Close×

Australia’s future defence simulation capability will move from monolithic independent systems to the defence equivalent of plug and play, share realistic “virtual world” modules from a central database repository and may incorporate technology borrowed from the computer gaming industry to provide maximum reality at affordable cost.

That’s the vision presented by Australian Defence Simulation and Training Centre director general, Commodore Charles McHardie. Speaking at Adelaide’s SimTecT 2014 simulation conference in August, CDRE McHardie outlined a future distributed simulator training system where Australia’s defence services will have a 24-hour ability to participate in real-time joint training scenarios, share data and use accurate modelling of training areas and possible operational environments to maximise training realism.

He believes this function will be vital to ensure best use of the unprecedented number of sophisticated, high-value platforms the ADF will integrate over the next three decades, such as the Joint Strike Fighter, EA-18G Growler electronic attack aircraft, Seahawk Romeo helicopters, Air Warfare Destroyer, P-8 Poseidon anti-submarine aircraft, Wedgetail Airborne Early Warning and Control aircraft and the Canberra class LHDs.

“Each of those platforms are very expensive to run,” he said. “And they’re a leap forward as far as cost per operating hour or cost per sea day, so we need to try and leverage off, as much as we can, the benefits of simulation to reduce our through-life cost of those platforms.”

To achieve his goal, CDRE McHardie will have to turn current simulator architecture on its head. The majority of legacy simulators are platform-specific, designed to train pilots in a particular aircraft type or naval or army personnel in a particular task. The fidelity of the platform characteristics have often taken precedence over the level of reality offered by the simulated environment in which the training is conducted. And many have used proprietary systems or data that were never designed to be shared or updated by anyone other than the manufacturer, let alone exchanged with other simulators in real-time exercises. But with the future focus on joint training to support joint operations, that is all about to change.

Under Joint Project JP 3035 the ADF aims to acquire a persistent, integrated, distributed synthetic (simulated) environment, to support Joint and Coalition collective training, mission planning, and rehearsal.  Phase II of the project will deliver a core simulator capability.

“Rather than bringing live platforms together all the time what we need to be able to do is distribute our product, our simulated product over our networks to where it is needed,” CDRE McHardie said. “So it’s interconnecting our live platforms and our simulators to reduce travel costs and load on our platforms.”

The required end state is that officers responsible for training on live assets such as ships and vehicles will be able to plug realistic simulated participants in to a training scenario when live participants are not available.

“What they should be able to do is to be able to reach in to the wider simulation organisation, headed up by the ADSTC and say for my period at sea on this day I’m going to need four fast jets for strike, a P3 for strike direction, combat air patrol, submarines, etc. And be able to put together a mix of live and virtual platforms and be able to undertake that training whilst they’re at sea.

“The same thing in an army environment; if we want to go conduct joint fires and train JTACs, to be able to conduct that – for P3s or Wedgetail to be able to interact with a wider joint force. That’s what we’re talking about.”

But that wouldn’t always mean everyone is sitting in an expensive high-fidelity simulator.

“We shouldn’t be building or federating high fidelity platforms that do not need to be federated,” CDRE McHardie said. “Why would we use a very expensive high-fidelity F-18 simulator to conduct some distributed mission training with, say, a frigate, when they don’t need that level of simulation? Because in that example we are not about training the pilot, we are about training the operations staff in that frigate. It may be a lot more cost-effective to use a commercial platform, a quite inexpensive platform with a pilot or someone sitting in a seat with a joystick using combat net radio sim to use a VOIP circuit to interact with the other platforms.”

The first steps towards a linked system came in August, when simulator provider CAE linked the Royal Australian Air Force’s C-130J full-flight and mission simulator with a C-130J tactical airlift crew trainer at Richmond airbase.

Industry contribution
Speaking at another SimTecT forum, CAE’s Chief Technical Officer and Senior Design Engineer, Jawahar Bhalla, explained the core of the new philosophy is to concentrate on creating a universally-available virtual world, in which individual platform or scenario based simulations can interact. And then ensure that all simulators involved are operating to an open common standard so they talk to each other.

“A common synthetic environment means we have to have a form of database that we can all work with,” he said. “Essentially we need to separate out the simulation of the platform, the aircraft or the ship or whatever it is, from the simulation of the synthetic environment. What we are talking about is a shift from a platform-centric architecture to simulation of a virtual world in a sense, where we pull the platform out and the virtual world has simulations of the various things that make up that world.”

To explain the complexity of such a system he used the example of a tactical simulation exercise in which a bridge is destroyed. To present an accurate training exercise, once the bridge has been destroyed it must appear that way for the rest of the exercise to everyone involved, regardless of the device they are using to view it.

“It needs to be correlated internally across all systems that use the databases, not just the optical window, but the FLIR, the radar and so on, all need to see the same things in the same places,” Bhalla said. “So we want these simulations to be able to act together in real time to produce that virtual world and simulation of the platform is just another thing that comes and plugs in to this virtual world.”

This view of the virtual world as a module designed separately, managed from one place and made available to all, opens myriad possibilities for increasing realism. Just as commercial computer games allow users to load expansion pack software modules introducing new characters, environments or vehicles, simulators could emulate specific scenarios, climates and weather.

“When the weather was dry we would expect to the roads to be dried out; you would see dust from vehicles and the vegetation would dry out,” Bhalla said. “We want to be able to change the properties of the simulation. So if there was rain the vegetation would change, the roads themselves would start to get bogged down and you would expect to get bogged in those situations. And you need to be able to do this quickly, to update it to reflect what is happening in the real world to make it more effective for mission rehearsal.”

International offerings
The concept of plug and play interests more than just the database designers. On its exhibition stand at SimTecT, Canadian company D-Box displayed a portable vehicle simulator with the driver’s seat and controls mounted on four computer-controlled electromagnetic pistons that cycle so quickly they can create the sensation of engine vibration. D-Box said its simulations reproduce characteristics of specific vehicles and can accurately simulate vehicles sliding in mud or becoming bogged. They can also reproduce the sensations of a vehicle’s armament firing or taking hits from hostile weapons, from small arms to an Improvised Explosive Device detonation. The system is ready to connect with any VBS2 (Virtual Battlespace 2) or VBS3 based simulation application.

And the gaming world may have more to offer from the plug and play philosophy. CDRE McHardie described how the ADSTC is studying gaming technologies such as the Oculus Rift virtual reality headset and the Omni treadmill, a stationary platform like a gym treadmill used to simulate walking or running.

“So what we should get is the three services coming to the ADSTC with a need to build a virtual world to conduct that virtual, low end virtual collective training and we can assist with the requirements,” CDRE McHardie said. “Now we may well be able to build it in-house if it’s quite simple, or we’ll be going out to industry, maybe through our defence simulation minor capital program to ask industry to deliver those virtual platforms for us. But different to what we’ve done in the past where we’ve allowed a lot of autonomy for the services to actually just engage directly with industry, we’re trying to drive that technology down a certain path by gaining a better understanding of it.”

“We see this as the heart of Joint Project 3035 Phase II. It needs to deliver us a real enterprise solution to being able to deliver a very contemporary joint exercise operating environment in the future. And we need to get better and better at automating our production of all those products that sit in that joint operating environment.”

comments powered by Disqus