Embedded Training, Closer to the Source

Embedded Training, Closer to the Source

930
SHARE
Mississippi Army National Guard Spec. Zachary Wilbanks, of Amory, removes the barrel of a 25mm cannon from a Bradley Fighting Vehicle during a Gunner's Skill Test July 20, 2014, at Camp Shelby Joint Forces Training Center. Wilbanks is with the 2nd Battalion, 198th Combined Arms Battalion of the 155th Armored Brigade Combat Team, which is conducting Annual Training at CSJFTC. (Army National Guard photo by Staff Sgt. Scott Tynes, 102nd Mobile Public Affairs Detachment)

Closer to the Sources

Embedded capabilities bring training to the operator level for training on demand.

 

By Henry Canaday, MTI Correspondent

 

 

Embedded training (ET) is training hosted in hardware or software that is integrated into operational equipment. When activated, ET overrides the equipment’s normal operations for training and assessment.

The U.S. Army uses full and partial ET approaches. In full ET all training hardware and software reside on tactical platforms, even in combat. In partial ET some training components are temporarily attached the system.

Patrick Sincebaugh, lead engineer for ET at Program Executive Office, Simulation, Training and Instrumentation, said ET’s primary advantage is warfighter training at the point of need. “Lessons from theater have shown this is critical.” For example, In Iraq, vehicle gunners temporarily assigned other duties would have benefited greatly from ET in maintaining gunnery skills. So a Common Embedded Training System (CETS) has been developed for virtual gunnery training for the Abrams and Bradley. But integration of CETS on these vehicles or the Stryker would require engineering change proposals.

CAE September

ET also allows soldiers to train as they fight, using real combat systems. And it’s tough to keep stand-alone training aids, devices, simulators, and simulations current with combat systems. ET inherently addresses most concurrency issues.

Army regulations prefer ET, but significant hurdles must be overcome, including cybersecurity and safety. Standalone trainers are unclassified, while many vehicles are classified. For example, if un-embedded tactical vehicle systems are used for live force-on-force training, data on location and kill status can be sent to operations centers unencrypted. It would need encryption in an ET system.

Updating ET can be a challenge as updates may only be allowed during annual or semi-annual software loading. And changing ET may require updated safety certification, time-consuming and costly.

Another challenge is using ET with non-standard, proprietary interfaces and software. The government sometimes lacks data rights to embed software affordably. Future live-training systems will use open interface standards with software available for download at www.lt2portal.mil by qualified vendors and government.

The Army’s Embedded Training Working Group, with more than 150 members, is developing a strategic roadmap for ET, working on collaborative opportunities, solutions and standards. One achievement is the Multifunction Vehicle Port (MFVP) Interface Standard for connecting standalone training systems to vehicles through training ports.

Development of Live Training Engagement Composition (LTEC) software enables live training for appended, hybrid and full ET. Stryker will embed LTEC to eliminate some appended training devices.

Sincebaugh said the cost of appended, hybrid and full ET could be significantly reduced by implementation of the Vehicular Integration for C4ISR/EW Interoperability (VICTORY) architecture and specification. VICTORY is a PEO Ground Combat Systems (GCS) initiative that originated to address size, weight and power issues with ground vehicles.

“Mission equipment packages on ground vehicles were often developed in a stove-pipe manner,” Sincebaugh noted. “They weren’t fully integrated with the vehicle or other existing systems.”

The Program Manager for Training Devices has been working closely with PEO GCS to ensure test and training requirements are satisfied by VICTORY. Component types for interfacing with weapon systems are now specified in VICTORY architecture.

This provides a common interface for different vehicles to acquire data such as trigger events. Sincebaugh says widespread implementation of VICTORY on ground vehicles would eliminate vehicle-specific kits for interfacing with weapons.

VICTORY can also enable software re-use across training domains. “The same software and message can be used for acquiring a trigger event for live force-on-force training as for ET in gunnery,” Sincebaugh said. Reducing vehicle-specific kits and re-using them across domains would significantly cut training costs.

Meggitt Training Systems is well-versed in ET, with more than 2,000 fielded systems in armored vehicles.

Director of business development Paul Romeo said applying ET correctly in the right circumstances can be very cost-effective, since most equipment can be reused rather than replicated or modelled. The ET assets can be operated in any environment real assets operate in. ET can also train anywhere and anytime, especially in theater. And ET offers the best fidelity in look and feel.

But the approach is not suitable for all situations. “Some vehicles are more suited than others,” Romeo noted. Choosing ET should be based on internal components of the vehicle. Otherwise, the cost and challenges of ET will be greater than standalone trainers. Generally, vehicles that have cameras for primary sighting systems are better suited for ET than vehicles that rely on optical combat sights.

Vehicles must also have room to house ET hardware, mostly a high-end ruggedized computer. And power consumption matters. Vehicle OEMs often limit ET to 100 Watts, with simulations running on batteries when the engine is off.

Meggitt balances best ET technology and best performance. Power limits can sometimes force compromise. ET typically requires two to four channels, feeding two to four sighting systems. Using more channels can affect performance.

Romeo also stressed that ET itself does not teach a crew how to perform actions. A computer assisted training module must be developed with training guidelines on the step-by-step process of performing tasks.

Matt Jackson, a technical product manager at Presagis, has worked with his firm’s software for both real avionics in cockpits and for training. “In Europe, that requires certification, and in the U.S. the ability to get data on the screen with fusion,” Jackson explained. Presagis works with real systems for both aircraft and ground vehicles and also does ET.

ET requires plenty of software for simulating virtual sensors and mimicking radar behavior and tactical environments. Blending virtual tools with real systems enables troops to train realistically in the field. That is the ET approach.

For example, ET simulation lets pilots work with other aircraft, manned or unmanned, without the associated operational and sustainment costs. ET lets pilots use virtual weapons that are not both dangerous and expensive. Presagis software simulates information that would replicate real hardware.

So ET for both aircraft and ground vehicles usually requires a combination of real hardware and simulated hardware, data and situations. The challenge is inserting the virtual world into the real world safely, effectively and affordably.

Jackson said modern simulation makes it theoretically possible to do almost anything in the half-real, half-virtual world of ET. But military budgets mean ET is limited by resources and must focus on essential training functions. “You have to have a reason and operational need. That determines how realistic you can make it.”

Whether it’s a vehicle, aircraft or ship space inside most platforms is at a premium, so any system that takes away more space has to be considered carefully. “D-Box systems are so compact they can be virtually

adding kinesthetic cues anywhere even in confined and compact military vehicles environments,” said Sebastien Loze, D-Box’s senior product marketing director.

In late 2015, the Quebec, Canada-based D-Box demonstrated a prototype embedded tactical team trainer which was a combined simulator with a driver/pilot position and an unstabilized gunnery trainer. “The Embedded Tactical Team Trainer provides trainees with the finest immersion techniques through motion and visual cues. It integrates world-class technologies from D-BOX, Esterline and Laser Shot to create true-to-life training scenarios,” said Loze.

“Embedded training is just starting, communication simulation standards— such as HLA—adoption is increasing and with it the possibility to link these simulators together hence building better team training from different simulators in a network,” Loze continued.

Textron Systems’ Embedded Onboard Training (OBT) allows forces to cost-effectively train operators in port and rehearse missions at sea. OBT technology “stimulates systems installed on the ship, supporting and integrating high-value assets in their operational environment,” explained Byron Green, electronic systems vice president of business development.

OBT’s environment and scenario controller is a common tool used across many Textron products, allowing collaborative ET across many domains, assets and countries participating in networked exercises.

Textron ET presents realistic visuals of the battlespace displayed on ship-operator consoles by stimulating various sensors—navigation, radio, radars, sonar and weapons—with high fidelity. This represents behavior of players in each scenario. Green emphasized stimulations are pure physics-based. Training scenarios can be entirely virtual or virtual augmentation of real situations.

Green also stressed OBT’s modularity, allowing rapid delivery of new functions. OBT enables collaborative training and mission rehearsals, for example, by crews sailing to mission positions. And ET allows warfighters to train inexpensively and frequently on the ground or in port without costly missions at sea.

Green said Textron is uniquely able to offer ET systems because it also makes the radio-frequency receivers, synthesizers and other components that are being modeled or stimulated. “The science is the hard part. We do the math determining what the results would be in real time, not just provide old fashion look-up tables.” ET generally requires excellent visualization of training situations.

MetaVR’s 3D product, Virtual Reality Scene Generator (VRSG), is a real-time 3D graphics engine that can render very large, expansive and geo-specific terrain in round-earth format, explained W. Garth Smith, MetaVR’s president.

The U.S. Army uses VRSG to train unmanned aerial system (UAS) pilots in ET and other configurations.

VRSG is commercial-off-the-shelf system that runs on game-level Windows computers. Customers use VRSG to build terrain, populate terrain with culture and create training scenarios to run in VRSG. Especially important for UAS training is VRSG’s ability to simulate UAS video feeds coupled with its physics-based infrared (IR) rendering capability.

The Army uses VRSG in its RQ-7 Shadow Crew Trainer and Universal Ground Control Station (UGCS) for training operators of the Shadow, MQ-1C Gray Eagle and MQ-5 Hunter. VRSG simulates UAS camera payload by streaming simulated real-time UAS key-length-value metadata multiplexed into an HD H.264, a commonly used video coding format, transport stream. Tactical systems use this streaming MPEG feed to visualize sensor imagery in real time.

VRSG can encode Motion Imagery Standards Board (MISB) 0104.5 and Engineering Guideline (EG) 0601 metadata. The HD H.264 stream can be transmitted live over User Datagram Protocol (UDP) to any remotely operated device that can play back video from an actual intelligence, surveillance and reconnaissance video feed. VRSG’s real-time, high-definition simulated video is indiscernible from actual UAS video feed.

The Army uses VRSG in its UGCS for training operators of multiple platforms in universal mission simulators, the next-generation simulator for multiple UASs. The UGCS Tactical Common Data Link can use VRSG’s streaming feed to visualize simulated sensor imagery in real time and extract the UAS metadata.

Using VRSG, the switch from UGCS simulation exercise to actual mission is seamless. Trainees needs only learn one system. Thus when UAS operators are not flying an actual UAS, they can fly a simulated UAS mission using the same hardware.

Training UAS operators often requires interacting with joint terminal attack controller trainees, during close air support exercises. VRSG can stream its simulated sensor video to a JTAC’s Remote Operational Video Enhanced Receiver, generating range and coordinates of target on monitors.

VRSG simulates IR images directly from the visual database by combining automatic material classification of red-green-blue images with a physics-based IR radiance and sensor model. This means VRSG emulates the heat signatures of terrain as well as vehicles, characters and objects on the terrain with a very high degree of accuracy.

MetaVR provides a number of tools and products to enhance the usefulness of VRSG.

To aid in building realistic highly detailed 3D terrain, MetaVR’s new remote-controlled aircraft collects images at sub-inch resolution. Synthetic environments built from the resulting terrain, with MetaVR’s Terrain

Tools plugin to Esri ArcGIS, rendered in VRSG benefit from the ultra-high resolution and can then be rendered in VRSG’s simulated sensor feeds with a high degree of ground detail.

UAS customers use MetaVR’s VRSG Scenario Editor to build culturerich dense scenes and real-time pattern-of-life scenarios to support training exercises in VRSG. Using this game-level editor, users can drag and drop culture and moving models onto the 3D terrain, create paths of movement, assign appearances and animations and sequence activities in a timeline. In doing so, users can draw from MetaVR’s extensive libraries of over 5,500 models.