3D modeling simulations mesh well with military training needs.
By Patrick E. Clarke, MTI Correspondent
3D virtual reality (VR) training continues to become more realistic in order to meet military expectations. These expectations have truly spread throughout all ranks within all services.
That’s because new recruits are used to using Xbox 360 and other video games that are so realistic that one half expects Arnold Schwarzenegger to reach out and tap you on the shoulder.
Two of the key components in 3D modeling simulations are the simulator and the content library.
MetaVR has just such extensive content, from zombies to battleships. No, that wasn’t a typo, MetaVR even has zombies. “The zombie was actually a customer request,” said Phillip Winston, lead software engineer at MetaVR. “The customer wanted a ‘stand in’ which could be used in the simulation but would not be confused with the regular entities.”
MetaVR’s flagship product is the Virtual Reality Scene Generator (VRSG). The accompanying 3D terrain generation and visualization products include substantial libraries of 3D content, totaling over 5,300 models.
MetaVR has become one of the largest suppliers of commercial licensed 3D visualization software for unmanned aircraft system simulation training in the U.S. military, according to Winston. VRSG is used in settings ranging from classroom training at Fort Huachuca, Ariz., and other sites, to embedded training in portable ground control stations the field.
He explained some of the unique features offered by MetaVR. “The VRSG’s have the ability to render ultra-high resolution 2 centimeter per-pixel terrain at 60 Hertz. At 2 cm resolution one can see small details on the terrain such as small craters left from exploded ordnance and bullet holes on targets,” said Winston.
Winston explained MetaVR can achieve this level of detail because of their unique approach to obtaining the data. “The fact that we do imagery collection with our own remotely controlled aircraft is pretty fantastic. We process the data ourselves with our own tools, then we visualize the results with our image generator (IG).”
Of course the military always needs to see the results of any mission, so battle damage assessments (BDAs) are included. MetaVR’s visualization software has long been used as a tool to teach military vehicle identification for combat target recognition.
The company’s 3D military models achieve high-fidelity because they are built from publicly available photographs, and can be viewed in various sensor modes such as simulated noise, contrast and blur. With such a close comparison between a photograph and a model, the model can be examined from any angle and with thermal signatures in various sensor modes. By zooming and rotating the model and moving the articulated parts, trainees can learn to recognize a vehicle from different ranges and perspective angles.
“At I/ITSEC 2015 we demonstrated our largest damage-capable terrain to date, located in our Kabul, Afghanistan, database,” said Winston. “With this approach every building in the area can receive several degrees of damage. Damage is critical to creating a dynamic battlefield because it shows the history and impact of operations.”
To meet a shifting focus of training and simulation needs by the U.S. military and its allies to the Horn of Africa, MetaVR has built a 3D virtual representation of the southern Somalia port city of Kismayo. The terrain is populated with hundreds of geographically specific culture models built from ground-level photographs taken on the streets of Kismayo. In addition to these geolocated and photographically specific models of buildings and other structures, several hundred other buildings were modeled by matching the structural footprints visible in the imagery as geographically typical models with culturally and architecturally accurate details.
MetaVR VRSG supports several features in 3D file format. These features enhance the model’s realism in a real-time context, for example:
- Switch states toggle between open and closed doors and windows.
- Level of detail switches swap in a less detailed version of a model at a distance to improve performance.
- Animation nodes add a looping sequence to models requiring movement (such as flags and vehicle wheels).
- Material assignments determine how the surfaces of a model respond to lighting.
As for future challenges “I think in the future we’ll improve collection capacity by using multiple aircraft at once, we’ll use many machines to process in hours rather than weeks,” said Winston
In Winston’s view, the future is always now. For example, “We will release a version of VRSG this year that uses DirectX 11. We created a DirectX 11 prototype a number of years ago but only now are rolling it into the shipping product. If you adopt new technology too soon then your customers are not ready for it and the platform is immature. If we wait too long, then you are falling behind. I think we timed this one very well.”
AEgis Technologies is another leader in the area of 3D content and has been around for more than 25 years. “We’ve helped set a lot of standards for 3D simulations,” said Anthony Castillo, production manager for real-time 3D modeling programs at Aegis Castillo uses the Xbox consoles as a point of comparison. Xbox gaming has a higher visual quality and the scenarios are more realistic, he explained. But Xbox is finite and fixed—linear. AEgis Simulation models and databases are designed to be open-ended and are not tied to one type of software or IG. At the same time, the products are backward compatible to support older systems.
One of their primary products is the Improved Moving Target Simulator System (IMTS). The IMTS is a 40-foot domed simulator with 360 degree projections and can accommodate two three-man Stinger teams. Aegis has delivered multiple IMTS simulators to the U.S. military which includes both hardware and software.
In addition to the domed environment, AEgis provides fully embedded simulations that train all operator functions and emergency procedures for family-of-systems small unmanned aircraft systems. AEgis Vampire runs on fielded Panasonic Toughbooks and ground control stations with no modification to fielded hardware and software. AEgis products provide simulations that are an open-ended training environment where trainees can learn from their mistakes. The cost of actual training and training exercises is always a concern, especially with the wartime posture the U.S. has been in for years. “It’s always more efficient to use simulators as well as more cost effective from live ammunition costs to transporting troops,” said Lauren Johannesmeyer, business development manager, technology solutions at AEgis.
The new content library, AEgis Elements, is loaded with high-fidelity, real-time, military 3D models that have been custom built and compiled over the last two decades by leading 3D artists, designers and developers for use in simulators, gaming applications, demos and more. A customer can search AEgis Elements for hundreds of 3D models: articulated ground vehicles, maritime vessels, aircraft, missiles and weaponry. Search criteria can be filtered by level-of-detail, country code, damage states and national markings.
AEgis Elements 3D models can be designed for real-time simulations featuring high-fidelity models and high-resolution textures that can be tailored to individual needs. Models are designed to be universally compatible with numerous geospatial datasets, 2D/3D viewing application, and gaming engines.
- High-quality texture
- Multiple paint schemes
- IR texture
- Multiple levels of detail
- Articulation beads where applicable
- Standard structure to ease integration
- Available for delivery in multiple formats
- Standard format: MultiGen OpenFlight (.FLT)
- Optional Formats: 3D Studio Max (.3DS), VRML (.WRL), VBS2 (.pbo)
AEgis products allow trainees to gain physical muscle memory and/or visual muscle memory. Battle damage assessment is also available. Battle damage is dependent on the use case and the customer requirements. Some of the traditional IG-based simulator models use multiple states based on the damage type. For example, if the vehicle took battle damage the weapon area would show a darkened, burnt looking texture as well as visual damage or removal of the weapon geometry. Mobility damage would show texture damage to the track or wheel area with a thrown track or tire. Catastrophic kill states usually show extreme non-repairable damage.
The newer game engines being used for some simulations have their own visual damage.
In most cases it is more area specific than the legacy method—for example, the damage will appear exactly where you take a hit versus a general damage type. The game engine displayed damage is also usually controlled by the engine and not pre-built in the source 3D model.
AEgis is aggressive in looking at future environments, according to Castillo. But they also make sure products are backward compatible. “We can support older systems while always developing new products, giving them a well-rounded capability,” he explained.
Future systems must be lightweight and portable so they can be downloaded onto a laptop or tablet. “The goal is a product that would allow cost effective access to multiple environments, so training could continue even while troops are deployed,” said Castillo.
He added, “The generation now coming into the military grew up playing video games that were realistic and believable. Military training must be the same quality.”
Bohemia Interactive Simulations (BISim) is a global software company providing simulation training solutions for military and civilian organizations. They use game-based technology for military-specific training and simulation software products.
Virtual Battlespace 3 (VBS3) simulations can be used for training solutions such as scenario training, mission rehearsal and more. VBS3 was selected by the U.S. Army as its flagship product for its Games for Training program. Oliver Arup, vice-president of product management said, “We are based on gaming technology. We’re not old school.”
In fact, the founder of the company, Pete Morrison, co-chief executive officer, got the idea for the company from a game. While serving in the Australian Army, he used a game called Operation Flashpoint and
thought that the game provided better training than that provided by the military.
Bi-Sim can provide simulations for most equipment types. For example, ground based vehicle simulation can be done with HMMWVs. Marines can do convoy training with simulated vehicles and weapons in a simulated combat environment.
VBS3 comes equipped with a robust Developer Suite including tools for creating buildings, creating terrains from satellite imagery and other data sources, and converting models to the simulation environment.
“Things have to look good because what users use for personal use is high end,” said Arup. “Graphics aren’t the be-all end-all, but have to be credible.” He added, “If you get the users past that first hurdle with the software then the training can be effective.”
That, and having a commander say that this is a real world simulation and soldiers will be judged accordingly. “Then the users get immersed and take the training seriously,” said Arup.
“Simulations may not be specifically designed for BDA– but you can build in BDA as you feel it’s needed”, said Arup. “Basically, BDA can be built into a simulation on the fly.” BISim systems support cratering and destruction.
From a practical standpoint, weather can be an issue vs. simulation. This provides additional cost saving in that a unit using a simulation for training purposes wouldn’t have to cancel an exercise.
Arup emphasizes that Bohemia is always looking to meet the challenges of the future. “Every six months hardware improves. We have to be able to match that,” said Arup. He pointed out that expectations are driven by the game market. “However, personal games have billions of dollars to support them. We have to match that with much less funding.”
Another leader in the field of virtual training for Air, Ground and Naval platforms is Elbit Systems. Linked together for interoperability, Elbit Systems training solutions simulate complex combat scenarios for which coordination between multiple teams is crucial for success.
Their solutions range from the individual soldier to full-scale joint forces LVC training systems, using advanced modeling, visualization and networking capabilities.
ARTIST (Augmented Reality Integrated Training System) is Elbit Systems’ latest product, one that combines live, embedded and augmented reality technologies into one innovative training suite, according to Livneh Ofer, research and development manager, training and simulation, at Elbit Systems.
A major gap for live embedded training, especially in combat platforms, lies in the insufficiencies of the training arena. In order to train in an applicable operational arena, the costs are very high, and even then, many restrictions are imposed—safety or limited areas where firing is actually permitted, for example. “ARTIST integrates a virtual arena (entities and effects) into the real training arena, where the training occurs. By using ARTIST, the warrior is enabled, while operating the combat platform’s systems, to view the environment and to operate the platform sensors (e.g. electro-optical) against the real world, after it has been enriched by using augmented reality techniques,” said Ofer The end result is that the training arena that the operator is exposed to, via the platforms’ displays,
are rich and the elements look, behave and react as in the real operational arena.
The training audiences are operators of sensors and systems that have visual displays—either screens or out the window, such as commanders and gunners of land or naval platforms, forward observers, border protection personnel, pilots etc. ARTIST allows operators to train on detection, recognition, identification and fire procedures as well as defense protocols, when red forces are operating against the trainees. The training
is aimed towards both crew and formation training scenarios.
Battle damage assessment is a crucial element of the training Elbit Systems provides. “We capture, automatically and manually by the instructor, a vast variety of trainee actions, training and arena statuses, events along with the video and audio of the training,” said Ofer. “All that data is presented to the trainee in an after action debrief session. The trainee can use the data and perform various actions like jumping between events, viewing all the data in a synchronized manner or comparing between the trainees who participated in the training.”
The training and simulation domain needs to bridge two main challenges—constant reduction of budget on one hand and constant increase of complexity within the platforms, sensors and operational battle field, on the other. Additionally, one specific aspect is the fact that more and more operational scenarios are operationally carried out by joint forces. Training should support those needs.