The Federal Aviation Administration (FAA) uses MetaVR Virtual Reality Scene Generator (VRSG) in two of its out-the-window general aviation flight simulators to support human factors research related to synthetic vision systems at the FAA Civil Aerospace Medical Institute (CAMI) within the Mike Monroney Aeronautical Center, Oklahoma City.
Synthetic vision is a computer-generated image that provides situational awareness of the external scene topography from the perspective of the cockpit. This image is derived from aircraft attitude, a high-precision navigation solution, and a terrain database with buildings, towers, and other relevant cultural features. NASA's definition of a synthetic vision system is "an aircraft cockpit display technology that presents the visual environment external to the aircraft using computer-generated imagery in a manner analogous to how it would appear to the pilot if forward visibility were not restricted."
The FAA uses 15 VRSG licenses in its Advanced General Aviation Research simulator (with a Piper Malibu configuration) and the Very Light Jet simulator (with a Cessna 510 Citation Mustang configuration). These simulators, which were built by ZedaSoft, use VRSG to render virtual airfields in MetaVR's round-earth terrain format, are used to study and test landing on runways with limited human out-the-window vision with the aid of synthetic vision.
The FAA's human factors research department conducts its research at two laboratories with five active simulators, of which two are the Very Light Jet (VLJ) and Advanced General Aviation Research Simulator (AGARS). Research focuses on improving individual system effectiveness, efficiency, and safety. A major emphasis of the department is its focus on improving human performance through enhanced equipment design. In these two general aviation flight simulators, the experimenters typically run pilots through pre-designed scenarios and record the pilots' interactions and behavioral responses. Recorded video, audio and computer aircraft data is then analyzed by the experimenter. In this case, such scenarios simulate landing in adverse weather visibility conditions, when visibility of the airfield is obscured.
The visual system for the FAA’s Cessna Citation Mustang VLJ simulator (shown above) uses MetaVR VRSG with five projectors and a 225-degree fixed dome. The simulated cockpit replicates a Cessna Mustang VLJ cockpit’s physical equipment and furnishings, controls, and interphone and air/ground communications equipment. Having the system function and perform as the actual aircraft system increases the realism and helps pilot trainees gain experience with tools they would use in the real world, such as the Garmin G1000 integrated flight instrument system.
The VRSG licenses were acquired concurrently with the FAA’s purchase of nine virtual airfields delivered in MetaVR’s round-earth Metadesic terrain format for visualizing in VRSG.
The synthetic environments include all major airport infrastructures, navigational aids, approach lights, VFR reference points, runways, aprons, and taxiways to form a cohesive, realistic, out-of-the-window scene.
Built using the latest airport diagrams and elevation data, these airfield databases conform to the FAA's strict standards. The terrain's elevation matches the data from the FAA. The virtual runways match real-world data closely to enable trainees to simulate the challenges of runway fluctuations. For example, the elevation of the ends of each virtual runway matches the FAA runway data exactly. The slopes from the ends of the virtual runways toward the center match the FAA slope data. Some sections of the runways are flat and some sections slope up or down. Such terrain elevation accuracy is key for pilot training, as pilots must be aware of the changing slopes in order to avoid hitting or overshooting the runway.
Synthetic vision simulation
Synthetic vision research and simulation is an emerging area of research and technology focused on reducing approach-and-landing accidents caused by the pilot's inability to see the runway in adverse weather conditions. Limited visibility remains the single most critical factor affecting safety and capacity in worldwide aviation operations. NASA and the FAA are among the organizations seeking to maximize situational awareness in the cockpit through synthetic vision with the goal of reducing such accidents.
As shown in a publicly available video of a missed approach to Aspen airport due to severely restricted visibility caused by weather-related phenomena, during times of reduced visibility, pilots rely solely on the instrumentation and other tools in the cockpit and reports from Air Traffic Control to maintain a mental image (situational awareness) of their position in space relative to terrain, airports, and other air traffic.
Synthetic vision displays an image relative to the terrain and airport within the limits of the navigation source capabilities (position, altitude, heading, track, and the correlated geospecific terrain database). Elements of a synthetic vision system are shown in the FAA diagram to the left.
Providing a visual solution to a visibility problem, synthetic vision would alleviate a pilot’s lack of a clear picture of where the aircraft is in space and especially in relation to the airfield, and enable the pilot to land an aircraft under the virtual equivalent of unobstructed daylight weather conditions.
The image below shows a different example of a flight simulator with a VRSG synthetic vision view. In the image, a flight simulator running Battlespace Simulations' Modern Air Combat Environment (MACE) is using VRSG with its simulated clouds for the out-the-window view, displaying the approach to the Aspen airfield virtual terrain. Below the out-the-window view is the VRSG synthetic vision view (without weather effects), just above the cockpit controls.