{"id":14294,"date":"2022-11-17T16:45:21","date_gmt":"2022-11-17T22:45:21","guid":{"rendered":"https:\/\/pvfa.tamu.edu\/?p=14294"},"modified":"2022-11-17T09:36:27","modified_gmt":"2022-11-17T15:36:27","slug":"engineering-and-visualization-students-collaborate-on-extended-reality-flight-simulator","status":"publish","type":"post","link":"https:\/\/pvfa.tamu.edu\/news\/2022\/11\/17\/engineering-and-visualization-students-collaborate-on-extended-reality-flight-simulator\/","title":{"rendered":"Engineering, visualization students collaborate on extended reality flight simulator"},"content":{"rendered":"\n

Students at Texas A&M University built an extended reality (XR) flight simulator that could potentially be used by the Texas Air National Guard as a cost-effective training tool for pilots. The simulator has been designed to work in conjunction with a mixed-reality deep-immersion headset developed by Passenger Inc., a company out of Austin, Texas.  <\/p>\n\n\n\n

After seeing the Retro Rocket simulator<\/a> created by students in the Department of Aerospace Engineering and Department of Mechanical Engineering, Ron Maynard, CEO and founder of Passenger Inc., connected with Dr. Darren Hartl, associate professor in the Department of Aerospace Engineering, about building the F16 XR flight simulator.<\/p>\n\n\n\n

Although Hartl works with virtual reality (VR), XR was new territory for him and his students. XR is a combination of VR and augmented reality (AR). VR is an immersive experience; think about the headsets gamers use today. It uses a computer-generated environment to replace what the user is viewing. AR takes a computer-generated view and overlays that onto our reality, such as Snapchat filters or Pok\u00e9mon GO. XR lies between these two by blending virtual and physical worlds almost seamlessly, allowing the two to interact.<\/p>\n\n\n\n

\u201cWhen you\u2019re sitting in the simulator, I don\u2019t know that you can get any closer to feeling like you are flying an F16, visually and tactically, in terms of how you use and see your hands and other features of the experience that we have made,\u201d said Hartl.<\/p>\n\n\n\n

The physical elements of the design include toggle switches, functional buttons, HOTAS (hand on throttle and stick) controls, rudder pedals and a vibrating seat that immerses the user in the flight. There is also a multifunctional display onto which different VR elements can be cast, and it serves as a touchscreen. All these are fully tactile, and pilots can see their own hands as they manipulate each one. This is all within the aluminum frame designed by the students to be light, portable and modular.<\/p>\n\n\n\n

The simulator also has a virtual landing gear lever controlled by hand tracking, so as the user places their hand over the virtual lever, it moves accordingly. The students also created a VR terrain equipped with pylons for racing agility simulations, missiles that the pilot can fire and a fully viewable aircraft around the pilot.<\/p>\n\n\n\n

\u201cThe purpose was to demonstrate that we can use this for any airframe,\u201d said Jesse Cate, aerospace engineering student. \u201cThe hand tracking was there to show that we might not even need a full physical cockpit, and the touchscreen was also important to show that we could replace the physical components as well if needed.\u201d<\/p>\n\n\n\n

This design shows that a system can be built quickly, economically and flexibly to give more pilots a chance to stay proficient regarding a range of mission tasks when not at the airfield.<\/p>\n\n\n\n

\n