Rojas-Muñoz To Explore Adaptable Virtual Reality Environments Through ASCEND Research Fellowship
Edgar Rojas-Muñoz, assistant professor in the School of Performance, Visualization and Fine Arts, was recently selected for Texas A&M University’s inaugural ASCEND research seed grant initiative.
The Research Leadership Fellowship aims to turn junior faculty into research leaders, and it comes with a $75,000 award for a one-year project. Twelve researchers were selected within the university.
Rojas-Muñoz leads the interdisciplinary research team that also includes Patrick Suermann, interim dean in the School of Architecture, and Xin Li, visual computing professor in the School of Performance, Visualization and Fine Arts.
The project aims to explore the artificial generation of virtual environments, specifically targeted to urban planning and room modifications. The goal: accelerate content creation for virtual reality applications, making it more flexible and adaptable for nonexpert users.
Rojas-Muñoz explains this in the context of firefighter training. A virtual environment can be designed, and the interactions can be programmed. The finished product, however, is not able to be altered or scaled by the client. So a firefighter can experience only what is present. After repeated use, a virtual building’s floor plan is understood and anticipated. This reduces efficiency, Rojas-Muñoz said, because firefighters encounter new and unfamiliar floor plans during an emergency. That then requires rehiring the company that produced it for modifications.
“That’s the business model,” Rojas-Muñoz said. “It’s not sustainable for everybody, which means a lot of people don’t want to take on the initial commitment to VR because they know that it will be unsustainable for them. They’re not going to have the flexibility they require.”
The project aims to solve this through a two-step process. First is using camera devices to scan and capture the environment, which is commonly used for 3D experiences. But instead of acquiring the static representation of the room, a semantic interpretation will be created. This requires interpreting the characteristics of the room, such as the roof, windows, floor and furniture. Afterward, a second step will allow users to go inside a virtual reality rendition of the environment and modify it intuitively. Through this immersive editing, users can generate new virtual models from existing real ones.
“You put on the goggles and you’re able to walk inside,” he said. “And while immersed there, you have access to the defining characteristics of the room. You can easily say, ‘OK, I like this window and I want to put another one in that wall.’ So I grab this window and boom — it’s there. But maybe it’s too big for that wall. So I’ll shrink it a bit. Or the material that this floor has, you want to have it in the ceiling. Since it’s one of the defining characteristics, you can copy it and now the ceiling has the same material.”
For firefighter training, Rojas-Muñoz said the dimensions of a room — making a wall longer, reducing the size of the windows — could be altered, thereby changing the exercise.
A second example is from health care, with a surgeon who relocates to a new hospital but wants to duplicate the surgery room from the previous location. A scan of the room acquires the defining features, which can then be adapted to fit within the new location, e.g., room constraints, lighting and building position.
“People can have more intuitive and easy-to-use control over this type of room modifications inside of VR,” he said. “Essentially, someone who has no skill set on building the models or programming VR applications can re-create existing rooms and modify them to whatever they want, and then save them and use them later.”
The fellowship provides more than seed money, Rojas-Muñoz said, including leadership training workshops and events through the Texas A&M Division of Research.
“Throughout the coming year there will be several sessions, and the idea is they will help us take this idea and develop a full-fledged proposal that can later be submitted to the National Science Foundation or another funding institution,” he said. “It’s the seed money to develop an initial prototype, an initial idea of the research, conduct an initial evaluation and provide the assistance to take the idea and refine it for future external funding.”
The funding will go to support graduate students from the visualization or computer science programs to assist in the research, along with acquiring equipment and virtual reality headsets. Rojas-Muñoz said he wants to connect with skilled graduate students looking to get involved in this type of research.
“I feel very honored, very excited,” he said. “I think this is going to be interesting. I’m happy to be able to put the name of the school out there with a great opportunity like this.”
Photo by Glen Vigus.