Safety In Maritime Surveying: Dr. Edgar Rojas-Muñoz Leading Research With Augmented Reality Devices
Dr. Edgar Rojas-Muñoz, assistant professor in Visualization, is working to improve the safety of maritime surveyors by using augmented reality devices.
The American Bureau of Shipping approached Rojas-Muñoz in June 2023 to evaluate whether these devices are safe enough to be used by workers performing industrial tasks in hazardous maritime environments.
Rojas-Muñoz’s lab — the Laboratory for Extended and Mixed User Realities — is connected to the American Bureau of Shipping’s Laboratory for Ocean Innovation, located in the Haynes Engineering Building. The bureau collaborated with the Department of Ocean Engineering to fund the innovation lab as a hub for maritime research.
Dr. Freddie Witherden, assistant professor in the Department of Ocean Engineering, and four graduate students are working alongside Rojas-Muñoz to complete several experiments for the project. One featured participants completing a surveyor’s tasks on a ship, including inspecting the interior and exterior, while wearing AR devices.
Sensors installed throughout the room collected data on potential accidents, such as head injuries and missteps, he said. The data revealed whether the devices hurt the participants’ ability to perceive hazards.
“The surveyors make sure everything is working properly and that there aren’t cracks or holes on board,” Rojas-Muñoz said. “It’s pretty intense work along with the hazards they have to deal with — having to climb through tiny holes, walk through narrow hallways, so the setting is already fairly hazardous.”
An AR device could be beneficial to a maritime surveyor, Rojas-Muñoz said, but also introduce safety hazards.
“If the conditions they are in are already inherently dangerous, then we could be putting them in even more danger with the augmented reality devices,” he said. “This experiment is looking at what needs to be improved from augmented reality devices to keep the surveyors safe.”
Rojas-Muñoz and his team built a simulation room similar to the steering gear room, which controls steering equipment, on the training ship TS Kennedy on the Texas A&M University at Galveston campus. Graciela Camacho, a graduate student in Visualization, collected measurements from the TS Kennedy to re-create the simulation, and collected feedback from surveyors.
“We really hope to have the insights to be able to use this technology in the future,” Camacho said. “We also really hope to have more insights about safety.”
The participants were selected to use one of three devices during the experiment: a Microsoft HoloLens 2, an optical see-through headset device that shows holograms; a RealWear Navigator 520, a multiplexed device positioned under the user’s eye that shows a small screen; or a smartphone to receive information from a remote expert.
Participants had to check binders, tools and report numbers from panels while wearing personal protective equipment just as a surveyor would, including coveralls, gloves, safety goggles, steel toe boots and a safety helmet. Motion capture sensors were used to trigger hazards as participants completed their tasks.
“We are looking at whether people who are wearing these devices — is it making them hit or step on the hazards more often?” he said. “If that is the case, that means they are in a situation where their safety is being affected by them wearing these devices.”
Randy Brooks, associate professor of practice in the College of Engineering, was one of the 60 participants in the experiment. He used a smartphone during his simulation and said he faced a few obstacles.
“The biggest challenge was getting the phone to recognize my glove,” he said. “I really had to watch my footing because there is a lot to watch out for. On ships, they have water and holes and things of different heights, so the challenge was trying to avoid one obstacle putting me in another.”
Blain Judkins and Kylee Friederichs, graduate students in Computer Engineering, and Lara Soberanis, a graduate student in Visualization, are also part of the project. Judkins served as the remote expert and was on the phone with Brooks to answer questions or aid in completing a task.
Judkins said he was curious to see how many participants bumped into certain objects because of their height, or because of the augmented screens. Friederichs said she was intrigued by the data showing how each device affects users.
“Especially the amount of time it takes users to use different devices, which corresponds to how fast they get to their task and how successful they feel,” Friederichs said. “For certain devices, it may be more difficult if they are not experienced with it.”
Phase two of the project is in discussion, Rojas-Muñoz said, which involves ways to improve how the devices interact with users: adding calming aspects when the user is under high stress or struggling to complete a task; expanding the user interface to adapt to its surrounding to better serve the user; and adding sensors to flag potential hazards for the user.
Rojas-Muñoz said he and his team hope to find the safest way to use these devices in all types of settings.
“This is very interesting work, and someone needs to do it,” he said. “The outcome from this project is for the people — in the sense that you want people to be safe. There is a direct benefit from this work in society, with less accidents and less potential deaths.”
Top photo: Dr. Edgar Rojas-Muñoz wears the RealWear Navigator 520 AR device as he went through the simulation. Photo courtesy of Rojas-Muñoz.