Johns Hopkins University Applied Physics Laboratory LLC

01/21/2022 | Press release | Distributed by Public on 01/21/2022 14:00

Johns Hopkins APL’s Glassworld Project Opens a Wider Window on Situational Awareness

January 21, 2022

Johns Hopkins APL's Glassworld Project Opens a Wider Window on Situational Awareness

By Ajai Raj

[Link]

Glassworld fuses APL's expertise in multiple domains to give SOCOM operators an unprecedented level of situational awareness in the field.

Credit: Johns Hopkins APL/Taylor Buck

Scientists and engineers at the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, are working to give Special Operations Command (SOCOM) operators an enhanced level of awareness in the field, equipping them with capabilities - like the ability to see through walls and a mini-map indicating the locations of teammates, adversaries and important landmarks - that could make the difference between success and failure, or even life and death.

Known as Glassworld, the project brings together APL's expertise in robotics, sensors, machine learning and mixed reality (MR) with off-the-shelf augmented reality (AR) technology to extend operators' line of sight, give them a map of their environment complete with relevant landmarks and feed them crucial contextual information, such as how an adversary is armed - all without GPS.

Glassworld fuses visual data taken from cameras with 3D light detection and ranging (LIDAR) data, which scans the physical surfaces of the environment and returns that information as a point cloud. The results are processed using a technique called simultaneous localization and mapping (in place of GPS) and sent to a Microsoft HoloLens 2, a head-mounted AR display, giving the user the illusion of seeing through walls and visualizing people and landmarks in the environment. All of the necessary equipment can be mounted on robots, drones or any other vehicle, creating a self-contained system that can be deployed in the field.

Glassworld originated via the combination of two internal projects at APL: Wall Hacks in Real Life (WHIRL) and Flying Lidar On-board: Real-time Indoor/Outdoor Dense 3D Reconstruction (FLO-RID3R). WHIRL focused on using LIDAR and the HoloLens to enable operators to see through walls; and FLO-RID3R used LIDAR sensors mounted on aerial drones to generate 3D maps in real time. The success of those projects resulted in both teams being given additional funding to combine the projects into a system that could be used by operators in the field.

The central challenge the combined team faced from the user's point of view was presenting all the necessary information in an unobtrusive manner, said Rebecca Crockett, an assistant program manager in APL's Asymmetric Operations Sector (AOS) and project manager of Glassworld.

"There's a definite balancing act to presenting the right amount of information in an intuitive way, so that it's available when it's needed and fades into the periphery when it isn't," Crockett said. "It can't be overwhelming or distracting because, for SOCOM operators in the field, their lives are on the line."

In addressing that challenge, the team drew heavily on lessons from video games, a medium in which developers have been addressing the very same problem for decades. The first-person shooter genre - which includes some of the most popular game franchises of all time, like Halo and Call of Duty - was the primary inspiration, according to Stephen Bailey, who led WHIRL and is leading user interface development on Glassworld.

"The concept of 'wall hacks' that inspired WHIRL came from a way that players cheat in these kinds of games, and it was the easiest way to express what we were going for," said Bailey, a software engineer who specializes in data visualization and decision support in APL's Air and Missile Defense Sector. "People have become very comfortable with features like a mini-map that shows navigation waypoints, and a heads-up display (HUD) that gives you contextual information. It's easy to jump into a new game and instantly grasp what's going on. We want to replicate that 'pick-up-and-play' quality with the Glassworld interface."

The team also had to overcome a number of daunting technical problems. Like the challenges on the visualization side, the technical hurdles required processing a tremendous amount of information in a way that would not overwhelm the capabilities of the hardware.

"The biggest challenge from a technical standpoint has been the localization piece - the headset and the robot need to know where each of them is at all times," said Miguel Rufino, a robotics engineer in APL's Force Projection Sector and the technical lead on Glassworld. "That involves keeping track of a tremendous amount of visual and LIDAR data, and updating it in real time."

The design and engineering challenges are inseparable from one another, because any lag in the system can have an immediate and disruptive impact on the user.

"The HoloLens isn't built for capturing and displaying a lot of data, and if the system slows down, you're moving your head around and the holograms are stuttering about, and that's not a good experience," Rufino said. "So, it's about figuring out how to process more information on the robot side without taking too much processing power away from the headset. Finding that sweet spot isn't easy, but we're definitely seeing improvements."

In the current phase of the ongoing work, the team hopes to eliminate the need for calibration, improve the user interface, add the ability to identify adversary weaponry and create a fieldable prototype of the Glassworld system, which has thus far been limited to laboratory testing.

As the concept matures, Lou Colangelo, APL's Special Operations mission area executive, anticipates the system being deployed in a variety of formats and environments and being used by forces around the world.

"This concept is undeniably the future, and there are very few theoretical limits to how and where it could be used," Colangelo said. "It could make use of ground-based sensors, autonomous vehicles and even aerial and sea vehicles to assist special operators on missions. And eventually, we imagine it will be used not only by special operators, but by military forces and law enforcement as well."

Looking farther into the future, Hirsh Goldberg, who led the FLO-RID3R effort and is advising the Glassworld team, envisions a platform that not only delivers enhanced awareness but also serves as an active partner on missions.

"The next big step will be incorporating autonomous navigation into a system like Glassworld, which could be based on commands given by the operator to the platform," Goldberg said. "'Search around that corner,' 'Inspect that object' and so on. When we're able to combine the unique capabilities already afforded by Glassworld with the ability to respond dynamically to human commands, we'll have a platform that can act as another team member in the field with its own complementary set of 'eyes.' This can be a critical enabler for reconnaissance operations or search and rescue missions."

Crockett attributed Glassworld's success to its cross-disciplinary nature, drawing on talent and expertise from across APL.

"We're trying to not just create, but actually implement a truly novel capability, which is a difficult balance to achieve compared to doing research alone," Crockett said. "For that kind of an effort to be successful, it takes more than just one sector. Fortunately, we've been able to pull together a variety of skill sets and perspectives into an amazing and capable team."

Media contact: Amanda Zrebiec, 240-592-2794, [email protected]

The Applied Physics Laboratory, a not-for-profit division of The Johns Hopkins University, meets critical national challenges through the innovative application of science and technology. For more information, visit www.jhuapl.edu.