Every wall is a mirror. Not for light, obviously, which scatters uselessly off plaster and concrete, but for radio waves. Their wavelengths are so much longer than the tiny imperfections in a building’s surfaces that those surfaces reflect them cleanly, predictably, the way a polished sheet of glass handles visible light. Engineers at the University of Pennsylvania have now exploited that quirk to build a system that lets robots see what’s hiding around corners.
That is the promise of HoloRadar, a system developed at the University of Pennsylvania that lets robots reconstruct three-dimensional scenes beyond their direct line of sight. Presented at the NeurIPS conference in December 2025, it works in complete darkness and under variable lighting, where camera-based approaches would fail entirely.
The trick hinges on a counterintuitive property of radio waves. Their wavelengths are far longer than those of visible light, which has always been viewed as a drawback for imaging because it limits the resolution you can achieve. But Mingmin Zhao, an assistant professor in computer and information science at Penn and the project’s senior researcher, and his team realised that for seeing around corners this supposed weakness is actually the key advantage. “Because radio waves are so much larger than the tiny surface variations in walls,” says Haowen Lai, a doctoral student on the project, “those surfaces effectively become mirrors that reflect radio signals in predictable ways.”
Walls, floors, ceilings: all of them bounce radio signals around corners in a manner the system can exploit. Ordinary light scatters in all directions when it hits a rough surface, which is why previous attempts at non-line-of-sight imaging using lasers and sensitive cameras have struggled outside controlled laboratory settings.
“It’s similar to how human drivers sometimes rely on mirrors stationed at blind intersections,” says Lai. “Because HoloRadar uses radio waves, the environment itself becomes full of mirrors, without actually having to change the environment.” It is a neat inversion of the usual problem with radar: low resolution becomes, in this particular context, a kind of superpower.
The field of seeing around corners has been gathering momentum for over a decade. Back in 2012, the MIT computer scientist Antonio Torralba noticed that the wall of his Spanish hotel room carried a faint, inverted image of the patio outside, cast through the window acting as a pinhole camera. That observation helped launch a wave of research into non-line-of-sight imaging. Most of the work since has relied on visible light, analysing shadows or the faint reflections of laser pulses bounced off relay surfaces. These approaches can produce detailed reconstructions, but they tend to need expensive equipment and carefully controlled conditions, and they fall apart in darkness or fog.
HoloRadar sidesteps those limitations. It uses a single millimetre-wave radar sensor mounted on a mobile robot. A pulse of radio goes out, bounces off walls and floors, and comes back as a tangle of overlapping reflections. Separating those returns is the hard part. A single pulse might bounce two or three times before reaching the sensor, creating phantom copies of objects at mirrored locations. “In some sense, the challenge is similar to walking into a room full of mirrors,” says Zitong Lan, a doctoral student in electrical and systems engineering who worked on the project. “You see many copies of the same object reflected in different places, and the hard part is figuring out where things really are.”
To untangle all of this, the team built a two-stage AI pipeline. The first stage enhances the noisy, low-resolution radar data and identifies multiple “returns” from different bounce paths, essentially turning the radar into a kind of multi-return LiDAR. The second stage traces those reflections backward using a physics-guided model, undoing the mirroring effects to reconstruct where objects actually sit in three-dimensional space.
The system was tested on a mobile robot navigating real indoor environments. Across 32 distinct corners in five buildings on the Penn campus, some built as early as 1906, HoloRadar successfully reconstructed hidden corridors, walls and human subjects standing out of the robot’s line of sight.
What sets this apart from earlier radar-based attempts at non-line-of-sight perception is the practical bit. Previous systems required slow, bulky scanning equipment. HoloRadar is mobile. It runs in real time. And it doesn’t replace existing sensors like LiDAR or cameras; it complements them, adding a layer of perception that covers blind spots those tools simply cannot reach. “HoloRadar is designed to work in the kinds of environments robots actually operate in,” says Zhao.
The next step is outdoors, where intersections and urban streets pose longer distances and messier conditions. If a self-driving car could sense a pedestrian about to cross from behind a parked van, or a delivery robot could anticipate a cyclist rounding a building corner, the safety implications are considerable. We are a long way from deploying that in traffic, of course. But the walls, it turns out, have been quietly carrying the information all along.
Study link: https://waves.seas.upenn.edu/projects/holoradar/
If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Thank you for standing with us!



