Duke University researchers developed WildFusion, a framework combining visual, tactile, and vibrational data for outdoor robot navigation.

Sensory input—including touch, smell, hearing, and balance —helps the brain interpret and respond to environmental conditions. These functions are essential for navigating seemingly simple settings, such as walking on uneven terrain.
Perception of the canopy overhead provides visual cues that help with navigation. Sounds like snapping branches or the feel of moss underfoot offer information about ground stability. Noises from falling tree or moving branches in strong winds may indicate nearby hazards.
Robots typically rely on visual inputs like cameras or lidar for navigation. Multisensory navigation remains a complex task for machines. Forest, with dense vegetation, physical obstacles, and irregular terrain, present significant challenges for conventional robotic systems.
Researchers from Duke University have developed a framework named WildFusion that combines visual, tactile, and vibrational data to support robotic navigation in outdoor environments. The work was recently accepted to the IEEE International Conference on Robotics and Automation (ICRA 2025), which will take place May 19-23, 2025, in Atlanta, Georgia.
“WildFusion opens a new chapter in robotic navigation and 3D mapping,” said Boyuan Chen, the Dickinson Family Assistant Professor of Mechanical Engineering and Materials Science, Electrical and Computer Engineering, and Computer Science at Duke University. “It helps robots to operate more confidently in unstructured, unpredictable environments like forests, disaster zones and off-road terrain.”
“Typical robots rely heavily on vision or LiDAR alone, which often falter without clear paths or predictable landmarks,” added Yanbaihui Liu, the lead student author and a second-year Ph.D. student in Chen’s lab. “Even advanced 3D mapping methods struggle to reconstruct a continuous map when sensor data is sparse, noisy or incomplete, which is a frequent problem in unstructured outdoor environments. That’s exactly the challenge WildFusion was designed to solve.”
WildFusion, built on a quadruped robot, combines several sensors, including an RGB camera, LiDAR, inertial sensors, contact microphones and tactile sensors. The camera and LiDAR record the environment’s geometry, color and distance. WildFusion also uses acoustic vibrations and touch.
As the robot moves, contact microphones record vibrations produced by each step, helping distinguish surface types such as dry leaves or mud. Tactile sensors measure the force applied to each foot to detect surface stability. These inputs are used alongside an inertial sensor that tracks acceleration and body motion, such as tilt and roll, while moving over uneven terrain.
Each type of sensory data is processed through specialized encoders and combined into a single, comprehensive representation. The core component of WildFusion is a deep learning model based on implicit neural representations. Unlike traditional methods that treat the environment as discrete points, this approach models surfaces and features continuously, allowing the robot to make more informed decisions about foot placement, even with limited or uncertain visual input.
“Think of it like solving a puzzle where some pieces are missing, yet you’re able to intuitively imagine the complete picture,” explained Chen. “WildFusion’s multimodal approach lets the robot ‘fill in the blanks’ when sensor data is sparse or noisy, much like what humans do.”
WildFusion was tested at the Eno River State Park in North Carolina near Duke’s campus, where it enabled a robot to navigate environments including forests, grasslands and gravel paths. “Watching the robot confidently navigate terrain was incredibly rewarding,” Liu shared. “These real-world tests proved WildFusion’s remarkable ability to accurately predict traversability, significantly improving the robot’s decision-making on safe paths through challenging terrain.”
The team plans to expand the system by adding sensors such as thermal or humidity detectors to improve how the robot responds to different environmental conditions. Its modular design supports applications beyond forests, including disaster response in different terrain, inspection of remote infrastructure and navigation in unstructured areas.
This research was supported by DARPA (HR00112490419, HR00112490372) and the Army Research Laboratory (W911NF2320182, W911NF2220113).
“WildFusion: Multimodal Implicit3DReconstructions in the Wild.” Yanbaihui Liu and Boyuan Chen. IEEE International Conference on Robotics and Automation (ICRA 2025)
Project Website: http://generalroboticslab.com/WildFusion
General Robotics Lab Website: http://generalroboticslab.com
Edited by Puja Mitra, WTWH Media, for Control Engineering, from a Duke University news release.