In the Ocotillo desert, we set out to push our software-only visual navigation system to the operational limit. In the process, we achieved a remarkable breakthrough: ultra-low (<100 ft AGL) flights over desert terrains with almost no identifiable features. Crucially, this entire feat was carried out running in real-time on a limited compute node with a narrow FOV camera.
Humans can navigate via land, sea, and air in unseen environments, yet the navigation stacks today are tailored to each platform and heavily reliant on domain-specific assumptions that don't generalize.
We’re thrilled to unveil our first research paper, “TARDIS STRIDE: A Spatio-Temporal Road Image Dataset and World Model for Autonomy,” which breaks fresh ground in modeling dynamic real-world environments.