A Novel Caltech Algorithm Allows Autonomous Systems Navigate Themselves By Referring The Surrounding Terrain, Summer or Winter

GPS systems were used predominantly in autonomous devices to guide them in their environment in recent years. But a recent study has introduced a new Caltech algorithm that, for the first time, allows autonomous systems to identify their location by simply looking around the terrain, regardless of the seasonal changes.

The process employed in the algorithm is called ‘visual terrain-relative navigation’ (VTRN), which was first developed in the 1960s that helped the autonomous devices compare the nearby terrain to high-resolution satellite images to locate themselves.

However, it was discovered that VTRN strictly requires the terrain it is looking at to closely match the images in its database. Anything that changes or obscures the terrain, such as snow or fallen leaves, causes the images to mismatch and clogs the system. Thus, VTRN systems can be easily mistaken unless there is a landscape image database in every conceivable condition.

To solve this issue, Soon-Jo Chung’s laboratory team, Bren Aerospace Professor and research scientist at Control & Dynamical Systems and JPL, have devoted their attention to deep learning and artificial intelligence to eliminate seasonal content hindering existing VTRN systems.

The current technique only works when the image from the satellite and the image from the autonomous system have identical content. However, in real-world scenarios, things change dramatically depending on the season because the images no longer contain the same objects and cannot be directly compared.

The process developed by the team uses the “self-supervised learning” approach. The AI examines patterns in images by highlighting details and features that humans are likely to overlook. Unlike most computer-vision strategies, which rely on human annotators to train an algorithm to recognize what it sees, the self-supervised learning method allows the algorithm to teach itself.


Using the new system along with the current generation of VTRN results in more accurate localization. This was demonstrated in an experiment where researchers attempted to localize images of summer foliage against winter leaf-off imagery using a correlation-based VTRN technique. They discovered that navigation failed in half of the attempts. However, employing the new algorithm into the VTRN worked better where 92 percent of attempts were matched correctly, and in advance, the remaining 8 percent could be identified as problematic.

The system also finds applications for space missions. For example, VTRN was used for the first time on the Red Planet to land on the Crater Jizero, a site previously considered too dangerous for a safe entry, on the Mars 2020 Perseverance rover mission. The team considered Martian polar regions with extreme seasonal changes, similar to conditions at the Earth, and thought the new navigation systems would support scientific goals, including the search for water.

In the near future, the team plans to extend the technology to account for changes in the weather, including fog, rain, snow, etc., that would offer significant improvements in the navigation systems of driverless cars.

Source: https://www.caltech.edu/about/news/new-algorithm-helps-autonomous-vehicles-find-themselves-summer-or-winter

Paper: https://robotics.sciencemag.org/content/6/55/eabf3320

[Announcing Gretel Navigator] Create, edit, and augment tabular data with the first compound AI system trusted by EY, Databricks, Google, and Microsoft