From Mars to the deep

From Mars to the deep

Thu May 6th, 2021
WHOI

Read this at WHOI.

Last February, the Mars rover Perseverance—the most advanced robot ever to explore another world—landed on Mars. It tore through the planet’s wispy-thin atmosphere at 12,000 mph before a drogue parachute opened to slow the spacecraft down for its first glimpse at the planet’s rocky, pink surface.

Soon, the navigation system that made that journey possible could guide robots in another unexplored terrain that’s much closer to home: the deepest trenches of the ocean. Russell Smith, an engineer with NASA's Jet Propulsion Laboratory (JPL) is involved with both projects. He recalls the thrill of watching the rover land.

“My heart was pounding in my chest,” says Smith, who was virtually strapped in at home during the live YouTube broadcast of NASA’s Perseverance landing. Smith was personally vested: he had spent many long day and nights building a system which simulated reduced gravity environments in order to test the Mars Helicopter, dubbed Ingenuity. The helicopter hitched a ride to Mars aboard the rover, and went on to perform the first powered controlled flight of an aircraft on another planet.

As the rover reached an altitude of two miles above Mars, rocky boulders and steep cliffs suddenly appeared in high-def. Moments later, furious clouds of orange-red dust appeared as the Perseverance hovered just above the surface. Then, finally, its six aluminum-cleated wheels settled peacefully onto a flat, rock-free piece of Jezero Crater.

“It was really thrilling!” Smith says. But, he says, those “seven minutes of terror”—NASA-speak for the time it takes for a spacecraft to enter, descend, and land on Mars—were also nerve-wracking. As the rover descended, he was relieved to see that the navigation systems appeared to be working. The Lander Vision System (LVS) using Terrain Relative Navigation (TRN) enabled the Perseverance to not only land safely in “a field of dangers,” he says, but to do so within a car’s length of the landing target. The helicopter’s flight was one step closer.

Now, Smith has his sights set on moving similar TRN-based navigation technology (called xVIO) into the hadal zone, the deepest, the darkest reaches of the sea, extending from 6,000-11,000 meters below the surface. He’s working with WHOI biologist Tim Shank, WHOI research engineer Casey Machado, and others on WHOI’s HADEX program team to integrate the technology into Orpheus-class hadal robots. These relatively small, bright-orange drones are specifically designed for hadal zone exploration.

On Earth, advanced GPS systems are sufficient for navigation—at least on land. “But deep in the ocean it’s far more difficult,” Smith says. Space and the deep ocean both lack the constellations of sensors that make such navigation technology possible.

TRN, however, could be an ideal solution. The concept behind the technology is simple— it works much like you do when walking around your own house. You know where you are based on the objects you see: doors, furniture, the refrigerator, a staircase. But in order for a robot to function in this way, a tightly integrated system of advanced machine vision cameras, lighting, and pattern-matching software algorithms is necessary. These components enable the system to reconstruct the seafloor by creating three-dimensional maps that stitch together images of features it sees, such as rocks and clams. The maps are stored in the system’s memory, so when Orpheus flies back over a mapped area of the ocean, it will know where it is based on the familiar objects it sees.

But the maps are more than simply a navigation tool. They will also enable Orpheus to locate scientifically interesting features like cold seeps, hydrothermal vents, and even animals. Shank says this could be a revolutionary advance.

 

 

 

 

 

 

 

“These will be the most detailed maps we’ve ever had on the seafloor, with a resolution that gets down to the biological scale,” he says. “From there, we want to be able to command, ‘Go back to that clam we saw earlier’ and have the system tell the vehicle how to get back there. It’s a huge step we hope to reach.”

Another key advantage of this navigation system, according to Shank, is that it is compact. “To do conventional mapping in the ocean, you typically have to mount heavy sonars on the vehicle,” he says. This system, by contrast, adds very little weight.

But getting the technology to work well in the ocean will be a challenge. On Mars, visibility is relatively good due to the planet’s thin atmosphere (unless you find yourself in the middle of a dust storm). But the ocean is often murky, and conditions change quickly. The combination of turbulence, particulates, and even sea life swimming around a robot’s cameras can make it difficult for the system to recognize landmarks.

Shank and Smith believe the system will nevertheless be a game-changer for ocean exploration. They will field test it next month, when Orpheus and its twin robot Eurydice descend to the Blake Plateau in the Western Atlantic Ocean.

“To date, we’ve been testing the system out on a mini version of Orpheus in a tank at JPL,” Shank says. “So, we’re excited to be able to finally get it into deep ocean conditions. The seafloor is relatively flat along the Blake Plateau, but there is relief there that we think may be coral mounds and methane seeps which we’re very interested in studying.” 

For Smith, the field tests will give him yet another opportunity to watch with bated breath as an autonomous robot roams a completely different unexplored world. “The engineering challenge of working in extreme environments like this is always fun because you don't know the challenges you're going to hit,” he says. “And down there, there’s so much potential for finding interesting things.”