Robots Help Ocean Explorers Map the Sea Floor

To discover the ocean, map the seafloor and get an up-close view, ocean explorers deploy a range of powerful technologies, from ship-based sonar to autonomous underwater and remotely operated vehicles.

The crew of E/V Nautilus lowers Argus, a remotely operated vehicle, into the ocean. Image Credit: OET/Nautilus Live.

Some 5,000 meters below the surface of the Caribbean Sea, between the Cayman Islands and Jamaica, cracks in the seafloor spew scorching fluids into the ocean. At more than 400 degrees C — four times the boiling point of water — these copper-laden fluids rise in a black plume the height of two-and-a-half Empire State Buildings. Thousands of pale shrimp cluster near the fissures, along with sea anemones, snails, and mats of microbes. Fish swim by intermittently, and lonely squat lobsters crawl on nearby rocks.

Here, in the deep, in the dark, lies a system of hydrothermal vents further from the surface than any other on Earth: the Beebe Vent Field.

With its extreme depth, the discoveries made in the vent field helped scientists understand how ocean pressure affects hydrothermal systems. Scientists aboard the British Royal Research Ship (RRS) James Cook discovered and explored the vent field in 2010. They also found the nearby Von Damm Vent Field, a shallower site with vent chimneys made of talc, a composition never before been seen in an active vent field. While humans have spread across every continent and even ventured into space, there is still much to learn about the world beneath the waves that cover 71 percent of Earth’s surface.

Video footage of the Beebe Vent Field, which at 5 kilometers below the waves is the deepest hydrothermal system known to humans, filmed by an ROV in 2013. Video Credit: RRS James Cook Voyage 82

“A lot of oceans in the world are not explored, meaning there’s no baseline data to inform management or any other decision-making about it,” says Mashkoor Malik, lead mapping scientist for Okeanos Explorer, a research vessel operated by the U.S. National Oceanic and Atmospheric Administration’s (NOAA) Office of Ocean Exploration and Research.

Opening Google Earth, you might assume we know the ocean well — a forgivable mistake given the rich array of trenches and ridges painted in blue. Zoom in, however, and the map becomes increasingly blotchy, even pixelated.There are limits to its resolution.

Those limits come from mapping methods. Global ocean floor maps, like Google Earth’s, draw not on direct measurements, but gravity data. Satellites in space send radio waves to Earth, where they strike the surface of the sea. These pulses aren’t suited to swimming and diving, though — the radio waves quickly dissipate in water. They can’t measure the seafloor directly.

But the satellites do measure the height of the ocean surface, which, when paired with complex calculations to account for the effect of tides, waves and gravity, sketches a rough outline of the ocean’s bottom. The entire ocean floor has been mapped in this way, with the most recent effort in 2014 capturing features in the deep that exceed 5 kilometers in width.

A screenshot of Google Earth shows the texture of the seafloor. This image centers on the Cayman Trough, the ocean basin in which the Beebe Vent Field lies. Image Credit: Google Earth

Seeing with Sonar

But at that resolution, any feature smaller than 5 kilometers across simply won’t appear on scans. For more precise images, researchers don’t rely on outer space, but rather, take to the waves, ditching radar for sonar.

Sonar devices are mounted on the undersides of ships dedicated to exploration, like NOAA’s Okeanos Explorer and the Ocean Exploration Trust’s Exploration Vessel (E/V) Nautilus, both of which sport a 30 kHz Kongsberg Simrad EM302 echosounder. Arranged in a T- or U-shape, these devices hurl sound waves into the sea, creating a fan of imaging below and to the sides of the vessel. Those waves bounce off ocean terrain, sending echoes back to the ship that tell the system how far away a tangible object is and what it looks like. It’s a technology called multibeam sonar.

Multibeam sonar attached to a ship can achieve map resolutions of tens to hundreds of meters. But the deeper the seafloor, the lower the map resolution, since sound waves must travel farther to hit solid surface — and echoes have to climb much farther to get back to the sonar transducer.

In the 1980s, a new solution emerged: robots. Researchers began using autonomous underwater vehicles (AUVs), underwater watercraft that dart through the blue on a pre-programmed course. Multibeam sonar turns these scientific robots into mapping machines.

The Autosub6000 AUV, which was lowered from a research ship into the ocean to locate the Beebe and Von Damm Vent Fields. Image Credit: Jon Copley

Robots Swimming on Their Own

AUVs entail lower costs — and less risk — than submersibles staffed with people.

And they’ve opened new frontiers in ocean exploration, journeying underneath ice and traveling to new depths. Among AUVs, a small number of vehicles can even venture 5,000 meters or more below the surface. Getting so close to the seafloor means getting maps with much higher resolution, along the lines of 4 or 5 meters — with some scans mapping objects just half a meter wide.

“It’s like looking at a blurred image and suddenly the lights go on and everything looks pin-sharp,” says Russell Wynn, chief scientist of the Marine Autonomous and Robotics Systems (MARS) group at the UK’s National Oceanography Centre. “That’s what an AUV gives you.”

AUVs are commonly used to study hydrothermal vents, such as the Beebe Vent Field, and undersea volcanoes. In the middle of the Atlantic Ocean, for example, the Massachusetts-based Woods Hole Oceanographic Institution (WHOI) deployed an AUV called the Autonomous Benthic Explorer (ABE). Plunging some 900 meters into the ocean — a depth equal to the combined length of eight football fields — the vehicle mapped Lost City, a field of hydrothermal vents atop a 4,200-meter undersea mountain.

Research collecting data with AUVs has grown rapidly since the mid-2000s. By using AUV-mounted sonar alongside ship-based sonar, ocean explorers garner the benefits of both. Ships equipped with multibeam sonar can map much larger areas than an AUV, but the robots can bring the ocean world into sharper focus with greater detail.

Autonomous vehicles have their limits, however. Once set in the water, they’re stuck to one pre-programmed course and have limited, if any, communication with the ship. Sometimes, AUVs get lost, as happened with ABE, the vehicle that, among its 222 research dives, explored the Lost City Hydrothermal Field. While off the coast of Chile in 2010 — during a mission gathering data on the movements of tectonic plates — ABE lost contact with the research vessel and disappeared.

Getting Down Deep and Up Close

Some vehicles are easier to track. Unlike AUVs, remotely operated underwater vehicles (ROVs) remain tethered via cable to a ship. While this makes them slower and less agile than AUVs, it keeps them connected to a constant source of power and enables them to communicate with their ship in real time.

ROV operators control the instrument’s movements directly, either from the nearby ship or from hundreds of miles away. Their precision bests even AUVs: ROVs can create maps capturing objects that are just a few centimeters wide. On the seafloor, these vehicles can send real-time images back to the ship, capture video and take objects back to the surface for further study.

In 2010, it was an ROV, HyBIS, which let researchers aboard the RRS James Cook photograph the rugged terrain and abundant life at the Beebe and Von Damm Vent Fields. The crew first surveyed a large area with multibeam sonar before deploying the Autosub6000 AUV 60 meters above the seafloor to map the area in greater detail.

Doug Connelly, a marine geoscientist who led the expedition, explains that locating the vent fields could once have taken weeks, with crews casting an electronic device called a CTD (conductivity, temperature, depth) into the water over and over again for measurements, tracking chemical changes and triangulating to find hydrothermal fields. But with multibeam sonar — ship-based, AUV, and ROV — they found the exact location of the vent sites in just days.

Only after that could the researchers lower the ROV, capturing photos, videos and water samples. “We still need a human-directed presence to collect samples,” says Jonathan Copley, a marine biologist who led the expedition. “What’s great is, while you’re doing that, your AUV can be off scouting for the next vent field.”

Some 160 meters deep near the Channel Islands, the Hercules ROV examines sedimentary rocks. Image Credit: OET/Nautilus Live

An Important First Step for Science

Using these three technologies in tandem, a “three-phase nested survey,” lets teams hone in on areas of interest at higher resolutions. Many projects use multiple technologies concurrently, since ship time is expensive and multitasking saves money.

Around the Channel Islands off the coast of Southern California, scientists and engineers on board E/V Nautilus employed a similar sequence, swapping AUVs out for autonomous surface vehicles (ASVs). In 2017, they mapped old, submerged beaches off the islands and explored underwater caves.

Nicole Raineault, who heads exploration and science operations for Nautilus, says their ROVs are equipped with downward-facing sonar for mapping and forward-facing sonar to image caves.

Through multibeam sonar and a suite of associated technologies, researchers are illuminating areas once shrouded by darkness and depth, recording the baseline data demanded, for example, by grant committees that fund hypothesis-driven research.

“I always view exploration as a precursor to the traditional scientific method,” which entails testing particular hypotheses instead of exploring more generally, says Raineault, calling exploration “a very important first step for science.”

Andrew Urevig is a 2017 Earthzine Writing Fellow and freelance science writer. Follow him on Twitter @aurevig.

Editor’s Note: This article was updated on March 13, 2018, to add information about transducer shapes and clarify the purpose of Nautilus mapping. 

Leave a Reply