ROVs/Submersibles/Diving

Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea

By Keith Cowing
Status Report
NOAA
February 18, 2026
Filed under , , , , , , , , , , , , , , ,
Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea
Remotely operated vehicle MiniROV in a simulated environment during the training of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI. Download largest version (jpg, 405 KB).

Editor’s note: When we start to mount Astrobiology missions to explore ocean worlds we’ll need ways for our robotic submersibles to observe and interact with whatever life forms they may encounter. We’re going to our droids to be as smart and self-reliant as possible. How we study our own living water world is the perfect place to plan for future, offworld missions when it comes to detecting and characterizing life on the planet below. Of special importance to astrobiology expeditions are the plans being formulated for the exploration of icy ocean worlds in our own star system such as Enceladus, Europa, Ganymede and beyond – and the necessary role that autonomous submersibles will play.


Overview

For the past two years, engineers and scientists have been developing a way for artificial intelligence-driven underwater vehicles to autonomously find, follow, and identify deep-sea animals in real time, with limited human oversight. Their Deployable AI (artificial intelligence) will help scientists, policymakers, and the public better and more rapidly understand the life that inhabits our ocean.

Modern robotics, low-cost observation platforms, and other emerging exploration tools make underwater imaging easier. However, searching for and following animals, and then analyzing all the resulting imagery, takes a lot of effort. This new technology aims to overcome these major obstacles to discovery.

How It Works

The Deployable AI technology consists of hardware (cameras and a compact computer) and software (with several computational algorithms) to enable detection and tracking of underwater animals by a remotely operated vehicle (ROV) or autonomous underwater vehicle (AUV). During a dive, “detector” and “supervisor” algorithms review live video, looking for animals they’ve been trained to recognize (e.g., fish, jellyfish, siphonophores, and comb jellies), similar to facial recognition.

Once they find something of interest, another “agent” algorithm works with the vehicle controls algorithm to maneuver the vehicle and follow the animal slowly and far enough away to avoid disturbing it, while continuing to image it as long as possible.

Initial Testing

To ensure the agent could perform these tasks effectively, it underwent additional training in a simulated environment with an ROV and animals — much like playing a video game. After the team was satisfied with the agent’s performance, they integrated it on MBARI’s (Monterey Bay Aquarium Research Institute’s) MiniROV — since ROVs offer greater control for testing purposes — and tested and refined it, teaching it to follow an artificial jellyfish mimic in a 10-meter-deep (30-feet-deep) test tank.

Screenshot from the FathomNet Database, a publicly available underwater image database used to train the Deployable AI developed to find, follow, and identify deep-sea animals as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. The jellyfish seen here was observed during an expedition led by NOAA Ocean Exploration, one of a number of contributors to the database. Image courtesy of MBARI/FathomNet, NOAA Ocean Exploration. Download largest version (jpg, 160 KB).

Next Stop, the Ocean

In October 2024, the agent was ready for real-world testing in Monterey Bay. Deployed on MiniROV from Research Vessel Rachel Carson, it performed both tasks relatively well — finding and following siphonophores, comb jellies, and jellyfish — but still needed work.

Screenshots captured in the control room of Research Vessel Rachel Carson during the first field test on remotely operated vehicle MiniROV of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI. Download larger Top and Bottom imagery

Astrobiology, oceanography, AI,

Explorers Club Fellow, ex-NASA Space Station Payload manager/space biologist, Away Teams, Journalist, Lapsed climber, Synaesthete, Na’Vi-Jedi-Freman-Buddhist-mix, ASL, Devon Island and Everest Base Camp veteran, (he/him) 🖖🏻