A new 'AI Captain' will go to sea this month for testing which, if successful, will enable the Mayflower Autonomous Ship (MAS) to self-navigate across the Atlantic autonomously later this year.
The Mayflower Autonomous Ship (MAS) will trace the route of the original 1620 Mayflower to commemorate the 400th anniversary of the famous voyage.
Sailing from Plymouth, UK, to Plymouth, Massachusetts, with no human captain or crew, it will become one of the first full-sized, fully autonomous vessels to cross the Atlantic. The mission will further the development of autonomous ships and transform the future of marine research.
IBM and marine research organisation Promare will conduct the trial on a manned research vessel off the coast of Plymouth, UK.
The aim is to evaluate how the vessel uses its onboard AI and edge computing systems to safely navigate around ships, buoys and other ocean hazards that it is expected to meet during its record-breaking attempt on 16 September.
Don Scott, CTO of the Mayflower Autonomous Ship, said: "While the autonomous shipping market is set to grow from $90bn today to over $130bn by 2030*, many of today's autonomous ships are really just automated - robots which do not dynamically adapt to new situations and rely heavily on operator override.
"Using an integrated suite of IBM's AI, cloud, and edge technologies, we are aiming to give the Mayflower full autonomy and are pushing the boundaries of what's currently possible."
The Mayflower Autonomous Ship
MAS will rely on IBM’s advanced AI and edge computing systems to sense, think and make decisions at sea, even with no human intervention.
With the three hulls of the trimaran Mayflower Autonomous Ship currently reaching the final phase of construction in Gdansk, Poland, a prototype of the Mayflower's ‘AI Captain’ will first take to the water on a manned vessel - the Plymouth Quest - a research ship owned and operated by the Plymouth Marine Laboratory.
The March sea trials, which will be conducted in waters of Smart Sound Plymouth under the watchful eye of the Quest’s human crew, will help determine how the Mayflower’s AI Captain performs in real-world maritime scenarios, and provide valuable feedback to help further refine the ship’s machine learning models.
Over the past two years, the Mayflower team have been training the ship’s AI models using more than a million nautical images collected from cameras in the Plymouth Sound, as well as open source data bases.
To meet the demands of the machine learning process, the team used an IBM Power AC922 - the same IBM POWER technology behind the world’s smartest AI supercomputers.
Now, using IBM’s computer vision technology PowerAI Vision, the Mayflower’s AI Captain should be able to independently detect and classify ships, buoys and other hazards such as land, breakwaters and debris.
Plymouth Sound. Picture by Jay Stone
As the Mayflower will not have access to high-bandwidth connectivity throughout its transatlantic journey, it will use a fully autonomous edge system running on Red Hat Enterprise Linux and IBM’s edge computing solutions, powered by several onboard NVIDIA Xavier devices.
While at sea, the Mayflower will process data locally, increasing the speed of decision making and reducing the amount of data flow and storage on the ship.
Rob High, VP and CTO for Edge Computing, IBM, commented: "Edge computing is critical to making an autonomous ship like the Mayflower possible. The ship needs to sense its environment, make smart decisions about the situation and then act on these insights in the minimum amount of time – even in the presence of intermittent connectivity, and all while keeping data secure from cyber threats.
"IBM’s edge computing solutions are designed to support mission-critical workloads like the Mayflower Autonomous Ship, extending the power of the cloud and the security and flexibility of Red Hat Enterprise Linux all the way out to the edge of the network, even in the middle of the ocean."
Image: Bob Stone, Human Interface Technology Team, University of Birmingham
As well as following the overall mission objectives to reach Plymouth, Massachusetts, in the shortest amount of time, the AI Captain will draw on IBM’s rule management system (Operational Decision Manager - ODM) to follow the International Regulations for Preventing Collisions at Sea (COLREGs) as well as recommendations from the International Convention for the Safety of Life at Sea (SOLAS).
As the weather is one of the most significant factors impacting the success of the voyage, the AI Captain will use forecast data from The Weather Company to help make navigation decisions.
Taking into account this and other critical situational data such as depth and vessel status, the Mayflower’s AI captain is designed to operate independently in some of the most challenging circumstances. ODM also provides a completely transparent record of its decision-making process, avoiding black box scenarios.
After the current three months of evaluation with a human captain at the helm, further trials to evaluate full autonomy will begin in May.
You'll be the first to hear the latest Mayflower news, events, and more.