France turns to AI for signals analysis in underwater acoustics war

PARIS — The French Navy is turning to artificial intelligence to help its submariners detect enemy vessels in a growing sea of underwater sounds.

The Navy’s acoustic recognition and interpretation center CIRA in Toulon is working with French startup Preligens on AI-powered analysis of underwater acoustic signals, the center’s head, Vincent Magnan, said in a presentation here Thursday. France expects to test the technology onboard its submarines by the end of the year, with operational deployment scheduled for 2025.

As France equips more and more vessels with increasingly powerful passive acoustic sensors, the amount of data collected for analysis is growing exponentially. The Navy is counting on AI to help its acoustics analysts, nicknamed “golden ears,” cut through the noise, both at the Toulon center and on board its submarines.

More sensors and greater detection ranges will result in “a massive flow of data,” Magnan said. “To be able to analyze all this data, and especially to be able to isolate from it the useful and decisive information for the conduct of our combat operations, we need to resort to technological innovations, including artificial intelligence.”

In addition to submarines, frigates and aircraft fitted with passive sensors, the near future will bring drones and underwater gliders that capture acoustic data, according to Magnan. The amount of such data gathered by CIRA has increased to around 10 terabytes in 2024 from 1 terabyte in 2020, and is expected to approach 100 terabytes or more by 2030.

Interest in “passive acoustic warfare” is growing because it allows surface vessels and submarines to detect underwater sounds during operations at sea and derive tactical elements in “all discretion,” without an adversary knowing about it, Magnan said. A particular propulsion pattern might allow the Navy to define a target’s speed, which can then in turn determine a tactical maneuver.

The Toulon center is using AI to filter out those acoustic signals of interest, after which humans can carry out high-value added analysis. The goal will be broadly similar at sea, with AI allowing human operators to focus on the useful signals.

Are torpedo-killing torpedoes ready for prime time?

“So we use technology to discard or filter the standard part of the signal, the almost useless part, and we rely on humans to exploit the useful part,” Magnan said.

Sifting through 12 days of acoustic data recorded in the waters off Toulon takes two “golden ears” more than 40 working days, Magnan said. With the AI demonstrator from Preligens, extracting useful signals from those same recordings can be done in 4 to 5 hours, with an additional five to six days of human analysis. “So you can already see that the gain is enormous”

Whereas in the 1990s and 2000s CIRA analyzed acoustic recordings of around 5 minutes targeted at a particular threat, the center now deals with data stretching over forty-day periods that requires “a great deal of human capacity” to process, according to the head of the center.

In the early 2000s, a sonar operator could see around 20 kilometers and would monitor 10 simultaneous acoustic contacts, by 2020 that had increased to more than 200 kilometers and a hundred tracks, Magnan said. France’s third-generation ballistic missile submarines will have even greater sensor capabilities, creating a real need to ease the detection task, the commander said.

France currently operates four Le Triomphant-class nuclear-powered ballistic missile submarines and is in the process of replacing its Rubis-class nuclear-powered attack submarines with six Suffren-class vessels.

The AI model has shown “very encouraging results,” able to distinguish hobbyist boats from commercial vessels, and identify propeller speed, propulsion systems and even the number of propeller blades, according to Magnan. A future step will be combining the AI models applied to acoustics with other sources of information, including satellite, radar, visual and electromagnetic.

The team working on acoustics detection has created a tool to automatically detect and identify various acoustic sources and sound emissions that will be demonstrated at the Viva Technology show in Paris next week, said Julian Le Deunf, an expert at the Armed Forces Ministry’s newly created agency for AI in defense.

“The promising results over these last few months also encourage us to test all these capabilities in real-life conditions, so to take the jump onboard the submarine and test these models directly at sea,” Le Deunf said. “The goal for the end of the year is really to succeed in plugging the model directly behind an audio stream, behind a sensor.”

The AI project has been running at CIRA since 2021, after Magnan met with Preligens executives in October of that year. French military intelligence was already using the company’s AI products to analyze satellite imagery, and Magnan said his discussions with Preligens led to the idea that the model could be replicated to make sense of underwater signals.

Eventually, the AI algorithms will be able to identify ambient noises such as a pump starting up or a wrench falling in a hold, according to Magnan.

“The idea in the long run is obviously to find models that are effective and efficient over the whole acoustic spectrum of the sources we encounter at sea,” he said.



from Defense News https://ift.tt/HByFdL4
via IFTTT

Post a Comment

Previous Post Next Post