NATO navies must build capability and capacity to master the use of data, alongside enhancing platforms, sensors, and weapons, in order to dominate the modern naval battlespace, a UK Royal Navy (RN) information warfare expert told the Paris Naval Conference in early February.
The event, held for the fourth year and co-hosted once more by the French Navy and IFRI (France’s institute for international relations), brought together navy chiefs and senior naval operators and experts from across NATO to discuss themes like determining decisive factors for prevailing in contested operations at sea.
The RN is transitioning to a ‘hybrid’ fleet, developing a force structure balanced between crewed, uncrewed, and increasingly autonomous systems. A core component of the capability requirement for uncrewed and autonomous systems is to provide sensing mass across broad geographical areas, so that crewed platform operations can be more effectively targeted, including in tasks like anti-submarine warfare (ASW) or critical undersea infrastructure (CUI) protection. In such areas, the use of data to provide information on where, when, and how to focus this targeted effect is increasingly significant.
Given this context and requirement, there is growing need for the collection, collation, and processing of data; then, the subsequent dissemination of the resultant information to decision-makers and operators; and finally, the application of information for operational effect. With greater activity levels occurring at sea, and larger numbers of sensors and volumes of data used to both conduct and counter such activity, mastering data and information use is becoming ever more critical, as is procurement of computer processing and storage power to manage this data.

“We know we need weapons that are capable of going much further. We know we need resilience in our naval capabilities. We need redundancy. We need the ability to operate in a degraded, denied, intermittent, and low-bandwidth (DDIL) environment. But for a modern, warfighting, hybrid navy, if we don’t also have mastery of data, we’re going to lose,” Captain Bryan McCavour – the RN’s Deputy Assistant Chief of Staff for Information Warfare, and Deputy Head of the service’s Information Warfare specialisation – told the conference.
Without this, said Capt McCavour, “In effect, battlespace awareness collapses if it’s not underpinned by the ability to curate and share data at pace in order to sense, decide, and effect.” “We need authoritative data catalogues, we need fused data, we need target-quality data, backed up by an assured command-and-control (C2) system that’s capable of moving this data at machine speed and making sense of it,” he added.
“What we should emphasize is fighting on the same picture,” Capt McCavour continued, referring to the need for navies to share and use a recognised common operational picture. “However,that only works if you have agile bearer networks, if you have multiple redundant systems, and if you’re able to blend agentic artificial intelligence (AI)* and computing power.” The issue is harnessing not just the data but the processing power to understand it, both in maritime C2 headquarters on land and amongst operators at sea, Capt McCavour explained. This is all crucial so that data and information can be used effectively to mitigate anti-access/area denial (A2/AD) capabilities employed in DDIL operational contexts, he added.
Much of the conference discussions noted that NATO navies are preparing to be ready for war by or before 2030. Thus, there is a ‘real-world’ context around development of concepts for enhancing data and information use.
“If you’re able to move that data, curate it, and analyse it, you need to be able to do it in very challenging conditions,” said Capt McCavour. NATO’s North Atlantic region, from Norway across to Greenland, lies almost wholly within Russia’s weapon engagement zone. “The ability to not only take data, analyse it, and compute it, but compress it and use broadcast data architectures that move data in kilobytes – not terabytes – will be really critical for our ability to operate in EMCON [emissions controlled] or denied environments: to maintain a surety to battlespace awareness; to deliver integrated fires; to not be detected, not be killed; and survive to keep fighting,” Capt McCavour continued. “We have to be able to operate in the enemy’s weapon engagement zone. If we can’t do that, we won’t survive: we’ll be detected, we’ll be targeted, and we’ll be killed.”
In this overarching operational context, digital enablers are crucial.
“We talk a lot about data, but I don’t think we talk enough about the things that make the data useful and available,” Capt McCavour said. Illustrating this point with an example of the relationship between ammunition and a gun, he explained that, if data is ammunition, then a delivery mechanism – the gun – is needed to deliver the data’s effect.
Computing power is one such enabler, and one such mechanism.
“You need a lot of computing power, and if we are going to exploit agentic AI, we absolutely have to have computing power, both ashore and at sea,” said Capt McCavour. “Computing power has to be resilient and redundant, and it has to be available in a time of conflict.”
The use of data drawn from sensors, processed, analysed, and distributed using tools like AI, and all enabled by enhanced computing power, is part of what is referred to as a navy’s ‘digital backbone’. Such capacity is sovereign in both national and NATO contexts. In wartime, Capt McCavour said, drawing on civil society and the commercial sector to provide this capacity and redundancy will be critical. “We can’t just rely on sovereign military ability to compute,” he added, noting the need to be able to ‘burst’ into commercial communication and cloud capabilities in the build-up to and during conflict.
*Agentic AI involves agents that can behave and interact autonomously in order to achieve their objectives, with multiple AI agents working together to solve complex problems.



