Skip to main content

Underwater Communications and the Challenge of Ocean Connectivity

In May 2025, theNetworkingChannel brought together a panel of leading researchers for a deep technical discussion on underwater communications and networks. Organised by Marco Ajmone Marsan of IMDEA Networks Institute and Politecnico di Torino, the webinar explored how connectivity can be extended into one of the most challenging environments on Earth, the underwater domain. The session highlighted not only how far the field has progressed, but also why underwater networking remains fundamentally different from the terrestrial and satellite systems that dominate today’s connectivity landscape. 

The discussion opened by setting the context for underwater communications as an enabling technology for ocean observation, industrial operations and autonomous systems. From seabed sensors and autonomous underwater vehicles to surface buoys and gateways linking back to shore, the vision is of a heterogeneous network that spans the seabed, the water column and the surface. Cabled connections are often impractical due to cost, weight and lack of flexibility, making wireless communication essential despite the severe constraints imposed by the underwater channel. 

A central theme of the webinar was the dominance of acoustics as the only viable technology for long range underwater communication. Radio waves attenuate rapidly in water, while optical links, although capable of high data rates, are limited to short distances and require precise alignment. Acoustic signals, by contrast, can travel over kilometres, but at the cost of extremely limited bandwidth and high latency. These characteristics shape every layer of system design, from physical layer signal processing through to network protocols and applications. 

At the physical layer, the underwater acoustic channel was described as hostile and unpredictable. Severe path loss forces operation at very low frequencies, while ambient noise from shipping, wind and waves creates a non white noise environment. The usable bandwidth is often small in absolute terms, yet large relative to the carrier frequency, which means underwater acoustic systems behave as inherently broadband systems. This has major implications for modulation, synchronisation and receiver design. Doppler effects caused by relative motion between transmitter and receiver can significantly distort signals, while multipath propagation, driven by reflections and sound speed variations with depth, introduces long delay spreads that challenge conventional equalisation techniques. 

To cope with these impairments, sophisticated signal processing techniques are required, often relying on adaptive algorithms and array processing. Single carrier systems with decision feedback equalisers remain common for long range links, while multicarrier approaches such as OFDM are being explored to better handle frequency selectivity. However, sensitivity to Doppler and power efficiency remain key trade offs. Even with decades of research, reliable high data rate underwater links are measured in kilobits per second rather than megabits, underlining how different this environment is from radio based systems. 

Moving beyond the physical layer, the webinar emphasised that networking in underwater systems cannot simply reuse terrestrial protocols. The slow speed of sound in water leads to propagation delays that can exceed packet durations, breaking assumptions that underpin many MAC and transport protocols. Retransmission based error control becomes inefficient, while routing decisions must consider distance dependent bandwidth and long delays. As a result, many underwater networks still rely on simple flooding based approaches, not because they are elegant, but because the environment makes more complex schemes difficult to justify. Transport protocols such as TCP are poorly suited due to overhead and sensitivity to delay and loss, prompting ongoing research into lighter and more adaptive alternatives. 

Another important aspect discussed was the role of multimodal communication. Acoustic, optical and in some cases short range radio links each occupy different regions of the range and data rate trade space. Rather than searching for a single universal solution, future underwater networks are likely to combine multiple technologies, selecting or switching between them based on application requirements. This opens up new design challenges, but also creates opportunities for more capable and flexible systems. 

The availability and cost of hardware was highlighted as a major barrier to wider experimentation and adoption. Traditional acoustic modems have been designed for defence, offshore and industrial use, prioritising long range and robustness at the expense of cost and accessibility. This has limited hands on experimentation, particularly for students and smaller research groups. Recent efforts to develop low cost and software defined acoustic modems aim to change this by leveraging off the shelf components, reduced power operation and shallow water use cases. While performance is lower than high end commercial systems, these platforms make it possible to prototype new waveforms, protocols and applications without prohibitive investment. 

Standardisation was presented as a critical enabler for interoperability and ecosystem growth. Unlike terrestrial communications, where Wi-Fi, mobile cellular and optical standards are deeply embedded, underwater communications lack widely adopted open standards. One notable exception is the NATO JANUS standard, which defines a common acoustic waveform to allow basic interoperability between heterogeneous systems. JANUS focuses on robust signalling rather than high data rates, acting as a digital underwater lingua franca that enables discovery, coordination and coexistence between proprietary systems. While optical underwater communications currently lack comparable standards, the discussion highlighted the need for similar efforts to support future hybrid networks. 

The webinar made it clear that underwater communications and networks remain a specialised but increasingly important field. Applications ranging from environmental monitoring and offshore energy to robotics, defence and scientific exploration all depend on reliable underwater connectivity. Progress will require continued advances across signal processing, networking, hardware platforms and standards, as well as closer collaboration between academia, industry and standardisation bodies. As interest in the oceans grows, both for economic activity and environmental stewardship, the ability to connect underwater systems effectively is set to become a core component of future connectivity technologies.

Details of the webinar is available here and the recording is embedded below:

Related Posts

Comments

Popular posts from this blog

Highlights from XGMF's Conference to Advance Millimetre Wave Technology

On April 1, 2024, two of Japan's leading connectivity organizations—the 5G Mobile Promotion Forum (5GMF) and the Beyond 5G Promotion Consortium (B5GPC)—joined forces to create the XG Mobile Promotion Forum ( XGMF ). This merger symbolizes a pivotal step in accelerating the adoption of next-generation wireless technologies. In May 2024, XGMF's Millimeter Wave Promotion Ad Hoc (Millimeter Wave AH) hosted the International Workshop on Millimeter Wave Dissemination for 5G. This event aimed to foster the adoption of millimeter wave (mmWave) technology in Japan and beyond, drawing an audience of approximately 200 attendees and broadcasting in both English and Japanese. The workshop featured opening remarks by Mr. Naohiko Ogiwara, Director of the Radio Department, Telecommunications Infrastructure Bureau, Ministry of Internal Affairs and Communications (MIC). Key speakers included: Mr. Takanori Mashiko (MIC, slides ) Mr. Sam Gielges (Qualcomm, online - no slides) Mr. Christopher Pric...

Testing, Refining, and Improving Stratospheric Connectivity: NTT Docomo’s HAPS Trials

At MWC 2025, NTT Docomo highlighted its latest initiatives under the NTT Group's "NTT C89" space-business strategy, such as mobile-connectivity services using unmanned vehicles, or high-altitude platform stations (HAPS), that fly in the stratosphere for days or months, using relays to provide mobile connectivity in mountainous and remote areas, including at sea and in the sky. A presentation on NTT C89 Aerospace Business Strategy is available here while a presentation on NTT DOCOMO's Non-Terrestrial Network (NTN) for Extreme Coverage Extension is available here . Stratospheric connectivity, enabled by High Altitude Platform Stations (HAPS), is emerging as a key solution for extending mobile coverage to remote and underserved areas. However, ensuring that these airborne platforms can provide stable, high-quality connectivity requires extensive testing and refinement. At MWC 2025, NTT Docomo showcased its progress in this domain, highlighting multiple real-world trials...

How Do Apple AirTags Work?

Apple AirTags have steadily gained popularity in the smart tag market. A recent report highlighted that 69% of smart tag buyers in late 2024 chose an Apple AirTag. This marks a significant rise from 45% in early 2022. In contrast, Tile, the category pioneer now owned by Life360, has seen its market share fall to 11% from 17% during the same period. Samsung's Galaxy SmartTags now hold second place. Interestingly, the technology behind AirTags resembles concepts like Opportunity Driven Multiple Access (ODMA) or Multihop Cellular Networks (MCNs), which I have previously explored . A similar approach has also been discussed regarding Bluetooth-based Ad-Hoc networks . How Do They Work? AirTags primarily use Bluetooth Low Energy (BLE) to communicate with nearby Apple devices that are part of the Find My network. This vast network consists of millions of Apple devices, including iPhones, iPads, and Macs, which can detect AirTags and securely relay their location back to the owner. Addit...