Unlike land or airborne vehicles that can use radar and cameras to locate and identify objects underwater navigation and exploration present greater challenges. If underwater visibility is good, cameras provide a good source of information. Increasing resolution and the number of cameras are among the improvements that have been made in the recent past to increase their value. However, just as cameras are insufficient in land vehicles, other techniques must be used underwater as well to avoid hitting objects in the navigation system and locating objects in the exploration phase.
In advanced driver assistance systems (ADAS) radar and LiDAR compete with each other to supplement the data from cameras. Radar is ineffective underwater because the microwave frequencies (with centimeter-range wavelengths are absorbed within feet of transmission. The well-known alternative is sound navigation ranging or sonar. Similar to radar and LiDAR, sonar uses the time-of-flight measurement technique to determine range. The source signal bounces off an object and returns to a receiver, where the time of the round trip and the velocity of the signal are used to calculate the range.
R = v*∆T/2,
where R is the range (in m),
v is the wave propagation velocity (in m/s), and
∆T is the round-trip time (in s).
Today’s advanced sonar systems include side-scanning or sector scanning sonar. After measuring the returned acoustic signal, the forward-looking imaging system is rotated slightly in one direction. The smaller the step (in degrees), the higher the resolution of the picture. Among the critical design factors is how quickly the signal from the underwater scanning head and the sonar processor located in the vessel as well as the narrowness of the beam.
Part 2 will address the use of LiDAR in underwater navigation and exploration.
Filed Under: Sensor Tips