Groundbreaking new sonar technology could be used to monitor the movement of vulnerable care home patients, according to researchers at the University of Glasgow.
As part of a recent project, researchers from the University said they have found ways to equip everyday objects, including smartphones or laptops, with ‘bat-like’ senses.
Using a sophisticated machine learning algorithm, researchers were able to reflect echoes to generate images similar to the way bats use echolocation to navigate and hunt.
This algorithm measures the time it takes for blips of sound emitted by speakers or radio waves pulsed from small antennas to bounce around inside an indoor space and return to the sensor.
By analysing the results, the algorithm can establish the shape, size and layout of a room, as well as identify objects or people.
Results are then displayed as a video feed, which turns the echo data into three-dimensional vision. Hypothetically, researchers say the sonar technology could be used to generate images through any device equipped with a microphone and speaker.
- CodeClan and FanDuel team up on employer partner programme
- Dundee Uni spin-out Exscientia completes SoftBank-led funding round
- Firms are spending more and more on cybersecurity to combat growing threats
Findings of the study have been published in the journal Physical Review Letters. Lead authors, Dr Alex Turpin and Dr Valentin Kapitany, believe the techniques could have applications in a range of areas, including healthcare and security.
“We believe that the algorithm we’ve developed could turn any device with either of those pieces of kit [microphone or antenna] into an echolocation device,” Dr Turpin commented.
“That means that the cost of this kind of 3D imaging could be greatly reduced, opening up many new applications.”
Buildings could be kept secure without the use of traditional cameras, Turpin suggested, with the sonar technology able to pick up signals reflected from an intruder. Similarly, the same technology could be used to monitor the movements of vulnerable patients in care homes.
“We could even see the system being used to track the rise and fall of a patient’s chest in healthcare settings, altering staff to changes in their breathing,” Turpin added.
This research builds on previous work by the Glasgow team, which trained a neural-network algorithm to build 3D images by measuring the reflections from flashes of light using a single-pixel detector.
It’s simple, we train the algorithm
The paper, titled ‘3D imaging from multipath temporal echoes’, outlines how the team used speakers and microphones from a laptop to generate and receive acoustic waves in the kilohertz range.
Researchers also used an antenna to do the same, except with radio frequency sounds in the gigahertz range.
In both instances, the team collected data on the reflections of the waves taken in a room as a single person moved around.
At the same time, they also recorded data about the room using a special camera which uses a process known as ‘time-of-flight’ to measure the dimensions of the room and provide low-resolution images.
By combining the echo data from the microphone and the image data from the time-of-flight camera, the team ‘trained’ the algorithm over hundreds of repetitions to associate specific delays in the echoes with images.
Eventually, the algorithm had learned enough to generate its own detailed images of the room and its contents from the echo data alone, giving it the ‘bat-like’ ability to sense its surroundings.
“We’ve now been able to demonstrate the effectiveness of this algorithmic machine-learning technique using light and sound, which is very exciting,” Dr Turpin continued.
“It’s clear that there is a lot of potential here for sensing the world in new ways, and we’re keen to continue exploring the possibilities of generating more high-resolution images in the future.”