Researchers have come up with an ingenious way to listen in on conversations in a room at a distance without relying on planted bugs, sophisticated lasers, or eagle-eyed lip readers: they just stare at a light bulb.
The project, led by Ben Nassi of the Ben-Gurion University of the Negev in Israel, recovers speech from a room with a bulb in real time.
A lot of Ben-Gurion’s research focuses on getting data out of air-gapped systems, often at very low bandwidths. In the past, we’ve covered its use of everything from keyboard LED lights through to screen brightness and the vibrations from computer fans. Now, though, its researchers after your boardroom, expanding their research to listen in on ambient noise.
The team’s latest research, Lamphone: Real-Time Passive Sound Recovery from Light Bulb Vibrations, can listen in (or, more accurately, look in) to what’s being said in a room from up to 25 metres away by examining how a light bulb’s frequency responds to sound.
Researchers have proposed eavesdropping methods before. One team bounced a laser off a window to convert vibrations in the glass into audio. The problem there is that defenders could detect the beam with an optical sensor.
Another used a camera to watch vibrations caused by sound in materials like crisp bags and the surface of water in a glass. That’s ingenious but impractical, argues the Ben-Gurion team, because it picks up information slowly and it has to be analysed for hours afterwards to reproduce the audio.
The Lamphone attack solves both these problems because it can eavesdrop in real time and is entirely passive, making it undetectable. It uses the fact that light bulbs vibrate in response to sound.
The team pointed a telescope with an optical sensor at an E27 LED light bulb and used it to analyze displacement patterns caused by the vibrations in a glowing bulb.
It was able to sample the bulb 4,000 times per second. It converted the sampled optical signal using an audio-to-digital converter running on a laptop. It reduced the noise from the signal and then ran it through an audio signal processor to optimise the quality of speech in the audio.
The research team tested the concept by pointing three telescopes at a room 25 metres away. They were able to recover two songs: Clocks by Coldplay and Let It Be by the Beatles. The audio was good enough for Shazam to recognise both songs. They also retrieved a recording of President Trump and successfully recognized it using Google’s speech API.
There are some countermeasures, which are pretty low-tech. You can reduce the wattage of the bulb, which degrades the quality of the audio even though the resulting drop in light is insignificant for people in a room.
You can use a building with curtain walls, which are thin non-structural walls often thicker and more transparent than regular windows (an example of this is full-building glass cladding).
If you’re really paranoid, you could always draw a blind. Failing all that, just turn the lights off and hold your top secret meetings as spy novelists would prefer it: in the dark.