INFERRING TOUCH GESTURES, NO PERMISSIONS REQUIRED —

Ambient light sensors can reveal your device activity. How big a threat is it?

For now, there's no reason for concern, but that could change in coming years.

Ambient light sensors can reveal your device activity. How big a threat is it?
Getty Images

An overwhelming majority of handheld devices these days have ambient light sensors built into them. A large percentage of TVs and monitors do, too, and that proportion is growing. The sensors allow devices to automatically adjust the screen brightness based on how light or dark the surroundings are. That, in turn, reduces eye strain and improves power consumption.

New research reveals that embedded ambient light sensors can, under certain conditions, allow website operators, app makers, and others to pry into user actions that until now have been presumed to be private. A proof-of-concept attack coming out of the research, for instance, can determine what touch gestures a user is performing on the screen. Gestures including one-finger slides, two-finger scrolls, three-finger pinches, four-finger swipes, and five-finger rotates can all be determined. As screen resolutions and sensors improve, the attack is likely to get better.

Always-on sensors, no permissions required

There are plenty of limitations that prevent the attack as it exists now from being practical or posing an immediate threat. The biggest restrictions: It works only on devices with a large screen, in environments without bright ambient light, and when the screen is displaying certain types of content that are known to the attacker. The technique also can’t reveal the identity of people in front of the screen. The researchers, from Massachusetts Institute of Technology, readily acknowledge these constraints but say it’s important for device makers and end users to be aware of the potential threat going forward.

“We aim to raise the public awareness and suggest that simple software steps can be made to make ambient light sensors safer, that is restricting the permission and information rate of ambient light sensors,” Yang Liu, a fifth-year PhD student and the lead author of the study, wrote in an email. “Additionally, we want to warn people of the potential privacy/security risk of the combination of passive (sensor) and active (screen) components of modern smart devices, as they are getting ‘smarter’ with more sensors. The trend of consumer electronics pursuing larger and brighter screens can also impact the landscape by pushing the imaging privacy threat towards the warning zone.”

There’s a large body of existing attacks that use sensors on phones and other devices as a side channel that can leak private details about the people using them. An attack devised by researchers in 2013, for instance, used the embedded video camera and microphone of a phone to accurately guess PINs entered. Research from 2019 showed how monitoring a device accelerometer and gyroscope output can also lead to the accurate guessing of PINS entered. Research from 2015 used accelerometers to detect speech activity and correlate it with mood. And an attack presented in 2020 shows how accelerometers can recognize speech and reconstruct the corresponding audio signals.

Exacerbating the potential risk: This sensor data is always on, and neither Android nor iOS limit the permissions required to access it. End users are left with few, if any, effective recourses.

The MIT researchers add to this existing corpus with an eavesdropping technique that can capture rough images of objects or events taking place directly in front of the device screen. The device used in the experiments was a Samsung Galaxy View2, a tablet that runs on Android. The researchers chose it because of its large (17.3-inch) screen. Under current conditions, large screens are necessary for the attack to work because they provide the large amount of brightness needed. The Galaxy View2 also provided easy access to the light sensor. MIT researcher Liu said iOS devices and light sensor-embedded TVs from a host of manufacturers are also likely vulnerable.

Channel Ars Technica