Dave Gershgorn, writing for Quartz: At the Neural Information Processing Systems conference in Long Beach, California, next week, Google researchers Hee Jung Ryu and Florian Schroff will present a project they’re calling an electronic screen protector, where a Google Pixel phone uses its front-facing camera and eye-detecting artificial intelligence to detect whether more than one person is looking at the screen. An unlisted, but public video by Ryu shows the software interrupting a Google messaging app to display a camera view, with the peeking perpetrator identified and given a Snapchat-esque vomit rainbow. Ryu and Schroff claim the system works with different lighting conditions and poses, and can recognize a person’s gaze in 2 milliseconds. Ostensibly, this AI software is able to work so quickly because it’s being run on the phone, rather than sent for processing on the company’s powerful cloud servers.
Read more of this story at Slashdot.