Google’s researchers are constantly experimenting. Some of their projects, like Gmail, are almost destined to become full-scale products. Others never see the light of day. But we’re hoping that one — an artificially intelligent (AI) camera app that recognizes when more than one person is looking at a phone’s screen — is in the former category.
Google’s person-detecting demo, which will be presented by Google researchers Hee Jung Ryu and Florian Schroff at the Neural Information Processing Systems conference in Long Beach, California next week, uses the Google Pixel‘s front-facing camera and an eye-detecting algorithm to determine whether more than one person is looking at the screen. It’s quick enough to recognize a person’s gaze in two milliseconds, and runs entirely locally, meaning it doesn’t need to be processed on Google’s servers.
The embedded video above gives an idea of how it works in practice. (The researchers call it an “electronic screen protector.”)
It’s not Google’s first foray into gaze detection (as Quartz points out, Google holds patents for vision-tracking computer pointers and Pay-Per-Gaze ad analytics suites), but it’s arguably the most useful. These days, as an increasing number of people view family photos, check bank account statements, and enter payment information on the go, smartphones are becoming incredibly personal. And just like someone looking over your shoulder while you enter your ATM PIN number feels creepy, there’s something uncomfortable about a stranger stealing glances at your phone’s screen.
As Google accelerates its neural networks research and an increasing number of devices come equipped with AI-accelerated hardware, it isn’t a stretch to say projects like these will become more common. But as with all internal Google projects, there’s no guarantee they’ll make their way into an application. At the very least, they’re fascinating examples of how neural networks have the potential to affect our lives.
Source: Quartz
0 comments:
Post a Comment