They say you can tell a real smile because it reaches the eyes. Of course, that just means we all have to learn to fake that kind of smile, too. But the subtle expressiveness of our eyeball area has a fringe benefit: VR researchers can use it to guess at what the rest of your face is doing.
Google Research just published a fun little project that attempts to track expressions solely by looking at your eyes inside the headset. Between the shape of the exposed eye, direction of gaze, eyebrows, wrinkles (for those of us expressive enough to have them) and so on, there’s actually quite a bit of information there.
Enough, anyway, that a deep learning system can figure out a few basic expressions and degrees thereof with decent accuracy. “Happiness” and “surprise” are there, but the data isn’t rich enough to detect “schadenfreude” or “mischief.”
The idea is that with minimal monitoring tools — eye tracking cameras inside headsets, which seem inevitable — you can get at least a bare-bones feel for what a user’s face is doing in real time.
They’ve put it all in a paper, of course, which you can read now or check out soon at SIGGRAPH.