When our screens look back at us


Machine Learning Experimentation





tldr;

I coded an interaction that transforms your home screen when you are looking away.




Context
We always have multiple screens on.

We’re constantly surrounded by screens—laptops, extra monitors, phones—but we can only look at one at a time. The rest just sit there, on but ignored. I was curious about what those “forgotten” screens are doing while our attention is elsewhere.




Design

Our screens looking back at us

I wanted to flip the dynamic and imagine screens that are aware of our attention. Here, the screen looks back at us and notices when we’ve looked away. That moment of neglect becomes intentional, playful, and slightly uncanny—suggesting a future where our devices don’t just display information, but notice us noticing them.






Code
Determining when someone is truly looking

To track someone’s attention/gaze, I leveraged an eye-tracking model with facial recognition. 

However, a user could be facing the screen but they are looking at a near by screen. It was critical to detect their “attention” not just the presence of their eyes.

I spent time tuning the relationship between gaze direction and head position, doing the math to decide when someone was technically present versus actually looking.




Users eyes are present but they are looking away from the screen triggering the interaction. When the user’s attention return, the interaction stops.