Google is designing computers that respect your personal space – Fast Company

You’re binging a show on Netflix late at night when you realize that you want to grab a snack. The room is dark, so you feel around for the remote. Maybe you even hit the wrong button to pause and end up fast-forwarding instead.

Now imagine another way. Once you stand up from the couch, the movie simply pauses. And once you sit back down, it continues.

[Image: Google]

This is how seamlessly the world would work, if only the computers all around us could automatically understand our implicit intent, rather than our explicit mouse clicks, screen taps, and even voice commands. It’s a vision that’s been imagined by technologists for decades, sometimes called “ubiquitous computing” or “ambient computing” or even “quiet computing.” And now Google is trying to bring it to fruition, by building computers that understand social norms around personal space.

[Image: Google]

“We’re really inspired by the way people understand each other,” says Leonardo Giusti, Head of Design at Google’s Advanced Technology & Projects (ATAP) Lab. “When you walk behind someone, they hold the door open to you. When you reach to something, it’s handed to you. As humans, we understand each other intuitively often without saying a word.”

[Image: Google]

At ATAP, design researchers have been supercharging their Soli radar sensor to understand the social nuance embedded in our everyday movements. Soli sees your body as nothing more than a blob. But through the right lens, this blob has inertia, posture, and a gaze—all of the things we constantly size up when we interact with other people.

[Photo: Google]

Soli first debuted as a way to track gestures in midair, and landed in Pixel phones to allow you to do things like air swiping past songs—intriguing technologically, but pointless practically. Then Google integrated Soli into its Nest Hub, where it could track your sleep by sensing when you laid down, how much you tossed and turned, and even your cadence of breath. It was a promising use case. But it was also exactly one use case for Soli, which was highly situational and context dependent.

ATAP’s new demos push Soli’s capabilities to new extremes. Soli’s first iterations scanned a few feet. Now it can scan a small room. Then by using a pile of algorithms, it can let a Google phone, thermostat, or smart screen read your body language much like another person would, to anticipate what you’d like next.

“As this technology gets more present in our life, it’s fair to ask tech to take a few more cues from us,” says Giusti. “The same way a partner would tell you to grab an umbrella on the way out [the door on a rainy day], a thermostat by the door could do the same thing.”

[Image: Google]

Google isn’t there yet, but it has trained Soli to understand something key to it all: “We’ve imagined that people and devices can have a personal space,” says Giusti, “and the overlap between these spaces can give us a good understanding of the type of engagement, and social relation between them, at any given moment.”

[Photo: Google]

If I were to walk right up to you—standing four feet away—and make eye contact, you’d know that I wanted to talk to you, so you’d perk up and acknowledge my …….

Source: https://www.fastcompany.com/90725730/google-is-designing-computers-that-respect-your-personal-space

Leave a Reply

Your email address will not be published. Required fields are marked *