- Advertisement -

- Advertisement -

OHIO WEATHER

The Google Pixel’s squeeze for assistant was a button without a button


The Pixel 2 is an almost five-year-old phone, but it introduced a feature that I miss more and more with each passing year. It was called Active Edge, and it let you summon Google Assistant just by giving your phone a squeeze. In some ways, it’s an unusual idea. But it effectively gave you something sorely lacking on modern phones: a way to physically interact with the phone to just get something done.

Looking at the sides of the Pixel 2 and 2 XL, you won’t see anything to indicate that you’re holding anything special. Sure, there’s a power button and volume rocker, but otherwise, the sides are sparse. Give the phone’s bare edges a good squeeze, though, and a subtle vibration and animation will play, as Google Assistant pops up from the bottom of the screen, ready to start listening to you. You don’t have to wake the phone up, long-press on any physical or virtual buttons, or tap the screen. You squeeze and start talking.

Looking at the sides of the Pixel 2, you’d never guess it’s actually a button.
Photo by Amelia Holowaty Krales / The Verge

We’ll talk about how useful this is in a second, but I don’t want to gloss over just how cool it feels. Phones are rigid objects made of metal and plastic, and yet, the Pixel can tell when I’m applying more pressure than I do just holding it. According to an old iFixit teardown, this is made possible by a few strain gauges mounted to the inside of the phone that can detect the ever so slight bend in your phone’s case when you squeeze it. For the record, this is a change my human nervous system is incapable of picking up on; I can’t tell that the phone is bending at all.

Whether you found Active Edge useful probably came down to whether you liked using Google Assistant, as illustrated by this Reddit thread. Personally, the only time I ever really used a voice assistant on a daily basis was when I had the Pixel 2 because it was literally right at hand. The thing that made it so convenient is that the squeeze basically always worked. Even if you were in an app that hid the navigation buttons or your phone’s screen was completely off, Active Edge still did its job.

While that made it extremely useful for looking up fun facts or doing quick calculations and conversions, I’d argue that Active Edge could’ve been so much more useful had you been able to remap it. I enjoyed having the assistant, but if I had been able to turn on my flashlight with a squeeze, I would’ve had instant access to the most important features of my phone no matter what.

This version of the feature actually existed. HTC’s U11, which came out a few months before the Pixel 2, had a similar but more customizable feature called Edge Sense. The two companies worked together on the Pixel and Pixel 2, which explains how it ended up on Google’s devices. That same year, Google bought HTC’s mobile division team.

Active Edge was not Google’s first attempt at providing an alternative to using the touchscreen or physical buttons to control your phone, either. A few years before the Pixel 2, Motorola was letting you open the camera by twisting your phone and turn on the flashlight with a karate chop — not unlike how you shuffled music on a 2008 iPod Nano. The camera shortcut came about during the relatively short amount of time that Google owned Motorola.

As time went on, though, phone manufacturers moved further away from being able to access a few essential features with a physical action. Take my daily driver, an iPhone 12 Mini, for instance. To launch Siri, I have to press and hold the power button, which has become burdened with responsibilities since Apple got rid of the home button. To turn on the flashlight, something I do multiple times a day, I have to wake up the screen and tap and hold the button in the left-hand corner. The camera is slightly more convenient, being accessible with a left swipe on the lock screen, but the screen still has to be on for that to work. And if I’m actually using the phone, the easiest way to access the flashlight or camera is through Control Center, which involves swiping down from the top-right corner and trying to pick out one specific icon from a grid.

In other words, if I look up from my phone and notice my cat doing something cute, he may very well have stopped by the time I actually get the camera open. It’s not that it’s difficult to launch the camera or turn on the flashlight — it’s just that it could be so much more convenient if there were a dedicated button or squeeze gesture. Apple even briefly acknowledged this when it made a battery case for the iPhone that had a button to launch the camera. A few seconds saved here or there add up over the lifetime of a phone.

Just to prove the point, here’s how fast launching the camera is on my…



Read More: The Google Pixel’s squeeze for assistant was a button without a button

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.