Americas

  • United States

Asia

JR Raphael
Contributing Editor

The Pixel phone’s Motion Sense mystery

news analysis
May 19, 202010 mins
AndroidGoogleMobile

Google's latest Pixel phone had what should have been a killer new kind of mobile technology. But we may never get to experience its full potential.

Google Pixel Phone
Credit: Google/JR Raphael

This time nearly a year ago, I was genuinely excited.

The month was June of 2019. Rumors were flying fast about Google’s then-still-under-wraps Pixel 4 phone, and an especially juicy one had just made its way to the surface.

The Pixel 4 would feature a wild new kind of radar system, the grapeline informed us — a system we’d been hearing about from Google for years but that had remained a lab-based experiment up until that point. It was called Project Soli, and Goog almighty, did it sound promising.

Project Soli got its start as a part of Google’s Advanced Technology and Projects (ATAP) division and had been the subject of several awe-inspiring demos over the years. The Pixel 4, though, would mark the first time we’d see it in an actual user-facing product — and the first time we mere tech-toting mortals would have the chance to experience its magical-seeming wares.

It should have been spectacular. And it should have been just the beginning.

But here we are today, nearly seven months to the day from the Pixel 4’s debut — and the phone’s flashy radar system, now known as Motion Sense, hasn’t even come close to meeting its potential. What’s more, a fresh set of rumors suggests Google could be giving up on the effort entirely with its Pixel phones and going Soli-free with this year’s Pixel 5 flagship.

If so, it’d be a classic Google about-face — yet another one of the company’s many moments of having some inspired idea, breathlessly convincing us of its value, and then losing interest and moving on instead of nourishing the notion and allowing it to develop. And with Motion Sense in particular, that’d be a damn shame to see — because this system really had the potential to turn into something special.

We’ll get into why as well as the question of what might’ve happened along the way in a moment. First, we need to rewind for a second to refresh ourselves on what Google’s crazy-sounding radar system was supposed to accomplish — and then come back to what it’s actually done so far. Because boy howdy, is there quite the contrast between those two things.

The Project Soli promise

So first: what Google’s Pixel-based Soli radar was supposed to be capable of doing for us. At its core, the Soli radar was designed to track the tiniest of hand movements — “micromotions” or “twitches,” as they were lovingly called during development. The system was then trained to “extract specific gesture information” from the radar signal at a “high frame rate,” in the words of its engineers.

What that ultimately meant was that the chip could, in theory, sense precisely and reliably how you were moving your hand at any given moment and then perform an action on your device that was mapped to that specific movement. And it wouldn’t require any complicated mime-on-mescaline-like actions to make that happen.

It’s one of those things you kind of have to see to appreciate. And this early Soli video, created long before the Pixel 4 came into the equation, does a dynamite job of demonstrating it:

It was in part because of Soli’s unusual ability to detect those fine hand movements — an effect of its radar-based nature as opposed to the more ordinary camera-driven methods most gesture-recognizing systems rely on — that I was so optimistic about what it could do for us in a Pixel phone. But even that wasn’t the entire story.

According to earlier Soli demos, the nature of the radar technology meant the system could detect hand movements from as far as 49 feet away — meaning you could be a full tractor-trailer’s length from your device and still fire off a hand gesture to control it. And since it’s all radar-based, you wouldn’t even have to position your hand in any direct line-of-sight path in order for your commands to be detected. You wouldn’t even have to have any line-of-sight at all, in fact; the folks from Google’s ATAP group had noted that the Soli radar could detect hand movements even through fabrics, with no visible path between your hand and the gadget.

That’s some seriously sci-fi-level stuff, right? I mean, just imagine all the productivity-oriented possibilities (again, in theory): You could wave your hand left or right in the air to control a presentation playing from your phone and move on to the next slide — or you could lift your hand up or down to scroll through a document or web page. With a twist of your fingers, you could adjust your device’s media volume. And all of that could happen even from across a large conference room and even if your phone were tucked away inside a bag.

Heck, with the right kind of smart-hardware integrations — integrations most of us hoped would arrive eventually — you could adjust something like the level of light in a room with an associated gesture (and, if you really want to impress your fellow nerds, an enthusiastic “Lumos!” incantation for good measure). And productivity-oriented purposes aside, you could control all sorts of facets of your phone while driving, running, working out, working outside, or doing anything else where your hands aren’t readily available.

So. Much. Potential. And as Google emphasized repeatedly when Motion Sense was first being shown off with the Pixel 4, the phone’s earliest capabilities were “just the start” — “just as Pixels get better over time,” the company emphasized, “Motion Sense [would] evolve as well.” Google wanted to give us all time to get used to this new manner of interacting with our devices, the thinking went, and would expand the system’s unique “language” and the capabilities around it as time moved on.

And yet, here we are.

The Motion Sense mystery

This week officially marks seven months into the Pixel 4’s life — and we’ve got basically the same set of limited Motion Sense features we saw when the phone arrived last October: You can skip songs and silence an alarm, stop a timer, or quiet an incoming call ring by waving your hand over the phone. Google made one trivial addition to that list in February, allowing you to pause music by making a tapping gesture above the device’s display, but that’s all the progress there’s been so far.

Perhaps most disappointingly, all of those gestures work only when you perform ’em within a few inches of the phone’s screen, with your hand directly above it. They’re mildly useful on occasion — like if you’re working out or have your hands dirty with something, for instance, and want to adjust audio playback without having to mess with your phone — but it’s more mild added convenience than anything transformative. And as someone who’s owned a Pixel 4 since the start, I’m still baffled by how often it takes multiple tries to get the gestures to work properly (and that’s to say nothing of the moments when I activate a Motion Sense gesture inadvertently, which is consistently infuriating).

The system’s ability to enhance the Pixel’s face unlock mechanism by detecting when you’re reaching for your device and then proactively activating the display is a genuinely nice touch, but it’s a relatively subtle enhancement over the gyroscope-driven “Lift to check phone” feature built into Android — and it’s still far from addressing the vast promise this technology presented us with at the get-go or justifying its complication-adding presence in Google’s flagship phone.

So what in the world happened? How did we go from awe-inspiring, potential-filled innovation to a limited set of so-so gestures, next to no ongoing development, and now the possible phasing out of Motion Sense entirely — and thus presumably also the end of any real ongoing, focused development for the system on the phone front?

The obvious explanation is the typical one: Google lost interest, changed priorities, and decided to cut its losses and move on — an all-too-familiar tale for those of us who watch this company closely. Today’s strategic focus is tomorrow’s abandoned plan. It happens all the freakin’ time.

But it’s possible there may be more to this — provided, of course, that the Pixel radar pivot is actually happening. Earlier this year, remember, we saw signs that the upcoming Pixel 5 flagship could use a more midrange-level processor instead of the top-of-the-line chip most 2020 Android flagships are packing. That’s especially significant because that top-of-the-line chip requires the presence of 5G, which makes the phones using it exceptionally expensive and with little added benefit for most of us.

Google going with a more cost-effective chip for the Pixel 5 could eliminate the premature 5G-fixation so many other device-makers are now exhibiting. And that, in turn, could pave the way for Google to lower the Pixel 5’s price from the line’s current $800 starting point to, say, somewhere in the ballpark of $600 or $700 — a notion reinforced by an apparent survey making the rounds right now to gauge perception of a $699 Pixel phone.

Combined with recent reports of particularly disappointing Pixel 4 sales, it’s entirely possible Google decided going for a lower-priced phone was its best chance at helping the Pixel achieve more mainstream success — something I think makes an awful lot of sense. And it’s quite conceivable that a fancy radar-gesture system simply wouldn’t fit into a more value-minded scenario.

So if that proves to be the case, then an absence of Motion Sense in a Pixel 5 could actually be sensible. But regardless, what we saw with the feature at the start of the Pixel 4’s life was supposed to have been just the beginning. Google even hinted that the technology could make its way to other types of devices — something I optimistically hoped would serve as a missing piece of the puzzle that really showed off the value of Google’s homegrown hardware effort.

And who knows? Maybe some of that development will still happen on a less unified, more piecemeal level. (Just last week, word broke about an apparent Google patent filing involving Motion-Sense-reminiscent gestures on a smartwatch, but it seems to use an “optical sensor” of some sort instead of the Soli-like radar system. It was also filed all the way back in January of 2019. And patent filings in general tend to fly fast and frequently in the tech world and often have no direct connection to a company’s actual active road map.)

But knowing Google, it’s hard not to wonder if this could be the end — the end of an exciting and promising technological journey that never truly progressed past its beginning. Google’s willingness to constantly reassess its products and pivot away from once-prominent plans can sometimes be an asset. But that same lack of commitment and willingness to stick with something long enough to see it through can also be a liability. And most frustrating of all is the realization that, if things end up playing out as expected here, we’ll likely never know what could have been. 

Sign up for my weekly newsletter to get more practical tips, personal recommendations, and plain-English perspective on the news that matters.

AI Newsletter

[Android Intelligence videos at Computerworld]

JR Raphael
Contributing Editor

JR Raphael has been covering Android and ChromeOS since their earliest days. You can ingest his advice and insight in his long-standing Android Intelligence column at Computerworld and get even more tasty tech knowledge with his free Android Intelligence newsletter at The Intelligence.

More from this author