Americas

  • United States

Asia

JR Raphael
Contributing Editor

Project Soli in depth: How radar-detected gestures could set the Pixel 4 apart

news analysis
Jun 12, 20198 mins
AndroidComputers and PeripheralsGoogle

An experimental Google project may finally be ready to make its way into the real world — and the implications could be enormous.

Google Project Soli Pixel 4
Credit: Google

Well, here’s something straight out of science fiction for ya: If a pair of recent rumors is to be believed, Google may be planning to include a futuristic radar chip in its next Pixel phone — possibly to allow for a wild new kind of touch-free gesture controls.

Stop and let that sink in for a second. Crazy stuff, right?

Now, let’s be clear: The chip itself is absolutely real — no question about it. Google’s been talking about the thing since 2015, in fact, as part of its Motorola-born Advanced Technology and Projects (ATAP) group. That’s the same group that came up with Google’s now-defunct modular smartphone system, Project Ara, as well as the also-abandoned Project Tango program that aimed to create a new kind of augmented reality platform.

The radar chip is part of an ongoing ATAP effort called Project Soli — and according to the website 9to5Google, the first Soli-based chip could make its debut in this fall’s Pixel 4 phone. That notion lines up with a separate report from the gang at XDA Developers, which discovered some code in the Android Q beta software that points to OS-level support for a series of top-secret gestures — not the standard screen-based Android Q gestures, mind you, but a whole new category of hands-in-the-air movements that require a special “aware sensor” in order to be recognized.

Oh, and earlier this year, Google got a “waiver” from the FCC that gave it permission to operate those same Soli sensors at a higher frequency that what current regulations allow. In its report, the FCC said the move would “serve the public interest by providing for innovative device control features using touchless hand gesture technology.”

Good golly, Miss Soli, there sure is a lot going on here. Let’s step back for a second and explore this whole thing a little more closely so we can get the full context of what’s actually up and what might be looming in the not-so-distant future.

Project Soli’s not-so-humble start

We’ll start at the beginning: Back in May of 2015, as part of its I/O developers’ conference, Google first took the wraps off its Project Soli, erm, project. At the time, the concept seemed a bit far-fetched — like another one of those lab-based ideas that’d blow us away in a demo but never make its way to the real world.

In a video detailing the effort, the folks behind Project Soli described how the chip would use radars to track the tiniest hand movements — “micromotions” or “twitches,” even — and then use those motions to interact with different types of virtual interfaces. The system, they explained, was designed to “extract specific gesture information” from the radar signal at a “high frame rate.”

Translated into non-geek-speak, that means the chip can sense precisely how you’re moving your hand — making a twisting-like motion as if you were turning a volume knob up or down, for instance, or tapping your thumb and index finger together as if you were tapping a button — and then perform an action on your device that’s mapped to that specific movement.

Seriously, take four minutes to watch this. It’ll blow your mind.

That’s all cool, of course, in that Google I/O developer-session-demo sense — the sense that you assume it’s something you’ll never actually see or use in your regular-person life (or at least not anytime soon). When you add in the possibility of this technology showing up in this fall’s Pixel 4, though, it takes on a whole new meaning.

And there’s more to the Soli story yet.

The Soli sensor evolution

About a year after its debut, Project Soli showed up again — this time, in a session at the next year’s Google I/O conference, in 2016. That year, the Soli team announced it had shrunken down and optimized the chip to make it small and efficient enough to run on standard smartwatch hardware — a significant step from the typical supercomputer powers required to run most radar technology.

“If you can make something run on a smartwatch, you can make it run anywhere you want,” Soli head honcho Ivan Poupyrev explained.

Project Soli Radar Chip Google

Poupyrev and his associates went on to note that the Soli chip ran on Android software and already worked not only with watches but also with phones and home entertainment devices. What’s more, the radar could sense gestures being performed as far as 15 meters — roughly 49 feet (!) — away. And critically, they pointed out that the idea was not to replace existing forms of interaction but to provide an additional option on top of the more mundane methods we all already knew.

“It offers a third dimension of interaction which complements and enhances other interaction modalities such as touch-screen and voice input,” Poupyrev said. “We don’t fight with them. We work together.”

The full presentation is pretty long, but there’s a demo halfway through that’s well worth watching. I’ve got it cued up for you here:

One bit that struck me was the gesture shown for holding up your hand to stop a speaker from playing. It’s eerily reminiscent of the gesture built into the recently announced Google Nest Home Max Smart Display (gesundheit!) — almost identical, in fact:

The Nest Home Max seems to use a regular camera to recognize the gesture, but regardless, the similarity in implementation seems awfully coincidental — especially when you consider what the Soli team has consistently said about establishing a “universal set of gestures” that you can eventually use to “control any device around you.”

Soli’s next steps

So what else can Project Soli and its gesture-sensing radar technology do? From the looks of it, plenty. An exploration from the University of St Andrews in Scotland shows Soli performing tasks like counting cards, sensing compass orientation, and analyzing patterns of blocks in Lego towers.

“The sensing technique remains rather similar, [but] the main contribution is the vast exploration into … the counting, ordering, stacking, movement, and orientation of different objects,” the researchers told The Verge earlier this year. The chips, they added, could even be built into smart home devices in order to monitor specific items in a house and detect if anything about them ever changes.

Hmmmmmm.

Project Soli and the Pixel 4

So putting it all in perspective, what might we expect if this Soli chip does in fact make its way into the Pixel 4 phone? The clues found within the Android Q software suggest there might be gestures for commands like silencing music or skipping tracks, but it’s hard to imagine the effects of this technology ending there.

Looking at all the demos and ideas discussed in the early Soli materials, it seems like there’s a world of possibilities just waiting to be tapped into. And if this technology truly is ready to make its way into widely available hardware, it only makes sense that it’d find its way into more than just the Pixel phone. Time and time again, the Google Soli team has talked about the technology working with wearables, speakers, phones, computers, and even vehicles. And guess what? Google has a hand in all of those areas in some capacity — so, provided the Pixel 4 debut pans out as predicted, it doesn’t seem like much of a stretch to consider Soli and its radar-detected gesture system becoming a common thread throughout many of the company’s future products.

As for how useful it’d actually be, that’s a question only time will answer. “Cool” and “practical” don’t automatically go hand in hand, and lots of eye-catching features end up feeling more gimmicky than helpful in the real world. But the possibilities Soli suggests certainly seem more valuable than those provided by the silly and limited “air gesture” systems we’ve seen on Android phones before.

Perhaps most significant, the nature of the chip and its radar technology means movements can be detected through fabrics — which suggests the gestures could work, at least in theory, even if the associated device is tucked away in a pocket or a purse. And “even though these controls are virtual,” Google has said, the interactions “feel physical and responsive” — with feedback “generated by the haptic sensation of fingers touching each other.”

From a bigger-picture perspective, what’s especially interesting is the way this advancement could demonstrate the power of Google’s still-relatively-young homegrown hardware philosophy. We’ve talked from the start about how the Pixel has always been much more than the sum of its parts and how the whole “holistic,” end-to-end control of the entire user experience is what Google really gains from building its own devices.

If this Soli stuff does in fact show up in the Pixel 4 this fall — and if it proves to be as effective and practical in the real world as it appears in these demos — we may see the biggest indication yet of how that approach could ultimately pay off, not only for Google but also for us as humans who carry and rely on its products. And software support aside, the Pixel phones may finally have the killer feature they need to set them apart from the rest of the smartphone pack and stand out from the flailing, tech-for-the-sake-of-tech forms of “differentiation” most of the industry is currently attempting.

Heck, it might almost be enough to offset the obnoxious nature of all the sales-driven, user-hostile changes we’ve been seeing in smartphone hardware as of late. Almost — and maybe.

Sign up for my weekly newsletter to get more practical tips, personal recommendations, and plain-English perspective on the news that matters.

JR Raphael
Contributing Editor

JR Raphael has been covering Android and ChromeOS since their earliest days. You can ingest his advice and insight in his long-standing Android Intelligence column at Computerworld and get even more tasty tech knowledge with his free Android Intelligence newsletter at The Intelligence.

More from this author