Americas

  • United States

Asia

JR Raphael
Contributing Editor

30 time-saving tricks to try with the new Google Assistant

how-to
Oct 29, 201911 mins
AndroidArtificial IntelligenceEnterprise Applications

Google's revamped Assistant can do some really useful stuff — if you know what to ask.

Google’s new and improved Assistant may not be a full-fledged revolution quite yet, but it sure is a substantial upgrade over the traditional Assistant experience.

The new Assistant, if you haven’t heard, is currently something you can experience only on the new Pixel 4 phone. But rest assured, that’ll change before long: I’ve confirmed with Google that the revamped setup will indeed be making its way to the broader Android ecosystem sometime next year. So even if you don’t have it in your hands this minute, it’ll make its way to you eventually.

So what makes the new Assistant so special? Well, the interface is dramatically improved, for one: Instead of having a big honkin’ box pop up every time you summon your virtual genie, the Assistant now indicates it’s listening via the presence of an animated, multicolored bar at the bottom of your screen. The words you speak appear in that same area, and responses pop up in a small partial-screen window instead of requiring you to stop what you’re doing and shift over to the full Google app.

google pixel 4 new assistant JR

On top of that, the new Assistant moves much of its actual processing onto your device instead of relying on a constant connection to Google’s servers — and that, in turn, opens up the door to some nifty new ways Assistant can work for you. As of now, there’s no easy way to know what those freshly added options are; instead, it’s up to you to flail around and experiment to find ’em.

Today’s your lucky day, though, for I am a master flailer. Flailing, in fact, is what I do best. (That’s probably because I’m at least 7 percent amphibian, according to a recent analysis that I can neither confirm nor deny.) So spare yourself any further flailing, and allow me to take you on a tasty, totally flail-free tour of what the new Assistant can do.

After living with the Pixel 4 for the past several days, these are some of the most interesting and potentially useful new tricks I’ve discovered.

Ask more of your apps

Google Assistant can now open apps on your phone faster than ever, but that’s only half the story. With the new Assistant, you can actually go a step further and pull up specific parts of apps, all by giving your beautiful little larynx a little bit of exercise.

For example, you could say to Assistant: Open my messages with Maude — and just like that, you’d be catapulted over to your messaging thread with your lovely friend (and possibly Great-Great-Great Aunt) Maude.

The same basic idea can work with other apps, too. For instance, you could tell Assistant to search for pickling instruction videos in YouTube — or search for Made by Google on Twitter — and the service would open the respective apps (assuming you have them installed, of course) and take you directly to the search you requested.

You can start that sort of search while you’re actively using an app as well. In apps with visible search boxes — YouTube, Twitter, Gmail, Google Docs, Google Drive, Google Maps, and so on — firing up Assistant and saying search for followed by any term will immediately launch a search within that app.

But wait! There’s more: In Google Photos, you can use the same logic to narrow things down even further once a search has begun. Let’s say you started by telling Assistant: Find my photos from Halloween. Once Assistant pulls up the Photos app and shows you all your Halloween images, you could then say the ones with pumpkins — and it’d search within your results to show you only the pictures that are both from Halloween and with pumpkins in them.

And the same sort of thing works in Maps. While in the app, you might activate Assistant and then say: Search for restaurants near me. When you get a list of places, you could follow up with by saying the ones that serve barbecue — and Assistant would narrow down the field accordingly.

Not bad, right? And all of this is still just the start.

Interact with info on your screen

Right now, the new Assistant doesn’t have the “What’s on my screen?” button that used to be present in the service; in fact, if you ask Assistant what’s on your screen, it’ll give you a confusing message instructing you to touch and hold your phone’s non-existent Home button and then to tap the also-non-existent “What’s on my screen?” button in order to access that function.

new google assistant on screen JR

(Google tells me that appears to be a bug, so it’ll presumably be fixed before long.)

Even without that option in place, though, the new Assistant is able to view your screen and interact with info in some pretty interesting ways — ways that are arguably more useful and consistently effective than the old system’s method, where the service would attempt to read your entire screen and then guess what you wanted to know about.

For instance, if you see a physical address on your screen — be it on a web page, in an email, in a document, or whatever — you can activate Assistant and say navigate there.

Similarly, anytime you see a phone number, you can summon Assistant and say call that number.

Got an email with a non-hyperlinked web address in it? Save yourself some highlighting and simply tell Assistant to open that web page.

And if you’re watching a movie trailer on YouTube, try asking Assistant: When’s that playing near me? You’ll get the answer faster than you can say “sorry, sucky Siri.”

Capture and share screenshots in a snap

‘Twas a time when taking a screenshot on Android was a painfully complicated affair. (Anyone else remember the days of connecting a computer to your phone just to do that?!) With the new Assistant, though, capturing and then sharing what’s on your screen at any given moment has never been easier.

All you’ve gotta do is tell Assistant to take a screenshot from anywhere on your phone, and it’ll oblige in a split second. And here’s the really neat part: As soon as that screenshot is captured, you can follow up and tell Assistant to send it to anyone in your contacts — and just like that, it’ll open up Messages and attach the screenshot into the appropriate thread.

Share from anywhere

Speaking of sharing, the new Assistant makes it super-simple to share almost anything, from anywhere. Whether you’re looking at an image in your Google Photos collection, viewing a web page in Chrome, or watching a video in YouTube, you can activate your Assistant and tell it to share this with or send this to anyone you want — and Assistant will instantly take you over to that person’s thread in your messaging app with a link pasted in and ready.

If you use that command in a place that doesn’t lend itself well to link-based sharing, meanwhile — say, a message you’re viewing in Gmail, something you’re looking at in Drive or Docs, or even some sort of results or non-page-specific info you’re checking out within the Google app — Assistant will generate a screenshot and then share that image to the person you specify, all in the span of a single second. Once again, all you have to do is open Assistant from wherever you are and tell it to share this with or send this to whomever you want.

Reply without lifting a finger

While you’re looking at a thread within Google’s Messages app, take note: You can now reply to anyone you’re talking to by telling Assistant reply followed by your message.

Control your camera

Assistant’s camera control capabilities had become curiously limited as of late, but the new Assistant fixes that. From anywhere in your phone, try some of these camera-specific commands:

  • Take a picture — to open your Camera app and then take a picture with your phone’s primary camera in three seconds
  • Take a selfie — to snap a photo using the phone’s front-facing camera after a countdown of three
  • Take a picture (or a selfie) in Night Sight — to capture a picture using whichever camera you prefer with Night Sight mode already activated, again with a three-second countdown
  • Take a picture (or a selfie) in Portrait — to take a picture using either camera with Portrait mode activated, with that same three-second countdown
  • Take a panorama — to launch immediately into a panoramic photo-capturing sequence
  • Take a video — to launch immediately into recording a video (note that there’s no countdown for this one by default)
  • Take a slow-motion video — to launch immediately into recording a slow-mo video (again, no countdown here by default)
  • Take a video time-lapse — to open your camera’s time-lapse function (after which you’ll have to select the specific setting you want to use before the recording will begin)

Prefer to set your own timer for any of these functions? No problem: Just add in 10 seconds (or whatever amount of time you want) after any of those commands to create your own custom countdown.

Get to Google Lens

Strangely absent from the new Assistant setup is a shortcut for opening Google Lens — Google’s system for searching the real world, via your phone’s camera, and obtaining information about everything from plants and animals to landmarks around you as well as letting you copy and paste text or translate words from physical signs and documents.

Fear not, though, for when there’s a will, there’s a way. If you want to get to Lens from the new Assistant, just grab the official Google Lens app from the Play Store. It’s basically just a shortcut to the main Lens service, but once it’s installed, you’ll be able to tell Assistant to open Google Lens — and just like that, you’ll have it opened up and and ready for action.

Let Assistant keep listening

One of the new Assistant’s most convenient capabilities is its ability to keep listening for additional commands after you’ve told it something. That’s what lets the service do things like drilling down deeper into searches within Photos or sending additional messages (via the reply command) after you’ve shared something to someone in Messages.

But for now, at least, that ability is oddly disabled by default. To turn it on, tell Assistant: Open my Assistant settings. Then, turn the toggle next to “Continued Conversation” on — and speak away to your heart’s content.

Edit your queries more easily

Let’s face it: For as good as Assistant may be at understanding what you’re asking, it sometimes gets it wrong (and by “sometimes,” I mean “shockingly often, even if you don’t have a total mush-mouth”). The new Assistant has a handy way to correct its misinterpretations, though, without forcing you to start over and issue your command again with the desperate hope that it understands you the second time.

With any command that results in an on-screen answer, you’ll see the words you uttered — as Assistant interpreted them — at the top of the Assistant interface, right next to your profile picture. You’d probably never realize it, but you can actually tap those words. And once you do, a keyboard will pop up with your full query in a box, where you can then tweak it as needed.

New Google Assistant: Edit JR

In some cases, when Assistant thinks it might’ve misunderstood you, it’ll actually even underline part of your query and put an “Edit” command (complete with a pencil icon) below it. But even when that prompt doesn’t appear, all it takes is a quick tap of the text to refine your command.

So there ya have it: tons of new tricks to try with Google’s new and improved Assistant. Now, let’s be realistic: None of this stuff is gonna dramatically change your life. You’re still gonna be tapping your screen for most things, most of the time. But Assistant’s newfound capabilities could genuinely be useful in certain circumstances — and once you get in the habit of using ’em, you’ll almost certainly rely on Assistant more than you did before.

That’s a win for Google, without a doubt — and with every second saved, it’s a win for you, too. And just think: You didn’t even have to flail to get here.

Sign up for my weekly newsletter to get more practical tips, personal recommendations, and plain-English perspective on the news that matters.

AI Newsletter

[Android Intelligence videos at Computerworld]

JR Raphael
Contributing Editor

JR Raphael has been covering Android and ChromeOS since their earliest days. You can ingest his advice and insight in his long-standing Android Intelligence column at Computerworld and get even more tasty tech knowledge with his free Android Intelligence newsletter at The Intelligence.

More from this author