Everything Important From Google I/O 2018 in an Easy-to-Read Bulleted List

Besides the, admittedly weird, but also super interesting machine learning produced music, and the revelation that beer foam cannot defy physics, there was a bunch of things announced at Google I/O that I think are worth mentioning to people even if they aren’t developers. From new Google Maps features to Android P UI changes, here are the most important things I found from watching the Google I/O 2018 Keynote (so you don’t have to).

GBoard and Gmail

Gmail Smart Compose

  • Gboard now supports morse code to people with disabilities.
  • Smart Compose is a new autocomplete for phrases instead of just words that will roll out in Gmail first later this month.

Google Assistant

  • FINALLY! Google Now adopts the ability to do follow up commands. After you issue a command, she’ll listen for another instead of having to say “Hey, Google” every time. You know, like a real human would (and Alexa already did a bit ago).
  • Multiple commands at once can now be issued without “Hey, Google” every time. I.e. “Turn off the kitchen and living room lights.”
  • 6 new voices coming to Google Assistant. John Legend’s voice is also coming.
  • Optionally add “Please” to Google Home commands to teach children to be more polite.
  • Google Assistant coming to navigation, Maps, and YouTube TV
  • Google Duplex will enable Google Assisant to make real phone calls to businesses for you and have a full blown conversation with someone on the other end to book an appointment or reservation without you needing to be involved. I’ll believe it when I see it, but if it works, that is insane. Coming in the next few weeks as an “experiment”.
  • Google Assistant will now suggest actions like fixing the exposure of an image, automatically send an image to a friend it notices is in the photo, take a photo of a document and automatically convert it to a PDF, add color to an old black and white photo that never had it, etc.
  • Google is adding reminders to take breaks from apps, starting with YouTube.
  • Get daily digest notifications for YouTube instead of push ones as they come (maybe this means all of our subscribers will now get out notifications instead of how it selects who it thinks will click and refuses to send to the rest–how it works now).

Google News

Google News Full Coverage

  • New AI and reinforcement learning to figure out what stories to show you in Google News. You can choose to show more or less of each.
  • You can still get to top stories.
  • All of it uses the new material design.
  • Newscast is a new feature that resembles Moments from Twitter. It’ll grab info from a bunch of sources on a topic it thinks you are interested in and display it all in one place.
  • Full Coverage uses “temporary co-locality” to show you the same story from various sources to give you more perspective on it. It’s displayed in a timeline version if the story has been developing for some time.
  • You can subscribe (free and paid) to publications you like within the news app with just one click.
  • Subscribe with Google lets you use your Google account to easily subscribe through the app with one click as well as on publishers’ own sites (the ones that they have partnered with).
  • The new Google News is available today.

Android P

 

Google showed off some Android P features, but it’s only a small sampling, the full list is available here if you’re curious.

  • Adaptive Battery uses a partnership with Deepmind to learn how you use apps and essentially stop them from waking up the CPU unnecessarily, conserving power.
  • Adaptive Brightness learns how you set the slide based on ambient lighting instead of just deciding what brightness to use based on current lighting alone.
  • App Action Predictions show you app actions you might be about to do instead of just predicting what apps to open.
  • Slice brings slices of apps and actions from them directly into search instead of having to open the app itself (i.e. searching Lyft in the search bar might automatically give you the ability to call one without opening Lyft). Slice is coming to developers next month.
  • ML Kit lets developers use APIs for speech, image, and facial recognition along with other machine learning common tasts directly with their app without needing to code any of it.
  • Android P has a simplified new UI.
  • Gestures on the on-screen home button instead of navigation buttons. Swipe up once to get multitasking, swipe again to open the app drawer, swipe left on it to go through a timeline of used apps, etc.
  • Horizontal cards in multitasking view so you can see all the information in each app without it being covered by another. Also, you can directly act with the apps in this view (i.e. copy and paste text from one to another without having to open it).
  • Volume control has been moved to the side instead of top and controls Media Volume instead of Ringer Volume by default which makes so much more sense to me.
  • Dashboard App usage shows how you how often and when you use apps on your phone (and will give this info to developers, by the way)
  • App Timers allow you to set limits on apps.
  • Do Not Disturb updated to also silence visual notifications instead of just audio ones (calling it Shush mode).
  • Wind Down Mode fades the screen to greyscale to reduce stimulus before bedtime.
  • Coming in Fall to Pixel devices.
  • Android P Beta available now (and coming to devices that aren’t just Pixel devices for the first time).

Google Maps

  • Tells you about parking, wait times, how long people usually spend in locations.
  • Can now give motorcycle directions instead of just car (Not quite sure why. Are we assuming motorcyclists don’t obey speed limits?).
  • For You tab has new locations, trending places and personal recommendations for places around you.
  • You Match number gives a personalized score for a place it suggests instead of just overall star ratings.
  • Shortlists lets you create lists of places that you can share with friends (I use Foursquare for this, but maybe that’ll change now).
  • The camera can now use street view and image recognition and immediately tell you where to go using maps instead of walking up and down the street to see where the dot is moving and then deciding from there (I do that all the time).

Google Lens

  • Google Lens can detect buildings, landmarks, animals, etc.
  • Google Lens will be integrated directly into the camera on the G7, and Pixel devices (more devices coming).
  • Google Lens can recognize words to copy and paste from the real world to your phone, or let you tap these words to get quickly get answers from Google.
  • Style Match lets you match things like an item instead of the just trying to find the exact thing (might want to use this to help me decorate my apartment–I’m useless).
  • Coming to lens in the next few weeks.

There yo, go guys! Hope that helped some of you save some time watching the keynote and expect a few videos coming later on some of the things in here (namely how to get Android P Beta on your device is coming ASAP for those that want to mess with it).

Related Items: , ,

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.