Google I/O is a festival for developers and enthusiasts. It’s also an on opportunity for the search brand to end all search brands to promote itself. Developers such as Kevin Barry, developer of the Nova Launcher (article in German) can participate, interact with free Android software and check out its features. I’m sure the bars around the Shoreline Amphiteatre in Mountain View, California are also thrilled.
For us Europeans, however, the keynote is where it’s at. After all, it’s how we find out what the future of Google and Android holds.
This new operating system will have tiles. Easily reachable through a swipe of your smartwatch, they show you the following data:
These tiles can be rearranged at will. The update containing the tiles will roll out sometime this month.
An other blog post describes the new look of the smart car application Android Auto. The functionality is barely changing, music will still seamlessly transfer from your phone to your car’s loudspeakers. The look is darkening, however, as the popularity of dark mode has caused its adoption on many Google apps. In dark mode, the background isn’t bright white but dark grey, HEX #424242.
Dark mode could prove to be extremely necessary when driving. It’s handy and saves some battery on your phone, but in a car it could – quite literally – save your life. If you stare at a bright display at night, that’s your vision in low light gone. After a few seconds, your eyes will have got used to the darkness again, but not at the same level as they were before you stared at that screen. These few seconds could make the difference between an accident or safety. I’d prefer if Google offered an Amoled option, HEX #000000. Like this, all Android Auto background pixels would remain unpowered. In simpler terms: Less light in your cockpit means better driving at night. Still: a good dark mode for Android Auto is still a solid start.
Back in the day, Google’s Pixel devices weren’t called Pixel yet, but Nexus. They were cheap and good. So good in fact, that some users still use them today. As the Pixel era dawned, these phones got a massive price upgrade and were transformed from usefully cheap gadgets into collector’s prizes with godly cameras. Now Google’s taking a step back and releasing two mid-range contenders, the Google Pixel 3a and the larger 3a XL.
If ordered immediately.
Information subject to change.
If ordered immediately.
Information subject to change.
The larger Google Pixel 3a XL has a 6-inch screen diagonal, 15.42 centimetres. According to Google’s presentation, they went with a so-called gOLED display instead of the usual Amoled screen. It could mean «Google OLED». Experts can’t really agree on the differences between gOLED and Amoled displays.
This year, Google is firing AI from all cylinders. Android shall become smarter and even more integrated. «Google will help more,» is what they’re saying. One of these features is «Full Coverage» on Google News, giving you multiple viewpoints for a single subject. Google Search will be equipped with this throughout the year. Among other things, Google is attempting to quell belief in Fake News and bring balance to the media landscape.
The Assistant is becoming quicker, of course. The «Continued Conversation» feature allows you to activate the Assistant without even having to say «Hey Google». Still, one feature is missing: an AND function. You can ask it a billion questions at lightning speed, send SMS and e-mails, check the weather and a million other things, but only one at a time. You can’t say «Hey Google, turn off the lights and write an SMS to Andrea saying "I like dolphins"». Google will want to «turn off the lights and write an SMS to Andrea saying "I like dolphins”» as a whole, and can’t process the crucial «and». That’s how it seems at least, judging from their stage show.
The reworked Assistant will coincide with the Google Pixel 4 release.
Google Search is getting a pretty big update. You can view a three-dimensional display of your search results, and using Augmented Reality, these can even be transferred into the real world.
On stage, a Great White Shark was projected in 3D. The animation is still a bit wooden and the textures could be better, but the technology’s still impressive. I don’t really know why I’d need a shark in my office, but I’ll still gladly try it out, as all of Google Search’s updates have only been visual for a while now.
Google Lens, Google’s virtual eye that can interpret images, is becoming smarter. Point your smartphone camera at a restaurant menu, and Lens will show you the popular dishes, calculate tip and divide the bill amongst your guests. This is done by correlating data through Google Maps.
A fun gimmick, but Google also wants to help illiterate people the world over using Google Go. When pressing on Lens in the Search Bar, Google Assistant will read you any text. The passage currently being read is also highlighted, of course. Translations are also enabled, with images as well as sounds.
It’s obvious that Google is uniting its apps, connecting their services and data with all of its knowledge acquired through the years, promoting and enabling it to grow ever more. Creepy? Definitely. For the first time we can really see what «a small scrap of data here or there» can amount to when combined into one unitary mass.
No one can doubt that this is a massive technological achievement. However, I was disturbed by the audience’s rapturous applause. Is this really what we like or enjoy? Objectively good, and not just technologically impressive?
By the way, a small detail: Google, it’s called «Deutsch» or «Deutsche Sprache». There’s no such thing as «Deutsche».
Google Duplex is Google’s telephone solution. Here’s what that means: you give your «Hey Google» a command. Let’s say it’s «Hey Google, get me a table for three at the Zürcher Tales Bar at half past eight». If Duplex worked in Zurich, it would call Tales and reserve a table there with a human voice.
This also works in the browser. Duplex understands your travelling plans, for one. It can read your journal entries, display your plane tickets and recommend car rentals while collecting a mountain of data about you. To enable this as a user, all that’s needed is a click on «Continue» and «Yes».
Google has realised that huge loads of data concerning your private business could be damaging if it fell into the wrong hands. Which is why they’ve adjusted their privacy settings.
With a simple click on your profile cion you can access the security settings for the feature you’re using at the moment. These features, alongside a separate Incognito mode for searching will come out this year.
The talk around Android Q is also mostly about security and privacy. It was only briefly mentioned, but still: there’s a lot happening behind the scenes.
Google will also be supporting native folding smartphones. Let’s look back: Google’s Android Core wasn’t part of Notch. Are foldables the future? I don’t know. What Google knows, however, ist that 5G will definitely be innovative. This standard will work with native Android Q products.
As funny and as cool as all these brand new comfort features sound, the kicker is something else. Federated Learning wants to open the world to everyone.
Google is implementing «Federated Learning». All data collected by the Google empire will be analysed and finally used in things such as finding the next word on Google’s Gboard smartphone keyboards. Cloud Speech API will transcribe speech to text in real-time. Live Caption will even apply this to videos. What this means: AI will automatically generate captions for a video while you’re watching it. This shall also be posible for conversations over the phone. Imagine your deaf, mute or both. Impressive.
What’s more: these conversations remain on your device: private. Without data connectivity. The neural network running this technology can operate independantly on your phone and is only 80 Megabyte in size. It’s system-wide, not app-dependent.
And Google’s still not finished. Of course, there are already recordings of pigeons or stroke victims. But what about non-verbal communicators? Google’s working on that as well. And to be honest, I’m in favour of this idea. Extremely in favour.
Google Assistant has found a new home. All Home hardware is now called Nest, as the former start-up of the same name and Google’s Home team have been merged. Google Home Hub is now called Nest Hub. The newest product from that team is called Google Nest Hub Max. With a ten-inch display and a camera, it combines all devices such as the Nest Cams, smartlights or smartlocks into one. The Google Home App on your smartphone can already do this, but dedicated hardware always carries advantages with it.
Speaking of which, Nest Hub Max has a killswitch at the back which immediately interrupts all power to the camera and microphone. Finally, a security feature that’s actually useful.
As nice as all this sounds, there’s one big glaring problem with I/O for us in the Old World: everything’s pretty US-centric. Let me give you an example: Google Assistant can speak in dozens of voices. Male, female, dialects and accents. It can even imitate the voice of singer-songwriter John Legend. There’s a crazy amount of engineering behind the creation of a voice that can say anything but has never been recorded. The feature is debuted at the I/O festival and is slated for release.
However, here in Switzerland, it doesn’t work like that: All these glorious features such as Google Duplex won’t be available in Swiss German for a long while. We’re still a crazy bunch of a few million eccentrics who can never settle on one language. I’m pretty sure Google won’t try and unite the meanings of words such as «Mond», «Moo» and «Manneschi». And I don’t think Google Assistant will come strolling by and order «foif Tickets» or «Auä vier Stüu» any time soon. Shame. Looks like we’ll have to remain humans for a while.
find this comment useful
don't find this comment useful
You're not connected to the Internet. Please check that your connection is enabled to keep browsing the site.