SAN FRANCISCO — If there was one thing that was evident at Google’s Pixel 2 press event here Wednesday, it was this: The future will be beautiful — and noisy.
The company unleashed a stampede of product introductions: Pixel 2 smartphones in two size variations, two new Google Home speakers, a high-end Pixelbook convertible laptop-tablet, a smart clip-on camera and wireless smart ear buds that can translate spoken foreign languages in real time.
The devices feature a new common aesthetic design language, visually unifying the lot and positioning what might have been a chaotic tangle of gadgets into more of a cohesive collection. But the real common thread was not the custom fabric and muted colors — it was “algorithms.” Google made it clear that artificial intelligence, neural networks, computer vision and machine learning are powering its vision of the future, and these gadgets serve as gateways or connection points.
In this future, the company knows more than just where photos were shot or how to sort tagged friends. Its AI can read and understand precisely what’s in those images, so it can support useful features. Point the Pixel’s improved 12 MP camera at a building, and it can identify the subject and provide the best information the web can offer about the structure. Set Google’s new Clip camera — which somewhat resembles a slightly larger Tile — in front of a child, and it can capture a stream of images when it notices that he or she is smiling or laughing.
Vocal technology, courtesy of Google Assistant, got smarter as well, and is available through more devices. As Rishi Chandra, lead of the Google Home team, said onstage, the technology works whether you’re “using your voice to order diapers,” interacting with your smart lights or performing other tasks. The voice assistant seems to shine when a Google Home speaker is tied to other devices in the company’s Nest smart home system, for example. A user can ask the device to show who is at the front door, and it magically funnels the desired smart camera feed to the television. Thanks to improvements to its face-matching algorithms, Google Assistant could even announce or display Aunt Sally’s arrival at the door without the homeowner having to ask.
Whether those features are cool or creepy will vary from one person to the next, and if these devices are parked in every corner of the house or pointed at a child. Google seems to understand that on some level, even if it won’t directly address that.
“When we first started this project, the goal was to create an assistant for every room,” said Isabel Olson, Google’s head of industrial design for home products and wearables, and leader of the CMF team (color, material and finish) for the entire hardware line. “It has to be able to fit anywhere and blend into the background. You don’t want gadgets all over your house. We wanted people to feel comfortable placing [these devices] anywhere.”
In designing the new Google Home speakers, which involved custom fabrication of the nylon-blend material, Olson, a former furniture designer, emphasized the company’s desire to be “thoughtful” — a word uttered often on the stage by multiple Google executives. She explained that the team aimed to show a softer side of technology. “You can definitely see it through the softer curves, the fabric, the way we applied color.”
That softer side was less evident in the company’s ousting of the 3.5-mm headphone jack in its smartphones. But, like Apple, the hardware change seemed to pave the way for a new wireless ear bud product — an impressive one, at that. The accessory not only makes Google Assistant available as soon as it’s paired with the Pixel, but it can deliver real-time spoken language translation.
Such translation powers have been a highly anticipated feature in the growing smart ear bud category, also known as “hearables.” In the onstage demo, the Pixel Buds deftly translated Swedish to English, and vice versa. As for whether they will hold up outside a controlled environment, in the real world, is too soon to say.