Google's Aparna Chennapragada speaks at the Google I/O conference in Mountain View, CalifGoogle Showcase, Mountain View, USA - 08 May 2018

Google took another step into fashion at its I/O developer conference with a new Google Lens camera feature called Style Match.

The tool can conjure products similar to items in the camera’s view and uses artificial intelligence, a big trend at the conference on Tuesday and Wednesday. 

“You see an outfit that catches your eye, [and] you can simply open the camera, tap on any item and find out specific information — like reviews of any specific item. But you can also see all the things…that match that style,” said Aparna Chennapragada, Google’s vice president of product for augmented reality and virtual reality, while showcasing Google Lens in a keynote address. 

The tech giant discovered what the fashion industry has long known: Making sense of the incredible load of visual information that comes along with style can be vexing. But a massive organization with outsized resources, like Google, can make significant progress on solving the problem with computers. Plus, Google already has half the equation nailed down. Lens may rely on searching many millions of items, but search is Google’s specialty. 

Google’s search engine already features a shopping section with product search results, but Style Match focuses on real-time recognition and suggestions for items of interest. The initial product categories are apparel and home decor, and the feature will be available in the Pixel smartphone later this month.

Integrating fashion and beauty features directly into mobile device software is a hot trend lately. Samsung integrated Modiface’s augmented reality tech into its S9 smartphone’s Bixby Vision camera software earlier this year. Apple hasn’t made overt moves with its camera yet, but it did officially launch Apple Business Chat for customer service-minded organizations such as retailers and service providers.

Google fashion style match

Google Lens’ new Style Match feature can recommend products that match clothes and home furnishings. 

Chennapragada elaborated further about Style Match in an exclusive interview with WWD. The idea for the feature came from the company’s user studies, as well as its own product thinking. “Certainly, it’s not just fashion and clothing, but it’s the biggest use case here,” she said. “We had all these requests from users who couldn’t describe in words what they wanted to find.”

Describing fashion can be a daunting task, even for retailers. For consumers, the challenge can be a major obstacle. “You see a polka-dotted dress, with a V-neck, a bias cut, flowing fabric, etc….how do you type all that into a search box?” she said.

With Style Match, it’s about using cameras instead of keyboards to search for things. Functionality is fairly basic, but Chennapragada said it’s still early days. Google really only started working on it a few months ago. As for where it goes next, she thinks product suggestions — pulling up other clothes or accessories that go with an item of interest — would be a natural next step.

To be clear, Chennapragada said she doesn’t consider the feature to be a stylist product. Such development would put it firmly in styling territory, where companies from Stitch Fix to Amazon are making deep investments. And it would come out of the box with Android, the world’s most prevalent smartphone platform.

Google Lens’ computer vision and camera tech powers fit neatly into a key theme at the conference: As chief executive officer Sundar Pichai put it, Google wants to make computing feel more “natural.” And that doesn’t involve digging through settings and tackling menu after menu. It speaks to the broader issue of human-to-computer interaction, an area of technology that has sped development for voice, gestures, vision and more.

As the tech giant’s voice platform, Google Assistant enjoyed a marquee position among the company’s smorgasbord of announcements, with several new updates. The array covered hardware partnerships that blend voice-plus-visual displays. (Think Amazon’s Echo Show, but across a greater variety of display devices.) It’s also getting six new, natural-sounding voices, including the smooth speaking tones of singer John Legend.

One particularly attention-grabbing change: The assistant can now place phone calls to beauty salons and other businesses to book reservations or other appointments — with a rather convincing human cadence and tone. 

load comments
blog comments powered by Disqus