With the tag line “drag, drop, buy,” image-driven discovery platform Ave23.com is looking to change the landscape of fashion search engines with visual intelligence technology.
Just more than a year in the making, the new e-commerce model, which launched in beta today, is the first search engine that uses images — instead of words — to find visually similar products. All users have to do is drag an image into Ave23.com’s search bar — and voilà — the site spits out an exhaustive list of products that make up the look.
This story first appeared in the July 26, 2011 issue of WWD. Subscribe Today.
“In the future, people will no longer use text as a main form of search, people will be using visual search to look for different products,” said Julian Reis, co-founder of Ave23.com. “It’s much more important for us now to understand what things look like and return similar products to what you’re looking for, rather than just typing in the text word and hoping you’ll get it.”
Reis cites Google’s Boutiques.com as the site’s most direct competitor at the moment, but hopes that Ave23.com — with a database featuring over a million products that will be updated daily with thousands of new “looks” — will change how consumers search for and discover products online. Ave23 is the only venture to date that has the ability to let the computer know how to recognize products by analyzing the whole look, separate it into different products, and then search to find relevant results.
“For someone who loves a dress from Prada, she might not be able to afford it, so they might find one that looks just like it from Zara,” Reis said, adding that the platform brings the consumer to the merchant seamlessly. A mobile app — which will allow users to snap photos of fashionable passersby on the street and upload them to find relative products — will also launch in the next month.
The venture is independently funded by Reis and his partner, but the two are looking to raise capital.