Will online shoppers more readily click on a blue-colored “buy now” Web site button or is gold a more enticing hue? Do numerous small images grab a consumer or is a solitary high-quality photo more powerful? Block letters or italic?

Retailers continually evaluate design elements of their Web sites, searching for the best presentation to engage the consumer and boost sales. To minimize the risk of fiddling with a Web site’s appearance and alienating visitors, companies are turning to technology-driven techniques called A/B testing and multivariate testing.

Simply put, A/B testing software presents two variations of a Web site, an A version and a B version that differ in perhaps just one detail. The analytical software measures each site’s ability to convert shoppers to buyers; multivariate testing serves the same purpose, but examines multiple design variations simultaneously.

Thirty-three percent of online retailers use A/B testing to evaluate Web site design elements, according to Forrester Research’s 2005 State of Retailing Online study, which polled 137 U.S. retailers. Among those respondents using A/B testing, 97 percent rated the practice as effective or very effective in improving conversion rates. Carrie Johnson, research director for consumer markets at Forrester, said A/B testing usage figures from 2004 are unavailable as the practice is “a fairly new phenomenon” and last year’s survey did not address it.

Linsly Donnelly, chief operating officer for joann.com, the online version of the fabric and crafts store, is among the believers in A/B and multivariate testing. The value of joann.com’s average online order rose 137 percent after the retailer launched a promotion that no one thought would work, but which testing software endorsed as a winner. The promotion, “buy two sewing machines and save 10 percent,” was counterintuitive, said Donnelly. “Who buys two sewing machines?” she asked, noting that price points range from $250 to $5,000. Still, the unlikely promotion outperformed other more standard incentives, such as free shipping and deep discounts.

“We tested that more as a whim and that was the one that won,” Donnelly said. “I can pretty much bet that if I had surveyed people and said, ‘Which offer do you think would win?’ it would not be that one.” The testing software, from Offermatica of San Francisco, revealed that craft and sewing hobbyists are very communal and will collaborate on a purchase if an offer saves them money.

This story first appeared in the October 12, 2005 issue of WWD. Subscribe Today.

Eric Peterson, senior analyst for Jupiter Research, said he’s “bullish” on A/B and multivariate testing. “It’s so easy to set these tests up and try things that sound crazy. You can find out in a couple hours, a couple days or a week whether they work. It takes a lot of the fear out of extending yourself as a marketing person and merchandiser to try new things.”

Some retailers attempt this testing with in-house developed software, but Peterson said commercial software can offer more extensive functionality. Such tools “are not inexpensive,” he added, and can cost $8,000 to $10,000 per month. “When companies don’t have these full-blown testing facilities, they are a lot less likely to try radical ideas,” Peterson added.

Conversion rates rocketed 429 percent at Skinner Inc. after the Bolton, Mass., auction house followed A/B test recommendations that suggested larger images would encourage more consumers to place bids. At Skinner, “conversion” is defined as the rate at which shoppers submit bids for fine art, antique couture apparel, textiles, furniture or any of the items featured in its auctions.

Kerry Shrives, senior appraiser, auctioneer and director of Skinner’s Internet auctions, used software from SiteSpect of Boston to test consumers’ responses to images that were 15 and 30 percent larger than its current images. Shrives, who appears on the PBS-TV series “Antiques Roadshow,” went into the testing with preconceptions that image size “wouldn’t make much difference for people. We were looking to debunk that” bigger-is-better assumption to justify keeping the images small. “So the question became, would there be advantages that would [outweigh] the disadvantages” of dealing with larger images, such as added cost of server space to store the images and image quality issues.

When conversion rates rose a dramatic 429 percent among users viewing images 30 percent larger, it became clear that bigger was better and the added expense would be worth enlarging the images. Next, Shrives will test the color of the “bid now” button, which is currently gray with a maroon-colored band. Initial testing indicated consumers respond better to a cream-colored bid button and Skinner will conduct more testing to verify that conclusion.

“The best thing about A/B testing is you can figure out if this [proposed Web site change] is a good thing to be doing,” Shrives said. “Are we going to spend money that will have some impact, or will we be tossing it away” by tweaking the site in a way that has no quantifiable impact on the shopper experience and conversion rates?

load comments
blog comments powered by Disqus