Amazon is reportedly set to test a new emotion-reading wearable.

As the retail landscape gets techier by the minute, one legendary shopper springs to mind: Carrie Bradshaw, admitted shopaholic and tech-phobic protagonist of “Sex and the City.”

If the fictional columnist were surveying these modern times, she might have mused, “People are always fascinated by things shiny and new. But as we bring these bits and bytes into our lives to assist us, I can’t help but wonder — can they actually know us better than we know ourselves? Should they?”

It’s an intimacy issue of a different sort than she was used to. But it’s fundamentally salient now, as innovations zero in on people’s fingerprints, faces, mind-sets — and now, potentially, their moods.

What would Carrie have made of Dylan, the wrist-worn wearable that Amazon is reportedly developing that can recognize human emotions?

This week, reports of leaked documents and Amazon patent filings indicate that the technology can read vocal cues to determine how a user is feeling. Apparently the same Lab 126 team that developed Alexa, the artificially intelligent voice assistant powering Amazon’s Echo line of devices, is beta-testing Dylan at the moment.

Like so many wearables now, the Amazon wrist gizmo appears to have health and wellness in its crosshairs. Apple, for instance, has doubled down on the health focus of its Apple Watch, having launched echocardiogram features and more over the past year. The device leads a global wearables market that jumped 31.4 percent at the end of last year, according to IDC.

If Dylan works as described, it might stand a decent chance of carving out a place for itself in the market.

“Detecting a user’s voice and cross-referencing it against the heart rate can give a better indication of what a user wants,” Ramon Llamas, research director at IDC’s mobile devices and AR/VR unit, told WWD. “Think of the person in distress and the heart rate is racing. This could be a cue to call emergency services.”

That’s harder than it looks, though, Llamas added. Zeroing in on emotions based on speech is tough, even for human beings. “This is where the device has to be smart about content and context, and that’s a significant challenge,” he said. He recommends a wait-and-see approach.

The scenario is one that Amazon may be uniquely poised to tackle. The company has a well-known penchant for experimentation, as well as the patience and the deep pockets to keep at it for as long as it takes. The first Alexa-powered Amazon Echo started as a test device on a limited rollout in 2014, before rocketing up to become a category-defining market leader in smart speakers.

But Amazon likely has other reasons to pursue development for Dylan. And those motivations may have nothing to do with becoming a wearables giant.

Consider this: Retailers and brands have been clamoring for ways to capture or track customer sentiments, preferences and intentions. And so they throw money at AI, machine-learning and other tech, hoping to reach consumers at the precise moment when they are the most open to certain types of messaging.

This is the holy grail in retail, Amazon’s core business, as well as advertising, one of the company’s fastest-growing segments.

Emotion has always been a driving force in consumerism, and there’s no sign that things are changing. According to a Clicktale survey of 2,091 consumers, 40 percent used “retail therapy” to calm down and 74 percent admitted to having ”stress-shopped.”

As a test device, Dylan may not be heading to Amazon’s virtual shelves anytime soon — or, perhaps, even at all. But it’s clear that the company has compelling business reasons to pursue it.

How the gadget would be received is another matter entirely. Brand partners might cheer, but rivals would dread its arrival. Privacy wonks would surely take a dim view of an emotion-reading technology, especially during a time of heightened sensitivities around how much personal data tech companies are vacuuming up about their users.

A recent Microsoft study revealed that 41 percent of voice-assistant users are worried about privacy and trust. The results came out following reports that Amazon personnel annotate sound clips captured by Alexa and feed them back into the software to improve its understanding of human speech.

Dylan, Alexa’s sibling tech, seems to sit in a complex nexus where technology and humanity overlap — and that space has become a fraught, messy place marked by a thicket of concerns.

Take facial recognition, as another example. The technology has huge implications for areas as wide-ranging as law enforcement and enterprise security to mobile entertainment, wellness and beauty.

And yet, in San Francisco — a region that birthed some of the biggest advancements in facial recognition — law enforcement authorities effectively banned their use of the technology earlier this month.

“I think part of San Francisco being the real and perceived headquarters for all things tech also comes with a responsibility for its local legislators,” city supervisor Aaron Peskin publicly stated at the time. “We have an outsize responsibility to regulate the excesses of technology precisely because they are headquartered here.”

It’s worth noting that Amazon is a developer of facial recognition software. And during its latest shareholder meeting on Wednesday, a vote came up on whether the company should suspend sales of its “Rekognition” face tech. Result: Shareholders voted to continue selling the software.

The stakes can only rise, as technology gets more intimate with its human owners. A young, vibrant Carrie Bradshaw, fashionista, Mac laptop killer and iPhone rejecter, probably wouldn’t wear Dylan. But an older Carrie, perhaps concerned about her health, might consider it — at least if it’s well-designed or caps off her look beautifully.

What she might not know is what she’d be signing away when she puts it on.

load comments
blog comments powered by Disqus