Understanding Today’s Changing Customers

What do customers most want nowadays? According to David Avrin, the three-part answer consists of immediacy (instant gratification), individuality (flexible, customized assistance) and humanity (concern trumps indifference). Of them, while immediacy should in theory be the easiest to enact, ironically enough automation is making that goal more elusive. What else is of interest from Avrin’s version of ranting about the ills of customer service? For one thing, the desperate measures companies take to ward off negative reviews appearing on-line. For another, Avrin’s favorite exercise to help his clients improve their operations: have front-line employees imagine that they are creating a rival company, which benefits from knowledge about what customers really want most but aren’t getting right now. There’s nothing like the risk of losing existing customers, after all, to grab management’s attention!

Released today: episode #82 of my podcast series “Dan Hill’s EQ Spotlight,” featuring David Avrin discussing Why Customers Leave (And How to Win Them Back). Click on https://newbooksnetwork.com/category/special-series/dan-hills-eq-spotlight to get to the new episode.

Image of Author David Avrin and his new book "Why Customers LEave and How to Win them Back" for Dan Hill's EQ Spotlight Episode 82 titled "Understanding Today's Changing Customers"

David Avrin is a highly popular speaker and consultant on the topics of the customer experience as well as on marketing. He’s a former CEO group leader and speaker for Vistage International. This is his third book, following It’s Not Who You Know, It’s Who Knows You and Visibility Marketing.

Image of NewBooks Network logo and Dan HIll's EQ Spotlight podcast logo

Dan Hill, PhD, is the president of Sensory Logic, Inc. His latest book, available on Amazon is Blah, Blah, Blah: A Snarky Guide to Office Lingo.

Rip Currents

An image of this question: Which of the 3 categories contributes the most top-10 trends? Is it economic, technological or social?

The correct answer is Economic, twice over. Not only does it provide half of all the top-10 trends or “undercurrents” in Jonathan Brill’s seminal book Rogue Waves, those economic trends also garner the most prominence by laying down the changing landscape (or “seascape”) that companies must navigate to protect and enrich their futures. What goes first? Changing demographics: the cost and availability of a company’s most precious resources: its personnel and its customers. Aging populations, a skilled labor shortage, and accelerating urbanization are the key emerging patterns in that case. Other trends that belong in the Economic category consist of the data economy, automation, the rise of Asia, and cheap money. The technological category encompasses the closing innovation window, and what Brill calls “remixing and convergence” (new combinations of existing technologies). Finally, the Social category addresses digital trust and new social contracts. This week’s new episode dips into several of these top-10 factors; to get to them all, buy Brill’s book!

Released today: episode #67 of my podcast series “Dan Hill’s EQ Spotlight,” featuring Jonathan Brill discussing Rogue Waves: Future-Proof Your Business to Survive & Profit from Radical Change. Click on https://newbooksnetwork.com/category/special-series/dan-hills-eq-spotlight to get to the new episode.

Jonathan Brill is the former Global Futurist and Research Director for HP, a board member and advisor to the Chairman at Frost & Sullivan, and the Futurist-in-Residence at Territory Studio. Companies he’s consulted for over the years have generated over $27 billion from new revenue sources.

Image of NewBooks Network logo and Dan HIll's EQ Spotlight podcast logo

Dan Hill, PhD, is the president of Sensory Logic, Inc. His latest book, available on Amazon is Blah, Blah, Blah: A Snarky Guide to Office Lingo.

The Incoming Tide: How Facial Recognition and Facial Coding Will Feed Into A.I.

I pioneered the use of facial coding in business to capture and quantify people’s intuitive emotional responses to advertising, products, packaging, and much more. So I’m a believer in Cicero’s adage that “All action is of the mind and the mirror of the mind is its face, its index the eyes.” Yes, an awful lot is in the face: four of our five senses are located there, and it serves as the easiest and surest barometer of a person’s beauty, health, and emotions. But Cicero’s adage also leads to the question: whose eyes serve as the interpreter, and how reliable are they?

An article in last Saturday’s edition of The New York Times, “Facial Recognition Is Accurate, If You’re a White Guy” raises exactly those questions. Usually, in “Faces of the Week” I focus on what I guess you could call the rich and famous. But in this case I’m showcasing Joy Buolamwini, a M.I.T. researcher whose TED talk on algorithmic injustices has already been viewed almost a million times on-line. Hooray for Boulamwini for documenting just how accurate facial recognition technology is to date. Take gender, for example. If you’re a white guy, the software has 99% accuracy in recognizing whether you’re male or female. But if you’re a black woman, like Boulamwini, then for now you have to settle for something like 65% accuracy instead.

The implications of such errors are enormous. The Economist, for one, has written about the emerging “facial-industrial complex.” In airports, cars, appliances, courtrooms, online job interviews, and elsewhere, a tidal wave of uses for automated facial recognition software, emotional recognition software (facial coding), and how both will feed into artificial intelligence (A.I.) systems is well under way.  So it’s no laughing matter when, for instance, a Google image-recognition photo app labeled African-Americans as “gorillas” back in 2015.

In my specialty, I’ve doggedly stuck to manual facial coding in researching my newest book, Famous Faces Decoded: A Guidebook for Reading Others (set for release on October 1, 2018). And the reason is accuracy. A knowledgeable, experienced facial coder can exceed 90% accuracy, whereas the emotional recognition software that forms the second wave behind the identity recognition software that Boulamwini has investigated is, at best, probably in the 60% range as companies like Apple, Facebook, and others weigh in. As Boulamwini has shown, even getting a person’s gender right can be tricky. Then throw in not one variable—male or female—but seven emotions and the 23 facial muscle movements that reveals those emotions, often in combination, and you can begin to see why the task of automating emotional recognition isn’t a slam-dunk.

Add in the influence of money to be made, and credibility suffers because the big claims of accuracy never go away. Plenty of firms offering automated facial coding services claim in excess of 90% accuracy, knowing that they won’t be able to attract customers by acknowledging a lower rate.

That makes for embarrassing moments. One company claiming 90% was appalled when they asked my firm to test and confirm their accuracy. When we found it to be at maybe half that level of accuracy, they rushed to provide us with the results from an academic contest in which they placed first by achieving 52% accuracy (based on standards we weren’t privy to learning). Another company’s software we tested showed all seven emotions flaring into strong action at a certain moment in time. In actuality, however, the person’s head had merely tilted a little—with no big burst of feelings having actually taken place just then. In another instance, automated facial coding software reported that the three judges in a mock appellate hearing had been so endlessly angry that about 75% of their emoting had supposedly consisted of anger during the proceedings. If so, that would have been an astonishing rate considering that the rapper Eminem was, at 73%, the single most frequently angry person in my sample of the 173 celebrities I manually coded for Famous Faces Decoded.

I could go on and on with such examples of automated facial coding not yet being ready for prime time. The case of another firm’s software supposedly detecting emotions in a plastic doll placed in front of a web cam to watch a TV commercial also comes to mind. Meanwhile, the reactions of the three companies Boulamwini tested for the accuracy of their facial recognition software are equally telling. China-based Megvii ignored requests for comment before the NYT’s story was published. Microsoft promised that improvements were under way. As for IBM, the company claimed to be on the verge of releasing identity recognition software nearly 10x better than before in terms of detecting dark-skinned women more faithfully. What’s the old saying in Silicon Valley? If you’re not embarrassed by your initial launch, then you waited too long.