by Glen Drummond
Estimated reading time: 7 minutes

Part One in a two-part series

Empathy.  It’s such a defining human quality, you could say it’s in our bones. For sure, it’s in our brains. Neuroscience reveals that we have “mirror neurons” that cause other people’s emotional experiences to become our own. That concept would be astonishing if it were not so familiar.   Empathy runs in our veins. The hormone oxytocin – makes us closer to those we’re close with.  

Beyond this, there are the mental gadgets that history has draped on our biology. For instance, our fine-tuned sense of justice, fairness, and balance.  These qualities also incline us to prosocial behavior, such as helping a stranger on the street, supporting a local non-profit,  separating our recycling…

So if empathy comes naturally, why call for “Artificial Empathy?”  (Presuming, of course, that such a thing could even be possible?)  The answer begins with an observation about a trend in scale. Human nature developed over a long period in which there were rewards for co-operation within groups and competition between groups.  But compared to today, the groups were small. It’s not clear that biologically-rooted empathy equips us adequately for the scale-change.    

It’s not merely that there are more of us, although the human population has tripled since 1945.  It’s that the nature of connectivity between us is transformed.   As members of media-fueled electorates, our mood-swings are damaging institutions that took centuries to build.   As members of a global economy, our collective emissions are generating planet-scale impacts on the environment.  

There are broad conversations underway about these forms of our connectivity. Less so about our participation in corporations.   Arguably, no prior form of connectivity rivals the modern corporation’s capacity to pursue its objectives with such speed, scale and precision

And big corporations are getting bigger.  The World Bank reported in 2016 that among the 100 largest revenue-collecting entities in the world, 69 are corporations; 31 are nation-states.  A decade ago, the US Supreme Court awarded corporations a human right: freedom of speech.  The Danish government has appointed an Ambassador to liaise between the midsized nation and giant tech corporations.

If you have spent your career inside corporations, you know there are instances where scale acts as a liability as much as a strength.  The world knows that something went wrong at Volkswagen, at Facebook, at United, at Boeing.   And while the particulars are different, the circumstances rhyme.  A group of people sincerely felt it was their job to do something that the public would come to hate and the owners would come to regret.   What corporation is free from this risk?  

So why does business need “Artificial Empathy?”  It’s partly because natural empathy is poorly matched to the scale of the modern corporation.  And it’s partly because the consumer and the public are not going to let corporations off the hook for un-empathetic behavior.    

Here’s the basis for my confidence in that second observation.  People imagine brands as if they were other people. The marketing practice of managing brands using a system of archetypal characters speaks to this fact.  So does the blow-back that follows when corporations act in notably inhuman ways. There’s even neuro-imaging research that shows we look at logos and faces in surprisingly similar ways.      

So here, in a nutshell, is why brands need artificial empathy:  

  1. Because we imagine brands as if they were other people,  and 
  2. Because we expect other people to be inherently empathetic, so 
  3. We also expect brands to be inherently empathetic too, and
  4. Brands have no natural capacity to fulfill this expectation

This fabric of observations explains a lot. Corporations,  pursuing their interests without paying attention to this prevalent expectation, violate customer trust. And sometimes, public trust too. 

Only on the rare occasion does this violation happen in the dramatic ways cited in the cases of Volkswagen’s emissions masking or Cambridge Analytica’s democracy hacks.   

Far more common are violations so banal they barely register. Robotic voice response systems that remind you: “please continue to hold,  your call is important to us.” Departure lounges that add acoustic assault to the list of insults suffered by air passengers. Manipulative marketing and sales tactics like the email that arrived this morning in my inbox, by no coincidence, at 9:18 AM with the subject header, “9:00 AM Meeting.”   

Viewed through the lens of empathy, (and the lack thereof)  the distinction between the dramatic and undramatic instances becomes only a distinction of degree, not kind. And that observation is potentially helpful because it offers some guidance on what needs to be done.  

Now, you might say, “Ah, you’re talking about customer experience,” and yes, in a way that’s true.  But insofar as the term “customer experience” stands for a department, a performance measure or one in a set of parallel business disciplines,  a “customer experience” capability will only act on symptoms while failing to address the root cause. (Sociopaths are known, after all, for their ability to charm.)

Or, you might say, “Ah, so you’re talking about corporate governance.”  And yes, again in a way that’s true. But how much real capacity do the people charged with such weighty responsibilities have to intervene in the minor daily violations of the customer’s expectation of empathy?  It’s been observed for some time, that “The road to hell is paved with good intentions.”   

Since empathy violations appear to take place despite the ubiquity of “customer experience” and “corporate governance” functions the empathy gap – the delta between customer expectations of empathy and the level of empathy corporations are presently organized to muster – is a real business problem.

It seems like a problem that would be worth taking risks to explore, based on the value of the potential outcome if it could be solved.  

To summarize, let’s retrace our steps.   

  • Corporations are large, powerful, engines of collective influence and action.   
  • They are growing increasingly large, powerful, and influential in the lives of people.
  • People expect them to act empathetically, but corporations have no natural inherent capacity, like people do, to fulfill that expectation.
  • So, we should expect the empathy gap will grow with the power and reach of corporations, until such time as either corporations design a technology of empathy – “artificial empathy” if you will – or face a more concerted backlash directed at individual brands (“United breaks guitars”), at industry sectors (say, “big tech,”) and at corporations in general.   

Despite all the technical progress, investment and hype devoted to it, there remains a debate over whether “artificial intelligence” (AI) actually exists.  The concept of “artificial empathy,” if it were to enter the public discussion, would be subject to a similar philosophical challenge.  

So why talk about it at all? 

Because corporations have plenty of resources for tackling challenges once they can be identified. This one is staring us in the face. 

Since the processes, which we call “artificial intelligence” will inevitably shape more of the experiences that corporations project and customers and the public will absorb, is there any question that the need for artificial empathy will grow with each passing day? 

The conjunction of “artificial” and “empathy” is a provoking framing of a problem that exists. It matters greatly to a corporation’s stakeholders and deserves far more rigorous thinking and effort than has been devoted to it thus far. Rather than being a zero-sum game, “artificial empathy” will be a project that aligns the interests of shareholders, employees, customers, and the public.  Rather than being a departmental problem, “artificial empathy” will require a systems-level response.  

I’ll leave for a subsequent article the questions of how “artificial empathy” might work and what resources it might draw upon.   For now, suffice it to say if corporations need empathy and don’t have it as a natural quality, then the commercial incentive is there to synthesize it. 

The ingenuity and organized effort that has made predictive science – machine learning, deep learning, expert systems, big data, or more generally, “artificial intelligence” –  such an important component of corporate strategy today, provides at least a framing metaphor for this initiative – and maybe some important tools too.   

But intelligence (natural or artificial)  is no substitute for empathy. No matter what strides we make in AI, brands need to make progress now on Artificial Empathy. And if AI begins to make strides on its own, there’s a good chance brands will need to pick up the pace.   

Don’t miss out on what’s new at Carbon Design Co., join our email list here!

Would love your thoughts, please comment.x