by Glen Drummond
Estimated read time: 6 minutes
Part Two in a two part series
Recently, I published an article with a provocative observation. While much attention has been devoted to the need for organizations to adopt Artificial Intelligence as a core capability, we should consider an even-more-pressing need for “artificial empathy.”
If you did not read part-one, I’ll retrace some footsteps here. The corporation is a creature of human invention. But the creature has grown so enormously in its size, capabilities ,and power, that we the people now encounter a diminishing sense of agency for ourselves and an increasing sense of agency for corporations to shape our future on issues including privacy, equality, safety, the environment, and the behavior of public institutions that once governed these things. Not to mention the stuff of everyday experience: stupid IVRs, impenetrable clam-shell packaging, and infuriating password implementations, just to name a few.
The ramifications of this observation extend beyond marketing strategy. But still, people who think deeply about the relationship between people and brands will play a role in how this narrative unfolds.
And here’s why: In our fast-thinking minds, we perceive the brands that stand for corporations as if they were other people.
Now, people – except for sociopaths – are naturally empathetic. And moreover, we expect them to be so. When we sense a sociopath, the hair on our neck springs, and adrenalin shocks our bloodstream.
As social creatures, we are born pre-wired with miraculously-adapted endocrine and neurological systems that reinforce our empathy in a positive feedback system known as friends and family, community and kin. But corporations are not born with anything of the sort.
Do you see the problem?
At least in our hearts, we have an expectation for brands to behave in a way that they are poorly equipped to fulfill. Expectations disappointed are brands diminished.
Organizational scale amplifies this problem. (We all know what “faceless corporation” means.) So does the doctrine of maximizing shareholder profits. Are there signs that both society and corporate leaders are beginning to discern that the corporation has gained such power, that the power needs to be matched with greater empathy? The recent “statement of social purpose” by 181 corporate leaders suggest this might be so.
The question is how? Some people who read my first post may have been under the impression that I had a plan for how “artificial empathy” could be created. Rest assured this was far from the case. I’m sympathetic to the aspirations of the customer experience movement, but I’m skeptical those aspirations are advanced by continuing to ask socially clueless questions that amount to: “How do you like me now?”
Still, having once stumbled upon the problem of artificial empathy, it’s tempting to speculate. So, with apologies for pairing a ten dollar question with nickel and dime answers, here are some preliminary thoughts.
Biomimicry
If you’re familiar with the literature on biomimicry – you will know that many industrial inventions begin with the observation of patterns in nature. Could we re-conceive the information systems used by corporations through this lens?
In that case, the challenge of “artificial empathy” would cause us to think about a system involving a sensory apparatus, a cortex that integrates the signals from the senses, real-time feedback, amplifier mechanisms and so on.
It does not take long to see that analogues for each of these things already exist within the information systems of corporations – but what’s lacking is an architecture marshalled by the imperative of empathy.
For humans as social creatures – empathy is essential for survival. Embracing the biomimicry idea in an IT architecture geared to artificial empathy would mean that the selfish subjectivity of the corporation would need to be subjugated to human experience and dignity. Do we have engineers this creative and leaders this courageous?
Philosophy
There is a branch of philosophy, “epistemology,” that deals with the question of how we know what we know. Historically, for corporations, and indeed any large organization, to operate at scale has required that an internal representation of customers and prospects is shared across the organization. Sometimes this internal representation goes out of date. Sometimes it is simply wrong-headed from the start. Invariably this internal representation is reductive.
Done well, the disciplines of customer segmentation and personas offer steps in a journey away from the most reductive internal representations of the corporation’s publics. But too often in practice, people mistake the map for the territory. In a product-centric world-view with no imperative for empathy, mistaking the customer map for the territory is standard operating procedure – “best practice” even. In a corporation seeking to attain the capacity of artificial empathy these old habits must die.
While corporations have raced to hire data scientists and put them to work on the analysis of customer behavior and customer responses to various stimuli, they have not been as quick or adept at hiring and training people in the discipline of keeping separate the map from the territory while the study of people is underway.
The pairing of these disciplines feels important going forward. Data scientists are in demand now. Data scientists with a flair for philosophy will be the rarest and most valuable of all.
Artificial Intelligence
Setting aside the semantic arguments about the existence of AI, we now can access algorithmic tools that can explore data-sets to find multiple features of interest about people, and discover patterns of difference, similarity and prediction that are more subtle than those derived from averages, demographic co-variates, single-touch attributions, and the other mainstays of traditional customer analytics.
Indeed, if we are going to operate with less reductive representations for people, and if we are going to simulate the biological mechanisms of empathy within a corporation, artificial intelligence may be the disruptive game-changing technology that finally enables meaningful progress against a problem that has been building for some time.
Final Thoughts
None of these answers by themselves is a prescription for artificial empathy. The confluence of all three may point in a worthy direction. Still, some journeys are worth taking, even when the destination is distant and the route uncertain.
This might be one.
Get more thoughtful content on how to ‘think differently’ on marketing, business, and work by subscribing to our newsletter.