Empathy is essential for human relationships. This article explores the sustainability of being human in an increasingly virtual and artificially oversaturated.
Practicing social distancing and abiding to stay-at-home orders are physical and psychological shocks — they also pose a great risk to the social fabrics of our societies.
Before the crisis, most of us weren’t isolated: we had, if we chose to take it, the opportunity to mix with individuals from different socio-economic classes at work, in our free time, and on public transportation. We could practice empathy as we saw how other people’s lives were different from our own.
Before the crisis, most of us weren’t isolated.
During the lockdown we have been largely on our own or with our own, often confronted with a volatile and professional homelife and feelings of extreme stress — mostly through our screens.
We are primed to classify the news we receive as good or bad, right or wrong without critical thought. The press, social media platforms and politicians deliver. Catastrophic headlines and stories vie for our attention. Many in these camps shun empathy, and promote one singular truth — their narrative.
In a world where we previously discussed news and ideas with those from other communities, our isolation makes it harder to practice empathy, and easier to be excessively judgmental.
The more we ache and fear the more we seek rewards and affirmation, which we find in the form of dopamine hits from our smartphone applications. In extreme uncertainty, when we need a sense of community we instead maximize self-interest.
Our reward-seeking mechanisms are misfiring — just as are some of today’s transformative inventions based on machine learning. Algorithms are designed to identify patterns (which means, excluding exceptions), but are poorly designed to show meaning and intent.
In social media, they are designed to deliver intermittent rewards. The best platforms examine our past behavior and devise new ways to keep us engaged.
Keep in mind that humans are the ultimate supercomputer; our emotions are a form of AI. We are trained to react based on past experience, and digital positive reinforcement can make us feel good when we are in fact lost. In addition, the models, data, and expert knowledge upon which these algorithms are built prove incapable of telling us what is going to happen next and how to do it better — or perhaps they are designed to keep us in anticipation.
Our reward-seeking mechanisms are misfiring — just as are some of today’s transformative inventions based on machine learning.
So if we were to design better algorithms, for ourselves and for the systems we use, what ethical dimensions would we try to give them? This goes back to isolation. We need to weave together a more meaningful tapestry of social space — how we behave online, reach out to people with common passions, reconnect with our community’s pain.
Our connections have shrunk. We must be intentional, and we need to think about more diverse communities and the challenges they face. In our current environment, technology can be used in different ways to help us embrace this diversity — to weave our tapestry differently reaching beyond the limits of our perceptions of race and class.
This crisis will fundamentally change the way we think — about how we live, travel, manage, and measure progress in society, especially regarding inequality or race relations.
Humanity has faced crises before; but this crisis has pushed us to confront our own history. We must choose to be responsive and address the challenges before us; wishing them away is irrational.
We need to weave together a more meaningful tapestry of social space — how we behave online, reach out to people with common passions, reconnect with our community’s pain.
The loss, ache, and racial and social injustices are unbearable; yet to prepare for the future we need to focus on what we can learn from this crisis. Only so will we be able to look back on this time not just as a pitiful waste, but as an opportunity for introspection, for lucidity, and for transformation. We have the opportunity to become better. Let’s turn the regrets into resets.
Mel Martin & Carin-Isabel Knoop, neighbors, friends and colleagues at Harvard Business School, who were inspired by a conversation with Professor Lee Schlenker and his exploration of insight-based, rather than data-based, decision-making and interest in rethinking ethics in an increasingly technologically-intermediated culture.
September 8, 2020