Science

Reading Stories Can Teach Empathy to Both Humans and Robots.


“There are worse crimes than burning books. One of them is not reading them,”Joseph Brodsky, a Russian and American poet, once said. Books let the reader experience new, different worlds, unexpected events, wild adventures. But foremost, they open the access to the minds of others, minds of characters and minds of narrators.

Common sense tells us that reading fiction should train people in understanding what others think and feel. Indeed, in 2006 Raymond Mar and his colleagues conducted a study on almost 100 people that (weakly but still) supports this claim. The results have shown that the more fiction a person reads, the more empathy he or she has and the better the person performs on tests of social understanding.

Here’s how the study looked like. It’s socially desirable to say that you read a lot, so researchers couldn’t just ask people how many books they have read. Instead, they gave participants a long list of names that included authors of fiction, authors of non-fiction, and random names. It has been shown in prior research that the number of recognized authors’ names is a good indicator of how much the person actually reads. People who recognized more names of authors of fiction scored higher on measures of social awareness and tests of empathy. Recognizing more non-fiction authors showed the opposite association: the more fact-based literature one reads, the lower the empathy scores.

However, it is important to remember that it’s only a correlational study. This means that a certain outcome (reading more fiction) coappears with another outcome (higher empathy). It says nothing about causation: does reading improve the understanding of others, or do more empathic people choose to read more fiction?

The answer to these questions comes from an article written by David Comer Kidd and Emanuele Castano, published in 2013 in Science. It describes five experiments that showed that reading literary fiction improves empathy. Hundreds of participants read a few pages of fiction and immediately took a test on recognizing people’s emotions based on their eyes and faces. Reading literary fiction (but not pop fiction or non-fiction) boosted people’s ability to recognize others’ emotions (and in other experiments, in some cases, others’ thoughts and desires). The effects were still present after controlling for variables such as education, gender, age, and mood.

However, the researchers themselves concluded that their findings “are only preliminary and much research is needed.” Nevertheless, those findings suggest that fiction plays an important role in developing humans’ empathy. Here’s authors’ take: “Literature has been deployed in programs intended to promote social welfare, such as those intended to promote empathy among doctors and life skills among prisoners. Literature is, of course, also a required subject throughout secondary education in the United States, but reformers have questioned its importance: A new set of education standards that has been adopted by 46 U.S. states (the Common Core State Standards) controversially calls for less emphasis on fiction in secondary education.”

We inevitably approach times when human’s intelligence is not the only type of advanced intelligence on Earth. Many people, including Elon Musk, fear that robots will threaten the existence of humanity. How, then, can we teach robots to act ethically and empathically? As explained above, humans use stories for that, so why not try the same with artificial intelligence?

Researchers Mark Riedl and Brent Harrison from the Georgia Institute of Technology developed the “Quixote system” that does exactly that. It trains robots to read stories and learn acceptable sequences of events: what you can and cannot do in human societies. Quixote learns that it will be rewarded whenever it acts like the protagonist in a story instead of acting randomly or like the antagonist.

Quixote

An example of how it works is called “Pharmacy World. ” The robots needs to acquire a drug to cure an illness and return home. The robot could a) rob the pharmacy, take the medicine, and run; b) interact politely with the pharmacist, or c) wait in line. Without value alignment and positive reinforcement, the robot would learn that robbing is the fastest and cheapest way to accomplish its task. With value alignment from Quixote, the robot would be rewarded for waiting patiently in line and paying for the prescription.

Here is what Riedl says:

“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels, and other literature. We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose. We believe that AI has to be enculturated to adopt the values of a particular society, and in doing so, it will strive to avoid unacceptable behavior. ”

Will teaching artificial intelligence to read stories save us from annihilation? Nobody knows that. Maybe we should all just listen to P.J. O’Rourke’s advice: “Always read something that will make you look good if you die in the middle of it. ”

Refferences

Mar, R.A., Oatley, K., Hirsh, J., dela Paz, J. & Peterson, J.B. (2006). Bookworms versus nerds: exposure to fiction versus non-fiction, divergent associations with social ability, and the simulation of fictional social worlds. Journal of Research in Personality, 40, 694-712.

David Comer Kidd, and Emanuele Castano (2013). Reading literary fiction improves theory of mind. Science express.

Riedl, M. O., & Harrison, B. (2015). Using Stories to Teach Human Values to Artificial Agents.

Share on facebook
Get more articles like this direct to your inbox




Comments

Comments!

Productivity
Facts and Myths About Multitasking
Psychology
The Happiness Genes Lottery: Have You Had The Luck To Be Born Happy?
Psychology
Does Calhoun’s Shocking Experiment Reveal the Condition of Our Society?
There are currently no comments.