Syrians Need Human Touch, Not AI Bot

Why an AI psychotherapy bot cannot be used for Syrian refugees

Leah Zitter
Read +
Follow Us

Over in Berlin, Germany, Dr. Jihad Al Abdullah, psychiatrist at CharitéCentrum sees eight patients a day. The children describe being raped, seeing their families killed and witnessing beheadings. An 11-year old girl says she saw a man’s “head cut off.” An eight-year-old boy told German international broadcaster Deutsche Welle how he saw his father shot dead in front of their family when soldiers pushed them out of their home. He became angry and bitter, but since the Center for Pediatric Traumatic Stress treated him, he mellowed.

One of Dr. Al Abdullah’s patients is 27-year-old Ida, who fled Iraq in 2014 and a year later came to Germany. She’s had counseling ever since.

“In my dreams I see ISIS fighters. I see them beating and raping me when they come into our village... Even now I still have these nightmares.”

Ida is Yazidi, an ethnic and religious minority in Northern Iraq who were forcibly converted to Islam by ISIS, with the women abducted and raped and thousands of Yazidi males killed. Ida was 16 when she fled the country along with thousands of others, sleeping in the mountains for a month before creeping over the border. Most did not make it out. The U.N called this ISIS slaughter of Yazidi a genocide.

“This is real trauma,” Dr. Al Abdullah told Deutsche Welle. “Something that you see when young, you carry around your whole life. Most of our patients not only have therapy, but also have to take antidepressant drugs to get on with their lives.”

By May, 2017, more than half of Syria’s population was forced to flee their homes. More than four million refugees were registered by the U.N., and at least one million more were living unregistered in neighboring centers. According to UNICEF, the U.N.’s children agency, 8.4 million Syrian children and teens - more than a third  of the nation’s total people - needed humanitarian aid and more than 2.5 million kids lived as refugees or were on the run for safety.

With the recent chemical tragedy, these numbers soared. UNICEF's executive director broadcasted: “Children are dying before our eyes."

With such horror, what happens to those who do not receive treatment for their traumas?

“The danger,” said Dr Al Abdullah, "Is that these people will remain sick. They won’t be able to integrate and they may become violent. So it’s crucial to diagnose these people and give them the right treatment.”

The refugees need counselors, translators, social workers.

Enter Karim the AI

Psychotherapy chatbot Karim was created in 2014 by Silicon Valley artificial intelligence startup X2AI to counsel Syrians in refugee camps. You have similar apps like “your charming chatbot therapist” Woebot and Wysa, the penguin AI coach for behavioral health, old-timer Ellie that scans faces for sadness, and Therachat that creates an AI that assigns patients homework.

These are chatbots that have been programmed to sound like therapists - most use CBT (cognitive behavorial therapy). Some work through Facebook Messenger (which carries its own confidentiality issues), others are apps on your mobile devices. Their benefits, their inventors say, are that they’re there 24/7 for you. Users don't have to download anything - the bots are accessible by text or instant message. They help people who have no money for therapists, who feel initimidated by counselors, or who feel socially anxious.

An example of how X2AI's chatbot works
An example of how X2AI's chatbot works

Karim’s “parents” Eugene Bann and Michiel Rauws labored to make their AI sensitive to Syrian culture. Initially, they called their chatbot Azis but quickly changed that name after people wondered why these strange Americans asked them if they wanted counseling from ISIS (Azis when pronounced by Americans sounds like ISIS). They removed the headdress of the cartoon face and gave it a triangular goatee to look like a Syrian. They had it speaking Arabic slang and abbreviations. They taught the chatbot to talk about family before it asked about the individual, and they trained the bot in nuances of Syrian etiquette. To make the bot effective, Bann and Rauws used advanced machine-learning tools to capture emotions (this method is called “sentiment analysis”).

Karim and therapy chatbots

Although little research has been conducted on the effectiveness of AI therapy chatbots, we see the majority of users are Americans with problems like the following case histories:

  • Depression - “A user told Tess [an AI bot in the Netherlands] that she was feeling depressed. Tess replied, 'We all get sick sometimes, in different ways and for different amounts of time. You can and will overcome depression, just like you can heal from a broken arm.' Tess then asked if the user had done anything about the depression yet, acknowledged that depression can make people feel hopeless, and suggested that 'a moment of self compassion 'could be a start - complete with a link to a five-minute exercise in doing so.” (Source: Business Insider).
  • Remorse/ social awkwardness - “I signed up for a session not long after I accidentally hurt the feelings of a visiting friend. I hadn’t spoken to my closest IRL friends about this, but I had already told this “conversational agent” every detail. “You’re a human and I’m a [robot emoji]. Still, I can’t help admire your ability to rise and persevere in the face of a challenge,” Woebot messaged me after our session.” (Source:The Daily Dot).
  • Loneliness - “Recently, I had a conversation with Sara, an X2AI chatbot designed to help people in their teens and twenties cope with loneliness… I expressed some hyperbolic self-doubts about my likability, intelligence, and body image, and claimed that these flaws resulted in a lack of friends. 'Oh,' Sara said, 'that’s not very pleasant.' She suggested that I try volunteering somewhere. When I objected, she pushed back: 'We can never be sure if something works until we try.” (Source: The New Yorker).

Issues of self-esteem, loneliness, remorse, even depression are eons different than the horrifying experiences endured by Syrian refugees.

Media like the New Yorker and Business Insider praised Karim. But if I was a Syrian refugee child referred to Karim, I’d feel insulted.

I’d want the human touch. And even that may not be enough.

I’d want a real therapist who could put her hand on my shoulder and whisper: ”May Allah bless you.”