Ellie Forman

This is part of our ‘Meet Our Delegates’ series where we introduce you to social scientists and technologists who are attending the conference. Ellie will also be giving a PechaKucha talk on “how should we design Conversational AI in order to provide emotional support?”.

Meet Ellie Foreman

Hey Ellie, nice to meet you! You’re an Automation Fellow for the South West Creative Technology Network (SWCTN). You came to tech after graduating in Philosophy and Politics in 2015. What drew you to tech and what do you love about it?

Hi! That’s a great question because if you had told me when I was in school or university that I would now be working as a developer, there’s no way I would have believed you!

I have always gravitated towards the arts and humanities and a career in tech never crossed my consciousness as something I’d like to do. Then at university, I took a module in Philosophical Logic and absolutely loved it. I love the process of breaking a problem down to its core elements, and working back up from there, developing a greater understanding of the entity as a whole.

A few years after graduating, I took some online coding courses and then decided to do a three-month web development bootcamp in Bristol. I love the challenge of coding, and the satisfaction when something I’m stuck on finally works! I also love how versatile and creative coding can be. I spend a lot of my spare time playing around with processing, p5.js and making interactive arduino projects.

As an Automation Fellow, you’re exploring the relationship between humans and AI, questioning how we converse with technology. Your research focuses on whether emotional support can be provided through communicating with artificial intelligence. This sounds utterly fascinating! What does your research involve and what have you discovered so far (that you can share with us)?

Something I find fascinating is how our belief in robotic capability could potentially make a difference to how we perceive technology expressing emotion.

In the past few months, I’ve been carrying out interviews and workshops to explore how users feel when technology expresses emotions such as sympathy and empathy.

Something I find fascinating is how our belief in robotic capability could potentially make a difference to how we perceive technology expressing emotion. Although there is a lot of research into how users feel when interacting with conversational AI, there doesn’t seem to be much research looking into why we feel the way we do. A study by Bingjie Liu and S. Shyam Sunda found that users who believed in robotic feelings had a negative reaction when empathy was expressed. However, those who did not believe in robotic feeling reacted more positively when empathy was expressed than in cases where they were just given informational support. The idea that we respond more positively to technology expressing emotion when we actively believe this expression to be artificial is really interesting. I’m excited to explore this more!

One of my favourite documentaries that involves robots is called Alice Cares. It shows older people who are living alone being introduced to a small doll-like robot called Alice and their initial scepticism about a robot being a potential companion…. well, I won’t give away the ending, our readers will have to seek it out! What do you think about robots providing emotional support to humans? 

Alice Cares (Ik ben Alice): http://www.ikbenalice.nl

That documentary sounds great, I’ll have to watch it! It’s a difficult question because there are so many forms of emotional support. The support we receive from friends and family is different to emotionally-supportive therapeutic relationships, and both are very different to the companionship we feel from our pets. Some would argue that we shouldn’t try to solve social problems like isolation and loneliness through technology. Although that’s true, I think there is a lot of potential to use technology in a way which augments human support.

Veterans suffering from PTSD were more willing to talk openly about the depth of their emotions when talking to a virtual therapist compared to a human therapist.

It’s interesting to look at cases where someone may feel more comfortable discussing their emotions with technology rather than a human. Researchers at USC Institute for Creative Technologies created a virtual therapist, called Ellie, who aims to help veterans talk about PTSD. Studies showed that veterans were more willing to talk openly about the depth of their emotions when talking to Ellie compared to a human therapist. In this way, technology could have the potential to reduce the stigma and fear of judgement that people often feel when confiding in a human.

There are important ethical questions when designing technology to emotionally support people, especially when caring for vulnerable groups in society. This highlights the importance of bringing social scientists and technologists together at conferences like Anthropology + Technology!

Why are you excited to attend the conference and what do you hope to get from attending?

I’m excited to explore the potential of emerging technologies alongside the critical analysis that social sciences bring to the table.

The Anthropology + Technology conference offers a unique space to reflect on important questions about the future of technology. I’m excited to explore the potential of emerging technologies alongside the critical analysis that social sciences bring to the table.

From a personal point of view, since studying Philosophy and Politics and then moving into the world of tech, I’m interested in how I can build a career which involves both these disciplines, so I’m excited to meet and learn from all the amazing attendees and speakers!

In your opinion, why do you think other technologists should attend the conference?

Technologists need to engage with social sciences in order to create long-term sustainable value to our society.

Technologists need to engage with social sciences in order to create long-term sustainable value to our society. Disciplines like Philosophy ask questions, rather than offer solutions. This is especially important with the growth of AI, which could have a hugely positive impact on society, as well as the potential to be detrimental. We need to work together to ensure that ethics and safety are woven into the fabric of researching, designing and implementing intelligent systems.

Is there anything else you’d like to tell us or say?

…just thank you for organising such a great conference and I’m excited for October!!

Thanks so much, Ellie, we look forward to your PechaKucha talk and meeting you in October!