‘Eliza’ Explores Moral Questions Around AI-Based Therapy

‘Eliza’ Explores Moral Questions Around AI-Based Therapy


Eliza is a virtual counselling app, offering help to thousands who, for whatever reason, can’t or won’t go to therapy themselves. But can technology replace real therapy in this visual novel?

You’ll join Evelyn, a woman who abandoned a high-powered tech career, but has now returned to work as a proxy for this virtual counseling app. Eliza follows Evelyn as she reads through a script provided by the therapy AI, as well as in her daily life in present-day Seattle, where she bears witness to the growth of the tech company who created this therapy AI, as well as the effects on the people who use it.

Eliza aims to look into the effects of tech companies aiming to deal with complex human interactions and emotions with algorithms and AI. It explores the harmful effects of feeling you can plan for every human possibility, how technology is inherently designed with the biases of those who create it, and the often-ignored dangers that can come from assuming these technologies are correct and unbiased. Is it better than nothing for people who, for whatever reason, don’t have access to therapy? The story is a whirlwind of complex questions and worries that can be applied to a great deal of modern thought in technology.

Eliza explores these effects and questions through your interactions and choices as you meet people and learn their stories. It;s a grim, yet necessary look into the troubles that come up when we rely on programs and tech too much to resolve issues of humanity, and the very real dangers that come from ignoring there destructive issues.



Eliza is available now on Steam.

The post ‘Eliza’ Explores Moral Questions Around AI-Based Therapy appeared first on Indie Games Plus.



Source link

#Follow us on Instagram