Can Artificial Intelligence replace frontline workers? Here’s an interesting find!

Share :

Published November 1, 2023 at 4:37am

Update November 1, 2023 at 7:18am

    NEDA used chatbot named Tessa to aid patients suffering from eating disorders

    Most of the responses were evidently pre-programmed

    Some of the responses generated by the chatbot were viewed as harmful

Two professors from Washington University and Stanford University  attempted to create a chatbot named Tessa to aid patients suffering from eating disorders. This was later put to use by the National Eating Disorder Association (NEDA) of the United States of America (USA). Salaried employees who were responsible for the task were laid off. 

The impact however was not along the line of expectations. Most of the responses were evidently pre-programmed. The “Enhanced question and answer” feature was later on added to the chatbot. They mirrored human interactions and used a wider database but this too was susceptible to problems.  Some of the responses generated by the chatbot were viewed as harmful. “Rule-based chatbots have the potential to reach large populations at low cost in providing information and simple interactions but are limited in understanding and responding appropriately to unanticipated user responses,” the scientists stated in their conclusion. 

Mark Tsagas, a professor in the University of East London pondered over the implication of this incident on the large scale implementation of Artificial Intelligence tools. “It’s important to learn lessons from cases such as this against the background of a rush towards the integration of AI in a variety of systems. There is a potential tension between ethical considerations and business interests. We must hope that the two will eventually align, balancing the wellbeing of individuals with the efficiency and benefits that AI could provide. AI-generated responses and simulated empathy may never be enough to replace genuine humanity and compassion, particularly in the areas of medicine and mental health,” he stated in an interaction with The Conversation

Can Artificial Intelligence replace frontline workers? Here’s an interesting find!

https://newsfirstprime.com/wp-content/uploads/2023/11/BeFunky-collage-2023-11-01T100444.799.jpg

    NEDA used chatbot named Tessa to aid patients suffering from eating disorders

    Most of the responses were evidently pre-programmed

    Some of the responses generated by the chatbot were viewed as harmful

Two professors from Washington University and Stanford University  attempted to create a chatbot named Tessa to aid patients suffering from eating disorders. This was later put to use by the National Eating Disorder Association (NEDA) of the United States of America (USA). Salaried employees who were responsible for the task were laid off. 

The impact however was not along the line of expectations. Most of the responses were evidently pre-programmed. The “Enhanced question and answer” feature was later on added to the chatbot. They mirrored human interactions and used a wider database but this too was susceptible to problems.  Some of the responses generated by the chatbot were viewed as harmful. “Rule-based chatbots have the potential to reach large populations at low cost in providing information and simple interactions but are limited in understanding and responding appropriately to unanticipated user responses,” the scientists stated in their conclusion. 

Mark Tsagas, a professor in the University of East London pondered over the implication of this incident on the large scale implementation of Artificial Intelligence tools. “It’s important to learn lessons from cases such as this against the background of a rush towards the integration of AI in a variety of systems. There is a potential tension between ethical considerations and business interests. We must hope that the two will eventually align, balancing the wellbeing of individuals with the efficiency and benefits that AI could provide. AI-generated responses and simulated empathy may never be enough to replace genuine humanity and compassion, particularly in the areas of medicine and mental health,” he stated in an interaction with The Conversation

Load More