Mental health helpline CTL which is based in the US ends the data-sharing relationship with Loris. Previously has been using anonymous data from Crisis Text Line (CTL) to develop AI systems. The main goal of this data-sharing relationship was to help customer service agents understand the sentiment in chats.

Credit: winnquick

Recently one of the CTL board members tweeted they had been “wrong” to agree to the relationship.” The helpline also mentioned that they want to listen to its community’s concerns. CTL vice-president Shawn Rodriguez said, “Loris has not accessed any data since the beginning of 2020.”

Credit: theverge

Previously Loris representative said “AI-powered chat solution” helped customer service representatives to understand the customer’s sentiment and dynamically crafting effective responses based on the customer’s tone”. Loris also explained, “come from the highly acclaimed Crisis Text Line, where the ability to handle the toughest conversations is critical. It is this intelligence, drawn from analyzing nearly 200 million messages, that sits at the core of the AI”. 

CTL said all the data shared was fully anonymized. They claimed that they are transparent about data sharing with their users. It said, “as an automatic auto-reply to every initial text message that Crisis Text Line receives, and to which all texters consent”

But Politico claimed that the data were used for business purposes. Privacy and data policy fellow from Stanford University’s AI Institute said, “They may have legal consent, but do they have actual meaningful, emotional, fully understood consent?”

Credit: hitechglitz

So, CTL has now ended data-sharing It said, “We heard your feedback that it should be clear and easy for anyone in crisis to understand what they are consenting to when they reach out for help.”