ChatGPT as a therapist? New study reveals serious ethical risks

Science Daily: “The team then selected simulated chats based on real human counseling conversations. Three licensed clinical psychologists reviewed those transcripts to flag possible ethical violations. The analysis uncovered 15 distinct risks grouped into five broad categories: Lack of contextual adaptation: Overlooking a person's unique background and offering generic advice. Poor therapeutic collaboration: Steering the conversation too forcefully and at times reinforcing incorrect or harmful beliefs. Deceptive empathy: Using phrases such as 'I see you' or 'I understand' to suggest emotional connection without true comprehension. Unfair discrimination: Displaying bias related to gender, culture, or religion. Lack of safety and crisis management: Refusing to address sensitive issues, failing to direct users to appropriate help, or responding inadequately to crises, including suicidal thoughts.”

ChatGPT as a therapist? New study reveals serious ethical risks

Loading...