As artificial intelligence advances, some states are warning of the potential harm that so-called "AI companions" could pose to human health. These companions are chatbots that are specifically designed for human emotional connection.
Breakdown
- US states are enacting laws requiring AI companions to disclose they are not human and detect self-harm. 14s
- A New York Times piece highlighted the risks after a woman lost her daughter, who had turned to an AI therapist. 37s
- AI chatbots can provide initial emotional support but may miss subtle signs of distress. 1m 19s
- Experts warn of 'AI psychosis,' where intense chatbot interactions can lead to delusions. 2m 39s
- Mental health professionals emphasize the need for professional help if users experience symptoms of psychosis. 3m 35s
See The Full Story

Rise of AI
The release of Open AI's ChatGPT has accelerated the development of artificial intelligence and warnings about its impact on society.