eSafety requires providers of AI companion chatbots to explain how they are keeping Aussie kids safe
By API User
Australia’s eSafety Commissioner has issued legal notices to four popular AI companion providers requiring them to explain how they are protecting children from exposure to a range of harms, including sexually explicit conversations and images and suicidal ideation and self-harm. Notices were given to Character Technologies, Inc. (character.ai), Glimpse.AI (Nomi), Chai Research Corp (Chai), and … Continued