eSafety requires providers of AI companion chatbots to explain how they are keeping minors safe

Australia’s eSafety Commissioner has issued legal notices to four popular AI companion providers requiring them to explain how they are protecting children from exposure to a range of harms, including sexually explicit conversations and images and suicidal ideation and self-harm. Notices were given to Character Technologies, Inc. (character.ai), Glimpse.AI (Nomi), Chai Research Corp (Chai), and Chub AI Inc. (Chub.ai) under Australia’s Online Safety Act. The notices require the four companies to answer a series of questions about how they are complying with the Government’s Basic Online Safety Expectations Determination. The notices require these providers to report on the steps they are taking to keep Australian’s safe online, especially children. AI companion providers which fail to comply with a reporting notice could face enforcement action, including court proceedings and financial penalties of up to $825,000 per day.

Click here for the official article/release

Disclaimer

The Legal Wire takes all necessary precautions to ensure that the materials, information, and documents on its website, including but not limited to articles, newsletters, reports, and blogs (“Materials”), are accurate and complete. Nevertheless, these Materials are intended solely for general informational purposes and do not constitute legal advice. They may not necessarily reflect the current laws or regulations. The Materials should not be interpreted as legal advice on any specific matter. Furthermore, the content and interpretation of the Materials and the laws discussed within are subject to change.

AI was used to generate part or all of this content - more information

Also Read:  U.S. Nears Finalization of AI and Semiconductor Investment Restrictions on China