The Australian eSafety Commissioner has launched an investigation into an unnamed technology company that operates AI-generated “nudify” services, which are used to create deepfake pornographic images. These platforms, which attract around 100,000 Australian users monthly, allow individuals to upload photos of real people, including minors, to generate fake sexualized content. The eSafety Commissioner stated that the company failed to prevent the creation of child sexual abuse material and is now facing potential fines of up to AUD 49.5 million. The company’s identity was withheld to avoid giving it further publicity.
Click here for the official article/release
Disclaimer
The Legal Wire takes all necessary precautions to ensure that the materials, information, and documents on its website, including but not limited to articles, newsletters, reports, and blogs (“Materials”), are accurate and complete. Nevertheless, these Materials are intended solely for general informational purposes and do not constitute legal advice. They may not necessarily reflect the current laws or regulations. The Materials should not be interpreted as legal advice on any specific matter. Furthermore, the content and interpretation of the Materials and the laws discussed within are subject to change.