
Amazon Joins Forces With OpenAI
Amazon.com Inc. revealed that last year it detected hundreds of thousands of pieces of content in its AI training data that appeared to contain child sexual abuse material (CSAM).
While the company removed the material before using it to train AI models, child safety officials say Amazon has not shared enough information about where the content came from, making it harder for law enforcement to protect victims and track down offenders.
Amazon uses an automatic scanning tool that compares content against a database of known CSAM, a process called hashing.
According to the company, nearly all the reports came from non-proprietary training data obtained from external sources, like publicly available web content.
The company also admitted it tends to over-report potential CSAM to avoid missing anything, which can lead to a high number of false positives.
The number of AI-related reports from Amazon jumped dramatically in 2025. The company accounted for most of over 1 million AI-related CSAM reports submitted to the National Center for Missing and Exploited Children (NCMEC), compared with just 67,000 reports from the rest of the tech industry the year before.
Experts say this surge is an outlier, raising concerns about the source of the material and the safeguards in place during AI training.
While Amazon is required to report suspected CSAM to NCMEC, the company has provided very little detail on where the content came from or who shared it, limiting the ability of authorities to remove the material or investigate offenders. NCMEC officials said that without these details, the reports are often “inactionable.” according to Bloomberg News.
The spike in reports comes amid a fast-paced AI race, where companies are rapidly gathering large amounts of data to improve their models.
Experts warn that this speed increases the risk that exploitative material can enter AI training pipelines, and training AI on illegal content could unintentionally teach models to manipulate or sexualize images of children.
Amazon said it is committed to preventing CSAM across all its businesses. A spokesperson emphasized that none of the flagged material was AI-generated, and the company’s AI models have not produced any CSAM. They also highlighted that Amazon’s tools scan training data carefully and remove known illegal content before it is used.
Other tech companies, including Google, OpenAI, Meta, and Anthropic, also scan AI training data for CSAM. But according to NCMEC, Amazon’s reporting is far higher than its peers, while providing much less information about the source of the material. Experts say this underscores the need for greater transparency and stronger safeguards in AI development.
Experts like David Thiel, former technologist at the Stanford Internet Observatory, say companies should be more open about where their AI training data comes from and how it is cleaned. Without transparency, there is always a risk that illegal material slips through, and children remain at risk of exploitation.
The discovery of hundreds of thousands of CSAM instances in Amazon’s AI training data highlights the challenges of developing AI responsibly.
While Amazon has systems in place to scan and remove illegal content, experts say more transparency, oversight, and safety measures are urgently needed to protect children and prevent AI from being trained on exploitative material.
ALSO READ: PlayStation Plus Games February 2026 ANNOUNCED: Check Out Full List Inside
Sofia Babu Chacko is a journalist with over five years of experience reporting on Indian politics, crime, human rights, gender issues, and stories about marginalized communities. She believes journalism plays a crucial role in amplifying unheard voices and bringing attention to issues that truly matter. Sofia has contributed articles to The New Indian Express, Youth Ki Awaaz, and Maktoob Media. She is also a recipient of the 2025 Laadli Media Awards for gender sensitivity. Beyond the newsroom, she is a music enthusiast who enjoys singing. Connect with Sofia on X: https://x.com/SBCism
Kanwal Aftab Pakistani MMS row: Viral leaked video sparks privacy concerns; authenticity unclear as debate…
Delhi Police issued a traffic advisory ahead of the DC vs MI IPL 2026 match…
The crew of Artemis II has begun its much-awaited journey around the Moon after a…