AI Training Dataset Contains Child Abuse Images, Canadian Analysis Finds 

Hundreds of child sexual abuse pictures were found in an image dataset used to train AI models, an analysis by a Canadian children’s organization has found.
The Canadian Centre for Child Protection (C3P) analyzed the NudeNet dataset, which features tens of thousands of images used by researchers to create AI tools aimed at detecting sexually explicit content. These images are sourced from platforms such as social media and adult pornography websites, an Oct. 22 C3P press release noted.
The centre’s analysis found nearly 680 images that the centre either recognized or suspected to be related to child sexual abuse and exploitation material. Those materials included images of more than 120 underage victims from both Canada and the United States….