Skip to Content

Child abuse images removed from AI image-generator training source, researchers say

KVIA

AP Technology Writer

Artificial intelligence researchers said Friday that they have deleted more than 2,000 web links to suspected child sexual abuse imagery from a dataset used to train popular AI image-generator tools. The research dataset — called LAION — is a huge index of online images and captions that’s been a source for leading AI image-makers such as Stable Diffusion and Midjourney. But a report last year by the Stanford Internet Observatory found it contained links to sexually explicit images of children, contributing to the ease with which some AI tools have been able to produce photorealistic deepfakes that depict children.

Article Topic Follows: AP National News

Jump to comments ↓

Associated Press

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content