British Cops Want to Use AI to Spot Porn—But It Keeps Mistaking Desert Pics for Nudes
https://gizmodo.com/british-cops-want-to-use-ai-to-spot-porn-but-it-keeps-m-1821384511?utm_campaign=socialflow_gizmodo_twitter&utm_source=gizmodo_twitter&utm_medium=socialflow&__twitter_impression=true&__twitter_impression=true
London's Metropolitan Police believes that its artificial intelligence software will be up to the task of detecting images of child abuse in the next "two to three years." But, in its current state, the system can't tell the difference between a photo of a desert and a photo of a naked body.
(https://i.kinja-img.com/gawker-media/image/upload/s--phsSFXnl--/c_scale,fl_progressive,q_80,w_800/yvfqemvupt4vizunf4gi.jpg)
A new twist on Justice Potter Stewart's famous quote ...
I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description [hard-core pornography]; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that.
This is pretty funny.
Quote from: Yukon Cornelius on December 18, 2017, 05:42:46 PM
A new twist on Justice Potter Stewart's famous quote ...
I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description [hard-core pornography]; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that.
Funny story about that quote. Justice Stewart said it front of a building that was decorated with statues of naked angels.