A new AI system, developed by researchers at Seattle, is designed to assess, detect, and promptly block unsolicited images that are deemed obscene enough to be shown.
Kelsey Bressler is the CTO of Badass Army, a revenge porn activism organization. She got the “inspiration” to assist in the development of the AI after she was sent inappropriate nude pictures of a man. In order to train the AI, she and her team had set up a dummy Twitter account, where the inbox is thrown with different nude pictures that the AI would then learn to sort out.
As far as direct identification of reproductive organs is concerned, the AI performs adequately well. However, at the moment, it is still quite far from complete, as reported by several failed attempts to identify a male penis when it is partially covered or decorated with different things.
The necessity of the idea can be a bit tricky to rationalize. After all, one can simply close the message if they really don’t want to see the unsolicited pictures. But you have to consider that the picture will still be seen at least one time regardless, if we use that method. What the AI is aiming for is to identify the picture, before you even open it, or even before you consider that the filename of the attachment is fishy enough for you to straight up ignore.
Ms. Bressler compares it to flashing in public, where even if you don’t know that person, it is still highly disrespectful. It may not do anything, or cause harm, but it would still stir a bit of disturbance towards the person seeing the image.
So, what exactly was the final fate of that poor little Twitter inbox? It’s now officially closed (predictably) due to the high number of pics that was thrown in there by “volunteers”. The research itself still continues, however, and Ms. Bressler announced that her team will be sharing the research data once it is completed.
Featured Image credit by Анастасия Гепп via Pixabay