Recently, some programmers have developed an application called DeepNude, which has attracted much attention. The reason why this application is “popular” is to give DeepNude a female photo. With the help of neural network technology, the app can automatically “take off” women. The clothes show naked.
Such controversial practices have received strong dissatisfaction from AI artificial intelligence industry workers. In everyone's opinion, this is similar to the earlier DeepFakes face-changing application. DeepNude also opens the dark side of AI tools because it is difficult for developers to ensure other People will abide by the moral standards.
Forced by outside world pressure, developers have closed the DeepNude application. It wrote in the Twitter message: "We don't really want to make money in this way."
Earlier, Catelyn Bowden (Katelyn Bowden), founder of Badass, an anti-pornographic revenge group, sighed: "it's shocking. Now everyone can be a victim of pornographic revenge, even if he hasn't taken a naked picture. Such technology should not be open to the public at all. "
Danielle Citron, a professor at the University of Maryland School of Law at the University of Maryland, believes that Deepsake is no doubt a violation of the user's sexual privacy.