署名を添えて、2月26日付でレターを郵送いたしましたことをご報告させていただきます。内容は下記のとおりです。
The ‘porNOphone’ campaign: To develop a feature to prevent
the taking of pornographic images on smartphones
from being recorded and
to incorporate it as an OS standard feature
We are an organization that provides counseling and support for victims of sexual exploitation and sexual violence. We track circumstances of sexual exploitation and violence in Japan, make demands for remedy to businesses profiting from sexual exploitation, and work to provide relief to victims.
One of these victims, a high school student, let’s call her Aiko, was forced to have a sexual relationship with a male acquaintance and these acts were recorded on his smartphone. She didn't like it, but she felt she couldn't say no for fear of ruining the relationship, and she had convinced herself that it was just something that men usually wanted to do.
Afterwards, the footage was uploaded to a pornography video-sharing website and her schoolmates and parents found out about it. Her parents asked the website owner to take it down, but their request was ignored. We at PAPS managed to get the website owner to take it down, but since then the video has periodically re-appeared, and so Aiko must constantly monitor the site. She now lives her life afraid that someone will see the video, and in this state of constant tension she has thoughts of death.
Another case that PAPS has be involved with is that of junior high school student ‘Marie’ who became friends with someone she met through a smartphone game. She began chatting on Twitter with this friend about her daily school life and various problems. She hadn’t had anyone in her life to chat to about problems, so she came to trust this friend even though they hadn’t actually met in real life.
One day, Marie sent the friend a photo of her face in response to a request for a photo from this friend who had sent one of himself through social media. Sending photos became a daily activity that Marie looked forward to as an emotional comfort in loneliness, and the friend would compliment her looks, and the photos would become a topic of conversation between the two online. Marie was eventually goaded by this friend into sending pictures in her underwear.
Marie was afraid that if she refused, she would hurt this friend and be rejected by him. Then, one day, she was asked to send naked photos. She refused at first, but her friend threatened to circulate the underwear photos she had already sent. He then created a fake Twitter account with a doctored photo of Marie in her underwear. In response to this intimidation, Marie sent the friend a naked photo taken on her phone.
She subsequently consulted with us at PAPS, and the perpetrator was arrested. However, the perpetrator himself was an underage boy in another class at Marie’s school.
When child or revenge pornography is released to the Internet in the ways shown in these examples, it is virtually impossible to remove, and its damage to victims can continue for 10 or 20 years. Further, in order for a victim to take action to demand removal of such images, they must search the internet and find the images of themselves, and this can cause further trauma. Our organization searches for sexual images on behalf of victims and submits take-down requests to website owners, but, even when the images are taken down, they re-appear after a period of time, and a cat-and-mouse game ensues.
Due to the current proliferation of smartphones, even primary school children these days have them, and our organization now receives endless requests for consultation over harms arising from the circulation of pornographic images, including child and revenge pornography. The people who produce these images do so easily with their smartphones, and so are unaware of the extent of human rights violation their victims sustain. Almost all of the requests we have received for consultation over revenge and child pornography have involved smartphone cameras.
According to Japan’s National Police Agency, the number of calls to police relating to revenge pornography, which involves the release of sexual images or videos without a person's consent, hit a record 1,479 calls in 2019, which was a 9.8 percent increase over the previous year. More than half of the perpetrators were named as dating partners or former dating partners. A record number of children were victims of child pornography, 1,559 victims, which was an increase of 283 children over the previous year.
Looking at the statistics in this diagram, it is obvious that smartphones, as convenient as they might be in our convenience-centred societies, are causing children to fall victim or perpetrator to crimes of pornographic image-based sexual abuse.
We do not believe smartphones should be the easiest tools that create victims or perpetrators of pornographic images. Our ‘porNOphone’ campaign will of course not prevent all harms arising from the proliferation of these images, but it aims to reduce the number of requests arising from victims for the take-down of materials online. The campaign also serves to raise awareness of the criminality of producing sexually explicit imagines against someone’s will, and also to encourage a shift in social norms towards intolerance of such practices.
In launching our ‘porNOphone’ campaign, we appeal to the sense of corporate responsibility that Apple and Google exercise in relation to their products, and ask for their cooperation. We urge Apple and Google to use their brilliant minds, technology and pioneering spirit to solve this problem of sexual exploitation and innovate to address the common problem of sexual exploitation around the world.
We ask specifically of smartphone OS vendors Apple and Google the following two points.
1. We request that they develop a feature that prevents the recording of naked human beings on smartphone, utilizing AI and image recognition technology.
2. After development of this feature, we request it shall be incorporated as a standard fixed function of their OS.
We raised money through crowdfunding in January. Based on the voices and funds that support our backs, we are going to continue to educate society about the fact of digital sexual violence.
When AI stops letting smartphone cameras shoot nudes and pornography, it makes sense for humans to decide whether or not to install a new camera app. It is our sincere hope that AI will be properly involved in the decision-making process as well as keeping the opportunity for crime and mistakes away from our children, even one step at a time.
Sincerely