Trigger warning – rape, sexual violence, child sexual harassment
Globally porn is a USD $97 billion industry. PornHub, the largest distributor, gets 115 million unique visits each day on an average and the uploaded content on the site adds up to the streaming time of over 170 years. The nationwide lockdown has exposed hideous parts of our society where child rapists, abusers, and pedophiles sit behind the dark web enjoying sexual harassment of children, others traffic, film, and upload such content. A sex racket like this, its business is huge. Work is done through WhatsApp, Facebook, and Telegram groups. Members are required to upload a certain number of videos every day to retain their membership and incomes and thus begins the vicious cycle of entrapping gullible children into forced sex labor.
This industry makes billions through advertisement revenue, premium subscriptions, and data collection all through rape, abuse, and child sex trafficking. According to a BBC report, a 14-year-old girl (unnamed) was kidnapped, raped, and assaulted on camera for 12 hours at gunpoint and she had to beg PornHub for months to remove her videos. In an investigation by Sunday Times, it discovered the images of a toddler being sexually abused on a porn website. These incidents are like a drop in the ocean, only a scratch on the mind-bogglingly enormous sex racket. By not holding PornHub accountable we sanction the company to benefit from rape, violence, and child sex trafficking.
India Child Protection Fund (ICPF) in a survey published in April 2020, found an alarming rise in demand for child pornography. The average demand for child pornography was 5 million per month in 100 cities, only on the public web, the figures from Virtual Private Network (VPN) and other dark web resources remain unknown. Chennai and Bhubaneshwar topped the consumption of child sexual abuse material (CSAM). 90% of CSAM consumers are men. The search for ‘child porn’, ‘sexy child’, and ‘hot teen’ has spiked. 18% of individuals consuming CSAM have exhibited an explicit interest in videos where children are choking, bleeding in pain, or screaming. In addition to this, the overall 2019 search keywords in India include ‘forced sex India’ and ‘rape sex videos Indian’. With six billion videos uploaded every year, it's a shame to not have a system in place to verify age and consent for uploading a pornographic video. PornHub makes uploading content insanely easy where all users need is an email ID. That’s it!
In an attempt to counter this closer home, President M Venkaiah Naidu constituted the Rajya Sabha Ad Hoc committee which has made more than 40 recommendations. Those include greater regulation of children online, strict government action to prevent the generation and distribution of child porn and holding Internet service providers (ISPs), and holding social media companies accountable for their inaction and involvement.
Kerala state police in April 2019 conducted an operation that resulted in the registration of 21 cases and arrests of 14 persons. It was followed by a sequel elimination step in June in which 12 persons were arrested and 16 cases were registered. According to police, the investigation has found that a host of child pornographic groups were operating in social media applications such as Facebook, WhatsApp, and particularly Telegram, as these have strong encryption and offer greater anonymity.
In a similar whack, Maharashtra police undertook an operation called Operation Blackface. The Maharashtra police have lodged 47 First Information Reports (FIRs) across the state out of which 16 are under The Protection of Children from Sexual Offences POCSO), 29 are under the Information Technology Act, 2000, and one is under the Indian Penal Code (IPC). During the last one-and-a-half months, 10 persons have been arrested under a special drive called ‘Blackface’.
ICPF in its recommendation stated that an aggressive online campaign is required for pulling down websites distributing CSAM, more governmental and non-governmental engagements to educate children and parents on the relevance of online child sexual abusers, and means to report the crime. CSAM Offenders Registry should be maintained for individuals found to be consuming CSAM, which will be made available to national and local ISPs. Individuals found to be consuming, distributing, or selling CSAM must be denied personal internet services for a reasonable amount of time. Orders for mandatory reporting and pulling down of CSAM content and individuals by ISPs and social media platforms. Failure to report or remove CSAM shall qualify for violation of terms for permission to provide networking services. Most importantly, the creation of a CSAM Tracker through a nationwide scale-up of artificial intelligence tools deployed for this research. The CSAM Tracker will track the hosting, sharing viewership, and download of CSAM and link it with existing cybersecurity systems of the Government of India.
While all these ideas look credible, the essence lies in its implementation. The battle doesn’t end with eradication of CSAM, we need a coherent legal system which makes consent of performers mandatory for filming porn, efficient technical support to detect non-consensual and violent activities, prompt action must be taken against offenders and most importantly sensitization among viewers.
Comments