I hope Pandora wont mind us copying her very important piece in its entirety, we have left in all her links and we do suggest you try to contribute a little to help Pandora in this fight. We only report here what others are saying but sometimes it is so very important that we give the full gist of the argument so we will have this post open for comments.
How to get a meeting with the DCMS
I was recently invited to meet with the Department of Digital, Culture, Media and Sport (DCMS) to discuss age verification.
I’ve been trying for a while to make this conversation happen. I’d previously connected with a couple of different members of the Child Internet Safety Team, the group responsible for implementing age verificiation – and then they moved onto other roles, leaving me without any active contacts. I first met with different representatives in collaboration with the UK Adult Producers trade association (UKAP), with the Open Rights Group – and more briefly at events organised by UKAP and the Adult Provider Network. Twice I was given email addresses and told to keep in touch, only to then have my emails ignored. No doubt they’re busy people.
Now, with the DCMS in prime position to shape up age verification and create a robust regulatory system in which data is protected and user privacy is respected, I was faced with another new Senior Policy Manager – the third since I started campaigning around age verification – who was refusing to engage.
I had to act strategically. In November I organised a roundtable on age verification, and invited the DCMS. They declined to attend. The next day I sent them a long list of questions, exposing the myriad cracks in age verification, and holding the DCMS to account for protecting privacy, security and freedom of expression. I waited for an answer – until I posted my piece revealing the catastrophic consequences should MindGeek attain a monopoly on age verification with their AgeID software, which could potentially track porn users across the internet. In that piece I called out the DCMS for refusing to answer my questions via email, or take responsibility for regulating the age verification market they have created. It worked. The next day I received a reply to my email, inviting me to a meeting to address my concerns.
Age verification: the questions we need answered
When the Senior Policy Manager of the Child Internet Safety Team sent his apologies for not attending the age verification roundtable, he included a paragraph of DCMS policy stating that age verification is an “important child protection measure” and that they “want to make the UK the safest place in the world to be online” – a line copied and pasted directly from government press releases. I decided to engage with this claim, and tackle all the ways in which the age verification policy outlined in the Digital Economy Act, is inconsistent, unworkable, and threatens to harm small businesses and individuals, particularly members of marginalised communities such as sex workers and LGBTQ people.
These are the questions I asked:
- Section 3 14.6 of the Digital Economy Act states that “For the purposes of this Part, making material available on the internet does not include making the content of an on-demand programme service available on the internet in the course of providing such a service.” Does this mean that on-demand programme services will not have to age verify users?
- The scope of the Act is not well defined. Sky’s content filter blocks over 4 million adult websites. How many websites can the BBFC classify?
- Under the definitions provided in the Act, “commercial pornography providers” could include amateur sex bloggers posting nude personal pictures to a small audience as part of a consensual adult lifestyle and freedom of sexual expression, if they make a little money (even £50/month) from advertising, sponsored posts or sex toy reviews. How will sites like this be able to afford the costs of age verification?
- Many independent escorts, to avoid working on the streets or for exploitative bosses, advertise their services online via self-hosted websites, which often contain nude pictures and video to promote their brand. Will these sites need to install age verification? If sex workers cannot afford to pay 25 pence per unique visitor (which adds up to hundreds of pounds a day for even relatively small websites), and therefore need to stop advertising online and go back to working for exploitative agencies or bosses, how does this make them safer?
- The word “security” is referenced by the Digital Economy Act only in terms of national security, not personal security. Are you aware that a law which forces citizens into the habit of sharing private details online is extraordinarily bad security on a national scale?
- Have you considered the vast potential for identity theft which widespread age verification will lead to, with ripe pickings for identity fraudsters to collect credit card details and other identifying information on fake websites with fake “age checks”? I’d be thrilled to learn how exactly you think this will make the internet “safe”. What grounds will the regulator have to sanction fraudulent or data harvesting “age verification” software?
- Section 3 25.1(a) requires the regulator to publish “guidance about the types of arrangements for making pornographic material available that the regulator will treat as complying with section 14”. Does this extend to providers of Age Verification software; and does the regulator therefore have the power to decide what age verification solutions will be considered compliant or not? On what basis will these decisions be made? Will the security and privacy strengths of AV solutions be taken into account when considering their compliance, or merely whether or not they correctly verify age?
- The Act completely fails to address the privacy and security of internet users – it does not use the word “privacy” once. MindGeek (the biggest online porn company, who own the majority of the “free” adult tube sites such as PornHub) anticipate that 20-25 million users will sign up to their age verification solution AgeID in the first month. Their sites PornHub, YouPorn, Digital Playground and Brazzers have all suffered security breaches in the past. Therefore, without any privacy safeguards in the body of the Act, this seems to pave the way for 25 million users’ sexual preferences to be leaked onto the Internet. Are you intending to produce a mandatory privacy standard which age verification technologies must comply with in order to be considered compliant? Will the regulator have authority to enforce this standard?
- It has been repeatedly claimed that age verification only needs to know the user’s age, not their identity. Nonetheless, many of the currently available age verification technologies (eg VeriMe, Yoti, Experian, Equifax etc) require users to share identifying details such as their mobile number, real name or credit card details. When it comes to secure systems design, the most confidential data is not retained anywhere. What safeguards will be in place to ensure that age verification providers only see the minimal amount of data; blind data as much as possible; and do not needlessly retain sensitive user data, thereby making it vulnerable to security breaches, abuse or leaks?
- How do you resolve the discrepancy between the content regulations laid out in the Act (where pornography is permitted behind age checks as long as it is not “extreme” according to the definition provided by the Criminal Justice and Immigration Act 2008), and those laid out in the AVMS 2014, which prohibits any content beyond R18? The AVMS 2014 is still in force, even though Parliament ruled during the debates on the Digital Economy Act that the CPS guidance on the OPA is out of date with recent case law, and not fit for purpose. Why is there one rule for ODPS and one rule for everyone else?
- Section 3 15.2 of the Act defines “material” as “(a) a series of visual images shown as a moving picture, with or without sound; (b) a still image or series of still images, with or without sound; or (c) sound”. Given the definition of “pornographic” requires “that any classification certificate issued in respect of a video work including it would be an 18 certificate;” how will sound be classified? Have the CPS issued any guidance on the OPA relating to sound? Will you be producing guidance clarifying what sort of audio would be considered “pornographic” for the purposes of the Act?
- I understand that your second draft of the guidance for the age verification regulator is about to be presented to Parliament, and must be approved by both Houses. The regulator then needs to be appointed, and they then need to produce guidance which must also be approved. Age verification companies are not going to launch their products until after the regulator’s guidance has been published, as they want to ensure their products are compliant before they go to market. It therefore seems unlikely that any age verification solutions will be available for public consideration before January or February next year. Given Matt Hancock has announced the deadline for compliance as May 2018, how do you expect pornography site owners, most of which are one or two person outfits without a separate IT department, to research the relevant software and make informed decisions within this timeframe? Many site owners are eager to begin preparations, but they are unable to do so until the relevant guidance has been issued, which might not be for many months. Do you consider a two month turnaround to be realistic for one person who is not an IT expert to conduct a complete site overhaul?
- Although guidance about the scope of the legislation and what will be considered compliant has still not been issued, are you aware that since the Act passed we are already seeing a chilling effect, with small porn sites closing down for fear of being blocked? Do you believe that UK law should uphold diversity and freedom of expression, particularly when it comes to LGBTQ sexualities and other marginalised communities?
- MindGeek are producing their own age verification solution, AgeID, which as noted above will verify 25 million users in the first month. They are intending to license this out to other site owners, thereby creating an unimpeded traffic flow where users only have to age verify once, as long as they continue to browse sites using the AgeID system. This will disadvantage sites using other age verification solutions which would require the user to re-verify, as many users will be put off by having to go through the age verification process multiple times. Are you aware that this will gift the biggest porn company in the world with a monopoly that effectively makes them the gatekeepers of porn? MindGeek’s tube sites make money by allowing users to upload pirated (stolen) content made by producers like myself, and then monetising it via advertising; once age verification is in effect, if we want to stay in business we will effectively have to pay a “MindGeek tax” to our biggest competitor, who has established market dominance by pirating our content. Is this sort of extortion your idea of a “safe internet”?
I am extremely grateful to the amazing 114 people who support me with a small monthly pledge to do work like this. Become a patron and help me continue to lobby for privacy, freedom of speech and other adult liberties. Patrons get early access to campaigning news, including my article reporting the DCMS’ answers to my questions, which has just gone live. Pledge $1 or more to view it.