Given Royal ascent last month, the UK has enacted its Online Safety Act 2023 and, while its intentions are good, it is likely to have both positive and negative impact. David Ashman takes a look at what it might mean for the telemedia market
The UK Government’s Online Safety Act 2023 was given Royal Assent in October and presents a key opportunity for the evolution of the web, or more simply, a safer place to allow our children to spend time. This is the aspiration and it’s a laudable endeavour.
While there is great detail contained within the Act, the key themes cover a range of powers to protect underage users from the darker forces at work online. These can be summarised in six key aims:
- Zero-tolerance approach to protecting children, meaning social media platforms will be legally responsible for the content they host
- Empowering adults with more choices over what they see online
- Legal responsibility on tech companies to prevent and rapidly remove illegal content, like terrorism and revenge pornography
- Enforce age limits and use age-checking measures on platforms where content harmful to children is published
- Stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography
- That there will be some general provision (detail not yet clear), placing advertising control obligation on providers
However, like the principles of Newtonian dynamics, every action has an equal and opposite reaction and for every good intention, there will be nefarious forces unpicking the legislation for its loopholes, to continue business as normal.
John Swift once stated, “Laws are like cobwebs, which may catch small flies, but let wasps and hornets break through”. One need only think back to GDPR and the results for online cookie management for an example of intention gone awry. Providing little practical benefit to the consumer, whilst heaping further red tape and bureaucracy on businesses. Arguably cookie management might be considered an even worse user experience; with a mandatory pop-up box on every website requesting consents that only the most dedicated and data-savvy user has the patience – time after time – to read pages of permission and in many cases perform elongated steps to restrict permissions. A more educated legislator may have instead looked to mandate Web3 and blockchain to enable a user to manage permissions and control their data.
For the Online Safety Act, the first potential cracks have been exposed even before the ink has dried. The Government’s definition may not align with definitions in play within other sectors of society, or worse there may even be discord and inconsistency between the public sector enforcement body and the aims of the legislator or wider public.
In this context, a pertinent example is the ongoing debate surrounding the term “Jihad.” The Metropolitan Police has pointed out to an infuriated Home Secretary that this term carries a dual definition, encompassing both an internal struggle to hold at bay one’s sinful urges, in addition to an external interpretation sometimes linked to acts that incite terrorism.
Context is important and the rising non-compliance we at MCP have identified with the placements of Google ads demonstrates this principle, as well as Google’s current challenge in controlling this without third-party assistance from compliance monitoring houses such as MCP.
These subjectivities pose real jeopardy for Social Media companies forced to ride the line between principles of enabling free speech and the prospect of sanctions if boundaries are deemed to have been overstepped.
Who is the appropriate arbitrator? The personal opinion of an Ofcom employee? The complaints from a vocal few? The church, the proverbial Mary Whitehouse or, like the episode of Black Mirror, the oblivion of cancellation through the kangaroo court of an emotionally charged dislike button?
Scope and jurisdiction
The challenges with the scope and jurisdiction of country specific enforcement of internet activity have also recently been brought into focus by the successful appeal of an American facial recognition company, see Clearview AI Inc v The Information Commissioner  UKFTT 00819 (GRC).
Given the challenges that even the heavy-weight authoritarian states of the censorship world have faced due to the ubiquity of VPNs, or the West has experience trying to control the Dark Web; whether the UK Government can successfully act, unilaterally, in regulating a global community has a very potent question mark hanging over it.
Content Providers familiar with the DCB and mVAS sector are already familiar with the tools and techniques to monitor and control affiliate partners. During my discussions with one of the social media giants – at the time looking to launch their own virtual currency – it was evident, despite drawing senior talent from the likes of PayPal, that the control of the wider value chain was an alien and barely considered concept.
As mainstream businesses are thrust into the bright light of enhanced compliance, MCP’s monitoring expertise has already become attractive to the likes of Google, and many more businesses in the coming months will need to adopt tools and services like MCP SCANNER to discharge their new obligation to protect the public.
Levelling the playing field
The bill – now published –contains provisions to combat fraudulent advertising, placing responsibility on the value-chain. With Ofcom, the intended regulator, the future regulations could very likely mirror those that the DCB industry is already very familiar; within the Phone-paid Service Authority Code.
In many respects, the Online Safety Act could be seen to level the playing field.
Having had the forbearance of sitting through trade association meetings for over a decade, the most frequent theme has been the disparity between DCB payment regulations focus on the entire user journey and the lesser standards demanded of marketing annibali traditional payment such as credit and debit cards. This has long been considered a disadvantage. However, the tide is now turning as general legislation is moving in the direction that the Mobile Payments sector has already significant experience.
For DCB payment providers this will be an opportunity to present a tried, tested and compliant solution to those companies that are currently exposed by these new requirements.
Further, for providers of adult content, who have seen margins cannibalized by free sites, the likely outcome of such material needing to transfer behind age verification, could present new opportunities to reinvigorate paywalls.
How hard is this going to be to implement?
It won’t be a walk in the park. It will require consistent and measured enforcement, which is easier said than done. Balancing free speech and the prevention of harmful content is a real challenge. Plus, ensuring that everyone’s on the same page regarding definitions and compliance across different sectors adds another layer of complexity.
The DCB and mVAS sector does have an advantage though. We’ve long been subjected to rigorous regulatory scrutiny in the UK and already have the compliance solutions needed. So, barring the issues of definitions, scope and jurisdiction, we’re in good shape to implement appropriate solutions.
MCP has already invested in solutions to identify and control the types of promotion and content being displayed to children. Grappling with other thorny issues, such as defining what environments are considered as being attractive to children; given the blurry boundary with cartoons and games with an equal or predominate adult target market.
For example, MCP’s monitoring in the Middle East has given us a head start in the identification of adult and political content to enable rapid blocking. Whilst work perfecting traffic analysis techniques through MCP SHIELD fraud blocking already has the foundations for geo-specific silo- based approaches to applying legislation.
How effective the legislation will be remains in the balance. Consistent measured enforcement will be the lynchpin. The political will to stand up to the vested interests of the corporate giants will need to be a keystone. Above all an understanding of and the mandating of the right technology in the value chain will be crucial.
What is clear is that the solutions, skills and experience gained within the DCB sector mean that its talent and products are likely to be in high demand.
David Ashman is Head of Compliance MCP Insight
Where Europe’s Digital Services Act fits in
Ratified in November 2022 – but not due to take effect until 17 February 2024 – the EU’s Digital Services Act (DSA) seeks to play a similar role to the UK’s Online Safety Act 2023 which came into force in the UK in October 2023.
Designed to harmonise EU member state’s online safety rules which had diverged considerably as the online world pulled away from how it looked in 2000 when the E-commerce Directive sought to police how the web was used, the DSA joins the Digital Marketing Act (DMA) and everyone’s favourite GDPR – not to mention the proposed AI Act – as a triumvirate of legislation designed to protect Europe’s users of the web and ecommerce.
The rules are wide reaching and will see everyone from Big Tech downwards forced to be accountable for everything from fake news to propaganda to illicit products to child grooming and abuse.
The DSA is designed around five key themes: combatting illegal products; policing and removing illegal content; protection of children; sensitivity about racial and gender diversity; and a ban on ‘Dark Patterns’ – effectively rules to prevent shoppers from being manipulated into buying things they don’t want or need. There is also plans to prevent tech companies ranking their own services more favourably than others, making it easier to uninstall pre-installed apps and replacing them with others.
For those that don’t comply there are heavy fines – up to 6% of global revenue – and, in the worst cases, a total ban from operating in Europe.
As with the UK Online Safety Act, the DSA has been broadly welcomed. Sam Media’s Leon Dijksman says that “The act would potentially protect children from being confronted with adult content and it will certainly makes things a lot more difficult for rogue affiliates and fraudulent advertisers – and I only expect a positive impact for the parties[involved]. There will also be less fraud and fewer complaints.”
However, like many in the industry, he admits that it will require a lot of effort and investments from advertising media and platforms to implement however it can only be good for business.
Comparing DSA and UK Online Safety Act
While both the EU’s Digital Safety Act (DSA) and the UK’s Online Safety Act are geared towards doing the same job, they vary in scope and approach – a diversity that is a direct result of Brexit. The UK’s law, for example, divides services into different categories depending on their size and deemed risk, while the DSA focuses on digital intermediaries, including a wide variety of social media, search engines, marketplaces and cloud services.
While the DSA treats all kinds of illegal content equally, the UK law contains different obligations for different types of illegal content. The nature of risk assessment obligations is also different; only Very Large Online Platforms (VLOPs) and Very Large Online Platforms and Search Engines (VLOSEs) need to comply with these DSA requirements, as opposed to all services under the Online Safety Act’s remit. Finally, the DSA will be enforced by the EU Commission and Digital Services Coordinators – government entities – while the UK law will be enforced by Ofcom, the UK’s independent communications regulator.