Driving Value Added Services & Content|Billing & Engagement In Motion|Minutes, Messages & Traffic That Pays|Engage & Commercialize Connected Consumers|Making Interactive Media Pay|Billing & Alternative Payments That Convert|Mobile Strategies For Merchants & Content Owners|Monetising Premium Content & Services
Renew, Refresh
Juniper Conv Commerce Main Banner Ad
MCP Insight
dynamicmobile billing
DMWF-Global Ad
SexyJobs Ad

1 in 4 UK businesses worried about complying with EU online safety regulations proposed under Digital Services Act

0

Research reveals that many UK businesses currently feel unprepared for new Digital Services Act (DSA) regulation focused on protecting online users from harmful content set to come into force late this year across the EU.

The findings from Besedo, an expert in online content moderation, reveal how UK businesses are reacting to the EU’s proposal for a Digital Services Act (DSA).  The research, conducted among 200 UK businesses, shows although many are “cautiously optimistic” about complying, nearly all (92%) have concerns about the new regulations.

Submitted in December 2020, if passed by the European Council and European Parliament, the DSA will provide businesses operating online in the EU area with a new set of rules to create a safer digital space and will involve a significant overhaul of businesses’ current responsibilities.

Petter Nylander, CEO of Besedo comments: “With the pandemic driving more consumers to use online platforms to shop, date and connect in a socially distanced world, the opportunity for fraudulent, harmful and upsetting content has increased. There’s no hiding from the fact this regulation will significantly impact how businesses operate online. The DSA will force businesses to change the way they approach content moderation to protect users against dangerous and fraudulent activity.”

Worries over compliance

Topping the list of main concerns for businesses around the DSA is lack of understanding on how to comply with the regulations (25%), whilst a further quarter of respondents have concerns regarding the cost of compliance. The research also reveals that reputational damage from not complying is causing apprehension, with 22% stating this as their biggest concern about the DSA.

The research shows an awareness gap particularly between large and small businesses, with 87% of big business considering the DSA to be “a wide spanning act to create a safer digital space, affecting multiple platforms.” However, this is true of just 40% of small businesses, suggesting that small businesses aren’t fully cognisant of the fact that the DSA will affect them in serious ways.

Turning to technology to ensure compliance

Many businesses know they will need to make changes to improve their content moderation in preparation to be compliant; 93% of respondents are expecting to upgrade existing content moderation systems.

Nearly half of businesses are planning on using staff to moderate content manually and manage the process themselves in order to comply. Over a third plan to build their own content moderation tool, with a further 22% saying they will look to set up their own Artificial Intelligence to moderate their content.

Nylander adds: “Our research suggests that businesses don’t yet fully understand the potential impact the DSA will have, or the scale of the content moderation challenge that lies ahead of them to be compliant. Although it’s too soon to build a moderation approach specifically designed for the DSA, businesses should not delay in setting up systems that keep their users safe. In a similar way to how the GDPR guidelines were publicised in the media, it’s likely that the DSA will also be discussed before the new regulations are in place. This will put pressure on companies to prioritise safety before it becomes a legal requirement due to consumer awareness.”

He concludes: ”Businesses should start working on improvements now, to show customers that they are working with companies they can trust. Businesses cannot afford to take a cavalier attitude towards removing harmful content, not only as regulators crack down, but as users’ expectations of services rise. Effective content moderation, using a combination of AI and human moderators, ensures businesses can safeguard themselves and avoid reputational damage, as well as grow their business based on positive user experiences.”

>>> Don't miss out on the latest news, analysis and key insider views from around the industry via the Telemedia Newsletter. Totally free and published every Thursday - Sign up for your copy TODAY
Share.

Leave A Reply