UK publishes security-focused rules for video-sharing platforms like TikTok – TechCrunch

Video-sharing platforms that offer a service in the UK must comply with new regulations designed to protect users and those under the age of 18 from harmful content such as hate speech and potentially inciting videos / ads. violence again against protected groups.

Ofcom, the communications, broadcast and – in a growing role – internet content regulator, today released guidelines for platforms such as TikTok, Snapchat, Vimeo and Twitch.

One of the requirements for the affected services is that they must take “appropriate measures” to protect users from harmful content.

Terrorist content, child pornography, racism and xenophobia also fall under the category of “harmful content”.

In a press release, the regulator said its research showed that a third of UK internet users said they had witnessed or experienced hateful content; a quarter claim to have been exposed to violent or disturbing content; while one in five has been exposed to videos or content promoting racism.

There is no prescriptive list of measures that video sharing platforms must use to prevent users from being exposed to such content.

But there are a number of recommendations – such as clauses in terms and conditions; features such as the ability for downloaders to declare whether their content contains advertising; and user-friendly mechanisms for viewers to report or report harmful content, as well as transparent complaints procedures.

Age assurance systems are also recommended, as is the inclusion of parental controls, as the regulation has the specific objective of protecting those under 18 from viewing videos and advertisements containing restricted material.

Ofcom also recommends a “robust” age check for video-sharing platforms that host pornography to prevent those under 18 from viewing adult material.

A list of video sharing platforms that have notified Ofcom as falling within the scope of the regulation is available here. (In addition to the aforementioned platform giants, it also includes OnlyFans, Triller, and Recast.)

“We recommend that suppliers put in place systematic risk management processes to help suppliers identify and implement practicable and proportionate measures”, continues Ofcom in the guide to video sharing platforms.

“While we recognize that harmful material may not be completely eradicated from a platform, we expect vendors to make significant efforts to prevent users from encountering it,” he adds.

“The VSP [aka video sharing platform] The regime is about the platform’s security systems and processes, not the regulation of individual videos, but evidence of a prevalence of harmful material on a platform may require further investigation.

The regulator says it will want to understand the measures the platforms have in place, as well as their effectiveness in protecting users – and “all the processes that have informed a vendor’s decisions about which protective measures to use.” . The platforms will therefore have to document and be able to justify their decisions if the regulator intervenes, following a complaint.

Monitoring the compliance of technology platforms with the new requirements will be a key new role for Ofcom – and a taste of what needs to be covered by incoming and much more general digital security-focused regulations.

“In addition to engaging with the vendors themselves, we hope to inform our understanding of effective user protection, for example by monitoring complaints and engaging with interested parties such as charities, NGOs and technology security groups, ”Ofcom also wrote, adding that this engagement will play an important role in supporting its decisions regarding“ areas of interest ”.

Ofcom’s role as a regulator of internet content will be deepened in the years to come as the government strives to pass legislation that will impose a broad duty of care on digital service providers of all stripes, requiring them to manage user-generated content in a way that prevents people – and in particular children – from being exposed to illegal and / or harmful things.

A key appointment – the president of Ofcom – was delayed as the government decided to relaunch the competition for the post.

Reports suggest the government wants the former editor of the Daily Mail to take the job, but an independent panel involved in the initial selection process dismissed Paul Dacre as an inappropriate candidate earlier this year. (It is not known if the government will continue to try to parachute Dacre into the post.)

Ofcom, meanwhile, has been regulating video-on-demand services in the UK since 2010.

But the video-sharing framework is a separate regulatory instrument intended to address differences in the level of control, as video-sharing platforms provide tools for users to upload their own content.

However, this new framework should be replaced by new legislation as part of the new online safety regulatory framework.

So, these regulations for video-sharing platforms are kind of a placeholder and a taste as UK lawmakers scramble to set more comprehensive online safety rules that will apply much more broadly.

Yet, in the guide, Ofcom describes the VSP regime as “an important precursor of future online security legislation,” adding: “Given the common goal of both regimes to improve user security by requiring services to protect users through the adoption of process systems, Ofcom believes that compliance with the VSP regime will help services prepare to comply with the online security regime as outlined by the government in the draft online security law.

UK data protection regulations also already enforce a set of ‘age-appropriate’ design requirements for digital services that may be viewed by children.


Source link

About Rochelle Boisvert

Check Also

Uproar over North Texas school administrator urges present ‘opposing’ views on Holocaust – CBS Dallas / Fort Worth

SOUTHLAKE, Texas (CBSDFW.COM/CNN) – Some members of a North Texas community and Jewish groups speak …