Education Law Solicitors

Everything you face, we face with you

Tech company’s face fines if they let children see harmful content

New Government Legislation has handed Ofcom powers to investigate, fine and disrupt video sharing and live streaming platforms to protect children from violence, child abuse and pornography. The new legal requirement to protect children from any video content that “impairs their physical, mental or moral development” will come into effect from September in an EU-wide directive that Britain will enact as part of the withdrawal agreement. The regulator will also play a key role in enforcing a statutory duty of care to protect users from harmful and illegal terrorist and child abuse content. It is another step towards achieving the government’s pledge to make the UK the safest place in the world to be online.

Social media platforms like Facebook, Instagram and YouTube as well as TikTok, Snapchat and Twitch will be among those held to account under the new regime, which will require tough age verification checks. Under the new plans, the lead regulator will be based in the country where the social media or video-sharing platform has its HQ or biggest base. This means that YouTube will be regulated out of Dublin. Ofcom will be responsible for policing UK platforms, with responsibility for Snapchat, TikTok and Twitch. The move is part of plans to protect children and vulnerable people online and give consumers greater confidence to use technology. It will provide the certainty technology businesses need to flourish and innovate while creating a fair and proportionate regulatory environment.

The new regulation comes following pressure and numerous concerns from various agencies highlighting the dangers unregulated platforms are having on children. It is claimed that there has been a surge in children viewing live streaming, which paedophiles also use to groom and abuse them. In addition, police and campaigners blame it for fuelling knife crime via violent drill music videos used by gangs to intimidate each other. The new legislation also covers online bullying, hate crime, discrimination and suicide. All of which are on the increase each year following the growth of such platforms.

Last week, the National Crime Agency criticised the failure of the tech giants to protect children. The new legislation will give powers to Ofcom to impose fines of up to five per cent of a tech firm’s revenues if it fails to protect children from harmful content. Some big tech companies could see fines worth millions of pounds.

Ofcom will be able to serve notices requiring the firms by law to remedy failings. They may also suspend or restrict services for breaches of video content. Additionally Ofcom will be able to charge video-sharing platforms to finance the regulation.

Home Secretary Priti Patel, said “While the internet can be used to connect people and drive innovation, we know it can also be a hiding place for criminals, including paedophiles, to cause immense harm. It is incumbent on tech firms to balance issues of privacy and technological advances with child protection. That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users”.

in News