Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
On Monday, the UK’s internet regulator, Ofcom, published final guidelines for internet service providers under the Online Safety Act. This is starting to hit the Internet that is growing to destroy the deadline for the law, which the regulator expects to last three months.
Ofcom has been under pressure to speed up the process of setting up an online security system riots in the summer which is thought to have been fueled by social media activity. Although it is only following the plan established by the parliamentarians, which requires them to discuss, and for the parliament to approve, the final measures to be followed.
“The proposed Harm Prevention Regulations and guidance are a major step forward, with internet providers now required to protect their users from harm,” Ofcom wrote in a statement. Press release.
“Providers are now responsible for assessing the risk of illegal harm in their services, with a deadline of March 16, 2025. According to the Laws that complete the work of the Parliament, starting from March 17, 2025, providers will be required to take safety measures. in the Codes or use other effective measures to protect users from illegal activity.”
“We are prepared to take action if service providers do not take immediate action to address risks to their services,” it added.
According to Ofcom, more than 100,000 technology companies may be responsible for protecting users from various types of illegal activity – in relation to more than 130 “priority offences” established by the Act, which cover areas including terrorism, hate speech. speech, child abuse and exploitation, fraud and financial crimes.
Failure to comply risks fines of up to 10% of global assets (or up to £18 million, whichever is greater).
The in-scope companies range from technology giants to “very small” service providers, which are involved in the various sectors of social media, dating, gaming, research, and porn.
“The provisions of the Act apply to service providers who have links to the UK regardless of where they are located in the world. The number of internet services subject to the rules can exceed 100,000 and range from the largest technology companies in the world to the smallest services,” wrote Ofcom.
Codes and guidance follow consultation, with Ofcom looking at research and taking stakeholder feedback to help improve the rules, starting with the rules. it passed the parliament last fall and become law in October 2023.
The administrator has explained the methods of use to users and search services to reduce the risks related to illegal content. Guidance on risk assessment, record keeping, and reviews is outlined legal document.
Ofcom has released it again summary covering every topic in today’s schedule.
The approach taken by UK law is different and similar to all – with, in general, more responsibilities being placed on large services and platforms where more risks can occur compared to smaller services with fewer risks.
However, smaller, less risky services may not be exempt from the mandate, either. And – of course – many requirements apply to all services, such as having a control system that allows for quick downloads of illegal content; have a mechanism for users to submit content complaints; have a clear and accessible voice; withdraw accounts of prohibited entities; and many others. Although many of these non-wear items are items for which most of the services, perhaps, can already be provided.
But it’s fair to say that any technology company that provides user-to-user or search services in the UK needs to assess how the law applies to their business, at least, if they don’t change their operations to eliminate it. specific areas of regulatory risk.
For large platforms with business-related businesses – where their ability to earn money generated by users is linked to maintaining public interest – a major change in functionality may be required to avoid the collapse of legal services to protect users from many problems. .
The main goal of the reform is the law to bring lawsuits against senior executives in certain cases, meaning that tech CEOs can be prosecuted for certain types of misconduct.
Speaking to BBC Radio 4’s Today program on Monday morning, Ofcom CEO Melanie Dawes said 2025 will see a big change in the way the major platforms operate.
“What we are announcing today is a big moment, especially for cyber security, because in three months, technology companies will have to start taking action,” he said. “What do they need to change? They need to change the way the algorithms work. They need to test them so that illegal things like fear and hate, using pornographic images, a lot, don’t show up on our feed.”
“Then if things fall into the net, they must take them down.” And for children we want their accounts to be private, so they don’t connect with strangers,” he added.
That said, Ofcom’s process is only the beginning of the necessary legal process, with the regulator still carrying out other legislative tasks – including what Dawes said “adequate protection for children” which he said will be launched in the new year.
As a result, major changes related to child safety on platforms that parents have been clamoring to force may not play out until the end of the year.
“In January, we will come up with what we need in terms of age to find out where the children are,” said Dawes. “Then in April, we will finalize laws to protect our children – and this will be about pornography, suicide and self-harm, violence and other such things, child malnourishment as it has become. It’s normal but it’s really harmful these days.”
Ofcom’s briefing note also says that other measures may be needed to keep up with technological developments such as the rise of AI output, which it suggests will continue to review risks and may also change requirements for service providers.
The administration is also planning “crisis response strategies for emergencies” like last summer’s riots; proposals to block the accounts of CSAM participants (child abuse products); and guidelines for using AI to combat illegal harm.