Social Media Ban in UK: Strictness regarding social media platforms is increasing in Britain. The country’s media and data privacy regulators have asked big tech companies to take stronger steps to keep children away from their platforms. Regulators say that many companies are not properly enforcing their own minimum age requirements. The government is also considering that children below 16 years of age should be prevented from using social media platforms. This step could be like the decisions taken in Australia.
Algorithms are impacting children
Regulatory bodies say that they have special concerns about the algorithms of social media. Through these algorithms, children may see content that may be harmful or addictive for them.
Melanie Dawes, head of Britain’s communications regulatory body Ofcom, said that these platforms are recognized in every household but the safety of children is not being given priority in their design. He warned that if the companies did not improve quickly, they could take regulatory action.
Strict instructions given to companies
Under Britain’s online safety law, many companies have now been asked to strengthen their rules. This includes Meta’s platforms Facebook and Instagram, as well as TikTok, Snapchat, YouTube and Roblox. These companies have been asked to clarify by April 30 how they will strengthen age verification, how they will limit contact with children to people they do not know and how they will make the content visible on their platforms safe. Besides, it has also been asked to stop testing new features on minors.
Demand to identify age through modern technology
Britain’s data privacy organization Information Commissioner’s Office has also issued an open letter to these platforms. It says that companies should use modern technology to ensure accurate age identification so that children under 13 years of age cannot access services that are not designed for them. The head of the organization, Paul Arnold, said that today there are many technological options available to verify age, so companies should no longer have any excuse.
What do the companies say
A Meta spokesperson said the company already uses an AI-based system that estimates users’ age and applies safe account settings for teens. The company also says that age verification should be done at the app store level so that families do not have to share personal information again and again.
Google-owned YouTube said the platform already offers age-appropriate experiences and regulators should pay more attention to services that are not following the rules.
Heavy fine possible for breaking rules
According to British rules, if companies do not follow the rules related to child safety, then heavy fines can be imposed on them. Ofcom can fine companies up to 10 percent of their global revenue, while the Information Commissioner’s Office can also impose penalties of up to 4 percent of annual global revenue.
Also read:
WhatsApp Web getting logged out again and again? Don’t panic, it’s not your fault, this could be the reason

