**Social Media’s Last Chance to Combat Illegal Content**
In a significant development concerning online safety, social media platforms have been granted a “last chance” to address the issue of illegal postings on their services. Following the enactment of the Online Safety Act (OSA), all online platforms must conduct evaluations to determine whether they expose their users to illegal material. The deadline for these assessments is set for March 16, 2025. Failure to comply will result in substantial financial penalties that could reach up to 10% of the companies’ global turnover.
The UK’s communications regulator, Ofcom, recently published its final codes of practice detailing how these firms should manage and respond to illegal online content. The regulations aim to ensure that platforms responsibly tackle the myriad challenges posed by harmful content, particularly for vulnerable groups, including children and adolescents. Ofcom’s chief, Dame Melanie Dawes, emphasized the urgency of these measures while speaking to BBC News, highlighting that this moment represents a crucial juncture for the industry to enact real changes in their operations.
Dame Dawes articulated a clear directive: social media platforms must take immediate action to rectify their procedures. She warned that public calls for stricter regulations—including outright bans for minors accessing social media—will grow more insistent should companies fail to comply. The consequences for non-adherence will not only involve fines but could also spark stricter regulations that could limit the accessibility of social media for younger users.
Despite the ambitious scope of the OSA, it has drawn criticism from various advocacy groups who argue that it does not adequately address the range of dangers that children face online. Andy Burrows, representing the Molly Rose Foundation, voiced his disappointment regarding the lack of specific actionable measures in the guidelines that relate to online suicide and self-harm content. His assertion that robust regulation is necessary to effectively manage illegal material underlines the urgent need for comprehensive oversight.
The codes issued by Ofcom mandate that platforms must identify how illegal content could manifest on their websites and establish mechanisms to prevent this material from reaching users. This includes content connected to child sexual abuse (CSAM), coercive behavior, extreme sexual violence, and materials that promote self-harm or suicide. Ofcom’s approach reflects a desire not just for compliance but for proactive engagement on the part of technology firms.
In addition to assessing risks and implementing preventative measures, Ofcom’s codes also impose certain safety features aimed explicitly at child protection. For instance, platforms must refrain from suggesting children’s accounts to other users and warn minors about the risks associated with sharing personal information. Additionally, the use of hash-matching technology to detect CSAM has become a critical requirement, extending even to smaller file hosting and storage services. This means that any media uploaded online should carry a unique digital signature, allowing it to be verified against existing databases of known illegal content.
As the discourse around online safety evolves, many tech companies have already initiated safety measures targeted at adolescent users. Platforms like Facebook, Instagram, and Snapchat have implemented features that prevent users under the age of 18 from being found in searches by unknown accounts. Furthermore, Instagram recently began blocking certain screenshots in direct messages to combat rising incidences of sextortion, a tactic that poses a severe threat to young individuals.
While there is optimism surrounding the potential of the OSA to create a safer online environment, significant concerns persist regarding the expansive range of services impacted by these regulations. Advocacy groups have also expressed apprehension about the privacy challenges tied to age verification processes. Moreover, families of young individuals harmed by online content have criticized Ofcom for its sluggish pace in enacting the OSA, stating that the urgency of implementing protections is paramount.
The approval of the illegal content codes by the UK Parliament remains a necessary step before they can be fully enforced, with a timeline set for March 17. Nevertheless, platforms are urged to act preemptively, given the strong expectation that these regulatory codes will be ratified without significant objection. As the deadline looms, the onus lies on social media companies to ensure that their proactive measures are effective in curtailing the access and availability of illegal material on their platforms. The call to action is clear: enact meaningful change, or face the consequences.









