Social media executives from Meta, Snap, YouTube, TikTok and X are being summoned to Downing Street on Thursday for a crucial meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over online safety for children. The tech bosses will be questioned about what measures they are taking to safeguard young people and address parental concerns, as the government continues its review on whether to implement a complete prohibition on social media for under-16s, in line with Australia’s approach. Sir Keir has emphasised that the meeting will focus on ensuring “social media companies accept and demonstrate responsibility”, warning that “the consequences of failing to act are stark” and that the government has a duty to parents and the next generation to put children’s safety first.
The Number 10 Face-off
Thursday’s meeting constitutes a pivotal moment in the government’s drive to bring tech giants to account for their part in protecting vulnerable young users. The gathering comes at a crucial juncture, with Parliament having rejected calls for an complete ban on social media for those under 16 just hours earlier, despite support from the House of Lords. Instead of implementing a broad prohibition, MPs voted to grant ministers authority to establish their own limitations, indicating the government’s inclination for a increasingly bespoke regulatory approach rather than a sweeping legislative ban.
The timing of the Downing Street summit demonstrates the administration’s resolve to appear firm on digital safety whilst managing multifaceted political and commercial pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy indicated the meeting enables the administration to show it is taking the initiative on digital harms. Downing Street has previously recognised that some services have made progress, implementing steps such as deactivating autoplay for children by standard, and giving parents greater controls over device usage, though critics maintain substantially more must be completed.
- Tech executives grilled regarding safeguarding measures and responses to parental concerns
- Ministers weighing ban on social media for those under 16 drawing from Australian model
- MPs rejected full ban but gave ministers authority to introduce restrictions
- Some services already implemented protections like disabling autoplay for children
Parliamentary Rejection and the Broader Debate
Wednesday evening’s House vote dealt a significant blow to campaigners advocating for a complete ban on social media for those under 16, marking the second occasion MPs have dismissed such proposals despite strong support from the House of Lords. The government’s decision to prioritise ministerial discretion over legislative action demonstrates a more conservative strategy, with officials contending that an outright ban would be premature given continuing policy discussions. This strategy allows the government room for manoeuvre in crafting bespoke restrictions rather than introducing a sweeping ban that some fear could be hard to enforce and effectively oversee across multiple platforms.
The rejection has intensified debate about whether the UK is adequately protecting its young people from internet-based threats. Whilst the government maintains that giving ministers authority to implement bespoke guidelines represents a more sensible solution, critics contend this approach lacks the decisive action the situation requires. Recent studies conducted in Australia, where an social media restriction for those under 16 was introduced in December 2025, reveals that approximately 60 per cent of young users persist in using platforms nonetheless, highlighting serious doubts about the effectiveness of legislative bans and suggesting the challenge stretches well past simple prohibition.
Bipartisan Criticism
The parliamentary vote has attracted sharp criticism from opposition benches. Conservative shadow education secretary Laura Trott accused Labour MPs of failing parents and children by rejecting the ban, maintaining that other nations are recognising social media’s harms whilst the UK drops back under the current government. Liberal Democrat education spokeswoman Munira Wilson shared these reservations, asserting that “the time for incremental steps is over” and insisting on immediate measures to restrict the most destructive platforms for young users rather than gradual policy tweaks.
Australia’s Cautionary Example
Australia’s track record with online platform restrictions offers a cautionary case study for policymakers considering comparable approaches in the UK. When the country introduced a prohibition on online platforms for under-16s in December 2025, it was celebrated as a landmark step in safeguarding young users from online harms. However, emerging research from the Molly Rose Foundation has revealed a concerning reality: more than 60 per cent of young Australians keep using social media platforms in spite of the legal ban. This substantial rate of non-compliance suggests that legislative bans alone could be inadequate in preventing young users intent on access from using the services they wish to use.
The Australian results carry significant implications for the UK’s ongoing policy deliberations. If a similar ban were implemented in Britain, the evidence suggests enforcement would present substantial challenges, with young people probably discovering methods to circumvent age-verification systems and restrictions through multiple technical means. The data undermines arguments that a straightforward legal ban represents a quick fix to online safety concerns, instead highlighting the need for a more comprehensive approach integrating regulatory frameworks, platform responsibility, parental oversight tools, and digital literacy training to effectively tackle the risks young people face online.
| Key Finding | Implication |
|---|---|
| Over 60% of underage Australians still access social media despite ban | Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms |
| Ban introduced in December 2025 has failed to achieve widespread compliance | Enforcement mechanisms remain weak and young people find workarounds to restrictions |
| Blanket bans do not address underlying appeal of social media to young people | Multi-faceted approach combining regulation, platform accountability, and education is necessary |
Subject Matter Experts Push for Substantive Measures
Child safety advocates and online protection specialists have intensified calls for tech companies to implement meaningful action past self-regulation. The Molly Rose Foundation, created to honour 14-year-old Molly Russell who took her own life after accessing dangerous material on the internet, has been especially outspoken in calling for structural reform. Rather than implementing sweeping prohibitions that prove hard to police, campaigners argue the priority should move towards holding platforms accountable for the systems driving dangerous material to at-risk individuals.
Andy Burrows, head of the Molly Rose Foundation, has emphasised that Thursday’s Downing Street meeting represents a critical moment for government action. The charity has repeatedly maintained that social media companies have the technical capability to implement robust safeguards, yet frequently place engagement metrics over user wellbeing. Experts stress that real safeguarding requires platforms to redesign their recommendation systems, improve content moderation, and provide parents with meaningful tools to monitor their children’s online activity successfully.
The Algorithm Problem
At the centre of concerns lies the algorithmic systems that control what content younger audiences see. These algorithms are designed to maximise engagement, often pushing sensational, harmful, or addictive content to at-risk groups. Reforming these systems represents one of the most critical issues in online safety, requiring platform transparency about how their algorithmic systems operate and what safeguards exist.
- Algorithms favour user engagement over the safety and wellbeing of users
- Platforms need to improve transparency about algorithmic recommendation processes
- External reviews of harm caused by algorithms are essential for maintaining accountability
What’s Coming Next
Thursday’s summit at Downing Street will set the tone for the government’s position regarding online child safety in the period ahead. Following the meeting, Sir Keir Starmer and Liz Kendall are expected to outline their results and determine whether established voluntary arrangements from tech companies suffice or whether stronger legislative action becomes necessary. The government remains in the midst of its consultation process on whether to introduce an Australia-style ban on social media for under-16s, with the outcome of this week’s discussions likely to affect the final policy direction.
Ministers have expressed their preference for granting themselves powers to impose restrictions rather than enacting an all-out ban, citing worries regarding enforceability and impact. However, mounting pressure from opposition parties, child protection advocates, and parents suggests the government may come under sustained pressure for more decisive action. The next few weeks will prove crucial in determining whether tech companies can prove genuine commitment to keeping young users safe or whether Parliament will introduce new laws to force compliance with more stringent safety standards.