The Federal Government has begun consultations on plans to introduce age restrictions for social media use, seeking public input on how to balance child safety with the benefits of digital connectivity. The Ministry of Communications, Innovation and Digital Economy launched a public poll targeting parents, teachers, young people, and digital experts to help shape an evidence-based policy framework.
Minister Bosun Tijani stated that while the internet offers significant opportunities for learning, creativity, and communication, it also exposes children to risks including cyberbullying, harmful content, online exploitation, misuse of personal data, and emerging challenges linked to artificial intelligence tools. “As Nigeria evaluates potential policy approaches for protection of children online, including age restrictions, improved age verification systems, platform accountability measures, and enhanced regulatory oversight, public input is essential,” he said.
The move comes amid steady growth in internet usage across Nigeria. Data from the Nigeria Data Protection Commission shows that more than 40 million Nigerians spend an average of six hours daily on social media platforms. Recent industry figures from the Nigerian Communications Commission reveal that active telephone users rose to about 182 million in January 2026, with internet users on narrowband networks reaching 151.5 million and broadband penetration climbing to 53 percent, representing approximately 115 million users.
If implemented, the proposed restrictions would place Nigeria among countries such as France, Denmark, and Australia that have introduced measures to strengthen child safety on platforms like TikTok, Instagram, and YouTube.
Global Context: Australia and UK Lead on Social Media Regulation
Nigeria’s consideration of age restrictions aligns with a growing international movement toward tighter regulation of social media platforms. Australia made global headlines in late 2024 when it passed legislation banning children under 16 from accessing social media platforms, imposing fines of up to A$50 million on companies that fail to comply. The Australian law, which took effect in early 2025, requires platforms to take “reasonable steps” to prevent age-restricted users from holding accounts, sparking debate about privacy implications and enforcement mechanisms.
The United Kingdom has taken a different approach through its Online Safety Act, which imposes a duty of care on platforms to protect children from harmful content. The UK legislation requires age verification for adult content and gives regulators power to fine companies up to £18 million or 10 percent of global turnover for non-compliance. British authorities have focused on holding platforms accountable for algorithmic amplification of harmful material rather than imposing blanket age bans.
For Nigeria, these international precedents offer both models and warnings. Australia’s approach demonstrates that outright bans are technically possible but face implementation challenges, while the UK model shows the complexity of regulating platform behaviour. As Nigeria develops its framework, policymakers must weigh the effectiveness of different approaches against local realities, including limited digital literacy, enforcement capacity, and the potential economic impact on a growing digital ecosystem where social media platforms serve as crucial channels for business, education, and civic engagement.




