Must Read
MANILA, Philippines – As several countries around the world propose measures banning minors from social media, following Australia’s lead, the Philippine Senate on Wednesday, February 11, held a committee hearing on five related bills looking to regulate or ban children’s social media access.
A keyword frequently used was “age-approriate,” referring largely to a more nuanced regulation of minors’ social media use rather than an outright ban.
University of the Philippines president Angelo Jimenez, a lawyer, called for a “calibrated age-appropriate framework” that is not only in alignment with free expression rights but also confronts the reality that childhood development is “not monolithic.”
“A 12-year-old child is very different from a 16-year-old.” He argued that a “blanket regulation may restrict older adolescents’ legitimate use of digital platforms for education, civic participation, and creative enterprise.”
“A calibrated, age-appropriate framework may better reflect developmental realities.”
Four of the five Senate bills propose a ban, with varying minimum ages of limitation.
Senate Bill 40 (filed by Ping Lacson) is the most restrictive, proposing a ban all the way to 18 years old, followed by SB 1735 (Imee Marcos) at 16, SB 595 (Erwin Tulfo) at 13, and SB 185 (Camille Villar) at 12.
Jimenez argues for a framework with rules that adjust to a minors’ specific age, given that children change fast in their development and educational capabilities during these years.
“Children are evolving rights-holders capable of responsible participation. Our Constitution reinforces this approach,” Jimenez said.
The lawyer also highlighted that along with regulation, key elements are needed to nurture kids in the digital era: “Digital literacy, critical thinking, parental engagement, media education, mental health awareness are indispensable compliments.”
The Council for the Welfare of Children’s Bryan Santamaria said that age-appropriate regulation “ensures we allow more mature children to use social media” while the Child Rights Network’s Angelica Reyes noted that a ban might “disincentivize” social media platforms from putting in the effort to make their platforms safer.
Representatives from the National Privacy Commission (NPC) and the Department of Education (DepEd) support an “age-appropriate” and a “learner-supportive” framework as well, recognizing the “evolving capacities of the child.”
Annalyn Capulong, assistant professor at UP Diliman’s Department of Psychology, explained that “parental mediation” remains the most important aspect of fostering children in the digital age.
“Sentro ‘yung ambag ng magulang (The parents’ contribution is central),” Capulong said. But she acknowledges that parents are also overwhelmed because this is quite a new challenge for them, and the is that reality that sometimes, parents do use gadgets and screen devices to keep a child busy while they have to attend to another immediate task.
Capulong, whose dissertation focused on the impact of digital media on child development, recommended that if screen time cannot be avoided, it is important to choose the right kind of content for the child. She said to avoid content that’s very passive in nature such as unboxing videos of toys or low quality content, and to guide them towards educational content.
Still, screen time for kids six years old and under must be limited to a maximum of two hours.
The parent needs to show to the child that there is a physical world, and a digital world, Capulong said.
NPC’s Analyn Taguiling stressed the importance of data privacy as several bills carry a provision for age verification processes to be conducted by the social media platform or a third-party provider.
“We strongly advocate data minimization and privacy technologies so that only data that is necessary and proportionate will be collected, and intrusive methods such as biometrics and ID verification should be treated as measures of last resort and should be allowed only upon a clear showing of strict necessity demonstrable effectiveness and the implementation of security, retention limits, and accountability controls,” Taguiling said.
Parents who lost children to social media–related harms hold a vigil ahead of a social media addiction trial set to begin next week, in Los Angeles, California, US, February 5, 2026. REUTERS/Jill Connelly
Large social media platforms currently face a shake-up not just with proposed bans worldwide but a landmark trial in the US that specifically alleges that social media platforms Facebook, TikTok, Snap, YouTube, and Instagram have had design features that are addictive, causing harm.
In the opening trial, the plaintiff’s side called the platforms “addiction machines.”
Internal documents and expert reports have shown that the platforms have been knowledgeable of the said harms but have not instituted sufficient protections or warnings for the users.
The EU Commission’s preliminary findings on TikTok have also said that the platform is in breach of its Digital Services Act for its addictive features including infinite scroll, autoplay, push notifications, and the “highly personalized recommender system” powered by the app’s algorithm.
These “urge” the user to stay on the system, while mitigation measures are said to be ineffective. “TikTok needs to change the basic design of its service,” the Commission said.
The Commission also noted that the features may be harmful not just to kids but also to “vulnerable adults.”
Bills filed in the Philippine Senate have also pointed out a similar set of “addictive” design features or features that lead to “compulsive behavior.”
Camille Villar’s SB 185 defines a social media platform as a platform that employs “algorithms that analyze user data or information on users to select content for users” has “addictive features” including “infinite scroll,” “continuously loading content,” or “the use of pages with no visible or apparent end or page breaks.”
Senator Robin Padilla’s SB 601 aims to order social media platforms to “limit features that increase, sustain or extend the use of the SMP by the child such as automatic playing of media, rewards system for time spent on the platform, notifications, and other features that result in compulsive usage.”
A screenshot of a portion of Senator Robin Padilla’s Senate Bill 601 showing the provision on the limitation of features that result in ‘compulsive usage.’
It also mentioned regulating the algorithm and targeted ads systems of the platforms: “Control personalized recommendation systems and targeted advertising systems to ensure that recommendations or advertisements that may be accessed by a child are age- and developmentally-appropriate,” the bill said.
Such acknowledgment of a social media feature having addictive features in local legislation, hopefully leads to further inquiries as to the regulation of these platforms beyond protecting minors but the rest of the adult population as well.
Majority of the other participants in the hearing including the Cybercrime Investigation and Coordination Center, the Department of Justice, the Philippine Psychological Association, and the platform owners Meta, Google, and TikTok were supportive of the so-called age-appropriate framework. – Rappler.com


