New Parental Controls Could Change How Kids Use WhatsApp

WhatsApp is introducing a new set of parental control features designed to give families greater oversight of how children and pre-teens use the messaging platform, as concerns grow globally over online safety and exposure to unknown contacts.

The update, rolled out by Meta, marks a significant shift in how younger users interact with one of the world’s most widely used communication apps, which now serves over three billion users worldwide.

Parent-managed accounts for younger users

At the centre of the update is the introduction of “parent-managed accounts,” a new system that allows parents or guardians to link their child’s WhatsApp account to their own device and manage key safety settings. WhatsApp will allow children, particularly those under 13, to use the app in a controlled environment with restricted features.

Under the new system, parents can control who is allowed to contact their child, approve or block incoming message requests and manage which groups the child can join. Requests from unknown contacts are placed in a protected folder that requires parental approval.

“These accounts come with strict new default settings, parental controls and options for parents to guide their pre-teens,” the company said.

The accounts are limited to messaging and calling only, with features such as channels, status updates, AI tools and certain disappearing message options disabled to reduce exposure to potential risks.

Privacy maintained despite controls

Despite the additional oversight, WhatsApp says user privacy will remain intact. All conversations will continue to be protected by end-to-end encryption, meaning even parents cannot read the actual content of messages.

Instead, the system focuses on supervision without intrusion. Parents can receive alerts about activity such as new contact requests, group invitations or profile changes, while still respecting the child’s private conversations.

To activate the feature, both the parent’s and the child’s devices must be present, with accounts linked through a secure verification process. Settings are protected by a PIN to prevent unauthorised changes.

Growing pressure for safer digital spaces

The move comes amid increasing global scrutiny over the impact of social media and messaging platforms on children’s mental health and safety.

Governments and regulators in several countries have been pushing for stricter safeguards, including age restrictions and enhanced parental controls, to address risks such as cyberbullying, online predators and exposure to harmful content.

Meta has already introduced similar safety features across its platforms, including Instagram and Facebook, as part of a broader strategy to improve trust and comply with evolving regulations.

Industry analysts say WhatsApp’s latest update reflects a wider shift toward “family-first” design in technology, where platforms are increasingly expected to build safety tools directly into their core experience rather than relying on third-party controls.

For parents, the changes offer more visibility and control. For WhatsApp, they represent an effort to balance privacy with protection in an environment where digital communication is becoming part of childhood earlier than ever.

As the rollout continues globally, the effectiveness of these tools will likely depend on how well they are adopted and whether they can keep pace with the rapidly evolving online risks faced by younger users.

Pakistan

Lifestyle

Automobile

World

Smart Stories for the Smart Readers

Smart Stories for the Smart Readers