Parents Wants More Rigorous Age-Verification Protocols to Protect Children From Harmful Online Content
Who’s Accountable for Children’s Safety Online?
Brand new insights show that a strong majority of parents believe content providers, like TikTok and Roblox, should shoulder significant responsibility and that legislative action is essential to ensure these platforms act.
In today’s world where connected technology is pervasive, and invasive, child safety protections online are an urgent priority, or should be. American parents are particularly concerned about popular apps like TikTok, Instagram, and Fortnite, expressing strong dissatisfaction with current age controls and strongly support stronger age-verification protocols from content providers.
While technology can expand learning opportunities for children, and puts a wealth of information at their fingertips, the online world can be a dangerous place for children. The ill-effects of inappropriate content on children is well-documented; they include increased anxiety and depression, loss of analytical and cognitive skills, and clearly addictive behavior. There are also countless examples of children being able to access harmful content online. A recent research report described one of the most popular online children’s platforms as an “X-rated pedophile hellscape”.
The most pressing question embedded in the current debate about protecting minors from harm online concerns the role content providers should play: Should it be the responsibility of platforms like Instagram, TikTok, or Roblox, where the content is taking place or the application stores from device manufacturers, such as Apple’s App Store or Google Play, where the platform is purchased?
To better understand the challenges, AudienceNet conducted a national poll of over 2,000 American parents with children aged between 6 and 17 and American parents want action.
The survey results reveal that 97% of respondents support a robust age-verification systems for online content, with 75% calling it “very important.” Eighty-seven percent of respondents want regulations to safeguard children’s mental health and shield them from inappropriate content, and 69% say stopping minors’ access to adult-only content is a top priority.
Eighty-nine percent of parents believe that social media, gaming, dating, and adult only platforms should be doing more to prevent access to inappropriate adult-only content. More tellingly, 88% want legislation that mandates age checks on such sites to protect minors from exposure to such content.
Gaming platforms are another area of concern. A large majority, 77% of parents worry about video games’ impact on minors, and two-thirds see legislation as “very important” to combat the problem.
Parents widely acknowledged their own responsibility, ranking themselves first among the groups responsible. But they also affirmed that they need help and more accountability from the content providers; 93% pointed to content providers as critical players in the search for solutions. Meanwhile, only 7% assign primary responsibility to device manufacturers and a scant 3% to schools.
When it comes to enforcement, a majority disagree that the platforms can effectively police themselves and support Congress action to establish stronger age-verification measures. Overall, 87% say they support legislative protections against online predators and business practices harmful to children within these digital environments. Furthermore, 86% want an 18+ minimum age requirement for social media, underscoring the belief that these platforms are not adequately safeguarding children.
But parents also have a nuanced understanding of what is a complicated issue and saw the need to balance the priority of tougher age restrictions with protecting digital privacy. Privacy concerns among parents remain quite high, 78% of parents are wary about their own privacy and 88% feel uneasy about their children’s digital privacy. As lawmakers craft policies to address age-verification and online safety, they need to also meet these demands without compromising privacy.
This survey highlights an urgent issue. As content providers and device manufacturers debate where responsibility lies, parents are sending a clear signal—they need effective, privacy-conscious solutions that work and see content providers as best positioned to provide this.
As the 199th Congress convenes in Washington and state legislatures around the country consider this topic there is one issue that everyone should get behind. Protecting children from harmful information online is no longer option–the providers of that information must do more.