Clark Howard

Concerns grow about YouTube content geared toward children

We all know that children of all ages are increasingly comfortable with electronic devices, many of them even able (given the opportnity) to turn them on and access content by themselves. But as parents are learning, there’s a dark side to kids’ familiarity with all things internet.

Recent reports have raised concerns that content creators are altering online videos geared toward children and putting alarming and inappropriate images in it.

Report: Many inappropriate videos geared toward kids found on YouTube

The videos, uploaded to Google-owned YouTube, use the names popular children’s titles such as Disney’s “Frozen” and Mickey Mouse, Entertainment One’s Peppa Pig and Nickelodeon’s “PAW Patrol.” The disturbing content has racked up millions of views and people are making big money from it, courtesy of the platform’s liberal monetization policy and rabbit hole-inducing algorithm.

The issue reached a fever pitch in early November when the controversial kiddie videos were highlighted by writer James Bridle in a fiery Medium blog post.

YouTube tried to address the issue a few weeks later, writing that "In recent months, we've noticed a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not. While some of these videos may be suitable for adults, others are completely unacceptable."

Then in December, YouTube announced that it would crack down on "bad actors" who were "exploiting our openness to mislead, manipulate, harass or even harm."

The resulting rules, including changes in who would be eligible for monetization, wiped thousands of videos off the service, but also crippled the livelihoods of numerous popular YouTubers while others were penalized for videos deemed in poor taste.

But the larger issue — how children are affected by what they see online — seems to have become an afterthought. Recently, medical professionals have expressed concerns that YouTube content continues to target children as young as 2 years old, according to CNBC.com.

They fear continual exposure to such content may damage a child’s development and perpetuate negative effects as they grow toward adulthood.

“Children who repeatedly experience stressful and/or fearful emotions may underdevelop parts of their brain’s prefrontal cortex and frontal lobe, the parts of the brain responsible for executive functions, like making conscious choices and planning ahead,” Donna Volpitta, Ed.D., founder of the Center for Resilient Leadership, said.

RELATED: How young is too young for social media?

Some critics assert that even on YouTube Kids, the platform’s video service specifically for children, malicious content is plentiful.

Many of the videos start off harmlessly enough but soon descend into decidedly darker themes and imagery. The New York Times reports that one 10-minute clip of "Paw Patrol" shows a nightmarish sequence in which "some characters died and one walked off a roof after being hypnotized by a likeness of a doll possessed by a demon."

Another YouTube channel, which is still online, features Mickey Mouse shooting people, according to the Today Show.

YouTube has promised to put more "real people" in the process of flagging videos, pulled ads from 2 million clips and even implemented more rules as recently as last week, but it's clear that more vigilance, with the help of parents, is needed.

Still, the question of what exactly is age-appropriate content and who gets to decide continues to stir debate.

Team Clark recently asked its Facebook fans what they thought was the right age for children to join social media. Should it be younger than 13?

Let us know your thoughts in the comments or on our Facebook page!

RELATED: Google removes 60 apps, many due to porn