But Many Parents Agree: YouTube Kids Isn’t Really Responsible for What Your Kids See
We’ve been told that screen time is bad for our kids, but sometimes even the best parents give in.
Enter the YouTube Kids app, billed as a child-friendly app packed full of age-appropriate videos, channels and playlists.
A coalition of consumer and child advocacy groups produced this montage of not-so-age-appropriate videos allegedly found on the YouTube Kids app. Some of the videos that made it through YouTube’s child-friendly filter include cartoons with language banned by the FCC and videos about drugs, alcohol and even suicide.
They ask, “Is YouTube Kids A Safe Place for Young Children to Explore?” They’re calling on the FTC to take action against YouTube for deceptive and misleading claims about age-appropriate content. Just last month the same group filed a complaint about the app for excessive advertising to children in violation of FTC regulations.
But when CBS San Francisco ConsumerWatch Reporter Julie Watts showed the video to parents, they all had the same reaction.
YouTube Kids said in a statement, “We work to make the videos … as family-friendly as possible.” The company also points outs that parents can flag inappropriate videos to be removed and use parental controls that disable the search function.
YouTube says it’s constantly improving its filter feature but admits it’s not 100 percent accurate yet. It also says it immediately removed the inappropriate videos flagged by the consumer groups and monitors flagged content 24/7.
Google is taking more heat from consumer groups over its new YouTube Kids’ app who claim the content intended for toddlers is full of disturbing videos meant for mature audiences.
Google launched YouTube Kids, their kid-friendly app, in February to much fanfare, but consumer and child advocacy groups are now filing a complaint against the app with the Federal Trade Commission, claiming they illegally target kids with ads.
So what do you think?
Is it YouTube’s job to filter what your kids see? Should they be held accountable for content that “slips through” their algorithm?
Tell us on Facebook @NewsMomDotCom… and like us while you’re there.