Doctor Reveals Dangerous Content In YouTube Kids Videos

Florida Mom Discovers Suicide Instructions in YouTube Videos For Children

Florida Mom Discovers Suicide Instructions in YouTube Videos For Children

A USA pediatrician is raising an alarm about instructions on how to slit one's wrists posted in YouTube videos targeted at children - showing that inappropriate content continues to slip through the online streaming site's filters. Hess, from Ocala, Fla., has been blogging about the altered videos and working to get them taken down amid an outcry from parents and child health experts, who say such visuals can be damaging to children.

I don't doubt that social media and things such as this are contributing, "she later told CNN".

In the parody video, a second character appears in time to stop the attempted suicide, with the narrator stating: "Why couldn't he just let me hang myself?"

"I think our kids are facing a whole new world with social media and internet access". She said she found videos glorifying not only suicide but sexual exploitation and abuse, human trafficking, gun violence and domestic violence.

Ms Andrea Faville, a spokesman for YouTube, said in a written statement that the company works to ensure that it is "not used to encourage risky behaviour and we have strict policies that prohibit videos which promote self-harm".

Read the full story here. Although she reported the video, no action was taken by YouTube immediately, but eventually, it was deleted from the video-streaming site. Momo Suicide Challenge Promoted By YouTube? "Flagged videos are manually reviewed 24/7 and any video that don't belong in the app are removed".

The man featured is YouTuber Filthy Frank, who has over 6.2 million subscribers and calls himself "the embodiment of everything a person should not be", although there is no evidence that Frank, whose real name is George Miller, was involved in creating the doctored video.

Fortnite maker pulls ads over YouTube 'paedophile ring' claims
In 2017, it pulled its entire marketing budget from the video site after its ads appeared alongside offensive content. It seems like the signal from these accounts is so strong that it reveals a major shortcoming of YouTube's algorithm.

"We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video", the statement said.

Kaslow, who teaches at Emory University's school of medicine, said that some children may ignore the grim video content but that others, particularly those who are more vulnerable, may be drawn to it.

"I believe this cartoon was up for about six months prior to it coming down", she told the station.

Hess then told a number of groups about the videos. We are making constant improvements to our systems and recognize there's more work to do.

"There is this disconnect between what kids know about technology and what their parents know because the parents didn't grow up with it", she said. Vulnerable children, perhaps too young to understand suicide, may develop nightmares or try harming themselves out of curiosity, she warned.

"We need to fix this", she said, "and we all need to fix this together".

If you have thoughts of suicide, confidential help is available for free at the National Suicide Prevention Lifeline.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.