facebook-pixel

Pediatrician exposes suicide tips hidden in videos on YouTube and YouTube Kids

(Jenny Kane | The Associated Press) The YouTube app and YouTube Kids app are displayed on an iPhone in New York on Wednesday, April 25, 2018. YouTube is overhauling its kid-focused video app to give parents the option of letting humans, not computer algorithms, select what shows their children can watch. The updates that begin rolling out Thursday are a response to complaints that the YouTube Kids app has repeatedly failed to filter out disturbing content.

Free Hess, a pediatrician and mother, had learned about the chilling videos over the summer when another mom spotted one on YouTube Kids.

She said that minutes into the clip from a children’s video game, a man appeared on the screen — giving instructions on how to commit suicide.

"I was shocked," Hess said, noting that since then, the scene has been spliced into several more videos from the popular Nintendo game "Splatoon" on YouTube and YouTube Kids, a video app for children. Hess, from Ocala, Florida, has been blogging about the altered videos and working to get them taken down amid an outcry from parents and child health experts, who say such visuals can be damaging to children.

One on YouTube shows a man pop into the frame. "Remember, kids," he begins, holding what appears to be an imaginary blade to the inside of his arm. "Sideways for attention. Longways for results."

"I think it's extremely dangerous for our kids," Hess said about the clips Sunday in a phone interview with The Washington Post. "I think our kids are facing a whole new world with social media and internet access. It's changing the way they're growing, and it's changing the way they're developing. I think videos like this put them at risk."

A recent YouTube video viewed by The Post appears to include a spliced-in scene showing internet personality Filthy Frank. It's unclear why he was edited into these clips, but his fans have been known to put him in memes and other videos. There is a similar video on his channel filmed in front of a green screen, but the origins and context of the clip in question are not clear.

Andrea Faville, a spokeswoman for YouTube, said in a written statement that the company works to ensure that it is "not used to encourage dangerous behavior and we have strict policies that prohibit videos which promote self-harm."

“We rely on both user flagging and smart detection technology to flag this content for our reviewers,” Faville added. “Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report (transparencyreport.google.com) and give users a dashboard showing the status of videos they’ve flagged to us.”

The videos come amid mounting questions about how YouTube, the world's largest video-sharing platform, monitors and removes problematic content.

YouTube has long struggled with how to keep the platform free from such material — removing hateful and violent videos, banning dangerous pranks and cracking down on child sexual exploitation. As The Post’s Elizabeth Dwoskin reported last month, YouTube announced that it was rebuilding its recommendation algorithm to prevent it from prompting videos that include conspiracy theories and other bogus information, though the videos would remain on the site.

Hess said she has been writing about the distressing video clips on her blog, PediMom, to raise awareness and to get them removed from the platform.

Earlier this month, she found a second one — this time on YouTube. She recorded it, wrote about it and reported the content to the video-sharing platform, she said. The video was taken down.

Another version was reposted Feb. 12, receiving more than 1,000 views before it, too, was removed from the site.

Hess said the doctored "Splatoon" videos are not the only ones pushing dark and potentially dangerous content on social media platforms, particularly on YouTube Kids. In a blog post last week, Hess alerted other parents to numerous concerning videos she said she found on the app — a "Minecraft" video depicting a school shooting, a cartoon centered on human trafficking, one about a child who committed suicide by stabbing and another who attempted to commit suicide by hanging.

Nadine Kaslow, a past president of the American Psychological Association, told The Post that it is a "tragic" situation in which "trolls are targeting kids and encouraging kids to kill themselves."

Kaslow, who teaches at Emory University's school of medicine, said that some children may ignore the grim video content but that others, particularly those who are more vulnerable, may be drawn to it. She said such videos can cause children to have nightmares, trigger bad memories about people close to them who have killed themselves or even encourage them to try it, though some of them may be too young to understand the consequences.

Kaslow said parents should monitor what their children do online and tech companies should ensure such content is removed. Still, she said, it's not enough.

"I don't think you can just take them down," she said about the videos. "For children who have been exposed, they've been exposed. There needs to be messaging — this is why it's not OK."

Though parents should talk to their children about the videos, Kaslow said, YouTube Kids also should address the issue, explaining to children what the videos were and why children should never harm themselves.

She added that there should be "serious consequences" for those who had a hand in the videos, noting that it was "very worrisome" that they were targeting children.

According to the Centers for Disease Control and Prevention, risk factors associated with suicide may include mental disorders such as clinical depression, previous suicide attempts, a barrier to accessing mental health treatment, physical illness and feelings of hopelessness or isolation. Those who need help, including children, can call the National Suicide Prevention Lifeline at 1-800-273-TALK.