Please wait...
Please wait...

YouTube hit by suicide video criticism

YouTube has come under renewed pressure to better police content after a prominent vlogger apologised for uploading video which appeared to show the body of someone who had taken their own life.

Logan Paul

Logan Paul, whose YouTube channel has more than 15 million subscribers, had originally published a video on December 31 of him visiting Aokigahara Forest in Japan, a notorious suicide location, with an “intent to focus on the haunted aspect” of the forest. However, Paul and his group appear to happen across a dead body.

The clip, which cuts back several times to the body and features Paul talking of how suicide, and mental illness are not jokes, was reportedly viewed millions of times before being deleted following widespread criticism. Paul apologised on January 2 in a post on Twitter to his 3.9m followers saying he “didn’t do it for views.”

“I do this sh*t every day. I’ve made a 15 minute TV show every single day for the past 460+ days,” he wrote. “One may understand that it’s easy to get caught up in the moment without fully weighing the possible ramifications.”

However, many people have questioned the online platform’s policing of its content, with fellow YouTube stars calling for a response.

Philip DeFranco, creator of YouTube series The Philip DeFranco Show, said that before the “the extended community outrage” to the video, there was a “seemingly uncontested 550-600,000 likes on it.”

“Unless YouTube does something, this doesn’t hurt him,” he said.

YouTube and Snapchat video maker Kandee Johnson questioned how the video was even allowed on the platform.

She wrote on Twitter: “Dear @YouTube, after the Logan Paul video where he shows a dead body of a suicide victim, uses that for the title, makes heartless jokes next to the body, there needs to be age restrictions for certain creators… His followers are children! Horrifying.”

Similarly, actor and comedian Ed Petrie also called on better protection for children against harmful content, having spent 10 years working in Kids TV at Nickelodeon and CBBC, and demanded regulation for the site.

“Seeing the thought and care that goes into providing content for young people, there needs to be some serious questions asked about how You Tube functions,” he wrote on Twitter.

“A good start? On their YouTube kids app they should vet every video before it’s posted, instead of taking things down in retrospect after they get complaints. It will cost them more money. But they make a lot of money. And with that comes responsibility.

“Can you imagine the outcry if CBBC gave presenters a platform to show kids footage of suicide victims? But YouTube let this stuff go out and seem to think business can carry on as usual. Why are they not subject to the same rules as the rest of us?”

In December, YouTube CEO Susan Wojcicki wrote a public statement about “abuse of our platform,” including its YouTube Kids app, and said she aimed to bring the total number of people working to address content that might violate Google’s policies to more than 10,000 in 2018.

The company’s action preceded numerous sector-wide comments at the Children’s Global Media Summit including BBC director general Tony Hall who warned about the dangers of “children exposed daily to unfiltered content that is harmful and inappropriate” and “internet platform providers taking no responsibility for what appears on their platforms.”

In a statement YouTube said: “Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated.

“We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Centre.”

Please wait...