Why was a gruesome YouTube video of a decapitated head left online for hours?

This story includes graphic descriptions some readers may find disturbing
Wednesday, January 31, 2024
MIDDLETOWN TWP., Pennsylvania -- A disturbing video of a man holding what he claimed was his father's decapitated head circulated for hours on YouTube. It was viewed more than 5,000 times before it was taken down.

Justin Mohn, 32, from Levittown, Pennsylvania, has been charged with first-degree murder and abuse of a corpse after his father, Michael Mohn, was found dead inside their family home Tuesday night.


[Ads /]
Mohn is accused of not only beheading his father but also bragging about it online.

Police said the more than 14-minute-long YouTube video, titled "Mohn's Militia - Call to Arms for American Patriots," showed Justin Mohn picking up his father's decapitated head and identifying him by name. Police said it appeared Mohn was reading from a script as he railed about the government.

Mohn referred to himself as a militia leader and called his father a traitor to the country for being a federal employee for 20 years.



He also spoke about President Joe Biden and threatened multiple federal agencies in the gruesome YouTube video while reading from a script.

The incident is one of countless examples of gruesome and often horrifying content that circulates on social media with no filter. Last week, AI-generated pornographic images of Taylor Swift were viewed millions of times on X - and similar videos are increasingly appearing online featuring underage and non-consenting women. Some people have live-streamed murders on Facebook.

The horrifying decapitation video was published hours before major tech CEOs are headed to Capitol Hill for a hearing on child safety and social media. Sundar Pichai, the CEO of YouTube parent Alphabet, is not among those chief executives.

In a statement, YouTube said: "YouTube has strict policies prohibiting graphic violence and violent extremism. The video was removed for violating our graphic violence policy and Justin Mohn's channel was terminated in line with our violent extremism policies. Our teams are closely tracking to remove any re-uploads of the video."

But online platforms are having difficulty keeping up. And they're not doing themselves favors, relying on algorithms and outsourced teams to moderate content rather than employees who can develop better strategies for tackling the problem.



In 2022, X eliminated teams focused on security, public policy and human rights issues after Elon Musk took over. Early last year, Twitch, a livestreaming platform owned by Amazon, laid off some employees focused on responsible AI and other trust and safety work, according to former employees and public social media posts. Microsoft cut a key team focused on ethical AI product development. And Facebook-parent Meta cut staff working in non-technical roles as part of its latest round of layoffs.
[Ads /]
Critics often accuse the social media platforms' lack of investment in safety when similar disturbing videos and posts filled with misinformation remain online for too long - and spread to other platforms.

"Platforms like YouTube haven't invested nearly enough in their trust and safety teams - compared, for instance, to what they've invested in ad sales - so that these videos far too often take far too long to come down," said Josh Golin, the executive director of Fair Play for Kids, which works to protect kids online.

But that's only part of the issue, he said. The algorithms that power these platforms focus on videos that get a lot of attention in the forms of shares and likes. That compounds the problem for videos like these.

"Even when tech companies have practices in place to label violent content, they aren't able to moderate and remove them fast enough, and the unfortunate reality is that kids and teens still see them before they are taken down," said James Steyer, founder and CEO of Common Sense Media.

Steyer added that the volume of videos that need moderation is overwhelming for YouTube and other platforms - either because of ability or will. He noted that traumatizing images can have a lasting effect on children's mental health and well being.
[Ads /]
But, until recently, tech companies have not been given incentives to rethink their investments in content moderation. Despite promises from lawmakers and regulators, Big Tech has largely been left alone - even as consumer advocates say social media puts young users at risk of everything from depression to bullying to sexual abuse.



When tech has acted to rein in harmful content on their platforms, they've found it difficult to keep up: Their reputation hasn't really improved at all - quite the opposite.

Facing a grilling Wednesday before Congress, however, tech companies are expected to tout tools and policies to protect children and give parents more control over their kids' online experiences. However, parents and online safety advocacy groups say many of the tools released by social media platforms don't go far enough becauser they largely leave the job of protecting teens up to parents and, in some cases, the young users themselves. Advocates say that tech platforms can no longer be left to self-regulate.

Mohn was eventually taken into custody in Fort Indiantown Gap in Lebanon County, about 100 miles from the crime scene. While is unclear why he was in that area, there is a National Guard training base there and on its website it is referred to as "America's busiest National Guard Training Center."

Mohn, who also was arrested on a weapons possession charge, was arraigned early Wednesday and held without bail. He is scheduled for a hearing on Feb. 8.

A motive for the killing is still under investigation.

WPVI contributed to this post.

(The-CNN-Wire & 2023 Cable News Network, Inc., a Time Warner Company. All rights reserved.)