Everybody enjoys YouTube.
In fact, the newest generational cohort, Generation Z, has been defined by sociologists as a generation shaped by YouTube, rather than Google (like their predecessors, the Millennials). From vlogs, to instructional videos, to gaming let’s plays, and the relatively new YouTube Red, there is something for everybody on the video platform.
However, recently, some rather disturbing phenomena have come underneath the light of the public eye. Weird video content is to be expected, and this is evident in violent cartoons pretending to be kid-friendly.
Andrew Torba, CEO and Founder of Gab.ai, an alternative social networking website that has taken the internet by storm, recently posted a screenshot of his YouTube suggestions. As he typed, “How to hav-” into YouTube’s search engine, the following results are what popped into the suggestion box:
Disturbingly, the primary suggestion that YouTube offers to anyone typing in, “How to hav-” is “How to have sex with your kids.” While this is clearly the symptom of a pedophilia problem among parents that needs to be addressed, why is YouTube permitting such search terms to be saved and recommended by its AI
The dreaded “adpocalypse” on YouTube is a much-discussed issue of concern among those who rely on the platform for ad revenue. Because what YouTube has deemed as “innapropriate” or “controversial” content has been censored and demonetized, public dissatisfaction with the video platform has been outspoken since 2016.
Whereas politically incorrect and offensive videos are being actively targeted by Google censors, search results including pedophilia-related terms are being tolerated on the platform.
Like my content? Consider buying me a cup of coffee! Squawker Media is a grassroots media outlet comprised of independent journalists and are in desperate need of support to keep this cause going. I appreciate all of my readers, as I would be nothing without your loyalty or support.