Close

Hmmm, you are using a Gmail.com email address...

Google has declared war on the independent media and has begun blocking emails from NaturalNews from getting to our readers. We recommend GoodGopher.com as a free, uncensored email receiving service, or ProtonMail.com as a free, encrypted email send and receive service.

That's okay. Continue with my Gmail address...

YouTube promotes pedophilia, auto-completes search for “how to have” with suggestion of sexual relations with children


What’s passing as news these days includes a recent scandal over at YouTube that’s turning the world of social media and the blogosphere completely upside down. Apparently when some people try to search for videos on YouTube using the words “how to have,” the first auto-completed suggestion recommended by the online video service is content that promotes both incest and pedophilia.

The exact words that some people say they’re seeing pop up in the YouTube search box after the aforementioned phrase are “s*x with your children,” a sickeningly evil suggestion that has many people asking the question: Are you serious, YouTube? As reports have circulated across the internet about this major offense, several news outlets have picked up the story as well.

BuzzFeed reporter Charlie Warzel, for instance, reportedly attempted the search while logged out of his personal account and, sure enough, it pulled up the vile suggestion. He took to Twitter with a screenshot of what he says he saw, stating:

“Seeing this screenshot float around so I just did it myself in incognito mode. This i (sic) a VERY troubling YouTube search autocomplete, no?”

Warzel made it clear that, should one actually click the auto-completed phrase, YouTube does not pull up the gross content in question. In a follow-up tweet he said:

“should note that if you click any of those you do not get any pedophile vids (that i could see but still…what’s going on?”

Independent journalist Mike Cernovich reportedly had a similar experience, which he also tweeted about with his own screenshot – as did many others after engaging in the experiment themselves. As to be expected, the resounding consensus was one of absolute disgust and confusion as to why YouTube would suggest such a vile thing.

Internet sleuths say disturbing auto-complete suggestions a result of coordinated attacks by online ‘trolls’

Warzel was apparently so disgusted with the whole situation that he personally contacted YouTube directly about the issue. A spokesperson from YouTube reportedly offered up its own perplexed response, indicating that the matter is still under investigation.

“Earlier today our teams were alerted to this awful autocomplete result and we worked to quickly remove it,” the company stated. “We are investigating this matter to determine what was behind the appearance of this autocompletion.”

In the absence of an official answer from YouTube, many amateur online investigators have come to the conclusion that this must be some kind of large-scale, coordinated effort by internet “trolls” to alter the auto-complete suggestions that show up on YouTube.

Google, the parent company of YouTube, has been in this situation before. The multinational corporation sustained its own barrage of criticism for controversial search results that prompted it to publish a special blog post back in June 2016 explaining how auto-complete algorithms are supposed to work.

In no way is the auto-complete algorithm ever supposed to complete a search for a person’s name with terms that the company describes as “offensive or disparaging.” Further, the auto-complete feature populates suggestions based on what a lot of people are currently searching, as well as more recent search terms – which gives credence to the coordinated attack theory.

“Autocomplete isn’t an exact science, and the output of the prediction algorithms changes frequently,” Google explains in its blog post about the topic. “Predictions are produced based on a number of factors including the popularity and freshness of search terms.”

According to Warzel, a source familiar with YouTube’s algorithms told BuzzFeed News that the unusual asterisk present in the “s*x” portion of the inappropriate auto-complete phrase further suggests that the anomaly is a result of a sick group of people with perhaps an even sicker social agenda.

Sources for this article include:

TheGatewayPundit.com

NaturalNews.com

Blog.Google

Receive Our Free Email Newsletter

Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.



Comments
comments powered by Disqus

RECENT NEWS & ARTICLES