Nathan'ette Burdine's The Nyle Magazine

  News     Politics       Entertainment      Under the Radar      Double-Talking            

Legionnaires' disease;
Ohio becomes the
second state where
a person died from the Legionnaires outbreak
`19-year-old Utah man who ran over an 11-year-old girl;
19-year-old allegedly told witnesses that drugs made him run over an 11-year-old girl
CBS This Morning William Barr Speaking with Jan Crawford about Robert Mueller's Russian Investigation Report;
William Barr says a "high level group" is being formed to counter foreign interference into the 2020 Presidential Election



An article in Wired is why YouTube
issued a new policy addressing the
problem of child predators on its site
by Nathan'ette Burdine: March 13, 2019

An article in Wired is why YouTube issued a new policy addressing the problem of child predators on its site.

Eight days before YouTube announced its new policy to remove predatory content targeting children, KG Orphanides wrote an article in Wired titled “On YouTube, a network of pedophiles is hiding in plain sight.”

The one thing Orphanides wants folks to keep in mind is that YouTube search code makes it fairly easy for pedophiles to find and share videos of the children they want to target.

You read it right. YouTube’s own system makes it easy for pedophiles to find videos of children to target with their sick comments.

Just in case you continue to doubt the words I’ve written, go on and read what Orphanides wrote, “Its algorithm specifically suggests videos that are seemingly popular with other pedophiles, most of which have hundreds of thousands of views and dozens of disturbing comments.”

All a pedophile has to do is type young girls in the search box and several suggestions of young girls doing everything from dancing, eating candy, swimming, to doing yoga will pop up.

Now, I know that us normals see it as normal for a child to dance, eat candy, and do whatever else it is that children like to do nowadays. But a pedophile, unh-unh, those nasty good for nothing devils aren’t normal. Those things see triple XXX, porno, when they see a child engaging in normal child activities like dancing or swimming. Those nasty devils look at a video of a child just simply being a child and those good for nothings immediately begin looking for “pocket book” and “booty” shots to take, time stamp, and then share with their fellow perverts.

Mind you, this nasty mess has been going on YouTube’s site for quite some time now.

It was just less than two years ago, back in 2017, The Times of London labeled YouTube a “sweet shop” for sickos due to the fact that YouTube’s algorithm made it easy for them to search for and then share videos of young girls.

Fast forward here to 2019, less than a year later and the company still has a pedophile problem that is a result of its own search code that makes it easy for pedophiles to seek out and target children.

Therefore, asking YouTube to change its search code is like asking a zebra to change its stripes. It just ain’t gonna happen. The search code is central to YouTube’s bottom line which it needs in order to keep YouTube going.

All of that data information YouTube obtains from a user’s search results is information YouTube sells to advertisers.

Advertisers use information about the number of site visitors a site has and the site visitors’ behaviors while on the site in order to determine how much ad space a company will buy.

That’s why YouTube got up and decided to do something by issuing that new policy. They are worried that more advertisers will join the likes of Grammarly and Peloton in reevaluating the companies’ relationship with the social media site.

A spokesperson for Grammarly told Orphanides that their company is “absolutely horrified” by the thought that their products would be placed alongside such vile content and that they “would never knowingly associate ourselves with channels like this.”

A spokesperson for Peloton said they are going to find out why their “ads were being displayed against such videos.” YouTube looked at those comments and were like, “We better do something before we lose this money.”

The something that YouTube did was to issue a new policy. “Over the past weeks, we disabled comments from tens of millions of videos that could be subject to predatory behavior. Over the next few months, we will be broadening this action to suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behavior.”

It sounds good. But hey, it’s the same ol’ song with different words. The sad truth is that in another one to two years, YouTube will be going right back around this bin again.

There will be another report about how YouTube’s algorithm is child predator friendly. The public shame will make some advertisers leave. YouTube will feel bad about the entire situation and address the issue in a blog post. But as soon as the light on the problem dims down, YouTube will go back to doing what it does best which is focusing on its bottom line.

comments powered by Disqus