Google, which owns YouTube, has been accused of giving unusual prominence to YouTube videos featuring extreme content, conspiracy theories, and misinformation on its search results page, according to an investigation by Sky News.
The investigation discovered a number of search terms resulted in content pushing viewers towards false claims appearing in the box marked “videos”, which is usually featured near the top of the first page among the top results.
For example, when Sky News searched the phrase “5G birds” – a conspiracy theory that the upgraded mobile networks will kill birds – the top results showed an article debunking the theory.
However, directly below the Videos box showed three YouTube videos supporting the claim with the first titled “5G is killing birds What is it doing to us,” from a New Zealand-based YouTube channel with only 3,096 subscribers.
- DeepNude AI Photo App Taken Down by Makers
- EU Cybersecurity Act Comes into Force
- YouTube Data Reveals Glastonbury’s Most Viewed Artists
This was not an isolated incident, other search terms including “Maddy McCann”, a term that was trending after the release of a Netflix documentary, or “Yellow vests” resulted in videos featuring conspiracy theories that have been repeatedly debunked.
Since Sky News alerted Google to the search results, the Video box no longer showed for this search term or other related terms.
A spokesman for the platform told Sky News that deliberate misinformation was “a major concern” and that it was tackling the problem by cutting off some sites’ revenue and prioritising authoritative sources – particularly around topics such as chemtrails and vaccines. It was unethical or of financial for the platform to push harmful content, the spokesman added.
In response to Sky New’s findings, Will Moy, director of independent fact-checking charity Full Fact, said: “I’m concerned that when we look for information we ought to get the stuff that actually helps us make up our minds. It’s not so great that they then amplify that to everyone who is casually searching. That seems to be a risk that they haven’t fully understood.”
A former YouTube Software engineer, Guillaume Chaslot, who now runs a non-profit called Algo Transparency describes the platform as a “radicalisation engine”. Chaslot, who helped build the platform’s search algorithm said that over the past decade “The algorithm for the last 10 years has been pushing people down rabbit holes”.
“That is most efficient for watch times,” he said. “Whether it’s a terrorist rabbit hole or one that will make you believe crazy conspiracy theories or make you watch crap all day, it doesn’t matter”.
YouTube has dismissed these claims saying they are no longer using this secretive algorithm anymore.