TheirTube lets you see how conspiracy theorists fall down the YouTube rabbit hole – it can show the same points of view over and over again, confirming and amplifying your existing bias

TheirTube is a demonstration of this concept, and really shows how recommendation algorithms (what to watch next) tend to show you more of what you want to see, and can be matched to personality types (similar to yours). Of course its not only YouTube as Facebook has regularly been highlighted for the way its home feed algorithms function in a similar manner.

Over the years, YouTube has increasingly been criticized for its controversial recommendation engine. That algorithm informs the platform as to what other videos to recommend to users based on what they consume on the site. Critics have highlighted how this can often lead YouTube’s recommendation engine to promote videos with further extremist viewpoints, pushing users down a “rabbit hole” where all they see on the platform are these kinds of videos.

And this can be a pitfall of "Google research" too where "you get what you search for"… So be considerate of how you phrase a search term. It just demonstrates the challenges of the Internet with regard to quality, context and bias of information.

See This website lets you see how conspiracy theorists fall down the YouTube rabbit hole

#technology

Image/photo

Do you know any conspiracists, fruitarians, or climate deniers?