YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. (Marijan Murat/picture alliance via Getty Images ...
If Nielsen stats are to be believed, we collectively spend more time in front of YouTube than any other streaming service—including Disney+ and Netflix. That's a lot of watch hours, especially for an ...
If you've ever opened YouTube and seen a talking AI SpongeBob, a looping slime video, or something that feels engineered to melt your brain just enough to keep you scrolling, congrats. According to a ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
YouTube Shorts, the shortform platform from Google-owned video giant YouTube, has seen massive success since its launch in September 2020. Today, an estimated 1% of all waking human hours are spent ...
The researchers, the New York Times reports, find that the same tenets that reward extremism also happen with sexual content on YouTube: A user who watches erotic videos might be recommended videos of ...
A new study conducted by the Computational Social Science Lab (CSSLab) at the University of Pennsylvania sheds light on a pressing question: Does YouTube's algorithm radicalize young Americans?
Over the years, the YouTube suggestion algorithm has become pretty complex. I’ve noticed that it can extrapolate my tastes very well based on my watch history, continuously tempting me to consume more ...
You may think you’re too smart to fall for a conspiracy theory. Your social media is dedicated to cat videos, Trader Joe’s hauls and Saturday Night Live sketches. You think you’re safe in this ...