YouTube users have reported potentially objectionable content in thousands of videos recommended to them using the platform’s algorithm, according to the nonprofit Mozilla Foundation. The findings, ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. (Marijan Murat/picture alliance via Getty Images ...
A new study conducted by the Computational Social Science Lab (CSSLab) at the University of Pennsylvania sheds light on a pressing question: Does YouTube's algorithm radicalize young Americans?
Some results have been hidden because they may be inaccessible to you
Show inaccessible results