By: The Hill
July 7, 2021
YouTube users have reported potentially objectionable content in thousands of videos recommended to them using the platform’s algorithm, according to the nonprofit Mozilla Foundation.
The findings, released Wednesday, revealed many instances of YouTube recommending videos that users had marked as “regrettable” — a broad category including misinformation, violence and hate speech.
The 10-month-long investigation used crowdsourced data gathered by the foundation using an extension for its Firefox web browser as well as a browser extension created for Chrome users to report potentially problematic content.
Mozilla gathered 3,362 reports submitted by 1,622 unique contributors coming from 91 nations between July 2020 and June of this year.
The nonprofit then hired 41 researchers from the University of Exeter to review the submissions and determine if they thought videos should be on YouTube and, if not, what platform guidelines they may violate.