Radicalisation via algorithmic recommendations on social media is an ongoing concern. Our prior study, Ledwich and Zaitsev (2020), investigated the flow of recommendations presented to anonymous control users with no prior watch history. This study extends our work on the behaviour of the YouTube recommendation algorithm by introducing personalised recommendations via personas: bots with content preferences and watch history. We have extended our prior dataset to include several thousand YouTube channels via a machine learning algorithm used to identify and classify channel data. Each persona was first shown content that corresponded with their preference. A set of YouTube content was then shown to each persona. The study reveals that YouTube generates moderate filter bubbles for most personas. However, the filter bubble effect is weak for personas who engaged in niche content, such as Conspiracy and QAnon channels. Surprisingly, all political personas, excluding the mainstream media persona, are recommended less videos from the mainstream media content category than an anonymous viewer with no personalisation. The study also shows that personalization has a larger influence on the home page rather than the videos recommended in the Up Next recommendations feed.
Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations
M. Ledwich,Anna Zaitsev,A. Laukemper
Published 2022 in First Monday
ABSTRACT
PUBLICATION RECORD
- Publication year
2022
- Venue
First Monday
- Publication date
2022-12-13
- Fields of study
Computer Science, Political Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-25 of 25 references · Page 1 of 1
CITED BY
Showing 1-14 of 14 citing papers · Page 1 of 1