Research Projects

Influencers are often argued to connect with their audience via a host of media tactics, such as personal vlogs. But how are audiences reciprocating this relationship and (re)constructing their own side of this relationship? In this project, executed during the DMI Winterschool 2022, we reveal how audiences engage in a parasocial relationship with "the most paranoid man in America."

For many people social media has replaced the role of traditional media. During the 2020 U.S. presidential election day, political commentators on YouTube were competing with television media in their coverage of the events. But how did audiences exprience these livestreamings? This DMI 2020 project reveals how audiences create subcultural rituals using emotes and memes on YouTube and Twitch to experience live political events.

Are media radicalizing us? In the past half-decade, this question has increasingly come to preoccupy public debate. This web project offers a cross-disciplinary and public-facing dialogue on this question, with non-linear and interactive presentations of experts' views and related assays, video art, and documentary footage.

An investigation for the Dutch government into the extent to which disinformation and so-called junk news resonate in the online political arenas on social media, before and during the 2019 Provincial Council and European Parliament elections in the Netherlands...

Over the last few years, the creation of ‘filter-bubbles’ and the issue of algorithmic radicalization has sparked significant controversy within academia and privacy-focused collectives. In this literature, YouTube especially has come under scrutiny. In 2019, Ribeiro et al. have sought to computationally verify the ‘anecdotal literature’ on the ‘radicalizing power’ of YouTube’s Recommendation Systems (RSs). They argued that there is indeed empirical evidence to confirm the hypothesis that users navigate along with YouTube Recommendations. Contrary to this research, at the end of 2019, Ledwich and Zaitsev claimed that the hypothesis of YouTube being a platform for algorithmic radicalization to be actually false. Instead, they highlighted how YouTube recommends a more mainstream type of content to its users. In other words, blaming the algorithm for creating niche viewing behavior might be incorrect...

How are several popular conspiracy theories (involving China, 5G, Bill Gates, QAnon, flat earth, and the deep state) spreading across social media? Using the concept of a “conspiracy tribe” as a heuristic for gauging how online communities produce and consume conspiratorial content in a variety of ever-shifting social media constellations, we wanted to know whether social media platforms saw an increase in engagement with these theories and how this engagement changed over time, in terms of the user base and types of conspiratorial content...

YouTube in 2018 became known as one of the main platform for audiences to engage with the controversial content and discussions, uploaded and shared by followers of the so-called Intellectual Dark Web. As Joe Rogan -- one of the leading figures within the field of lengthy contrarian discussions -- explains, “people are starving for controversial opinions." YouTube seems to have become a market for ideas that are considered to be too controversial to be discussed by the mainstream press and academic establishment. One of these topics that, according to influencers on YouTube deserved more 'intellectual' engagement was that of 'race realism' or now known as 'scientific racism'...