Logic & ethos of recommender systems in European public television: Strategies for preserving diversity

Abstract: 

Introduction

This article aims to analyze and compare the Video on Demand recommender system's strategies of 3 European Public Service Media (following Hallin and Mancini's media system classification: BBC for the liberal model, ZDF for the democratic-corporatist model and RTP for the polarized-pluralist model) in order to assess how these broadcasters are facing the challenge of organizing and distributing content in their online platforms. The objective is to define whether this broadcasters take into account notions of diversity and universality when recommending online content; and to what extent is the user capable of filtering its own content consumption. 

Theoretical Framework

Diversity has largely been regarded as a fundamental part of news quality (McQuail, 1992; Strömbäck, 2005) and in the last years, as we have seen how audiences around the world have taken up new forms of content consumption following the blooming of VOD services (IHS Markit, 2019) and algorithmic automatizations (Túñez, Toural, Valdiviezo-Abad, 2019), these algorithms now play a major role in preserving (or not) this diversity. 

The use of algorithms to generate automatic recommendations is a method to retrieve and present personalized content to the user. This personalization can be explicit, that is, when the user proactively reveals its preferences; or implicit, when the personalization is based on the observation of a user's online behavior (Thurman, Schifferes 2012). In both cases the purpose is to deliver content that is more adjusted to its interests (Zhang, Wang, Yuan, Jin, 2019). This recommendation algorithms and content filters are the basis of search engines such as Google, social networks like Facebook or Instagram and ‘Subscription Video On Demand’ (SVOD) platforms like Netflix. 

However, when applied to Public Service Media, these algorithms have raised concerns among scholars for their potential ability to block content diversity (Pasquale, 2015). As algorithms pretend to deliver more personalized, interest-narrowed content, they are presumably more likely to fall in what we know as 'filter bubble' (Pariser, 2011), where recommendations fall into what is presumed as a more 'engaging' content, filtering out other content that could be of valuable public service, and driving the user towards an 'echo chamber' (Sunstein, 2009) or 'sphericule' (Gitlin, 1998) where the user feels it's not been sufficiently equipped to be an informed and rational democratic citizen (Haim, Graefe, Brosius, 2017).

Methodology

We will start by reviewing the statutes, the official web page description, and annual reports of the previous 5 years from all 3 broadcasters in search for the description of their mission, values and principles and how these have been applied in the last years.

This will be complemented with a qualitative method, using semi-structured interviews with the heads of innovation, content and technological development departments of the aforementioned mentioned broadcasters, understanding their strategies on content recommendation, the behavior of their recommender systems and the technological policies of each broadcaster.

The broadcasters will be analyzed and compared following previous works on algorithmic design (Diakopoulos, 2019) that include notions of journalistic values, public service values and interface design.