“I Get on Better with Anglo People”: Racial Discrimination in Guest Acceptance Decision-Making Within London’s AirBnB Host Community

Abstract: 

Research on racial discrimination within the “sharing” economy is still emerging. While large-scale quantitative data has been employed to find differences in acceptance rates based on racial categorisation, little qualitative research has been conducted with hosts to understand how bias enters their decision-making processes on these platforms. In this study I utilise data produced between 2017 and 2018 during in-depth conversations with 18 AirBnB hosts in London. 

While the issue of race was never prompted by the interviewer, 3 hosts directly discussed guest race or nationality as an issue that informs their decision-making practices – in these cases, hosts indicated that they would reject Asian guests. Hosts based this decision-making on vague racist or xenophobic generalisation, for example, one host said ‘I had one from Singapore; I would never have anybody from Singapore again.’  In these instances, a single negative experience with the 'other' can result in a generalisation that leads to ongoing discrimination. 

This study aims to understand the rationality utilised by hosts in guest acceptance decision-making, including those who don’t openly discriminate, and through this, develop an understanding of how racial bias might propagate both directly and indirectly on AirBnB. It is important to see, for instance, how other factors that hosts use to decide if they are going to accept a guest, such as the guest's use of the English language, might allow for indirect discrimination. 

This paper offers some of the first direct evidence of racial discrimination in the 'sharing' economy and shows how digital platforms have the potential to deepen (rather than soften) offline racial divides.