This paper analyses cultural factors within tech companies that limit the participation of older people in the digital society. Algorithms that run behind the screens of digital systems are non-neutral. They are social constructs, engineered by humans, and hence they embody rules, ideals, imaginations, perceptions and cultures (Klinger and Svensson, 2018). For example, algorithms respond to corporate interests (Zuboff, 2019) as well as corporate cultures (Kunda, 2006; Svensson, 2020). Directly or indirectly, they also reflect personal ideals (Levy, 2010) of those involved in coding. Notably, tech culture is quite homogenous in terms of age, ethnicity and gender. It is young, predominantly populated by men of Caucasian or Asian origin (Wachter Boettcher, 2017), which is associated with the structural discrimination embedded in digital technologies (Faulkner, 2001; Wajcman 2009) that reinforce sexism or racism (Buolamwini and Gebru, 2018). For example, the racism of face recognition systems (Buolamwini and Gebru, 2018) and the gender biases of image search algorithms (Kay, Matuszek and Munson, 2015).
However, less studied are discriminatory practices concerning age (Rosales & Fernández-Ardèvol, 2019). In this paper, we analyse unstructured interviews with 18 programmers in tech companies in Germany, India, Israel, Spain and USA. The interviews revolved around how tech workers understand, in particular, old age in connection to working in tech as well as their past, present and future programming trajectories. We also asked how ideas of age shape the products they develop. According to our study, widely accepted ageist ideas in tech companies, tend to deprioritise, disregard or exclude older people. Older people are discriminated against, as developers, as test users and as target users. Thus, by not taking into account the habits, innovations, and interests of older people, media technologies tend to reinforce ageism, challenging older people to agentively appropriate ICTs.
References
Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Fairness, Accountability and Transparency, 81, 1–15.
Faulkner, W. (2001). The technology question in feminism: A view from feminist technology studies. Women´s Studies International Forum, 24(1), 79–95.
Kay, M, Matuszek, C, and Munson, SA (2015) Unequal representation and gender stereotypes in image search results for occupations. In: Proceedings of Human Factors in Computing Systems. ACM, 3819-3828.
Klinger, Ulrike & Svensson, Jakob (2018) “The End of Media Logics? On Algorithms and Agency”. New Media & Society. Vol. 20. Issue 12.
Kunda, Gideon (2006) Engineering Culture: Control and Commitment in a High-Tech Corporations, Temple University Press
Levy, S. (2010). Geek Power: Steven Levy Revisits Tech Titans, Hackers, Idealists | WIRED. Retrieved February 13, 2019, from https://www.wired.com/2010/04/ff-hackers/
Rosales, A., & Fernández-Ardèvol, M. (2019). Structural ageism in big data approaches. Nordicom Review, 40(s1), 51–64.
Svensson, Jakob 2020 Wizards of the Web. A journey into tech culture and mathematics. Göteborg: Nordicom
Wajcman, J. (2009). Feminist theories of technology. Cambridge Journal of Economics, 34(1), 143–152.
Wachter-Boettcher, S., & Emmes, A. (2018). Technically wrong: sexist apps, biased algorithms, and other threats of toxic tech. Norton & Company