|
Armin Stein

The inevitability of misclassification in media recommender systems

Dienstag, 20. April 2021 - 12:00 bis 13:00, Online (Zoom)

Speaker: Prof. Dr. Christian Stöcker, University of Applied Sciences, Hamburg


Title: The inevitability of misclassification in media recommender systems


Abstract: The abundance of user generated or user submitted media content on modern social media platforms makes algorithmic curation of content more or less inevitable. The precise principles governing the recommender systems are usually opaque, but the goals behind the optimiziation process are transparent: Factors like engagement, dwell time or watch time are optimized for because these measures translate into monetizable usage time, thus directly affecting the financial bottom line of said companies. There are numerous real-world examples of collateral damage caused by these optimization mechanisms. In simulations, it is possible to demonstrate that suggestions and promotion of inappropriate content are an inevitable outcome of the interplay of user behavior, content sources and algorithmic sorting of media content.


Short Bio: Christian Stöcker is Professor of Digital Communication at Hamburg University of Applied Sciences. There he is in charge of a new master program for aspiring journalists and communicators. Besides, he is a columnist at Spiegel Online (one of the most widely read German-language news Websites) where he previously served as head of the Internet Department from 2011 to 2016. Christian Stöcker has published several books within the context of digitization’s impact on society and is one of few German journalists who had access to Edward Snowden’s archive of intelligence documents. He co-authored several investigative stories about the NSA, GCHQ, and other intelligence services. He has advised the German Bundestag, the local parliament of the state of Thuringia and the Bundestag’s enquete commission on artificial intelligence and society on questions of digital publics and the effects of technological developments on society and public discourse.


Selected Publications


  • Stöcker, C. (2020). How Facebook and Google Accidentally Created a Perfect Ecosystem for Targeted Disinformation. In C. Grimme, M. Preuss, F. W. Takes, & A. Waldherr (Hrsg.), Disinformation in Open Online Media (Bd. 12021, S. 129–149). Springer International Publishing. https://doi.org/10.1007/978-3-030-39627-5_11
  • Stöcker, C., & Preuss, M. (2020). Riding the Wave of Misclassification: How We End up with Extreme YouTube Content. In G. Meiselwitz (Hrsg.), Social Computing and Social Media. Design, Ethics, User Behavior, and Social Network Analysis (Bd. 12194, S. 359–375). Springer International Publishing. https://doi.org/10.1007/978-3-030-49570-1_25
  • Stöcker, C. (2019). Bedeutung von Emotionen in den Sozialen Medien, Emotionalisierung durch Soziale Medien: Emotion bringt Reichweite? In A. Besand, B. Overwien, & P. Zorn (Hrsg.), Politische Bildung mit Gefühl (Bd. 10299). Bundeszentrale für politische Bildung.
  • Stöcker, C., & Lischka, K. (2018). Wie algorithmische Prozesse Öffentlichkeit strukturieren. In R. Mohabbat Kar, B. E. Thapa, & P. Parycek (Hrsg.), (Un)berechenbar? Algorithmen und Automatisierung in Staat und Gesellschaft (S. 28).
  • Lischka, K., & Stöcker, C. (2018). The Digital Public. Discussion Paper Ethics of Algorithms. https://doi.org/10.11586/2017049
  • Stöcker, C., & Lischka, K. (2018). Digital public: Looking at what algorithms actually do. The Conversation. http://theconversation.com/digital-public-looking-at-what-algorithms-act...