# Thomas Barth
A theoretical problem in recent sociology concerns how differentiated social units are integrated in digital environments. This abstract argues that by reconceptualising the concept of „generalised symbolic media“ (Luhmann) connected with the Theory of panoptism (Foucault), a new theory of integration can be posited that exposes the importance of regulation of platform capitalism. The Data Strategy of the German government (Januar 2021) expressed the will to follow the Digital Markets Act (DMA), one of the centrepieces of the European digital strategy. According to the DMA, some large online platforms act as „gatekeepers“ in digital markets. The DMA aims to ensure that these platforms behave in a fair way online, what seemingly needs some theoretical efforts.
Shoshana Zuboff states that new capitalist products are traded in a new kind of marketplace „that I call behavioural futures markets“. The new Surveillance Capitalism regards human experience as free raw material for translation into behavioural data which are declared as a proprietary behavioural surplus, fed into processes based on AI, „and fabricated into prediction products that anticipate what you will do now, soon, and later.“
Luhmann‘s generalised symbolic media need a new concept of panoptical surveillance of the individual that is connected with targeting messages to this individual. I would like to call this new generalised symbolic media „ACCESS“ (access to the privacy of an individual to manipulate the access of this individual to messages). The Facebook/Cambridge Analytica privacy scandal is a well known example of this new formation of digital discourse mechanisms:
In March 2018, media broke news of Cambridge Analytica’s business practices. The personal data of up to 87 million Facebook users were acquired via the 270,000 Facebook users who used a Facebook app. The company claimed to use data enhancement and audience segmentation techniques providing „psychographic analysis“ for a „deeper knowledge of the target audience“ using the Big Five model of personality for a behavioral microtargeting. Services then can be individually targeted for the benefit of its clients from the political arena, governments, and companies. The 2016 Trump campaign was provided with such individualised political messages to American voters. In January 2020, a release of more than 100,000 documents showed how Cambridge Analytica worked in 68 countries as a global infrastructure with operations to manipulate voters on an industrial scale.
The business model of Cambridge Analytica may be seen as a dispositif (Foucault) or as a (social) discourse practice. The way platform capitalism functions as discourse in Foucault’s sense of the term, that is, the way it constitutes subjects outside the immediacy of consciousness. Hopefully the systemtheoretical approach of ACCESS is a better way to explain what is done with target subjects of this digital discourse practices. The fashioning of „digital selfes“ in a digital panoptism is probably harming us. Social media are leading us to create a strategic social self between Goffmans front stage and back stage. This social self is the main target of the new generalised symbolic media „ACCESS“.
Foucault, M.: Überwachen und Strafen, Frankf./M. 1981
Luhmann, N.: Soziale Systeme, Frankf./M. 1987
Poster, M.: The Second Media Age, Cambridge 1995
Zuboff, S.: Das Zeitalter des Überwachungskapitalismus (The Age of Surveillance Capitalism), Frankf.M. 2019
Carole Cadwalladr, Fresh Cambridge Analytica leak ‘shows global manipulation is out of control’, Guardian 4.Jan.2020, https://www.theguardian.com/uk-news/2020/jan/04/cambridge-analytica-data-leak-global-election-manipulation
#Als Vorschlag am 15.2.2021 eingereicht bei:
CfP: Erste Jahreskonferenz des Platform Governance Research Network
Die Veranstaltung findet auf Englisch statt.
The discussions around platform governance can be traced back to long-standing debates on the legal, social, and material structures that constitute the Internet’s ordering. For over 20 years, scholars from multiple fields have sought to decipher this sprawling web of power struggles. However, the consolidation of a few digital platforms as central global spaces of interaction and consumption has re-oriented many of these endeavours, making them more specific but not less complex. How platforms create, enforce, and enact rules and technologies that affect billions of people around the world — and the ways in which different actors seek to affect those structures — is now a major focus of public and governmental attention. As a scholarly area, platform governance can be understood as a part of a longer-term project to explore the logics behind, and the consequences of, the “private mediation between Internet content and the humans who provide and access this content” (DeNardis, 2012).
Work on this topic now is increasingly featured at various disciplinary conferences ranging across communication, public policy, computer science, human-computer interaction, law and technology, and science and technology studies, as well as interdisciplinary conferences like AoIR, FAccT, or GIGANET. However, there still is no single venue that tries to bring together these broader communities into a more focused conversation, looking more specifically at the multifaceted and increasingly complex role that online intermediaries play in today’s platformized societies (Van Dijck et al., 2018).
Please contact email@example.com with any questions.