There is a new open access journal in the making, and I am glad to be part of its editorial board. Please check out the homepage of Memory, Mind and Media at Cambridge University Press. While its official launch date is 2022, first online articles will be published by mid-2021. The journal is edited by Andrew Hoskins (University of Glasgow, UK) and Amanda J. Barnier (Macquarie University, Australia).
Memory, Mind & Media (MMM) explores the impact of media and technology on individual, social and cultural remembering and forgetting. This agenda-setting journal fosters high-quality, interdisciplinary conversations combining cognitive, social and cultural approaches to the study of memory and forgetting in the digital era. The pervasiveness, complexity and immediacy of digital media, communication networks and archives are transforming what memory is and what memory does, changing the relationship between memory in the head and memory in the wild.
MMM offers a new home for a wide variety of scholars working on these questions, within and across disciplines, from history, philosophy, media studies, cultural studies, law, literature, anthropology, political science, sociology, neuroscience, psychology, cognitive and computational science and elsewhere.
The journal gives priority to submissions that are cross-disciplinary and/or interdisciplinary, experimental, agenda-setting and push the boundaries of existing knowledge and methods. The journal insists on jargon-free, plain English submissions to ensure a widely accessible forum for cutting edge work.
MMM is a high-quality, peer-reviewed journal, publishing online and Open Access. As a barrier-free Gold OA journal, a fee waiver system is in place for unfunded authors. You can submit your article using our online submission system here. General queries should go to email@example.com.
Within the volume’s long production time, the reproduction of images somehow took a strange trajectory. This is why I republish them in this blog post for your viewing and reading pleasure. And do not forget to check out the other excellent contributations to this publication of the German Research Foundation’s third Media Studies symposium! It is also the first time that this has been a transatlantic event. I am very grateful to have been a part of it. „The Practice Turn in Media Studies“ weiterlesen
Based on historical case studies focused on media and data practices,
the project reconstructs the co-operative creation of networked media
since 1989. From a media-historical perspective, it aims to provide a
contribution to the European and transatlantic history of the Internet
and the World Wide Web. From a media-theoretical perspective, the
project aims to develop and specify a concept of digitality that takes
into account its cooperative emergence, its infrastructural
maintenance, universalization, and its specific publics.
We thereby focus on the constitutive role of a) interchangeability of representations and the growth of digital systems, b) cooperative production of interoperability and modularity, and c) elementary practices of reading, writing and algorithmic control. The three work packages of the project explore
the constitution of the World Wide Web via its situated work constitution (Gießmann, Schüttpelz, Taha, Volmar),
the development of intranets using the example of German corporate networks (Taha) and
the emergence and spread of IP-based real-time communication via instant messaging (Volmar).
We assume that the establishment of the Internet and especially the
World Wide Web as a public general-purpose infrastructure has lead to a
remediation of cooperative practices of local working contexts. The
project therefore therefore reconstructs the emergence and proliferation
of web applications as a software- and data-oriented infrastructural
history of cooperative media. We focus on the mutual production of
cooperative conditions from collective, locally limited as well as
translocally distributed work contexts and the corresponding situated
data practices and arrangements (such as format usage, user
administration, file sharing, collaborative processing of files,
programming, error correction, patenting, standardization, etc.).
We are particularly interested in the interactions between work practices and the specific requirements for cooperation they produce, and in the materializations and affordances of digital micro-practices, through which cooperative conditions are ultimately realized in the form of digitally networked applications. We analyze these dynamics before the background of a longue durée of bureaucratic and administrative processes. These form the underlying socio-technical conditions that determine the materiality of cooperative computing, networking and data processing.
This research project is a part of the Collaborative Research Center „Media of Cooperation“ at Siegen University. Feel free to contact us anytime! Up to date publications can be found at our Media of Cooperation homepage.
Die dreckige Wäsche wird immer zum Schluss gewaschen. Yasha Levines furiose Abrechnung mit dem Surveillance Valley setzt auf den letzten Seiten zum Rundumschlag an. Egal ob Edward Snowden, Jacob Applebaum, Roger Dingledine oder die Electronic Frontier Foundation: Für Levine spielen die Aktivisten rund um die Verschlüsselungssoftware Tor allzu naiv das Spiel von Geheimdiensten und Militärs mit, ohne sich kritisch mit der Herkunft ihrer favorisierten Technologien auseinander zu setzen. Levine, Sohn russischer Einwanderer und investigativer Journalist, hält sich hingegen an die Devise follow the money. Er beginnt sein Buch mit der bekannten Geschichte von Sputnik-Schock und Vietnamkrieg, die in den USA der 1960er-Jahre staatliche Forschungsgelder im ungeahnten Umfang mobilisierten. Er widmet sich der Advanced Research Projects Agency (ARPA), die auf dieser Basis als Forschungsagentur des US-Verteidigungsministeriums gegründet wird.
Als Urszene der digitalen Überwachung fungieren in Surveillance Valley die strategischen Aktivitäten der ARPA zur Aufstandsbekämpfung im Project Agile. Sie beruhten auf einer Analyse des Militärgeheimdienstmanns William Godel: Angesichts der militärischen Fehler der französischen Kolonialmacht in Vietnam lautete dessen Schlussfolgerung, dass zukünftige counterinsurgency kleinteiliger, verdeckt, mit mehr High-Tech und psychologischer Kriegsführung operieren müsse. Noch vor Ausbruch des Vietnamkrieges baute die ARPA daher für das Pentagon gezielt Überwachungsstationen in Vietnam auf. Im Rahmen von Operation Igloo White wurden – weitestgehend ohne Erfolg – tausende Sensoren und Mikrofone im Dschungel platziert.
Thursday, 24 January 2019, University of Siegen Herrengarten 3, 57072 Siegen, room AH 217/218
13:15 Opening Remarks: Standards Revisited Sebastian Gießmann (University of Siegen) / Nadine Taha (University of Siegen)
13:30 Anna Echterhölter (University of Vienna) Red and Black Boxes: Standardization as Mesuroclasm in German New Guinea
14:30 Nadine Taha (University of Siegen) George Eastman and the Calendar Reform
16:00 Geoffrey C. Bowker (University of California, Irvine) Standard Time: Computers, Clocks and Turtles – via Zoom Conference
17:00 Lawrence Busch (Michigan State University) Markets and Standards – via Zoom Conference
Friday, 25 January 2019
10:00 JoAnne Yates (MIT, Sloan School of Management) A New Model for Standard Setting: How IETF became the Standards Body for the Internet
11:00 Thomas Haigh (University of Wisconsin, Milwaukee / University of Siegen) The Accidental Standard: How a Box Became an Industry
13:00 Sebastian Gießmann (University of Siegen) Standardizing Digital Payments
14:00 Anne Helmond (University of Amsterdam)/ Fernando van der Vlist (University of Amsterdam / University of Siegen) ‘It’s Graphs All the Way Down’
are not easy to come by. As infrastructural media they coordinate the
social to an ever-growing extent, thus creating conditions of
cooperation. Standards do so not just by their sociotechnical power, but
also by public uptake and controversies that put their accountability
into question. They can also be understood as engineering and
bureaucratic media that form a basis and condition for cooperation.
Historically, practices of
standardization can be traced back to antiquity, especially in the
history of coins, writing, and measurements. But pre-modern standards
were bound to flounder and dissipate. Early modern knowledge cultures –
partly – realized standardization via hand-made scientific instruments
that extended metrological chains. While pre-industrial attempts to
standardize the aggregation of information in administrative forms have
been limited in scale and scope, 19th century industrialization
interconnected with nationalized politics extended the territories of
standardization. Media infrastructures such as the postal service and
telegraphy became transnational through their administration in
international organizations and a legal foundation via international
treaties. Scale and scope of – inherently political and normative –
standards and metrologies were at the same time constitutive for
colonial prospection and rule.
Computing has given rise to its own
regimes and obsessions of non-governmental standardizing. While early
digital computers were unique, the trajectories of standardization were
then tied to governmental contract research, commercialization and its
coordinative and delegative practices. Serial production and the
diffusion of architectural norms became a matter of economic competition
in the era of mainframe computing in organizations. In multiple ways
both the networking of heterogeneous computers and the success of the
IBM-compatible PC did create a pathway to “open standards” that made
computers publicly accessible. In the transpacific and global arena of
hardware and software production, hyper-standardization has been an
issue ever since. This also involves the questions of formats that
mediate bureaucratic processes, textual representation, visual and
auditory perception, and digital audiovisuality. Formats thus have
become standards that mediate digital practices in their own right, just
like network protocols and Internet standards. In many ways, the
ecology of the World Wide Web is an ecology due to its standardizing
bodies, communities of practice, and institutions like the Internet
Engineering Task Force (IETF) and the World Wide Web Consortium (W3C).
Our aim is to understand how standards
generalize and universalize media technologies, and to ask: How do
metrology, industrialization, and imperialism/colonialism intersect with
standards? What is the relation between standards, digital media, and
coordination? How to explain the longue durée, ecology, and the enduring
power of standards to configure cooperation? What is the relation
between standards, delegative power, scale, and scope of media?
The article intertwines the history of the American credit card, its standardization, and interactional realization with the latest developments in payment systems. Understanding both credit cards and systems like Apple Pay or blockchain-based applications as part of an administrative longue durée, it argues for a different understanding of the Internet of Things. It should be understood both as a technical-informational and as an accounting infrastructure, with tensions arising between both segments.