The Datafied Web: Call for Contributions RESAW25

Call for Participation The Datafied Web RESAW25

Do you remember the beginnings of early metrics in the 90s, the birth of web counters, those digital pioneers that marked and started to quantify the pulse of online activity, the novelty of seeing website visits measured in real-time, eye-catching graphics becoming the currency of online attention, and the early days of companies like Webtrends, Urchin and DoubleClick?

We invite scholars, researchers, web archivists to contribute to the 6th RESAW conference on the topic of “The Datafied Web”, through a historical lens. We would like to delve into the historical roots, trends, and trajectories that shaped the data-driven paradigm in web development and to examine the genealogies of the datafied and metrified web. Historical studies of trajectories towards a databased web and the emergence of platform-driven mobile ecosystems are very welcome, as well as case studies for instance related to the development of Application Programming Interfaces (APIs) and the evolution of data-sharing practices.

Uncovering the early forms of analytics software, their origins, and the role they played in shaping the web landscape, and examining the historical context, aesthetics and role of web counters, analytics tools, mobile sensing and other metrics may also help us deepen our understanding of online interactions, past publics and audiences, and their (uneasy) trajectories. “The datafied web” also raises questions related to methods and (web) archives allowing to research this evolution: what are for instance the challenges and methodologies involved in archiving the metrified and increasingly mobile web, including the back-end infrastructure?

This theme also invites us to trace the historical trajectory of data surveillance and the evolution of data capturing practices on the web. Complementary are issues related to the historical development of tracking mechanisms, cookies, and the creation of digital footprints, as well as the evolution of companies relying on metrics, and the development of financialized web spaces and their implications.

By investigating historical controversies and debates surrounding the increasing datafication of the web and uncovering historical instances of innovative data use or resistance practices against the datafication of the web, this conference also aims at reconstructing vivid and key debates that are transversal to the history of the web. How did the datafied web provide for the sensory media environments that we are now living in?

Finally, we wish to discuss innovative research methodologies for uncovering the historical dimensions of the datafied and metrified web, as well as methods that are approaching web archives as data (from an archiving and research perspective) and explore them through distant reading, metadata, seed lists, and other methods. Plus, we want to encourage everyone to think about datafication as a practice of sensing and sense-making that creates, sustains, and undermines media environments.

This is a reblog from the original call at the RESAW homepage.
You can download a PDF version right here.

„The Datafied Web: Call for Contributions RESAW25“ weiterlesen

Why the Internet Is Not an Internet

Cover Internet Myths

Myth: The Internet is a ‘network of networks’. It connects heterogeneous elements, not just technically, but also socially and economically. The ‘network of networks’ idea has influenced peer-to-peer networking, ideals of scientific and democratic values, and Internet Governance. Basically, it promises universal connectivity and interoperability.

Busted: Yet the Internet we have is not an internetwork of heterogeneous networks, as counterintuitive as it might seem. Network protocols are infrastructure, and infrastructure is boring, bureaucratic and usually taken for granted. Yet developers and administrators of network protocols know about the social and relational character of digital infrastructure, and what is at stake politically in the design of network protocols. In a 2006 interview, computer scientist David Reed made some 1980s political choices of protocol developers transparent: “In fact, the idea of pursuing a thing called ‘the Internet’ (an ur-network-of-networks) was a political choice – that universal interoperability was achievable and desirable. It’s parallel to ‘One Europe’ or ‘World Government’, though not the same. The engineers involved were not ignorant of the potential implications at the political level of that choice” (Reed in Gillespie 2006, 452). Reed’s argument is somewhat typical for the values that influenced the design of Internet protocols and its end-to-end architecture. It is also missing one important historical point.

‘Universal interoperability’ depends on standardisation, and network protocols form the de facto standards of digital mediation. TCP/IP, the Transmission Control Program and Internet Protocol has been imposed as a standard by the US Department of Defense on January 1, 1983. US universities followed that directive and gladly adopted TCP/IP. What did that transition within the ARPANET achieve? Computer scientist John Day argues that within that infrastructural shift the internetworking layer actually got lost. Picture Day’s central argument not in all its subtlety, but in its consequences when he asks “How in the heck do you loose a layer?” (Day 2011). He stresses that the split of TCP and IP “contributed to being an Internet in name only” (Day 2013, 22). Open Systems Interconnection (OSI) and other internetworking approaches took into account that interconnected networks could be based on completely different technologies and addressing schemes. But the Internet Protocol created only one address space for all connected networks; and today’s Domain Name System has been built along that path dependency. You can still hook up any other network with obscure protocols to the Internet as long as it uses the ruling IP addressing system. The 1990s slogan “IP on everything” did not create an ur-network-of-networks. It rather reinforced the loss of what would have been an internetworking layer in a scientifically sound and technically interoperable network architecture (Day 2008). Currently, we need to live with that flaw. The Internet is not doing the heterogeneous networking of heterogeneity that so many people still expect it to do: “OSI had an Internet Architecture and the Internet has a Network Architecture” (Day 2012, 15; cf. Russell 2014).

Truth: Ever since the internetworking layer got lost in 1983, the Internet’s architecture depends on a homogeneous system of naming and addressing. The domain name system DNS does exactly that, creating one seamless space for IP addresses that needs to be centrally administered, even if domain registration procedures are decentralised. The current Internet does not interconnect completely heterogeneous networks, but remains just one single network on the level of naming and addressing. So when will we have a real internetwork?


This is a slightly modified version of a text which is appearing in Busted! The Truth About the 50 Most Common Internet Myths, edited by Matthias C. Kettemann and Stephan Dreyer, Hamburg: Leibniz Institute for Media Research | Hans-Bredow-Institut, 2019. The book is going to be launched at the Internet Governance Forum in Berlin.

References

John Day, Patterns in Network Architecture: A Return to Fundamentals (Upper Saddle River, NJ: Pearson, 2008.

John Day, How in the Heck Do You Lose a Layer!? (International Conference on the Network of the Future, Paris, 2011), 135-143. doi: 10.1109/NOF.2011.6126673.

John Day, How in the Heck Do You Lose a Layer!? (Future Network Architectures Workshop University of Kaiserslautern, 2012). https://www.researchgate.net/publication/261458332_How_in_the_Heck_do_you_lose_a_layer.

John Day, Surviving Networking’s Dark Ages or
 How in the Hell Do You Lose a Layer!? (IRATI RINA Workshop, Barcelona, 2013). http://irati.eu/wp-content/uploads/2013/01/1-LostLayer130123.pdf

Tarleton Gillespie, Engineering a Principle: “End-to-End” in the Design of the Internet, Social Studies of Science 36 (3) (2006), 427-457. Russell, Andrew L. (2014), Open Standards and the Digital Age. History, Ideology, and Networks (Cambridge, Cambridge University Press, 2014).

Circulating Indexicality, Cyberspace and the Early Web

logo the web that was

Looking back at 1990s representations of cyberspace always makes one feel alienated, a bit dislocated, and amazed at the same time. Did the American and Western European grasp of the World Wide Web really mix it with imaginations of cyberspace, all of the time? How could the mundane interfaces, modems, and slowly loading websites give rise to such an enthusiastic mapping of online spatiality, creating an unique visual culture of new cyberspaces? Some explanations for this are easier to give: Cyberpunk, Gaming Cultures and Media Arts had been engaged with online spatiality before the Web grew exponentially in a short time. Interlinking public, and especially urban space with representations of digital cities and information landscapes also did not start with the Web, as Kirsten Wagner has shown as early as 2006 (Wagner 2006). Yet some of the Web’s practices became quickly engaged with a translation of urbanity into cyber-urbanity, and affording a new situationist dérive while surfing. John Perry Barlow’s “Declaration of Independence” attempted to remove the cyberspace from the realm of old statehood and legality, while addressing its representatives at the highly localized 1996 World Economic Forum in Davos.

A lot of this resonates in and with Martin Dodge’s and Rob Kitchin’s seminal work of “Mapping Cyberspace” (2000), which we want to revisit here. For them, the “Web has become such a powerful interface and interaction paradigm that is the mode of cyberspace, particularly for the mass of users who only came online since the mid-1990s.” (Dodge/Kitchin 2000, p. 3). Along with Dodge and Kitchin, a slightly more systematic explanation can be made about the dynamics between locating the Internet, and the Web, topographically while at the same time accounting for its feelingly new information spaces and attaching a topological spatiality to them. Relations between topography and topology are, as I would like to argue, always shifting and relational, thereby relying on the evaluations of what kind of indexicality a mapping wants to achieve. So neither is topography bound to mimetic mappings of actual geographic space, nor is topology something only to be found in the realm of abstract diagrammatics and mathematics that refrain from any geo-indexicality. Methodologically, Dodge and Kitchin appropriated the whole range of digital cartographic options at hand, including a multitude of distributed mappings of geographers at universities and telco companies. Geo-indexicality thus almost always remained topical, even if it was absent in representations of, let us say, a hyperlink topology between websites like Ben Fry’s Valence (1999). “[G]eography continues to matter, despite recent rhetoric claiming the ‘death of distance’.” (Dodge/Kitchin 2000, p. x.)

„Circulating Indexicality, Cyberspace and the Early Web“ weiterlesen