fbpx

 

The internet, in the words of pioneer Vint Cerf, has gone on to ‘revolutionise the computer and communications world’ and is the largest computer network in the world with an estimated 5.3 billion users, which is almost 66% of the world’s population.

But how is it being recorded, how are we tackling digital storytelling and how are collections being archived and accessed? These are some of the questions posed in at AIT’s inaugural Forum on the Histories of the Internet.

The Forum, which took place on 9 January at the Worshipful Company of Information Technologists’ Livery Hall and online, consisted of five panels and lively corresponding discussions with a total of 17 presentations.

The talks were led by a multi-disciplinary set of academics, practitioners and leaders from civil society, business, industry and government, with online representations from the US and Egypt.

There was also an introduction by AIT’s chair of trustees, John Carrington and closing remarks from AIT trustee and founding Director of the Oxford Internet Institute, University of Oxford, Professor Bill Dutton and Professor Jim Norton Fellow of the UK Royal Academy of Engineering who worked with packet switching in the 1970s at the Post Office.

The Forum also looked out how we arrived at our current networked society, what was happening before packet switching took over and how it enabled mobile technology, what preceded social media and how the latter developed as well as the biases of AI and the societal impacts of the internet such as working from home (teleworking), digital poverty and gender inequality.

The Five Panels:

“The panels covered a wide spectrum of IT developments, which were thought-provoking and a notable addition to our online archive that is accessible to all without charge,” said Carrington, who launched Cellnet’s first cellular service.

“It was an opportunity to build new academic networks in an under-developed field, and also foster academic-industry interaction, which complemented AIT’s objective of raising public awareness of the rich and important history of technology in the UK.”

 

The Highlights

Panel 1 – Who Cares about the History and Archives of the Internet?

Niels Brügger – Lessons from Editing the Journal entitled Internet Histories Journal

In the first panel Who Cares about the History and Archives of the Internet? Niels Brügger, Professor of Media Studies at Aarhus University, opened proceedings with an emphatic ‘I do’ as he explained how he launched the Internet Histories an international, inter-disciplinary, peer-reviewed journal.

“It started in 2010 as an idea. I was editing this book called Web History and in the epilogue I wrote ‘what do we need if we want to push forward the history of the web?’

“And one of the underlying things is that we should have some infrastructure to support us, such as conferences and also an international peer review journal,” he said.

Neils became the managing editor of Internet Histories and in 2017 Taylor and Francis began publishing the journal online and also publishes physical books of special issues and interviews.

 

Brian Sudlow – Insights from IT History on the Search for Accessibility and Effectiveness

Digital Storytelling and Snackable Content

Following on from Neils, Brian Sudlow, lecturer in history at Aston University, presented his talk Insights from IT History on the Search for Accessibility and Effectiveness and reframed the panel question to Who cares about Archives of IT?

His key question was whether Archives of IT could promote prospective history and if so how?

“Essentially what prospective history means is using history as a tool for future, tactical and strategic planning. The use of digital storytelling as a way of translating the very rich sources that Archives of IT now represent using that digital storytelling to communicate all that material in an agile way.

“How can digital storytelling actually ‘Capture the past, inspire the future’, which is the AIT tagline. So, I think inspiring the future is something that needs to be impacted.”

He believes AIT is underused because of time constraints with all the interviews being an hour long or more.

Today, he said, the preference that Generation Z has for short format media, known in the jargon as ‘snackable content’, is overwhelming and he has been working on short narrative videos that encapsulate a much more complex story such as the ones contained in the AIT interviews.

“In that sense we think that these histories could serve as curatives. Secondly then, histories could serve also as catalysts because essentially what we find in them is a way of enriching the imagination of making users of history more sensitive to the unexpected and the contingent, which is the essence of history in many ways.”

 

Jane Winters – Reflections from Digital Humanities

Professor Jane Winters, Professor of Digital Humanities at the School of Advance Studies at the University of London talked about the histories and archives of the internet and the perspective of digital humanities.

Professor Winters focused on four main areas: Use, access, value and the linked questions of ephemerality and preservation.

She said there are multiple actual and potential uses, the archives of technology, the web and the internet. From the archived web to the collections of cultural heritage institutions such as the Science Museum Group to the interviews and oral histories created and archived by AIT.

“But the uses to which these enormously rich archives can be put to depends on who is able to access them and how they can then be analysed, published and reused,” she said. “And that’s why it’s so wonderful that the AIT interviews are open and available for everybody.”

One example she focused on was the UK Web Archive, which is only accessible on site in the reading rooms of the UK and Ireland’s six legal deposit libraries [British Library, National Library of Scotland, National Library of Wales, Bodleian Library, Oxford, University Library, Cambridge and Trinity College Library, Dublin.] and not online, so they can’t be analysed at scale easily.

There was an update from Nicola Bingham, Lead Curator, Web Archiving at the British Library, who was in the audience and said this situation was being reviewed by the Joint Committee of Legal Deposit representing the legal deposit libraries and also publishers.

“What it is trying to do is strike a balance between opening up these collections for researchers and for members of the public while also protecting the commercial interests of the publishers.”

Jane complemented the work of the National Archives and https://www.nationalarchives.gov.uk/webarchive/find-a-website/ and Niels mentioned an international web archive coalition called the Internet Preservation Consortium that has a long list of all the web archives that are part of their organisation. So that’s also a place to start if you want to study the web as it’s being archived.

 

Panel 2 – Innovations and Implementation, Successes and Failures

In his talk Telecommunications and Computing: British Rail’s Nationwide Train Operating System and its Evolution researcher, Jonathan Aylen says there was what amounts to an internet before packet switching.

“There was something called circuit switching of telecoms links, way back when. And they were large centralised computer networks linked by circuits switching.

“In this method of telecommunications, believe it or not, you had to keep end-to-end communications while the data was being transmitted. That is the key point, it was not being sent in little packets and reassembled at the destination.

“What we want to emphasise here is that Tops (Total Operations Processing System) kept track of every single freight loco, every wagon across British Rail’s network by 1975.

“It was actually rolled out across Britain for a period of about three years. And the key point to take away from this is that this was a command-and-control system.”

 

Ed Smith – Evolving and Exploiting Packet Switched Networks

But of course packet switching became the predominant force for the internet and telecommunications and Ed Smith reflected on the period between 1982 and 2000 while working for BT and how the technology enabled better mobile connections.

“This was an interesting period for BT who were the main network provider at the time and it had rather a lot on its plate. It had just separated from Post Office and was about to be privatised and it had to break into new markets to see off the competition and keep revenues up,” he said.

“And among those were mobile communications. At the beginning of the period, most of the networks were customer provided and made use of our IT provider’s architectures and equipment.

“This would be linked by the customer organisation using basic telecommunications systems. And a standardised approach didn’t become available until about 1976. One was X25, which was standardised as a packet switch technology.

“And in 1981 BT replaced its experimental package service with the national X25 service, based on Telenet equipment, first of all because they were headed up by a chap called Larry Roberts and secondly, they were a subsidiary of BBN [Bolt Beranek and Newman] both of which are pretty significant in the development of the internet.”

 

Simon Rowberry – A Revisionist History of Videotex in Britain: The Importance of Connecting Editorial and Engineering in Teletext/Videotex Adoption in Britain

In his talk, Simon Rowberry, Director of the Centre for Publishing and Lecturer in Publishing at University College London, opened with the observation that people do not read most of their digital content in long form format.

For almost 20 years we have had social media and he has been researching what was ‘actually an early form of this: teletext and video text’.

There’s three he focuses Ceefax, a BBC service from 1974 onwards, Channel 4’s Fortel and Prestel built on the strength of BT but which never really took.

Most people he says will remember Ceefax because it ran for longer than most of the other services and was supported by the BBC.

“But overall, it wasn’t doing something completely revolutionary. So why should we care about that? Well actually, from an editorial perspective, it’s how you use that very low-resolution text mosaic to create interesting and engaging content? How can you use the fact that this refreshes over time? There was a lot of creativity and Ceefax can be seen as the kind of pinnacle of this.

“It was very strong in terms of how this was a brand, how we’re going to use it to convey information effectively, rapidly, quickly. So we can see this as a kind of proto form of something like Twitter in its broadcast mechanism. So not in terms of interaction but very quickly saying these are the headlines to the extent where you could have some TV channel that had the Teletext Ceefax breaking news appear down the bottom of the screen.

“This demonstrates that you can’t just look at the kind of technical aspects within these historical platforms, you need to figure out how it works from the social perspective as well as that kind of socio technical analysis of these things. Then it will allow you to understand more effectively why this was successful in some cases, but not in others.”

 

Robin Mansell, Professor Emerit, Department of Media and Communications at LSE and Panel Chair said she thought Simon did an excellent job in highlighting one of her favourite terms, socio technical.

“Sometimes it sounds a bit awkward, but it is really needful because where would we be without the interfaces between content and engineering and the technical standards. This is what the world we live in today is about and historically there must be a huge number of lessons.”

Simon: I think for me, this draws on the previous panel and the problem of these gaps in archives that if you go to the institutional archives of any of these teletext providers that have very little direct evidence of their service.

But actually, people will be very creative and be able to reconstruct things in interesting ways. So in terms of teletext, if you have a VHS recording of a television programme that had a teletext transmission, you can reverse engineer the cassette to actually extract a partial reconstruction of that material.

“In the same way with web archives, some of them might be lost, but there might be some way of reconstructing material and there might be hope in a few generations’ time of hobbyists, archivists extracting content in interesting and innovative ways.”

Robin: One of the things that struck me when I saw the curve of internet explosion was that quite often we’re told that with internet everything was disruptive and very rapid. And yet the stories that these histories uncover are relative, not so much that huge exponential take-off, but a process of learning and sometimes making mistakes and sometimes correcting for those mistakes. And learning from the United States as well as continental Europe, also the innovations in Britain of pushing forward with new services and technologies. And I think that’s what’s fascinating about these histories. And why it really does need to be preserved.

 

Panel 3 – AI Future Realities

John Handby – The Coming AI Tsunami

John Handby, who has worked for government and corporates in advanced technology and major change programmes, says that he rather provocatively titled his talk.

He said this was because, in his experience, technologies sometimes sit in the background for years, and internet technology is no exception, and for most of his career AI has been there and now suddenly it’s come of age.

“I’ve been talking on the subject for about eight to ten years now and got really little traction certainly among politicians and now governments and the media has grown fond of it.”

He said the companies in California are currently leading an arms race between themselves and governments with some of the governments being ‘good guys and some of the governments are not so good guys’.

“We’re going through this transformational change and that’s why I talk about the coming tsunami, something that begins as ripples and has now become major waves and it’s engulfing us all.

“When you look at what’s happening with chatbots and all the rest of it you can see the way in which the whole nature of what is happening is changing, it’s suddenly got this traction.

“It’s happening very quickly. We’ve got quantum computing coming. The Turing test has effectively been passed. Of course, there is good news. You’ve only got a look at healthcare, the way that offices work, the use of algorithms, manufacturing: there are masses of things going on.”

 

Vassilis Galanos: To have done with AI and Internet Summer/Winter Narratives: Can History Cure the Hype?

I have a PhD in science and technology studies from the University of Edinburgh and I’ve been researching and conducting what is known as a historical sociology of artificial intelligence. I’ve also been teaching a course called Internet and Society for four years now. So I’ve been reflecting a lot on the history of the internet on the one hand and the history of AI on the other.

And the more you go into these histories, especially the mainstream histories, you see how they were written, how the victorious people have written those histories.

And I’m saying that some of these metaphors that we decided to use are quite bad linguistic choices essentially because they distort from technological advancements what an earlier speaker termed as small-scale innovations. Small scale innovations happen at the same time, but how we describe them diverts our attention from them.

So we like our dichotomies, we like the idea of a summer and winter, we like the idea of the analogue and the digital, we like idea of AI as a companion, but we also like the idea of connection and the two of them being distinct. We like natural, artificial, right? But I say history doesn’t really know these differences. And a little theoretical hand to give you, the winter as opposed to the AI effect.

One of the arguments goes back to packet switching and the way time sharing was used for AI research in the ‘80s. So much of what became known as the internet was used between AI professionals in order to communicate with each other. So the internet was essentially a by-product to create our artificial intelligence in the strategic computing initiative in DARPA.

You can also look at the work of Lynn Conway, probably the first openly trans person in the history of computing, covering very large system integration and her statements about the role of internet connection and internet and intelligence through time sharing platforms.

In parallel, during the 1980s the connection took off with Geoffrey Hinton and his colleagues who were producing some very important and influential papers. And interestingly what we now consider as AI and the connectionist approach, although because Hinton wasn’t offered a postdoc at the University of Edinburgh people didn’t consider his approach to be called AI, he essentially rebranded the term called neuron networks.

Web to 2.0

So step two, who remembers web 2.0? I’m a web to 2.0 generation and everything on the web was about sharing, bottom up instead of top down. And that was all about the IT information revolution, information superhighways, we’re going to build up the web and so on.

If you read the initial scientific papers that led to contemporary deep machine learning they don’t refer to AI at all. They cite some of the classics, some of the Minsky’s, some of the Hinton’s, but they don’t cite artificial intelligence as a term.

They do cite web 2.0. So for them it was web 2.0 that brought about this kind of revolution.

And the third step, contemporary AI. I was surprised to see that within the first couple of months of ChatGPT’s launching, it was initially used for web searches.

So people were disillusioned by Google’s advertisement-led approach because, at least for now, ChatGPT seems to be advertisement free and seems to carry on this sort of objectivist kind of knowledge that seems to be very balanced or as Elon Musk called it woke, I don’t agree with the term, so it seems that people are using it as sort of authoritative one hit search engine.

Which takes us back to that question about what is the internet for? what is AI for? Is it about memory, is it about search, is it about whose memory, is it about power, is it about politics and so on.

So this tension between hype and history is something that can be useful in the future and that feeds into scientific motivation.

 

Chris Reynolds: Did the Hype Associated with Early AI Research Lead to Alternative Routes Towards Intelligent Interactive Computer Systems Being Overlooked?

The latest large language models are very impressive but they are they are black boxes which don’t really understand the information they are processing. For comparison this paper reviews the archives of the CODIL project (1966-1988).

CODIL was a transparent language system modelling human thought processes, but which was abandoned because it was not compatible with the AI paradigms that were fashionable at the time.

In addition to demonstrating how to constrict a transparent human-friendly “electronic clerk” the archives also provide information about how research was handled at the time of the ICL merger, and the problems which neurodiverse researchers can face when their “blue sky” research differs from the establishment views.

It is possible that, if the merger to form ICL hadn’t happened, CODIL, or something like it, could have formed the basis of a human-friendly computer language.

 

Jon Agar – Why Did James Lighthill Attack AI?

Jon Agar originally trained as a mathematician then he became a historian of modern science and technology. His talk focused on the history of AI and it concerned a particular incident, and going back to what we heard from Vassilis, about one of the causes of the so-called AI winter, the first real crash in funding in the early 1970s, which had many causes and many factors.

“One of the causes is often attributed to the mathematician, James Lighthill, who wrote a report for the British Science Research Council on artificial intelligence, on the state of research and whether or not we should be done with it,” he said.

Cuts to funding

“And basically his critical comments set off a wave of cuts in funding in AI research around the world.

He told the audience how he had wondered for many years about where Lighthill’s archives were and eventually found them right under his nose.

“I’m based at UCL and it turns out that they were 100 meters away in the UCL archives where he was provost at some stage [1979-89].”

Agar said that Lighthill’s report is usually taken to be an attack on artificial intelligence, however he found that it was more nuanced than that.

Lighthill’s three areas of AI

“Lighthill’s argument was that you can see artificial intelligence in the 1970s as being essentially three areas, at one end you have A, things to do with advanced automation – replacing people with machines and at the other end, C, where we see what you’d call computer based central nervous system research closely connected with brains, psychology and neural networks?

“And in the middle, B, where there’s this is bridging category: building robots. And this is where you find a lot of that much more general AI work going on. And he basically says that A and C are coherent and advancing fast, but that B is over promising, it has failed again and again and basically funding should be cut, which meant severing AI as a coherent thing.”

 

Panel 4 – Learning More from History

Brian Vagts: Telenet and the Construction of Network Security Culture

Brian Vagts is a graduate student in the Science and Technology programme at Virginia Tech and Associate Professor of History at Northern Virginia Community College.

He said that one of the interesting things about Telenet is that where he is from, the suburbs of DC, Northern Virginia and where Telenet was established, is that no one knows about it and he’s constantly having to explain what it is but that it’s already been mentioned twice at the Forum.

“So just a little bit of background on Telenet. Again, I’m arguing against the Arpanet being the beginning. Arpanet is important but it is largely developing IP. People don’t care about protocols. They care about what they’re doing on their networks. And as best I can tell, in 1983, Telenet is orders of magnitude larger than Arpanet was at the time period. Arpanet had up 113 nodes. Telenet had more than 2,000 customers. It’s an apple to oranges comparison but each of those represents a large enterprise computer system. And I don’t know how to break that down into absolute numbers of users but several orders of magnitude larger. And it’s going to be these users that then go on and start bringing, and in addition to Tymnet and CompuServe and the other services of the time of the DBS [deep brain stimulation] world, those learned behaviours into the internet when it starts expanding in 1990s.

So what I’m arguing against in terms of the technical construction is okay, yeah, Arpanet is important in terms of the IP conversion and the development of those protocols. But the social basis is much more bifurcated.

Hacking incidents in the ‘80s

“We had a series of hacking incidents, one of which utilised Telenet as a vehicle for attack making it a transport mechanism that wasn’t directly involved.

“But later there were attacks on the Telemail email service itself. So Telenet was both a vehicle for and a victim of the attack itself. And looking into this idea called Scot (social construction of technology). Basically there’s a feedback cycle.

“You have a technology and people respond and the technology incorporates that feedback and so I wanted to see how security technology was incorporating this feedback. The hacking incidents in 1983 are pretty much your stereotypical teenage male suburban hackers, it was a real nice early point to address security issues before the stakes went bad. And we didn’t address it. So the question is, what was going on with this?

So, how did we respond to these hacking incidents? How did the technology change? And at first what was really disappointed was there’s no changes and in fact it’s almost like a broken record. What led to all of these systems being penetrated was the fact that the passwords were horrible. And we saw, if you watch the movie War Games, the protagonist breaks into the school computer because the password was pencil.

On the Telenet systems it was documented the passwords were the person’s first name and if you’re an administrator you had a capital A afterwards. Okay, so the hackers went after that. You can take this story and repeat it for decades and decades. The security didn’t develop to incorporate this feedback. And I’m like, oh, what the heck I’m on here, so that’s not how it works.

And there were some mechanisms, they were cumbersome, but there were some mechanisms at the time period such as callback modems and a couple of years later the RSA secure ID tokens came out and they had a very slow uptake.

But we don’t see an effective response to this. And a year after that we get the TRW Hacks of credit card information. So it’s very rapidly escalating in costs. And how Telenet responded to this was interesting.

When you go through the literature of the time period. Most people thought that hacking was wrong, but they weren’t quite sure why. The US code at the time was ambiguous on whether it was legal or not. It had strong wiretapping protections. You can’t mess up data when it’s being transmitted, but if it’s in storage on the computer, well what are you going to do with that?

There is a conflation between software piracy and what these kids were doing and these kids were basically just messing around but they weren’t tyrants as far as I can tell. And some people were like, well, they’re not pirating commercial software, so there’s no crime here.

And what Telenet ultimately did was launch a campaign trying to associate hacking with criminal activity. And they’re playing kind of a two-pronged approach. They’re like, oh yeah, these kids are just organised dupes, but there’s a reference that comes up repeatedly and repeatedly and repeatedly that this is a front for organised crime, for the mafia, which is far from helpful time period is it not?

But later, it’s almost prescient. And so we see in response to this, there’s not much of a change in the technology. This is what I was expecting. But if you incorporate the feedback then there would have been a change in these systems.

Unauthorised computer access

What we saw in fact was this was altered in social change, this is going on but it’s not the whole process, and we start defining this idea of unauthorised computer access as a new form of criminal activity in and of its own right.

And in doing so, we, kick the can down the road for decades and I’d say we only really effectively started addressing this five years ago, maybe. That passwords should be awesome but in reality, they seldom are. And that in essence we refuse to deal with that technological feedback from this early time period.

And what ultimately see is what is driving this is that cost and ease of use trumps everything else. No one likes the idea of bad security but it’s never enough to overcome those other factors. Ultimately security was and I would argue still is probably a secondary element and that has messed with us long-term.

 

Mennatullah Hendawy: IoT Co-production for Inclusive Smart Cities

Mennatullah Hendawy is an interdisciplinary urban planner who joined the Forum from Cairo where she is affiliated to Ain Shams University as an assistant professor and also to the Center for Advanced Internet Studies in Germany.

Her talk was about the Internet of Things (IoT) co-production for inclusive smart cities and said the research for this was in its initial stages but she investigating how citizens will become data in smart cities with the use of IoT.

“What I argue today is that there needs to be a new movement which tries to link more the socio-technical and the hybrid, I would say the top down, bottom up procedures in the developing smart cities and IoT systems.

“And for this I propose trying to develop more co-production processes and co-create smart cities. I argue that the interdisciplinary knowledge and co-production can be a systemic innovation policy approach that integrates missions targeted to solve societal socio-technical challenges.

“So in smart cities, it’s basically about using big data. Using IoT devices and autonomous machines and sensors to collect massive amounts of data and collating it for decision-making.

“The knowledge co-production is an approach to data collection in science that emerged in the 1980s and especially in urban studies to understand who actually produces knowledge and science beyond universities.

“Collecting information in this way as a co-production enables us to rethink how we collect the data crucial for smart cities to work. For smart cities to operate, they need data and what I’m trying to explore is how we can co-produce data. Not only co-produce knowledge, but also give more agency to the different stakeholders to be part of the decision-making in smart cities.

“So I try to look at the data as a non-human actor and also as the human actor and try to question whether in smart cities the technological tools can become more active or become more responsive to citizens and vice versa. So essentially help citizens to become more active rather than technology taking over and citizens disappearing in this process. We see it especially in AI systems but also in IoT systems but it can be traced and in a methodical and beneficial way.”

 

George Zoukas: Decentralisation and Platform Migration: Lessons Learned and Expectations

George Zoukas travelled from Greece to be at the Forum and is a Science and Technology Studies postdoctoral research fellow at the Department of History and Philosophy of Science, National and Kapodistrian University of Athens.

Like Vassilis he completed his PhD at the University of Edinburgh and has a background in technology. One of his interests includes online communication and the history of online communication within the context of climate and environmental communication.

“In April 2022 on exactly the same day that of the acquisition of Twitter by Elon Musk was officially initiated, Mastodon, the largest decentralised social media platform, which has operated since 2016 uses its competitor Twitter to promote the benefits of decentralisation and the benefits of moving to Mastodon in particular,” he told the audience.

“So, independence appears to be the main advantage of Mastodon as people can now use a social media platform where its developers for instance claim we have no power to define your rules to show you ads to track your data by design.”

However, he says the idea of independent online communication and its association with the process of decentralisation is not new, given it first appeared at least 45 years ago with the initiation of Usenet, the centralised computer-based communication system distributed in different locations on LAN.

“Usenet was developed as an inexpensive alternative to Arpanet, the latter generally regarded as the official ancestor of the internet, with the aim of providing an independent way of communicating online.

“Usenet was distinguished by its anti-bureaucratic ethos of collaboration, egalitarian is and a bottom-up democracy, according to which users were the only ones who decided how to control their content.

“So that sort of promise, we could argue, of Usenet was mirrored not only in its decentralised and distributed function, but also in the way it was designed from the very beginning.

“However, due to its commercialisation and increasing openness to the public, which began in the early ‘90s, Usenet eventually deteriorated.

“I developed my case study around the sci.environment group, which involved the appropriation of a blogging platform by medical scientists for climate communication.

“Most of the blogs related directly to some aspects of the history of online communication and I found them an especially good source for data, specifically given that in this kind of historical study if you want to interview people who were using something 25 or even 30 years ago it is difficult to find them

“So why did a small group of scientists move from Usenet to blogs? I was specifically interested in the social, cultural and technological factors involved and the character of that migration.

“And finally, I was interested in how we could use the example of Usenet to analyse current examples of platform immigration, especially from decentralised platforms to the centralised ones. For instance, the migration from Twitter to Mastodon.

“I really like this phrase by one of the regular sc.environment users and afterwards climate blogger who posted on his blog: ‘The great virtue of Usenet was that no-one owned it and anyone could post. The great vice of Usenet was that no-one owned it and so anyone could post’.”

“I like this because it’s somehow indicative of the main paradigm or of the main argument I put forward in my research paper.

“And the argument I put forward is that to understand platform migration from centralised platforms to decentralised platforms and vice versa, as techno-cultural transition, exposes no superiority of the new/alternative platforms over the over the old/mainstream ones but rather indicates different paradigms of communication.”

 

Panel 5 – Societal Implications

Juliet Webster: Gender in IT Industries and Workplaces

Juliet says that since the early ‘80s she’s been very interested in and concerned by the gender inequalities in computing professions.

Partly as a result of having observed the rollout of computers in the first workplaces in the early ‘80s and the disparities of skill and opportunity that went alongside that.

“And still it seems to me that nothing very much has changed substantially. I’ll say a bit more about that in a minute,” she says.

She has a mixed academic/policy background and has also worked in the European Commission and in NGOs as well as in a university in Spain where she ran, under the auspices of Manuel Castells, a research centre on gender and information technology.

Virtual work

“While I was in Spain, I got drawn into a you EU cost action on virtual work. And virtual work was described, and we understood it at that time as being, and I quote, ‘labour, whether paid or unpaid that is carried out using a combination of digital and telecommunications technologies and or produces content for digital media’.

“While I was in Barcelona, we were looking at the differences in women’s participation in various broadly defined IT professions across different EU countries and drawing on lessons from elsewhere in the world.

Enduring absence of women in IT

“And what was particularly striking to us was the enduring absence of women in IT. So for the preparation of this talk, I went back through my own personal archive, which I don’t know what to do with either.

“These included a DTI strategy for women in science engineering and technology dated 2003. This is the Institute for Physics Guide to Best Practices in Career Break Management that designed to attract women into scientific and technical professions.

“Then there was the UK Resource Centre for Women in Science Engineering and Technology, which existed mid-2000s to early 2010s. And it quotes the statistics that only 14.4% of computing professionals are female. The latest stats that I’ve laid my hands on the Eurostat statistics which across Europe show a figure of something like 17%.

“So why? When I was in Spain doing work there, we identified several points in a woman’s life course, at which they left the professions altogether and that seems to me to be still relevant.

Critical factors: maternity and mid-career

“The two points that seem to be critical were maternity and mid-career. Maternity because a lot of the work was so hostile to people trying to balance several different demands on them that women left in their droves if they were the key people within families responsible for child rearing.

“In mid-career what we were seeing was those women who came back into the profession and into this area of work, leaving again because they were sandwiched between two types of care. And it’s still the case that women form the majority of carers. So caring for elderly and caring for children at the same time.

“And so, what you had was women coming into the profession, sometimes being encouraged by special public policy measures, by company initiatives and so on coming in, studying and then dropping out at maternity.

“And on and on it goes. So what we have is a mixture of hostile working conditions and hostile culture. And that remains the case. So we’ve got this kind of chilly climate. I think that explains what’s happening in conventionally defined tech professions.

“I became very interested then in what’s happening within digital work now. And by digital work I mean, that vast variety of professions including media, content production and other things.

“And within the cost action that I was involved with we were finding that digital labour had a certain set of features that again were not only disadvantageous to social groups such as women but exhibited a set of characteristics which meant the disadvantage was actually being spread throughout the workforce.

“So what started as a sort of set of working conditions that exhibited gender disparities I think became disadvantages to all.”

 

Jack Nilles: Evolving Telework

My formal education was as a physicist and engineer. And my initial career began basically managing the design and implementation of a variety of reconnaissance systems for the Air Force.

For much of that career it was space systems that I was designing. Payloads for essentially reconnaissance satellites.

Now this went on for about 17 years or so until the late 1960s. At one point I was talking with an urban planner, who looked at me and said, you know, if you people can put man on the moon, why can’t you do something about traffic?

Oh, well, I said, I’ve helped NASA pick the cameras to map the landscape of the moon and to pick landing spots. Why not traffic? You know, it’s just an engineering problem, right?

Wrong. I thought about this for a while and tried to talk my company into doing a little research on substituting applications a people could work at home, or nearby home instead of hogging the freeways with traffic twice or more times a day.

My company didn’t like this, they said we’re engineers and we don’t deal with stuff like this. So I moved to the University of Southern California and invented a job called Director of Interdisciplinary Program Development and since nobody knew what that meant, I had a fairly free hand-picking people to do the research with me from around the university.

Telecommunication and transportation

We got a grant from the National Science Foundation to develop policy telecommunication and transportation trade off. In 1973 we picked an insurance company to try this on because I wanted it done with an actual operating company because they have clear criteria on what works in the middle.

We ran an experiment for several months, moving their employees to locations near their homes that we call satellite offices. Working there on dumb terminals connected to their local mini-computer, which would upload their information to the company mainframe downtown.

Because having people work at home, the telephone costs at that time would have been out of hand. The internet had not yet arrived. Anyway, the experiment was a success. We figured the company would save four to five million dollars a year in reduced turnover rate. Not having that replace a third of their employees every year by having them stay there.

Great success but no take up

Because the turnover rate went from one third to zero during our experiment. They saved on facility costs and so forth. So it was a great success. But the company said, we’re not going to do it.

I asked why? And they said because we’re afraid we’ll get unionised and we don’t want to have the union come by and pick up these line offices one by one.

A couple weeks later I was at a conference in San Francisco of the AFL CIO labour unions and he said, you know, this telecommuting idea that you’re calling it is a terrible idea.

Why is that, I asked? And they said well, if the company’s employees are scattered all over the countryside how will we ever get them organised?

Attracting Fortune 100 firms

I spent the next decade trying to get more support from this from government agencies and so forth. And finally, in the mid-1980s we were able to attract several Fortune 100 firms to try this out, IBM and AT&T among them.

And again, they had successful projects. But they didn’t want us to say anything about them because If you’ve got a great idea to do something at a low cost and it produces year after years returns in terms of lower costs and improved productivity, why tell you your competitors?

And this is the same tune that made me decide what I want to do is get some public sector companies to do this, so I can talk about it to people.

So we started recruiting public sector organisations and finally the state of California climbed on board. We had a project that ran three years from the 1987 to 1990. Same results. Productivity went up. Turnover rate went down. The state would save money and so forth and so on. And it was a great success.

But the next governor turned it down because he didn’t want people doing strange stuff like working where people can’t see them all the time.

So same problem, management resistance was the big barrier to having all this happen. We did another experiment with the city of Los Angeles. The mayor supported it. We had good results, pretty much the same as the state of California. New mayor came in, stopped the programme.

Same reason. ‘I want people to come in the office, where I can see them. And so forth and so on.’

All the problems we’ve had were in convincing management that this would work for them.

And I kept looking for a magic elixir that would turn this around and finally out of the blue it appeared as something I hadn’t anticipated at all.

COVID-19

And overnight, hundreds of thousands of companies had to get their employees to work at home. Because working together they would get COVID and they’d be a substantial fraction of them might die of it.

I was in two minds about this, one I was happy to see teleworking had finally picked up, forcefully as it turned out, and on the other hand, some apprehension that the most of these companies had no clue about how to manage it properly.

I was very worried that we’d see a bunch of failures but as it turned out, in the years since 2020 companies have quickly adapted to this working at home. Many managers still wanted them to come back to the office and, over the past year or so, the battle has been on between the employees who really like working at home and are willing to go to the office some of the time but not all of the time, and the managers who want them back full-time, but are now finally relenting.

Currently the average rate of people working from home is roughly about 30% to 50% of them working half and half. Spending half the time at home and half the time in the office. Which corresponds with a survey we did in the year 2000 with teleworkers around the United States.

 

Chris Winter: The Internet and Web Have Brought Great Benefits but Who Has been Left Behind?

For the past few years, I’ve been focusing on the people who have been left behind by technology, which is particularly worrying as more and more of our living depends on digital services. Essential services as well as broader services. But there’s a lot of people, probably around 20% that are left behind.

And for two years I’ve been working with the Digital Poverty Alliance. Who have quite a broad definition of digital poverty and I realised that most people were focusing on poverty which I refer to as, affordability.

People who can’t afford devices or cannot afford to pay for broadband connectivity or any other form of connectivity, who are being left behind.

Digital accessibility

I’d like to give you three facts, which I hope will worry people. So, just looking at the UK this past year. Every year the DWP [Department for Work and Pensions] publish statistics on people who are registered disabled. And in 2023 the number rose to 16 million, be that physical or mental. So that’ll be republished around the March April time.

Focusing on just the web aspects of digital accessibility, there’s an organisation I’ve been working with in the US called WebAIM [Web Accessibility in Mind) and they perform a service to evaluate websites, specifically homepages against the W3C [World Wide Web Consortium] accessibility guidelines.

96% of websites have errors

And they assess a million websites per annum and at least 96% have non-conformance issues on their homepage. These guidelines produced by W3C were produced initially in 1999 and one of the key instigators was Sir Tim Berners-Lee.

Now, during the last week, what have we seen all over the media about 1999? It was the year the first issues with Horizon IT came to the surface and now it’s all over the place because of the TV drama.

Not because of all the other correct channels, and similarly issues of accessibility of the web specifically has been known since the 1990s and yet, as I said, 96% of websites have errors, non-conformance issues.

Contrast is the most prolific error

And they’re almost all to do with contrast. Contrast being the most prolific of all the error types. Just poor colour. But in our village case it was green and white. Which is a very common one.

Especially with organisations pushing the sustainability agenda. Green is a sustainable colour. Pale green on white is a disaster for many people.

Of the 16 million people disabled in the UK, people who suffer from colour blindness are not included, it’s not a disability. It’s an impairment. There’s 3 million people with colour blindness in the UK, 300 million worldwide. And by the way there are 1.3 billion disabled people in the world. Just to set the size of the problem in context.

I am not trying to solve the technical problem here. I don’t believe this is a technical problem. There are very good technologies available to provide more accessible digital services.

 

Conclusions

Professor Bill Dutton

“I think the diversity of this group was just incredible. Everybody said how they got into this field from different angles and into the area of studying the history of the internet and IT and it made a very interesting conversation because of that.

One of the purposes of us putting this together was to help us network with other people who were interested in the history of the internet and IT and I think hopefully by doing this and maybe future events we can network people who are not necessarily expected to be in the same room or expected to have similar interest. So I thank AIT for doing this.”

 

Professor Jim Norton

I will go back to 1948 when Winston Churchill pointed out to the House of Commons that, and I quote: “Those that fail to learn from history are doomed to repeat it.”

He was paraphrasing, in fact, George Santayana from his book The Life of Reason published in 1905, who noted that those who cannot remember the past are condemned to repeat it.

I think this thought provides the key reason for why the whole Archives of IT project and our conference today are both timely and very important.

There are so many lessons learnt over the past 70 years that later practitioners aren’t in danger of never having known, let alone forgetting. Each generation of computing, mainframe, mini-computer, personal computer, embedded systems, cloud: seems to have to relearn the need for things such as careful systems analysis and planning, let alone the need for the use of, for example, formal methods in software development.

I was there at the beginning of the packet switching revolution that led to Arpanet and BT’s experimental packet switch service and the package switch service to the internet we know today.

Little consideration to security

We gave little if any consideration to security. We were hell bent on providing the maximum access at interoperability. We certainly paid the price for that, in spam and spoofing. We’re in danger of repeating that pattern in the social and societal impacts of information and communication technology.

Our technologies continue to follow an exponential curve of growth in price performance and thus capability. Sensible managed social diffusion and adoption remains a steadfastly linear process. Lagging, I would argue, dramatically.

We didn’t fully comprehend what we were unleashing with social networks and social media. Are we about to face exactly the same challenges with generative machine learning tools such as ChatGPT and Bard?

Great value of Archives of IT

You will understand from this preamble that I firmly believe in the great value of Archives of IT. John [Carrington], Bill [Dutton], Tom [Abram] and the whole team are doing a great job in capturing the lessons and experiences of the past. Even more importantly, they are pioneering dissemination of that knowledge and material into our schools and universities.

Dead expertise, I would argue, is of limited value. It must be put back to work. How many of our children know of the key role the UK has played in the development of ICT? The role for example of Dr Donald Davis who I was honoured to work with at the National Physical Laboratory who arguably established the concept of packet switching and certainly pioneered its first implementations in this country. Or the role of the Royal Signals of Radar Establishment at Malvern in developing liquid crystal displays.

They may know about the contribution of Tim Berners-Lee to the World Wide Web. But what of his parents? His father Conway Berners-Lee helped John Pinkerton develop the Ferranti Mark I, the first commercially built computer. And his mother, Mary Lee Woods, was arguing the first computer programmer at Ferranti.

I would also argue, by the way, not on script, that as I recall from what we did in BCS many years ago at the start of the computing era it was more like a 50 50 [ratio of women to men] we managed to kill it.

In that working how many of us know of the pioneering work of Janet, the Joint Academic network and the European wide equivalent, GÉANT? All the work on data communications protocols carried out in the 1980s by the UK universities, the so-called coloured books. I think it’s important that we share role models and we share achievements.

What more does the archive project need to do? How can we make better use of the archive material we have? How can we reach out to schools, colleges, universities? Should we run further forums such as today? But as I close, will you please join me in thanking the whole team who have worked so hard to put on today’s forum and many thanks to you both here and online for joining us.

LinkedInTwitterFacebook