fbpx

Human Computer Interaction: The UK’s distinctive contribution and lessons for our future

The iPhone is a remarkable example of how HCI and technology combined to produce something that people love and find easy to use

AIT has published an in-depth report on the history of Human-Computer Interaction (HCI). From Punch Cards to Brain Computer Interfaces: 75 Years of Human Computer Interaction and its Impact on Society describes how HCI evolved from the early years of computing in the 1950s, when the subject was dominated by a small number of mainframes, to today’s world where we talk about Graphical User Interfaces and interactive displays used by billions of people.

The feature was commissioned to develop the story of HCI by leveraging AIT’s unique knowledge base, reviewing our archive of interviews produced since 2015 and conducting six new interviews between late 2022 and early 2023. Together they now tell a more comprehensive, up-to-date and personal account of this life-changing aspect of social and industrial history.

The research was conducted by Dr Elisabetta Mori and has a particular emphasis on the UK’s distinctive contribution to the subject.

We are very grateful to the Worshipful Company of Information Technologists for funding this work in line with its aim of providing leadership for the IT industry. 

Reading about the past made those of us who have been working closely with Dr Mori think about the future. We have been struck, once again, by how the lessons of history are so often relevant going forward: HCI has a key part to play in building a future that’s more democratic, based on the total human experience and caters for all genders, backgrounds, and abilities – but only if we do it right.

The early days of mainframes

The Manchester Baby
The Manchester Baby recreated. Science Museum Group Collection (c) The Board of Trustees of the Science Museum (CC BY-NC-SA 4.0)

HCI has its roots in research work going back to the 1950s, when computers were huge machines used and operated by a small number of specialists. Its focus was on how to enter data and extract results as efficiently as possible. HCI evolved with technology over the following decades, and indeed played a crucial part in moving us to the age of the PC and the internet, with the invention and adoption of such key things as the mouse and touch screen.

In the past 25 years the internet has, of course, changed the world fundamentally and irreversibly. There are now thought to be perhaps 2 billion personal computer users and well over 7 billion mobile phone users in the world and HCI is now far more concerned with how we as humans are able to manage these digital devices, than with how a relative handful of specialists spent their days.

And in the not-too-distant future HCI will be concerned with how we manage our way through what has been termed by some the ‘Fourth Industrial Revolution’ (4IR) as new and emerging technologies such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage, and quantum computing become pervasive and embedded. We might all of us be able to do anything with anyone, anywhere – things like collaborative surgery or interactive arts involving real time sight, sound, feeling and thinking. We will all be impacted by this – indeed, some of the devices envisaged will be placed on and within our bodies.

Deliberate and inclusive design

Smart watches encapsulate the decades of research into HCI and have the capability to monitor our heart rate, blood oxygen level, blood pressure and temperature; facilitate contactless payment, messaging and calling features

We all know that technology can bring huge benefits to society, but a key lesson from Dr Mori’s research is that it requires deliberate and inclusive design, development and testing of the human user experience order to achieve this. We hear the phrase the ‘digital divide’ and this refers to the fact that although digital devices are appealing to many, not everyone wants to use them and not everyone finds them easy to get on with.

Professor Alan Dix of Swansea University was interviewed by Dr Mori for her article, and he puts it like this: “…if you want to buy an airline ticket or fill up your tax return it is difficult not to use a computer. We have made a world now, where to be a citizen of the 21st century you need to be digitally connected.” For this reason, Professor Dix goes on to say: “Digital technology on its own deepens the existing social divisions.”

Professor Alan Newell, Emeritus Professor at Dundee University was also interviewed: “[IT] certainly has made a lot of difference to a number of disabled people, but it also made it more difficult for some categories of disabled people to take part in society.” Professor Newell said: “When Windows appeared, blind people who had been able to use computers until that point were not able to use them anymore, because they could not see the screen, whereas they were quite capable of using command line interpretation. This is one of the examples where allegedly a move forward has put a number of people at disadvantage.”

A different perspective on the patchy adoption of technology is offered by two more interviewees. Professor Harold Thimbleby of Swansea University talked about the iPhone: “The iPhone was (and still is) a remarkable piece of technology and engineering, but key to its success was that its user interface was an immediate hit with the general public. All of a sudden, people wanted to have one and use one.”

But ex-BBC technology correspondent, Rory Cellan-Jones, said: “I was very excited when I saw the demo of Google’s Google Glass, this augmented reality headset, and eventually managed to persuade the BBC to get me one. [I] then wore the thing for three months solid, wherever I went, while my colleagues, my friends, my family, told me I looked like an idiot. And eventually I realised I did look like an idiot, and stopped wearing it, and that actually betrayed a bigger truth about the product, which is, however clever the technology, the look and feel of it is incredibly important to its eventual success.”

Future technologies

Healthcare is major area where HCI could contribute positive changes in the future

What does this tell us about the likely success of our future technologies? Professor Thimbleby points out that reaping the full benefits of new technology will require a lot to happen behind the scenes. Referring particularly to the healthcare setting he said it would involve “changing the culture in the hospitals to rationalise it and re-engineer it and that’s really difficult … If you just computerise the NHS or healthcare, you end up with the mess going faster.”

In general, this tells us that for the industrial benefits of 4IR-like technology to be fully realised we need process and culture change. These will be major and could be possibly socially divisive, and will need a lot of focus. But what of the end-user benefits of this technology – being able to do anything with anyone, anywhere? We already have a digital divide in which swathes of society globally cannot and do not want to use today’s technology, so how are we going to ensure this doesn’t get worse?

Rory Cellan-Jones’ point is surely germane: “… the look and feel of it is incredibly important to its eventual success.” This is not a new issue. Dianne Murray, a hugely experienced usability and interface design consultant, talked in an interview with AIT how before the 1970s The software engineers who did testing but didn’t do anything like usability testing, didn’t have concepts of what the user actually did, or required”.

Learning from history

There is always a temptation to get the latest tech out there as soon as it’s working but without an interface that is relevant and appropriate to millions, or even billions, of people the tech will remain niche, and in the province of the wealthy aficionados. There could be many reasons why something is launched early – maybe the desired interface is not yet achievable, or maybe it’s essential to get an early return on an investment – but we need an understanding by the key global players that the job isn’t done until their tech is actually usable by billions.

People want to use and be seen to use tomorrow’s technology but it needs to fit their lifestyle requirements. They need to love it, not just wonder at it. Technological improvements will continue at an enormous rate and without an empathetic understanding of how to meet the needs of the human condition – whether in health, education, entertainment, fashion, media, or law and so on – technology will fail to deliver its full potential to make life better for everyone all around the world.

Maybe we should change the terminology from “HCI” to “HCE” – the Human Computer Experience?
LinkedInTwitterFacebook