At Archives of IT’s recent event exploring 75 years of Human-Computer Interaction (HCI) an expert panel said advances in human user interface design were now being overlooked as the latest technology was being churned out without being properly tested.
The event was a response to an in-depth report commissioned by Archives of IT (AIT) and funded by the Worshipful Company of Information Technologists (WCIT). From Punch Cards to Brain Computer Interfaces: 75 Years of Human-Computer Interaction and its Impact on Society, was published in September 2023, and examined the role UK pioneers have made to the development of HCI.
Led by Dr Elisabetta Mori, the report reviewed past interviews by AIT with leading figures in the field as well as interviews conducted by Dr Mori as part of the project to provide a more comprehensive, up-to-date overview of the history of HCI in the UK. The HCI event, held at the WCIT’s Livery Hall in Central London and online, was an opportunity for a wider audience to discuss the contents of Dr Mori’s research following a summary by AIT CEO, Tola Sargeant.
The panel included Professor Leela Damodaran, Professor Emerita of Digital Inclusion and Participation at Loughborough University, Chris Winter, Advocate for the Digital Poverty Alliance and Paul Excell – WCIT and IT Entrepreneur. Also in the audience were project interviewees, Professor Ernest Edmonds, Emeritus Professor of Computational Art, Leicester Media School, De Montfort University and Professor Harold Thimbleby, Emeritus Professor of Computer Science, Swansea University.

Professors Damodaran, Thimbleby and Edmonds had all worked together at Loughborough University in the pioneering HUSAT (Human Sciences and Advanced Technology) Research Institute, founded in 1970 by Brian Shackel, which played a crucial role in the development of human-computer interaction. They hoped the report would be a stimulus to revive the visionary thinking they were part of from the 1970s to 2000s with less of today’s profit-making excess and more controlled attention to human design that would make technology accessible to a larger number of people, especially the elderly, disabled and poor.
The panel was joined online by HCI project interviewees, Dianne Murray, a Usability and Interface design consultant and Linda Macaulay, Emeritus Professor of System Design at the University of Manchester as well as students from as far afield as the Islamic University of Malaysia’s Faculty of ICT.
The panel and audience considered the following questions:
– Do we recognise the digital divide here in the UK or elsewhere?
– How can we guide the safer, more ethical, enriching deployment of HCI?
– What can we learn from our pioneers to improve society for today and tomorrow?
AIT Video – Introduction to the Human-Computer Interaction project
1951 – The Lyons Electronic Office (LEO) becomes the world’s first business computer, developed by J. Lyons & Co
Mid-1950s – UK begins involvement in the development of HCI as a discipline
1959 – Professor Brian Shackel’s article, Ergonomics for a Computer was published, which was one of the first papers on HCI
1960s-1970s – Development of microcomputers, microprocessors and graphical user interfaces are developed
1970 – Professor Brian Shackel launches the Human Sciences and Advanced Technology (HUSAT) Research Group at Loughborough University. Over the next three decades it played a prominent role in the development of human-computer interaction
1979 – British computer scientist and author, Christopher Evans, publishes his book The Mighty Micro: The Impact of the Computer Revolution in which he predicted the arrival of ubiquitous computing
1984 – The UK government launches the Alvey Programme to research information technology that focused on: Advanced Microelectronics, Artificial Intelligence, Software Engineering and Man-Machine Interaction (including Natural Language Processing)
1993 – World Wide Web opens to the public and the Internet accelerates the expansion of computer users, which the leads industry and academia to consider the needs of disabled users the first time with the Disability Discrimination Act 1995 coming into force
2006 – Launch of the iPhone and the rise of smartphones and the rapid miniaturisation of other devices such as touch screens and gesture-based input and the development of augmented and virtual reality. HCI was now truly a global concern and both the UK Government and BCS launched initiatives to try and ensure that disadvantaged users weren’t left behind
2010s – Huge growth in social media companies such as Facebook, which was generating revenues of nearly 70 billion dollars by the end of the decade. And as more services became available via the internet, in some cases only via the internet, we saw the emergence of the phrase digital divide
2020s – Advancements in human machine symbiosis technologies such as GenAI, robotics and autonomous vehicles, which promise to redefine human-computer interaction once again
In her summary of Dr Mori’s work, Tola Sargeant said in preparation for the event she had asked ChatGPT how it would impact the evolution of HCI and it gave her a lengthy response about how large language models are making technology more intuitive, natural and human like. These developments, it said, and others, mean technology will be increasingly immersive and all pervasive as we move into the future. Instead of operating computers, humans and computers will have an inextricably linked existence.
To begin the discussion, she then asked the panel what HCI meant to them and to highlight some of the things we could learn from the past 75 years that might inform the future.
HUSAT team driven by passion to make life better
Professor Leela Damodaran, began by saying how fortunate she had been to be one of the ten founder members of Professor Shackel’s HUSAT Group at Loughborough University in 1970 and how the team were driven by a passion and believed computer technology had vast potential to make life better for everybody.
“That was the rationale for the experimentation and the initial projects we worked on,” she said. “It started off as a group, and during the 30 years it survived, there were a lot of important developments that saw us move from working on the basics of interface design and usability, to being commissioned by companies such as Plessey, IBM and ICL, to investigate and provide guidance on interface design.
“And one of my questions is whether there’s that rationale and demand coming from the technologists now? “I’m not sure about this. Our family acquired a Tesla about a year ago and I’m forever infuriated that many of the things that we were researching 20 years ago are not part of the design that we’re experiencing. There was a lot of research on tactile controls, for example, they’re not in the Tesla.
Going backwards in human factors
“So, there is the feeling that we are going backwards in human factors and I think our hope, and this would be endorsed by Dr Mori, is that her research is a reminder of where we’ve come from and what has happened in the past 75 years will be a stimulus to get some of that visionary thinking back, some of the passion to make technology accessible to far more people.”
Paul Excell, who chairs an organisation on behalf of the WCIT called AI4C (AI and Machine Learning Exchange for Charities), which has 60 different charities deploying AI for good, said one of the main strengths of the report was the human stories from the HCI pioneers and what they had learnt from their endeavors. He was particularly impressed by Robin Christopherson MBE – Head of Digital Inclusion at Abilitynet – who is widely quoted in Dr Mori’s HCI report, and how he talked about how technology can improve our lives.
Legislation and the Disability Discrimination Act 1995
“What came out of Dr Mori’s report around legislation is that Robin Christopherson said that the Disability Discrimination Act 1995 drove people to enforce action because it suddenly became a legal requirement,” he said. “If I had a wish, I would hire the very best men and women on the planet, pay them well, and get them to focus on regulatory safeguarding to produce both solutions that enable entrepreneurs and innovators to get on with their work but to do it safely to protect us as a society moving forward.
“In my charity’s most recent meeting we had the Royal National Institute for Blind talking about all their engagements with technology to help, facilitate and improve the life of people with visual impairments. That’s why this work is important. And I liked what Professor Alan Newell said in the HCI video about us tending to design for ordinary users and what he was interested in was the extraordinary users, a positive way at looking the disabled. These are people with superpowers, and if we don’t include them in classic user design by putting the human at the centre then I think we’ve missed a trick.”
Modern audience vastly different to early days of HCI
Chris Winter said by revisiting the report, one of the key considerations for him, as someone who had been in the technology industry since the 1960s working with companies such as IBM, was that the audience who interact with computers is now vastly different.
“It used to be people like me working with paper tape, and I was so excited when I switched to punch cards. What more could you want? But now people interact and they depend on technology so much. I work with a charity called the Digital Poverty Alliance and they are really focused on the people who are being left behind. A lot of the focus is on people who can’t afford devices, can’t afford connectivity and don’t have the skills.
“And the particular area I focus on is disabled people who cannot interact with something that’s not accessible. There are 16 million registered disabled people in the UK, 24% of the population, with various disabilities and 1.3 billion worldwide.
“And our industry continues to ship systems that are not accessible or not fully accessible to many of those people. The problem’s been there and recognised since the 1990s and very much with the launch of the web. Soon after accessibility challenges were identified with Google right and [web founder] Tim Berners-Lee and it’s still an issue.”
One of the largest audiences who interact with computers today are gamers with latest figures suggesting there are 3.2 billion active video gamers in 2025 and Simon Rae, who was attending online, wanted to ask the experts in the room whether it was possible to draw direct links from the Alvey Programme in HCI to the current success of the UK’s video game industry.
Professor Edmonds, said he would say no, because he thought the benefits of the Alvey Programme were not technological in that sense. “They were to do with bringing people together, getting industry working with academic researchers and also moving academic researchers into industry. So, I think that there was a lot of work done that changed the climate of the IT world in the UK and that that was much more important than any technological advance. So many of the people who worked for me in Academia under the Alvey remit ended up working in the IT industry. So, it was like a recruitment process.”
Tola Sargeant then asked if bringing disciplines together was something we need today and whether we had lost the ability to share information between disciplines.
Loss of vision from today’s government, which thinks tech is magic
Professor Damodaran, said she felt we had lost any kind of vision from government which was above and beyond profit and what we do to boost different aspects of the economy or growth in particular areas. Not the loss of the really big picture about how we can make this something that transforms everybody’s lives and enhances life across the world.
Professor Thimbleby said he thought it was much worse than that. “The government thinks technology is magic and it’s easy. They think you just need to buy AI and the solutions to our problems will be everywhere. And you mentioned the lack of human factors in the Tesla. I mean, the government doesn’t even know human factors are relevant to AI. There are two points I’d make. One is, HCI is not just about human factors, it’s human computers. So, there’s computational thinking and you put the two together. And then there’s something really interesting going on in computer science and human factors that’s making both completely different from everything else.
“If you think of physics, chemistry, biology – I’m an ex-physicist and we study the world – and in computing the world’s changing the whole time. Somebody invents Teslas, or driverless cars or AI, and it’s like we’re not trying to understand the world with the ground shifting under our feet. And everybody else who’s a consumer, such as the government, just thinks it’s wonderful. They don’t realise we’re struggling to manage the complexity of these systems. And we produce Teslas, and even trivial things like parking machines, that are an absolute mess, because the people who designed them, they know what they’re doing, so it’s easy, but they don’t seem to realise the user doesn’t have the privileged information of how they built it. So, it’s a nightmare for the user.”
Crucial to investigate user needs
Professor Damodaran replied by saying that we had forgotten that the crucial thing is to investigate user needs in a detailed way and looking at, not just what the user would do with their hands or their eyes, how they physically interact, but the wider issue, should they trust technology?
“Can they trust it? What are the implications of it down the line? Who else is going to be involved in this? The bigger picture is not there. I hope I’m proved wrong by those who are working in HCI now. However, it seems to be the case that it is just assumed that more, better and faster is a good thing, without thinking about what higher order of activity is this going to change or contribute to.”
Professor Thimbleby said he was baffled by this approach: “Because more, better, faster would apply, say to cars, but you’d also know you want them to be safe, reliable and low energy and a whole raft of criteria. Where are those criteria for computer systems? We’re just swamped with excitement.”
Paul Excell said the fundamentals to HCI should be making user design well and rigorously testing. “All the successful tech entrepreneurs and investments I’ve ever seen or made have spent so much time trying to understand the user experience. So, they really understand the pain points and can design a solution that takes on board everybody’s user needs. They try to pre-empt these user needs and design something that is, for example, good, solid, safe software. And it seems to me that some of those fundamental standards just aren’t being adhered to.”
IT industry failing to make applications accessible
Chris Winter said their was a disconnect between the various businesses that work in the IT industry: “I started out in the computer industry and in the ‘60s and now people refer to the IT industry and others to Big Tech. And I think they’re separate industries and I think part of the issue is the big tech providers, the Microsofts, the Apples, etc, provide fabulous accessible machines and then they are not built into the business applications produced by the IT industry, which is the applied or the non-applied tech use of that technology.
“So, we’re not applying the technology that exists. And the other thing that’s really shocked me is that the public sector is in a better position globally than the private sector from an accessibility point of view. Why is that? Regulation. So, this year the EU’s legislation that’s been 10 years in gestation becomes law. So if you’re trading, if you want to operate in the European Union, you need to comply to that.
“Strangely enough, the UK is kind of ahead of the pack, because the public sector, knowing that this EU legislation was coming out came up with something called PSBAR [Public Sector Bodies Accessibility Regulations], which has been there since 2018.”
Paul Finch, WCIT warden and chair of charity, Community Tech Hub, wanted to know whether current HCI technologists were able to improve the trust and desire with users such as the elderly: “I run an organisation that tries to keep the retired safe from tech fraud and online scams. But one of the things I’m witnessing is a reduction in the over-65s desire to use tech, their enthusiasm is reducing as each new technical evolution brings out a new set of gadgets. So, can re-establishing the trust with those that are not so technically efficient be achieved with innovative thinking through HCI?”
Professor Damodaran said the failure to address this had consequences for the productivity of society: “This is a massive issue. One of the things that we have throughout society is that everybody over the age of about 60 will gradually develop some level of microvascular disease. So, a lot of us in this room have it to some degree, but it means that our population is affected. It seems bizarre that as a society we’re designing systems that cannot work for those people, or they will have limited capability to use many of the features that we’re developing now. I’m a septuagenarian and there are often things that I want to do or use but it’s online only and I find it increasingly hard to do that. And when you complain you’re told to ask your children, your grandchildren, a friendly neighbour, a friend. And there is a madness about this.
“Our government, everybody’s governments are worried about reducing productivity. So, we’re actually telling people to make sure that they occupy the time of increasing numbers of the most productive in society, the people who should be the most productive in society, the IT able, etc with addressing this problem. They can’t just be getting on with their jobs and competitiveness and everything else in society and the economy, because they’re propping up their grannies, their parents, their elderly aunts to compensate for bad design.”
Creating an environment that encourages people to engage
Paul Excell said it was the case that many older people were very lonely and we ought to be focusing on how we create good designs which not only give a good return for investors but somehow create an environment that encourages people to use services seemlessly and engage.
“It seems to me, without playing to the audience, that we’re losing all that wealth of experience if you don’t provide connectivity and good design. Again, if you get the design of the system right, you can find a large majority of the conversations can be human to machine and the user can benefit from it. It also strikes me, and I’m not a big gamer, is if you ask particularly young people, and I know we’re talking about different demographics of all social economic backgrounds, they will say they can get on a new game and quickly adapt to it because it plays to their interests and strengths. So, I wonder if there’s the equivalent [for older generations] of how you make that incentive to want to use technology because it’s fun and easy to use. So, if you can gamify it or have some sort of value proposition for a person, I think they’re just going to want to use it.”
Mark Jones, an Adviser for AIT who worked with Dr Mori on the HCI report, asked: “As individual users, why do we accept second rate tech, including second rate interfaces. And what can we do apart from spending lots of money to sort this out?”
Professor Thimbleby replied by saying he might sound cynical, but that businesses make a lot of money out of this scenario and there’s a huge pressure on users to buy new devices and to upgrade.
“There’s this feeling that we’re to blame and it’s our problem. So let me give an example. John is sitting here and his chair doesn’t work. That’s not because John is chair illiterate. It’s because it’s a bad chair. Now, why do we talk about us being IT literate or unable to cope with technology when it’s a failure of human factors and general design and user interface design in the technology?
Manufacturers churn out stuff without testing it
“The manufacturers just churn out stuff without testing it and developing it for the target users and expecting us to pay for it. And when it’s not satisfactory, it’s a dopamine shot or something, we go and buy some more of the wretched stuff. It’s deeply embedded in our society. If you buy a computer system it will have a warranty that says the manufacturer is not liable for anything, basically. Whereas if you buy a washing machine or a car and the wheels drop off, or whatever, you can go back on the warranty and say, fix it. If you buy some software, a bad user experience, it’s your own fault, because you have signed up, as it were, to that. We accept that it’s our own fault and it isn’t.
“And here’s a suggestion, Paul [Excell] said earlier that we need highly qualified people. I think we need to step back. We need the certification, so we know who those highly qualified people are. And then we need a law that says, we need certified people to build these things who know human factors, user interface design, computer science and the rest of it, like simple bits of technology such as the sockets on the wall are wired up by people who are competent electricians.
“We could build a robot that puts in the sockets, and we don’t have to be qualified to build a robot. But yet it can do all the jobs that qualified people have to do. It’s bizarre.”
Tola Sargeant mentioned that in AIT’s Forum on Norms for the Digital Age there was a panel on professionalism in IT that addressed this problem with the outgoing President of the BCS, Alastair Revell, advocating for a chartered system for information technologists.
Chris Winter said a further self-inflicted problem was the complexity of the systems that we interact with. “And a lot of the complexity is what I refer to as induced. We add complexity where it’s not required when we should be taking it away, making the interface simpler. Going back to what Harold just said, to sit on a chair is straightforward, but it’s not necessarily straightforward how to use a business application to file your tax return.”
Professor Damodaran said she agreed with the complexity issue but felt things were worse than that. “I think we’re developing the characteristics of a people in a totalitarian state where things are so bad that to preserve your own mental health, you’ll say it’s all fine, or you get on with it, and you use it, and I’m increasingly hearing people say that about digitalisation in general ‘well, it’s the way of things’ such as automated tills at supermarkets, when they say it’s all about the people. Well, that’s fine only if we all roll over and say ‘yes, it’s fine’.”
Professor Thimbleby charicterised the current situation with HCI and technology as a turning point as grave as that of the intervening years between the Second World War and the Universal Declaration of Human Rights in 1948.
“After the Second World War we realised nations have problems, so we introduced the Declaration of Human Rights. And now we’re selling our lives to computer systems and NHS data disappears into America. It’s nothing to do with states. It’s to do with businesses, and we do not understand how to manage businesses and the way they’re giving us mental health problems. It’s a bigger issue than the Declaration of Human Rights. There’s a human right to be able to not have mental health issues and get cross with your computer.”
Professor Edmonds agreed that we need to look at these systems so they benefit humans: “One thing to consider, which is behind everything we’ve said, is the notion of considering total systems. So much of the problems that we have are our concerns with computer system, which should be created in a way that is good in whatever way it’s meant to be.
“Does it solve the problem that we’re trying to deal with? Does it help human beings in the way that they want to be helped? We need to look at the total system, which includes human beings.”
Computers were invented too soon
Professor Thimbleby said that during his PhD in the 1970s he interviewed Joseph Weisenbaum, computer scientist and professor at MIT, the author of Computer Power and Human Reason, who said: ‘computers were invented too soon’.
“Maybe he was right. The world is getting complicated. Tax and social security is a mess and some idiot invented computers, and now they can become even messier. So, we can’t be bothered to sort out the mess. We’ve just computerised it.”
To conclude Professor Damodaran reminded the audience of the still pertinent words of Norman Cousins – the American political journalist, author, professor and world peace advocate – in his essay, The Poet and the Computer, published in 1966.
“The question persists, and indeed grows, whether the computer will make it easier or harder for human beings to know who they really are, to identify their real problems, to respond more fully to beauty, to place adequate value on life, and to make their world safer than it is now.”
The answer is that the computer has the potential to do so but the desire for this is being hampered by what the panel said is the excitement factor of the never ending supply of new devices, new apps and system updates, the will to ‘build more, better and faster’ rather than adequate systems that are accessible and can be used by everyone.