Alan Newell is Emeritus Professor at Dundee University. He has spent over forty years conducting research in Human Computer Interaction, primarily into supporting elderly and disabled people. He founded and headed the Dundee University’s School of Computing, and later set up the Queen Mother Research Centre, an important academic group, researching digital systems for older and disabled people. In 2000 He was awarded an MBE, for services to IT and communication for people with disabilities.
Prof Newell wrote “Design and the Digital Divide”, published in 2011 by Morgan and Claypool. He is a Fellow of the British Computer Society, a Fellow of the Royal Society of Edinburgh, and an Honorary Fellow of the Royal College of Speech and Language Therapy. He was named ACM Fellow in 2006 for his contribution to computer-based systems for people with disabilities. In 2011 he was awarded the CHI Social Impact Award. Finally in 2012 he was appointed to the ACM CHI Academy.
Interview conducted by Dr Elisabetta Mori on 24 January 2023 via Zoom.
Alan Newell was born in 1941 in Birmingham. His father was a motor engineer and his mother was a milliner. He had an older brother. Alan attended his local primary school and won a scholarship to attend a grammar school where he focussed on engineering and science. He was awarded a scholarship to attend Birmingham University to study electrical engineering. Having completed his degree in 1962, he went on to complete a PhD in the electrical engineering department. Alan says of his topic: “It was on the subjective response to patterns. I looked at the ways human beings discriminated certain patterns, visually. It involved having to understand some experimental psychology, so that’s how I moved from being a straightforward engineer to an engineer/experimental psychologist. It was exactly the right background for someone who ended up in human-computer interaction.” Early Life and Education
Having gained his PhD, Alan became a research engineer at Standard Telephones and Cables which had ambitions to make speech recognition machines. His work involved the use of a Digital Equipment PDP-8, which he describes as “one of the early laboratory computers which had 4,000 words of storage.” While working for Standard Telephones and Cables, Alan developed VOTEM. He explains: “I didn’t believe at that stage that speech recognition would be capable of being built in the near future. However, I realised that the one area where it could be useful was by disabled people who had no manual dexterity. I started looking at how we could develop a computer system that worked on voice. That’s why I developed VOTEM, which stands for Voice Operated Typewriter Employing Morse Code, because the user spoke Morse Code at it.” Standard Telephones and Cables
In 1970, feeling that he did not really fit with industrial research, Alan returned to academia and became a lecturer at the University of Southampton. It was here that his team developed the Talking Brooch; a small rolling alphanumeric display worn on the lapel and operated via a handheld keyboard. He says of it: “It was for people who couldn’t speak. Up till that point, non-speaking people could use teleprinters or typewriters to communicate but they didn’t look at each other, they looked at what was being typed. The idea of the Talking Brooch was to retain eye contact which I thought was important.” The Talking Brooch was among the factors that saw Alan awarded a Winston Churchill Travel Fellowship in 1976. He took what he describes as “a grand tour of the States” where he met fellow researchers including Penny Parnes from Toronto, Arlene Kratt from New York, and Rick Foulds from Boston, among others who were working in the field. Alan also demonstrated the Talking Brooch to a Member of Parliament who visited the department in Southampton. It was suggested that the brooch would be of use was Jack Ashley, an MP who had lost his hearing. Having conferred with Jack, Alan realised that the brooch was not the right thing for him, because he needed a verbatim transcript of what was said. He explains what happened next: “At that stage, because of my research in speech recognition, I knew about machine shorthand and I decided that what Jack needed was a Palantypist who would use machine shorthand to type verbatim what was said in the House of Commons and Jack would be able to read it. “We went back to Southampton and with Andrew Downton, John Arnott, Colin Brookes and others we developed various systems to do that. It was then installed in the House of Commons. I’m sure it was the first electronic system to be installed in the chamber of the House of Commons and when it became computerised, it was the first computer system to be used in the Chamber. Jack Ashley used to have a screen in front of him and the Palantypist sat in the foreign press gallery. He used it for a number of years.”. Colin Brookes moved to Possum Controls where they marketed a system based on these ideas. My group also worked on TV subtitling for the deaf and produced a system to achieve this, which was developed and marketed by my colleague Andrew Lambourn. University of Southampton
In 1980, Alan was appointed a Professor at Dundee University where he remained until his retirement. During his time at Dundee, he has carried out numerous research projects on computer support for people with disablities with a particular focus on the development of augmentative and alternative communication systems for non-speaking people Alan says: “One of the aspects we concentrated on was that most of the devices available at the time focussed on needs of the disabled people such as whether they were hungry, thirsty, and so on. It’s not actually easy, however, to have a conversation and make friends with somebody when all you can do is to tell them you’re hungry. So, I was more interested in devices which would enable people to have more normal conversations. “One of the first devices that Norman Alm, my colleague, developed was called CHAT, which enabled a user to have a conversation by very easily picking out from a number of stored general phases. For example, they could start off the conversation by choosing from a number of ways of saying hello. In most currently available systems they would just type H-E-L-L-O, but our system stored various ways of saying ‘Hello’ and once they had said ‘Hello’ they would expect the other person to say ‘Hello’, so the user would then be presented with you a number of messages that they had developed in the past which were follow up remarks to ‘Hello’. At the end of the conversation it would provide the user with phrases which implied ‘Goodbye’, ‘Nice seeing you’ and so on. With a single keystroke, this system enabled the user to say polite statements, (or even impolite statements if they wanted to), and to socialise more easily. With my colleague Annalu Waller, we took it further and started looking at the sorts of conversations you might have, for example, if you’re in a restaurant.” Based on his research, Alan published his first book Design and the Digital Divide in 2011, which was essentially a research autobiography. Explaining the digital divide, Alan says: “It’s the fact that a significant proportion of the population are not able to easily use modern technology and therefore there’s a divide between those who can use technology and those who can’t. This has got worse and worse as technology has got better, so it tends to make it so difficult that people can’t use it and that applies particularly with older people.” Alan’s research also suggested a shift from “user-centred design” to “user-sensitive inclusive design”. He explains: “The idea was that you should be sensitive to the user’s needs, rather than centred on the user, and that it’s also inclusive. It includes the whole range of human beings, not just the young eighteen-year-old, physically fit, mentally capable person that much technology is designed for.” He also developed the concept of ‘ordinary and extraordinary human-computer interaction”; Alan says: “Engineers tend to design for what could be called ‘ordinary users’ and I’m more interested in ‘extraordinary users’. It’s a positive way of looking at people with disabilities. The other concept is that an ordinary person in an extraordinary situation, such as a pilot flying a high-performance aircraft is in the same position as an extraordinary person (a disabled person) in an ordinary situation (e.g. using a typewriter), because they’re both constrained by the Human Computer Interaction (HCI) bandwidth between the person and the human being. So we can learn a lot about HCI from looking at disabled users.” Dundee University
In the nineties, Alan founded the Queen Mother Research Centre in Dundee. He explains: “I was conscious that there wasn’t a great deal of research done on the problems particularly older people had with technology. In contrast to “disabled” people, who tended to be perceived as having one major disability, older people generally have multiple minor disabilities which create many HCI challenges. It therefore seemed to me important to start researching in depth in that area. I thought it would be good to name the centre the Queen Mother Centre, and the Dowager Countess of Strathmore, with whom we already had contact, helped to arrange for us to have permission to use that name. The Queen Mother Research Centre
Alan introduced the concept of using theatre as a research tool having seen his wife’s theatre company use actors to discuss sensitive subjects and encourage an audience to discuss issues. He says: “It seemed to me that that was exactly what was needed for HCI research. We needed to raise awareness of the issues, by showing users not being able to use computers and discussing with them why not. However, it’s not too easy to do that with disabled or older people, because they may think that you’re getting at them: whereas, if we used actors, we could write plays which demonstrated typical problems and then the audience would be able to talk to the actors about problems they had, but there was no embarrassment because it was an actor. There were many research benefits to this approach, and we have shown that at a number of international conferences. It went down very well with the audience; they were able to ask questions which I don’t think they would have thought about before the event.” Theatre in research
Describing the main milestones in the evolution of HCI since the 1950s, Alan says: “From my perspective one of the main milestones was when Windows and ‘What You See Is What You Get’ appeared. Blind people who’d been able to use computers until that point by using command line interpretation, were not able to use it any more because they couldn’t see the screen. That’s one of the examples of where an allegedly move forward put a number of people at a disadvantage for a number of years. “What’s been happening more recently is that manufacturers seem to think that they’ve got to have something new. For example, I, being an older person, find that whenever things are working in such a way as I can understand them, then the designers change them. It confuses me and many other older people, as do some of the various new metaphors which designers have introduced more recently. “As computers have got more complicated, human-computer interaction should become clearer because they’ve got so much power behind them, but instead many designers seem to make things more complicated for the user.” Looking to the future and the digital divide, Alan adds: “It’s not got any easier over the past few years in terms of the divide between older people and ordinary people and I see nothing that makes me think that it will get better.” While Alan thinks IT has and may in the future improve life for some, enabling some disabled people to do things they couldn’t do before and making a difference to a number of disabled people, he also feels it’s made it more difficult for some categories of disabled people to take part in society. He says: “It’s now often impossible to talk to anyone on the telephone, for example. If you have a query, you have to go via the web. If you haven’t got internet access, or cannot use a computer that’s you without a bank account, etc.. It would be fairly easy to make it so that the things that you provide were available to disabled people.” The evolution and future of human-computer interaction
Alan’s proudest achievements include the introduction of the devices into the House of Commons, and the use of theatre in HCI research. He adds: “The proudest thing I wanted to achieve was to get the general population of HCI users to be aware that there were a great number of people that they weren’t addressing. Whether I’ve achieved that, I’m not absolutely sure.” Proudest Achievements
He says of the work done by British researchers in HCI is more focused on the sociological aspects of computers than the technology. Adding: “British researchers in particular have raised the awareness that human-computer interaction is about people, not about computers.” British contribution to HCI
In the 1980s, Alan was part of the Alvey Programme, he says: “It was to develop a speech recognition machine. The ideal was what was called a listening typewriter. My concern was that a listening typewriter would not be particularly effective. When the Alvey Programme was launched, I suggested that I should use my Palantype transcription system as a ‘working’ speech recognition system, so we could find out how real human beings managed when they’re trying to use a talking typewriter.” Alvey Programme
For those wanting to enter the HCI field as a career, Alan says: “I would advise them at least to read my book, because it suggests various approaches that you should take to research. One of the things that I mention, for example, is the motto of the SAS, which is ‘He who dares wins’. You mustn’t follow fashion too much, because most of life is based on fashion, as is most research and the way to get on is to be a bit unfashionable. Act a bit like a maverick.” Advice
Interview Data
Interviewed by Elisabetta Mori
Transcribed by Susan Nicholls
Abstracted by Lynda Feeley