Martyn Catlow’s two work passions are weather and computing, which he has utlilised in a 43-year career at the UK’s Meteorology Office and is currently its Hybrid Cloud Solutions Architect.
Over his career he has combined theory and practice to master Fortran and Assembler programming and has worked on proprietary operating systems right through to Linux. He helped plan and execute the Met Office’s move from Bracknell to Exeter, the largest IT relocation in Europe at the time and is now examining the future of supercomputing in the cloud.
Interviewed by Richard Sharpe on 15 August 2022 on Teams.
Read more in our feature – Supercomputers and the Met Office: at the forefront of weather and climate science
Martyn Catlow was born in 1960 in Blackburn. He is an only child. His mother worked the cotton looms in one of the local mills and his father was a delivery driver. Martyn went to his local primary and secondary schools. He says: “Cedar Street Primary school was probably the place that influenced my interest in weather the most and that led to other things later on. He adds: “I went on to Billinge secondary school which was an ex-grammar school. I had a really good education there because a lot of the teachers were from that old grammar school.” Martyn studied geography and physics at A level which also set him on his road to meteorology. He adds: “Having gained my A levels, I decided to get a job rather than go to university.” Throughout his education, Martyn had the full support of his parents who were fully supportive of his attending Lancaster University to study environmental science. However, his plans changes when he was offered a job at the Met Office, which he says “seemed a better option”. Early Life & Education
Martyn joined the Met Office in Bracknell as Assistant Scientific Officer and attended the Met Office College for the first of several courses that he would go on to study for during his career with the organisation. He says: “The first one I did was an introduction for meteorology, which I’m proud to say, I came top in that class. It was only a month long and it was a very specialist course in basics of meteorology at the Met Office, introducing you into forecasting and observations and all the good stuff that generates data.” The Met Office started to introduce computers into its forecasting processes beginning in 1959 with a Ferranti Mercury, which they called Meteor; it was one of 5,500 computers in the world at that time. In the mid 1960s they bought an English Electric KDF9 for £500,000 which they called the Comet. It was on the Comet that they did their first operational forecast. The Comet was followed, in 1971, by an IBM 360 195. Martyn says: “At home in Lancashire I was completely unaware of all these things going on at the Met Office. The main driver for me was the weather. I thought weather forecasting was a manual process. Little did I know that to do all the weather forecasting and processing they were introducing computers. Around the time I was leaving school, they’d got the 360 195 IBM machine, which was one of the most powerful computers in the world at the time. They ran the first real production scale computer models for atmospheric simulation on that machine, relatively small scale, but they were quite sophisticated for the time and leading the game in terms of computerised weather forecasting. “So joining the Met Office at that point in time, in 1979, they were really ramping up how they were using the IBM 360 machine, and not only were they using it for weather forecasting, but they were using it for climate data digitisation. A huge history of climate data has been generated over the years since 1850s and that was all being digitised onto magnetic tape. Looking after the digitised archive and managing the tape archive that it was being put on to was one of my first jobs.” The Met Office
Martyn explain more about meteorological data processing, saying: “There were two ways of tackling data processing. One went back to the 1920s where marine data was punched onto Hollerith cards. That continued right up to the 1960s. However, once technology became available in the form of electronic digital computers in the 1960s and seventies, the approach to data processing completely changed and it was done at scale. They started to not only punch data from records directly on specialised computer systems that allowed teams of people to key in information from forms by visually translating it and using a keyboard, into digitised form, but they’d also got this archive back to the 1920s of punched cards which they also digitised into electronic form by using card readers and so on. So there were two channels of data flow coming into the archives at that point.” The data was used to look for diurnal and climatic patterns which had been done since the 1850s. Martyn explains: “At that time, the Met Office, Professor Galton, Lord Kelvin (of temperature scale fame) and colleagues developed mechanical machines to process data, these were analogue computers and used do analysis of daily patterns and behaviour of climate over longer periods of time. That early work has fed into the modelling that we do today, in a very loose way.” Today, the Met Office’s modelling is on a global basis. Martyn explains: “If you imagine a grid that covers the entire surface of the earth, at each intersection of the verticals and horizontal lines of that grid you can forecast such parameters as windspeed, temperature, humidity and the whole range of other parameters in the very sophisticated models that we run. That’s done on a global scale, and in more detail in more condensed form and much higher resolution over certain areas of the world, like the UK and Europe, for example. That is presented in digitised form through those channels to the media and for specialist consumers like aviation and so on, and they all have different requirements. For example, aviation is more interested in what’s going on at the higher levels of the atmosphere as opposed to what’s going on at the surface and so we do vertical forecasting as well as the points on the surface grid. So if you imagine different layers in the atmosphere, different grids on top of the surface grid going up for each point there, you’ve got a vertical representation of that data as well. There’s a whole load of data and the more you increase the resolution of those models, the more data you’ve got to process and the aim is to process that in a minimal time so that people get a good and timely forecast out at the end.” Digitising data to forecast the weather
After two years, Martyn moved on to be a Support Programmer, Climate Data Processing, where he learned to programme using Fortran. He says: “I understood what data was about, and there was an opportunity to understand how you could do much more with it by using computerised techniques. I met some very, very clever people that were pathfinding at that time and they were a big influence to my future career.” On learning to programme in Fortran, he says: “The team had some very good tutors who were very bright and understood meteorology and also had an aptitude for programming, and they conveyed that over to me. It triggered things in my mind that allowed me to understand how to program and I had an aptitude for that as well, which was great. That’s the theme for the rest of my career, really. “I loved Fortran because it was very scientific and you could translate exactly what you wanted to do back to very simple coding techniques, and they were simple in those days. All languages now have become much more sophisticated and more capable.” It was while working as a Support Programmer that Martyn was encouraged and supported by the Met Office to study for his HNC in computer studies at Reading College of Technology and Design. He says of his love programming: “If you can make a machine do something and see that you’ve made that happen, there is a real feeling of achievement; you’ve mastered something. It’s a bit like playing an instrument, you can hone it to make the most beautiful sound, and if you’re really good at programming you can make a fantastic program that uses the machine to its best efficiency and produces the very best results. That’s what you aim to do as a programmer.” As well writing programmes when needed, Martyn and the team had access to a library of Fortran programmes that the team had already produced. He adds: “There was a lot of stuff that had been developed by these guys who had been working on it during the 1970s. They had produced sets of routines and libraries of regularly used things that people could reuse, rather than code from scratch every time. That allowed a standardisation to start to come into what we were doing. At the end of the day, the quality of modelling and of data or control all depends on having standards. The principles they adopted for developing common libraries of routines were similar to those adopted for Open Source programming.” Support Programmer
The next stage in Martyn’s career with the Met Office saw him promoted in to the field of Consultancy and Training. He says of his experience: “Some people have a breeze because they’re just intellectually bright, and other people have to put a lot of work into achieving outcomes; I was the latter. I learnt and achieved through practice and trial and error rather than always taking a purely academic approach. There’s a beauty to that because it converges the practicality of the real world with the academic. “The outcome of that is that you can still achieve in life, even if you’re not particularly gifted in academic terms. … I think that allowed me to move into this next area of work, which was techniques and training, where I was using some of the techniques that I both developed and learnt from my colleagues and passed them onto a new generation of people that were coming into the Office after me during the early eighties.” It was working in his new role that Martyn was involved with some of the early graphics that the Met Office. He says one of the challenges he was given was updating a programme called C-Map written by Peter Cockerill. Martyn adds: “He developed this program which did a visualisation of data. It had been around since the late seventies and in the early eighties the Met Office wanted to refresh it so it could use colour graphics as opposed to monochrome contours. … I proved the principles (applying graduated colour shading you see in temperature maps today) although didn’t get to refine the process, it was a fantastic exploration of someone else’s very, very deep and clever code. It taught me a lot about challenge and persistence.” As well as working on this and other projects, Martyn’s new role included training people. As well as in-house training, Martyn also lectured at Reading Tech in the evenings. He says he took to training quickly and enjoyed it, adding: “In my early to mid twenties, I was probably more confident than I am today and I found training people was really good fun. Standing in front of a group of people and feeling confident about what you’re presenting to them is definitely a challenge, but I think I got there and I was master of the stuff that I was conveying at that point, which you really have to be in order to convey confidence to the people who are learning.” Consultancy, Training and Technology Exploration
In 1985 Martyn joined the Met Office’s IBM mainframe system team. Martyn explains: “At the time I joined the 360 195 was being replaced by newer versions of the same technology. The team that I moved into was a very specialist team and that was the most engineering focussed job I think I’ve ever done. It was low-level stuff to make the best out of the investment in the mainframe technology. At that time, in the mid-eighties, the mainframe was probably the only machine that was doing operational data processing in the office, so everyone in the Met Office relied on that machine, its availability, its performance and the tools that were available to help them do their jobs in terms of using data processing.” The operating system was a proprietary operating system called System 390, so called to represent the 390 points in the compass and the variety of workloads it could support. Martyn adds: “It was a multi-purpose platform and it did everything from forecasting through data management and data archiving, the machine was the focus of all of the data processing in the Office in the mid-eighties, really. “It was a small team of about seven to ten people, and we were very focussed on making sure that that mainframe system and the peripherals that were associated with it, the graphical presentation machines, printers, tape drives, disk drives and all that low-level IT stuff, all hung together so that the Met Office had a twenty-four seven production platform for its weather forecasting.” Martyn and the team used the Assembler language, which he describes as ‘like Fortran’ but better. He explains why: “It allows you to go right into the mechanics of the machine and operate aspects of the machine that other programming languages couldn’t do. It’s designed to interact with the instruction set of the machine directly, so you’re handling machine operation codes, bits and bytes of storage, and changing individual bits of storage and privileged areas of storage as well, so not just areas of storage that a Fortran program could access. Doing that allowed us to access things like computer peripheral channels, communications networks and actually, because Fortran was a very high-level environment, it didn’t allow you to interact with specific devices particularly. It had the concept of a card reader and printer, but it didn’t understand screen technology and communications channels. So being able to program in lower level languages like Assembler enabled us to write routines that could be called from Fortran but will also interact with some of these devices and channels that were very specific to the machine architecture and enabling that allowed data to be passed between computers. “At that time we had other computers which were very specific to supporting data exchange with other World Meteorological Organisations; the Ferranti Argus and various other specialist processors. The data that was produced on the mainframe was sent to those machines using low-level language programs like the ones that I was involved with writing.” Asked if the Met Office did any transaction processing, Martyn responds: “The processing was largely batch. It was all about large volumes of data, large volumes of model outputs and moving that from one place to another, from where it was being generated to customers who wanted to consume it.” With so many demands on the mainframe, Martyn created an automatic job scheduling system for the mainframe batch workloads. He says: “We had different workloads using the same resource. … I was involved in writing a lot of the stuff around how you managed the resource of the mainframe in an elegant way such that everything managed to get processed, rather than the emphasis being on one thing over another. It was a shared environment, so everyone had a fair share of the computing and to do that there was various approaches to scheduling different types of workload.” Martyn also installed remote terminal access for Edinburgh and Belfast Met Offices which had no access to the central computing. Martyn adds: “There was an initiative to try and help that push of access out to the regions, and I was lucky enough to spend a week in Edinburgh and a week in Belfast and help them set up the remote communications and access to the mainframe. That was a great day when they turned on the terminal and could see a program running at Bracknell and access it and interact with it from Edinburgh and Belfast.” IBM Mainframe System Team
By the early nineties, Martyn was offered the opportunity to become the team lead and was appointed the Technical Lead, Mainframes. He says: “Managing a team of about fourteen or fifteen people, running that mainframe environment, that was a great time for me.” Asked about his management style, he says: “I think you have to encourage people to learn. Some of the stuff that we were involved with in that area was very challenging and new people coming in really had difficulties in understanding. “Technology’s not just about reading a manual, it’s about understanding what other people have done, you don’t start everything from scratch, you take things forward and build upon them. So, I very much consider that what I did was building on the shoulders of giants of the 1970s guys, and that’s what I tried to instil into people that were coming into the team when I was running it, that you have to impart your knowledge to them in a way that makes them enthusiastic and want to continue and drive things forward. “We had a lot of staff turn-over during the nineties because computing was becoming much, much more predominant in the business world; people came, got trained up and left. But it never stopped me taking the view that you have to be enthusiastic, encourage people and drive that enthusiasm in them and if they choose to leave, then someone else has benefitted. The UK’s benefitting in general.” Asked why he chose to stay at the Met Office, Martyn says: “I was offered a couple of jobs with commercial organisations over the years, but they never seemed to be as challenging as what the Met Office were doing in terms of its computing and direction. “The Met Office acquired its first supercomputer in 1980 and was already leading the way in model development, in fact it was probably on its second-generation supercomputer by this time, and it was a really challenging and exciting place for someone that both was interested in data processing and the weather. At the time, commercial folks were interested in the database management and banking and that didn’t really appeal to me, I was still really into the weather and making sure that you got the best out of the data processing in that domain. Civil Service salaries have never been top of the pile, but I stayed because of the enthusiasm of working in an organisation that’s forward looking and challenging.” The organisation’s use of technology had moved apace during Martyn’s career from 1982 when the Met Office used a fifteen-level numerical weather prediction approach model to 1991 when there is a twenty-level unified model being processed. This created a need for, not only larger machines, but also a more sophisticated software environment. Martyn explains some of the changes: “As the models increase in complexity and the science that drives the models increases in complexity, then the processing demands of those models increase. However, the time window that you’ve got to put a forecast out still has to be maintained. So you need more power out of the machines in order to ensure that faster speed to get more cycles processed, and more efficient code to make sure that you’re using the minimum number of cycles to reduce the time of output of results. “To support that the Met Office was looking in different areas about Supercomputing and what that could do. So the move began of all modelling to Supercomputers while the mainframe environments were then used to do all the subsequent downstream processing. That processing was largely data processing, the Supercomputers doing higher performance numerical computing. At that time realisation Global Warming was an impending disaster meant Climate Modelling was coming to be a very big thing and dedicated high performance computers were needed to support the analysis and investigations required to understand its impact. At the same time very powerful computers were becoming financially viable, these provided Scientists with additional dedicated computing capability for exploration at their own desktop. These Unix workstations supported development of specialised programming and visualisation of data, and they were being deployed round the office. “So the that was the mid-nineties and by the time we’ve reached the next millennium, the Met Office had changed from being an organisation focussed around batch processing into an organisation that was beginning to use interactive processing and allow scientists to make changes in real time so they could start to do things interactively and get results straightaway. That speeded up the development of modelling and the data processing techniques that were being applied to the data.” The implementation of Unix meant that Martyn was once again able to combine his experience with learning to ensure the technology was implemented effectively. He says: “I wouldn’t classify myself a Unix expert but saw the benefits of a standard and portable application environment. The mainframe team experience of managing large scale technology were key in supporting the introduction these departmental systems to ensure that they were being used for and deployed efficiently. Wherever you’re ground-breaking, experience is useful, and even experience in different computer technology is still valuable when you’re applying the same principles to a new environment, like the Unix environment.” Technical Lead, Mainframes
Martyn was part of the quality assurance team for Y2K preparedness, he says of the experience: “It was daunting at the time, although we had the supercomputing for the modelling, much of the data output was still being done through the mainframe, so we didn’t want it to go wrong at midnight 1999 with no aviation forecasts being available, otherwise that would have been catastrophic. There was a lot of pressure and a lot of emphasis on ensuring all the systems that were in place, not just on the mainframe, but across the Office, and that we were ready for 2020. As it turned out, there were very few serious implications of that change. We were lucky.” Asked if he thinks it was over egged, Martyn adds: “It was over egged a little bit, as things always tend to be, by creating fear, uncertainty and doubt about things it generates revenue. If you’re a technician you can home in and see that the problem isn’t all pervasive, it’s in some very specific areas and you can probably find out what those are quite quickly. But nevertheless, there was an opportunity for a lot of money to be made in a very short time.” The Met Office used consultants to audit what the team had already established. Martyn adds: “Because we’re a public sector organisation we took it very seriously and made sure that all our customers were assured that we’d taken year 2000 seriously, and to do that it’s good to get the rubber stamp from an independent organisation.” Y2K
In the 2002, the Met Office relocated from Bracknell to Exeter. Martyn was involved in the technology transfer. He says: “The move was a remarkable achievement in the Met Office history. I don’t think any other change so impactful had ever taken place. Because of the nature of the reliance on the Met Office and its services by that point, it was really important that the transition was made in a seamless way. The planning that went behind the move was very diligent and there was a lot of people involved. Imagine the challenge; you’ve got an entire operation that’s working and has been established in Bracknell for twenty-odd years, and then you’re going to move that several hundred miles cross-country, and you have to continue to produce the outputs from the models and supply the customers with all the products that they’re expecting without disruption. It was one of the biggest challenges I think the Office has faced in its entire history.” The move employed a twinning IT operation to ensure capability could continue should one of the systems at one of the sites fail. Martyn adds: “The principle of migrating work between one system and another was already established for resilience at Bracknell, it was just that it had never been done over hundreds of miles across country before. In essence it sounds simple, but we needed to line up the ducks to make sure that that all worked and the communications were open and available all the time.” He highlights that the infrastructure of that time was not on a par with the fibre connections of today, adding: “The infrastructure in the UK has completely changed in the past twenty years, back in 2002 things were much more elemental and less reliable and you still had to maintain the data flow between the two sites as well as moving the equipment. Co-ordinating that was a massive challenge for everyone.” The move also allowed time for the Met Office to review its IT Infrastructure. Martyn adds: “There was a big clear out, if you’re going to move house you’re going to get rid of all the rubbish, and we had the luxury of moving into completely new, freshly built IT halls. They were built ground up with the latest available technologies and infrastructure and some of the IT kit didn’t migrate. New infrastructure and IT Systems were installed at the Exeter site leaving much old kit behind in Bracknell. It was a really good opportunity to clear out dead wood and then force a refresh of technology which was, in some cases, long overdue in Bracknell.” While some Met Office employees decided not to make the move to Exeter, Martyn felt it was a good opportunity to have a change of scenery. Among the new challenges that emerged from the move was the adoption of Linux, which Martyn was heavily involved in as the lead. He explains: “I think we were probably very innovative, because Linux is a derivative of Unix, but it’s a much more open technology allowing people to contribute to its development as opposed to being developed by, say, Hewlett Packard, SGI or Sun Systems.” The open-source concept took off in the nineties and allowed individuals to create a code base that is shared freely. Martyn adds: “In the Met Office that concept had been used by the early developers of Fortran to create libraries that were common in use so that everyone could use them. …The advent of commercially supported Linux turned that open-source drive, that community, into something that businesses could start to adopt. I’m very pleased to say that we again were pathfinding with Linux in a very early stage in the Met Office, and we implemented the first production Linux on the mainframe pathfinding a technology that would later become dominant in the Met Office. “By the time we’d moved to Exeter, it had really taken off and there were so many different uses of Linux being deployed around the entire organisation, on desktops, on departmental computers, as well as on the mainframe. Today we’ve got Linux running on the Supercomputers as well. So the age of proprietary operating environments and proprietary software is largely past; we’re in a new world now where Linux is king.” Asked if open-source is a better way of creating better software code, Martyn says: “One hundred per cent. So often during the 1980s and nineties I was frustrated by commercial software that we had to install to support facilities. At the time that if something was wrong it took ages to get a fix. If you wanted a change to the functionality, it would take forever for that to happen, it was all driven by business decisions within the companies that owned the software. Now it’s a completely different world, people can contribute, and they make assessments based on need rather than commercial need, and open source is just fantastic. “However, there is a dark sign on the horizon that’s beginning to emerge with Cloud computing. From my perspective we seem to be going back in a cycle towards platform specific code which is beginning to become apparent on platform site Amazon where you’re bought in to techniques that are almost proprietary to their environment. This takes us back, in my mind, to the 1970s where IBM was king and their operating System 390 and z/OS have been very proprietary. In making computing decision, I think you need to always take the long-term view and make sure that you’re making the right choices rather than fulfilling an immediate need by doing something pragmatic.” Bracknell to Exeter
In 2003, the Met Office adopted Oracle databases. Martyn explains the history: “We started off with a product called IDMS, it was a non-relational database and the thing about the Met Office data was that it suited a relational type database architecture it soon became apparent for processing performance and efficiency things had to change. That’s where Oracle Database did well. The choice at the time was between Oracle, Ingres and BB2. We chose Oracle because their geospatial capability was more advanced; they were more aligned with the geographical relationship of Met data did than the other database implementations, which were more oriented around business function and processing. Although new data storage and manipulation technologies have emerged, we’re still running Oracle today on the mainframe.” Oracle and the Met Office
The Met Office has, from its first supercomputer, always installed, managed their supercomputers in their own data centres in collaboration with suppliers. However, in 2020, the Met Office took a change of direction. Martyn explains: “Supercomputing is very demanding from an environmental perspective, from power supply and cooling to specialised infrastructure to ensure capability always matches what’s needed in by the Met Office science. Scale’s been a challenge and much more so support the most recent ambitious plans. The Met Office has often wanted to increase its capability and resolution of models and the on-premises installation facilities had limitation. So, although we’ve run our own computing installations since the first digital computing in the 1950’s to date, in 2020 the decision was taken to host Supercomputing differently taking advantage of Cloud Computing scaleability.” Martyn was initially resistant to the idea but now agrees that “we absolutely needed to make that change to a different way of hosting Supercomputing.” Martyn’s initial reservations centred on his view of the Cloud industry as profit driven and acquisitive He says: “Once you’ve got your workload on somebody else’s platform you’re no longer completely in control of data or much performance you get. You’re handing over a huge amount of trust to organisations that are very commercially focussed. You have to make a decision whether you want to take that risk and trade off against losing the ability to do the very best you can to get the very best out of your investment, for me that was a big challenge. I still have reservations to some degree but what the Met Office is doing in terms of its Supercomputing and Cloud is not your average Public Cloud deployment, it’s being done in a very innovative and novel way which enables us to take advantage of the scale of cloud without necessarily buying into all the dangers of cloud in terms of the economics and in terms of the lock in.” Next generation of supercomputing and the Met Office
Martyn says that quantum computing is on the horizon but is not yet manifesting itself into anything real. He adds: “Because computer models produce outputs of massive scale, to change them from working in a conventional computer architecture to a completely new one, not only means you have to tune and re-engineer for that architecture, but you also have to make sure the science works on that type of architecture as well, so it’s an unknown for me at the moment how we would transition into quantum.” Quantum Computing
Martyn describes how effort that has been put into Cyber Security to ensure that the Met Office products are not impacted by cyber threat. Martyn goes on to discuss the wider issues of Cyber Security and the Cloud, saying: “From my perspective that remains the huge risk in terms of where we are in the world, the global politics and how it affects multi-national organisations and how they’re potential targets for international terrorism or warfare. The more you hand over your control, the more you are in their hands as to how seriously they take Cyber Security and installation security and all of those things that sit around computing. It’s a trust thing, and you have to work on the basis that these organisations have the money to make the investments that make secure IT possible at the very highest levels, and consequently they’re meeting the levels of expectation that you would have if you were doing it yourself. I think they do, but there’s always that risk because they are so big that things get forgotten or lost or processes fail and something breaks. “So, in my head, in the world we’re heading towards we have to be far more conscious about maintaining a capability in both Cloud and on-premise so that you don’t lose complete control of your ability to operate in the event of a major disaster.” Cyber Security in the Cloud
Looking at mistakes in his career, Martyn says: “I’m in two minds on this because I really enjoy the direction that my career’s taken but I think the biggest challenge has been not being able to progress through management. If you focus on an engineering career, you miss out on the benefits of being able to influence the management side and set the direction for the organisation. Even though you’re still influential, you’re not part of that culture. I think that might have been a mistake, but I’m still not convinced.” Mistakes
Asked if he has any regrets about staying with the Met Office for over forty years, Martyn jokes about salaries and pensions, but adds: “In terms of fulfilment, of enjoying what I’ve done, of working with some of the best, brightest and intelligent people, and some of the nicest people that I’ve ever come across; I couldn’t have chosen a better organisation to work for. I’m pleased I’m still here.” No Regrets
Interview Data
Interviewed by Richard Sharpe
Transcribed by Susan Nicholls
Abstracted by Lynda Feeley