Editorial: A Journey Through History

Editorial: A Journey Through History

By Ric Parkin

Overload, 19(105):2-3, October 2011


Despite early pioneers, the computer revolution is relatively young. Ric Parkin takes a personal tour.

A few news stories recently reminded me of just how far the world of computing has progressed, often in a surprisingly short space of time, so I thought I’d look at a potted, and at times personal, history of computing.

In a literal sense, the first computers were just mechanical devices that helped humans make calculations, whether the humble (yet powerful in the right hands) abacus, the impressive Antikythera mechanism [ Antikythera ], via clocks and orrories, up to powerful mechanical devices such as automated programmable looms [ Jacquard ]. While not radically different from these forerunners, the Babbage Difference Engine is still historically interesting. It was basically a big automated calculator that could calculate polynomial approximations using finite differences (those reading Richard Harris’ articles will note their long history). Their advantages were they were faster than a human and avoided the inevitable errors caused bythe tedious task of calculating tables of logs and trigonometry for various uses such as in navigation, ballistics, and engineering.

But this just set the scene for his Analytical Engine. This was much more powerful and flexible, using various types of punch cards – as seen in the programmable looms – not only as input and output, but also as a way of changing the behaviour of the engine. In terms of its architecture it was revolutionary and way ahead of its time – with separate arithmetical unit, a central processing unit that supported conditional branching and looping, and seperate storage, input and output units. It is recognisably a modern computer. However, compared to current (or even old!) hardware it is a lumbering beast – estimates would be of a few KB of memory at most, with the processor only executing a handful of instructions per second, which may have been able to multiply two numbers in a couple of minutes. All in a package the size of a large room! Sadly only small prototypes were built, although there is a project that is trying to reconstruct a working model [ Analytical ]. In many ways its design was way ahead of anything built for another century, which has led to plenty of ‘Alternative History’ fiction wondering what the world would have been like it it had succeeded [ Gibson Sterling ]. It did bring about another first though – while translating a paper describing the machine in 1842, Ada Lovelace added many notes and thoughts, including the instructions needed to calculate Bernoulli numbers. Subsequent analysis has shown this would have worked correctly had the engine ever been built, and so is credited as the world’s first computer program. [ Lovelace ]

Things went quiet after that, until the pressure leading up to the second world war led to many of these ideas being reinvented (sadly Babbage’s work itself was largely forgotten and was never really influential at that time). Some are relatively obscure nowadays – I found that a German, Konrad Zuse, had invented a series of increasingly powerful computers in the late 1930s, culminating in the Z3 [ Zuse ], one of the first Turing-Complete. However the authorities thought it was ‘strategically unimportant’ and didn’t fund his work. Post-war, he continued to develop his ideas and invented the first high-level language ‘Plankalkül’, although it wasn’t actually implemented until 2000.

The more famous pioneers at the time were of course the code breakers at Bletchley Park, who not only came up with ingenious ways of automating the tedious work of decrypting masses of intercepted messages, but improved on Polish Bombes that checked for possible decryption keys, and Colossus which was semi-programmable and used to crack the hardest codes. Kept secret for many decades, it’s only relatively recently that the importance of this period has been recognised, and efforts made to restore and rebuild both the park and its treasures. Some news in this area is that Astrid Byro has completed her trek to Everest Base Camp, raising money for the continuing work at Bletchley, and generating many stunning photographs [ Byro ]. And sadly, Tony Sale, who led the mamoth task of rebuilding Colossus and will be recalled by those who have attaended the ACCU autumn security conferences, died recently aged 80 [ Sale ]. Apart from Colossus, he achieved many amazing engineering feats including building an early android, and is a true inspiration.

Post war saw a flurry of new computers which are more well-known, such as the USA’s Eniac [ Eniac ], and Manchester’s SSEM [ Baby ]. These were now fully programmable, with even more of the many features that we now take for granted. The revolution had started, with many key technologies helping boost the power of computers, such as the invention of the transitor at Bell Labs in 1948 and the integrated circuit at Texas instruments in 1958. Programming languages were also being improved, with still extant ones such as FORTRAN in 1954 and LISP in 1958.

The subsequent history I can couch in a more personal way. As the 60s ended, my father was a computer engineer, maintaining computers for Sperry, the UNIVAC being one I remember. Work was sometimes brought home in the form of used punch cards and folded teletype paper, which held little interest to myself beyond their use as drawing paper. By the end of the 70s he’d moved on to being a trainer of other engineers for DEC in Manchester. For various reasons we’d pick him up on the way back from school, and as it was usually too early to leave I’d be allowed to keep myself occupied on some of the computers there, mainly PDPs and VAXes accessed via various devices. Some of these were quite memorable, such as playing Lunarlander [ Lunarlander ] on a vector graphics screen using a light pen, Adventure [ Adventure ] on a teletype (which is where the phrase ‘You are in a maze of twisty little passages, all alike’ comes from), or Star Trek on a VDU. Of course after a while you get bored, and so I was given a small blue book which told me about something called BASIC.

This was the time of the home computer revolution, and the chemistry teacher at school set up a computer club with one of the early ZX80s, followed by ZX81s and an Apple II. After saving up, I finally (after a notoriously long wait) got a 48K ZX Spectrum. I learnt an awful lot from this humble machine, not only programming in BASIC, but trying assembler, C and Pascal. I even used it in my Control Technology ‘O’ Level project, where it drove a simple ‘Multiple Choice Quiz Marker’. (Of course, disaster struck on the morning of it being marked – one of the light detectors used to find marks on the paper broke. Fortunately I realised that the sample paper I was using just happened to have some redundancy in it, and I could use the other three to infer the result of the broken one – my first experience of error correction!)

The home computer craze had run out of steam, and I’d sold my computer before going to 6th Form and University. I still occasionally came into contact with computers though, using Pascal and LaTEX on VAXes during summer jobs at the Royal Radar And Signals Establishment, and some university projects on numerical approximation when solving Schrodinger’s Equation.

By the time of my first job in early 1991, the PC had come to dominate. So my first work machine was running an Intel 286, running at about 8MHz. I forget how much memory it had, but it was pitiful by today’s standards, but it, along with a few 386 machines and a rather expensive 486, were used to do some rather remarkable jobs. We were a small company that did map digitising for the likes of the Ordinance Survey, but also lots of location based applications such as hydrolographic surveying. This was quite fun as we’d go out to interesting locations, such as when we developed the software for a major survey ship [ Protea ]. At the time the use of land based transponders and triangulation was the main technique to determine location, but we did have one of the new GPS receivers, which was the size of a small suitcase. My last job there was to write a new version of our graphical survey editing suite for an upstart operating system called Windows 3.0.

At my next job, I came across a language called C, and even invented some simple ones myself. We quickly developed an application for a book database, with various search and results screens. Knowing that the database format and screen layout was bound to need to be tweaked close to release, I came up with a simple script format that would describe the database, define ways of searching it, give the screen layouts and how the navigation between them fitted together. Without realising it I had basically (poorly) reinvented SQL and HTML!

After that I moved to a job in the computing hotspot known as ‘Silicon Fen’, where we were porting from C to the new trendy language C++. We also had a new tool – something called email that worked across a thing called ‘the Internet’. This had also got something new called ‘The World Wide Web’, accessed by a browser such as Lynx or Mosaic. We weren’t sure what it would be useful for though, although a very popular early use was to see if there was any coffee, even though it was someone else’s [ Trojan ].

By the end of the 90s I’d got a bit cocky, thinking I was a bit of a C++ wizz, so discovering Usenet and in particular comp.lang.c++ moderated came as a bit of a shock: in reality I knew so little, and I hadn’t even known it! Apparently this is quite common – if you graph perceived ability against actual ability you end up with a graph with an early peak followed by a valley and slow rise, where you overrate yourself to start with, but when you discover just how much you don’t know it falls back and you tend to underrate yourself. I think this is similar to the Dunning-Kruger effect [ D-K ], but can be seen in a single individual learning over time. But via the newsgroups, I did find out about ACCU, which helped me realise my ignorance as well a provide a means of improving!

The 2000s will be more familiar to most people – the internet became ubiquitous, chip clock speeds stalled, but Moore’s Law continued to hold with the extra transistors going towards more on-chip caches, multi-cores, and dedicated graphics and audio functions. Social media facilitated by computers and mobile communications have put people in touch like never before (I’ve literally just heard from an old girlfriend who has lived near Edinburgh for 17 years – in a previous era we’d have never met again). Politics and technology are still ill-at-ease – in the wake of the urban disturbances in the UK over the summer there were calls for the government to be able to shut down the likes of Blackberry and Twitter on the basis that they could be used to organise trouble. Thankfully people have realised that they in themselves aren’t the problem, they were also used to respond positively, and obvious workarounds exist. The existing laws for incitement dealt with the issue quickly (and easily as such communications could be tracked, providing evidence).

So what of the future? Cloud computing has been The Next Big Thing for a while, but may well go mainstream if someone can make it as reliable, usable, and seamless as local computing. Multicore and parallel processing are here now and will grow in importance – learning how to design software that safely takes advantage of it is the big problem for the next few years.

And the big news for many will be the ratification of the new C++ Standard. Some compilers already implement parts, and the next couple of years will see better compliance – I hope the major vendors commit to full implementations and not just pick and choose parts. For commercial vendors, pressure from their customers (ie you) will help, and for the open source implementations, input from their developers (ie you) will as well. But be quick, the next C++ standard is already being considered!

On a personal note

I came up with the idea for this historical overview a while ago because of some stories in the news. But the personal aspect now seems most apt: while writing it my father, the person who got me started with computers all those years ago, died suddenly aged 71. I’d like to dedicate this to him.

References

[Adventure] http://en.wikipedia.org/wiki/Colossal_Cave_Adventure

[Analytical] http://www.bbc.co.uk/news/technology-15001514

[Antikythera] http://www.antikythera-mechanism.gr/

[Baby] http://en.wikipedia.org/wiki/Manchester_Small-Scale_Experimental_Machine

[Byro] http://abc-ebc.blogspot.com/

[D-K] http://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

[Eniac] http://en.wikipedia.org/wiki/ENIAC

[Gibson Sterling] http://en.wikipedia.org/wiki/The_Difference_Engine

[Jacquard] http://en.wikipedia.org/wiki/Jacquard_loom

[Lovelace] http://en.wikipedia.org/wiki/Ada_Lovelace

[LunarLander] http://en.wikipedia.org/wiki/Lunar_Lander_%28video_game%29

[Protea] http://www.navy.mil.za/vtour/protea/index.htm

[Sale] http://www.bbc.co.uk/news/technology-14720180

[Trojan] http://www.cl.cam.ac.uk/coffee/qsf/coffee.html

[Zuse] http://en.wikipedia.org/wiki/Z3_%28computer%29






Your Privacy

By clicking "Accept Non-Essential Cookies" you agree ACCU can store non-essential cookies on your device and disclose information in accordance with our Privacy Policy and Cookie Policy.

Current Setting: Non-Essential Cookies REJECTED


By clicking "Include Third Party Content" you agree ACCU can forward your IP address to third-party sites (such as YouTube) to enhance the information presented on this site, and that third-party sites may store cookies on your device.

Current Setting: Third Party Content EXCLUDED



Settings can be changed at any time from the Cookie Policy page.