Think you’ve learnt it all? Chris Oldwood reminds us that unlearning then becomes our next problem.
After finishing my first article for CVu just over a decade ago, I was asked to come up with a short biography and photo to give ACCU readers a tiny insight into the author. At that point the only thing I’d ever written to describe myself was a CV for job applications but I’d guessed that wasn’t really what they were looking for. Instead I had to find a way to sum myself up in just a sentence or two.
I’m a firm believer that ‘context is king’ and therefore I decided that to distil my essence into such a short piece I should focus on where I came from (programmatically speaking) and where I am now so the reader could extrapolate from that the kinds of projects and organisations that have shaped my programming career, and consequently my writing. Hence I arrived at the bio you now see adorning my articles to this very day (recent pandemic-related tweak notwithstanding).
During my university years and the start of my professional programming career, I saw being an ‘80’s bedroom coder’ as a badge of honour. Working at a small software house writing graphics software to run on underpowered PCs required some of the skills I had developed writing demos in assembly during my teens, such as the ability to read what the optimizing compiler had come up with and then find a way to make it work when the compiler got it badly wrong. Who knew that in the real world, though, most code is not written in assembly with speed being the only concern...
My new found interest in networking and distributed systems caused me to leave that behind and enter the corporate world in a freelance capacity. Time marched on, CPUs grew faster and ever more complex, optimizing compilers become reliable, disk and network speeds jostled for position, memory became abundant, and the claims on my CV about my knowledge of PC hardware become weaker as I slowly moved ‘further up the stack’. What I once (naively) saw as the meat-and-potatoes of programming had been downgraded to ‘mechanical sympathy’ [MechSym].
For me any appreciation of new hardware or technology has tended to come from a single impressive moment of its application rather than a change in numbers on a data-sheet. For instance, I had a low opinion of the JVM in the early 2000s until I saw some Atari ST demos that Equinox (an old Atari demo group) had ported to run as Java applets and, while it was sluggish on the Sun JVM, they ran real-time on the Microsoft JVM. Any performance reservations of the mundane Java project I was assigned to at the time dropped away instantly. Sadly they were replaced by a more unexpected time bomb – the buggy date/time class.
One networking epiphany came when I had to debug an occasional crash in a C++ based service and I struggled for some time to believe what my eventual hypothesis was suggesting – that the service could send a financial trade to another machine, value it, and return and process the response on another thread before the sending thread got switched back in. As for disk I/O, which has never really been that stellar under Windows, I sported a rather large grin the first time I experienced compiling C++ code on an SSD.
More recently-ish I attended Jason McGuiness’s ACCU talk about the impact of Meltdown and Spectre on high-frequency trading. Any pretence I might still have had that my mental model of what went on inside a modern CPU was readily dismissed; in my heart I probably knew that but it was still a brutal awakening after all those teenage years counting cycles. Although I still occasionally inspect modern assembly code in the debugger it’s really the stack traces I’ve become more interested in as I try to reason about the flow rather than question the compiler’s choice of instructions and sequencing to get the best out of a CPU.
Now, as I write these very words, there is a flurry of excitement about Apple’s new M1 chip and how its ability to emulate a different CPU architecture faster than the real thing is an impressive achievement. For those “in the know” I’m sure it’s just one more inevitable step forward, but for me it’s yet another virtualization bubble burst. Even Knuth’s MIX is struggling to stay relevant.
And that’s one of the downsides with being a programmer as the years whizz past, there comes a point at which you find yourself spending more and more time ‘unlearning’. You might say it’s really just learning something new but unlearning is really about trying to forget what you learned because the game has changed and you need to catch back up with those that never learned the old ways in the first place. Unchecked, that badge of honour is slowly turning into a millstone.
Modern C++ is yet another example. I used to have a snippet of code I liked to chew over with candidates in an interview that encapsulated various idioms and pitfalls when working in C++. It was a well-honed example based on 15 years of blood sweat and tears and yet most of the discussion points are now moot as the language has changed dramatically since then due to move semantics, lambdas, range-based for loops, etc. Old habits die hard and unlearning that you shouldn’t return containers by value because it now Just Works™ is another example where years of inertia can be difficult to overcome.
Luckily there are still some inalienable truths to keep me warm at night, like the speed of light limiting my ping time and giving me an excuse for why I lost at Fortnite, yet again. Having sympathy for the machine is undeniably a valuable skill but who has sympathy for the poor programmer that is forever learning and then unlearning to keep up with the march of progress in an effort to stay relevant in today’s fast paced world?
[MechSym] Mechanical Sympathy: https://mechanical-sympathy.blogspot.com/2011/07/why-mechanical-sympathy.html
plush corporate offices the lounge below his bedroom. With no Godmanchester duck race to commentate on this year, he’s been even more easily distracted by messages.is a freelance programmer who started out as a bedroom coder in the 80s writing assembler on 8-bit micros. These days it’s enterprise grade technology from