Computers and I go back a long way. When I came to London as a research student in 1967 the whole of London University was serviced by a single huge computer. It was called Atlas and it occupied an entire building in a street alongside Euston Station. It looked like something from a science fiction movie, with rotating tape reels and flashing lights but no keyboards or screens and nary a mouse to be seen as they had yet to be invented. Atlas was staffed by young men in white coats — I don’t recall a single female operative — and mere students weren’t allowed anywhere near it. The data from our experiments had to be fed in on punched cards, and at UCL there was a special shed full of hole-punching machines where distraught researchers like me cussed as they punched holes in the wrong places and had to start all over again. If you were lucky you’d find that there was an existing programme to analyse the data on your cards, but if there wasn’t you had to write one yourself, and I was sent on a programming course to learn how to do this in a language called FORTRAN IV. There was a waiting-list for data processing, and when you eventually got your results they came in the form of numerical print-outs: reams and reams of paper. Not even the science fiction writers predicted that in the future all this would be miniaturised and made affordable for home users.
It took a while, however. It wasn’t until the late 1970s that the publishing firm I worked for could obtain sales figures from the warehouse via a dial-up modem attached to a telephone line, which was a slow and highly unreliable process, and once again I was allowed nowhere near it. More years went by, and I started my own publishing company. Towards the end of the 1980s some of our authors were writing on Amstrad computers with green lettering on black screens (no graphics), saving their writing onto floppy disks and expecting us to be pleased about receiving these state-of-the-art submissions. A decent-sized novel could occupy three or four disks and we had no computer to sort out the damn things, so people quickly acquired daisywheel printers and sent us stuff that we could actually read. We saw no reason to get one of these crappy computers or printers for ourselves.
I didn’t acquire my own computer until 1995 when, feeling I ought to be learning about these new-fangled things I enrolled on a basic computing course, which I got for free being theoretically unemployed at the time. Windows 95 was hot news at the time, and while it was ok for processing text its graphics capabilities were pretty well non-existent. The course was fun, though. Most of the other students were college-leavers trying to get a first foothold in the job market, and some of them were female. They all watched Friends, in which Monica was just embarking on an affair with Richard, played by Tom Selleck, an older man with a moustache, which seemed to give some of them the idea that getting to know an older man might be a neat idea.
I was certainly no Tom Selleck, but I was an older man called Richard with a moustache, and two girls in particular started being very friendly to me. One of them was clearly unstable (you don’t have to be clinically insane to fancy me but it helps, as the saying doesn’t quite go), but the other was tall and slim with closely-cropped black hair like Louise Brooks and she’d just graduated in Art History. I’ll merely say that we got on extremely well, and if she hadn’t had this boyfriend back home in Warwick … I still think of her sometimes. She was lovely: intelligent, beautiful, gentle and kind. Damn it, I should have fought much harder for her. Anyway, in amongst all that I bought a second-hand PC running Windows.
This computer didn’t last much longer than the course and the romance, but computing for me changed vastly for the better when my friend Bob who was supporting a big project that I was working on gave me an Apple Mac together with a matching scanner, laser printer and modem. Astonishingly kind, and I lost no time in enrolling on another course, learning Photoshop, Illustrator and Quark XPress over a period of eight months and copping an NVQ Level 2 into the bargain, boast boast, and connecting to the internet for the first time, which was not then plagued so much by advertising. This enabled me to do most of the things that we now take for granted on our Macs and PCs, and as not many others were doing desktop publishing at the time it helped me get going as a freelance editor and designer, and saved my bacon financially. Since then we’ve got much faster computers and near-universal broadband and — well, you know the rest as well as I do.
It’s astonishing to realize that the cheapest modern laptop is more powerful than Atlas was. The other day I was relating all this to another twenty-something woman, a freelance writer — I’m a fascinating conversationalist — and trying to persuade her that if things could change so much in the last 50 years computing would be vastly different 50 years hence when she’s a granny, but she was reluctant to believe that items like keyboards, screens and mouses would disappear and be as forgotten as card-hole punchers, floppy disks and modems when they’re superseded by all the data and imaging going directly to the brain. It’s already happening. I read in the newspaper the other day that a dog has been trained to move things about on a screen merely by willing them to do so, and I believe that Elon Musk is working on direct connections to the head. I’m glad in a way that I won’t be around to see the fruits of these researches, though I’d quite like to return to this young woman as a ghost saying in spooky tones “I tooold you sooo.”
As I’ve been writing this piece I’ve found Pulp’s ‘Help the Aged’ playing in what’s left of my brain. Can’t think why.