Computers have been around for a long time. Back in 2005 when I was teaching our introductory course in computer science, I devoted a whole week to the history of computing because I felt it was important for students to understand how computers got started and where they came from. The study of simpler, early computing technology helps one grasp the more complicated machines of today.
Being of the age I am, I grew up in the era of the first microcomputers of the mid-1970s. In those days, mainframes (aka “Big Iron”) from the likes of IBM, Univac, and Sperry-Rand were still the kings of computing. These machines evolved from their humbler beginnings in the 1940s as an outgrowth of the war effort. Built initially with vacuum tubes, and later with transistors, they were attended to by a round-the-clock team of technicians trying to squeeze as many useful operational cycles out of the machine as possible in order to maximize efficiency. It is no surprise, as these machines cost millions of dollars!
In the 1960s, as miniaturization started to mature with the introduction of Integrated Circuits, with tens or hundreds of transistors, certain mainframe functions could be successfully migrated to less expensive, smaller, and easier-to-maintain mini-computers. Costing only tens-of-thousands for a modest machine, a mini-computer could be operated in an office environment with just a couple of tech gurus. However, they were still limited to the corporate world. There were a small handful of computer “hobbyists” at the time who used slow dial-up links to a mini-computer in what was called a time-share. Many of these links only worked in the wee hours of the morning when the rest of the office was asleep. But the hobbyists wanted more…
Early in the 1970s, a new round of electronic miniaturization was rolling out — the LSI chip. A direct outgrowth of the space program, the integrated circuit now had thousands of transistors, and several companies thought they could build the core of a mini-computer onto a single circuit. While the details are the content of volumes, the breakthrough came in April 1974 when Intel — yes, the same Intel you know today — released its 8080 microprocessor to production. By the end of the year, a small company by the name of MITS had built a computer around it — the very first commercial microcomputer. This computer could be purchased by anyone for under $500 and did not require highly trained personnel to run it. It did help (a lot!) to know something about computers and electronics, but here was what the 1960s hobbyists wanted… a computer of their very own!
This brings us back to my involvement in the mid-1970s. I was one of those hobbyists, although at the age of 15, and I could only wish for some way to find $500 to buy one. Did it happen? You will have to tune into part two for this answer, along with more on how this interest turned into a career, and how it still plays a big part in my work and life today.
Learn more about Harrisburg Academy on our school website.