The Power Of The Future Is Rooted In The Past
First programming languages were written in the 1950s, long before our modern day ide software was even a twinkle in the eye of the most forward-thinking minds. Those early languages were very close to the hardware. Instructions were extremely unsophisticated and were almost one-to-one conversions of a circuit’s purpose: load value, multiply value, subtract value. In fact, multiplication was achieved by simply adding a value multiple times so “4 multiplied by 3” was really “load 4, add 4 to that, add 4 to the result, print the result“. In essence, the programming language was a series of grunts used to communicate those absolute essentials to the computer to get the work done. Was it a language? Of sorts, yes, but only really a step away from a traveler stranded in a foreign country gesticulating wildly with hand signals at a bar tender in order to get the drink they want. It was a struggle. It was arduous and error prone even though the hardware limitations meant the actual quantity of instructions which could be fed to the hardware were tiny by today’s standards. Programmers were speaking the language of the computer to make the computer’s life easier. We grunted, they understood. It was magical. What is a high-level computer language? Margaret Hamilton with the code she and her MIT team produced for the Apollo project. Source: WikiMedia Commons This onerous and laborious level of communication endured for a few years. Even by the time of the first moon landings the instructions were still designed to make the job easy for the computer with little or no deference to the ease of human comprehension. The capacity for the computer’s instruction store – where it put actual “what should I do” had massively increased but you still had to be something akin to a superhuman to write code which worked reliably. No wonder that time spawned so many comparisons of programmers to wizards and warlocks! Of course, the time came when a push towards making programming languages more human friendly arrived. Algol, Fortran, COBOL and C all broken new ground in abstracting those LOAD, STORE, SHIFT LEFT and ROTR computer-edible instructions into something more palatable for human programmers to consume. The shift from wizards and warlocks towards more regular humanity had begun. What happened to computer programming in the 1980’s? The 1980s were a huge boom time. That early abstraction of computer instructions in the “high level” languages such as BASIC and COBOL had gone a long way to making programming more accessible. The advent of the semiconductor and advances in miniaturization brought about programmable calculators, the first digital watches and, eventually the first personal computers which freed computing from the enormous room-sized behemoths of the ’60s and ’70s to a box which could be plugged into an ordinary power supply and would fit on the average office desk. Now, in theory at least, everyone could have a computer. Computing was personal. Why did the personal computer spawn a new era in computer languages? The invention of ‘personal’ computing meant that suddenly the possibility of writing and using computer programs could spread to ‘normal’ people. It was no longer constrained to the environs of huge corporations with development staff who were university-trained at great expense. The ordinary individual could write code for the personal […]
