Guide

The rise of a giant: History of Computing, Part 5

Kevin Hofer
12.4.2019
Translation: machine translated

In the early days, digital computers were huge. They not only fuelled fears due to their dimensions, but were also seen as a threat to jobs. One company was central to the acceptance of computers: IBM.

In this paper, it was stated that computers should store data and programmes in binary code on a memory. This concept is the most important invention in the history of computers. This is because it enables a programme to regard another programme as data.

Most of the computers that were built in the following years were based on this concept. There were a few such models in the early 1950s.

Mainframe computers for specialists

However, the big breakthrough was a long time coming. One reason for this was that the computers of the time could only be operated by specialists. And the work of specialists was expensive. The devices were highly specialised and could only perform one computing task at a time. To save money, any work that could be done by a scientist was also done by a scientist.

Computers did not have a good public image. They fuelled fears that they would destroy jobs. This was also reflected in popular culture, for example in the film "A Woman Who Knows Everything" ("Desk Set"). When a computer is introduced in the company, the employees fear for their jobs.

Translations for the machine

Programmes for early computers had to be written in the language of the respective machine. The vocabulary and syntax of machine languages differed significantly from mathematical or our language. It was obvious that the translation had to be automated. Ada Lovelace and Charles Babbage had already realised this in the 1830s.

To do this, a translation programme written in machine language had to run on the computer. This feeds data to the target programme in machine language. Early higher programming languages were translated by hand for the machine, not by the computer. Herman Goldstine implemented this with flowcharts.

This is where compilers came to the rescue. Input from the higher-level programming language is no longer translated into numerical code first. The entire higher-level programming language is translated into machine language and saved for later use. Although the initial translation takes a long time, it can be called up more quickly at a later point in time.

Programming languages

Also worth mentioning is Cobol. Cobol was based on the natural language. This made it easier to understand than Fortran, which led to better acceptance of computers from its release in 1959. The programming language was designed for business, whereas Fortran was intended for scientists.

IBM's rise

The company entered the mainframe computer business in the mid-1950s with the IBM 650. The invention of the transistor led to IBM gradually switching from vacuum tubes to electronic semiconductors. These first transistor computers heralded the second generation of computers.

IBM had several specific computer series at this time: for science/engineering, for data processing, for accounting and supercomputers. In the early 1960s, the managers at IBM decided to put all their eggs in one basket and combine all these applications in one architecture. For an estimated 5 billion dollars, the company developed System/360.

IBM System/360 was more architecture than a single machine. Central to the architecture was the operating system, which ran on all 360 models and was available in three variants. For installations without hard drives, smaller installations with hard drives and larger installations with hard drives. The first 360 models from 1965 were computer hybrids made of transistors and integrated circuits. Today, these are referred to as the third generation of computers.

The 360 operating system led to a shift: computers were now valued according to their operating system and not their hardware. The financial risk involved in development paid off for IBM. Until the 1970s, the company from Armonk, New York, was the undisputed market leader.

That's it for the fifth part of the history of computing. After a long break, I'm resuming the series and giving you a brief and crisp insight into IT's past. If you don't want to miss any more digressions into the history of computing, then follow me by clicking on the "Follow author" button. <p

30 people like this article


User Avatar
User Avatar

From big data to big brother, Cyborgs to Sci-Fi. All aspects of technology and society fascinate me.


Computing
Follow topics and stay updated on your areas of interest

Guide

Practical solutions for everyday problems with technology, household hacks and much more.

Show all

These articles might also interest you

  • Guide

    Douglas Engelbart: more than just the father of the mouse

    by Kevin Hofer

  • Guide

    Apollo 11: to the moon with 4 KB of RAM

    by David Lee

  • Guide

    Complexity theory: the surprise of the century?

    by Spektrum der Wissenschaft