The computer has existed in different forms since the early Greek and Roman periods of History in the form of the abacus, however in comparison to modern standards the abacus is comparable to a data storage facility. The abacus stores values of numbers in varying positions along its frame but it still requires the logic of a human to operate it and help perform the calculations.
The last millennium saw the invention of the computer as it is now thought of, but not before several versions of these machines existed. In the 1600’s Pascal and Leibniz created machines that performed calculations and used special algorithms to do so. The machines operated using a gearing system to store values, data was inputted mechanically. The algorithms could not be adapted once the machine was built, so compared to modern standards the machine could be called a semi automatic, data storage facility to use a rough term, but this would be analogous to a primitive calculator.
The 1800’s saw the specification of a computing device by Charles Babbage. The specification theoretically would allow the machine to perform calculations of relative complexity. The importance of the machine that Babbage designed is that it was programmable, it would read programs through punched cards, and it would print readouts on paper to prevent transcription error. His assistant Ada Byron Lovelace speculated that the machine would facilitate composition of complex music, be capable of producing graphics, and would be used for practical and scientific applications. She also specified a method for calculating Bernoulli numbers using the Babbage device. The fact that the machine specified by Babbage was programmable, means that it is regarded in some accounts as the worlds very first computer, Ada Lovelace is seen as the very first computer programmer and her specification for the calculation of Bernoulli numbers the very first computer program respectively.
The modern computing age began in the 1940’s with the creation of several computers that compete for the title of the very first modern computer. Two machines of significant importance are the Z3 created by Konrad Zuse in Germany, 1941, and Colossus created by Alan Turing and other colleagues in Great Britain in 1943. The Z3 was based on electromechanical relays, however Colossus was the first digital electronic machine.
Both machines have factors that make them equally viable for being the first modern computer. However the irony lies in a theory developed by Turing himself:
A Turing machine consists of an infinite amount of memory, a finite set of symbols and an instruction set. Re-arranging these symbols in memory and utilising an instruction set on these symbols allows mathematical functions to be calculated. A mathematical function exists if and only if it can be computed using a Turing machine. The Universal Turing machine is one that can imitate any other Turing machine. Any machine with the computational power the same as a Universal Turing machine is described as Turing-complete.
Colossus created by Turing was an electronic machine it was also programmable, however it was incapable of certain mathematical tasks and therefore was not Turing-complete. The Z3 was completely programmable, if it was possible to give the Z3 an infinite amount of memory it could imitate any other Turing machine. Therefore it is described as Turing-Complete, and by this standard can be called the first modern computer.
In 1950’s and 1960’s America, a new revolution began in terms of computers…
Although the development of computers continued throughout the 40’s and early 50’s it was within laboratories belonging to governments and large corporations beginning to specialise in this area that the development occurred. The ideologies within these laboratories were not always necessarily far-seeing. IBM for example, saw the computer as a bulk-processing machine, continuously operating but with very little interactivity. If this trend had continued it could be quite feasible that the computer as we now know it would not exist. However as the development continued and new machines were created old machines were not discarded but given to certain universities and other academic institutions such as the Massachusetts Institute of Technology (MIT) in America, for study and teaching purposes.
Within the laboratories of this institution the world of hacking was born and from that epicentre the world of computing has never been the same since. The MIT hackers, and other institutions given the ability to access machines such as the TX-0, the PDP-1 and the PDP-6 began to interact with the machines producing new software, where compilers and programming languages and other software were not comprehensive enough these hackers designed new compilers, new programming languages and new software for ever increasing new purposes. These hackers started everything from scratch in terms of software creating mathematical routines, graphical routines, basic calculators, rudimentary games, and as there visions grew so did their projects including venturing into the realms of artificial intelligence and musical composition. The predictions made by Ada Lovelace were beginning to be realised. Hardware was not untouched either and where these hackers had the ability to improve on the circuits and components of machines they were given the opportunity to experiment and install their new components if it would prove to be advantageous to development.
The culture grew and although it began to leave the institutions, the programmers that left the institutions began to form companies or spread the word in other colleges, and those who began to discover computers outside of academic life became part of the culture themselves. The late 60’s and 70’s saw the development of hardware, and saw the advent of the first personal computer. The desire to create a machine that could be used in the home for programming, and the friendly competition and cooperation that came about in pursuit of this goal saw the creation of many computers of varying capabilities that could be assembled at home and then used for various programming tasks. However the unprecedented arrival of the Apple Macintosh, created by Steve Wozniack one of the many hardware hackers in this age, signalled the arrival of the personal desktop computer.
The 80’s saw the creation of the gaming culture, along with the development of home and business software applications. Hardware also improved and began to spread through to regular homes and businesses and started to become part of everyday lives. This trend has continued and is still growing in the 21st century.
Computing has become part of everyday culture, and it is a large part of our heritage and the worlds identity, but in the same way that it has become part of our culture, culture itself is beginning to take advantage of digital processes in a way that may never have been imagined before.
When the very first computer game was created on the machines in the MIT laboratories in the 50’s we saw an example of modelling and simulation, an artificially intelligent opponent gave the application a method of automated movement around the board, however this could be hard coded. The very first applications taking advantage of the data storage capabilities of computers created a whole new world of data representation, and the first applications allowing users to type words and save the created document, created a new method for the storage and retrieval of the printed form. The first database, the first use of the Internet, the list goes on but each advent in terms of the computer revolution did not just create a solution for a specific problem, rather solutions could be taken and re-used if appropriate, or re-programmed into more practical forms. This has lead to the application of computers over many cultural areas new and old, each taking advantage of old methods, and spearheading development and the innovation of new computing technologies.
- Art History
- History
- Music
- Information Management
- Libraries
- Museums
- Records and Archives
- Architecture
- Archaeology
So out of two seemingly disparate areas, technology, and culture, there now exists a synergy between the two that hopefully will prove to be advantageous to each other as this relationship continues to evolve. As we move through this course we shall learn more about the technology and begin to see applications of computers within these varying subject areas, and how they can help one another in terms of further development, and exactly how this synergy between the two exists.
References
www.ams.org/new-in-math/cover/turing.html - accessed 28th January 2005
www.gchq.gov.uk - accessed 28th January 2005
www.maxmon.com/1943ad.htm - accessed 28th January 2005
www.oed.com - accessed 28th January 2005
www.webopaedia.com - accessed 28th January 2005
www.wikipedia.com - accessed 28th January 2005
www.yourdictionary.com - accessed 28th January 2005
Bibliography
Brookshear, J. G. (2000). Computer Science, An Overview. Addison-Wesley
Levy, S. (1984). Hackers, Heroes of the Computer Revolution. Dell Publishing
Wordsworth Dictionary of Science and Technology. (1995). Cambridge University Press