The lecture conducted by Charlie Gere looked at what it means to call a culture digital. Looking at the development of technology from the early 1800’s, computing and digital culture has evolved through a continual trail of events looking at improving the quality of human life and making everyday tasks easier and more automated.
Beginning with the Jacquard Loom created in 1801, the device used punch cards to control the weaving of patterns in fabric, each punch card corresponding to a single row in the fabric. The jacquard loom exhibits a form of semi-automation. The reduction of human error could be achieved by the patterns being pre-determined rather than human configuration at run-time. The punch card mechanism presented a solution that would later be used in early modern computers for storing programs, although it was adapted first and foremost by Charles Babbage for the Analytical Engine:
Charles Babbage sought to reduce the human error rate in the calculation of mathematical tables. To solve this problem he looked at how it could be achieved mechanically. His specification for the Analytical Engine, considered as the first mechanical computer could successfully achieve this, in theory. His assistant Ada Lovelace specifying a method for calculation of Bernoulli numbers using the machine, and predicting that it would be capable of facilitating the composition of complex music. Again the machine sought to achieve some form of automation, self-regulation, and to reduce human error.
Other discoveries and inventions between the 1800’s and now such as the idea that a logical mathematics could be used to express much more than a purely numerical mathematics and the creation of Boolean algebra, the use of Morse code for communication (a digital form of communication using just dots and dashes (ones and zero’s)), even the parallel evolution of travel networks has aided in the creation of the digital culture we now live in. The invention of the typewriter, the use of punch cards and tabulation machines in calculating census results. Even the gramophone has helped us to understand the principles of mechanical reproduction of analogue signals.
Events that led to the creation of the Electrodynamic Theories of Light eventually lead to the creation of wireless communication techniques. The advent of the Second World War required techniques to decode signals sent by the German army, this resulted in the creation of machines that would be capable of decoding them – these would be known as the first computers, however they were dedicated to a single task, the first modern computer came out of research conducted over this period and is considered to be the Manchester MK1, developed by Alan Turing.
In more recent years research into durable communications techniques and global memory, integrated circuits, visual display routines and concepts, and the accessibility of computers and electronics to amateurs such as the hackers mentioned in the section digital practice has led us respectively to the creation of the internet, modern processors, modern user interfaces such as Mac OS and Windows, and the creation of the computer as we know it today with the creation of Apple Macintosh in the early 80’s.
The history is vast, and one of many threads that have bled into this point. However the overall theme of what we are discovering is that our current culture has been created through a series of events that people have pushed for or discovered through the course of other events and research. Everything relating to digital culture is a product of the ideas and concepts discovered in the past 200 years. As mentioned above through the quest for the improvement of human quality of life, the quest to gain control of the environment around us, and a general ‘hands-on’ prerogative, a term coined to describe the ideals of the hackers of the MIT labs in the 1950’s, to discover how things work, and how to improve and manipulate them.
Currently we embrace digital technologies that have arrived through these 200 years of development. Using such technologies for education, creating art, music and other cultural works. We now use digital technology to help develop research such as the modelling of information and data and the simulation of events and practices. We also use make use of digital technologies in our everyday lives, using the technology to run our bank accounts, conduct our everyday shopping and even socialise, health issues are explored first on the internet, travel bookings made, and games, email, and instant messaging still make up a good proportion of how individuals use this one technology. Hoffman et al. (2004) for example in a research project to discover whether the Internet was now indispensable discovered that to a large proportion of individuals it was.
Digital Identity however moves beyond the identity of the individual as defined by the digital technologies they use. Digital Identity in this case looks at society and culture as a whole, and how it has been shaped by the development of technology. We have shown a small proportion of the events and developments that have led us to this point now. Digital has shaped our current identity, but how is this identity likely to evolve?
With development leading into this ‘one’ point it is difficult to see where we are likely to go. The key though is that ‘developments’ have bought us to this one point where we can define our culture as digital. It could be seen that we may be heading for a period of technological inertia, where the technology we now have will simply be modified and improved as human life has being doing with the aid of technology over the past 200 years. But the key could lie in a pattern that Babbage first noticed with the analytical engine. A key breakthrough in the creation of the analytical engine was when he re-directed the machine’s output to the input for further equations; he described this as the machine “eating its own tail”.
This concept of the machine eating its own tail could be a key point in where culture is likely to go. As technology and processes improve we put it back into developing new technology. For example as processor technology improves in computing we can put this technology into new machines that can run simulations faster and more accurately. The results of a more accurate faster simulations is that new developments could be made in new products, or advances in the old, which can then be put into a new product, which may eventually go back into improving processor technology, such as the successful development of nanotechnology processors through advanced modelling techniques developed through the faster processing technologies before. Technology will breed new technology along with providing solutions for society to use in the meantime.
References
http://en.wikipedia.org/wiki/Charles_Babbage - accessed 12th June 2005
http://en.wikipedia.org/wiki/Jacquard_loom - accessed 12th June 2005
Hoffman, D.L. Novak, T. P. Venkatesh, A. Has the Internet become Indispensable? Communications of the ACM, 47(7) July 2004, pp37-42
Kraut, R. Patterson, M. Lundmark, V. Kiesler, S. Mukopadhyay, T. Scherlis, W. Internet Paradox: A Social Technology that Reduces Social Involvement and Psychological Well-Being? American Psychologist 53 (1998), pp1017–1032.