If video killed the radio star, then digital killed analogue. The days of taking your roll of film to Boots, and collecting it an hour or a day later have long gone. Those who nostalgically want to hold on to the past might be heard saying that the rise of computing is responsible. Despite this accusation, Batchen (2002: 165) claims that, ever since their creation, both photography and computing were destined to converge by stating that, "the two technologies share a common history and embody comparable logics."
Whilst Henry Fox Talbot was perfecting his technique for fixing light, his confidant, Charles Babbage (1792-1871), was tinkering with ideas for his Analytical Engine, the first mechanical computer. Batchen (2002) reflects on how Babbage, Talbot and their contemporaries would have been aware of each other's scientific endeavours, and this knowledge of this will have influenced each other. When stripped down to their their basic elements, both the camera and computer consist of algorithms. Babbage took great satisfaction in how his 'cultural artefact' could help nature portray itself as a series of mathematical equations. Meanwhile, Talbot could replicate the world around him, based on the absence or presence of light.
This series of automations arose during the industrial revolution. Manual repetitive tasks were being replaced by mechanical processes, which could replicate a sequence of actions quickly and accurately. Batchen (2002: 171) continues by making the observation that modern computers use the algebraic logic of George Boole. Decisions are made, based on the presence or absence of things.
Today, the need for speed and the automatic replication of tasks has been the driving force behind a digital revolution. Advancements in associated technologies has enabled smaller batteries and electronics to be produced that has resulted in the camera and computer sharing the screen of smartphones. This is something that would have been inconceivable in Babbage and Talbot's time. The industrial revolution consisted of large scale, grand machines, powered by coal and steam. The technology to combine Babbage and Talbot's inventions just didn't exist. This resulted in computing and photography following parallel paths. Paths which finally converged in the 21st century.
It could be argued that by following their own separate paths, both technologies are far more advanced than what they woud have been if either Babbage or Talbot had tried to invent a computer that takes photographs, or a camera that computes! Rather than Batchen's (2002: 165) assertion that, "computing may even spell photography's doom," photography has given computing a new relevance. What was once confined to the domain of IBM machines in offices has now been opened up to the masses.
Batchen, G. (2002) Each Wild Idea, Cambridge, MA: MIT Press