What is digital technology?¶
Authors and date
- Submitted on: September 7th 2021
- Corinne Touati; Inria
- Christophe Bravard; Grenoble Alpes University
- Benjamin Ninassi; Inria
- Christine Leininger; Inria
- Martine Courbin-Coulaud; Inria
In concrete terms, is digital technology virtual?¶
In our daily lives, we use many objects from the digital world. We receive e-mails, which we classify in virtual folders, which themselves can be placed on a virtual office. All these objects seem immaterial to us. Yet, they all have a physical reality in our machines, cables and other equipment that allow them to communicate together. This confusion between material and immaterial is found in the terminology we use; the borders between digital, computer and electronic are blurred.
We begin by clarifying these notions by highlighting those that are material or immaterial. We also illustrate this dichotomy through a brief history of the development of digital sciences and technologies.
Some elements of terminology¶
Let's take a quick look at some of the phrases frequently used interchangeably in our daily lives.
Informatics, a made-up word from the contraction of information and automatic is the automated processing of information. This information can be of different types. For example, it can be the position of an object in space.
This information often concerns numbers and their properties. This type of information corresponds to the term digital. More precisely, numbers are generally the immaterial representation of information from the real world. The study of the properties of information, its processing, utilization and transmission, belongs to the field of information theory.
A procedure for manipulating a piece of information to produce a result is called an algorithm 1. Algorithmics is then the study of the efficiency, for example the reliability or the speed, of such a procedure. An example of an algorithm is a sequence of instructions that allows a geometric construction, such as the drawing of a perpendicular bisector from a line and two points. A classical (digital) algorithm is that of classifying a series of numbers in an array.
A computer is a machine with the capability of acquiring, storing and communicating information and implementing algorithms on that information.
The theoretical principle of operation of these implementations was developed and studied by the mathematician Alan Turing. In other words, the computer is a material support for information science.
Note that the English word computer is in fact a shift in meaning: it used to designate humans who made calculations, kept books of accounts, etc. Moreover, the English language makes a distinction between "calculation", the arithmetic calculation, and "computation" which is the logical continuation of a series of calculation rules, and which is closer to the notion of computer science and algorithmics.
The evolution of computing over the centuries¶
The first computer systems were of course... human brains. Thus the "procedures" (later called algorithms) of calculations, problem solving or even geometrical construction were enacted by humans to be used by others.
With technological progress, machines came to displace humans to perform these automatic tasks and computing machines appeared.
One of the most rudimentary system is the pascaline, which is an ancestor of the calculator, dating from 1642. The information manipulated is thus the number. Nearly a century later, Basile Bouchon invented a programmable loom by punched cards in 1725. Here the information is the position of the hole(s) in the card, which in turns determine the movement of the hooks and thus the thread. This invention inspired Charles Babbage in 1834, who laid the foundations for today's digital machines. He could not build this machine because of a lack of financial support. The manipulation of numbers thus entered the physical world.
But the real computer revolution 2 appeared almost a century later thanks to another technological development: the electronics. It allowed the realization of machines more sophisticated and more reliable than the previous mechanical machines.
During the Second World War, the need to pass information in a secure way made (secret) research in computer science crucial. Such research thus obtained important funding.
Progress, both on the theoretical and technological fundamentals, was dazzling. The first computer (i.e. the first implementation of the Turing machine) was tested in 1948 at the University of Manchester.
From then on, the evolution of computers was closely linked to that of electronic systems, and in particular to the development of microelectronics. The latter gradually allowed the development of more reliable systems, smaller, faster and with greater memory storage capacities. The "Moore's law", which states that the number of transistors on a microprocessor chip (the core of computers) doubles every 18 months, still seems to be valid today, more than 50 years after its promulgation.
The principle of walking forward continues especially since the 70's with the democratization of the computer which became "personal" (Personal Computer - PC). Thus, in parallel of super-computers for companies, computers of lower capacity are developed for individuals.
Moreover, with the continuing progress of miniaturization and integration of electronic circuits, more and more objects and systems will be equipped with automated information processing capabilities. Thus, telephone systems, railways, air traffic control systems and even the electricity network are controlled by computer systems.
The same holds for everyday objects: telephones, cars, household appliances. This is refered to as "ambient computing", "object computing" or "smart" objects.
This omnipresence of digital and its materiality, especially electronic, leads us to ask the question: What is its environmental impact?
Some references to go further:¶
- Sur la définition du mot « NUMÉRIQUE » [online]. Available from Pixees.fr website, 2019 [17/09/2021]
- François Rechenmann. Idée reçue : L'informatique, c'est récent ! [online]. Interstices, 2008. Available at interstices.info [17/09/2021]
- Frédéric Prost. En toute logique : une origine de l’ordinateur [online]. Interstices, 2006. Available at interstices.info [17/09/2021]
- Serge Abiteboul, Thierry Viéville. La naissance du numérique [online]. Binaire, 2018. Available at le blog binaire du Monde [17/09/2021]
- Françoise Berthoud, Eric Drezet, Laurent Lefèvre, Anne-Cécile Orgerie. Sciences du numérique et développement durable: des liens complexes [online]. Interstices, 2015. Available at interstices.info [17/09/2021]