Evolution
Origin of the word
The word algorithm comes from the name of the 9th century Persian Muslim mathematician Abu Abdullah Muhammad ibn Musa Al-Khwarizmi. The word algorism originally referred only to the rules of performing arithmetic using Hindu-Arabic numerals but evolved via European Latin translation of Al-Khwarizmi's name into algorithm by the 18th century. The use of the word evolved to include all definite procedures for solving problems or performing tasks.
Algebra, the origin of variables
The work of the ancient Greek geometers, Persian mathematician Al-Khwarizmi -- often considered as the "father of algebra", Chinese and Western European mathematicans culiminated in Leibniz' notion of the "calculus ratiocinator", an algebra of logic.
Algorithms by the ancients
Euclid has created an algorithm that has been given its name.
The algo serves to calculate the greatest common divisor, here is it:
- divide the number a by b, the remainder is r
- replace a by b
- replace b by r
- continue until a can't be more divided. In this case, a is the gcd.
The algorithm of Archimedes gives an approximation of the Pi number.
Eratosthenes has defined an algorithim for retrieving prime numbers.
Averroès (1126-1198) was using algorithmic methods for calculations.
Adelard de Bath (12 th) introduces the algorismus term, from Al-Khwarizmi.
Symbols, rules, and paradoxes
During the 1800's up to the mid-1900's:
- George Boole (1847) has invented the binary algebra, the basis of computers. Actually he has unified logic and calculation in a common symbolism.
- Gottlob Frege (1879) formula language's, that is a lingua characterica, a language written with special symbols, "for pure thought", that is free from rhetorical embellishments... constructed from specific symbols that are manipulated according to definite rules.
- Giuseppe Peano (1888) It's The principles of arithmetic, presented by a new method was the first attempt at an axiomatization of mathematics in a symbolic language.
- Alfred North Whitehead and Bertrand Russell in their Principia Mathematica (1910-1913) has further simplified and amplified the work of Frege.
- Kurt Goëdel (1931) cites the paradox of the liar that completely reduces rules of recursion to numbers.
First formalization
The concept of algorithm was formalized in 1936 through Alan Turing's Turing machines and Alonzo Church's lambda calculus, which in turn formed the foundation of computer science.
The symbol space of Post
Emil Post (1936) described the actions of a "computer" as follows:
- "...two concepts are involved: that of a symbol space in which the work leading from problem to answer is to be carried out, and a fixed unalterable set of directions.
His symbol space would be
- "a two way infinite sequence of spaces or boxes... The problem solver or worker is to move and work in this symbol space, being capable of being in, and operating in but one box at a time.... a box is to admit of but two possible conditions, i.e. being empty or unmarked, and having a single mark in it, say a vertical stroke.
- "One box is to be singled out and called the starting point. ...a specific problem is to be given in symbolic form by a finite number of boxes - input - being marked with a stroke. Likewise the answer - output - is to be given in symbolic form by such a configuration of marked boxes....
- "A set of directions applicable to a general problem sets up a deterministic process when applied to each specific problem. This process will terminate only when it comes to the direction of type stop."
Turing and the human computer
Alan Turing's work (1936-1937) preceded that of Stibitz (1937); it is unknown if Stibitz knew of the work of Turing. Turing's biographer believed that Turing's use of a typewriter-like model derived from a youthful interest: he had dreamt of inventing typewriters as a boy; Mrs. Turing had a typewriter and he could well have begun by asking himself what was meant to make a typewriter really mechanical. Given the prevalence of Morse code and telegraphy, ticker tape machines, and Teletypes we might conjecture that all were influences.
Turing - his model of computation is now called a Turing machine - begins, as did Post, with an analysis of a human computer that he whittles down to a simple set of basic motions and "states of mind". But he continues a step further and creates his machine as a model of computation of numbers :
- "Computing is normally done by writing certain symbols on paper. We may suppose this paper is divided into squares like a child's arithmetic book... I assume then that the computation is carried out on one-dimensional paper, i.e. on a tape divided into squares. I shall also suppose that the number of symbols which may be printed is finite..."
- "The behavior of the computer at any moment is determined by the symbols which he is observing, and his "state of mind" at that moment. We may suppose that there is a bound B to the number of symbols or squares which the computer can observe at one moment. If he wishes to observe more, he must use successive observations. We will also suppose that the number of states of mind which need be taken into account is finite..."
- "Let us imagine that the operations performed by the computer to be split up into 'simple operations' which are so elementary that it is not easy to imagine them further divided".
Turing's reduction yields the following:
- "The simple operations must therefore include:
1) Changes of the symbol on one of the observed squares
2) Changes of one of the squares observed to another square within L squares of one of the previously observed squares."
"It may be that some of these change necessarily invoke a change of state of mind. The most general single operation must therefore be taken to be one of the following:
- "1) A possible change (a) of symbol together with a possible change of state of mind.
- "2) A possible change (b) of observed squares, together with a possible change of state of mind".
- "We may now construct a machine to do the work of this computer."
The effective method of Rosser
J. Barkley Rosser (1939) boldly defined an effective mathematical method in the following manner:
- "'Effective method' is used here in the rather special sense of a method each step of which is precisely determined and which is certain to produce the answer in a finite number of steps. With this special meaning, three different precise definitions have been given to date. (1). The simplest of these to state (due to Post and Turing) says essentially that an effective method of solving certain sets of problems exists if one can build a machine which will then solve any problem of the set with no human intervention beyond inserting the question and (later) reading the answer. All three definitions are equivalent, so it doesn't matter which one is used. Moreover, the fact that all three are equivalent is a very strong argument for the correctness of any one."
(1) Rosser references the work of Church and Kleene and their definition of i-definability; Herbrand and Gödel and their use of recursion; Post and Turing in their mechanism-models of computation.
The algorithmic theory of Kleene
Stephen C. Kleene (1943) defined his now-famous thesis known as the "Church-Turing Thesis". In this context:
- " Algorithmic theories... In setting up a complete algorithmic theory, what we do is to describe a procedure, performable for each set of values of the independent variables, which procedure necessarily terminates and in such manner that from the outcome we can read a definite answer, "yes" or "no," to the question, "is the predicate value true?"