In mathematics and logic, an operation is finitary if it has finite arity, i.e. if it has a finite number of input values. Similarly, an infinitary operation is one with an Infinite set of input values.
In standard mathematics, an operation is finitary by definition. Therefore, these terms are usually only used in the context of infinitary logic.
By contrast, infinitary logic studies logics that allow infinitely long statements and proofs. In such a logic, one can regard the existential quantifier, for instance, as derived from an infinitary disjunction.
The stress on finiteness came from the idea that human mathematical thought is based on a finite number of principles and all the reasonings follow essentially one rule: the modus ponens. The project was to fix a finite number of symbols (essentially the Numerical digit 1, 2, 3, ... the letters of alphabet and some special symbols like "+", "⇒", "(", ")", etc.), give a finite number of propositions expressed in those symbols, which were to be taken as "foundations" (the axioms), and some rules of inference which would model the way humans make conclusions. From these, regardless of the semantic interpretation of the symbols the remaining theorems should follow formally using only the stated rules (which make mathematics look like a game with symbols more than a science) without the need to rely on ingenuity. The hope was to prove that from these axioms and rules all the theorems of mathematics could be deduced. That aim is known as logicism.
|
|