Systems of natural reasoning

from Wikipedia, the free encyclopedia

Systems (or calculi) of natural reasoning denote a type of calculus in mathematical and philosophical logic that was developed in 1934 by Gerhard Gentzen and around the same time by Stanisław Jaśkowski - a representative of the Lviv-Warsaw School  .

The concept of the calculus of natural inference (KdnS) is not strictly defined, instead there are a number of characteristics that apply to KdnS to different degrees and thereby determine how typical the specimen is for the genus.

  • In contrast to most of the other types of calculus such as tableaux calculus , axiomatic calculus , dialog calculus , etc., there is the possibility in KdnS to accept statements that are valid for a while within the derivation . These assumptions can be canceled later (see also below). This feature accounts for a large part of the “naturalness” of reasoning in KdnS, because it corresponds to the current practice in mathematical proofs.
  • In contrast to axiomatic calculi, a system of natural inference contains no or hardly any axioms , but mainly a larger number of rules of inference . Together with the likewise developed by Gentzen sequences calculi are systems of natural deduction is why the family rule logic or rule calculi.
  • The rules of conclusion in a KdnS should be justifiable intuitively, and ideally correspond to pre-theoretical proof techniques . This feature also contributes to the “naturalness” of closing.
  • Usually, the inference rules are systematized in such a way that an introduction and an elimination rule are specified for each logical operator ( junction or quantifier ). The introductory rule for an operator O allows a statement with O as the main operator; the elimination rule leads from a statement with O as the main operator to another statement.

Due to the naturalness of the inference and the systematization in introduction and elimination rules, the claim of a “proof-theoretical semantics ” can be combined with a KdnS , which wants to define the meaning of the logical operators by specifying inference rules.

history

Łukasiewicz gave the impetus for the development of the KdnS by Jaśkowski . In 1926 he characterized mathematicians' standard proof practice in such a way that they make assumptions and see where they lead. As a seminar topic, he formulated the project to set up a logical theory that would allow such conclusions. Gentzen characterizes his motivation very similarly, but independently:

My first point of view was this: The formalization of logical inference, as developed in particular by Frege , Russell, and Hilbert , is quite a long way from the kind of inference actually practiced in mathematical proofs. [...] First of all, I wanted to set up a formalism that comes as close as possible to the real reasoning. This resulted in a “calculation of natural reasoning”.
The most essential external difference between NJ-derivations and derivations in the systems of Russell, Hilbert, Heyting is as follows: In the latter, the correct formulas are derived from a series of "logical basic formulas" by a few conclusions; however, the natural inference is generally not based on logical principles, but on assumptions [...] which are followed by logical conclusions. A later conclusion then makes the result independent of the assumption again.

In order to show the range of the dependence on an assumption, Jaśkowski developed two different methods: On the one hand, those formulas that are dependent on a certain formula are enclosed in a rectangle. Such rectangles can be nested inside one another, but they cannot overlap, in the sense that the top of one rectangle is inside and the bottom is outside of another rectangle. Jaśkowski's other method is essentially equivalent: here the validity of assumptions is represented by a chain of numbers written on the edge of the proof.

Gentzen, on the other hand, used a tree-like arrangement of the formulas in the proof to represent the dependencies : leaves of the tree correspond to assumptions and the roots of the proven formula. Some edges are annotated with information about deleted assumptions. The system of rules is new at Gentzen; unlike Jaśkowski, here for the first time there are rules of introduction and elimination for every operator. (With Gentzen, however, this results in the intuitionist calculus, with him the classical conclusions only come about through additional axioms.)

KdnS are used for the first time in a textbook in Quine's "Methods of Logic" from 1950. In 1952, Fitch combined Gentzen's system of rules of introduction and elimination with Jaśkowski's rectangular representation ("Fitch calculus", although Fitch does not use the rectangle closed, but only indicated on the left as a "hypothesis line"). In 1957 Suppes developed a new form of representation for dependencies: Similar to Jaśkowski's second form of representation, the dependencies appear as annotation on the side of the proof. In contrast to Jaśkowski and every other KdnS that existed up to that point, the formulas for soups, which are dependent on a certain assumption, no longer need to be placed one after the other. This eliminates the need to “nest” assumptions and allows evidence to be written in a more linear form. Another innovation at Suppes is the introduction of parameters.

The rules

Acceptance rule and dependency

In a KdnS there is an acceptance rule that allows any statements to be accepted. Final rules are mostly applied to statements that have already been made. If these statements are assumptions, it is said that the result of the rule was obtained “as a function of” these assumptions. The same applies if the statements to which the rule was applied are not assumptions, but were themselves obtained as a function of some assumption. As a result of the rules, all the dependencies involved then add up. However, some rules allow very specific assumptions to be "erased". This means that the result of the rule no longer depends on a certain assumption on which one of the statements to which it was applied was still dependent.

Here is an example: If it is possible to derive a statement B depending on a statement A, then the statement “If A, then B” can be used. (This is the implicational -introduction was, see below.) By adopting A now speak explicitly included in the statement ( "If A, ..."), the dependence can be wiped off her. I.e. "If A, then B" is no longer dependent on A. If B were still dependent on further statements  ...  , these dependencies would continue to exist. “If A, then B” would still depend on  …  .

If you want to prove a logical theorem , the proven statement must be completely free of dependencies. If the proven statement A is still dependent on statements  ...  , it has been shown that  ...  the statement A implies logically.

Classic logic

Propositional logic

The following introduction (E) and elimination rules (B) are usually used for classic propositional logic . The premises are above the line of consequence, the conclusion below, square brackets mark dependencies to be eliminated:

The conjunction “A and B” can be inferred from two statements A and B (rule of the introduction of the conjunction).
Example: From the statements “ Skolem was Norwegian” (A) and “Skolem was a logician” (B) it can be concluded: “Skolem was Norwegian and Skolem was a logician” (A ∧ B).

From a conjunction “A and B”, every single conjunct, i.e. both A and B, can be deduced (rule of eliminating the conjunction).
Example: From “Skolem was Norwegian and Skolem was a logician” (A ∧ B) it can be concluded: “Skolem was Norwegian” (A) [and also “Skolem was a logician” (B)].

From a statement A the disjunction “A or B” can be inferred .
Example: From “Skolem was Norwegian” (A) it can be concluded: “Skolem was Swede or Norwegian” (A ∨ B).

If it is possible to derive a sentence C from every disjunction of a disjunction “A or B”, then this sentence follows from the disjunction.
Example: Let's say I know that Skolem was Norwegian or Swede (A ∨ B). “Skolem was Norwegian” (A) follows “Skolem was Scandinavian” (C). The same follows from “Skolem was a Swede” (B). So from “Skolem was Norwegian or Swede” (A ∨ B) I can deduce “Skolem was Scandinavian” (C).
This rule corresponds to the proof technique of "case distinction" (cf. Proof (mathematics) ).

If it is possible to derive a statement B from a statement A, then - based on the deduction theorem  - the implication “If A, then B” can also be derived.
Example: If I can conclude from the assumption “Skolem was Norwegian” (A) that “Skolem was Scandinavian” (B), I can (free of this assumption) conclude: “If Skolem was Norwegian, he was Scandinavian” (A → B).
This rule corresponds to the proving technique "conditional proof".

Modus ponens : From the implication "If A, then B" follows together with A the statement B.
Example: From “If Skolem was Norwegian, then he was Scandinavian” (A → B), together with “Skolem was Norwegian” (A) the statement “Skolem was Scandinavian” (B) follows.
If both B and “not B” and thus a contradiction can be derived from a statement A , then the negation “not A” can be concluded.
Example: From the statement “ and are the only prime numbers that exist” (A) it follows on the one hand “ is a prime number” (B), because it is neither divisible by nor by . But it also follows “ is not a prime number” (¬B), because there are only the prime numbers and . We can therefore conclude that “ and are not the only prime numbers” (¬A).
This rule corresponds to the proof technique "indirect proof" or reductio ad absurdum .

From the fact that a statement B can be derived from the statement A, the negation “not A” follows together with “not B ”.
Example: The statement “Skolem was Swede” (A) results in “Skolem was not a Norwegian” (B). But if we know “Skolem was Norwegian” (¬B), then we can infer the negation of A, that is, “Skolem was not a Swede” (¬A).

Duplex negatio affirmat : From the statement “not not A” one can infer A.
Example: “It is not true that Skolem was not Norwegian” (¬¬A) follows “Skolem was Norwegian” (A).

Predicate logic

For a calculus of natural inference for the predicate logic , additional rules of introduction and elimination for the quantifiers are necessary.

When formulating the rules, so-called "parameters" are used. Parameters are terms that cannot appear in axioms . It is assumed that an infinite supply of parameters is available. A parameter plays roughly the same role in the calculus of natural inference that free variables play in other calculi, but parameters cannot be bound by quantifiers.

Two auxiliary terms are required for an exact formulation of the quantifier rules: An instantiation of an all or existence statement , or , through the term t, A (t) is the result of replacing all free occurrences of x in A by t. A parameterization of one of these statements by the parameter u, A (u), is an instantiation by u, whereby u must not already occur in A.

, where A (u) does not depend on a statement in which u occurs.

From the parameterization A (u) the statement “A applies to every x” can be concluded. An additional condition here is that A (u) does not depend on any statements that contain the parameter u.
Example: If I have concluded that Skolem is perishable, and if no information about Skolem has gone into that conclusion, then I can conclude that everything is perishable. The clause that no information about Skolem was included in this conclusion corresponds to the clause that A (u) must not depend on a statement in which u occurs.

The instantiation A (t) can be inferred from the statement “For every x A”.
Example: From “Everything is perishable” can be concluded “Skolem is perishable”.

From the instantiation A (t) “there is an x ​​for which A applies” can be concluded.
Example: From “Skolem was Norwegian” one can conclude “There were Norwegians”.

, where u does not appear in B or in any statement on which B depends, except for the parameterization A (u).

A statement B can be inferred from the statement “There is an x ​​for which A applies” if it is possible to derive A (u) B from the parameterization. An additional condition is that B does not contain u and is not dependent on any statements that contain u, except for A (u).
Example: I know that there is a Norwegian logician. From the statement “Skolem was Norwegian and Skolem was a logician” I can conclude “There is a Scandinavian logician”, whereby no further information about Skolem goes into this conclusion (except that he is Norwegian and logician). Then I can conclude from “There is a Norwegian logician”: “There is a Scandinavian logician”.

identity

The identity mark can also be given a meaning by means of introduction and removal rules.

The statement “t = t” can be deduced.
Example: The statement “Skolem is equal to Skolem” applies. Note that this rule does not need to apply to any statements.

From “t1 = t2” and a statement A, a statement A (t1 // t2) can be inferred, where A (t1 // t2) results from A by replacing all or some occurrences of t1 in A with t2.
Example: From “Skolem is identical to the inventor of the Skolemian normal form” and “Skolem was Norwegian” it can be concluded: “The inventor of the Skolemian normal form was Norwegian”.

An example derivation

This derivation proves the counterposition .

Dependency (s) row statement rule applied to
1 1 fixed assumption
2 2 fixed assumption
3 3 adoption
1, 3 4th 1, 3
1, 2, 3 5 2, 4
1, 2 6th (3,) 5
1 7th (2,) 6
8th (1,) 7

Non-classical logic

Just as one generates non- classical logical systems from an axiomatic calculus for classical logic by omitting individual axioms or replacing them with new axioms, one can generate non-classical systems of natural reasoning by deleting individual rules from the above rule set or by certain others Rules replaced:

  • If one replaces the rule of elimination for double negation, with the rule ex contradictione sequitur quodlibet , one obtains a calculus corresponding to intuitionism . It corresponds to the use of the effective framework rule in dialogic logic .
  • If one deletes the elimination rule for the double negation without replacement, one obtains the so-called minimal calculus ( Kolmogorow 1924, Johansson 1937).
  • On the other hand, if one deletes the rules for introducing the conjunction , one obtains a so-called non-adjunctive paraconsistent logical calculus.

Cutting rule

The Gentzensche law states that the cut rule

is permissible (eliminable) in systems of natural reasoning . ( and are lists of formulas)

literature

  • Ludwik Borkowski: Formal logic. Logical systems. Introduction to metalogics. Akademie-Verlag, Berlin 1976, pp. 41-83.
  • Irving Copi: Introduction to Logic (= university paperbacks. Philosophy 2031). Munich, Fink 1998, ISBN 3-8252-2031-1 .
  • Wilhelm K. Essler, Elke Brendel, Rosa F. Martínez Cruzado: Basic features of logic. Volume 1: Wilhelm K. Essler, Rosa F. Martínez Cruzado: The logical conclusion. 4th edition. Klostermann, Frankfurt am Main 1991, ISBN 3-465-02501-6 .
  • Frederic Brenton Fitch : Symbolic Logic. An Introduction. Ronald Press Company, New York NY 1952.
  • Gerhard Gentzen : Investigations into logical reasoning. In: Mathematical Journal . Volume 39, 1935, pp. 176-210, 405-431, reprint in: Karel Berka , Lothar Kreiser: Logic texts. Annotated selection on the history of modern logic. 4th, compared to the 3rd, expanded, revised edition. Akademie-Verlag, Berlin 1986; Online version of the University of Göttingen: Part 1 and Part 2 .
  • Gerhard Gentzen: The Collected Papers. Edited by Manfred E. Szabo. North-Holland Publishing Co., Amsterdam u. a. 1969.
  • Michael Huth, Mark Ryan: Logic in Computer Science: Modeling and Reasoning about Systems. 2nd edition, Cambridge University Press, 2004.
  • Stanisław Jaśkowski : On the Rules of Suppositions in Formal Logic (= Uniwersytet Darszawski. Seminarjum Filozoficzne. Studia logica. No. 1). Nakładem Seminarjum Filozoficznego Wydziału Matematyczno-Przyrodniczego Uniwersytetu Warszawskiego, Warsaw 1934.
  • Ingebrigt Johansson: The minimal calculus, a reduced intuitionistic formalism. In: Compositio Mathematica. Volume 4, 1937, pp. 119-136, numdam.org (PDF; 1.4 MB)
  • Willard Van Orman Quine : Methods of Logic. Holt, Rinehart and Winston, New York NY et al. a. 1950.
  • Eike von Savigny : Basic course in logical reasoning. Exercises for self-study (= Kleine Vandenhoeck series. Volume 1504). 2nd, improved edition. Vandenhoeck & Ruprecht, Göttingen 1984, ISBN 3-525-33502-4 .
  • Patrick Suppes : Introduction to Logic. Van Nostrand, New York NY u. a. 1957.

Web links

Individual evidence

  1. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 106 ff. (For the text see web links).
  2. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 107 f.
  3. ^ Gentzen: Investigations into logical reasoning. 1935, p. 176.
  4. ^ Gentzen: Investigations into logical reasoning. 1935, p. 184.
  5. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 109.
  6. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 110.
  7. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 121.
  8. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 124 f.
  9. Pelletier: A History of Natural Deduction and Elementary Logic Textbooks. 2001, p. 129.