Psychology Wiki

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Philosophy Index: Aesthetics · Epistemology · Ethics · Logic · Metaphysics · Consciousness · Philosophy of Language · Philosophy of Mind · Philosophy of Science · Social and Political philosophy · Philosophies · Philosophers · List of lists


Main article: Inductive deductive reasoning

In traditional Aristotelian logic, deductive reasoning is inference in which the conclusion is of no greater generality than the premises, as opposed to inductive reasoning, where the conclusion is of greater generality than the premises. Other theories of logic define deductive reasoning as inference in which the conclusion is just as certain as the premises, as opposed to inductive reasoning, where the conclusion can have less certainty than the premises. In both approaches, the conclusion of a deductive inference is necessitated by the premises: the premises can't be true while the conclusion is false. (In Aristotelian logic, the premises in inductive reasoning can also be related in this way to the conclusion.)

Examples[]

Valid:

All men are mortal.
Socrates is a man.
Therefore Socrates is mortal.
The picture is above the desk.
The desk is above the floor.
Therefore the picture is above the floor.
All birds have wings.
A cardinal is a bird.
Therefore a cardinal has wings.

Invalid:

Every criminal opposes the government.
Everyone in the opposition party opposes the government.
Therefore everyone in the opposition party is a criminal.

This is invalid because the premises fail to establish commonality between membership in the opposition party and being a criminal. This is the famous fallacy of the undistributed middle.

Basic argument forms of the calculus
Name Sequent Description
Modus Ponens [(pq) ∧ p] ├ q if p then q; p; therefore q
Modus Tollens [(pq) ∧ ¬q] ├ ¬p if p then q; not q; therefore not p
Hypothetical Syllogism [(pq) ∧ (qr)] ├ (pr) if p then q; if q then r; therefore, if p then r
Disjunctive Syllogism [(pq) ∧ ¬p] ├ q Either p or q; not p; therefore, q
Constructive Dilemma [(pq) ∧ (rs) ∧ (pr)] ├ (qs) If p then q; and if r then s; but either p or r; therefore either q or s
Destructive Dilemma [(pq) ∧ (rs) ∧ (¬q ∨ ¬s)] ├ (¬p ∨ ¬r) If p then q; and if r then s; but either not q or not s; therefore rather not p or not r
Simplification (pq) ├ p p and q are true; therefore p is true
Conjunction p, q ├ (pq) p and q are true separately; therefore they are true conjointly
Addition p ├ (pq) p is true; therefore the disjunction (p or q) is true
Composition [(pq) ∧ (pr)] ├ [p → (qr)] If p then q; and if p then r; therefore if p is true then q and r are true
De Morgan's Theorem (1) ¬ (pq) ├ (¬p ∨ ¬q) The negation of (p and q) is equiv. to (not p or not q)
De Morgan's Theorem (2) ¬ (pq) ├ (¬p ∧ ¬q) The negation of (p or q) is equiv. to (not p and not q)
Commutation (1) (pq) ├ (qp) (p or q) is equiv. to (q or p)
Commutation (2) (pq) ├ (qp) (p and q) is equiv. to (q and p)
Association (1) [p ∨ (qr)] ├ [(pq) ∨ r] p or (q or r) is equiv. to (p or q) or r
Association (2) [p ∧ (qr)] ├ [(pq) ∧ r] p and (q and r) is equiv. to (p and q) and r
Distribution (1) [p ∧ (qr)] ├ [(pq) ∨ (pr)] p and (q or r) is equiv. to (p and q) or (p and r)
Distribution (2) [p ∨ (qr)] ├ [(pq) ∧ (pr)] p or (q and r) is equiv. to (p or q) and (p or r)
Double Negation p ├ ¬¬p p is equivalent to the negation of not p
Transposition (pq) ├ (¬q → ¬p) If p then q is equiv. to if not q then not p
Material Implication (pq) ├ (¬pq) If p then q is equiv. to either not p or q
Material Equivalence (1) (pq) ├ [(pq) ∧ (qp)] (p is equiv. to q) means, (if p is true then q is true) and (if q is true then p is true)
Material Equivalence (2) (pq) ├ [(pq) ∨ (¬q ∧ ¬p)] (p is equiv. to q) means, either (p and q are true) or ( both p and q are false)
Exportation [(pq) → r] ├ [p → (qr)] from (if p and q are true then r is true) we can prove (if q is true then r is true, if p is true)
Importation [p → (qr)] ├ [(pq) → r]
Tautology p ├ (pp) p is true is equiv. to p is true or p is true

Axiomatization[]

In more formal terms, a deduction is a sequence of statements such that every statement can be derived from those before it. It is understandable, then, that this leaves open the question of how we prove the first sentence (since it cannot follow from anything). Axiomatic propositional logic solves this by requiring the following conditions for a proof to be met:

A proof of α from an ensemble Σ of well-formed-formulas (wffs) is a finite sequence of wffs:

β1,...,βi,...,βn

where

βn = α

and for each βi (1 ≤ i ≤ n), either

  • βi ∈ Σ

or

  • βi is an axiom,

or

  • βi is the output of Modus Ponens for two previous wffs, βi-g and βi-h.

Different versions of axiomatic propositional logics contain a few axioms, usually three or more than three, in addition to one or more inference rules. For instance, Gottlob Frege's axiomatization of propositional logic, which is also the first instance of such an attempt, has six propositional axioms and two rules. Bertrand Russell and Alfred North Whitehead also suggested a system with five axioms.

For instance a version of axiomatic propositional logic due to Jan Lukasiewicz (1878-1956) has a set A of axioms adopted as follows:

  • [PL1] p → (qp)
  • [PL2] (p → (qr)) → ((pq) → (pr))
  • [PL3] (¬p → ¬q) → (qp)

and it has the set R of Rules of inference with one rule in it that is Modu Ponendo Ponens as follows:

  • [MP] from α and α → β, infer β.

The inference rule(s) allows us to derive the statements following the axioms or given wffs of the ensemble Σ.

Natural deductive logic[]

In one version of natural deductive logic presented by E.J. Lemmon that we should refer to it as system L, we do not have any axiom to begin with. We only have nine primitive rules that govern the syntax of a proof.

The nine primitive rules of system L are:

  1. The Rule of Assumption (A)
  2. Modus Ponendo Ponens (MPP)
  3. The Rule of Double Negation (DN)
  4. The Rule of Conditional Proof (CP)
  5. The Rule of ∧-introduction (∧I)
  6. The Rule of ∧-elimination (∧E)
  7. The Rule of ∨-introduction (∨I)
  8. The Rule of ∨-elimination (∨E)
  9. Reductio Ad Absurdum (RAA)

In system L, a proof has a definition with the following conditions:

  1. has a finite sequence of wffs (well-formed-formula)
  2. each line of it is justified by a rule of the system L
  3. the last line of the proof is what is intended (Q.E.D, quod erat demonstrandum, is a Latin expression that means: which was the thing to be proved), and this last line of the proof uses the only premise(s) that is given; or no premise if nothing is given.

Then if no premise is given, the sequent is called theorem. Therefore, the definitions of a theorem in system L is:

  • a theorem is a sequent that can be proved in system L, using an empty set of assumption.

or in other words:

  • a theorem is a sequent that can be proved from an empty set of assumptions in system L


An example of the proof of a sequent (Modus Tollendo Tollens in this case):

pq, ¬q ├ ¬p [Modus Tollendo Tollens (MTT)]
Assumption number Line number Formula (wff) Lines in-use and Justification
1 (1) (pq) A
2 (2) ¬q A
3 (3) p A (for RAA)
1,3 (4) q 1,3,MPP
1,2,3 (5) q ∧ ¬q 2,4,∧I
1,2 (6) ¬p 3,5,RAA
Q.E.D

An example of the proof of a sequent (a theorem in this case):

p ∨ ¬p
Assumption number Line number Formula (wff) Lines in-use and Justification
1 (1) ¬(p ∨ ¬p) A (for RAA)
2 (2) ¬p A (for RAA)
2 (3) (p ∨ ¬p) 2, ∨I
1, 2 (4) (p ∨ ¬p) ∧ ¬(p ∨ ¬p) 1, 2, ∧I
1 (5) ¬¬p 2, 4, RAA
1 (6) p 5, DN
1 (7) (p ∨ ¬p) 6, ∨I
1 (8) (p ∨ ¬p) ∧ ¬(p ∨ ¬p) 1, 7, ∧I
(9) ¬¬(p ∨ ¬p) 1, 8, RAA
(10) (p ∨ ¬p) 9, DN
Q.E.D

Each rule of system L has its own requirements for the type of input(s) or entry(es) that it can accept and has its own way of treating and calculating the assumptions used by its inputs.

References[]

  • Jennings, R. E., Continuing Logic, the course book of 'Axiomatic Logic' in Simon Fraser University, Vancouver, Canada
  • Zarefsky, David, Argumentation: The Study of Effective Reasoning Parts I and II, The Teaching Company 2002

See also[]


This page uses Creative Commons Licensed content from Wikipedia (view authors).