Formal Semantics (Logic)

166
Formal semantics (logic) From Wikipedia, the free encyclopedia

description

1. From Wikipedia, the free encyclopedia2. Lexicographical order

Transcript of Formal Semantics (Logic)

Page 1: Formal Semantics (Logic)

Formal semantics (logic)From Wikipedia, the free encyclopedia

Page 2: Formal Semantics (Logic)

Contents

1 Atomic sentence 11.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.1.1 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1.2 Atomic sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1.3 Atomic formulae . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.1.4 Compound sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.1.5 Compound formulae . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Translating sentences from a natural language into an artificial language . . . . . . . . . . . . . . . 31.4 Philosophical significance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 First-order logic 52.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.2.1 Alphabet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2.2 Formation rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.2.3 Free and bound variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.4 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.3 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.3.1 First-order structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.3.2 Evaluation of truth values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.3.3 Validity, satisfiability, and logical consequence . . . . . . . . . . . . . . . . . . . . . . . . 122.3.4 Algebraizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122.3.5 First-order theories, models, and elementary classes . . . . . . . . . . . . . . . . . . . . . 132.3.6 Empty domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

2.4 Deductive systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.4.1 Rules of inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.4.2 Hilbert-style systems and natural deduction . . . . . . . . . . . . . . . . . . . . . . . . . . 142.4.3 Sequent calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.4.4 Tableaux method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.4.5 Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

i

Page 3: Formal Semantics (Logic)

ii CONTENTS

2.4.6 Provable identities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.5 Equality and its axioms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.5.1 First-order logic without equality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.5.2 Defining equality within a theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.6 Metalogical properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.1 Completeness and undecidability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.2 The Löwenheim–Skolem theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.3 The compactness theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.6.4 Lindström’s theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

2.7 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.7.1 Expressiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.7.2 Formalizing natural languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2.8 Restrictions, extensions, and variations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.8.1 Restricted languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.8.2 Many-sorted logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.8.3 Additional quantifiers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202.8.4 Infinitary logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202.8.5 Non-classical and modal logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202.8.6 Fixpoint logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212.8.7 Higher-order logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

2.9 Automated theorem proving and formal methods . . . . . . . . . . . . . . . . . . . . . . . . . . . 212.10 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.11 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.12 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232.13 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3 Formal proof 263.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

3.1.1 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263.1.2 Formal grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263.1.3 Formal systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263.1.4 Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.4 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4 Formal semantics (logic) 284.1 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

5 Formal system 305.1 Related subjects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

5.1.1 Logical system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Page 4: Formal Semantics (Logic)

CONTENTS iii

5.1.2 Deductive system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315.1.3 Formal proofs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315.1.4 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315.1.5 Formal grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

5.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325.4 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325.5 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

6 Formation rule 336.1 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336.2 Formal systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336.3 Propositional and predicate logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

7 Interpretation (logic) 357.1 Formal languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

7.1.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357.1.2 Logical constants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

7.2 General properties of truth-functional interpretations . . . . . . . . . . . . . . . . . . . . . . . . . 367.2.1 Logical connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

7.3 Interpretation of a theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377.4 Interpretations for propositional logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377.5 First-order logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

7.5.1 Formal languages for first-order logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377.5.2 Interpretations of a first-order language . . . . . . . . . . . . . . . . . . . . . . . . . . . 387.5.3 Example of a first-order interpretation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387.5.4 Non-empty domain requirement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397.5.5 Interpreting equality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397.5.6 Many-sorted first-order logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

7.6 Higher-order predicate logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407.7 Non-classical interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407.8 Intended interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

7.8.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417.9 Other concepts of interpretation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417.10 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417.11 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427.12 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

8 Logical consequence 438.1 Formal accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 438.2 A priori property of logical consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

Page 5: Formal Semantics (Logic)

iv CONTENTS

8.3 Proofs and models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448.3.1 Syntactic consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448.3.2 Semantic consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

8.4 Modal accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448.4.1 Modal-formal accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458.4.2 Warrant-based accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458.4.3 Non-monotonic logical consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

8.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458.6 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458.7 Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

9 Logical constant 489.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 489.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 489.3 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

10 Logical Syntax of Language 4910.1 Life and work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4910.2 Logical syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

10.2.1 The purpose of logical syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5110.3 Rejection of metaphysics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

10.3.1 The function of logical analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5210.4 Selected publications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5210.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5310.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5310.7 Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5410.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

11 Metasyntactic variable 5611.1 Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5611.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5711.3 Words commonly used as metasyntactic variables . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

11.3.1 Arabic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5711.3.2 English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5711.3.3 German . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.4 French . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.5 Hebrew . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.6 Italian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.7 Japanese . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.8 Portuguese . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.9 Spanish . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

Page 6: Formal Semantics (Logic)

CONTENTS v

11.3.10 Turkish . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5811.3.11 Persian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

11.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5911.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5911.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

12 Metavariable 6112.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6112.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

13 Proposition 6213.1 Historical usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

13.1.1 By Aristotle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6213.1.2 By the logical positivists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6213.1.3 By Russell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

13.2 Relation to the mind . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6313.3 Treatment in logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6313.4 Objections to propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6313.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6413.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6413.7 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

14 Propositional calculus 6514.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6614.2 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6614.3 Basic concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

14.3.1 Closure under operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6814.3.2 Argument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

14.4 Generic description of a propositional calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6914.5 Example 1. Simple axiom system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7014.6 Example 2. Natural deduction system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7114.7 Basic and derived argument forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7214.8 Proofs in propositional calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

14.8.1 Example of a proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7214.9 Soundness and completeness of the rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

14.9.1 Sketch of a soundness proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7314.9.2 Sketch of completeness proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7414.9.3 Another outline for a completeness proof . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

14.10Interpretation of a truth-functional propositional calculus . . . . . . . . . . . . . . . . . . . . . . . 7514.10.1 Interpretation of a sentence of truth-functional propositional logic . . . . . . . . . . . . . . 75

14.11Alternative calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7614.11.1 Axioms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

Page 7: Formal Semantics (Logic)

vi CONTENTS

14.11.2 Inference rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7614.11.3 Meta-inference rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7614.11.4 Example of a proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

14.12Equivalence to equational logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7814.13Graphical calculi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7914.14Other logical calculi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7914.15Solvers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8014.16See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

14.16.1 Higher logical levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8014.16.2 Related topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

14.17References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8014.18Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

14.18.1 Related works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8114.19External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

15 Propositional formula 8215.1 Propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

15.1.1 Relationship between propositional and predicate formulas . . . . . . . . . . . . . . . . . 8315.1.2 Identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

15.2 An algebra of propositions, the propositional calculus . . . . . . . . . . . . . . . . . . . . . . . . . 8315.2.1 Usefulness of propositional formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8415.2.2 Propositional variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8415.2.3 Truth-value assignments, formula evaluations . . . . . . . . . . . . . . . . . . . . . . . . 84

15.3 Propositional connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8515.3.1 Connectives of rhetoric, philosophy and mathematics . . . . . . . . . . . . . . . . . . . . 8515.3.2 Engineering connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8515.3.3 CASE connective: IF ... THEN ... ELSE ... . . . . . . . . . . . . . . . . . . . . . . . . . 8515.3.4 IDENTITY and evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

15.4 More complex formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8715.4.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8715.4.2 Axiom and definition schemas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8815.4.3 Substitution versus replacement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

15.5 Inductive definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8815.6 Parsing formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

15.6.1 Connective seniority (symbol rank) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8915.6.2 Commutative and associative laws . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9015.6.3 Distributive laws . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9015.6.4 De Morgan’s laws . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9015.6.5 Laws of absorption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9115.6.6 Laws of evaluation: Identity, nullity, and complement . . . . . . . . . . . . . . . . . . . . 9115.6.7 Double negative (Involution) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

15.7 Well-formed formulas (wffs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

Page 8: Formal Semantics (Logic)

CONTENTS vii

15.7.1 Wffs versus valid formulas in inferences . . . . . . . . . . . . . . . . . . . . . . . . . . . 9215.8 Reduced sets of connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

15.8.1 The stroke (NAND) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9215.8.2 IF ... THEN ... ELSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

15.9 Normal forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9415.9.1 Reduction to normal form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9415.9.2 Reduction by use of the map method (Veitch, Karnaugh) . . . . . . . . . . . . . . . . . . 95

15.10Impredicative propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9615.11Propositional formula with “feedback” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97

15.11.1 Oscillation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9715.11.2 Memory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97

15.12Historical development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9815.13Footnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10015.14References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

16 Rule of inference 10816.1 The standard form of rules of inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10816.2 Axiom schemas and axioms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10916.3 Example: Hilbert systems for two propositional logics . . . . . . . . . . . . . . . . . . . . . . . . 10916.4 Admissibility and derivability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11016.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11016.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111

17 Semantics 11217.1 Linguistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11217.2 Montague grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11317.3 Dynamic turn in semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11317.4 Prototype theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11417.5 Theories in semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114

17.5.1 Model theoretic semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11417.5.2 Formal (or truth-conditional) semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11417.5.3 Lexical and conceptual semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11417.5.4 Lexical semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11417.5.5 Computational semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

17.6 Computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11517.6.1 Programming languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11517.6.2 Semantic models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

17.7 Psychology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11617.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

17.8.1 Linguistics and semiotics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11617.8.2 Logic and mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11717.8.3 Computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

Page 9: Formal Semantics (Logic)

viii CONTENTS

17.8.4 Psychology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11717.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11817.10External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118

18 Symbol (formal) 11918.1 Can words be modeled as formal symbols? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12018.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12018.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

19 Syntax (logic) 12119.1 Syntactic entities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122

19.1.1 Symbols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12219.1.2 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12219.1.3 Formation rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12219.1.4 Propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12219.1.5 Formal theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12219.1.6 Formal systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12319.1.7 Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

19.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12319.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

20 Theorem 12520.1 Informal account of theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12520.2 Provability and theoremhood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12620.3 Relation with scientific theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12620.4 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12620.5 Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12820.6 Lore . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12820.7 Theorems in logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128

20.7.1 Syntax and semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12920.7.2 Derivation of a theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12920.7.3 Interpretation of a formal theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13020.7.4 Theorems and theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

20.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13020.9 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13020.10References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13120.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131

21 Theory (mathematical logic) 13621.1 Theories expressed in formal language generally . . . . . . . . . . . . . . . . . . . . . . . . . . . 136

21.1.1 Subtheories and extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13621.1.2 Deductive theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13621.1.3 Consistency and completeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136

Page 10: Formal Semantics (Logic)

CONTENTS ix

21.1.4 Interpretation of a theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13721.1.5 Theories associated with a structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137

21.2 First-order theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13721.2.1 Derivation in a first order theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13721.2.2 Syntactic consequence in a first order theory . . . . . . . . . . . . . . . . . . . . . . . . . 13821.2.3 Interpretation of a first order theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13821.2.4 First order theories with identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13821.2.5 Topics related to first order theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

21.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13821.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13921.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13921.6 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139

22 Unate function 140

23 Variable (mathematics) 14123.1 Etymology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14123.2 Genesis and evolution of the concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14123.3 Specific kinds of variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142

23.3.1 Dependent and independent variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14323.3.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

23.4 Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14323.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14523.6 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14523.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

24 Well-formed formula 14624.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14724.2 Propositional calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14724.3 Predicate logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14824.4 Atomic and open formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14824.5 Closed formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14924.6 Properties applicable to formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14924.7 Usage of the terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14924.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14924.9 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14924.10References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15024.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15024.12Text and image sources, contributors, and licenses . . . . . . . . . . . . . . . . . . . . . . . . . . 151

24.12.1 Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15124.12.2 Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15524.12.3 Content license . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156

Page 11: Formal Semantics (Logic)

Chapter 1

Atomic sentence

In logic, an atomic sentence is a type of declarative sentence which is either true or false (may also be referred to asa proposition, statement or truthbearer) and which cannot be broken down into other simpler sentences. For example“The dog ran” is an atomic sentence in natural language, whereas “The dog ran and the cat hid.” is a molecularsentence in natural language.From a logical analysis, the truth or falsity of sentences in general is determined by only two things: the logical formof the sentence and the truth or falsity of its simple sentences. This is to say, for example, that the truth of the sentence“John is Greek and John is happy” is a function of the meaning of "and", and the truth values of the atomic sentences“John is Greek” and “John is happy”. However, the truth or falsity of an atomic sentence is not a matter that is withinthe scope of logic itself, but rather whatever art or science the content of the atomic sentence happens to be talkingabout.[1]

Logic has developed artificial languages, for example sentential calculus and predicate calculus partly with the purposeof revealing the underlying logic of natural languages statements, the surface grammar of which may conceal theunderlying logical structure; see Analytic Philosophy. In these artificial languages an Atomic Sentence is a string ofsymbols which can represent an elementary sentence in a natural language, and it can be defined as follows.In a formal language, a well-formed formula (or wff) is a string of symbols constituted in accordance with the rules ofsyntax of the language. A term is a variable, an individual constant or a n-place function letter followed by n terms.An atomic formula is a wff consisting of either a sentential letter or an n-place predicate letter followed by n terms. Asentence is a wff in which any variables are bound. An atomic sentence is an atomic formula containing no variables.It follows that an atomic sentence contains no logical connectives, variables or quantifiers. A sentence consisting ofone or more sentences and a logical connective is a compound (or molecular sentence). See vocabulary in First-orderlogic

1.1 Examples

1.1.1 Assumptions

In the following examples:* let F, G, H be predicate letters; * let a, b, c be individual constants; * let x, y, z be variables.

1.1.2 Atomic sentences

These wffs are atomic sentences; they contain no variables or conjunctions:

• F(a)

• H(b, a, c)

1

Page 12: Formal Semantics (Logic)

2 CHAPTER 1. ATOMIC SENTENCE

1.1.3 Atomic formulae

These wffs are atomic formulae, but are not sentences (atomic or otherwise) because they include free variables:

• F(x)

• G(a, z)

• H(x, y, z)

1.1.4 Compound sentences

These wffs are compound sentences. They are sentences, but are not atomic sentences because they are not atomicformulae:

• ∀x (F(x))

• ∃z (G(a, z))

• ∃x ∀y ∃z (H(x, y, x))

• ∀x ∃z (F(x) ∧ G(a, z))

• ∃x ∀y ∃z (G(a, z) ∨ H(x, y, z))

1.1.5 Compound formulae

These wffs are compound formulae. They are not atomic formulae but are built up from atomic formulae using logicalconnectives. They are also not sentences because they contain free variables:

• F(x) ∧ G(a, z)

• G(a, z) ∨ H(x, y, z)

1.2 Interpretations

Main article: Interpretation (logic)

A sentence is either true or false under an interpretation which assigns values to the logical variables. We mightfor example make the following assignments:Individual Constants

• a: Socrates

• b: Plato

• c: Aristotle

Predicates:

• Fα: α is sleeping

• Gαβ: α hates β

• Hαβγ: α made β hit γ

Sentential variables:

Page 13: Formal Semantics (Logic)

1.3. TRANSLATING SENTENCES FROM A NATURAL LANGUAGE INTO AN ARTIFICIAL LANGUAGE 3

• p: It is raining.

Under this interpretation the sentences discussed above would represent the following English statements:

• p: “It is raining.”

• F(a): “Socrates is sleeping.”

• H(b, a, c): “Plato made Socrates hit Aristotle.”

• ∀x (F(x)): “Everybody is sleeping.”

• ∃z (G(a, z)): “Socrates hates somebody.”

• ∃x ∀y ∃z (H(x, y, z)): “Somebody made everybody hit somebody.” (They may not have all hit the same personz, but they all did so because of the same person x.)

• ∀x ∃z (F(x) ∧ G(a, z)): “Everybody is sleeping and Socrates hates somebody.”

• ∃x ∀y ∃z (G(a, z) ∨H(x, y, z)): “Either Socrates hates somebody or somebody made everybody hit somebody.”

1.3 Translating sentences from a natural language into an artificial lan-guage

Sentences in natural languages can be ambiguous, whereas the languages of the sentential logic and predicate logicsare precise. Translation can reveal such ambiguities and express precisely the intended meaning.For example take the English sentence “Father Ted married Jack and Jill”. Does this mean Jack married Jill? Intranslating we might make the following assignments: Individual Constants

• a: Father Ted

• b: Jack

• c: Jill

Predicates:

• Mαβγ: α officiated at the marriage of β to γ

Using these assignments the sentence above could be translated as follows:

• M(a, b, c): Father Ted officiated at the marriage of Jack and Jill.

• ∃x ∃y (M(a, b, x) ∧ M(a, c, y)): Father Ted officiated at the marriage of Jack to somebody and Father Tedofficiated at the marriage of Jill to somebody.

• ∃x ∃y (M(x, a, b) ∧ M(y, a, c)): Somebody officiated at the marriage of Father Ted to Jack and somebodyofficiated at the marriage of Father Ted to Jill.

To establish which is the correct translation of “Father Ted married Jack and Jill”, it would be necessary to ask thespeaker exactly what was meant.

Page 14: Formal Semantics (Logic)

4 CHAPTER 1. ATOMIC SENTENCE

1.4 Philosophical significance

Atomic sentences are of particular interest in philosophical logic and the theory of truth and, it has been argued, thereare corresponding atomic facts. An Atomic sentence (or possibly the meaning of an atomic sentence) is called anelementary proposition by Wittgenstein and an atomic proposition by Russell:

• 4.2 The sense of a proposition is its agreement and disagreement with possibilities of existence and non-existenceof states of affairs. 4.21 The simplest kind of proposition, an elementary proposition, asserts the existence of astate of affairs.: Wittgenstein, Tractatus Logico-Philosophicus, s:Tractatus Logico-Philosophicus.

• A proposition (true or false) asserting an atomic fact is called an atomic proposition.: Russell, Introduction toTractatus Logico-Philosophicus, s:Tractatus Logico-Philosophicus/Introduction

• see also [2] and [3] especially regarding elementary proposition and atomic proposition as discussed by Russelland Wittgenstein

Note the distinction between an elementary/atomic proposition and an atomic factNo atomic sentence can be deduced from (is not entailed by) any other atomic sentence, no two atomic sentences areincompatible, and no sets of atomic sentences are self-contradictory. Wittgenstein made much of this in his TractatusLogico-Philosophicus. If there are any atomic sentences then there must be “atomic facts” which correspond to thosethat are true, and the conjunction of all true atomic sentences would say all that was the case, i.e. “the world” since,according toWittegenstein, “The world is all that is the case”. (TLP:1). Similarly the set of all sets of atomic sentencescorresponds to the set of all possible worlds (all that could be the case).The T-schema, which embodies the theory of truth proposed by Alfred Tarski, defines the truth of arbitrary sentencesfrom the truth of atomic sentences.

1.5 See also• Logical atomism

• Logical constant

• Truthbearer

1.6 References• Benson Mates, Elementary Logic, OUP, New York 1972 (Library of Congress Catalog Card no.74-166004)

• Elliot Mendelson, Introduction to Mathematical Logic, Van Nostran Reinholds Company, New York 1964

• Wittgenstein, Tractatus_Logico-Philosophicus: s:Tractatus Logico-Philosophicus.]

[1] Philosophy of Logic, Willard Van Orman Quine

[2] http://plato.stanford.edu/entries/logical-atomism/

[3] http://plato.stanford.edu/entries/wittgenstein-atomism/

Page 15: Formal Semantics (Logic)

Chapter 2

First-order logic

First-order logic is a formal system used in mathematics, philosophy, linguistics, and computer science. It is alsoknown as first-order predicate calculus, the lower predicate calculus, quantification theory, and predicate logic.First-order logic uses quantified variables over (non-logical) objects. This distinguishes it from propositional logicwhich does not use quantifiers.A theory about some topic is usually first-order logic together with a specified domain of discourse over which thequantified variables range, finitelymany functions whichmap from that domain into it, finitelymany predicates definedon that domain, and a recursive set of axioms which are believed to hold for those things. Sometimes “theory” isunderstood in a more formal sense, which is just a set of sentences in first-order logic.The adjective “first-order” distinguishes first-order logic from higher-order logic in which there are predicates havingpredicates or functions as arguments, or in which one or both of predicate quantifiers or function quantifiers arepermitted.[1] In first-order theories, predicates are often associated with sets. In interpreted higher-order theories,predicates may be interpreted as sets of sets.There are many deductive systems for first-order logic that are sound (all provable statements are true in all models)and complete (all statements which are true in all models are provable). Although the logical consequence relation isonly semidecidable, much progress has been made in automated theorem proving in first-order logic. First-order logicalso satisfies several metalogical theorems that make it amenable to analysis in proof theory, such as the Löwenheim–Skolem theorem and the compactness theorem.First-order logic is the standard for the formalization of mathematics into axioms and is studied in the foundationsof mathematics. Mathematical theories, such as number theory and set theory, have been formalized into first-orderaxiom schemas such as Peano arithmetic and Zermelo–Fraenkel set theory (ZF) respectively.No first-order theory, however, has the strength to describe uniquely a structure with an infinite domain, such as thenatural numbers or the real line. A uniquely describing, i.e. categorical, axiom system for such a structure can beobtained in stronger logics such as second-order logic.For a history of first-order logic and how it came to dominate formal logic, see José Ferreirós (2001).

2.1 Introduction

While propositional logic deals with simple declarative propositions, first-order logic additionally covers predicatesand quantification.A predicate takes an entity or entities in the domain of discourse as input and outputs either True or False. Considerthe two sentences “Socrates is a philosopher” and “Plato is a philosopher”. In propositional logic, these sentencesare viewed as being unrelated and are denoted, for example, by p and q. However, the predicate “is a philosopher”occurs in both sentences which have a common structure of "a is a philosopher”. The variable a is instantiated as“Socrates” in the first sentence and is instantiated as “Plato” in the second sentence. The use of predicates, such as“is a philosopher” in this example, distinguishes first-order logic from propositional logic.Predicates can be compared. Consider, for example, the first-order formula “if a is a philosopher, then a is a scholar”.This formula is a conditional statement with "a is a philosopher” as hypothesis and "a is a scholar” as conclusion.

5

Page 16: Formal Semantics (Logic)

6 CHAPTER 2. FIRST-ORDER LOGIC

The truth of this formula depends on which object is denoted by a, and on the interpretations of the predicates “is aphilosopher” and “is a scholar”.Variables can be quantified over. The variable a in the previous formula can be quantified over, for instance, in thefirst-order sentence “For every a, if a is a philosopher, then a is a scholar”. The universal quantifier “for every” inthis sentence expresses the idea that the claim “if a is a philosopher, then a is a scholar” holds for all choices of a.The negation of the sentence “For every a, if a is a philosopher, then a is a scholar” is logically equivalent to thesentence “There exists a such that a is a philosopher and a is not a scholar”. The existential quantifier “there exists”expresses the idea that the claim "a is a philosopher and a is not a scholar” holds for some choice of a.The predicates “is a philosopher” and “is a scholar” each take a single variable. Predicates can take several variables.In the first-order sentence “Socrates is the teacher of Plato”, the predicate “is the teacher of” takes two variables.To interpret a first-order formula, one specifies what each predicate means and the entities that can instantiate thepredicated variables. These entities form the domain of discourse or universe, which is usually required to be anonempty set. Given that the interpretation with the domain of discourse as consisting of all human beings and thepredicate “is a philosopher” understood as “have written the Republic”, the sentence “There exists a such that a is aphilosopher” is seen as being true, as witnessed by Plato.

2.2 Syntax

There are two key parts of first-order logic. The syntax determines which collections of symbols are legal expressionsin first-order logic, while the semantics determine the meanings behind these expressions.

2.2.1 Alphabet

Unlike natural languages, such as English, the language of first-order logic is completely formal, so that it can bemechanically determined whether a given expression is legal. There are two key types of legal expressions: terms,which intuitively represent objects, and formulas, which intuitively express predicates that can be true or false. Theterms and formulas of first-order logic are strings of symbols which together form the alphabet of the language. Aswith all formal languages, the nature of the symbols themselves is outside the scope of formal logic; they are oftenregarded simply as letters and punctuation symbols.It is common to divide the symbols of the alphabet into logical symbols, which always have the same meaning, andnon-logical symbols, whose meaning varies by interpretation. For example, the logical symbol ∧ always represents“and"; it is never interpreted as “or”. On the other hand, a non-logical predicate symbol such as Phil(x) could beinterpreted to mean "x is a philosopher”, "x is a man named Philip”, or any other unary predicate, depending on theinterpretation at hand.

Logical symbols

There are several logical symbols in the alphabet, which vary by author but usually include:

• The quantifier symbols ∀ and ∃

• The logical connectives: ∧ for conjunction, ∨ for disjunction, → for implication, ↔ for biconditional, ¬ fornegation. Occasionally other logical connective symbols are included. Some authors use Cpq, instead of →,and Epq, instead of ↔, especially in contexts where → is used for other purposes. Moreover, the horseshoe ⊃may replace →; the triple-bar ≡ may replace ↔; a tilde (~), Np, or Fpq, may replace ¬; ||, or Apq may replace∨; and &, Kpq, or the middle dot, ⋅, may replace ∧, especially if these symbols are not available for technicalreasons. (Note: the aforementioned symbols Cpq, Epq, Np, Apq, and Kpq are used in Polish notation.)

• Parentheses, brackets, and other punctuation symbols. The choice of such symbols varies depending on context.

• An infinite set of variables, often denoted by lowercase letters at the end of the alphabet x, y, z, ... . Subscriptsare often used to distinguish variables: x0, x1, x2, ... .

• An equality symbol (sometimes, identity symbol) =; see the section on equality below.

Page 17: Formal Semantics (Logic)

2.2. SYNTAX 7

It should be noted that not all of these symbols are required – only one of the quantifiers, negation and conjunc-tion, variables, brackets and equality suffice. There are numerous minor variations that may define additional logicalsymbols:

• Sometimes the truth constants T, Vpq, or ⊤, for “true” and F, Opq, or ⊥, for “false” are included. Without anysuch logical operators of valence 0, these two constants can only be expressed using quantifiers.

• Sometimes additional logical connectives are included, such as the Sheffer stroke, Dpq (NAND), and exclusiveor, Jpq.

Non-logical symbols

The non-logical symbols represent predicates (relations), functions and constants on the domain of discourse. It usedto be standard practice to use a fixed, infinite set of non-logical symbols for all purposes. A more recent practice isto use different non-logical symbols according to the application one has in mind. Therefore it has become necessaryto name the set of all non-logical symbols used in a particular application. This choice is made via a signature.[2]

The traditional approach is to have only one, infinite, set of non-logical symbols (one signature) for all applications.Consequently, under the traditional approach there is only one language of first-order logic.[3] This approach is stillcommon, especially in philosophically oriented books.

1. For every integer n ≥ 0 there is a collection of n-ary, or n-place, predicate symbols. Because they representrelations between n elements, they are also called relation symbols. For each arity n we have an infinite supplyof them:

Pn0, Pn1, Pn2, Pn3, ...

2. For every integer n ≥ 0 there are infinitely many n-ary function symbols:

f n0, f n1, f n2, f n3, ...

In contemporary mathematical logic, the signature varies by application. Typical signatures in mathematics are 1,× or just × for groups, or 0, 1, +, ×, < for ordered fields. There are no restrictions on the number of non-logicalsymbols. The signature can be empty, finite, or infinite, even uncountable. Uncountable signatures occur for examplein modern proofs of the Löwenheim-Skolem theorem.In this approach, every non-logical symbol is of one of the following types.

1. A predicate symbol (or relation symbol) with some valence (or arity, number of arguments) greater than orequal to 0. These are often denoted by uppercase letters P, Q, R,... .

• Relations of valence 0 can be identified with propositional variables. For example, P, which can stand forany statement.

• For example, P(x) is a predicate variable of valence 1. One possible interpretation is "x is a man”.• Q(x,y) is a predicate variable of valence 2. Possible interpretations include "x is greater than y" and "x isthe father of y".

2. A function symbol, with some valence greater than or equal to 0. These are often denoted by lowercase lettersf, g, h,... .

• Examples: f(x) may be interpreted as for “the father of x". In arithmetic, it may stand for "-x”. In settheory, it may stand for “the power set of x”. In arithmetic, g(x,y) may stand for "x+y". In set theory, itmay stand for “the union of x and y".

• Function symbols of valence 0 are called constant symbols, and are often denoted by lowercase lettersat the beginning of the alphabet a, b, c,... . The symbol a may stand for Socrates. In arithmetic, it maystand for 0. In set theory, such a constant may stand for the empty set.

The traditional approach can be recovered in the modern approach by simply specifying the “custom” signature toconsist of the traditional sequences of non-logical symbols.

Page 18: Formal Semantics (Logic)

8 CHAPTER 2. FIRST-ORDER LOGIC

2.2.2 Formation rules

The formation rules define the terms and formulas of first order logic. When terms and formulas are representedas strings of symbols, these rules can be used to write a formal grammar for terms and formulas. These rules aregenerally context-free (each production has a single symbol on the left side), except that the set of symbols may beallowed to be infinite and there may be many start symbols, for example the variables in the case of terms.

Terms

The set of terms is inductively defined by the following rules:

1. Variables. Any variable is a term.

2. Functions. Any expression f(t1,...,tn) of n arguments (where each argument ti is a term and f is a functionsymbol of valence n) is a term. In particular, symbols denoting individual constants are 0-ary function symbols,and are thus terms.

Only expressions which can be obtained by finitely many applications of rules 1 and 2 are terms. For example, noexpression involving a predicate symbol is a term.

Formulas

The set of formulas (also called well-formed formulas [4] or wffs) is inductively defined by the following rules:

1. Predicate symbols. If P is an n-ary predicate symbol and t1, ..., tn are terms then P(t1,...,t ) is a formula.

2. Equality. If the equality symbol is considered part of logic, and t1 and t2 are terms, then t1 = t2 is a formula.

3. Negation. If φ is a formula, then ¬ φ is a formula.

4. Binary connectives. If φ and ψ are formulas, then (φ→ ψ) is a formula. Similar rules apply to other binarylogical connectives.

5. Quantifiers. If φ is a formula and x is a variable, then ∀xφ (for all x, φ holds) and ∃xφ (there exists x suchthat φ ) are formulas.

Only expressions which can be obtained by finitely many applications of rules 1–5 are formulas. The formulas ob-tained from the first two rules are said to be atomic formulas.For example,

∀x∀y(P (f(x)) → ¬(P (x) → Q(f(y), x, z)))

is a formula, if f is a unary function symbol, P a unary predicate symbol, and Q a ternary predicate symbol. On theother hand, ∀xx→ is not a formula, although it is a string of symbols from the alphabet.The role of the parentheses in the definition is to ensure that any formula can only be obtained in one way by followingthe inductive definition (in other words, there is a unique parse tree for each formula). This property is known asunique readability of formulas. There are many conventions for where parentheses are used in formulas. Forexample, some authors use colons or full stops instead of parentheses, or change the places in which parentheses areinserted. Each author’s particular definition must be accompanied by a proof of unique readability.This definition of a formula does not support defining an if-then-else function ite(c, a, b), where “c” is a conditionexpressed as a formula, that would return “a” if c is true, and “b” if it is false. This is because both predicates andfunctions can only accept terms as parameters, but the first parameter is a formula. Some languages built on first-orderlogic, such as SMT-LIB 2.0, add this.[5]

Page 19: Formal Semantics (Logic)

2.2. SYNTAX 9

Notational conventions

For convenience, conventions have been developed about the precedence of the logical operators, to avoid the needto write parentheses in some cases. These rules are similar to the order of operations in arithmetic. A commonconvention is:

• ¬ is evaluated first

• ∧ and ∨ are evaluated next

• Quantifiers are evaluated next

• → is evaluated last.

Moreover, extra punctuation not required by the definition may be inserted to make formulas easier to read. Thus theformula

(¬∀xP (x) → ∃x¬P (x))

might be written as

(¬[∀xP (x)]) → ∃x[¬P (x)].

In some fields, it is common to use infix notation for binary relations and functions, instead of the prefix notationdefined above. For example, in arithmetic, one typically writes “2 + 2 = 4” instead of "=(+(2,2),4)". It is common toregard formulas in infix notation as abbreviations for the corresponding formulas in prefix notation.The definitions above use infix notation for binary connectives such as → . A less common convention is Polishnotation, in which one writes→ , ∧ , and so on in front of their arguments rather than between them. This conventionallows all punctuation symbols to be discarded. Polish notation is compact and elegant, but rarely used in practicebecause it is hard for humans to read it. In Polish notation, the formula

∀x∀y(P (f(x)) → ¬(P (x) → Q(f(y), x, z)))

becomes "∀x∀y→Pfx¬→ PxQfyxz”.

2.2.3 Free and bound variables

Main article: Free variables and bound variables

In a formula, a variable may occur free or bound. Intuitively, a variable is free in a formula if it is not quantified: in∀y P (x, y) , variable x is free while y is bound. The free and bound variables of a formula are defined inductively asfollows.

1. Atomic formulas. If φ is an atomic formula then x is free in φ if and only if x occurs in φ. Moreover, thereare no bound variables in any atomic formula.

2. Negation. x is free in ¬ φ if and only if x is free in φ. x is bound in ¬ φ if and only if x is bound in φ.

3. Binary connectives. x is free in (φ→ ψ) if and only if x is free in either φ or ψ. x is bound in (φ→ ψ) if andonly if x is bound in either φ or ψ. The same rule applies to any other binary connective in place of→ .

4. Quantifiers. x is free in ∀ y φ if and only if x is free in φ and x is a different symbol from y. Also, x is boundin ∀ y φ if and only if x is y or x is bound in φ. The same rule holds with ∃ in place of ∀ .

Page 20: Formal Semantics (Logic)

10 CHAPTER 2. FIRST-ORDER LOGIC

For example, in ∀ x ∀ y (P(x)→ Q(x,f(x),z)), x and y are bound variables, z is a free variable, andw is neither becauseit does not occur in the formula.Free and bound variables of a formula need not be disjoint sets: x is both free and bound in P (x) → ∀xQ(x) .Freeness and boundness can be also specialized to specific occurrences of variables in a formula. For example, inP (x) → ∀xQ(x) , the first occurrence of x is free while the second is bound. In other words, the x in P (x) is freewhile the x in ∀xQ(x) is bound.A formula in first-order logic with no free variables is called a first-order sentence. These are the formulas that willhave well-defined truth values under an interpretation. For example, whether a formula such as Phil(x) is true mustdepend on what x represents. But the sentence ∃xPhil(x) will be either true or false in a given interpretation.

2.2.4 Examples

Ordered abelian groups

In mathematics the language of ordered abelian groups has one constant symbol 0, one unary function symbol −, onebinary function symbol +, and one binary relation symbol ≤. Then:

• The expressions +(x, y) and +(x, +(y, −(z))) are terms. These are usually written as x + y and x + y − z.

• The expressions +(x, y) = 0 and ≤(+(x, +(y, −(z))), +(x, y)) are atomic formulas.

These are usually written as x + y = 0 and x + y − z ≤ x + y.

• The expression (∀x∀y≤(+(x, y), z) → ∀x∀y+(x, y) = 0) is a formula, which is usually written as∀x∀y(x+ y ≤ z) → ∀x∀y(x+ y = 0).

Loving relation

English sentences like “everyone loves someone” can be formalized by first-order logic formulas like ∀x∃y L(x,y).This is accomplished by abbreviating the relation "x loves y" by L(x,y). Using just the two quantifiers ∀ and ∃ andthe loving relation symbol L, but no logical connectives and no function symbols (including constants), formulas with8 different meanings can be built. The following diagrams show models for each of them, assuming that there areexactly five individuals a,...,e who can love (vertical axis) and be loved (horizontal axis). A small red box at row x andcolumn y indicates L(x,y). Only for the formulas 9 and 10 is the model unique, all other formulas may be satisfied byseveral models.Each model, represented by a logical matrix, satisfies the formulas in its caption in a “minimal” way, i.e. whiteningany red cell in any matrix would make it non-satisfying the corresponding formula. For example, formula 1 is alsosatisfied by the matrices at 3, 6, and 10, but not by those at 2, 4, 5, and 7. Conversely, the matrix shown at 6 satisfies1, 2, 5, 6, 7, and 8, but not 3, 4, 9, and 10.Some formulas imply others, i.e. all matrices satisfying the antecedent (LHS) also satisfy the conclusion (RHS) ofthe implication — e.g. formula 3 implies formula 1, i.e.: each matrix fulfilling formula 3 also fulfills formula 1, butnot vice versa (see the Hasse diagram for this ordering relation). In contrast, only some matrices,[6] which satisfyformula 2, happen to satisfy also formula 5, whereas others,[7] also satisfying formula 2, do not; therefore formula 5is not a logical consequence of formula 2.The sequence of the quantifiers is important! So it is instructive to distinguish formulas 1: ∀x ∃y L(y,x), and 3: ∃x∀y L(x,y). In both cases everyone is loved; but in the first case everyone (x) is loved by someone (y), in the secondcase everyone (y) is loved by just exactly one person (x).

2.3 Semantics

An interpretation of a first-order language assigns a denotation to all non-logical constants in that language. It alsodetermines a domain of discourse that specifies the range of the quantifiers. The result is that each term is assigned anobject that it represents, and each sentence is assigned a truth value. In this way, an interpretation provides semantic

Page 21: Formal Semantics (Logic)

2.3. SEMANTICS 11

meaning to the terms and formulas of the language. The study of the interpretations of formal languages is calledformal semantics. What follows is a description of the standard or Tarskian semantics for first-order logic. (It is alsopossible to define game semantics for first-order logic, but aside from requiring the axiom of choice, game semanticsagree with Tarskian semantics for first-order logic, so game semantics will not be elaborated herein.)The domain of discourseD is a nonempty set of “objects” of some kind. Intuitively, a first-order formula is a statementabout these objects; for example, ∃xP (x) states the existence of an object x such that the predicate P is true wherereferred to it. The domain of discourse is the set of considered objects. For example, one can takeD to be the set ofinteger numbers.The interpretation of a function symbol is a function. For example, if the domain of discourse consists of integers, afunction symbol f of arity 2 can be interpreted as the function that gives the sum of its arguments. In other words,the symbol f is associated with the function I(f) which, in this interpretation, is addition.The interpretation of a constant symbol is a function from the one-element setD0 toD, which can be simply identifiedwith an object in D. For example, an interpretation may assign the value I(c) = 10 to the constant symbol c .The interpretation of an n-ary predicate symbol is a set of n-tuples of elements of the domain of discourse. Thismeans that, given an interpretation, a predicate symbol, and n elements of the domain of discourse, one can tellwhether the predicate is true of those elements according to the given interpretation. For example, an interpretationI(P) of a binary predicate symbol P may be the set of pairs of integers such that the first one is less than the second.According to this interpretation, the predicate P would be true if its first argument is less than the second.

2.3.1 First-order structures

Main article: Structure (mathematical logic)

The most common way of specifying an interpretation (especially in mathematics) is to specify a structure (alsocalled a model; see below). The structure consists of a nonempty set D that forms the domain of discourse and aninterpretation I of the non-logical terms of the signature. This interpretation is itself a function:

• Each function symbol f of arity n is assigned a function I(f) fromDn toD . In particular, each constant symbolof the signature is assigned an individual in the domain of discourse.

• Each predicate symbol P of arity n is assigned a relation I(P) overDn or, equivalently, a function fromDn totrue, false . Thus each predicate symbol is interpreted by a Boolean-valued function on D.

2.3.2 Evaluation of truth values

A formula evaluates to true or false given an interpretation, and a variable assignment μ that associates an elementof the domain of discourse with each variable. The reason that a variable assignment is required is to give meaningsto formulas with free variables, such as y = x . The truth value of this formula changes depending on whether x andy denote the same individual.First, the variable assignment μ can be extended to all terms of the language, with the result that each term maps toa single element of the domain of discourse. The following rules are used to make this assignment:

1. Variables. Each variable x evaluates to μ(x)

2. Functions. Given terms t1, . . . , tn that have been evaluated to elements d1, . . . , dn of the domain of discourse,and a n-ary function symbol f, the term f(t1, . . . , tn) evaluates to (I(f))(d1, . . . , dn) .

Next, each formula is assigned a truth value. The inductive definition used to make this assignment is called theT-schema.

1. Atomic formulas (1). A formula P (t1, . . . , tn) is associated the value true or false depending on whether⟨v1, . . . , vn⟩ ∈ I(P ) , where v1, . . . , vn are the evaluation of the terms t1, . . . , tn and I(P ) is the interpreta-tion of P , which by assumption is a subset of Dn .

Page 22: Formal Semantics (Logic)

12 CHAPTER 2. FIRST-ORDER LOGIC

2. Atomic formulas (2). A formula t1 = t2 is assigned true if t1 and t2 evaluate to the same object of the domainof discourse (see the section on equality below).

3. Logical connectives. A formula in the form ¬ϕ , ϕ → ψ , etc. is evaluated according to the truth table forthe connective in question, as in propositional logic.

4. Existential quantifiers. A formula ∃xϕ(x) is true according to M and µ if there exists an evaluation µ′ ofthe variables that only differs from µ regarding the evaluation of x and such that φ is true according to theinterpretation M and the variable assignment µ′ . This formal definition captures the idea that ∃xϕ(x) is trueif and only if there is a way to choose a value for x such that φ(x) is satisfied.

5. Universal quantifiers. A formula ∀xϕ(x) is true according toM and µ if φ(x) is true for every pair composedby the interpretationM and some variable assignmentµ′ that differs fromµ only on the value of x. This capturesthe idea that ∀xϕ(x) is true if every possible choice of a value for x causes φ(x) to be true.

If a formula does not contain free variables, and so is a sentence, then the initial variable assignment does not affectits truth value. In other words, a sentence is true according to M and µ if and only if it is true according to M andevery other variable assignment µ′ .There is a second common approach to defining truth values that does not rely on variable assignment functions.Instead, given an interpretation M, one first adds to the signature a collection of constant symbols, one for eachelement of the domain of discourse in M; say that for each d in the domain the constant symbol cd is fixed. Theinterpretation is extended so that each new constant symbol is assigned to its corresponding element of the domain.One now defines truth for quantified formulas syntactically, as follows:

1. Existential quantifiers (alternate). A formula ∃xϕ(x) is true according toM if there is some d in the domainof discourse such that ϕ(cd) holds. Here ϕ(cd) is the result of substituting cd for every free occurrence of x inφ.

2. Universal quantifiers (alternate). A formula ∀xϕ(x) is true according toM if, for every d in the domain ofdiscourse, ϕ(cd) is true according to M.

This alternate approach gives exactly the same truth values to all sentences as the approach via variable assignments.

2.3.3 Validity, satisfiability, and logical consequence

See also: Satisfiability

If a sentence φ evaluates to True under a given interpretationM, one says thatM satisfies φ; this is denotedM ⊨ φ. A sentence is satisfiable if there is some interpretation under which it is true.Satisfiability of formulas with free variables is more complicated, because an interpretation on its own does notdetermine the truth value of such a formula. The most common convention is that a formula with free variables issaid to be satisfied by an interpretation if the formula remains true regardless which individuals from the domain ofdiscourse are assigned to its free variables. This has the same effect as saying that a formula is satisfied if and only ifits universal closure is satisfied.A formula is logically valid (or simply valid) if it is true in every interpretation. These formulas play a role similarto tautologies in propositional logic.A formula φ is a logical consequence of a formula ψ if every interpretation that makes ψ true also makes φ true. Inthis case one says that φ is logically implied by ψ.

2.3.4 Algebraizations

An alternate approach to the semantics of first-order logic proceeds via abstract algebra. This approach generalizesthe Lindenbaum–Tarski algebras of propositional logic. There are three ways of eliminating quantified variables fromfirst-order logic that do not involve replacing quantifiers with other variable binding term operators:

• Cylindric algebra, by Alfred Tarski and his coworkers;

Page 23: Formal Semantics (Logic)

2.3. SEMANTICS 13

• Polyadic algebra, by Paul Halmos;

• Predicate functor logic, mainly due to Willard Quine.

These algebras are all lattices that properly extend the two-element Boolean algebra.Tarski and Givant (1987) showed that the fragment of first-order logic that has no atomic sentence lying in the scopeof more than three quantifiers has the same expressive power as relation algebra. This fragment is of great interestbecause it suffices for Peano arithmetic and most axiomatic set theory, including the canonical ZFC. They also provethat first-order logic with a primitive ordered pair is equivalent to a relation algebra with two ordered pair projectionfunctions.

2.3.5 First-order theories, models, and elementary classes

A first-order theory of a particular signature is a set of axioms, which are sentences consisting of symbols from thatsignature. The set of axioms is often finite or recursively enumerable, in which case the theory is called effective.Some authors require theories to also include all logical consequences of the axioms. The axioms are considered tohold within the theory and from them other sentences that hold within the theory can be derived.A first-order structure that satisfies all sentences in a given theory is said to be amodel of the theory. An elementaryclass is the set of all structures satisfying a particular theory. These classes are a main subject of study in modeltheory.Many theories have an intended interpretation, a certain model that is kept in mind when studying the theory.For example, the intended interpretation of Peano arithmetic consists of the usual natural numbers with their usualoperations. However, the Löwenheim–Skolem theorem shows that most first-order theories will also have other,nonstandard models.A theory is consistent if it is not possible to prove a contradiction from the axioms of the theory. A theory is completeif, for every formula in its signature, either that formula or its negation is a logical consequence of the axioms of thetheory. Gödel’s incompleteness theorem shows that effective first-order theories that include a sufficient portion ofthe theory of the natural numbers can never be both consistent and complete.For more information on this subject see List of first-order theories and Theory (mathematical logic)

2.3.6 Empty domains

Main article: Empty domain

The definition above requires that the domain of discourse of any interpretation must be a nonempty set. There aresettings, such as inclusive logic, where empty domains are permitted. Moreover, if a class of algebraic structuresincludes an empty structure (for example, there is an empty poset), that class can only be an elementary class infirst-order logic if empty domains are permitted or the empty structure is removed from the class.There are several difficulties with empty domains, however:

• Many common rules of inference are only valid when the domain of discourse is required to be nonempty. Oneexample is the rule stating that ϕ∨∃xψ implies ∃x(ϕ∨ψ) when x is not a free variable in φ. This rule, whichis used to put formulas into prenex normal form, is sound in nonempty domains, but unsound if the emptydomain is permitted.

• The definition of truth in an interpretation that uses a variable assignment function cannot work with emptydomains, because there are no variable assignment functions whose range is empty. (Similarly, one cannotassign interpretations to constant symbols.) This truth definition requires that one must select a variable as-signment function (μ above) before truth values for even atomic formulas can be defined. Then the truth valueof a sentence is defined to be its truth value under any variable assignment, and it is proved that this truthvalue does not depend on which assignment is chosen. This technique does not work if there are no assignmentfunctions at all; it must be changed to accommodate empty domains.

Thus, when the empty domain is permitted, it must often be treated as a special case. Most authors, however, simplyexclude the empty domain by definition.

Page 24: Formal Semantics (Logic)

14 CHAPTER 2. FIRST-ORDER LOGIC

2.4 Deductive systems

A deductive system is used to demonstrate, on a purely syntactic basis, that one formula is a logical consequenceof another formula. There are many such systems for first-order logic, including Hilbert-style deductive systems,natural deduction, the sequent calculus, the tableaux method, and resolution. These share the common property thata deduction is a finite syntactic object; the format of this object, and the way it is constructed, vary widely. Thesefinite deductions themselves are often called derivations in proof theory. They are also often called proofs, but arecompletely formalized unlike natural-language mathematical proofs.A deductive system is sound if any formula that can be derived in the system is logically valid. Conversely, a deductivesystem is complete if every logically valid formula is derivable. All of the systems discussed in this article are bothsound and complete. They also share the property that it is possible to effectively verify that a purportedly validdeduction is actually a deduction; such deduction systems are called effective.A key property of deductive systems is that they are purely syntactic, so that derivations can be verified withoutconsidering any interpretation. Thus a sound argument is correct in every possible interpretation of the language,regardless whether that interpretation is about mathematics, economics, or some other area.In general, logical consequence in first-order logic is only semidecidable: if a sentence A logically implies a sentenceB then this can be discovered (for example, by searching for a proof until one is found, using some effective, sound,complete proof system). However, if A does not logically imply B, this does not mean that A logically implies thenegation of B. There is no effective procedure that, given formulas A and B, always correctly decides whether Alogically implies B.

2.4.1 Rules of inference

Further information: List of rules of inference

A rule of inference states that, given a particular formula (or set of formulas) with a certain property as a hypothesis,another specific formula (or set of formulas) can be derived as a conclusion. The rule is sound (or truth-preserving)if it preserves validity in the sense that whenever any interpretation satisfies the hypothesis, that interpretation alsosatisfies the conclusion.For example, one common rule of inference is the rule of substitution. If t is a term and φ is a formula possiblycontaining the variable x, then φ[t/x] (often denoted φ[x/t]) is the result of replacing all free instances of x by t inφ. The substitution rule states that for any φ and any term t, one can conclude φ[t/x] from φ provided that no freevariable of t becomes bound during the substitution process. (If some free variable of t becomes bound, then tosubstitute t for x it is first necessary to change the bound variables of φ to differ from the free variables of t.)To see why the restriction on bound variables is necessary, consider the logically valid formula φ given by ∃x(x = y), in the signature of (0,1,+,×,=) of arithmetic. If t is the term “x + 1”, the formula φ[t/y] is ∃x(x = x+1) , which willbe false in many interpretations. The problem is that the free variable x of t became bound during the substitution.The intended replacement can be obtained by renaming the bound variable x of φ to something else, say z, so thatthe formula after substitution is ∃z(z = x+ 1) , which is again logically valid.The substitution rule demonstrates several common aspects of rules of inference. It is entirely syntactical; one cantell whether it was correctly applied without appeal to any interpretation. It has (syntactically defined) limitations onwhen it can be applied, which must be respected to preserve the correctness of derivations. Moreover, as is oftenthe case, these limitations are necessary because of interactions between free and bound variables that occur duringsyntactic manipulations of the formulas involved in the inference rule.

2.4.2 Hilbert-style systems and natural deduction

A deduction in a Hilbert-style deductive system is a list of formulas, each of which is a logical axiom, a hypothesisthat has been assumed for the derivation at hand, or follows from previous formulas via a rule of inference. Thelogical axioms consist of several axiom schemas of logically valid formulas; these encompass a significant amount ofpropositional logic. The rules of inference enable the manipulation of quantifiers. Typical Hilbert-style systems havea small number of rules of inference, along with several infinite schemas of logical axioms. It is common to have onlymodus ponens and universal generalization as rules of inference.

Page 25: Formal Semantics (Logic)

2.4. DEDUCTIVE SYSTEMS 15

Natural deduction systems resemble Hilbert-style systems in that a deduction is a finite list of formulas. However,natural deduction systems have no logical axioms; they compensate by adding additional rules of inference that canbe used to manipulate the logical connectives in formulas in the proof.

2.4.3 Sequent calculus

Further information: Sequent calculus

The sequent calculus was developed to study the properties of natural deduction systems. Instead of working withone formula at a time, it uses sequents, which are expressions of the form

A1, . . . , An ⊢ B1, . . . , Bk,

where A1, ..., An, B1, ..., Bk are formulas and the turnstile symbol ⊢ is used as punctuation to separate the two halves.Intuitively, a sequent expresses the idea that (A1 ∧ · · · ∧An) implies (B1 ∨ · · · ∨Bk) .

2.4.4 Tableaux method

Further information: Method of analytic tableaux

Unlike the methods just described, the derivations in the tableaux method are not lists of formulas. Instead, a deriva-tion is a tree of formulas. To show that a formula A is provable, the tableaux method attempts to demonstrate thatthe negation of A is unsatisfiable. The tree of the derivation has ¬A at its root; the tree branches in a way that reflectsthe structure of the formula. For example, to show that C ∨ D is unsatisfiable requires showing that C and D areeach unsatisfiable; this corresponds to a branching point in the tree with parent C ∨D and children C and D.

2.4.5 Resolution

The resolution rule is a single rule of inference that, together with unification, is sound and complete for first-orderlogic. As with the tableaux method, a formula is proved by showing that the negation of the formula is unsatisfiable.Resolution is commonly used in automated theorem proving.The resolutionmethod works only with formulas that are disjunctions of atomic formulas; arbitrary formulas must firstbe converted to this form through Skolemization. The resolution rule states that from the hypothesesA1∨· · ·∨Ak∨Cand B1 ∨ · · · ∨Bl ∨ ¬C , the conclusion A1 ∨ · · · ∨Ak ∨B1 ∨ · · · ∨Bl can be obtained.

2.4.6 Provable identities

The following sentences can be called “identities” because the main connective in each is the biconditional.

¬∀xP (x) ⇔ ∃x¬P (x)

¬∃xP (x) ⇔ ∀x¬P (x)

∀x ∀y P (x, y) ⇔ ∀y ∀xP (x, y)

∃x ∃y P (x, y) ⇔ ∃y ∃xP (x, y)

∀xP (x) ∧ ∀xQ(x) ⇔ ∀x (P (x) ∧Q(x))

∃xP (x) ∨ ∃xQ(x) ⇔ ∃x (P (x) ∨Q(x))

P ∧ ∃xQ(x) ⇔ ∃x (P ∧Q(x)) (where x must not occur free in P )P ∨ ∀xQ(x) ⇔ ∀x (P ∨Q(x)) (where x must not occur free in P )

Page 26: Formal Semantics (Logic)

16 CHAPTER 2. FIRST-ORDER LOGIC

2.5 Equality and its axioms

There are several different conventions for using equality (or identity) in first-order logic. The most common con-vention, known as first-order logic with equality, includes the equality symbol as a primitive logical symbol whichis always interpreted as the real equality relation between members of the domain of discourse, such that the “two”given members are the same member. This approach also adds certain axioms about equality to the deductive systememployed. These equality axioms are:

1. Reflexivity. For each variable x, x = x.

2. Substitution for functions. For all variables x and y, and any function symbol f,

x = y→ f(...,x,...) = f(...,y,...).

3. Substitution for formulas. For any variables x and y and any formula φ(x), if φ' is obtained by replacing anynumber of free occurrences of x in φ with y, such that these remain free occurrences of y, then

x = y→ (φ → φ').

These are axiom schemas, each of which specifies an infinite set of axioms. The third schema is known as Leibniz’slaw, “the principle of substitutivity”, “the indiscernibility of identicals”, or “the replacement property”. The secondschema, involving the function symbol f, is (equivalent to) a special case of the third schema, using the formula

x = y→ (f(...,x,...) = z → f(...,y,...) = z).

Many other properties of equality are consequences of the axioms above, for example:

1. Symmetry. If x = y then y = x.

2. Transitivity. If x = y and y = z then x = z.

2.5.1 First-order logic without equality

An alternate approach considers the equality relation to be a non-logical symbol. This convention is known as first-order logic without equality. If an equality relation is included in the signature, the axioms of equality must now beadded to the theories under consideration, if desired, instead of being considered rules of logic. The main differencebetween this method and first-order logic with equality is that an interpretation may now interpret two distinct indi-viduals as “equal” (although, by Leibniz’s law, these will satisfy exactly the same formulas under any interpretation).That is, the equality relation may now be interpreted by an arbitrary equivalence relation on the domain of discoursethat is congruent with respect to the functions and relations of the interpretation.When this second convention is followed, the term normal model is used to refer to an interpretation where nodistinct individuals a and b satisfy a = b. In first-order logic with equality, only normal models are considered, andso there is no term for a model other than a normal model. When first-order logic without equality is studied, it isnecessary to amend the statements of results such as the Löwenheim–Skolem theorem so that only normal modelsare considered.First-order logic without equality is often employed in the context of second-order arithmetic and other higher-ordertheories of arithmetic, where the equality relation between sets of natural numbers is usually omitted.

2.5.2 Defining equality within a theory

If a theory has a binary formula A(x,y) which satisfies reflexivity and Leibniz’s law, the theory is said to have equality,or to be a theory with equality. The theory may not have all instances of the above schemas as axioms, but rather asderivable theorems. For example, in theories with no function symbols and a finite number of relations, it is possibleto define equality in terms of the relations, by defining the two terms s and t to be equal if any relation is unchangedby changing s to t in any argument.Some theories allow other ad hoc definitions of equality:

Page 27: Formal Semantics (Logic)

2.6. METALOGICAL PROPERTIES 17

• In the theory of partial orders with one relation symbol ≤, one could define s = t to be an abbreviation for s ≤ t∧ t ≤ s.

• In set theory with one relation ∈ , one may define s = t to be an abbreviation for ∀ x (s ∈ x ↔ t ∈ x) ∧ ∀x (x ∈ s ↔ x ∈ t). This definition of equality then automatically satisfies the axioms for equality. In thiscase, one should replace the usual axiom of extensionality, ∀x∀y[∀z(z ∈ x ⇔ z ∈ y) ⇒ x = y] , by∀x∀y[∀z(z ∈ x⇔ z ∈ y) ⇒ ∀z(x ∈ z ⇔ y ∈ z)] , i.e. if x and y have the same elements, then they belongto the same sets.

2.6 Metalogical properties

One motivation for the use of first-order logic, rather than higher-order logic, is that first-order logic has manymetalogical properties that stronger logics do not have. These results concern general properties of first-order logicitself, rather than properties of individual theories. They provide fundamental tools for the construction of modelsof first-order theories.

2.6.1 Completeness and undecidability

Gödel’s completeness theorem, proved by Kurt Gödel in 1929, establishes that there are sound, complete, effectivedeductive systems for first-order logic, and thus the first-order logical consequence relation is captured by finite prov-ability. Naively, the statement that a formula φ logically implies a formula ψ depends on every model of φ; thesemodels will in general be of arbitrarily large cardinality, and so logical consequence cannot be effectively verified bychecking every model. However, it is possible to enumerate all finite derivations and search for a derivation of ψ fromφ. If ψ is logically implied by φ, such a derivation will eventually be found. Thus first-order logical consequence issemidecidable: it is possible to make an effective enumeration of all pairs of sentences (φ,ψ) such that ψ is a logicalconsequence of φ.Unlike propositional logic, first-order logic is undecidable (although semidecidable), provided that the language hasat least one predicate of arity at least 2 (other than equality). This means that there is no decision procedure thatdetermines whether arbitrary formulas are logically valid. This result was established independently byAlonzo Churchand Alan Turing in 1936 and 1937, respectively, giving a negative answer to the Entscheidungsproblem posed byDavid Hilbert in 1928. Their proofs demonstrate a connection between the unsolvability of the decision problem forfirst-order logic and the unsolvability of the halting problem.There are systems weaker than full first-order logic for which the logical consequence relation is decidable. Theseinclude propositional logic andmonadic predicate logic, which is first-order logic restricted to unary predicate symbolsand no function symbols. Other logics with no function symbols which are decidable are the guarded fragment offirst-order logic, as well as two-variable logic. The Bernays–Schönfinkel class of first-order formulas is also decidable.Decidable subsets of first-order logic are also studied in the framework of description logics.

2.6.2 The Löwenheim–Skolem theorem

The Löwenheim–Skolem theorem shows that if a first-order theory of cardinality λ has an infinite model, then it hasmodels of every infinite cardinality greater than or equal to λ. One of the earliest results in model theory, it impliesthat it is not possible to characterize countability or uncountability in a first-order language. That is, there is nofirst-order formula φ(x) such that an arbitrary structure M satisfies φ if and only if the domain of discourse of M iscountable (or, in the second case, uncountable).The Löwenheim–Skolem theorem implies that infinite structures cannot be categorically axiomatized in first-orderlogic. For example, there is no first-order theory whose only model is the real line: any first-order theory with aninfinite model also has a model of cardinality larger than the continuum. Since the real line is infinite, any theorysatisfied by the real line is also satisfied by some nonstandard models. When the Löwenheim–Skolem theorem isapplied to first-order set theories, the nonintuitive consequences are known as Skolem’s paradox.

Page 28: Formal Semantics (Logic)

18 CHAPTER 2. FIRST-ORDER LOGIC

2.6.3 The compactness theorem

The compactness theorem states that a set of first-order sentences has a model if and only if every finite subset of ithas a model. This implies that if a formula is a logical consequence of an infinite set of first-order axioms, then itis a logical consequence of some finite number of those axioms. This theorem was proved first by Kurt Gödel as aconsequence of the completeness theorem, but many additional proofs have been obtained over time. It is a centraltool in model theory, providing a fundamental method for constructing models.The compactness theorem has a limiting effect on which collections of first-order structures are elementary classes.For example, the compactness theorem implies that any theory that has arbitrarily large finite models has an infi-nite model. Thus the class of all finite graphs is not an elementary class (the same holds for many other algebraicstructures).There are also more subtle limitations of first-order logic that are implied by the compactness theorem. For example,in computer science, many situations can be modeled as a directed graph of states (nodes) and connections (directededges). Validating such a system may require showing that no “bad” state can be reached from any “good” state. Thusone seeks to determine if the good and bad states are in different connected components of the graph. However, thecompactness theorem can be used to show that connected graphs are not an elementary class in first-order logic,and there is no formula φ(x,y) of first-order logic, in the signature of graphs, that expresses the idea that there is apath from x to y. Connectedness can be expressed in second-order logic, however, but not with only existential setquantifiers, as Σ1

1 also enjoys compactness.

2.6.4 Lindström’s theorem

Main article: Lindström’s theorem

Per Lindström showed that the metalogical properties just discussed actually characterize first-order logic in the sensethat no stronger logic can also have those properties (Ebbinghaus and Flum 1994, Chapter XIII). Lindström defineda class of abstract logical systems, and a rigorous definition of the relative strength of a member of this class. Heestablished two theorems for systems of this type:

• A logical system satisfying Lindström’s definition that contains first-order logic and satisfies both the Löwenheim–Skolem theorem and the compactness theorem must be equivalent to first-order logic.

• A logical system satisfying Lindström’s definition that has a semidecidable logical consequence relation andsatisfies the Löwenheim–Skolem theorem must be equivalent to first-order logic.

2.7 Limitations

Although first-order logic is sufficient for formalizing much of mathematics, and is commonly used in computerscience and other fields, it has certain limitations. These include limitations on its expressiveness and limitations ofthe fragments of natural languages that it can describe.For instance, first-order logic is undecidable, meaning a sound, complete and terminating decision algorithm is im-possible. This has led to the study of interesting decidable fragments such as C2, first-order logic with two variablesand the counting quantifiers ∃≥n and ∃≤n (these quantifiers are, respectively, “there exists at least n" and “there existsat most n") (Horrocks 2010).

2.7.1 Expressiveness

The Löwenheim–Skolem theorem shows that if a first-order theory has any infinite model, then it has infinite modelsof every cardinality. In particular, no first-order theory with an infinite model can be categorical. Thus there isno first-order theory whose only model has the set of natural numbers as its domain, or whose only model has theset of real numbers as its domain. Many extensions of first-order logic, including infinitary logics and higher-orderlogics, are more expressive in the sense that they do permit categorical axiomatizations of the natural numbers orreal numbers. This expressiveness comes at a metalogical cost, however: by Lindström’s theorem, the compactnesstheorem and the downward Löwenheim–Skolem theorem cannot hold in any logic stronger than first-order.

Page 29: Formal Semantics (Logic)

2.8. RESTRICTIONS, EXTENSIONS, AND VARIATIONS 19

2.7.2 Formalizing natural languages

First-order logic is able to formalize many simple quantifier constructions in natural language, such as “every personwho lives in Perth lives in Australia”. But there are many more complicated features of natural language that cannotbe expressed in (single-sorted) first-order logic. “Any logical system which is appropriate as an instrument for theanalysis of natural language needs a much richer structure than first-order predicate logic” (Gamut 1991, p. 75).

2.8 Restrictions, extensions, and variations

There are many variations of first-order logic. Some of these are inessential in the sense that they merely changenotation without affecting the semantics. Others change the expressive power more significantly, by extending thesemantics through additional quantifiers or other new logical symbols. For example, infinitary logics permit formulasof infinite size, and modal logics add symbols for possibility and necessity.

2.8.1 Restricted languages

First-order logic can be studied in languages with fewer logical symbols than were described above.

• Because ∃xϕ(x) can be expressed as ¬∀x¬ϕ(x) , and ∀xϕ(x) can be expressed as ¬∃x¬ϕ(x) , either of thetwo quantifiers ∃ and ∀ can be dropped.

• Since ϕ∨ψ can be expressed as ¬(¬ϕ∧¬ψ) and ϕ∧ψ can be expressed as ¬(¬ϕ∨¬ψ) , either ∨ or ∧ canbe dropped. In other words, it is sufficient to have ¬ and ∨ , or ¬ and ∧ , as the only logical connectives.

• Similarly, it is sufficient to have only¬ and→ as logical connectives, or to have only the Sheffer stroke (NAND)or the Peirce arrow (NOR) operator.

• It is possible to entirely avoid function symbols and constant symbols, rewriting them via predicate symbolsin an appropriate way. For example, instead of using a constant symbol 0 one may use a predicate 0(x)(interpreted as x = 0 ), and replace every predicate such as P (0, y) with ∀x (0(x) → P (x, y)) . Afunction such as f(x1, x2, ..., xn) will similarly be replaced by a predicate F (x1, x2, ..., xn, y) interpretedas y = f(x1, x2, ..., xn) . This change requires adding additional axioms to the theory at hand, so thatinterpretations of the predicate symbols used have the correct semantics.

Restrictions such as these are useful as a technique to reduce the number of inference rules or axiom schemas indeductive systems, which leads to shorter proofs of metalogical results. The cost of the restrictions is that it becomesmore difficult to express natural-language statements in the formal system at hand, because the logical connectivesused in the natural language statements must be replaced by their (longer) definitions in terms of the restricted col-lection of logical connectives. Similarly, derivations in the limited systems may be longer than derivations in systemsthat include additional connectives. There is thus a trade-off between the ease of working within the formal systemand the ease of proving results about the formal system.It is also possible to restrict the arities of function symbols and predicate symbols, in sufficiently expressive theories.One can in principle dispense entirely with functions of arity greater than 2 and predicates of arity greater than 1in theories that include a pairing function. This is a function of arity 2 that takes pairs of elements of the domainand returns an ordered pair containing them. It is also sufficient to have two predicate symbols of arity 2 that defineprojection functions from an ordered pair to its components. In either case it is necessary that the natural axioms fora pairing function and its projections are satisfied.

2.8.2 Many-sorted logic

Ordinary first-order interpretations have a single domain of discourse over which all quantifiers range. Many-sortedfirst-order logic allows variables to have different sorts, which have different domains. This is also called typed first-order logic, and the sorts called types (as in data type), but it is not the same as first-order type theory. Many-sortedfirst-order logic is often used in the study of second-order arithmetic.

Page 30: Formal Semantics (Logic)

20 CHAPTER 2. FIRST-ORDER LOGIC

When there are only finitely many sorts in a theory, many-sorted first-order logic can be reduced to single-sorted first-order logic. One introduces into the single-sorted theory a unary predicate symbol for each sort in the many-sortedtheory, and adds an axiom saying that these unary predicates partition the domain of discourse. For example, if thereare two sorts, one adds predicate symbols P1(x) and P2(x) and the axiom

∀x(P1(x) ∨ P2(x)) ∧ ¬∃x(P1(x) ∧ P2(x))

Then the elements satisfying P1 are thought of as elements of the first sort, and elements satisfying P2 as elementsof the second sort. One can quantify over each sort by using the corresponding predicate symbol to limit the rangeof quantification. For example, to say there is an element of the first sort satisfying formula φ(x), one writes

∃x(P1(x) ∧ ϕ(x))

2.8.3 Additional quantifiers

Additional quantifiers can be added to first-order logic.

• Sometimes it is useful to say that "P(x) holds for exactly one x", which can be expressed as ∃! x P(x). Thisnotation, called uniqueness quantification, may be taken to abbreviate a formula such as ∃ x (P(x) ∧∀ y (P(y)→ (x = y))).

• First-order logic with extra quantifiers has new quantifiers Qx,..., with meanings such as “there are many xsuch that ...”. Also see branching quantifiers and the plural quantifiers of George Boolos and others.

• Bounded quantifiers are often used in the study of set theory or arithmetic.

2.8.4 Infinitary logics

Main article: Infinitary logic

Infinitary logic allows infinitely long sentences. For example, one may allow a conjunction or disjunction of infinitelymany formulas, or quantification over infinitely many variables. Infinitely long sentences arise in areas of mathematicsincluding topology and model theory.Infinitary logic generalizes first-order logic to allow formulas of infinite length. The most common way in whichformulas can become infinite is through infinite conjunctions and disjunctions. However, it is also possible to admitgeneralized signatures in which function and relation symbols are allowed to have infinite arities, or in which quantifierscan bind infinitely many variables. Because an infinite formula cannot be represented by a finite string, it is necessaryto choose some other representation of formulas; the usual representation in this context is a tree. Thus formulas are,essentially, identified with their parse trees, rather than with the strings being parsed.The most commonly studied infinitary logics are denoted Lαᵦ, where α and β are each either cardinal numbersor the symbol ∞. In this notation, ordinary first-order logic is Lωω. In the logic L∞ω, arbitrary conjunctions ordisjunctions are allowed when building formulas, and there is an unlimited supply of variables. More generally, thelogic that permits conjunctions or disjunctions with less than κ constituents is known as Lκω. For example, Lω1ωpermits countable conjunctions and disjunctions.The set of free variables in a formula of Lκω can have any cardinality strictly less than κ, yet only finitely many ofthem can be in the scope of any quantifier when a formula appears as a subformula of another.[8] In other infinitarylogics, a subformula may be in the scope of infinitely many quantifiers. For example, in Lκ∞, a single universal or ex-istential quantifier may bind arbitrarily many variables simultaneously. Similarly, the logic Lκλ permits simultaneousquantification over fewer than λ variables, as well as conjunctions and disjunctions of size less than κ.

2.8.5 Non-classical and modal logics

• Intuitionistic first-order logic uses intuitionistic rather than classical propositional calculus; for example, ¬¬φneed not be equivalent to φ.

Page 31: Formal Semantics (Logic)

2.9. AUTOMATED THEOREM PROVING AND FORMAL METHODS 21

• First-order modal logic allows one to describe other possible worlds as well as this contingently true worldwhich we inhabit. In some versions, the set of possible worlds varies depending on which possible world oneinhabits. Modal logic has extra modal operators with meanings which can be characterized informally as, forexample “it is necessary that φ" (true in all possible worlds) and “it is possible that φ" (true in some possibleworld). With standard first-order logic we have a single domain and each predicate is assigned one extension.With first-order modal logic we have a domain function that assigns each possible world its own domain, so thateach predicate gets an extension only relative to these possible worlds. This allows us to model cases where,for example, Alex is a Philosopher, but might have been a Mathematician, and might not have existed at all.In the first possible world P(a) is true, in the second P(a) is false, and in the third possible world there is no ain the domain at all.

• first-order fuzzy logics are first-order extensions of propositional fuzzy logics rather than classical propositionalcalculus.

2.8.6 Fixpoint logic

Fixpoint logic extends first-order logic by adding the closure under the least fixed points of positive operators.[9]

2.8.7 Higher-order logics

Main article: Higher-order logic

The characteristic feature of first-order logic is that individuals can be quantified, but not predicates. Thus

∃a(Phil(a))

is a legal first-order formula, but

∃Phil(Phil(a))

is not, in most formalizations of first-order logic. Second-order logic extends first-order logic by adding the lattertype of quantification. Other higher-order logics allow quantification over even higher types than second-order logicpermits. These higher types include relations between relations, functions from relations to relations between relations,and other higher-type objects. Thus the “first” in first-order logic describes the type of objects that can be quantified.Unlike first-order logic, for which only one semantics is studied, there are several possible semantics for second-order logic. The most commonly employed semantics for second-order and higher-order logic is known as fullsemantics. The combination of additional quantifiers and the full semantics for these quantifiers makes higher-orderlogic stronger than first-order logic. In particular, the (semantic) logical consequence relation for second-order andhigher-order logic is not semidecidable; there is no effective deduction system for second-order logic that is soundand complete under full semantics.Second-order logic with full semantics is more expressive than first-order logic. For example, it is possible to createaxiom systems in second-order logic that uniquely characterize the natural numbers and the real line. The cost ofthis expressiveness is that second-order and higher-order logics have fewer attractive metalogical properties than first-order logic. For example, the Löwenheim–Skolem theorem and compactness theorem of first-order logic becomefalse when generalized to higher-order logics with full semantics.

2.9 Automated theorem proving and formal methods

Further information: First-order theorem proving

Automated theorem proving refers to the development of computer programs that search and find derivations (formalproofs) of mathematical theorems. Finding derivations is a difficult task because the search space can be very large; an

Page 32: Formal Semantics (Logic)

22 CHAPTER 2. FIRST-ORDER LOGIC

exhaustive search of every possible derivation is theoretically possible but computationally infeasible for many systemsof interest in mathematics. Thus complicated heuristic functions are developed to attempt to find a derivation in lesstime than a blind search.The related area of automated proof verification uses computer programs to check that human-created proofs arecorrect. Unlike complicated automated theorem provers, verification systems may be small enough that their correct-ness can be checked both by hand and through automated software verification. This validation of the proof verifieris needed to give confidence that any derivation labeled as “correct” is actually correct.Some proof verifiers, such as Metamath, insist on having a complete derivation as input. Others, such as Mizar andIsabelle, take a well-formatted proof sketch (which may still be very long and detailed) and fill in the missing piecesby doing simple proof searches or applying known decision procedures: the resulting derivation is then verified by asmall, core “kernel”. Many such systems are primarily intended for interactive use by human mathematicians: theseare known as proof assistants. They may also use formal logics that are stronger than first-order logic, such as typetheory. Because a full derivation of any nontrivial result in a first-order deductive system will be extremely long fora human to write,[10] results are often formalized as a series of lemmas, for which derivations can be constructedseparately.Automated theorem provers are also used to implement formal verification in computer science. In this setting,theorem provers are used to verify the correctness of programs and of hardware such as processors with respect to aformal specification. Because such analysis is time-consuming and thus expensive, it is usually reserved for projectsin which a malfunction would have grave human or financial consequences.

2.10 See also• ACL2 — A Computational Logic for Applicative Common Lisp.

• Equiconsistency

• Extension by definitions

• Herbrandization

• Higher-order logic

• List of logic symbols

• Löwenheim number

• Prenex normal form

• Relational algebra

• Relational model

• Second-order logic

• Skolem normal form

• Tarski’s World

• Truth table

• Type (model theory)

2.11 Notes[1] Mendelson, Elliott (1964). Introduction to Mathematical Logic. Van Nostrand Reinhold. p. 56.

[2] The word language is sometimes used as a synonym for signature, but this can be confusing because “language” can alsorefer to the set of formulas.

[3] More precisely, there is only one language of each variant of one-sorted first-order logic: with or without equality, with orwithout functions, with or without propositional variables, ....

Page 33: Formal Semantics (Logic)

2.12. REFERENCES 23

[4] Some authors who use the term “well-formed formula” use “formula” to mean any string of symbols from the alphabet.However, most authors in mathematical logic use “formula” to mean “well-formed formula” and have no term for non-well-formed formulas. In every context, it is only the well-formed formulas that are of interest.

[5] The SMT-LIB Standard: Version 2.0, by Clark Barrett, Aaron Stump, and Cesare Tinelli. http://smtlib.cs.uiowa.edu/language.shtml

[6] e.g. the matrix shown at 4

[7] e.g. the matrix shown at 2

[8] Some authors only admit formulas with finitely many free variables in Lκω, and more generally only formulas with < λ freevariables in Lκλ.

[9] Bosse, Uwe (1993). “An Ehrenfeucht–Fraïssé game for fixpoint logic and stratified fixpoint logic”. In Börger, Egon.Computer Science Logic: 6th Workshop, CSL'92, San Miniato, Italy, September 28 - October 2, 1992. Selected Papers.Lecture Notes in Computer Science 702. Springer-Verlag. pp. 100–114. ISBN 3-540-56992-8. Zbl 0808.03024.

[10] Avigad et al. (2007) discuss the process of formally verifying a proof of the prime number theorem. The formalized proofrequired approximately 30,000 lines of input to the Isabelle proof verifier.

2.12 References

• Andrews, Peter B. (2002); An Introduction to Mathematical Logic and Type Theory: To Truth Through Proof,2nd ed., Berlin: Kluwer Academic Publishers. Available from Springer.

• Avigad, Jeremy; Donnelly, Kevin; Gray, David; and Raff, Paul (2007); “A formally verified proof of the primenumber theorem”, ACM Transactions on Computational Logic, vol. 9 no. 1 doi:10.1145/1297658.1297660

• Barwise, Jon (1977); “An Introduction to First-Order Logic”, in Barwise, Jon, ed. (1982). Handbook ofMathematical Logic. Studies in Logic and the Foundations of Mathematics. Amsterdam, NL: North-Holland.ISBN 978-0-444-86388-1.

• Barwise, Jon; and Etchemendy, John (2000); Language Proof and Logic, Stanford, CA: CSLI Publications(Distributed by the University of Chicago Press)

• Bocheński, Józef Maria (2007); A Précis of Mathematical Logic, Dordrecht, NL: D. Reidel, translated fromthe French and German editions by Otto Bird

• Ferreirós, José (2001); The Road to Modern Logic — An Interpretation, Bulletin of Symbolic Logic, Volume7, Issue 4, 2001, pp. 441–484, DOI 10.2307/2687794, JStor

• Gamut, L. T. F. (1991); Logic, Language, and Meaning, Volume 2: Intensional Logic and Logical Grammar,Chicago, IL: University of Chicago Press, ISBN 0-226-28088-8

• Hilbert, David; and Ackermann, Wilhelm (1950); Principles of Mathematical Logic, Chelsea (English transla-tion of Grundzüge der theoretischen Logik, 1928 German first edition)

• Hodges, Wilfrid (2001); “Classical Logic I: First Order Logic”, in Goble, Lou (ed.); The Blackwell Guide toPhilosophical Logic, Blackwell

• Ebbinghaus, Heinz-Dieter; Flum, Jörg; and Thomas, Wolfgang (1994); Mathematical Logic, UndergraduateTexts in Mathematics, Berlin, DE/New York, NY: Springer-Verlag, Second Edition, ISBN 978-0-387-94258-2

• Rautenberg,Wolfgang (2010),AConcise Introduction toMathematical Logic (3rd ed.), NewYork, NY: SpringerScience+Business Media, doi:10.1007/978-1-4419-1221-3, ISBN 978-1-4419-1220-6

• Tarski, Alfred andGivant, Steven (1987); AFormalization of Set Theory without Variables. Vol.41 ofAmericanMathematical Society colloquium publications, Providence RI: American Mathematical Society, ISBN 978-0821810415.

Page 34: Formal Semantics (Logic)

24 CHAPTER 2. FIRST-ORDER LOGIC

2.13 External links• Hazewinkel, Michiel, ed. (2001), “Predicate calculus”, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608-010-4

• Stanford Encyclopedia of Philosophy: Shapiro, Stewart; "Classical Logic". Covers syntax, model theory, andmetatheory for first-order logic in the natural deduction style.

• Magnus, P. D.; forall x: an introduction to formal logic. Covers formal semantics and proof theory for first-order logic.

• Metamath: an ongoing online project to reconstruct mathematics as a huge first-order theory, using first-orderlogic and the axiomatic set theory ZFC. Principia Mathematica modernized.

• Podnieks, Karl; Introduction to mathematical logic

• Cambridge Mathematics Tripos Notes (typeset by John Fremlin). These notes cover part of a past CambridgeMathematics Tripos course taught to undergraduates students (usually) within their third year. The course isentitled “Logic, Computation and Set Theory” and covers Ordinals and cardinals, Posets and Zorn’s Lemma,Propositional logic, Predicate logic, Set theory and Consistency issues related to ZFC and other set theories.

• Tree Proof Generator can validate or invalidate formulas of FOL through the semantic tableaux method.

Page 35: Formal Semantics (Logic)

2.13. EXTERNAL LINKS 25

A tableaux proof for the propositional formula ((a ∨ ~b) & b) → a.

Page 36: Formal Semantics (Logic)

Chapter 3

Formal proof

A formal proof or derivation is a finite sequence of sentences (called well-formed formulas in the case of a formallanguage) each of which is an axiom, an assumption, or follows from the preceding sentences in the sequence by arule of inference. The last sentence in the sequence is a theorem of a formal system. The notion of theorem is notin general effective, therefore there may be no method by which we can always find a proof of a given sentence ordetermine that none exists. The concept of natural deduction is a generalization of the concept of proof.[1]

The theorem is a syntactic consequence of all the well-formed formulas preceding it in the proof. For a well-formedformula to qualify as part of a proof, it must be the result of applying a rule of the deductive apparatus of some formalsystem to the previous well-formed formulae in the proof sequence.Formal proofs often are constructed with the help of computers in interactive theorem proving. Significantly, theseproofs can be checked automatically, also by computer. Checking formal proofs is usually simple, while the problemof finding proofs (automated theorem proving) is usually computationally intractable and/or only semi-decidable,depending upon the formal system in use.

3.1 Background

3.1.1 Formal language

Main article: Formal language

A formal language is a set of finite sequences of symbols. Such a language can be defined without reference to anymeanings of any of its expressions; it can exist before any interpretation is assigned to it – that is, before it has anymeaning. Formal proofs are expressed in some formal language.

3.1.2 Formal grammar

Main articles: Formal grammar and Formation rule

A formal grammar (also called formation rules) is a precise description of the well-formed formulas of a formallanguage. It is synonymous with the set of strings over the alphabet of the formal language which constitute wellformed formulas. However, it does not describe their semantics (i.e. what they mean).

3.1.3 Formal systems

Main article: Formal system

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with adeductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformation

26

Page 37: Formal Semantics (Logic)

3.2. SEE ALSO 27

rules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expressionfrom one or more other expressions.

3.1.4 Interpretations

Main articles: Formal semantics (logic) and Interpretation (logic)

An interpretation of a formal system is the assignment of meanings to the symbols, and truth-values to the sentencesof a formal system. The study of interpretations is called formal semantics. Giving an interpretation is synonymouswith constructing a model.

3.2 See also• Proof (truth)

• Mathematical proof

• Proof theory

• Axiomatic system

3.3 References[1] The Cambridge Dictionary of Philosophy, deduction

3.4 External links• “A Special Issue on Formal Proof”. Notices of the American Mathematical Society. December 2008.

• 2πix.com: Logic Part of a series of articles covering mathematics and logic.

Page 38: Formal Semantics (Logic)

Chapter 4

Formal semantics (logic)

For other uses, see Formal semantics.

In logic, formal semantics OR logical semantics,[1][2][3] is the study of the semantics, or interpretations, of formaland (idealizations of) natural languages usually trying to capture the pre-theoretic notion of entailment. (Althoughboth linguistics and logic lay claim to providing theories of natural language, according to Geach, logic generallyignores the “idiotism of idiom”, and sees natural languages as cluttered with idioms of no logical interest.)[4]

The truth conditions of various sentences we may encounter in arguments will depend upon their meaning, andso logicians cannot completely avoid the need to provide some treatment of the meaning of these sentences. Thesemantics of logic refers to the approaches that logicians have introduced to understand and determine that part ofmeaning in which they are interested; the logician traditionally is not interested in the sentence as uttered but in theproposition, an idealised sentence suitable for logical manipulation.Until the advent of modern logic, Aristotle'sOrganon, especiallyDe Interpretatione, provided the basis for understand-ing the significance of logic. The introduction of quantification, needed to solve the problem of multiple generality,rendered impossible the kind of subject-predicate analysis that governed Aristotle’s account, although there is a re-newed interest in term logic, attempting to find calculi in the spirit of Aristotle’s syllogistic but with the generality ofmodern logics based on the quantifier.The main modern approaches to semantics for formal languages are the following:

• Model-theoretic semantics is the archetype of Alfred Tarski's semantic theory of truth, based on his T-schema, and is one of the founding concepts of model theory. This is the most widespread approach, and isbased on the idea that the meaning of the various parts of the propositions are given by the possible ways wecan give a recursively specified group of interpretation functions from them to some predefined mathematicaldomains: an interpretation of first-order predicate logic is given by a mapping from terms to a universe ofindividuals, and a mapping from propositions to the truth values “true” and “false”. Model-theoretic semanticsprovides the foundations for an approach to the theory ofmeaning known as Truth-conditional semantics, whichwas pioneered by Donald Davidson. Kripke semantics introduces innovations, but is broadly in the Tarskianmold.

• Proof-theoretic semantics associates the meaning of propositions with the roles that they can play in in-ferences. Gerhard Gentzen, Dag Prawitz and Michael Dummett are generally seen as the founders of thisapproach; it is heavily influenced by Ludwig Wittgenstein's later philosophy, especially his aphorism “meaningis use”.

• Truth-value semantics (also commonly referred to as substitutional quantification) was advocated by RuthBarcan Marcus for modal logics in the early 1960s and later championed by Dunn, Belnap, and Leblanc forstandard first-order logic. James Garson has given some results in the areas of adequacy for intensional logicsoutfitted with such a semantics. The truth conditions for quantified formulas are given purely in terms of truthwith no appeal to domains whatsoever (and hence its name truth-value semantics).

• Game-theoretical semantics has made a resurgence lately mainly due to Jaakko Hintikka for logics of (fi-nite) partially ordered quantification which were originally investigated by Leon Henkin, who studied Henkin

28

Page 39: Formal Semantics (Logic)

4.1. NOTES 29

quantifiers.

• Probabilistic semantics originated fromH. Field and has been shown equivalent to and a natural generalizationof truth-value semantics. Like truth-value semantics, it is also non-referential in nature.

4.1 Notes[1] Winfried Nöth Handbook of semiotics p.103

[2] p.64

[3] pp.32-3

[4] Mieszko Talasiewicz (2009). Philosophy of Syntax - Foundational Topics. Springer. p. 12. ISBN 978-90-481-3287-4.

Page 40: Formal Semantics (Logic)

Chapter 5

Formal system

A formal system is broadly defined as anywell-defined system of abstract thought based on themodel ofmathematics.Euclid’s Elements is often held to be the first formal system and displays the characteristic of a formal system. Theentailment of the system by its logical foundation is what distinguishes a formal system from others which may havesome basis in an abstract model. Often the formal system will be the basis for or even identified with a larger theoryor field (e.g. Euclidean geometry) consistent with the usage in modern mathematics such as model theory. A formalsystem need not be mathematical as such; for example, Spinoza’s Ethics imitates the form of Euclid’s Elements.Each formal system has a formal language, which is composed by primitive symbols. These symbols act on certainrules of formation and are developed by inference from a set of axioms. The system thus consists of any numberof formulas built up through finite combinations of the primitive symbols—combinations that are formed from theaxioms in accordance with the stated rules.[1]

Formal systems in mathematics consist of the following elements:

1. A finite set of symbols (i.e. the alphabet), that can be used for constructing formulas (i.e. finite strings ofsymbols).

2. A grammar, which tells how well-formed formulas (abbreviated wff) are constructed out of the symbols in thealphabet. It is usually required that there be a decision procedure for deciding whether a formula is well formedor not.

3. A set of axioms or axiom schemata: each axiom must be a wff.

4. A set of inference rules.

A formal system is said to be recursive (i.e. effective) if the set of axioms and the set of inference rules are decidablesets or semidecidable sets, according to context.Some theorists use the term formalism as a rough synonym for formal system, but the term is also used to refer to aparticular style of notation, for example, Paul Dirac's bra–ket notation.

5.1 Related subjects

5.1.1 Logical system

A logical system or, for short, logic, is a formal system together with a form of semantics, usually in the form ofmodel-theoretic interpretation, which assigns truth values to sentences of the formal language, that is, formulae thatcontain no free variables. A logic is sound if all sentences that can be derived are true in the interpretation, andcomplete if, conversely, all true sentences can be derived.

30

Page 41: Formal Semantics (Logic)

5.1. RELATED SUBJECTS 31

5.1.2 Deductive system

A deductive system (also called a deductive apparatus of a formal system) consists of the axioms (or axiom schemata)and rules of inference that can be used to derive the theorems of the system.[2]

Such a deductive system is intended to preserve deductive qualities in the formulas that are expressed in the system.Usually the quality we are concerned with is truth as opposed to falsehood. However, other modalities, such asjustification or belief may be preserved instead.In order to sustain its deductive integrity, a deductive apparatus must be definable without reference to any intendedinterpretation of the language. The aim is to ensure that each line of a derivation is merely a syntactic consequenceof the lines that precede it. There should be no element of any interpretation of the language that gets involved withthe deductive nature of the system.

5.1.3 Formal proofs

Main article: Formal proof

Formal proofs are sequences of well-formed formulas. For a wff to qualify as part of a proof, it might either be anaxiom or be the product of applying an inference rule on previous wffs in the proof sequence. The last wff in thesequence is recognized as a theorem.The point of view that generating formal proofs is all there is to mathematics is often called formalism. David Hilbertfounded metamathematics as a discipline for discussing formal systems. Any language that one uses to talk about aformal system is called ametalanguage. Themetalanguagemay be a natural language, or it may be partially formalizeditself, but it is generally less completely formalized than the formal language component of the formal system underexamination, which is then called the object language, that is, the object of the discussion in question.Once a formal system is given, one can define the set of theorems which can be proved inside the formal system.This set consists of all wffs for which there is a proof. Thus all axioms are considered theorems. Unlike the grammarfor wffs, there is no guarantee that there will be a decision procedure for deciding whether a given wff is a theoremor not. The notion of theorem just defined should not be confused with theorems about the formal system, which, inorder to avoid confusion, are usually called metatheorems.

5.1.4 Formal language

Main article: Formal language

In mathematics, logic, and computer science, a formal language is a language that is defined by precise mathematicalor machine processable formulas. Like languages in linguistics, formal languages generally have two aspects:

• the syntax of a language is what the language looks like (more formally: the set of possible expressions thatare valid utterances in the language)

• the semantics of a language are what the utterances of the language mean (which is formalized in various ways,depending on the type of language in question)

A special branch of mathematics and computer science exists that is devoted exclusively to the theory of languagesyntax: formal language theory. In formal language theory, a language is nothing more than its syntax; questions ofsemantics are not included in this specialty.

5.1.5 Formal grammar

Main article: Formal grammar

In computer science and linguistics a formal grammar is a precise description of a formal language: a set of strings.The two main categories of formal grammar are that of generative grammars, which are sets of rules for how strings

Page 42: Formal Semantics (Logic)

32 CHAPTER 5. FORMAL SYSTEM

in a language can be generated, and that of analytic grammars (or reductive grammar,[3][4] which are sets of rulesfor how a string can be analyzed to determine whether it is a member of the language. In short, an analytic grammardescribes how to recognize when strings are members in the set, whereas a generative grammar describes how to writeonly those strings in the set.

5.2 See also

5.3 References[1] Encyclopædia Britannica, Formal system definition, 2007.

[2] Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of CaliforniaPres, 1971

[3] Reductive grammar: (computer science) A set of syntactic rules for the analysis of strings to determine whether the stringsexist in a language. “Sci-Tech Dictionary McGraw-Hill Dictionary of Scientific and Technical Terms” (6th ed.). McGraw-Hill. About the Author Compiled by The Editors of the McGraw-Hill Encyclopedia of Science & Technology (New York,NY) an in-house staff who represents the cutting-edge of skill, knowledge, and innovation in science publishing.

[4] “There are two classes of formal-language definition compiler-writing schemes. The productive grammar approach is themost common. A productive grammar consists primarrly of a set of rules that describe a method of generating all possiblestrings of the language. The reductive or analytical grammar technique states a set of rules that describe a method ofanalyzing any string of characters and deciding whether that string is in the language.” "The TREE-META Compiler-Compiler System: A Meta Compiler System for the Univac 1108 and General Electric 645, University of UtahTechnical Report RADC-TR-69-83. C. Stephen Carr, David A. Luther, Sherian Erdmann” (PDF). Retrieved 5 January2015.

5.4 Further reading• RaymondM. Smullyan, 1961. Theory of Formal Systems: Annals of Mathematics Studies, Princeton UniversityPress (April 1, 1961) 156 pages ISBN 0-691-08047-X

• S. C. Kleene, 1967. Mathematical Logic Reprinted by Dover, 2002. ISBN 0-486-42533-9

• Douglas Hofstadter, 1979. Gödel, Escher, Bach: An Eternal Golden Braid ISBN 978-0-465-02656-2. 777pages.

5.5 External links• Encyclopædia Britannica, Formal system definition, 2007.

• What is a Formal System?: Some quotes from John Haugeland’s `Artificial Intelligence: The Very Idea' (1985),pp. 48–64.

• Peter Suber, Formal Systems and Machines: An Isomorphism, 1997.

Page 43: Formal Semantics (Logic)

Chapter 6

Formation rule

In mathematical logic, formation rules are rules for describing which strings of symbols formed from the alphabet ofa formal language are syntactically valid within the language. These rules only address the location and manipulationof the strings of the language. It does not describe anything else about a language, such as its semantics (i.e. whatthe strings mean). (See also formal grammar).

6.1 Formal language

Main article: Formal language

A formal language is an organized set of symbols the essential feature being that it can be precisely defined in termsof just the shapes and locations of those symbols. Such a language can be defined, then, without any reference to anymeanings of any of its expressions; it can exist before any interpretation is assigned to it—that is, before it has anymeaning. A formal grammar determines which symbols and sets of symbols are formulas in a formal language.

6.2 Formal systems

Main article: Formal system

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with adeductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformationrules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expressionfrom one or more other expressions. Propositional and predicate calculi are examples of formal systems.

6.3 Propositional and predicate logic

The formation rules of a propositional calculus may, for instance, take a form such that;

• if we take Φ to be a propositional formula we can also take ¬Φ to be a formula;

• if we take Φ and Ψ to be a propositional formulas we can also take (Φ & Ψ), (Φ → Ψ), (Φ Ψ) and (Φ↔ Ψ)to also be formulas.

A predicate calculus will usually include all the same rules as a propositional calculus, with the addition of quantifierssuch that if we take Φ to be a formula of propositional logic and α as a variable then we can take (α)Φ and (α)Φ eachto be formulas of our predicate calculus.

33

Page 44: Formal Semantics (Logic)

34 CHAPTER 6. FORMATION RULE

6.4 See also• Finite state automaton

Page 45: Formal Semantics (Logic)

Chapter 7

Interpretation (logic)

For other uses, see Interpretation (disambiguation).

An interpretation is an assignment of meaning to the symbols of a formal language. Many formal languages usedin mathematics, logic, and theoretical computer science are defined in solely syntactic terms, and as such do not haveany meaning until they are given some interpretation. The general study of interpretations of formal languages iscalled formal semantics.The most commonly studied formal logics are propositional logic, predicate logic and their modal analogs, and forthese there are standard ways of presenting an interpretation. In these contexts an interpretation is a function thatprovides the extension of symbols and strings of symbols of an object language. For example, an interpretationfunction could take the predicate T (for “tall”) and assign it the extension a (for “Abraham Lincoln”). Note thatall our interpretation does is assign the extension a to the non-logical constant T, and does not make a claim aboutwhether T is to stand for tall and 'a' for Abraham Lincoln. Nor does logical interpretation have anything to say aboutlogical connectives like 'and', 'or' and 'not'. Though wemay take these symbols to stand for certain things or concepts,this is not determined by the interpretation function.An interpretation often (but not always) provides a way to determine the truth values of sentences in a language. If agiven interpretation assigns the value True to a sentence or theory, the interpretation is called a model of that sentenceor theory.

7.1 Formal languages

Main article: Formal language

A formal language consists of a fixed collection of sentences (also called words or formulas, depending on the context)composed from a fixed set of letters or symbols. The inventory fromwhich these letters are taken is called the alphabetover which the language is defined. The essential feature of a formal language is that its syntax can be defined withoutreference to interpretation. We can determine that (P or Q) is a well-formed formula even without knowing whetherit is true or false.To distinguish the strings of symbols that are in a formal language from arbitrary strings of symbols, the former aresometimes called well-formed formulæ (wff).

7.1.1 Example

A formal languageW can be defined with the alphabet α = , , and with a word being inW if it begins with and is composed solely of the symbols and .A possible interpretation ofW could assign the decimal digit '1' to and '0' to . Then would denote 101under this interpretation ofW .

35

Page 46: Formal Semantics (Logic)

36 CHAPTER 7. INTERPRETATION (LOGIC)

7.1.2 Logical constants

In the specific cases of propositional logic and predicate logic, the formal languages considered have alphabets thatare divided into two sets: the logical symbols (logical constants) and the non-logical symbols. The idea behind thisterminology is that logical symbols have the same meaning regardless of the subject matter being studied, whilenon-logical symbols change in meaning depending on the area of investigation.Logical constants are always given the same meaning by every interpretation of the standard kind, so that only themeanings of the non-logical symbols are changed. Logical constants include quantifier symbols ∀ (“all”) and ∃(“some”), symbols for logical connectives ∧ (“and”), ∨ (“or”), ¬ (“not”), parentheses and other grouping symbols,and (in many treatments) the equality symbol =.

7.2 General properties of truth-functional interpretations

Many of the commonly studied interpretations associate each sentence in a formal language with a single truth value,either True or False. These interpretations are called truth functional; they include the usual interpretations of propo-sitional and first-order logic. The sentences that are made true by a particular assignment are said to be satisfied bythat assignment.No sentence can be made both true and false by the same interpretation, but it is possible that the truth value ofthe same sentence can be different under different interpretations. A sentence is consistent if it is true under at leastone interpretation; otherwise it is inconsistent. A sentence φ is said to be logically valid if it is satisfied by everyinterpretation (if φ is satisfied by every interpretation that satisfies ψ then φ is said to be a logical consequence of ψ).

7.2.1 Logical connectives

Some of the logical symbols of a language (other than quantifiers) are truth-functional connectives that represent truthfunctions — functions that take truth values as arguments and return truth values as outputs (in other words, theseare operations on truth values of sentences).The truth-functional connectives enable compound sentences to be built up from simpler sentences. In this way, thetruth value of the compound sentence is defined as a certain truth function of the truth values of the simpler sentences.The connectives are usually taken to be logical constants, meaning that the meaning of the connectives is always thesame, independent of what interpretations are given to the other symbols in a formula.This is how we define logical connectives in propositional logic:

• ¬Φ is True iff Φ is False.• (Φ & Ψ) is True iff Φ is True and Ψ is True.• (Φ Ψ) is True iff ¬(¬Φ & ¬Ψ) is True.• (Φ → Ψ) is True iff (¬Φ is True Ψ is True).• (Φ↔ Ψ) is True iff (Φ → Ψ) is True and (Ψ → Φ) is True.

So under a given interpretation of all the sentence letters Φ and Ψ (i.e., after assigning a truth-value to each sentenceletter), we can determine the truth-values of all formulas that have them as constituents, as a function of the logicalconnectives. The following table shows how this kind of thing looks. The first two columns show the truth-values ofthe sentence letters as determined by the four possible interpretations. The other columns show the truth-values offormulas built from these sentence letters, with truth-values determined recursively.

Now it’s easier to see what makes a formula logically valid. Take the formula F: (Φ ~Φ). If our interpretationfunction makes Φ True, then ~Φ is made False by the negation connective. Since the disjunct Φ of F is True underthat interpretation, F is True. Now the only other possible interpretation of Φ makes it False, and if so, ~Φ is madeTrue by the negation function. That would make F True again, since one of Fs disjuncts, ~Φ, would be true under thisinterpretation. Since these two interpretations for F are the only possible logical interpretations, and since F comesout True for both, we say that it is logically valid or tautologous.

Page 47: Formal Semantics (Logic)

7.3. INTERPRETATION OF A THEORY 37

7.3 Interpretation of a theory

Main article: Theory (mathematical logic)

An interpretation of a theory is the relationship between a theory and some subject matter when there is a many-to-onecorrespondence between certain elementary statements of the theory, and certain statements related to the subjectmatter. If every elementary statement in the theory has a correspondent it is called a full interpretation, otherwise itis called a partial interpretation.[1]

7.4 Interpretations for propositional logic

The formal language for propositional logic consists of formulas built up from propositional symbols (also calledsentential symbols, sentential variables, and propositional variables) and logical connectives. The only non-logicalsymbols in a formal language for propositional logic are the propositional symbols, which are often denoted by capitalletters. To make the formal language precise, a specific set of propositional symbols must be fixed.The standard kind of interpretation in this setting is a function that maps each propositional symbol to one of the truthvalues true and false. This function is known as a truth assignment or valuation function. In many presentations, it isliterally a truth value that is assigned, but some presentations assign truthbearers instead.For a language with n distinct propositional variables there are 2n distinct possible interpretations. For any particularvariable a, for example, there are 21=2 possible interpretations: 1) a is assigned T, or 2) a is assigned F. For the paira, b there are 22=4 possible interpretations: 1) both are assigned T, 2) both are assigned F, 3) a is assigned T and bis assigned F, or 4) a is assigned F and b is assigned T.Given any truth assignment for a set of propositional symbols, there is a unique extension to an interpretation for allthe propositional formulas built up from those variables. This extended interpretation is defined inductively, usingthe truth-table definitions of the logical connectives discussed above.

7.5 First-order logic

Unlike propositional logic, where every language is the same apart from a choice of a different set of propositionalvariables, there are many different first-order languages. Each first-order language is defined by a signature. Thesignature consists of a set of non-logical symbols and an identification of each of these symbols as a constant symbol,a function symbol, or a predicate symbol. In the case of function and predicate symbols, a natural number arity isalso assigned. The alphabet for the formal language consists of logical constants, the equality relation symbol =, allthe symbols from the signature, and an additional infinite set of symbols known as variables.For example, in the language of rings, there are constant symbols 0 and 1, two binary function symbols + and ·, andno binary relation symbols. (Here the equality relation is taken as a logical constant.)Again, we might define a first-order language L, as consisting of individual symbols a, b, and c; predicate symbolsF,G, H, I and J; variables x,y,z; no function letters; no sentential symbols.

7.5.1 Formal languages for first-order logic

Given a signature σ, the corresponding formal language is known as the set of σ-formulas. Each σ-formula is built upout of atomic formulas bymeans of logical connectives; atomic formulas are built from terms using predicate symbols.The formal definition of the set of σ-formulas proceeds in the other direction: first, terms are assembled from theconstant and function symbols together with the variables. Then, terms can be combined into an atomic formulausing a predicate symbol (relation symbol) from the signature or the special predicate symbol "=" for equality (seethe section "Interpreting equality” below). Finally, the formulas of the language are assembled from atomic formulasusing the logical connectives and quantifiers.

Page 48: Formal Semantics (Logic)

38 CHAPTER 7. INTERPRETATION (LOGIC)

7.5.2 Interpretations of a first-order language

To ascribe meaning to all sentences of a first-order language, the following information is needed.

• A domain of discourse[2] D, usually required to be non-empty (see below).

• For every constant symbol, an element of D as its interpretation.

• For every n-ary function symbol, an n-ary function from D to D as its interpretation (that is, a function Dn →D).

• For every n-ary predicate symbol, an n-ary relation on D as its interpretation (that is, a subset of Dn).

An object carrying this information is known as a structure (of signature σ, or σ-structure, or L-structure), or as a“model”.The information specified in the interpretation provides enough information to give a truth value to any atomic for-mula, after each of its free variables, if any, has been replaced by an element of the domain. The truth value ofan arbitrary sentence is then defined inductively using the T-schema, which is a definition of first-order semanticsdeveloped by Alfred Tarski. The T-schema interprets the logical connectives using truth tables, as discussed above.Thus, for example, φ & ψ is satisfied if and only if both φ and ψ are satisfied.This leaves the issue of how to interpret formulas of the form ∀ x φ(x) and ∃ x φ(x). The domain of discourse formsthe range for these quantifiers. The idea is that the sentence ∀ x φ(x) is true under an interpretation exactly whenevery substitution instance of φ(x), where x is replaced by some element of the domain, is satisfied. The formula ∃ xφ(x) is satisfied if there is at least one element d of the domain such that φ(d) is satisfied.Strictly speaking, a substitution instance such as the formula φ(d) mentioned above is not a formula in the originalformal language of φ, because d is an element of the domain. There are two ways of handling this technical issue. Thefirst is to pass to a larger language in which each element of the domain is named by a constant symbol. The second isto add to the interpretation a function that assigns each variable to an element of the domain. Then the T-schema canquantify over variations of the original interpretation in which this variable assignment function is changed, insteadof quantifying over substitution instances.Some authors also admit propositional variables in first-order logic, which must then also be interpreted. A proposi-tional variable can stand on its own as an atomic formula. The interpretation of a propositional variable is one of thetwo truth values true and false.[3]

Because the first-order interpretations described here are defined in set theory, they do not associate each predicatesymbol with a property [4](or relation), but rather with the extension of that property (or relation). In other words,these first-order interpretations are extensional [5] not intensional.

7.5.3 Example of a first-order interpretation

An example of interpretation I of the language L described above is as follows.

• Domain: A chess set

• Individual constants: a: The white King b: The black Queen c: The white King’s pawn

• F(x): x is a piece

• G(x): x is a pawn

• H(x): x is black

• I(x): x is white

• J(x, y): x can capture y

In the interpretation I of L:

• the following are true sentences: F(a), G(c), H(b), I(a) J(b, c),

• the following are false sentences: J(a, c), G(a).

Page 49: Formal Semantics (Logic)

7.5. FIRST-ORDER LOGIC 39

7.5.4 Non-empty domain requirement

As stated above, a first-order interpretation is usually required to specify a nonempty set as the domain of discourse.The reason for this requirement is to guarantee that equivalences such as

(ϕ ∨ ∃xψ) ↔ ∃x(ϕ ∨ ψ)

where x is not a free variable of φ, are logically valid. This equivalence holds in every interpretation with a nonemptydomain, but does not always hold when empty domains are permitted. For example, the equivalence

[∀y(y = y) ∨ ∃x(x = x)] ≡ ∃x[∀y(y = y) ∨ x = x]

fails in any structure with an empty domain. Thus the proof theory of first-order logic becomes more complicatedwhen empty structures are permitted. However, the gain in allowing them is negligible, as both the intended inter-pretations and the interesting interpretations of the theories people study have non-empty domains.[6][7]

Empty relations do not cause any problem for first-order interpretations, because there is no similar notion of passinga relation symbol across a logical connective, enlarging its scope in the process. Thus it is acceptable for relationsymbols to be interpreted as being identically false. However, the interpretation of a function symbol must alwaysassign a well-defined and total function to the symbol.

7.5.5 Interpreting equality

The equality relation is often treated specially in first order logic and other predicate logics. There are two generalapproaches.The first approach is to treat equality as no different than any other binary relation. In this case, if an equality symbolis included in the signature, it is usually necessary to add various axioms about equality to axiom systems (for example,the substitution axiom saying that if a = b and R(a) holds then R(b) holds as well). This approach to equality is mostuseful when studying signatures that do not include the equality relation, such as the signature for set theory or thesignature for second-order arithmetic in which there is only an equality relation for numbers, but not an equalityrelation for set of numbers.The second approach is to treat the equality relation symbol as a logical constant that must be interpreted by the realequality relation in any interpretation. An interpretation that interprets equality this way is known as a normal model,so this second approach is the same as only studying interpretations that happen to be normal models. The advantageof this approach is that the axioms related to equality are automatically satisfied by every normal model, and so theydo not need to be explicitly included in first-order theories when equality is treated this way. This second approachis sometimes called first order logic with equality, but many authors adopt it for the general study of first-order logicwithout comment.There are a few other reasons to restrict study of first-order logic to normal models. First, it is known that any first-order interpretation in which equality is interpreted by an equivalence relation and satisfies the substitution axioms forequality can be cut down to an elementarily equivalent interpretation on a subset of the original domain. Thus there islittle additional generality in studying non-normal models. Second, if non-normal models are considered, then everyconsistent theory has an infinite model; this affects the statements of results such as the Löwenheim–Skolem theorem,which are usually stated under the assumption that only normal models are considered.

7.5.6 Many-sorted first-order logic

A generalization of first order logic considers languages with more than one sort of variables. The idea is differentsorts of variables represent different types of objects. Every sort of variable can be quantified; thus an interpretationfor a many-sorted language has a separate domain for each of the sorts of variables to range over (there is an infinitecollection of variables of each of the different sorts). Function and relation symbols, in addition to having arities, arespecified so that each of their arguments must come from a certain sort.

Page 50: Formal Semantics (Logic)

40 CHAPTER 7. INTERPRETATION (LOGIC)

One example of many-sorted logic is for planar Euclidean geometry. There are two sorts; points and lines. There isan equality relation symbol for points, an equality relation symbol for lines, and a binary incidence relation E whichtakes one point variable and one line variable. The intended interpretation of this language has the point variablesrange over all points on the Euclidean plane, the line variable range over all lines on the plane, and the incidencerelation E(p,l) holds if and only if point p is on line l.

7.6 Higher-order predicate logics

A formal language for higher-order predicate logic looks much the same as a formal language for first-order logic.The difference is that there are now many different types of variables. Some variables correspond to elements of thedomain, as in first-order logic. Other variables correspond to objects of higher type: subsets of the domain, functionsfrom the domain, functions that take a subset of the domain and return a function from the domain to subsets of thedomain, etc. All of these types of variables can be quantified.There are two kinds of interpretations commonly employed for higher-order logic. Full semantics require that, oncethe domain of discourse is satisfied, the higher-order variables range over all possible elements of the correct type(all subsets of the domain, all functions from the domain to itself, etc.). Thus the specification of a full interpretationis the same as the specification of a first-order interpretation. Henkin semantics, which are essentially multi-sortedfirst-order semantics, require the interpretation to specify a separate domain for each type of higher-order variable torange over. Thus an interpretation in Henkin semantics includes a domainD, a collection of subsets ofD, a collectionof functions from D to D, etc. The relationship between these two semantics is an important topic in higher orderlogic.

7.7 Non-classical interpretations

The interpretations of propositional logic and predicate logic described above are not the only possible interpreta-tions. In particular, there are other types of interpretations that are used in the study of non-classical logic (such asintuitionistic logic), and in the study of modal logic.Interpretations used to study non-classical logic include topological models, Boolean valued models, and Kripkemodels. Modal logic is also studied using Kripke models.

7.8 Intended interpretations

Many formal languages are associated with a particular interpretation that is used to motivate them. For example, thefirst-order signature for set theory includes only one binary relation, ∈, which is intended to represent set membership,and the domain of discourse in a first-order theory of the natural numbers is intended to be the set of natural numbers.The intended interpretation is called the standard model (a term introduced by Abraham Robinson in 1960).[8] In thecontext of Peano arithmetic, it consists of the natural numbers with their ordinary arithmetical operations. All modelsthat are isomorphic to the one just given are also called standard; these models all satisfy the Peano axioms. Thereare also non-standard models of the (first-order version of the) Peano axioms, which contain elements not correlatedwith any natural number.While the intended interpretation can have no explicit indication in the syntactical rules – since these rules mustbe strictly formal – the author’s intention respecting interpretation naturally affects her choice of the formation andtransformation rules of the syntactical system. For example, she chooses primitive signs in such a way that certainconcepts can be expressed; she chooses sentential formulas in such a way that their counterparts in the intended inter-pretation can appear as meaningful declarative sentences; her choice of primitive sentences must meet the requirementthat these primitive sentences come out as true sentences in the interpretation; her rules of inference must be suchthat, if by one of these rules the sentence Ij is directly derivable from a sentence Ii , then Ii → Ij turns out to be atrue sentence (under the customary interpretation of → as meaning implication). These requirements ensure that allprovable sentences also come out to be true.[9]

Most formal systems have many more models than they were intended to have (the existence of non-standard modelsis an example). When we speak about 'models’ in empirical sciences, we mean, if we want reality to be a model of ourscience, to speak about an intended model. A model in the empirical sciences is an intended factually-true descriptive

Page 51: Formal Semantics (Logic)

7.9. OTHER CONCEPTS OF INTERPRETATION 41

interpretation (or in other contexts: a non-intended arbitrary interpretation used to clarify such an intended factually-true descriptive interpretation.) All models are interpretations that have the same domain of discourse as the intendedone, but other assignments for non-logical constants.[10]

7.8.1 Example

Given a simple formal system (we shall call this one FS ′ ) whose alphabet α consists only of three symbols ,⋆, ♦ and whose formation rule for formulas is:

'Any string of symbols of FS ′ which is at least 6 symbols long, and which is not infinitely long, is aformula of FS ′ . Nothing else is a formula of FS ′ .'

The single axiom schema of FS ′ is:

" ⋆ * ♦ * " (where " * " is a metasyntactic variable standing for a finite string of " “s )

A formal proof can be constructed as follows:

(1) ⋆ ♦ (2) ⋆ ♦ (3) ⋆ ♦

In this example the theorem produced " ⋆ ♦ " can be interpreted as meaning “One plus threeequals four.” A different interpretation would be to read it backwards as “Four minus three equals one.”[11]

7.9 Other concepts of interpretation

There are other uses of the term “interpretation” that are commonly used, which do not refer to the assignment ofmeanings to formal languages.In model theory, a structure A is said to interpret a structure B if there is a definable subset D of A, and definablerelations and functions on D, such that B is isomorphic to the structure with domain D and these functions andrelations. In some settings, it is not the domain D that is used, but rather D modulo an equivalence relation definablein A. For additional information, see Interpretation (model theory).A theory T is said to interpret another theory S if there is a finite extension by definitions T ′ of T such that S iscontained in T ′.

7.10 See also• Free variables and Name binding

• Herbrand interpretation

• Interpretation (model theory)

• Logical system

• Löwenheim-Skolem theorem

• Modal logic

• Model (abstract)

• Model theory

• Satisfiable

• Truth

Page 52: Formal Semantics (Logic)

42 CHAPTER 7. INTERPRETATION (LOGIC)

7.11 References[1] Curry, Haskell, Foundations of Mathematical Logic p.48

[2] Sometimes called the “universe of disourse”

[3] Mates, Benson (1972), Elementary Logic, Second Edition, New York: Oxford University Press, p. 56, ISBN 0-19-501491-X

[4] The extension of a property (also called an attribute) is a set of individuals, so a property is a unary relation. E.g. Theproperties “yellow” and “prime” are unary relations.

[5] see also Extension (predicate logic)

[6] Hailperin, Theodore (1953), “Quantification theory and empty individual-domains”, The Journal of Symbolic Logic (As-sociation for Symbolic Logic) 18 (3): 197–200, doi:10.2307/2267402, JSTOR 2267402, MR 0057820

[7] Quine, W. V. (1954), “Quantification and the empty domain”, The Journal of Symbolic Logic (Association for SymbolicLogic) 19 (3): 177–179, doi:10.2307/2268615, JSTOR 2268615, MR 0064715

[8] Roland Müller (2009). “The Notion of a Model”. In Anthonie Meijers. Philosophy of technology and engineering sciences.Handbook of the Philosophy of Science 9. Elsevier. ISBN 978-0-444-51667-1.

[9] Rudolf Carnap, Introduction to Symbolic Logic and its Applications

[10] The Concept and the Role of the Model in Mathematics and Natural and Social Sciences

[11] Geoffrey Hunter, Metalogic

7.12 External links• Stanford Enc. Phil: Classical Logic, 4. Semantics

• mathworld.wolfram.com: FormalLanguage

• mathworld.wolfram.com: Connective

• mathworld.wolfram.com: Interpretation

• mathworld.wolfram.com: Propositional Calculus

• mathworld.wolfram.com: First Order Logic

Page 53: Formal Semantics (Logic)

Chapter 8

Logical consequence

“Entailment” redirects here. For other uses, see Entail (disambiguation).“Therefore” redirects here. For the therefore symbol (∴), see Therefore sign.“Logical implication” redirects here. For the binary connective, see Material conditional.

Logical consequence (also entailment) is one of the most fundamental concepts in logic. It is the relationship be-tween statements that holds true when one logically “follows from” one or more others. A valid logical argumentis one in which the conclusions follow from its premises, and its conclusions are consequences of its premises.The philosophical analysis of logical consequence involves asking, 'in what sense does a conclusion follow fromits premises?' and 'what does it mean for a conclusion to be a consequence of premises?'[1] All of philosophical logiccan be thought of as providing accounts of the nature of logical consequence, as well as logical truth.[2]

Logical consequence is taken to be both necessary and formal with examples explicated using models and proofs.[1]A sentence is said to be a logical consequence of a set of sentences, for a given language, if and only if, using logicalone (i.e. without regard to any interpretations of the sentences) the sentence must be true if every sentence in theset were to be true.[3]

Logiciansmake precise accounts of logical consequence with respect to a given languageL by constructing a deductivesystem for L , or by formalizing the intended semantics for L . Alfred Tarski highlighted three salient features forwhich any adequate characterization of logical consequence needs to account: 1) that the logical consequence relationrelies on the logical form of the sentences involved, 2) that the relation is a priori, i.e. it can be determined whetheror not it holds without regard to sense experience, and 3) that the relation has a modal component.[3]

8.1 Formal accounts

Themost widely prevailing view on how to best account for logical consequence is to appeal to formality. This is to saythat whether statements follow from one another logically depends on the structure or logical form of the statementswithout regard to the contents of that form.Syntactic accounts of logical consequence rely on schemes using inference rules. For instance, we can express thelogical form of a valid argument as “AllA areB . All C are A . Therefore, All C areB .” This argument is formallyvalid, because every instance of arguments constructed using this scheme are valid.This is in contrast to an argument like “Fred is Mike’s brother’s son. Therefore Fred is Mike’s nephew.” Since thisargument depends on the meanings of the words “brother”, “son”, and “nephew”, the statement “Fred is Mike’snephew” is a so-called material consequence of “Fred is Mike’s brother’s son,” not a formal consequence. A formalconsequence must be true in all cases, however this is an incomplete definition of formal consequence, since even theargument " P is Q 's brother’s son, therefore P is Q 's nephew” is valid in all cases, but is not a formal argument.[1]

43

Page 54: Formal Semantics (Logic)

44 CHAPTER 8. LOGICAL CONSEQUENCE

8.2 A priori property of logical consequence

If you know thatQ follows logically fromP no information about the possible interpretations ofP orQwill affect thatknowledge. Our knowledge thatQ is a logical consequence of P cannot be influenced by empirical knowledge.[1] De-ductively valid arguments can be known to be so without recourse to experience, so they must be knowable a priori.[1]However, formality alone does not guarantee that logical consequence is not influenced by empirical knowledge. Sothe a priori property of logical consequence is considered to be independent of formality.[1]

8.3 Proofs and models

The two prevailing techniques for providing accounts of logical consequence involve expressing the concept in termsof proofs and via models. The study of the syntactic consequence (of a logic) is called (its) proof theory whereas thestudy of (its) semantic consequence is called (its) model theory.[4]

8.3.1 Syntactic consequence

See also: ∴ and ⊢

A formula A is a syntactic consequence[5][6][7][8] within some formal system FS of a set Γ of formulas if there is aformal proof in FS of A from the set Γ .

Γ ⊢FS A

Syntactic consequence does not depend on any interpretation of the formal system.[9]

8.3.2 Semantic consequence

See also: ⊨

A formula A is a semantic consequence within some formal system FS of a set of statements Γ

Γ |=FS A,

if and only if there is no model I in which all members of Γ are true and A is false.[10] Or, in other words, the set ofthe interpretations that make all members of Γ true is a subset of the set of the interpretations that make A true.

8.4 Modal accounts

Modal accounts of logical consequence are variations on the following basic idea:

Γ ⊢ A is true if and only if it is necessary that if all of the elements of Γ are true, then A is true.

Alternatively (and, most would say, equivalently):

Γ ⊢ A is true if and only if it is impossible for all of the elements of Γ to be true and A false.

Such accounts are called “modal” because they appeal to the modal notions of logical necessity and logical possibility.'It is necessary that' is often expressed as a universal quantifier over possible worlds, so that the accounts above translateas:

Page 55: Formal Semantics (Logic)

8.5. SEE ALSO 45

Γ ⊢ A is true if and only if there is no possible world at which all of the elements of Γ are true and A isfalse (untrue).

Consider the modal account in terms of the argument given as an example above:

All frogs are green.Kermit is a frog.Therefore, Kermit is green.

The conclusion is a logical consequence of the premises because we can't imagine a possible world where (a) all frogsare green; (b) Kermit is a frog; and (c) Kermit is not green.

8.4.1 Modal-formal accounts

Modal-formal accounts of logical consequence combine the modal and formal accounts above, yielding variations onthe following basic idea:

Γ ⊢ A if and only if it is impossible for an argument with the same logical form as Γ / A to have truepremises and a false conclusion.

8.4.2 Warrant-based accounts

The accounts considered above are all “truth-preservational,” in that they all assume that the characteristic feature of agood inference is that it never allows one to move from true premises to an untrue conclusion. As an alternative, somehave proposed "warrant-preservational” accounts, according to which the characteristic feature of a good inferenceis that it never allows one to move from justifiably assertible premises to a conclusion that is not justifiably assertible.This is (roughly) the account favored by intuitionists such as Michael Dummett.

8.4.3 Non-monotonic logical consequence

Main article: Non-monotonic logic

The accounts discussed above all yield monotonic consequence relations, i.e. ones such that if A is a consequence ofΓ , then A is a consequence of any superset of Γ . It is also possible to specify non-monotonic consequence relationsto capture the idea that, e.g., 'Tweety can fly' is a logical consequence of

Birds can typically fly, Tweety is a bird

but not of

Birds can typically fly, Tweety is a bird, Tweety is a penguin.

For more on this, see Belief revision#Non-monotonic inference relation.

8.5 See also

8.6 Notes[1] Beall, JC and Restall, Greg, Logical Consequence The Stanford Encyclopedia of Philosophy (Fall 2009 Edition), Edward

N. Zalta (ed.).

[2] Quine, Willard Van Orman, Philosophy of logic

Page 56: Formal Semantics (Logic)

46 CHAPTER 8. LOGICAL CONSEQUENCE

[3] McKeon, Matthew, Logical Consequence Internet Encyclopedia of Philosophy.

[4] Kosta Dosen (1996). “Logical consequence: a turn in style”. In Maria Luisa Dalla Chiara, Kees Doets, Daniele Mundici,Johan van Benthem. Logic and Scientific Methods: Volume One of the Tenth International Congress of Logic, Methodologyand Philosophy of Science, Florence, August 1995. Springer. p. 292. ISBN 978-0-7923-4383-7.

[5] Dummett, Michael (1993) Frege: philosophy of language Harvard University Press, p.82ff

[6] Lear, Jonathan (1986) Aristotle and Logical Theory Cambridge University Press, 136p.

[7] Creath, Richard, and Friedman, Michael (2007) The Cambridge companion to Carnap Cambridge University Press, 371p.

[8] FOLDOC: “syntactic consequence”

[9] Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of CaliforniaPres, 1971, p. 75.

[10] Etchemendy, John, Logical consequence, The Cambridge Dictionary of Philosophy

8.7 Resources• Anderson, A.R.; Belnap, N.D., Jr. (1975), Entailment 1, Princeton, NJ: Princeton.

• Barwise, Jon; Etchemendy, John (2008), Language, Proof and Logic, Stanford: CSLI Publications.

• Brown, Frank Markham (2003), Boolean Reasoning: The Logic of Boolean Equations 1st edition, KluwerAcademic Publishers, Norwell, MA. 2nd edition, Dover Publications, Mineola, NY, 2003.

• Davis, Martin, (editor) (1965), The Undecidable, Basic Papers on Undecidable Propositions, Unsolvable Prob-lems And Computable Functions, New York: Raven Press. Papers include those by Gödel, Church, Rosser,Kleene, and Post.

• Dummett, Michael (1991), The Logical Basis of Metaphysics, Harvard University Press.

• Edgington, Dorothy (2001), Conditionals, Blackwell in Lou Goble (ed.), The Blackwell Guide to PhilosophicalLogic.

• Edgington, Dorothy (2006), Conditionals in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy.

• Etchemendy, John (1990), The Concept of Logical Consequence, Harvard University Press.

• Goble, Lou, ed. (2001), The Blackwell Guide to Philosophical Logic, Blackwell.

• Hanson, William H (1997), “The concept of logical consequence”, The Philosophical Review 106 365–409.

• Hendricks, Vincent F. (2005), Thought 2 Talk: A Crash Course in Reflection and Expression, New York: Au-tomatic Press / VIP, ISBN 87-991013-7-8

• Planchette, P. A. (2001), Logical Consequence in Goble, Lou, ed., The Blackwell Guide to Philosophical Logic.Blackwell.

• Quine, W.V. (1982), Methods of Logic, Cambridge, MA: Harvard University Press (1st ed. 1950), (2nd ed.1959), (3rd ed. 1972), (4th edition, 1982).

• Shapiro, Stewart (2002), Necessity, meaning, and rationality: the notion of logical consequence in D. Jacquette,ed., A Companion to Philosophical Logic. Blackwell.

• Tarski, Alfred (1936), On the concept of logical consequence Reprinted in Tarski, A., 1983. Logic, Semantics,Metamathematics, 2nd ed. Oxford University Press. Originally published in Polish and German.

• A paper on 'implication' from math.niu.edu, Implication

• A definition of 'implicant' AllWords

• Ryszard Wójcicki (1988). Theory of Logical Calculi: Basic Theory of Consequence Operations. Springer.ISBN 978-90-277-2785-5.

Page 58: Formal Semantics (Logic)

Chapter 9

Logical constant

In logic, a logical constant of a languageL is a symbol that has the same semantic value under every interpretation ofL . Two important types of logical constants are logical connectives and quantifiers. The equality predicate (usuallywritten '=') is also treated as a logical constant in many systems of logic. One of the fundamental questions in thephilosophy of logic is “What is a logical constant?"; that is, what special feature of certain constants makes themlogical in nature?[1]

Some symbols that are commonly treated as logical constants are:Many of these logical constants are sometimes denoted by alternate symbols (e.g., the use of the symbol "&" ratherthan "∧" to denote the logical and).

9.1 See also• Non-logical symbol

• Logical value

• Logical connective

9.2 References[1] Carnap

9.3 External links• Stanford Encyclopedia of Philosophy entry on logical constants

48

Page 59: Formal Semantics (Logic)

Chapter 10

Logical Syntax of Language

Rudolf Carnap (/ˈkɑrnæp/;[1] German: [ˈkaɐnaːp]; May 18, 1891 – September 14, 1970) was a German-born philoso-pher who was active in Europe before 1935 and in the United States thereafter. He was a major member of theVienna Circle and an advocate of logical positivism. He is considered “one of the giants among twentieth-centuryphilosophers.”[2]

10.1 Life and work

Carnap’s Birthplace in Wuppertal

Carnap’s father had risen from the status of a poor ribbon-weaver to become the owner of a ribbon-making fac-tory. His mother came from academic stock; her father was an educational reformer and her oldest brother was thearchaeologist Wilhelm Dörpfeld. As a ten-year-old, Carnap accompanied his uncle on an expedition to Greece.[3]

He began his formal education at the Barmen Gymnasium. From 1910 to 1914, he attended the University ofJena, intending to write a thesis in physics. But he also studied carefully Kant's Critique of Pure Reason during acourse taught by Bruno Bauch, and was one of very few students to attend Gottlob Frege's courses in mathematical

49

Page 60: Formal Semantics (Logic)

50 CHAPTER 10. LOGICAL SYNTAX OF LANGUAGE

logic. While Carnap held moral and political opposition to World War I, he felt obligated to serve in the Germanarmy. After three years of service, he was given permission to study physics at the University of Berlin, 1917–18,where Albert Einstein was a newly appointed professor. Carnap then attended the University of Jena, where hewrote a thesis defining an axiomatic theory of space and time. The physics department said it was too philosophical,and Bruno Bauch of the philosophy department said it was pure physics. Carnap then wrote another thesis, withBauch’s supervision, on the theory of space in a more orthodox Kantian style, and published as Der Raum (Space)in a supplemental issue of Kant-Studien (1922). In it he makes the clear distinction between formal, physical andperceptual (e.g., visual) spaces.Frege’s course exposed him to Bertrand Russell's work on logic and philosophy, which put a sense of the aims tohis studies. He accepted the effort to surpass traditional philosophy with logical innovations that inform the sciences.He wrote a letter to Russell, who responded by copying by hand long passages from his Principia Mathematica forCarnap’s benefit, as neither Carnap nor his university could afford a copy of this epochal work. In 1924 and 1925, heattended seminars led by Edmund Husserl, the founder of phenomenology, and continued to write on physics from alogical positivist perspective.Carnap discovered a kindred spirit when he met Hans Reichenbach at a 1923 conference. Reichenbach introducedCarnap to Moritz Schlick, a professor at the University of Vienna who offered Carnap a position in his department,which Carnap accepted in 1926. Carnap thereupon joined an informal group of Viennese intellectuals that came tobe known as the Vienna Circle, directed largely by Moritz Schlick and including Hans Hahn, Friedrich Waismann,Otto Neurath, and Herbert Feigl, with occasional visits by Hahn’s student Kurt Gödel. When Wittgenstein visitedVienna, Carnap would meet with him. He (with Hahn and Neurath) wrote the 1929 manifesto of the Circle, and(with Hans Reichenbach) initiated the philosophy journal Erkenntnis.In 1928, Carnap published two important books:

• The Logical Structure of theWorld (German: “Der logischeAufbau derWelt”), in which he developed a rigorousformal version of empiricism, defining all scientific terms in phenomenalistic terms. The formal system of theAufbau (as the work is commonly termed) was grounded in a single primitive dyadic predicate, which is satisfiedif “two” individuals “resemble” each other. The Aufbau was greatly influenced by Principia Mathematica, andwarrants comparison with the mereotopological metaphysics A. N. Whitehead developed over 1916–29. Itappears, however, that Carnap soon became somewhat disenchanted with this book. In particular, he did notauthorize an English translation until 1967.

• Pseudoproblems in Philosophy asserted that many philosophical questions were meaningless, i.e., the way theywere posed amounted to an abuse of language. An operational implication of this opinion was taken to be theelimination of metaphysics from responsible human discourse. This is the statement for which Carnap was bestknown for many years.

See also: Carnap–Ramsey sentence

In February 1930 Tarski lectured in Vienna, and during November 1930 Carnap visited Warsaw. On these occasionshe learned much about Tarski’s model theoretic method of semantics. Rose Rand, another philosopher in the ViennaCircle, noted, “Carnap’s conception of semantics starts from the basis given in Tarski’s work but a distinction is madebetween logical and non-logical constants and between logical and factual truth... At the same time he worked withthe concepts of intension and extension and took these two concepts as a basis of a new method of semantics.”[4] In1931, Carnap was appointed Professor at the German language University of Prague. There he wrote the book thatwas to make him the most famous logical positivist and member of the Vienna Circle, his Logical Syntax of Language(Carnap 1934). In this work, Carnap advanced his Principle of Tolerance, according to which there is not any suchthing as a “true” or “correct” logic or language. One is free to adopt whatever form of language is useful for one’spurposes. In 1933, W. V. Quine met Carnap in Prague and discussed the latter’s work at some length. Thus beganthe lifelong mutual respect these two men shared, one that survived Quine’s eventual forceful disagreements with anumber of Carnap’s philosophical conclusions.Carnap, whose socialist and pacifist beliefs put him at risk in Nazi Germany, emigrated to the United States in1935 and became a naturalized citizen in 1941. Meanwhile back in Vienna, Moritz Schlick was murdered in 1936.From 1936 to 1952, Carnap was a professor of philosophy at the University of Chicago. During the late 1930s,Carnap offered an assistant position in philosophy to Carl Gustav Hempel, who accepted. The two conducted researchincluding Logical Syntax.[5] Thanks partly to Quine’s help, Carnap spent the years 1939–41 at Harvard, where he wasreunited with Tarski. Carnap (1963) later expressed some irritation about his time at Chicago, where he and Charles

Page 61: Formal Semantics (Logic)

10.2. LOGICAL SYNTAX 51

W. Morris were the only members of the department committed to the primacy of science and logic. (Their Chicagocolleagues included Richard McKeon, Mortimer Adler, Charles Hartshorne, and Manley Thompson.) Carnap’s yearsat Chicago were nonetheless very productive ones. He wrote books on semantics (Carnap 1942, 1943, 1956), modallogic, being very similar in Carnap (1956) to the now-standard possible worlds semantics for that logic Saul Kripkeproposed starting in 1959, and on the philosophical foundations of probability and induction (Carnap 1950, 1952).After a stint at the Institute for Advanced Study in Princeton, he joined the philosophy department at UCLA in 1954,Hans Reichenbach having died the previous year. He had earlier refused an offer of a similar job at the University ofCalifornia, because accepting that position required that he sign a loyalty oath, a practice to which he was opposed onprinciple. While at UCLA, he wrote on scientific knowledge, the analytic – synthetic dichotomy, and the verificationprinciple. His writings on thermodynamics and on the foundations of probability and induction, were publishedposthumously as Carnap (1971, 1977, 1980).Carnap taught himself Esperanto when he was 14 years of age, and remained sympathetic to it (Carnap 1963). Helater attended the World Congress of Esperanto in 1908 and 1922, and employed the language while traveling.Carnap had four children by his first marriage to Elizabeth Schöndube, which ended in divorce in 1929. He marriedhis second wife, Elizabeth Ina Stögner, in 1933.[3] Ina committed suicide in 1964.

10.2 Logical syntax

Carnap’s Logical Syntax of Language can be regarded as a response to Wittgenstein 's Tractatus.Carnap elaborated and extended the concept of logical syntax proposed by Wittgenstein in the Tractatus (Section3.325).

3.325. In order to avoid such errors we must make use of a sign-language that excludes them by notusing the same sign for different symbols and by not using in a superficially similar way signs that havedifferent modes of signification: that is to say, a sign-language that is governed by logical grammar—bylogical syntax. ......

—Wittgenstein , Section 3.325, Tractatus

However, Wittgenstein stated that propositions cannot represent logical form.

4.121. Propositions cannot represent logical form: it is mirrored in them. What finds its reflection inlanguage, language cannot represent. What expresses itself in language, we cannot express by means oflanguage.

Propositions show the logical form of reality. They display it.— Wittgenstein , Section 4.121, Tractatus

Carnap disagreed. Wittgenstein proposed the idea of logical syntax. It is Carnap who designed, formulated andimplemented the details of logical syntax in philosophical analysis. Carnap defined logical syntax as:

By the logical syntax of a language, we mean the formal theory of the linguistic forms of that lan-guage – the systematic statement of the formal rules which govern it together with the development of theconsequences which follow from these rules.

A theory, a rule, a definition, or the like is to be called formal when no reference is made in it eitherto the meaning of the symbols (for examples, the words) or to the sense of the expressions (e.g. thesentences), but simply and solely to the kinds and order of the symbols from which the expressions areconstructed.

— Carnap , Page 1, Logical Syntax of Language

In the U.S, the concept of logical syntax helped the development of natural language processing and compiler design.

10.2.1 The purpose of logical syntax

The purpose of logical syntax is to provide a system of concepts, a language, by the help of which the results of logicalanalysis will be exactly formulable.

Page 62: Formal Semantics (Logic)

52 CHAPTER 10. LOGICAL SYNTAX OF LANGUAGE

Carnap stated :

Philosophy is to be replaced by the logic of science – that is to say, by the logical analysis of theconcepts and sentences of the sciences, for the logic of science is nothing other than the logical syntax ofthe language of science.

—Carnap , Foreword, Logical Syntax of Language

......According to this view, the sentences ofmetaphysics are pseudo-sentences which on logical analysisare proved to be either empty phrases or phrases which violate the rules of syntax. Of the so-calledphilosophical problems, the only questions which have any meaning are those of the logic of science. Toshare this view is to substitute logical syntax for philosophy.

—Carnap , Page 8, Logical Syntax of Language

Carnap wanted only to end metaphysics but not philosophy.

10.3 Rejection of metaphysics

Carnap, in his book Philosophy and Logical Syntax, used the concept of verifiability to reject metaphysics.

10.3.1 The function of logical analysis

Carnap used the method of logical analysis to reject metaphysics.

The function of logical analysis is to analyse all knowledge, all assertions of science and of everydaylife, in order to make clear the sense of each such assertion and the connections between them. One of theprincipal tasks of the logical analysis of a given proposition is to find out the method of verification forthat proposition.

—Carnap , Page. 9-10 , Philosophy and Logical Syntax

10.4 Selected publications

For links to Carnap’s publications and discussions of his work, see ""Carnap” in All Fields of Study”. MicrosoftAcademic Search. Retrieved 2013-05-26.

• 1922. Der Raum: Ein Beitrag zur Wissenschaftslehre, Kant-Studien, Ergänzungshefte, no. 56. His Ph.D. thesis.

• 1926. Physikalische Begriffsbildung. Karlsruhe: Braun.

• 1928. Scheinprobleme in der Philosophie (Pseudoproblems of Philosophy). Berlin: Weltkreis-Verlag.

• 1928. Der Logische Aufbau der Welt. Leipzig: Felix Meiner Verlag. English translation by Rolf A. George,1967. The Logical Structure of the World. Pseudoproblems in Philosophy. University of California Press. ISBN0-812-69523-2

• 1929. Abriss der Logistik, mit besonderer Berücksichtigung der Relationstheorie und ihrer Anwendungen. Springer.[6]

• 1934. Logische Syntax der Sprache. English translation 1937, The Logical Syntax of Language. Kegan Paul.[7]

• 1996 (1935). Philosophy and Logical Syntax. Bristol UK: Thoemmes. Excerpt.

• 1939, Foundations of Logic and Mathematics in International Encyclopedia of Unified Science, Vol. I, no. 3.University of Chicago Press.[8]

• 1942. Introduction to Semantics. Harvard Uni. Press.

• 1943. Formalization of Logic. Harvard Uni. Press.

Page 63: Formal Semantics (Logic)

10.5. SEE ALSO 53

• 1945. On Inductive Logic in Philosophy of Science, Vol.12, p. 72-97.

• 1945. The Two Concepts of Probability in Philosophy and Phenomenological Research, Vol.5, No.4 (Jun), p.513-532.

• 1947. On the Application of Inductive Logic in Philosophy and Phenomenological Research, Vol. 8, p. 133-148.

• 1956 (1947). Meaning and Necessity: a Study in Semantics and Modal Logic. University of Chicago Press.

• 1950. Logical Foundations of Probability. University of Chicago Press. Pp. 3–15 online.

• 1950. "Empiricism, Semantics, Ontology", Revue Internationale de Philosophie 4: 20–40.

• 1952. The Continuum of Inductive Methods. University of Chicago Press.

• 1958. Introduction to Symbolic Logic with Applications. Dover.

• 1963, “Intellectual Autobiography” in Schilpp (1963: 1–84).

• 1966. Philosophical Foundations of Physics. Martin Gardner, ed. Basic Books. Online excerpt.

• 1971. Studies in inductive logic and probability, Vol. 1. University of California Press.

• 1977. Two essays on entropy. Shimony, Abner, ed. University of California Press.

• 1980. Studies in inductive logic and probability, Vol. 2. Jeffrey, R. C., ed. University of California Press.

• 2000. Untersuchungen zur Allgemeinen Axiomatik. Edited from unpublished manuscript by T. Bonk and J.Mosterín. Darmstadt: Wissenschaftliche Buchgesellschaft. 167 pp. ISBN 3-534-14298-5.

Online bibliography. Under construction, with no entries dated later than 1937.For a more comprehensive bibliography, see also http://fr.wikipedia.org/wiki/Rudolf_Carnap

10.5 See also• Analytic–synthetic distinction

• Carnap–Ramsey sentence

• Internal–external distinction

• Meta-ontology

• Philosophy of science

• Skepticism

• Visual space

10.6 References[1] “Carnap”. Random House Webster’s Unabridged Dictionary.

[2] http://texts.cdlib.org/view?docId=hb6h4nb3q7&doc.view=frames&chunk.id=div00004&toc.depth=1&toc.id=

[3] Quine, W.V. and Rudolf Carnap (1990). Dear Carnap, Dear Van: The Quine-Carnap Correspondence and Related Work.Berkeley, CA: University of California Press. p. 23.

[4] Rand, Rose. “Reading Notes and Summaries onWorks by Rudolph Carnap, 1932 and Undated” (PDF). Rose Rand Papers.Special Collections Department, University of Pittsburgh. Retrieved May 16, 2013.

[5] Carnap, Rudolf. “Rudolf Carnap Papers”. Special Collections Department, University of Pittsburgh. Retrieved September17, 2013.

Page 64: Formal Semantics (Logic)

54 CHAPTER 10. LOGICAL SYNTAX OF LANGUAGE

[6] Weiss, Paul (1929). “Review: Abriss der Logistik by Rudolf Carnap” (PDF). Bull. Amer. Math. Soc. 35 (6): 880.doi:10.1090/s0002-9904-1929-04818-3.

[7] Mac Lane, Saunders (1938). “Review: The Logical Syntax of Language by Rudolf Carnap, translated from the German byAmethe Smeaton” (PDF). Bull. Amer. Math. Soc. 44 (3): 171–176.

[8] Church, Alonzo (1939). “Review: Foundations of Logic and Mathematics by Rudolf Carnap” (PDF). Bull. Amer. Math.Soc. 45 (11): 821–822. doi:10.1090/s0002-9904-1939-07085-7.

10.7 Sources• Richard Creath, Michael Friedman, ed. (2007). The Cambridge companion to Carnap. Cambridge UniversityPress. ISBN 0521840155.

• Roger F Gibson, ed. (2004). The Cambridge companion to Quine. Cambridge University Press. ISBN0521639492.

• Ivor Grattan-Guinness, 2000. In Search of Mathematical Roots. Princeton Uni. Press.

• Thomas Mormann, 2000. “Rudolf Carnap” (book). München, Beck.

• Willard Quine

• 1951, Two Dogmas of Empiricism. The Philosophical Review 60: 20–43. Reprinted in his 1953 From aLogical Point of View. Harvard University Press.

• 1985, The Time of My Life: An Autobiography. MIT Press.

• Richardson, Alan W., 1998. Carnap’s construction of the world : the Aufbau and the emergence of logicalempiricism. Cambridge Uni. Press.

• Schilpp, P. A., ed., 1963. The Philosophy of Rudolf Carnap. LaSalle IL: Open Court.

• Spohn, Wolfgang, ed., 1991. Erkenntnis Orientated: A Centennial Volume for Rudolf Carnap and Hans Re-ichenbach. Kluwer Academic Publishers.

• 1991. Logic, Language, and the Structure of Scientific Theories: Proceedings of the Carnap-Reichenbach Cen-tennial, University of Konstanz, May 21–24, 1991. University of Pittsburgh Press.

• Wagner, Pierre, ed., 2009. Carnap’s Logical Syntax of Language. Palgrave Macmillan.

• Wagner, Pierre, ed., 2012. Carnap’s Ideal of Explication and Naturalism. Palgrave Macmillan.

10.8 External links• Rudolf Carnap entry by Mauro Murzi in the Internet Encyclopedia of Philosophy

• Carnap’s Modal Logic entry by M.J. Cresswell in the Internet Encyclopedia of Philosophy

• Rudolf Carnap Webpage and Directory of Internet Resources

• Homepage of the Collected Works of Rudolf Carnap. Department of Philosophy, Carnegie Mellon University

• Precis of Carnap’s philosophy.

• The Life of Rudolf Carnap, Philosophy at RBJones.com

• R. Carnap: “Von der Erkenntnistheorie zur Wissenschaftslogik”, Paris Congress in 1935, Paris, 1936.

• R. Carnap: "Über die Einheitssprache der Wissenschaft”, Paris Congress in 1935, Paris, 1936.

• R. Carnap: “Wahrheit und Bewährung”, Paris Congress in 1935, Paris, 1936.

• Rudolf Carnap Papers: (Rudolf Carnap Papers, 1905-1970, ASP.1974.01, Special Collections Department,University of Pittsburgh.)

Page 65: Formal Semantics (Logic)

10.8. EXTERNAL LINKS 55

• Das Fremdpsychische bei Rudolf Carnap (German) by Robert Bauer.

• FBI file on Rudolph Carnap

• Luchte, James, 2007. “Martin Heidegger and Rudolf Carnap: Radical Phenomenology, Logical Positivismand the Roots of the Continental/Analytic Divide,” Philosophy Today, Vol. 51, No. 3, 241–260.

Page 66: Formal Semantics (Logic)

Chapter 11

Metasyntactic variable

This article is about metasyntactic variables in computer science and hacker culture. For metasyntactic variables asused in formal logic, see Metavariable (logic). For general usage, see placeholder name.

A metasyntactic variable is a placeholder name used in computer science, a word without meaning intended to besubstituted by some objects pertaining to the context where it is used. The word foo as used in IETF Requests forComments is a good example.[1]

By mathematical analogy, a metasyntactic variable is a word that is a variable for other words, just as in algebra lettersare used as variables for numbers.[1] Any symbol or word which does not violate the syntactic rules of the languagecan be used as a metasyntactic variable. For specifications written in natural language, nonsense words are commonlyused as metasyntactic variables.Metasyntactic variables have a secondary, implied meaning to the reader (often students), which makes them differentfrom normal metavariables. It is understood by those who have studied computer science that certain words areplaceholders or examples only and should or must be replaced in a production-level computer program.In hacker culture, “metasyntactic variable” has come to denote some typical (otherwise meaningless) words used asmetavariables in computing; see reification. For example, The Hacker’s Dictionary (1st ed.) defined FOO as “the firstmetasyntactic variable” and BAR as “the second metasyntactic variable”, explaining that “When you have to inventan arbitrary temporary name for something for the sake of exposition, FOO is usually used. If you need a secondone, BAR or BAZ is usually used; there is a slight preference at MIT for bar and at Stanford for baz. Clearly, barwas the original, for the concatenation FOOBAR is widely used also, and this in turn can be traced to the obsceneacronym 'FUBAR' that arose in the armed forces during World War II. [...] A hacker avoids using 'foo' as the realname of anything. Indeed, a standard convention is that any file with 'foo' in its name is temporary and can be deletedon sight.”[2] The names of these consecrated “metasyntactic variables” are also commonly used as actual identifiers(for variables, functions, etc.) in tutorial programming examples when their purpose is to emphasize syntax; in thisusage, “metasyntactic variable” is synonymous with "meaningless word".[3]

11.1 Construction

• meta- means providing information about, or transcending,

• syntax denotes the grammatical arrangement of words or the grammatical rules of a programming language,and

• a variable is something that can assume a value, or something likely to vary.

So metasyntactic variable denotes a word that “transcends grammar and can assume a value” or one that is “morecomprehensive than suggested by its grammatical arrangement and is likely to vary”. It may also denote a word thatprovides information about the grammatical arrangement of words by being able to assume a value that is expectedto vary.

56

Page 67: Formal Semantics (Logic)

11.2. EXAMPLES 57

11.2 Examples

RFC 772 (cited in RFC 3092) contains for instance:All is well; now the recipients can be specified. S:MRCPTO:<Foo@Y><CRLF>R: 200OKS:MRCPTO:<Raboof@Y><CRLF>R: 553No such user here S:MRCPTO:<bar@Y><CRLF>R: 200OKS:MRCPTO:<@Y,@X,fubar@Z><CRLF> R: 200 OK Note that the failure of “Raboof” has no effect on the storage of mail for “Foo”, “bar” or themail to be forwarded to “fubar@Z” through host “X”.Both the IETF RFCs and computer programming languages are rendered in plain text, making it necessary to dis-tinguish metasyntactic variables by a naming convention, more or less obvious from context. If rich text formattingis available, e.g. as in the HTML produced from texinfo sources, then a typographical convention may be used, asdone for the example in the GNU Fortran manual:[4]

A metasyntactic variable—that is, a name used in this document to serve as a placeholder for what-ever text is used by the user or programmer—appears as shown in the following example:

“The INTEGER ivar statement specifies that ivar is a variable or array of type INTEGER.”In the above example, any valid text may be substituted for the metasyntactic variable ivar to make

the statement apply to a specific instance, as long as the same text is substituted for both occurrences ofivar.

The above example uses italics to denote metavariables (borrowing from the common convention to use italics forvariables in mathematics), although italics are also used in the same text for emphasizing other words. (The docu-mentation for texinfo emphasizes the distinction between metavariables and mere variables used in a programminglanguage being documented in some texinfo file as: “Use the @var command to indicate metasyntactic variables. Ametasyntactic variable is something that stands for another piece of text. For example, you should use a metasyntacticvariable in the documentation of a function to describe the arguments that are passed to that function. Do not use@var for the names of particular variables in programming languages. These are specific names from a program, so@code is correct for them.”[5]) Another point reflected in the above example is the convention that a metavariable isto be uniformly substituted with the same instance in all its appearances in a given schema. This is in contrast withnonterminal symbols in formal grammars where the nonterminals on the right of a production can be substituted bydifferent instances.[6]

A third example of the use of the “metasyntactic variables” foo and bar, this time as actual identifiers in a program-ming (interview) example is contrasting the following C++ function prototypes for their different argument passingmechanisms:[7]

void foo(Fruit bar); void foo(Fruit* bar); void foo(Fruit& bar);

11.3 Words commonly used as metasyntactic variables

11.3.1 Arabic

In Arabic, the word “kedha” (كذا) is often used in the same way English speakers use the word “bla” as in, “kedha,kedha, kedha” to mean “this, that, and the other thing” or, “such and such”. Similarly, the names “Fullan” (فلان)and "'Allan” (علان) are used to refer to non-specific persons, a practice which has been adopted in other languages(see Portuguese, Spanish, Turkish and Persian below).

11.3.2 English

A “standard list of metasyntactic variables used in syntax examples” often used in the United States is: foo, bar, baz,qux, quux, corge, grault, garply, waldo, fred, plugh, xyzzy, thud.[1] The word foo occurs in over 330 RFCs and baroccurs in over 290.[8] Wibble, wobble, wubble, Fred and flob are often used in the UK.[9]

Due to English being the foundation-language, or lingua franca, of most computer programming languages thesevariables are also commonly seen even in programs and examples of programs written for other spoken-languageaudiences.

Page 68: Formal Semantics (Logic)

58 CHAPTER 11. METASYNTACTIC VARIABLE

The typical names may depend however on the subculture that has developed around a given programming language.For example, spam, ham, and eggs are the principal metasyntactic variables used in the Python programming lan-guage.[10] This is a reference to the famous comedy sketch Spam by Monty Python, the eponym of the language.[11]

The R programming language often adds norf to the list.[12][13]

11.3.3 German

In German, the words bla, blubb and blabla are commonly used as names for metasyntactic variables (comparablewith English blah, blah-blah).

11.3.4 French

In French, the words toto, titi, tata, tutu, truc, bidule, machin and azerty are commonly used (AZERTY being theorder of first letters on French keyboards).

11.3.5 Hebrew

In Hebrew, the words chupchick and stam are commonly used.

11.3.6 Italian

In Italian, the word pippo is commonly used. Strangely enough, besides being a diminutive of the first namesGiuseppe(Joseph) and Filippo (Philip), pippo is the Italian name of the Disney character Goofy, but it is probably used justbecause of its sound which is quite strange; moreover, this name can be typed very quickly on a computer keyboard,as it involves three near keys (P, I and O). Sometimes the words pluto and paperino (Italian name of Donald Duck)can be hence used as additional terms.

11.3.7 Japanese

In Japanese, the words hoge and piyo are commonly used, with other common words and variants being fuga, hogera,and hogehoge.[14] Note that -ra is a pluralizing ending in Japanese, and reduplication is also used for pluralizing. Theorigin of hoge as a metasyntactic variable is not known, but it is believed to date to the early 1980s.[14] Hoge was alsoused extensively in the lyrics of the theme song for the Dororo animated adaptation, in 1969.

11.3.8 Portuguese

In Portuguese, the words fulano, sicrano and beltrano are commonly used to refer to people.[15] To refer to objectsin general, the most common placeholder name is XPTO.

11.3.9 Spanish

In Spanish, the words fulano,[16] mengano[17] and zutano[18] are commonly used, often followed by de tal mocking alastname in Spanish form (e.g. Fulano de Tal). These words have the constraint that they can only be used to referto people, as in the case with Portuguese. Also, when referring to an example of some person performing a certainaction, Perico de los Palotes can also be used as a placeholder for a real name. In place of people or objects (includingnumbers, etc.) the usual X, Y, Z are used (e.g. Person X, Quantity Z).

11.3.10 Turkish

In Turkish, the words falan, filan, hede, hödö, hebele, hübele are commonly used.

Page 69: Formal Semantics (Logic)

11.4. SEE ALSO 59

11.3.11 Persian

In Persian, the word folân is used for Foo and the words bahmān and bisār used for Bar.

11.4 See also• Metavariable (logic)

• Alice and Bob

• John Doe

• Fnord

• Free variables and bound variables

• Gadget

• Lorem ipsum

• Nonce word

• Placeholder name

• Widget

• Smurf

11.5 References[1] RFC 3092 (rfc3092) - Etymology of “Foo”

[2] As reproduced in Hank Bromley; Richard Lamson (1987). Lisp lore: a guide to programming the Lisp machine. KluwerAcademic. p. 291.

[3] Mark Slagell (2002). Sams Teach Yourself Ruby in 21 Days. Sams Publishing. p. 108. ISBN 978-0-672-32252-5.

[4] http://gcc.gnu.org/onlinedocs/gcc-3.3.5/g77/Notation-Used.html

[5] http://sunsite.ualberta.ca/Documentation/Gnu/texinfo-4.0/html_chapter/texinfo_10.html

[6] R. D. Tennent (2002). Specifying Software: A Hands-On Introduction. Cambridge University Press. pp. 36–37 and 210.ISBN 978-0-521-00401-5.

[7] John Mongan; Noah Kindler; Eric Giguere (2012). Programming Interviews Exposed: Secrets to Landing Your Next Job.John Wiley & Sons. p. 242. ISBN 978-1-118-28720-0.

[8] RFC-Editor.org

[9] wibble. (n.d.). Jargon File 4.4.7. Retrieved February 23, 2010, from

[10] Python Tutorial

[11] General Python FAQ

[12] http://www.vidyokarma.com/article.php?c=R-programming-language&page=1

[13] http://use-r.com/page/2/

[14] (Japanese)

[15] http://www.priberam.pt/dlpo/fulano

[16] http://lema.rae.es/drae/?val=fulano

[17] http://lema.rae.es/drae/?val=mengano

[18] http://lema.rae.es/drae/?val=zutano

Page 70: Formal Semantics (Logic)

60 CHAPTER 11. METASYNTACTIC VARIABLE

11.6 External links• Definition of metasyntactic variable, with examples.

• Examples of metasyntactic variables used in Commonwealth Hackish, such as wombat.

• Variable “foo” and Other Programming Oddities

Page 71: Formal Semantics (Logic)

Chapter 12

Metavariable

For the term as used in computer science and hacking culture, see Metasyntactic variable.

In logic, a metavariable (also metalinguistic variable[1] or syntactical variable[2]) is a symbol or symbol stringwhich belongs to a metalanguage and stands for elements of some object language. For instance, in the sentence

Let A and B be two sentences of a language ℒ

the symbols A and B are part of the metalanguage in which the statement about the object language ℒ is formulated.John Corcoran considers this terminology unfortunate because it obscures the use of schemata and because such“variables” do not actually range over a domain.[3]:220

The convention is that a metavariable is to be uniformly substituted with the same instance in all its appearances in agiven schema. This is in contrast with nonterminal symbols in formal grammars where the nonterminals on the rightof a production can be substituted by different instances.[4]

Attempts to formalize the notion of metavariable result in some kind of type theory.[5]

In computing one often needs to specify and document the syntax and semantics of a computer language, more orless formally. A term often used for metavariable in that area is "metasyntactic variable". Furthermore, because ofthe common practice in hacker culture to use nonsense words like "foo" as metavariables, the term “metasyntacticvariable” has come to denote such words by themselves; for instance, “foo” is referred to as “the first metasyntacticvariable” in the first edition of The Hacker’s Dictionary.

12.1 See also• Explicit substitution

12.2 References[1] Geoffrey Hunter, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic p.13

[2] Shoenfield, Joseph R. (2001) [1967], Mathematical Logic (2nd ed.), A K Peters, p. 7, ISBN 978-1-56881-135-2

[3] Corcoran, J. 2006. Schemata: the Concept of Schema in the History of Logic. Bulletin of Symbolic Logic 12: 219-40

[4] R. D. Tennent (2002). Specifying Software: A Hands-On Introduction. Cambridge University Press. pp. 36–37 and 210.ISBN 978-0-521-00401-5.

[5] Masahiko Sato, Takafumi Sakurai, Yukiyoshi Kameyama, and Atsushi Igarashi. "Calculi of Meta-variables" in ComputerScience Logic. 17th International Workshop CSL 2003. 12th Annual Conference of the EACSL. 8th Kurt Gödel Colloquium,KGC 2003, Vienna, Austria, August 25-30, 2003. Proceedings, Springer Lecture Notes in Computer Science 2803. ISBN3-540-40801-0. pp. 484–497

61

Page 72: Formal Semantics (Logic)

Chapter 13

Proposition

This article is about the term in logic and philosophy. For other uses, see Proposition (disambiguation).Not to be confused with preposition.

The term proposition has a broad use in contemporary philosophy. It is used to refer to some or all of the following:the primary bearers of truth-value, the objects of belief and other "propositional attitudes" (i.e., what is believed,doubted, etc.), the referents of that-clauses and the meanings of declarative sentences. Propositions are the sharableobjects of attitudes and the primary bearers of truth and falsity. This stipulation rules out certain candidates forpropositions, including thought- and utterance-tokens which are not sharable, and concrete events or facts, whichcannot be false.[1]

13.1 Historical usage

13.1.1 By Aristotle

Aristotelian logic identifies a proposition as a sentence which affirms or denies a predicate of a subject. AnAristotelianproposition may take the form “All men are mortal” or “Socrates is a man.” In the first example the subject is “Allmen” and the predicate “are mortal.” In the second example the subject is “Socrates” and the predicate is “is a man.”

13.1.2 By the logical positivists

Often propositions are related to closed sentences to distinguish them from what is expressed by an open sentence.In this sense, propositions are “statements” that are truth-bearers. This conception of a proposition was supported bythe philosophical school of logical positivism.Some philosophers argue that some (or all) kinds of speech or actions besides the declarative ones also have proposi-tional content. For example, yes–no questions present propositions, being inquiries into the truth value of them. Onthe other hand, some signs can be declarative assertions of propositions without forming a sentence nor even beinglinguistic, e.g. traffic signs convey definite meaning which is either true or false.Propositions are also spoken of as the content of beliefs and similar intentional attitudes such as desires, preferences,and hopes. For example, “I desire that I have a new car,” or “I wonder whether it will snow" (or, whether it is thecase that “it will snow”). Desire, belief, and so on, are thus called propositional attitudes when they take this sort ofcontent.

13.1.3 By Russell

Bertrand Russell held that propositions were structured entities with objects and properties as constituents. Wittgen-stein held that a proposition is the set of possible worlds/states of affairs in which it is true. One important differencebetween these views is that on the Russellian account, two propositions that are true in all the same states of affairs

62

Page 73: Formal Semantics (Logic)

13.2. RELATION TO THE MIND 63

can still be differentiated. For instance, the proposition that two plus two equals four is distinct on a Russellian ac-count from three plus three equals six. If propositions are sets of possible worlds, however, then all mathematicaltruths (and all other necessary truths) are the same set (the set of all possible worlds).

13.2 Relation to the mind

In relation to the mind, propositions are discussed primarily as they fit into propositional attitudes. Propositionalattitudes are simply attitudes characteristic of folk psychology (belief, desire, etc.) that one can take toward a propo-sition (e.g. 'it is raining,' 'snow is white,' etc.). In English, propositions usually follow folk psychological attitudes bya “that clause” (e.g. “Jane believes that it is raining”). In philosophy of mind and psychology, mental states are oftentaken to primarily consist in propositional attitudes. The propositions are usually said to be the “mental content” ofthe attitude. For example, if Jane has a mental state of believing that it is raining, her mental content is the proposi-tion 'it is raining.' Furthermore, since such mental states are about something (namely propositions), they are said tobe intentional mental states. Philosophical debates surrounding propositions as they relate to propositional attitudeshave also recently centered on whether they are internal or external to the agent or whether they are mind-dependentor mind-independent entities (see the entry on internalism and externalism in philosophy of mind).

13.3 Treatment in logic

As noted above, in Aristotelian logic a proposition is a particular kind of sentence, one which affirms or denies apredicate of a subject. Aristotelian propositions take forms like “All men are mortal” and “Socrates is a man.”Propositions show up in formal logic as objects of a formal language. A formal language begins with different types ofsymbols. These types can include variables, operators, function symbols, predicate (or relation) symbols, quantifiers,and propositional constants. (Grouping symbols are often added for convenience in using the language but do not playa logical role.) Symbols are concatenated together according to recursive rules in order to construct strings to whichtruth-values will be assigned. The rules specify how the operators, function and predicate symbols, and quantifiers areto be concatenated with other strings. A proposition is then a string with a specific form. The form that a propositiontakes depends on the type of logic.The type of logic called propositional, sentential, or statement logic includes only operators and propositional constantsas symbols in its language. The propositions in this language are propositional constants, which are considered atomicpropositions, and composite propositions, which are composed by recursively applying operators to propositions.Application here is simply a short way of saying that the corresponding concatenation rule has been applied.The types of logics called predicate, quantificational, or n-order logic include variables, operators, predicate andfunction symbols, and quantifiers as symbols in their languages. The propositions in these logics are more complex.First, terms must be defined. A term is (i) a variable or (ii) a function symbol applied to the number of terms requiredby the function symbol’s arity. For example, if + is a binary function symbol and x, y, and z are variables, thenx+(y+z) is a term, which might be written with the symbols in various orders. A proposition is (i) a predicate symbolapplied to the number of terms required by its arity, (ii) an operator applied to the number of propositions requiredby its arity, or (iii) a quantifier applied to a proposition. For example, if = is a binary predicate symbol and ∀ is aquantifier, then ∀x,y,z [(x = y) → (x+z = y+z)] is a proposition. This more complex structure of propositions allowsthese logics to make finer distinctions between inferences, i.e., to have greater expressive power.In this context, propositions are also called sentences, statements, statement forms, formulas, and well-formed formu-las, though these terms are usually not synonymous within a single text. This definition treats propositions as syntacticobjects, as opposed to semantic or mental objects. That is, propositions in this sense are meaningless, formal, abstractobjects. They are assigned meaning and truth-values by mappings called interpretations and valuations, respectively.

13.4 Objections to propositions

Attempts to provide a workable definition of proposition include

Two meaningful declarative sentences express the same proposition if and only if they mean thesame thing.

Page 74: Formal Semantics (Logic)

64 CHAPTER 13. PROPOSITION

thus defining proposition in terms of synonymity. For example, “Snow is white” (in English) and “Schnee ist weiß"(in German) are different sentences, but they say the same thing, so they express the same proposition.

Two meaningful declarative sentence-tokens express the same proposition if and only if they meanthe same thing.

Unfortunately, the above definition has the result that two sentences/sentence-tokens which have the same meaningand thus express the same proposition, could have different truth-values, e.g. “I am Spartacus” said by Spartacus andsaid by John Smith; and e.g. “It is Wednesday” said on a Wednesday and on a Thursday.A number of philosophers and linguists claim that all definitions of a proposition are too vague to be useful. Forthem, it is just a misleading concept that should be removed from philosophy and semantics. W.V. Quine maintainedthat the indeterminacy of translation prevented any meaningful discussion of propositions, and that they should bediscarded in favor of sentences.[2] Strawson advocated the use of the term “statement”.

13.5 See also• Main contention

.

13.6 References[1] “Propositions (Stanford Encyclopedia of Philosophy)". Plato.stanford.edu. Retrieved 2014-06-23.

[2] Quine W.V. Philosophy of Logic, Prentice-Hall NJ USA: 1970, pp 1-14

13.7 External links• Stanford Encyclopedia of Philosophy articles on:

• Propositions, by Matthew McGrath• Singular Propositions, by Greg Fitch• Structured Propositions, by Jeffrey C. King

Page 75: Formal Semantics (Logic)

Chapter 14

Propositional calculus

Propositional calculus (also called propositional logic, sentential calculus, or sentential logic) is the branch ofmathematical logic concerned with the study of propositions (whether they are true or false) that are formed by otherpropositions with the use of logical connectives, and how their value depends on the truth value of their components.Logical connectives are found in natural languages. In English for example, some examples are “and” (conjunction),“or” (disjunction), “not” (negation) and “if” (but only when used to denote material conditional).The following is an example of a very simple inference within the scope of propositional logic:

Premise 1: If it’s raining then it’s cloudy.Premise 2: It’s raining.Conclusion: It’s cloudy.

Both premises and the conclusions are propositions. The premises are taken for granted and then with the applicationof modus ponens (an inference rule) the conclusion follows.As propositional logic is not concerned with the structure of propositions beyond the point where they can't be decom-posed anymore by logical connectives, this inference can be restated replacing those atomic statements with statementletters, which are interpreted as variables representing statements:

P → Q

P

Q

The same can be stated succinctly in the following way:

P → Q,P ⊢ Q

When P is interpreted as “It’s raining” and Q as “it’s cloudy” the above symbolic expressions can be seen to exactlycorrespond with the original expression in natural language. Not only that, but they will also correspond with anyother inference of this form, which will be valid on the same basis that this inference is.Propositional logic may be studied through a formal system in which formulas of a formal languagemay be interpretedto represent propositions. A system of inference rules and axioms allows certain formulas to be derived. Thesederived formulas are called theorems and may be interpreted to be true propositions. A constructed sequence of suchformulas is known as a derivation or proof and the last formula of the sequence is the theorem. The derivation maybe interpreted as proof of the proposition represented by the theorem.When a formal system is used to represent formal logic, only statement letters are represented directly. The naturallanguage propositions that arise when they're interpreted are outside the scope of the system, and the relation betweenthe formal system and its interpretation is likewise outside the formal system itself.Usually in truth-functional propositional logic, formulas are interpreted as having either a truth value of true or atruth value of false. Truth-functional propositional logic and systems isomorphic to it, are considered to be zeroth-order logic.

65

Page 76: Formal Semantics (Logic)

66 CHAPTER 14. PROPOSITIONAL CALCULUS

14.1 History

Main article: History of logic

Although propositional logic (which is interchangeable with propositional calculus) had been hinted by earlier philoso-phers, it was developed into a formal logic by Chrysippus in the 3rd century BC[1] and expanded by the Stoics. Thelogic was focused on propositions. This advancement was different from the traditional syllogistic logic which was fo-cused on terms. However, later in antiquity, the propositional logic developed by the Stoics was no longer understood. Consequently, the system was essentially reinvented by Peter Abelard in the 12th century.[2]

Propositional logic was eventually refined using symbolic logic. The 17th/18th century philosopher Gottfried Leibnizhas been credited with being the founder of symbolic logic for his work with the calculus ratiocinator. Although hiswork was the first of its kind, it was unknown to the larger logical community. Consequently, many of the advancesachieved by Leibniz were reachieved by logicians likeGeorge Boole andAugustus DeMorgan completely independentof Leibniz.[3]

Just as propositional logic can be considered an advancement from the earlier syllogistic logic, Gottlob Frege'spredicate logic was an advancement from the earlier propositional logic. One author describes predicate logic ascombining “the distinctive features of syllogistic logic and propositional logic.”[4] Consequently, predicate logic ush-ered in a new era in logic’s history; however, advances in propositional logic were still made after Frege, includingNatural Deduction, Truth-Trees and Truth-Tables. Natural deduction was invented by Gerhard Gentzen and JanŁukasiewicz. Truth-Trees were invented by Evert Willem Beth.[5] The invention of truth-tables, however, is of con-troversial attribution.Within works by Frege[6] and BertrandRussell,[7] one finds ideas influential in bringing about the notion of truth tables.The actual 'tabular structure' (being formatted as a table), itself, is generally credited to either LudwigWittgenstein orEmil Post (or both, independently).[6] Besides Frege and Russell, others credited with having ideas preceding truth-tables include Philo, Boole, Charles Sanders Peirce, and Ernst Schröder. Others credited with the tabular structureinclude Łukasiewicz, Schröder, Alfred North Whitehead, William Stanley Jevons, John Venn, and Clarence IrvingLewis.[7] Ultimately, some have concluded, like John Shosky, that “It is far from clear that any one person should begiven the title of 'inventor' of truth-tables.”.[7]

14.2 Terminology

In general terms, a calculus is a formal system that consists of a set of syntactic expressions (well-formed formulas),a distinguished subset of these expressions (axioms), plus a set of formal rules that define a specific binary relation,intended to be interpreted to be logical equivalence, on the space of expressions.When the formal system is intended to be a logical system, the expressions aremeant to be interpreted to be statements,and the rules, known to be inference rules, are typically intended to be truth-preserving. In this setting, the rules (whichmay include axioms) can then be used to derive (“infer”) formulas representing true statements from given formulasrepresenting true statements.The set of axioms may be empty, a nonempty finite set, a countably infinite set, or be given by axiom schemata. Aformal grammar recursively defines the expressions and well-formed formulas of the language. In addition a semanticsmay be given which defines truth and valuations (or interpretations).The language of a propositional calculus consists of

1. a set of primitive symbols, variously referred to be atomic formulas, placeholders, proposition letters, or vari-ables, and

2. a set of operator symbols, variously interpreted to be logical operators or logical connectives.

A well-formed formula is any atomic formula, or any formula that can be built up from atomic formulas by means ofoperator symbols according to the rules of the grammar.Mathematicians sometimes distinguish between propositional constants, propositional variables, and schemata. Propo-sitional constants represent some particular proposition, while propositional variables range over the set of all atomicpropositions. Schemata, however, range over all propositions. It is common to represent propositional constants by

Page 77: Formal Semantics (Logic)

14.3. BASIC CONCEPTS 67

A, B, and C, propositional variables by P, Q, and R, and schematic letters are often Greek letters, most often φ, ψ,and χ.

14.3 Basic concepts

The following outlines a standard propositional calculus. Many different formulations exist which are all more or lessequivalent but differ in the details of:

1. their language, that is, the particular collection of primitive symbols and operator symbols,2. the set of axioms, or distinguished formulas, and3. the set of inference rules.

Any given proposition may be represented with a letter called a 'propositional constant', analogous to representing anumber by a letter in mathematics, for instance, a = 5. All propositions require exactly one of two truth-values: trueor false. For example, let P be the proposition that it is raining outside. This will be true (P) if it is raining outsideand false otherwise (¬P).

• We then define truth-functional operators, beginning with negation. (¬P represents the negation of P, whichcan be thought of as the denial of P. In the example above, (¬P expresses that it is not raining outside, or by amore standard reading: “It is not the case that it is raining outside.” When P is true, (¬P is false; and when Pis false, (¬P is true. (¬¬P always has the same truth-value as P.

• Conjunction is a truth-functional connective which forms a proposition out of two simpler propositions, forexample, P and Q. The conjunction of P and Q is written P ∧ Q, and expresses that each are true. We read P∧ Q for “P and Q”. For any two propositions, there are four possible assignments of truth values:1. P is true and Q is true2. P is true and Q is false3. P is false and Q is true4. P is false and Q is false

The conjunction of P and Q is true in case 1 and is false otherwise. Where P is the proposition that it israining outside and Q is the proposition that a cold-front is over Kansas, P ∧ Q is true when it is rainingoutside and there is a cold-front over Kansas. If it is not raining outside, then P ∧ Q is false; and if thereis no cold-front over Kansas, then P ∧ Q is false.

• Disjunction resembles conjunction in that it forms a proposition out of two simpler propositions. We write itP ∨ Q, and it is read “P or Q”. It expresses that either P or Q is true. Thus, in the cases listed above, thedisjunction of P and Q is true in all cases except 4. Using the example above, the disjunction expresses thatit is either raining outside or there is a cold front over Kansas. (Note, this use of disjunction is supposed toresemble the use of the English word “or”. However, it is most like the English inclusive “or”, which can beused to express the truth of at least one of two propositions. It is not like the English exclusive “or”, whichexpresses the truth of exactly one of two propositions. That is to say, the exclusive “or” is false when both Pand Q are true (case 1). An example of the exclusive or is: You may have a bagel or a pastry, but not both.Often in natural language, given the appropriate context, the addendum “but not both” is omitted but implied.In mathematics, however, “or” is always inclusive or; if exclusive or is meant it will be specified, possibly by“xor”.)

• Material conditional also joins two simpler propositions, and we write P → Q, which is read “if P then Q”.The proposition to the left of the arrow is called the antecedent and the proposition to the right is calledthe consequent. (There is no such designation for conjunction or disjunction, since they are commutativeoperations.) It expresses that Q is true whenever P is true. Thus it is true in every case above except case 2,because this is the only case when P is true but Q is not. Using the example, if P then Q expresses that if it israining outside then there is a cold-front over Kansas. The material conditional is often confused with physicalcausation. The material conditional, however, only relates two propositions by their truth-values—which is notthe relation of cause and effect. It is contentious in the literature whether the material implication representslogical causation.

Page 78: Formal Semantics (Logic)

68 CHAPTER 14. PROPOSITIONAL CALCULUS

• Biconditional joins two simpler propositions, and we write P ↔ Q, which is read “P if and only if Q”. Itexpresses that P and Q have the same truth-value, thus P if and only if Q is true in cases 1 and 4, and falseotherwise.

It is extremely helpful to look at the truth tables for these different operators, as well as themethod of analytic tableaux.

14.3.1 Closure under operations

Propositional logic is closed under truth-functional connectives. That is to say, for any proposition φ, ¬φ is also aproposition. Likewise, for any propositions φ and ψ, φ ∧ψ is a proposition, and similarly for disjunction, conditional,and biconditional. This implies that, for instance, φ ∧ ψ is a proposition, and so it can be conjoined with anotherproposition. In order to represent this, we need to use parentheses to indicate which proposition is conjoined withwhich. For instance, P ∧ Q ∧ R is not a well-formed formula, because we do not know if we are conjoining P ∧Q with R or if we are conjoining P with Q ∧ R. Thus we must write either (P ∧ Q) ∧ R to represent the former, orP ∧ (Q ∧ R) to represent the latter. By evaluating the truth conditions, we see that both expressions have the sametruth conditions (will be true in the same cases), and moreover that any proposition formed by arbitrary conjunctionswill have the same truth conditions, regardless of the location of the parentheses. This means that conjunction isassociative, however, one should not assume that parentheses never serve a purpose. For instance, the sentence P ∧(Q ∨ R) does not have the same truth conditions of (P ∧ Q) ∨ R, so they are different sentences distinguished only bythe parentheses. One can verify this by the truth-table method referenced above.Note: For any arbitrary number of propositional constants, we can form a finite number of cases which list theirpossible truth-values. A simple way to generate this is by truth-tables, in which one writes P, Q, ..., Z, for any list ofk propositional constants—that is to say, any list of propositional constants with k entries. Below this list, one writes2k rows, and below P one fills in the first half of the rows with true (or T) and the second half with false (or F). BelowQ one fills in one-quarter of the rows with T, then one-quarter with F, then one-quarter with T and the last quarterwith F. The next column alternates between true and false for each eighth of the rows, then sixteenths, and so on,until the last propositional constant varies between T and F for each row. This will give a complete listing of cases ortruth-value assignments possible for those propositional constants.

14.3.2 Argument

The propositional calculus then defines an argument to be a set of propositions. A valid argument is a set of proposi-tions, the last of which follows from—or is implied by—the rest. All other arguments are invalid. The simplest validargument is modus ponens, one instance of which is the following set of propositions:

1. P → Q2. P∴ Q

This is a set of three propositions, each line is a proposition, and the last follows from the rest. The first two lines arecalled premises, and the last line the conclusion. We say that any proposition C follows from any set of propositions(P1, ..., Pn) , if C must be true whenever every member of the set (P1, ..., Pn) is true. In the argument above, forany P and Q, whenever P → Q and P are true, necessarily Q is true. Notice that, when P is true, we cannot considercases 3 and 4 (from the truth table). When P → Q is true, we cannot consider case 2. This leaves only case 1, inwhich Q is also true. Thus Q is implied by the premises.This generalizes schematically. Thus, where φ and ψ may be any propositions at all,

1. φ→ ψ2. φ∴ ψ

Other argument forms are convenient, but not necessary. Given a complete set of axioms (see below for one suchset), modus ponens is sufficient to prove all other argument forms in propositional logic, thus they may be consideredto be a derivative. Note, this is not true of the extension of propositional logic to other logics like first-order logic.First-order logic requires at least one additional rule of inference in order to obtain completeness.

Page 79: Formal Semantics (Logic)

14.4. GENERIC DESCRIPTION OF A PROPOSITIONAL CALCULUS 69

The significance of argument in formal logic is that one may obtain new truths from established truths. In the firstexample above, given the two premises, the truth of Q is not yet known or stated. After the argument is made, Q isdeduced. In this way, we define a deduction system to be a set of all propositions that may be deduced from anotherset of propositions. For instance, given the set of propositions A = P ∨ Q,¬Q ∧ R, (P ∨ Q) → R , we candefine a deduction system, Γ, which is the set of all propositions which follow from A. Reiteration is always assumed,so P ∨Q,¬Q ∧ R, (P ∨Q) → R ∈ Γ . Also, from the first element of A, last element, as well as modus ponens,R is a consequence, and so R ∈ Γ . Because we have not included sufficiently complete axioms, though, nothingelse may be deduced. Thus, even though most deduction systems studied in propositional logic are able to deduce(P ∨Q) ↔ (¬P → Q) , this one is too weak to prove such a proposition.

14.4 Generic description of a propositional calculus

A propositional calculus is a formal system L = L (A, Ω, Z, I) , where:

• The alpha set A is a finite set of elements called proposition symbols or propositional variables. Syntacticallyspeaking, these are the most basic elements of the formal languageL , otherwise referred to as atomic formulasor terminal elements. In the examples to follow, the elements of A are typically the letters p, q, r, and so on.

• The omega set Ω is a finite set of elements called operator symbols or logical connectives. The set Ω is partitionedinto disjoint subsets as follows:

Ω = Ω0 ∪ Ω1 ∪ . . . ∪ Ωj ∪ . . . ∪ Ωm.

In this partition, Ωj is the set of operator symbols of arity j.

In the more familiar propositional calculi, Ω is typically partitioned as follows:

Ω1 = ¬,

Ω2 ⊆ ∧,∨,→,↔.

A frequently adopted convention treats the constant logical values as operators of arity zero, thus:

Ω0 = 0, 1.

Some writers use the tilde (~), or N, instead of ¬; and some use the ampersand (&), the prefixed K, or· instead of ∧ . Notation varies even more for the set of logical values, with symbols like false, true,F, T, or ⊥,⊤ all being seen in various contexts instead of 0, 1.

• The zeta set Z is a finite set of transformation rules that are called inference rules when they acquire logicalapplications.

• The iota set I is a finite set of initial points that are called axioms when they receive logical interpretations.

The language of L , also known as its set of formulas, well-formed formulas, is inductively defined by the followingrules:

Page 80: Formal Semantics (Logic)

70 CHAPTER 14. PROPOSITIONAL CALCULUS

1. Base: Any element of the alpha set A is a formula of L .

2. If p1, p2, . . . , pj are formulas and f is in Ωj , then (f(p1, p2, . . . , pj)) is a formula.

3. Closed: Nothing else is a formula of L .

Repeated applications of these rules permits the construction of complex formulas. For example:

1. By rule 1, p is a formula.

2. By rule 2, ¬p is a formula.

3. By rule 1, q is a formula.

4. By rule 2, (¬p ∨ q) is a formula.

14.5 Example 1. Simple axiom system

Let L1 = L(A,Ω,Z, I) , where A , Ω , Z , I are defined as follows:

• The alpha set A , is a finite set of symbols that is large enough to supply the needs of a given discussion, forexample:

A = p, q, r, s, t, u.

• Of the three connectives for conjunction, disjunction, and implication ( ∧,∨ , and →), one can be taken asprimitive and the other two can be defined in terms of it and negation (¬).[8] Indeed, all of the logical connectivescan be defined in terms of a sole sufficient operator. The biconditional (↔) can of course be defined in termsof conjunction and implication, with a↔ b defined as (a→ b) ∧ (b→ a) .

Ω = Ω1 ∪ Ω2

Ω1 = ¬,

Ω2 = →.

• An axiom system discovered by Jan Łukasiewicz formulates a propositional calculus in this language as follows.The axioms are all substitution instances of:

• (p→ (q → p))

• ((p→ (q → r)) → ((p→ q) → (p→ r)))

• ((¬p→ ¬q) → (q → p))

• The rule of inference is modus ponens (i.e., from p and (p→ q) , infer q). Then a ∨ b is defined as ¬a→ b ,and a ∧ b is defined as ¬(a→ ¬b) . This system is used in Metamath set.mm formal proof database.

Page 81: Formal Semantics (Logic)

14.6. EXAMPLE 2. NATURAL DEDUCTION SYSTEM 71

14.6 Example 2. Natural deduction system

Let L2 = L(A,Ω,Z, I) , where A , Ω , Z , I are defined as follows:

• The alpha set A , is a finite set of symbols that is large enough to supply the needs of a given discussion, forexample:

A = p, q, r, s, t, u.

• The omega set Ω = Ω1 ∪ Ω2 partitions as follows:

Ω1 = ¬,

Ω2 = ∧,∨,→,↔.

In the following example of a propositional calculus, the transformation rules are intended to be interpreted as theinference rules of a so-called natural deduction system. The particular system presented here has no initial points,which means that its interpretation for logical applications derives its theorems from an empty axiom set.

• The set of initial points is empty, that is, I = ∅ .

• The set of transformation rules, Z , is described as follows:

Our propositional calculus has ten inference rules. These rules allow us to derive other true formulas given a set offormulas that are assumed to be true. The first nine simply state that we can infer certain well-formed formulas fromother well-formed formulas. The last rule however uses hypothetical reasoning in the sense that in the premise of therule we temporarily assume an (unproven) hypothesis to be part of the set of inferred formulas to see if we can infera certain other formula. Since the first nine rules don't do this they are usually described as non-hypothetical rules,and the last one as a hypothetical rule.In describing the transformation rules, we may introduce a metalanguage symbol ⊢ . It is basically a convenientshorthand for saying “infer that”. The format is Γ ⊢ ψ , in which Γ is a (possibly empty) set of formulas calledpremises, and ψ is a formula called conclusion. The transformation rule Γ ⊢ ψ means that if every proposition in Γ isa theorem (or has the same truth value as the axioms), then ψ is also a theorem. Note that considering the followingrule Conjunction introduction, we will know whenever Γ has more than one formula, we can always safely reduce itinto one formula using conjunction. So for short, from that time on we may represent Γ as one formula instead of aset. Another omission for convenience is when Γ is an empty set, in which case Γ may not appear.

Negation introduction From (p→ q) and (p→ ¬q) , infer ¬p .

That is, (p→ q), (p→ ¬q) ⊢ ¬p .

Negation elimination From ¬p , infer (p→ r) .

That is, ¬p ⊢ (p→ r) .

Double negative elimination From ¬¬p , infer p.

That is, ¬¬p ⊢ p .

Conjunction introduction From p and q, infer (p ∧ q) .

That is, p, q ⊢ (p ∧ q) .

Conjunction elimination From (p ∧ q) , infer p.

From (p ∧ q) , infer q.

That is, (p ∧ q) ⊢ p and (p ∧ q) ⊢ q .

Disjunction introduction From p, infer (p ∨ q) .

From q, infer (p ∨ q) .

Page 82: Formal Semantics (Logic)

72 CHAPTER 14. PROPOSITIONAL CALCULUS

That is, p ⊢ (p ∨ q) and q ⊢ (p ∨ q) .

Disjunction elimination From (p ∨ q) and (p→ r) and (q → r) , infer r.

That is, p ∨ q, p→ r, q → r ⊢ r .

Biconditional introduction From (p→ q) and (q → p) , infer (p↔ q) .

That is, p→ q, q → p ⊢ (p↔ q) .

Biconditional elimination From (p↔ q) , infer (p→ q) .

From (p↔ q) , infer (q → p) .

That is, (p↔ q) ⊢ (p→ q) and (p↔ q) ⊢ (q → p) .

Modus ponens (conditional elimination) From p and (p→ q) , infer q.

That is, p, p→ q ⊢ q .

Conditional proof (conditional introduction) From [accepting p allows a proof of q], infer (p→ q) .

That is, (p ⊢ q) ⊢ (p→ q) .

14.7 Basic and derived argument forms

14.8 Proofs in propositional calculus

One of the main uses of a propositional calculus, when interpreted for logical applications, is to determine relationsof logical equivalence between propositional formulas. These relationships are determined by means of the availabletransformation rules, sequences of which are called derivations or proofs.In the discussion to follow, a proof is presented as a sequence of numbered lines, with each line consisting of a singleformula followed by a reason or justification for introducing that formula. Each premise of the argument, that is, anassumption introduced as an hypothesis of the argument, is listed at the beginning of the sequence and is marked asa “premise” in lieu of other justification. The conclusion is listed on the last line. A proof is complete if every linefollows from the previous ones by the correct application of a transformation rule. (For a contrasting approach, seeproof-trees).

14.8.1 Example of a proof

• To be shown that A→ A.

• One possible proof of this (which, though valid, happens to contain more steps than are necessary) may bearranged as follows:

Interpret A ⊢ A as “Assuming A, infer A”. Read ⊢ A→ A as “Assuming nothing, infer that A implies A”, or “It isa tautology that A implies A”, or “It is always true that A implies A”.

14.9 Soundness and completeness of the rules

The crucial properties of this set of rules are that they are sound and complete. Informally this means that the rulesare correct and that no other rules are required. These claims can be made more formal as follows.We define a truth assignment as a function that maps propositional variables to true or false. Informally such atruth assignment can be understood as the description of a possible state of affairs (or possible world) where certainstatements are true and others are not. The semantics of formulas can then be formalized by defining for which “stateof affairs” they are considered to be true, which is what is done by the following definition.We define when such a truth assignment A satisfies a certain well-formed formula with the following rules:

Page 83: Formal Semantics (Logic)

14.9. SOUNDNESS AND COMPLETENESS OF THE RULES 73

• A satisfies the propositional variable P if and only if A(P) = true

• A satisfies ¬φ if and only if A does not satisfy φ

• A satisfies (φ ∧ ψ) if and only if A satisfies both φ and ψ

• A satisfies (φ ∨ ψ) if and only if A satisfies at least one of either φ or ψ

• A satisfies (φ→ ψ) if and only if it is not the case that A satisfies φ but not ψ

• A satisfies (φ↔ ψ) if and only if A satisfies both φ and ψ or satisfies neither one of them

With this definition we can now formalize what it means for a formula φ to be implied by a certain set S of formulas.Informally this is true if in all worlds that are possible given the set of formulas S the formula φ also holds. Thisleads to the following formal definition: We say that a set S of well-formed formulas semantically entails (or implies)a certain well-formed formula φ if all truth assignments that satisfy all the formulas in S also satisfy φ.Finally we define syntactical entailment such that φ is syntactically entailed by S if and only if we can derive it withthe inference rules that were presented above in a finite number of steps. This allows us to formulate exactly what itmeans for the set of inference rules to be sound and complete:Soundness: If the set of well-formed formulas S syntactically entails the well-formed formula φ then S semanticallyentails φ.Completeness: If the set of well-formed formulas S semantically entails the well-formed formula φ then S syntac-tically entails φ.For the above set of rules this is indeed the case.

14.9.1 Sketch of a soundness proof

(For most logical systems, this is the comparatively “simple” direction of proof)Notational conventions: Let G be a variable ranging over sets of sentences. Let A, B and C range over sentences. For“G syntactically entails A” we write “G proves A”. For “G semantically entails A” we write “G implies A”.We want to show: (A)(G) (if G proves A, then G implies A).We note that “G proves A” has an inductive definition, and that gives us the immediate resources for demonstratingclaims of the form “If G proves A, then ...”. So our proof proceeds by induction.

1. Basis. Show: If A is a member of G, then G implies A.

2. Basis. Show: If A is an axiom, then G implies A.

3. Inductive step (induction on n, the length of the proof):

(a) Assume for arbitrary G and A that if G proves A in n or fewer steps, then G implies A.(b) For each possible application of a rule of inference at step n + 1, leading to a new theorem B, show that

G implies B.

Notice that Basis Step II can be omitted for natural deduction systems because they have no axioms. When used,Step II involves showing that each of the axioms is a (semantic) logical truth.The Basis steps demonstrate that the simplest provable sentences from G are also implied by G, for any G. (Theproof is simple, since the semantic fact that a set implies any of its members, is also trivial.) The Inductive step willsystematically cover all the further sentences that might be provable—by considering each case where we might reacha logical conclusion using an inference rule—and shows that if a new sentence is provable, it is also logically implied.(For example, we might have a rule telling us that from “A” we can derive “A or B”. In III.a We assume that if A isprovable it is implied. We also know that if A is provable then “A or B” is provable. We have to show that then “Aor B” too is implied. We do so by appeal to the semantic definition and the assumption we just made. A is provablefrom G, we assume. So it is also implied by G. So any semantic valuation making all of G true makes A true. Butany valuation making A true makes “A or B” true, by the defined semantics for “or”. So any valuation which makes

Page 84: Formal Semantics (Logic)

74 CHAPTER 14. PROPOSITIONAL CALCULUS

all of G true makes “A or B” true. So “A or B” is implied.) Generally, the Inductive step will consist of a lengthy butsimple case-by-case analysis of all the rules of inference, showing that each “preserves” semantic implication.By the definition of provability, there are no sentences provable other than by being a member of G, an axiom, orfollowing by a rule; so if all of those are semantically implied, the deduction calculus is sound.

14.9.2 Sketch of completeness proof

(This is usually the much harder direction of proof.)We adopt the same notational conventions as above.We want to show: If G implies A, then G proves A. We proceed by contraposition: We show instead that if G doesnot prove A then G does not imply A.

1. G does not prove A. (Assumption)

2. If G does not prove A, then we can construct an (infinite) Maximal Set, G∗, which is a superset of G andwhich also does not prove A.

(a) Place an ordering on all the sentences in the language (e.g., shortest first, and equally long ones in extendedalphabetical ordering), and number them (E1, E2, ...)

(b) Define a series G of sets (G0, G1, ...) inductively:

i. G0 = G

ii. If Gk ∪ Ek+1 proves A, then Gk+1 = Gk

iii. If Gk ∪ Ek+1 does not prove A, then Gk+1 = Gk ∪ Ek+1(c) Define G∗ as the union of all the G . (That is, G∗ is the set of all the sentences that are in any G .)(d) It can be easily shown that

i. G∗ contains (is a superset of) G (by (b.i));ii. G∗ does not prove A (because if it proves A then some sentence was added to some G which caused

it to prove A; but this was ruled out by definition); andiii. G∗ is a Maximal Set with respect to A: If any more sentences whatever were added to G∗, it would

prove A. (Because if it were possible to add any more sentences, they should have been added whenthey were encountered during the construction of the G , again by definition)

3. If G∗ is a Maximal Set with respect to A, then it is truth-like. This means that it contains C only if it does notcontain ¬C; If it contains C and contains “If C then B” then it also contains B; and so forth.

4. If G∗ is truth-like there is a G∗-Canonical valuation of the language: one that makes every sentence in G∗ trueand everything outside G∗ false while still obeying the laws of semantic composition in the language.

5. A G∗-canonical valuation will make our original set G all true, and make A false.

6. If there is a valuation on which G are true and A is false, then G does not (semantically) imply A.

QED

14.9.3 Another outline for a completeness proof

If a formula is a tautology, then there is a truth table for it which shows that each valuation yields the value true for theformula. Consider such a valuation. By mathematical induction on the length of the subformulas, show that the truthor falsity of the subformula follows from the truth or falsity (as appropriate for the valuation) of each propositionalvariable in the subformula. Then combine the lines of the truth table together two at a time by using "(P is true impliesS) implies ((P is false implies S) implies S)". Keep repeating this until all dependencies on propositional variableshave been eliminated. The result is that we have proved the given tautology. Since every tautology is provable, thelogic is complete.

Page 85: Formal Semantics (Logic)

14.10. INTERPRETATION OF A TRUTH-FUNCTIONAL PROPOSITIONAL CALCULUS 75

14.10 Interpretation of a truth-functional propositional calculus

An interpretation of a truth-functional propositional calculusP is an assignment to each propositional symbol ofP of one or the other (but not both) of the truth values truth (T) and falsity (F), and an assignment to the connectivesymbols of P of their usual truth-functional meanings. An interpretation of a truth-functional propositional calculusmay also be expressed in terms of truth tables.[10]

For n distinct propositional symbols there are 2n distinct possible interpretations. For any particular symbol a , forexample, there are 21 = 2 possible interpretations:

1. a is assigned T, or

2. a is assigned F.

For the pair a , b there are 22 = 4 possible interpretations:

1. both are assigned T,

2. both are assigned F,

3. a is assigned T and b is assigned F, or

4. a is assigned F and b is assigned T.[10]

Since P has ℵ0 , that is, denumerably many propositional symbols, there are 2ℵ0 = c , and therefore uncountablymany distinct possible interpretations of P .[10]

14.10.1 Interpretation of a sentence of truth-functional propositional logic

Main article: Interpretation (logic)

If φ and ψ are formulas of P and I is an interpretation of P then:

• A sentence of propositional logic is true under an interpretation I iff I assigns the truth valueT to that sentence.If a sentence is true under an interpretation, then that interpretation is called a model of that sentence.

• φ is false under an interpretation I iff φ is not true under I .[10]

• A sentence of propositional logic is logically valid if it is true under every interpretation

|= ϕ means that φ is logically valid

• A sentence ψ of propositional logic is a semantic consequence of a sentence φ iff there is no interpretationunder which φ is true and ψ is false.

• A sentence of propositional logic is consistent iff it is true under at least one interpretation. It is inconsistent ifit is not consistent.

Some consequences of these definitions:

• For any given interpretation a given formula is either true or false.[10]

• No formula is both true and false under the same interpretation.[10]

• φ is false for a given interpretation iff ¬ϕ is true for that interpretation; and φ is true under an interpretationiff ¬ϕ is false under that interpretation.[10]

• If φ and (ϕ→ ψ) are both true under a given interpretation, then ψ is true under that interpretation.[10]

• If |=P ϕ and |=P (ϕ→ ψ) , then |=P ψ .[10]

Page 86: Formal Semantics (Logic)

76 CHAPTER 14. PROPOSITIONAL CALCULUS

• ¬ϕ is true under I iff φ is not true under I .

• (ϕ→ ψ) is true under I iff either φ is not true under I or ψ is true under I .[10]

• A sentence ψ of propositional logic is a semantic consequence of a sentence φ iff (ϕ → ψ) is logically valid,that is, ϕ |=P ψ iff |=P (ϕ→ ψ) .[10]

14.11 Alternative calculus

It is possible to define another version of propositional calculus, which defines most of the syntax of the logicaloperators by means of axioms, and which uses only one inference rule.

14.11.1 Axioms

Let φ, χ, and ψ stand for well-formed formulas. (The well-formed formulas themselves would not contain any Greekletters, but only capital Roman letters, connective operators, and parentheses.) Then the axioms are as follows:

• Axiom THEN-2 may be considered to be a “distributive property of implication with respect to implication.”

• Axioms AND-1 and AND-2 correspond to “conjunction elimination”. The relation between AND-1 and AND-2 reflects the commutativity of the conjunction operator.

• Axiom AND-3 corresponds to “conjunction introduction.”

• Axioms OR-1 and OR-2 correspond to “disjunction introduction.” The relation between OR-1 and OR-2 re-flects the commutativity of the disjunction operator.

• Axiom NOT-1 corresponds to “reductio ad absurdum.”

• Axiom NOT-2 says that “anything can be deduced from a contradiction.”

• Axiom NOT-3 is called "tertium non datur" (Latin: “a third is not given”) and reflects the semantic valuationof propositional formulas: a formula can have a truth-value of either true or false. There is no third truth-value,at least not in classical logic. Intuitionistic logicians do not accept the axiom NOT-3.

14.11.2 Inference rule

The inference rule is modus ponens:

ϕ, ϕ→ χ ⊢ χ

14.11.3 Meta-inference rule

Let a demonstration be represented by a sequence, with hypotheses to the left of the turnstile and the conclusion tothe right of the turnstile. Then the deduction theorem can be stated as follows:

If the sequence

ϕ1, ϕ2, ..., ϕn, χ ⊢ ψ

has been demonstrated, then it is also possible to demonstrate the sequence

ϕ1, ϕ2, ..., ϕn ⊢ χ→ ψ

Page 87: Formal Semantics (Logic)

14.11. ALTERNATIVE CALCULUS 77

This deduction theorem (DT) is not itself formulated with propositional calculus: it is not a theorem of propositionalcalculus, but a theorem about propositional calculus. In this sense, it is a meta-theorem, comparable to theoremsabout the soundness or completeness of propositional calculus.On the other hand, DT is so useful for simplifying the syntactical proof process that it can be considered and used asanother inference rule, accompanying modus ponens. In this sense, DT corresponds to the natural conditional proofinference rule which is part of the first version of propositional calculus introduced in this article.The converse of DT is also valid:

If the sequence

ϕ1, ϕ2, ..., ϕn ⊢ χ→ ψ

has been demonstrated, then it is also possible to demonstrate the sequence

ϕ1, ϕ2, ..., ϕn, χ ⊢ ψ

in fact, the validity of the converse of DT is almost trivial compared to that of DT:

If

ϕ1, ..., ϕn ⊢ χ→ ψ

then

ϕ1, ..., ϕn, χ ⊢ χ→ ψ

ϕ1, ..., ϕn, χ ⊢ χ

and from (1) and (2) can be deduced

ϕ1, ..., ϕn, χ ⊢ ψ

by means of modus ponens, Q.E.D.

The converse of DT has powerful implications: it can be used to convert an axiom into an inference rule. For example,the axiom AND-1,

⊢ ϕ ∧ χ→ ϕ

can be transformed by means of the converse of the deduction theorem into the inference rule

ϕ ∧ χ ⊢ ϕ

which is conjunction elimination, one of the ten inference rules used in the first version (in this article) of the propo-sitional calculus.

14.11.4 Example of a proof

The following is an example of a (syntactical) demonstration, involving only axioms THEN-1 and THEN-2:Prove: A→ A (Reflexivity of implication).Proof:

1. (A→ ((B → A) → A)) → ((A→ (B → A)) → (A→ A))

ϕ = A,χ = B → A,ψ = A

Page 88: Formal Semantics (Logic)

78 CHAPTER 14. PROPOSITIONAL CALCULUS

2. A→ ((B → A) → A)

ϕ = A,χ = B → A

3. (A→ (B → A)) → (A→ A)

From (1) and (2) by modus ponens.

4. A→ (B → A)

ϕ = A,χ = B

5. A→ A

From (3) and (4) by modus ponens.

14.12 Equivalence to equational logics

The preceding alternative calculus is an example of a Hilbert-style deduction system. In the case of propositionalsystems the axioms are terms built with logical connectives and the only inference rule is modus ponens. Equationallogic as standardly used informally in high school algebra is a different kind of calculus from Hilbert systems. Itstheorems are equations and its inference rules express the properties of equality, namely that it is a congruence onterms that admits substitution.Classical propositional calculus as described above is equivalent to Boolean algebra, while intuitionistic propositionalcalculus is equivalent to Heyting algebra. The equivalence is shown by translation in each direction of the theoremsof the respective systems. Theorems ϕ of classical or intuitionistic propositional calculus are translated as equationsϕ = 1 of Boolean or Heyting algebra respectively. Conversely theorems x = y of Boolean or Heyting algebra aretranslated as theorems (x → y) ∧ (y → x) of classical or intuitionistic calculus respectively, for which x ≡ y is astandard abbreviation. In the case of Boolean algebra x = y can also be translated as (x ∧ y) ∨ (¬x ∧ ¬y) , but thistranslation is incorrect intuitionistically.In both Boolean and Heyting algebra, inequality x ≤ y can be used in place of equality. The equality x = y isexpressible as a pair of inequalities x ≤ y and y ≤ x . Conversely the inequality x ≤ y is expressible as the equalityx ∧ y = x , or as x ∨ y = y . The significance of inequality for Hilbert-style systems is that it corresponds to thelatter’s deduction or entailment symbol ⊢ . An entailment

ϕ1, ϕ2, . . . , ϕn ⊢ ψ

is translated in the inequality version of the algebraic framework as

ϕ1 ∧ ϕ2 ∧ . . . ∧ ϕn ≤ ψ

Conversely the algebraic inequality x ≤ y is translated as the entailment

x ⊢ y

The difference between implication x→ y and inequality or entailment x ≤ y or x ⊢ y is that the former is internalto the logic while the latter is external. Internal implication between two terms is another term of the same kind.Entailment as external implication between two terms expresses a metatruth outside the language of the logic, andis considered part of the metalanguage. Even when the logic under study is intuitionistic, entailment is ordinarilyunderstood classically as two-valued: either the left side entails, or is less-or-equal to, the right side, or it is not.

Page 89: Formal Semantics (Logic)

14.13. GRAPHICAL CALCULI 79

Similar but more complex translations to and from algebraic logics are possible for natural deduction systems asdescribed above and for the sequent calculus. The entailments of the latter can be interpreted as two-valued, but amore insightful interpretation is as a set, the elements of which can be understood as abstract proofs organized asthe morphisms of a category. In this interpretation the cut rule of the sequent calculus corresponds to compositionin the category. Boolean and Heyting algebras enter this picture as special categories having at most one morphismper homset, i.e., one proof per entailment, corresponding to the idea that existence of proofs is all that matters: anyproof will do and there is no point in distinguishing them.

14.13 Graphical calculi

It is possible to generalize the definition of a formal language from a set of finite sequences over a finite basis to includemany other sets of mathematical structures, so long as they are built up by finitary means from finite materials. What’smore, many of these families of formal structures are especially well-suited for use in logic.For example, there are many families of graphs that are close enough analogues of formal languages that the conceptof a calculus is quite easily and naturally extended to them. Indeed, many species of graphs arise as parse graphsin the syntactic analysis of the corresponding families of text structures. The exigencies of practical computation onformal languages frequently demand that text strings be converted into pointer structure renditions of parse graphs,simply as a matter of checking whether strings are well-formed formulas or not. Once this is done, there are manyadvantages to be gained from developing the graphical analogue of the calculus on strings. The mapping from stringsto parse graphs is called parsing and the inverse mapping from parse graphs to strings is achieved by an operationthat is called traversing the graph.

14.14 Other logical calculi

Propositional calculus is about the simplest kind of logical calculus in current use. It can be extended in several ways.(Aristotelian “syllogistic” calculus, which is largely supplanted in modern logic, is in some ways simpler – but in otherways more complex – than propositional calculus.) The most immediate way to develop a more complex logicalcalculus is to introduce rules that are sensitive to more fine-grained details of the sentences being used.First-order logic (a.k.a. first-order predicate logic) results when the “atomic sentences” of propositional logic arebroken up into terms, variables, predicates, and quantifiers, all keeping the rules of propositional logic with somenew ones introduced. (For example, from “All dogs are mammals” we may infer “If Rover is a dog then Rover isa mammal”.) With the tools of first-order logic it is possible to formulate a number of theories, either with explicitaxioms or by rules of inference, that can themselves be treated as logical calculi. Arithmetic is the best known of these;others include set theory and mereology. Second-order logic and other higher-order logics are formal extensions offirst-order logic. Thus, it makes sense to refer to propositional logic as “zeroth-order logic”, when comparing it withthese logics.Modal logic also offers a variety of inferences that cannot be captured in propositional calculus. For example, from“Necessarily p” we may infer that p. From p we may infer “It is possible that p”. The translation between modallogics and algebraic logics concerns classical and intuitionistic logics but with the introduction of a unary operator onBoolean or Heyting algebras, different from the Boolean operations, interpreting the possibility modality, and in thecase of Heyting algebra a second operator interpreting necessity (for Boolean algebra this is redundant since necessityis the De Morgan dual of possibility). The first operator preserves 0 and disjunction while the second preserves 1 andconjunction.Many-valued logics are those allowing sentences to have values other than true and false. (For example, neither andboth are standard “extra values"; “continuum logic” allows each sentence to have any of an infinite number of “degreesof truth” between true and false.) These logics often require calculational devices quite distinct from propositionalcalculus. When the values form a Boolean algebra (which may have more than two or even infinitely many values),many-valued logic reduces to classical logic; many-valued logics are therefore only of independent interest when thevalues form an algebra that is not Boolean.

Page 90: Formal Semantics (Logic)

80 CHAPTER 14. PROPOSITIONAL CALCULUS

14.15 Solvers

Finding solutions to propositional logic formulas is an NP-complete problem. However, practical methods exist (e.g.,DPLL algorithm, 1962; Chaff algorithm, 2001) that are very fast for many useful cases. Recent work has extendedthe SAT solver algorithms to work with propositions containing arithmetic expressions; these are the SMT solvers.

14.16 See also

14.16.1 Higher logical levels• First-order logic• Second-order propositional logic• Second-order logic• Higher-order logic

14.16.2 Related topics

14.17 References[1] Ancient Logic (Stanford Encyclopedia of Philosophy)

[2] Marenbon, John (2007). Medieval philosophy: an historical and philosophical introduction. Routledge. p. 137.

[3] Leibniz’s Influence on 19th Century Logic

[4] Hurley, Patrick (2007). A Concise Introduction to Logic 10th edition. Wadsworth Publishing. p. 392.

[5] Beth, Evert W.; “Semantic entailment and formal derivability”, series: Mededlingen van de Koninklijke NederlandseAkademie van Wetenschappen, Afdeling Letterkunde, Nieuwe Reeks, vol. 18, no. 13, Noord-Hollandsche Uitg. Mij.,Amsterdam, 1955, pp. 309–42. Reprinted in Jaakko Intikka (ed.) The Philosophy of Mathematics, Oxford UniversityPress, 1969

[6] Truth in Frege

[7] Russell’s Use of Truth-Tables

[8] Wernick, William (1942) “Complete Sets of Logical Functions,” Transactions of the American Mathematical Society 51,pp. 117–132.

[9] Toida, Shunichi (2 August 2009). “Proof of Implications”. CS381 Discrete Structures/Discrete Mathematics Web CourseMaterial. Department Of Computer Science, Old Dominion University. Retrieved 10 March 2010.

[10] Hunter, Geoffrey (1971). Metalogic: An Introduction to the Metatheory of Standard First-Order Logic. University ofCalifornia Pres. ISBN 0-520-02356-0.

14.18 Further reading• Brown, Frank Markham (2003), Boolean Reasoning: The Logic of Boolean Equations, 1st edition, KluwerAcademic Publishers, Norwell, MA. 2nd edition, Dover Publications, Mineola, NY.

• Chang, C.C. and Keisler, H.J. (1973), Model Theory, North-Holland, Amsterdam, Netherlands.• Kohavi, Zvi (1978), Switching and Finite Automata Theory, 1st edition, McGraw–Hill, 1970. 2nd edition,McGraw–Hill, 1978.

• Korfhage, Robert R. (1974), Discrete Computational Structures, Academic Press, New York, NY.• Lambek, J. and Scott, P.J. (1986), Introduction to Higher Order Categorical Logic, Cambridge University Press,Cambridge, UK.

• Mendelson, Elliot (1964), Introduction to Mathematical Logic, D. Van Nostrand Company.

Page 91: Formal Semantics (Logic)

14.19. EXTERNAL LINKS 81

14.18.1 Related works

• Hofstadter, Douglas (1979). Gödel, Escher, Bach: An Eternal Golden Braid. Basic Books. ISBN 978-0-465-02656-2.

14.19 External links• Klement, Kevin C. (2006), “Propositional Logic”, in James Fieser and Bradley Dowden (eds.), Internet Ency-clopedia of Philosophy, Eprint.

• Formal Predicate Calculus, contains a systematic formal development along the lines of Alternative calculus

• forall x: an introduction to formal logic, by P.D. Magnus, covers formal semantics and proof theory for sen-tential logic.

• Category:Propositional Calculus on ProofWiki (GFDLed)

• An Outline of Propositional Logic

Page 92: Formal Semantics (Logic)

Chapter 15

Propositional formula

In propositional logic, a propositional formula is a type of syntactic formula which is well formed and has a truthvalue. If the values of all variables in a propositional formula are given, it determines a unique truth value. Apropositional formula may also be called a propositional expression, a sentence, or a sentential formula.A propositional formula is constructed from simple propositions, such as “five is greater than three” or propositionalvariables such as P and Q, using connectives such as NOT, AND, OR, and IMPLIES; for example:

(P AND NOT Q) IMPLIES (P OR Q).

In mathematics, a propositional formula is often more briefly referred to as a "proposition", but, more precisely, apropositional formula is not a proposition but a formal expression that denotes a proposition, a formal object underdiscussion, just like an expression such as "x + y" is not a value, but denotes a value. In some contexts, maintainingthe distinction may be of importance.

15.1 Propositions

For the purposes of the propositional calculus, propositions (utterances, sentences, assertions) are considered to beeither simple or compound.[1] Compound propositions are considered to be linked by sentential connectives, someof the most common of which are “AND”, “OR”, “IF ... THEN ...”, “NEITHER ... NOR...”, "... IS EQUIVALENTTO ...” . The linking semicolon ";", and connective “BUT” are considered to be expressions of “AND”. A sequenceof discrete sentences are considered to be linked by “AND"s, and formal analysis applies a recursive “parenthesisrule” with respect to sequences of simple propositions (see more below about well-formed formulas).

For example: The assertion: “This cow is blue. That horse is orange but this horse here is purple.” isactually a compound proposition linked by “AND"s: ( (“This cow is blue” AND “that horse is orange”)AND “this horse here is purple” ) .

Simple propositions are declarative in nature, that is, they make assertions about the condition or nature of a particularobject of sensation e.g. “This cow is blue”, “There’s a coyote!" (“That coyote IS there, behind the rocks.”).[2] Thusthe simple “primitive” assertions must be about specific objects or specific states of mind. Each must have at least asubject (an immediate object of thought or observation), a verb (in the active voice and present tense preferred), andperhaps an adjective or adverb. “Dog!" probably implies “I see a dog” but should be rejected as too ambiguous.

Example: “That purple dog is running”, “This cow is blue”, “Switch M31 is closed”, “This cap is off”,“Tomorrow is Friday”.

For the purposes of the propositional calculus a compound proposition can usually be reworded into a series of simplesentences, although the result will probably sound stilted.

82

Page 93: Formal Semantics (Logic)

15.2. AN ALGEBRA OF PROPOSITIONS, THE PROPOSITIONAL CALCULUS 83

15.1.1 Relationship between propositional and predicate formulas

The predicate calculus goes a step further than the propositional calculus to an “analysis of the inner structure ofpropositions”[3] It breaks a simple sentence down into two parts (i) its subject (the object (singular or plural) ofdiscourse) and (ii) a predicate—a verb or possibly verb-clause that asserts a quality or attribute of the object(s)).The predicate calculus then generalizes the “subject|predicate” form (where | symbolizes concatenation (stringingtogether) of symbols) into a form with the following blank-subject structure " ___|predicate”, and the predicate inturn generalized to all things with that property.

Example: “This blue pig has wings” becomes two sentences in the propositional calculus: “This pig haswings” AND “This pig is blue”, whose internal structure is not considered. In contrast, in the predicatecalculus, the first sentence breaks into “this pig” as the subject, and “has wings” as the predicate. Thusit asserts that object “this pig” is a member of the class (set, collection) of “winged things”. The secondsentence asserts that object “this pig” has an attribute “blue” and thus is a member of the class of “bluethings”. One might choose to write the two sentences connected with AND as:

p|W AND p|B

The generalization of “this pig” to a (potential) member of two classes “winged things” and “blue things” means thatit has a truth-relationship with both of these classes. In other words, given a domain of discourse “winged things”,either we find p to be a member of this domain or not. Thus we have a relationship W (wingedness) between p (pig)and T, F , W(p) evaluates to T, F . Likewise for B (blueness) and p (pig) and T, F : B(p) evaluates to T, F. So we now can analyze the connected assertions “B(p) AND W(p)" for its overall truth-value, i.e.:

( B(p) AND W(p) ) evaluates to T, F

In particular, simple sentences that employ notions of “all”, “some”, “a few”, “one of”, etc. are treated by the predicatecalculus. Along with the new function symbolism “F(x)" two new symbols are introduced: ∀ (For all), and ∃ (Thereexists ..., At least one of ... exists, etc.). The predicate calculus, but not the propositional calculus, can establish theformal validity of the following statement:

“All blue pigs have wings but some pigs have no wings, hence some pigs are not blue”.

15.1.2 Identity

Tarski asserts that the notion of IDENTITY (as distinguished from LOGICAL EQUIVALENCE) lies outside thepropositional calculus; however, he notes that if a logic is to be of use for mathematics and the sciences it must containa “theory” of IDENTITY.[4] Some authors refer to “predicate logic with identity” to emphasize this extension. Seemore about this below.

15.2 An algebra of propositions, the propositional calculus

An algebra (and there are many different ones), loosely defined, is a method by which a collection of symbols calledvariables together with some other symbols such as parentheses (, ) and some sub-set of symbols such as *, +, ~, &,V, =, ≡, ⋀, ¬ are manipulated within a system of rules. These symbols, and well-formed strings of them, are saidto represent objects, but in a specific algebraic system these objects do not have meanings. Thus work inside thealgebra becomes an exercise in obeying certain laws (rules) of the algebra’s syntax (symbol-formation) rather thanin semantics (meaning) of the symbols. The meanings are to be found outside the algebra.For a well-formed sequence of symbols in the algebra—a formula -- to have some usefulness outside the algebra thesymbols are assigned meanings and eventually the variables are assigned values; then by a series of rules the formulais evaluated.When the values are restricted to just two and applied to the notion of simple sentences (e.g. spoken utterancesor written assertions) linked by propositional connectives this whole algebraic system of symbols and rules andevaluation-methods is usually called the propositional calculus or the sentential calculus.

Page 94: Formal Semantics (Logic)

84 CHAPTER 15. PROPOSITIONAL FORMULA

While some of the familiar rules of arithmetic algebra continue to hold in the algebra of propositions (e.g. thecommutative and associative laws for AND and OR), some do not (e.g. the distributive laws for AND, OR andNOT).

15.2.1 Usefulness of propositional formulas

Analysis: In deductive reasoning, philosophers, rhetoricians and mathematicians reduce arguments to formulas andthen study them (usually with truth tables) for correctness (soundness). For example: Is the following argumentsound?

“Given that consciousness is sufficient for an artificial intelligence and only conscious entities can passthe Turing test, before we can conclude that a robot is an artificial intelligence the robot must pass theTuring test.”

Engineers analyze the logic circuits they have designed using synthesis techniques and then apply various reductionand minimization techniques to simplify their designs.Synthesis: Engineers in particular synthesize propositional formulas (that eventually end up as circuits of symbols)from truth tables. For example, one might write down a truth table for how binary addition should behave given theaddition of variables “b” and “a” and “carry_in” “ci”, and the results “carry_out” “co” and “sum” Σ:

Example: in row 5, ( (b+a) + ci ) = ( (1+0) + 1 ) = the number “2”. written as a binary number this is102, where “co"=1 and Σ=0 as shown in the right-most columns.

15.2.2 Propositional variables

The simplest type of propositional formula is a propositional variable. Propositions that are simple (atomic), sym-bolic expressions are often denoted by variables named a, b, or A, B, etc. A propositional variable is intended torepresent an atomic proposition (assertion), such as “It is Saturday” = a (here the symbol = means " ... is assignedthe variable named ...”) or “I only go to the movies on Monday” = b.

15.2.3 Truth-value assignments, formula evaluations

Evaluation of a propositional formula begins with assignment of a truth value to each variable. Because eachvariable represents a simple sentence, the truth values are being applied to the “truth” or “falsity” of these simplesentences.Truth values in rhetoric, philosophy and mathematics: The truth values are only two: TRUTH “T”, FALSITY“F” . An empiricist puts all propositions into two broad classes: analytic—true no matter what (e.g. tautology), andsynthetic—derived from experience and thereby susceptible to confirmation by third parties (the verification theory ofmeaning).[5] Empiricits hold that, in general, to arrive at the truth-value of a synthetic proposition, meanings (pattern-matching templates) must first be applied to the words, and then these meaning-templates must be matched againstwhatever it is that is being asserted. For example, my utterance “That cow is blue!" Is this statement a TRUTH? TrulyI said it. And maybe I am seeing a blue cow—unless I am lying my statement is a TRUTH relative to the object ofmy (perhaps flawed) perception. But is the blue cow “really there"? What do you see when you look out the samewindow? In order to proceed with a verification, you will need a prior notion (a template) of both “cow” and “blue”,and an ability to match the templates against the object of sensation (if indeed there is one).Truth values in engineering: Engineers try to avoid notions of truth and falsity that bedevil philosophers, but in thefinal analysis engineers must trust their measuring instruments. In their quest for robustness, engineers prefer to pullknown objects from a small library—objects that have well-defined, predictable behaviors even in large combinations,(hence their name for the propositional calculus: “combinatorial logic”). The fewest behaviors of a single object aretwo (e.g. OFF, ON , open, shut , UP, DOWN etc.), and these are put in correspondence with 0, 1 .Such elements are called digital; those with a continuous range of behaviors are called analog. Whenever decisionsmust be made in an analog system, quite often an engineer will convert an analog behavior (the door is 45.32146%UP) to digital (e.g. DOWN=0 ) by use of a comparator.[6]

Page 95: Formal Semantics (Logic)

15.3. PROPOSITIONAL CONNECTIVES 85

Thus an assignment ofmeaning of the variables and the two value-symbols 0, 1 comes from “outside” the formulathat represents the behavior of the (usually) compound object. An example is a garage door with two “limit switches”,one for UP labelled SW_U and one for DOWN labelled SW_D, and whatever else is in the door’s circuitry. Inspectionof the circuit (either the diagram or the actual objects themselves—door, switches, wires, circuit board, etc.) mightreveal that, on the circuit board “node 22” goes to +0 volts when the contacts of switch “SW_D” are mechanically incontact (“closed”) and the door is in the “down” position (95% down), and “node 29” goes to +0 volts when the dooris 95% UP and the contacts of switch SW_U are in mechanical contact (“closed”).[7] The engineer must define themeanings of these voltages and all possible combinations (all 4 of them), including the “bad” ones (e.g. both nodes22 and 29 at 0 volts, meaning that the door is open and closed at the same time). The circuit mindlessly responds towhatever voltages it experiences without any awareness of TRUTH or FALSEHOOD, RIGHT or WRONG, SAFEor DANGEROUS.

15.3 Propositional connectives

Arbitrary propositional formulas are built from propositional variables and other propositional formulas using propositionalconnectives. Examples of connectives include:

• The unary negation connective. If α is a formula, then ¬α is a formula.

• The classical binary connectives ∧,∨,→,↔ . Thus, for example, if α and β are formulas, so is (α→ β) .

• Other binary connectives, such as NAND, NOR, and XOR

• The ternary connective IF ... THEN ... ELSE ...

• Constant 0-ary connectives ⊤ and ⊥ (alternately, constants T, F , 1, 0 etc. )

• The “theory-extension” connective EQUALS (alternately, IDENTITY, or the sign " = " as distinguished fromthe “logical connective”↔ )

15.3.1 Connectives of rhetoric, philosophy and mathematics

The following are the connectives common to rhetoric, philosophy and mathematics together with their truth tables.The symbols used will vary from author to author and between fields of endeavor. In general the abbreviations “T”and “F” stand for the evaluations TRUTH and FALSITY applied to the variables in the propositional formula (e.g.the assertion: “That cow is blue” will have the truth-value “T” for Truth or “F” for Falsity, as the case may be.).The connectives go by a number of different word-usages, e.g. “a IMPLIES b” is also said “IF a THEN b”. Some ofthese are shown in the table.

15.3.2 Engineering connectives

In general, the engineering connectives are just the same as the mathematics connectives excepting they tend toevaluate with “1” = “T” and “0” = “F”. This is done for the purposes of analysis/minimization and synthesis offormulas by use of the notion of minterms and Karnaugh maps (see below). Engineers also use the words logicalproduct from Boole's notion (a*a = a) and logical sum from Jevons' notion (a+a = a).[8]

15.3.3 CASE connective: IF ... THEN ... ELSE ...

The IF ... THEN ... ELSE ... connective appears as the simplest form of CASE operator of recursion theoryand computation theory and is the connective responsible for conditional goto’s (jumps, branches). From this oneconnective all other connectives can be constructed (see more below). Although " IF c THEN b ELSE a " soundslike an implication it is, in its most reduced form, a switch that makes a decision and offers as outcome only one oftwo alternatives “a” or “b” (hence the name switch statement in the C programming language).[9]

The following three propositions are equivalent (as indicated by the logical equivalence sign ≡ ):

Page 96: Formal Semantics (Logic)

86 CHAPTER 15. PROPOSITIONAL FORMULA

Engineering symbols have varied over the years, but these are commonplace. Sometimes they appear simply as boxes with symbolsin them. “a” and “b” are called “the inputs” and “c” is called “the output”. An output will typical “connect to” an input (unless it is thefinal connective); this accomplishes the mathematical notion of substitution.

• (1) ( IF 'counter is zero' THEN 'go to instruction b ' ELSE 'go to instruction a ') ≡• (2) ( (c → b) & (~c → a) ) ≡ ( ( IF 'counter is zero' THEN 'go to instruction b ' ) AND ( IF 'It isNOT the case that counter is zero' THEN 'go to instruction a ) " ≡

• (3) ( (c & b) V (~c & a) ) = " ( 'Counter is zero' AND 'go to instruction b ) OR ( 'It is NOT thecase that 'counter is zero' AND 'go to instruction a ) "

Thus IF ... THEN ... ELSE—unlike implication—does not evaluate to an ambiguous “TRUTH” when the firstproposition is false i.e. c = F in (c → b). For example, most people would reject the following compound propositionas a nonsensical non sequitur because the second sentence is not connected in meaning to the first.[10]

Example: The proposition " IF 'Winston Churchill was Chinese' THEN 'The sun rises in the east' "evaluates as a TRUTH given that 'Winston Church was Chinese' is a FALSEHOOD and 'The sun risesin the east' evaluates as a TRUTH.

In recognition of this problem, the sign → of formal implication in the propositional calculus is called materialimplication to distinguish it from the everyday, intuitive implication.[11]

The use of the IF ... THEN ... ELSE construction avoids controversy because it offers a completely deterministicchoice between two stated alternatives; it offers two “objects” (the two alternatives b and a), and it selects betweenthem exhaustively and unabiguously.[12] In the truth table below, d1 is the formula: ( (IF c THEN b) AND (IF NOT-cTHEN a) ). Its fully reduced form d2 is the formula: ( (c AND b) OR (NOT-c AND a). The two formulas areequivalent as shown by the columns "=d1” and "=d2”. Electrical engineers call the fully reduced formula the AND-OR-SELECT operator. The CASE (or SWITCH) operator is an extension of the same idea to n possible, but mutuallyexclusive outcomes. Electrical engineers call the CASE operator a multiplexer.

15.3.4 IDENTITY and evaluation

The first table of this section stars *** the entry logical equivalence to note the fact that "Logical equivalence" is notthe same thing as “identity”. For example, most would agree that the assertion “That cow is blue” is identical to theassertion “That cow is blue”. On the other hand logical equivalence sometimes appears in speech as in this example:" 'The sun is shining' means 'I'm biking' " Translated into a propositional formula the words become: “IF 'the sun isshining' THEN 'I'm biking', AND IF 'I'm biking' THEN 'the sun is shining'":[13]

“IF 's’ THEN 'b' AND IF 'b' THEN 's’ " is written as ((s → b) & (b → s)) or in an abbreviated form as(s ↔ b). As the rightmost symbol string is a definition for a new symbol in terms of the symbols on theleft, the use of the IDENTITY sign = is appropriate:

Page 97: Formal Semantics (Logic)

15.4. MORE COMPLEX FORMULAS 87

((s → b) & (b → s)) = (s ↔ b)

Different authors use different signs for logical equivalence: ↔ (e.g. Suppes, Goodstein, Hamilton), ≡ (e.g. Robbin),⇔ (e.g. Bender andWilliamson). Typically identity is written as the equals sign =. One exception to this rule is foundin Principia Mathematica. For more about the philosophy of the notion of IDENTITY see Leibniz’s law.As noted above, Tarski considers IDENTITY to lie outside the propositional calculus, but he asserts that without thenotion, “logic” is insufficient for mathematics and the deductive sciences. In fact the sign comes into the propositionalcalculus when a formula is to be evaluated.[14]

In some systems there are no truth tables, but rather just formal axioms (e.g. strings of symbols from a set ~, →, (,), variables p1, p2, p3, ... and formula-formation rules (rules about how to make more symbol strings from previousstrings by use of e.g. substitution and modus ponens). the result of such a calculus will be another formula (i.e. awell-formed symbol string). Eventually, however, if one wants to use the calculus to study notions of validity andtruth, one must add axioms that define the behavior of the symbols called “the truth values” T, F ( or 1, 0, etc.)relative to the other symbols.For example, Hamilton uses two symbols = and ≠ when he defines the notion of a valuation v of any wffs A and Bin his “formal statement calculus” L. A valuation v is a function from the wffs of his system L to the range (output) T, F , given that each variable p1, p2, p3 in a wff is assigned an arbitrary truth value T, F .

• (i) v(A) ≠ v(~A)

• (ii) v(A→ B) = F if and only if v(A) = T and v(B) = F

The two definitions (i) and (ii) define the equivalent of the truth tables for the ~ (NOT) and → (IMPLICATION)connectives of his system. The first one derives F ≠ T and T ≠ F, in other words " v(A) does not mean v(~A)".Definition (ii) specifies the third row in the truth table, and the other three rows then come from an application ofdefinition (i). In particular (ii) assigns the value F (or a meaning of “F”) to the entire expression. The definitions alsoserve as formation rules that allow substitution of a value previously derived into a formula:Some formal systems specify these valuation axioms at the outset in the form of certain formulas such as the law ofcontradiction or laws of identity and nullity. The choice of which ones to use, together with laws such as commutationand distribution, is up to the system’s designer as long as the set of axioms is complete (i.e. sufficient to form and toevaluate any well-formed formula created in the system).

15.4 More complex formulas

As shown above, the CASE (IF c THEN b ELSE a ) connective is constructed either from the 2-argument connectivesIF...THEN... and AND or from OR and AND and the 1-argument NOT. Connectives such as the n-argument AND(a & b & c & ... & n), OR (a V b V c V ... V n) are constructed from strings of two-argument AND and OR andwritten in abbreviated form without the parentheses. These, and other connectives as well, can then used as buildingblocks for yet further connectives. Rhetoricians, philosophers, and mathematicians use truth tables and the varioustheorems to analyze and simplify their formulas.Electrical engineering uses drawn symbols and connect them with lines that stand for the mathematicals act of sub-stitution and replacement. They then verify their drawings with truth tables and simplify the expressions as shownbelow by use of Karnaugh maps or the theorems. In this way engineers have created a host of “combinatorial logic”(i.e. connectives without feedback) such as “decoders”, “encoders”, “mutifunction gates”, “majority logic”, “binaryadders”, “arithmetic logic units”, etc.

15.4.1 Definitions

A definition creates a new symbol and its behavior, often for the purposes of abbreviation. Once the definition ispresented, either form of the equivalent symbol or formula can be used. The following symbolism =D is followingthe convention of Reichenbach.[15] Some examples of convenient definitions drawn from the symbol set ~, &, (,) and variables. Each definition is producing a logically equivalent formula that can be used for substitution orreplacement.

Page 98: Formal Semantics (Logic)

88 CHAPTER 15. PROPOSITIONAL FORMULA

• definition of a new variable: (c & d) =D s• OR: ~(~a & ~b) =D (a V b)• IMPLICATION: (~a V b) =D (a → b)• XOR: (~a & b) V (a & ~b) =D (a ⊕ b)• LOGICAL EQUIVALENCE: ( (a → b) & (b → a) ) =D ( a ≡ b )

15.4.2 Axiom and definition schemas

The definitions above for OR, IMPLICATION, XOR, and logical equivalence are actually schemas (or “schemata”),that is, they are models (demonstrations, examples) for a general formula format but shown (for illustrative purposes)with specific letters a, b, c for the variables, whereas any variable letters can go in their places as long as the lettersubstitutions follow the rule of substitution below.

Example: In the definition (~a V b) =D (a → b), other variable-symbols such as “SW2” and “CON1”might be used, i.e. formally:

a =D SW2, b =D CON1, so we would have as an instance of the definition schema (~SW2V CON1) =D (SW2 → CON1)

15.4.3 Substitution versus replacement

Substitution: The variable or sub-formula to be substituted with another variable, constant, or sub-formula must bereplaced in all instances throughout the overall formula.

Example: (c & d) V (p & ~(c & ~d)), but (q1 & ~q2) ≡ d. Now wherever variable “d” occurs, substitute(q1 & ~q2):

(c & (q1 & ~q2)) V (p & ~(c & ~(q1 & ~q2)))

Replacement: (i) the formula to be replaced must be within a tautology, i.e. logically equivalent ( connected by ≡or ↔) to the formula that replaces it, and (ii) unlike substitution its permissible for the replacement to occur only inone place (i.e. for one formula).

Example: Use this set of formula schemas/equivalences: 1: ( (a V 0) ≡ a ). 2: ( (a & ~a) ≡ 0 ). 3: ( (~aV b) =D (a → b) ). 6. ( ~(~a) ≡ a )

• start with “a": a• Use 1 to replace “a” with (a V 0): (a V 0)• Use the notion of “schema” to substitute b for a in 2: ( (a & ~a) ≡ 0 )• Use 2 to replace 0 with (b & ~b): ( a V (b & ~b) )• (see below for how to distribute “a V” over (b & ~b), etc

15.5 Inductive definition

The classical presentation of propositional logic (see Enderton 2002) uses the connectives ¬,∧,∨,→,↔ . The setof formulas over a given set of propositional variables is inductively defined to be the smallest set of expressions suchthat:

• Each propositional variable in the set is a formula,

• (¬α) is a formula whenever α is, and

• (αβ) is a formula whenever α and β are formulas and is one of the binary connectives ∧,∨,→,↔ .

Page 99: Formal Semantics (Logic)

15.6. PARSING FORMULAS 89

This inductive definition can be easily extended to cover additional connectives.The inductive definition can also be rephrased in terms of a closure operation (Enderton 2002). Let V denote a set ofpropositional variables and let XV denote the set of all strings from an alphabet including symbols in V, left and rightparentheses, and all the logical connectives under consideration. Each logical connective corresponds to a formulabuilding operation, a function from XXV to XXV:

• Given a string z, the operation E¬(z) returns (¬z) .

• Given strings y and z, the operation E∧(y, z) returns (y ∧ x) . There are similar operations E∨ , E→ , and E↔corresponding to the other binary connectives.

The set of formulas over V is defined to be the smallest subset of XXV containing V and closed under all the formulabuilding operations.

15.6 Parsing formulas

The following “laws” of the propositional calculus are used to “reduce” complex formulas. The “laws” can be easilyverified with truth tables. For each law, the principal (outermost) connective is associated with logical equivalence≡ or identity =. A complete analysis of all 2n combinations of truth-values for its n distinct variables will result ina column of 1’s (T’s) underneath this connective. This finding makes each law, by definition, a tautology. And, fora given law, because its formula on the left and right are equivalent (or identical) they can be substituted for oneanother.

Example: The following truth table is De Morgan’s law for the behavior of NOT over OR: ~(a V b) ≡(~a & ~b). To the left of the principal connective ≡ (yellow column labelled “taut”) the formula ~(b Va) evaluates to (1, 0, 0, 0) under the label “P”. On the right of “taut” the formula (~(b) V ~(a)) alsoevaluates to (1, 0, 0, 0) under the label “Q”. As the two columns have equivalent evaluations, the logicalequivalence ≡ under “taut” evaluates to (1, 1, 1, 1), i.e. P ≡ Q. Thus either formula can be substitutedfor the other if it appears in an larger formula.

Enterprising readers might challenge themselves to invent an “axiomatic system” that uses the symbols V, &, ~, (,), variables a, b, c , the formation rules specified above, as few as possible of the laws listed below, and then deriveas theorems the others as well as the truth-table valuations for V, &, and ~. One set attributed to Huntington (1904)(Suppes:204) uses 8 of the laws defined below.Note that that if used in an axiomatic system, the symbols 1 and 0 (or T and F) are considered to be wffs and thusobey all the same rules as the variables. Thus the laws listed below are actually axiom schemas, that is, they stand inplace of an infinite number of instances. Thus ( x V y ) ≡ ( y V x ) might be used in one instance, ( p V 0 ) ≡ ( 0 V p) and in another instance ( 1 V q ) ≡ ( q V 1 ), etc.

15.6.1 Connective seniority (symbol rank)

In general, to avoid confusion during analysis and evaluation of propositional formulas make liberal use parentheses.However, quite often authors leave them out. To parse a complicated formula one first needs to know the seniority,or rank that each of the connectives (excepting *) has over the other connectives. To” well-form” a formula start withthe connective with the highest rank and add parentheses around its components, then move down in rank (payingclose attention to the connective’s scope over which the it is working). From most- to least-senior, with the precidatesigns ∀x and ∃x, the IDENTITY = and arithmetic signs added for completeness:[16]

≡ (LOGICAL EQUIVALENCE), → (IMPLICATION), & (AND), V (OR), ~ (NOT), ∀x(FORALLx),∃x (THEREEXISTSANx),= (IDENTITY),+ (arithmetic sum), *(arithmeticmultiply), ' (s, arithmetic successor).

Thus the formula can be parsed—but note that, because NOT does not obey the distributive law, the parenthesesaround the inner formula (~c & ~d) is mandatory:

Page 100: Formal Semantics (Logic)

90 CHAPTER 15. PROPOSITIONAL FORMULA

Example: " d & c V w " rewritten is ( (d & c) V w )Example: " a & a → b ≡ a & ~a V b " rewritten (rigorously) is

• ≡ has seniority: ( ( a & a → b ) ≡ ( a & ~a V b ) )• → has seniority: ( ( a & (a → b) ) ≡ ( a & ~a V b ) )• & has seniority both sides: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~a V b) ) )• ~ has seniority: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) V b) ) )• check 9 ( -parenthesis and 9 ) -parenthesis: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) V b)) )

Example:

d & c V p & ~(c & ~d) ≡ c & d V p & c V p & ~d rewritten is ( ( (d & c) V ( p & ~((c &~(d)) ) ) ) ≡ ( (c & d) V (p & c) V (p & ~(d)) ) )

15.6.2 Commutative and associative laws

Both AND and OR obey the commutative law and associative law:

• Commutative law for OR: ( a V b ) ≡ ( b V a )

• Commutative law for AND: ( a & b ) ≡ ( b & a )

• Associative law for OR: (( a V b ) V c ) ≡ ( a V (b V c) )

• Associative law for AND: (( a & b ) & c ) ≡ ( a & (b & c) )

Omitting parentheses in strings of AND and OR: The connectives are considered to be unary (one-variable, e.g.NOT) and binary (i.e. two-variable AND, OR, IMPLIES). For example:

( (c & d) V (p & c) V (p & ~d) ) above should be written ( ((c & d) V (p & c)) V (p & ~(d) ) ) or possibly( (c & d) V ( (p & c) V (p & ~(d)) ) )

However, a truth-table demonstration shows that the form without the extra parentheses is perfectly adequate.Omitting parentheses with regards to a single-variable NOT: While ~(a) where a is a single variable is perfectlyclear, ~a is adequate and is the usual way this literal would appear. When the NOT is over a formula with more thanone symbol, then the parentheses are mandatory, e.g. ~(a V b)

15.6.3 Distributive laws

OR distributes over AND and AND distributes over OR. NOT does not distribute over AND nor OR. See belowabout De Morgan’s law:

• Distributive law for OR: ( c V ( a & b) ) ≡ ( (c V a) & (c V b) )

• Distributive law for AND: ( c & ( a V b) ) ≡ ( (c & a) V (c & b) )

15.6.4 De Morgan’s laws

NOT, when distributed over OR or AND, does something peculiar (again, these can be verified with a truth-table):

• De Morgan’s law for OR: ~(a V b) ≡ (~a & ~b)

• De Morgan’s law for AND: ~(a & b) ≡ (~a V ~b)

Page 101: Formal Semantics (Logic)

15.7. WELL-FORMED FORMULAS (WFFS) 91

15.6.5 Laws of absorption

Absorption, in particular the first one, cause the “laws” of logic to differ from the “laws” of arithmetic:

• Absorption (idempotency) for OR: (a V a) ≡ a

• Absorption (idempotency) for AND: (a & a) ≡ a

15.6.6 Laws of evaluation: Identity, nullity, and complement

The sign " = " (as distinguished from logical equivalence ≡, alternately ↔ or⇔) symbolizes the assignment of value ormeaning. Thus the string (a & ~(a)) symbolizes “1”, i.e. itmeans the same thing as symbol “1” ". In some “systems”this will be an axiom (definition) perhaps shown as ( (a & ~(a)) =D 1 ) ; in other systems, it may be derived in thetruth table below:

• Commutation of equality: (a = b) ≡ (b = a)

• Identity for OR: (a V 0) = a or (a V F) = a

• Identity for AND: (a & 1) = a or (a & T) = a

• Nullity for OR: (a V 1) = 1 or (a V T) = T

• Nullity for AND: (a & 0) = 0 or (a & F) = F

• Complement for OR: (a V ~a) = 1 or (a V ~a) = T, law of excluded middle

• Complement for AND: (a & ~a) = 0 or (a & ~a) = F, law of contradiction

15.6.7 Double negative (Involution)

• ~(~a) = a

15.7 Well-formed formulas (wffs)

A key property of formulas is that they can be uniquely parsed to determine the structure of the formula in termsof its propositional variables and logical connectives. When formulas are written in infix notation, as above, uniquereadability is ensured through an appropriate use of parentheses in the definition of formulas. Alternatively, formulascan be written in Polish notation or reverse Polish notation, eliminating the need for parentheses altogether.The inductive definition of infix formulas in the previous section can be converted to a formal grammar in Backus-Naur form:

<formula> ::= <propositional variable>| ( ¬ <formula> )| ( <formula> ∧ <formula>)| ( <formula> ∨ <formula> )| ( <formula>→ <formula> )| ( <formula>↔ <formula> )

It can be shown that any expression matched by the grammar has a balanced number of left and right parentheses, andany nonempty initial segment of a formula has more left than right parentheses.[17] This fact can be used to give analgorithm for parsing formulas. For example, suppose that an expression x begins with (¬ . Starting after the secondsymbol, match the shortest subexpression y of x that has balanced parentheses. If x is a formula, there is exactly onesymbol left after this expression, this symbol is a closing parenthesis, and y itself is a formula. This idea can be usedto generate a recursive descent parser for formulas.

Page 102: Formal Semantics (Logic)

92 CHAPTER 15. PROPOSITIONAL FORMULA

Example of parenthesis counting:This method locates as “1” the principal connective -- the connective under which the overall evaluation of theformula occurs for the outer-most parentheses (which are often omitted).[18] It also locates the inner-most connectivewhere one would begin evaluatation of the formula without the use of a truth table, e.g. at “level 6”.

15.7.1 Wffs versus valid formulas in inferences

The notion of valid argument is usually applied to inferences in arguments, but arguments reduce to propositionalformulas and can be evaluated the same as any other propositional formula. Here a valid inference means: “Theformula that represents the inference evaluates to “truth” beneath its principal connective, no matter what truth-values are assigned to its variables”, i.e. the formula is a tautology.[19] Quite possibly a formula will be well-formedbut not valid. Another way of saying this is: “Being well-formed is necessary for a formula to be valid but it is notsufficient.” The only way to find out if it is both well-formed and valid is to submit it to verification with a truth tableor by use of the “laws":

Example 1: What does one make of the following difficult-to-follow assertion? Is it valid? “If it’s sunny,but if the frog is croaking then it’s not sunny, then it’s the same as saying that the frog isn't croaking.”Convert this to a propositional formula as follows:

" IF (a AND (IF b THEN NOT-a) THEN NOT-a” where " a " represents “its sunny” and " b" represents “the frog is croaking":( ( (a) & ( (b) → ~(a) ) ≡ ~(b) )

This is well-formed, but is it valid? In other words, when evaluated will this yield a tautology (all T)beneath the logical-equivalence symbol ≡ ? The answer is NO, it is not valid. However, if reconstructedas an implication then the argument is valid.“Saying it’s sunny, but if the frog is croaking then it’s not sunny, implies that the frog isn't croaking.”

Other circumstances may be preventing the frog from croaking: perhaps a crane ate it.

Example 2 (from Reichenbach via Bertrand Russell):

“If pigs have wings, some winged animals are good to eat. Some winged animals are good toeat, so pigs have wings.”( ((a) → (b)) & (b) → (a) ) is well formed, but an invalid argument as shown by the redevaluation under the principal implication:

15.8 Reduced sets of connectives

A set of logical connectives is called complete if every propositional formula is tautologically equivalent to a formulawith just the connectives in that set. There are many complete sets of connectives, including ∧,¬ , ∨,¬ , and→,¬ . There are two binary connectives that are complete on their own, corresponding to NAND and NOR,respectively.[20] Some pairs are not complete, for example ∧,∨ .

15.8.1 The stroke (NAND)

The binary connective corresponding to NAND is called the Sheffer stroke, and written with a vertical bar | or verticalarrow ↑. The completeness of this connective was noted in Principia Mathematica (1927:xvii). Since it is complete onits own, all other connectives can be expressed using only the stroke. For example, where the symbol " ≡ " representslogical equivalence:

~p ≡ p|pp → q ≡ p|~qp V q ≡ ~p|~qp & q ≡ ~(p|q)

Page 103: Formal Semantics (Logic)

15.8. REDUCED SETS OF CONNECTIVES 93

The engineering symbol for the NAND connective (the 'stroke') can be used to build any propositional formula. The notion that truth(1) and falsity (0) can be defined in terms of this connective is shown in the sequence of NANDs on the left, and the derivations ofthe four evaluations of a NAND b are shown along the bottom. The more common method is to use the definition of the NAND fromthe truth table.

In particular, the zero-ary connectives ⊤ (representing truth) and ⊥ (representing falsity) can be expressed using thestroke:

⊤ ≡ (a|(a|a))

⊥ ≡ (⊤|⊤)

15.8.2 IF ... THEN ... ELSE

This connective together with 0, 1 , ( or F, T or ⊥ , ⊤ ) forms a complete set. In the following theIF...THEN...ELSE relation (c, b, a) = d represents ( (c → b) V (~c → a) ) ≡ ( (c & b) V (~c & a) ) = d

(c, b, a):(c, 0, 1) ≡ ~c(c, b, 1) ≡ (c → b)(c, c, a) ≡ (c V a)(c, b, c) ≡ (c & b)

Page 104: Formal Semantics (Logic)

94 CHAPTER 15. PROPOSITIONAL FORMULA

Example: The following shows how a theorem-based proof of "(c, b, 1) ≡ (c → b)" would proceed, below the proofis its truth-table verification. ( Note: (c → b) is defined to be (~c V b) ):

• Begin with the reduced form: ( (c & b) V (~c & a) )• Substitute “1” for a: ( (c & b) V (~c & 1) )• Identity (~c & 1) = ~c: ( (c & b) V (~c) )• Law of commutation for V: ( (~c) V (c & b) )• Distribute "~c V” over (c & b): ( ((~c) V c ) & ((~c) V b )• Law of excluded middle (((~c) V c ) = 1 ): ( (1) & ((~c) V b ) )• Distribute "(1) &" over ((~c) V b): ( ((1) & (~c)) V ((1) & b )) )• Commutivity and Identity (( 1 & ~c) = (~c & 1) = ~c, and (( 1 & b) ≡ (b & 1) ≡ b: ( ~c V b )• ( ~c V b ) is defined as c → b Q. E. D.

In the following truth table the column labelled “taut” for tautology evaluates logical equivalence (symbolized here by≡) between the two columns labelled d. Because all four rows under “taut” are 1’s, the equivalence indeed representsa tautology.

15.9 Normal forms

An arbitrary propositional formulamay have a very complicated structure. It is often convenient to workwith formulasthat have simpler forms, known as normal forms. Some common normal forms include conjunctive normal formand disjunctive normal form. Any propositional formula can be reduced to its conjunctive or disjunctive normal form.

15.9.1 Reduction to normal form

Reduction to normal form is relatively simple once a truth table for the formula is prepared. But further attempts tominimize the number of literals (see below) requires some tools: reduction by De Morgan’s laws and truth tablescan be unwieldy, but Karnaugh maps are very suitable a small number of variables (5 or less). Some sophisticatedtabular methods exist for more complex circuits with multiple outputs but these are beyond the scope of this article;for more see Quine–McCluskey algorithm.

Literal, term and alterm

In electrical engineering a variable x or its negation ~(x) is lumped together into a single notion called a literal. Astring of literals connected by ANDs is called a term. A string of literals connected by OR is called an alterm.Typically the literal ~(x) is abbreviated ~x. Sometimes the &-symbol is omitted altogether in the manner of algebraicmultiplication.

Example: a, b, c, d are variables. ((( a & ~(b) ) & ~(c)) & d) is a term. This can be abbreviated as (a &~b & ~c & d), or a~b~cd.Example: p, q, r, s are variables. (((p & ~(q) ) & r) & ~(s) ) is an alterm. This can be abbreviated as (pV ~q V r V ~s).

Minterms

In the same way that a 2n-row truth table displays the evaluation of a propositional formula for all 2n possible valuesof its variables, n variables produces a 2n-square Karnaugh map (even though we cannot draw it in its full-dimensionalrealization). For example, 3 variables produces 23 = 8 rows and 8 Karnaugh squares; 4 variables produces 16 truth-table rows and 16 squares and therefore 16 minterms. Each Karnaugh-map square and its corresponding truth-tableevaluation represents one minterm.

Page 105: Formal Semantics (Logic)

15.9. NORMAL FORMS 95

Any propositional formula can be reduced to the “logical sum” (OR) of the active (i.e. “1"- or “T"-valued) minterms.When in this form the formula is said to be in disjunctive normal form. But even though it is in this form, it is notnecessarily minimized with respect to either the number of terms or the number of literals.In the following table, observe the peculiar numbering of the rows: (0, 1, 3, 2, 6, 7, 5, 4, 0). The first column is thedecimal equivalent of the binary equivalent of the digits “cba”, in other words:

Example: cba2 = c*22 + b*21 + a*20:

cba = (c=1, b=0, a=0) = 1012 = 1*22 + 0*21 + 1*20 = 510

This numbering comes about because as one moves down the table from row to row only one variable at a timechanges its value. Gray code is derived from this notion. This notion can be extended to three and four-dimensionalhypercubes called Hasse diagrams where each corner’s variables change only one at a time as one moves around theedges of the cube. Hasse diagrams (hypercubes) flattened into two dimensions are either Veitch diagrams or Karnaughmaps (these are virtually the same thing).When working with Karnaugh maps one must always keep in mind that the top edge “wrap arounds” to the bottomedge, and the left edge wraps around to the right edge—the Karnaugh diagram is really a three- or four- or n-dimensional flattened object.

15.9.2 Reduction by use of the map method (Veitch, Karnaugh)

Veitch improved the notion of Venn diagrams by converting the circles to abutting squares, and Karnaugh simplifiedthe Veitch diagram by converting the minterms, written in their literal-form (e.g. ~abc~d) into numbers.[21] Themethod proceeds as follows:

(1) Produce the formula’s truth table

Produce the formula’s truth table. Number its rows using the binary-equivalents of the variables (usually just sequen-tially 0 through n-1) for n variables.

Technically, the propositional function has been reduced to its (unminimized) conjunctive normal form:each row has its minterm expression and these can be OR'd to produce the formula in its (unminimized)conjunctive normal form.

Example: ((c & d) V (p & ~(c & (~d)))) = q in conjunctive normal form is:

( (~p & d & c ) V (p & d & c) V (p & d & ~c) V (p & ~d & ~c) ) = q

However, this formula be reduced both in the number of terms (from 4 to 3) and in the total count of its literals (12to 6).

(2) Create the formula’s Karnaugh map

Use the values of the formula (e.g. “p”) found by the truth-table method and place them in their into their respective(associated) Karnaugh squares (these are numbered per the Gray code convention). If values of “d” for “don't care”appear in the table, this adds flexibility during the reduction phase.

(3) Reduce minterms

Minterms of adjacent (abutting) 1-squares (T-squares) can be reduced with respect to the number of their literals,and the number terms also will be reduced in the process. Two abutting squares (2 x 1 horizontal or 1 x 2 vertical,even the edges represent abutting squares) lose one literal, four squares in a 4 x 1 rectangle (horizontal or vertical)or 2 x 2 square (even the four corners represent abutting squares) lose two literals, eight squares in a rectangle lose 3literals, etc. (One seeks out the largest square or rectangles and ignores the smaller squares or rectangles contained

Page 106: Formal Semantics (Logic)

96 CHAPTER 15. PROPOSITIONAL FORMULA

totally within it. ) This process continues until all abutting squares are accounted for, at which point the propositionalformula is minimized.For example, squares #3 and #7 abut. These two abutting squares can lose one literal (e.g. “p” from squares #3 and#7), four squares in a rectangle or square lose two literals, eight squares in a rectangle lose 3 literals, etc. (One seeksout the largest square or rectangles.) This process continues until all abutting squares are accounted for, at whichpoint the propositional formula is said to be minimized.Example: The map method usually is done by inspection. The following example expands the algebraic method toshow the “trick” behind the combining of terms on a Karnaugh map:

Minterms #3 and #7 abut, #7 and #6 abut, and #4 and #6 abut (because the table’s edges wrap around).So each of these pairs can be reduced.

Observe that by the Idempotency law (A V A) = A, we can create more terms. Then by association and distributivelaws the variables to disappear can be paired, and then “disappeared” with the Law of contradiction (x & ~x)=0. Thefollowing uses brackets [ and ] only to keep track of the terms; they have no special significance:

• Put the formula in conjunctive normal form with the formula to be reduced:

q = ( (~p & d & c ) V (p & d & c) V (p & d & ~c) V (p & ~d & ~c) ) = ( #3 V#7 V #6 V #4 )

• Idempotency (absorption) [ A V A) = A:

( #3 V [ #7 V #7 ] V [ #6 V #6 ] V #4 )

• Associative law (x V (y V z)) = ( (x V y) V z )

( [ #3 V #7 ] V [ #7 V #6 ] V [ #6 V #4] )[ (~p & d & c ) V (p & d & c) ] V [ (p & d & c) V (p & d & ~c) ] V [ (p & d & ~c)V (p & ~d & ~c) ].

• Distributive law ( x & (y V z) ) = ( (x & y) V (x & z) ) :

( [ (d & c) V (~p & p) ] V [ (p & d) V (~c & c) ] V [ (p & ~c) V (c & ~c) ] )

• Commutative law and law of contradiction (x & ~x) = (~x & x) = 0:

( [ (d & c) V (0) ] V [ (p & d) V (0) ] V [ (p & ~c) V (0) ] )

• Law of identity ( x V 0 ) = x leading to the reduced form of the formula:

q = ( (d & c) V (p & d) V (p & ~c) )

(4) Verify reduction with a truth table

15.10 Impredicative propositions

Given the following examples-as-definitions, what does one make of the subsequent reasoning:

(1) “This sentence is simple.” (2) “This sentence is complex, and it is conjoined by AND.”

Then assign the variable “s” to the left-most sentence “This sentence is simple”. Define “compound” c = “not simple”~s, and assign c = ~s to “This sentence is compound"; assign “j” to “It [this sentence] is conjoined by AND”. Thesecond sentence can be expressed as:

Page 107: Formal Semantics (Logic)

15.11. PROPOSITIONAL FORMULA WITH “FEEDBACK” 97

( NOT(s) AND j )

If truth values are to be placed on the sentences c = ~s and j, then all are clearly FALSEHOODS: e.g. “This sentenceis complex” is a FALSEHOOD (it is simple, by definition). So their conjunction (AND) is a falsehood. But whentaken in its assembed form, the sentence a TRUTH.This is an example of the paradoxes that result from an impredicative definition—that is, when an object m has aproperty P, but the object m is defined in terms of property P.[22] The best advice for a rhetorician or one involvedin deductive analysis is avoid impredicative definitions but at the same time be on the lookout for them because theycan indeed create paradoxes. Engineers, on the other hand, put them to work in the form of propositional formulaswith feedback.

15.11 Propositional formula with “feedback”

The notion of a propositional formula appearing as one of its own variables requires a formation rule that allows theassignment of the formula to a variable. In general there is no stipulation (either axiomatic or truth-table systems ofobjects and relations) that forbids this from happening.[23]

The simplest case occurs when an OR formula becomes one its own inputs e.g. p = q. Begin with (p V s) = q, thenlet p = q. Observe that q’s “definition” depends on itself “q” as well as on “s” and the OR connective; this definitionof q is thus impredicative. Either of two conditions can result:[24] oscillation or memory.It helps to think of the formula as a black box. Without knowledge of what is going on “inside” the formula-"box”from the outside it would appear that the output is no longer a function of the inputs alone. That is, sometimesone looks at q and sees 0 and other times 1. To avoid this problem one has to know the state (condition) of the“hidden” variable p inside the box (i.e. the value of q fed back and assigned to p). When this is known the apparentinconsistency goes away.To understand [predict] the behavior of formulas with feedback requires the more sophisticated analysis of sequentialcircuits. Propositional formulas with feedback lead, in their simplest form, to state machines; they also lead tomemories in the form of Turing tapes and counter-machine counters. From combinations of these elements onecan build any sort of bounded computational model (e.g. Turing machines, counter machines, register machines,Macintosh computers, etc.).

15.11.1 Oscillation

In the abstract (ideal) case the simplest oscillating formula is a NOT fed back to itself: ~(~(p=q)) = q. Analysis ofan abstract (ideal) propositional formula in a truth-table reveals an inconsistency for both p=1 and p=0 cases: Whenp=1, q=0, this cannot be because p=q; ditto for when p=0 and q=1.Oscillation with delay: If an delay[25] (ideal or non-ideal) is inserted in the abstract formula between p and q thenp will oscillate between 1 and 0: 101010...101... ad infinitum. If either of the delay and NOT are not abstract (i.e.not ideal), the type of analysis to be used will be dependent upon the exact nature of the objects that make up theoscillator; such things fall outside mathematics and into engineering.Analysis requires a delay to be inserted and then the loop cut between the delay and the input “p”. The delay mustbe viewed as a kind of proposition that has “qd” (q-delayed) as output for “q” as input. This new proposition addsanother column to the truth table. The inconsistency is now between “qd” and “p” as shown in red; two stable statesresulting:

15.11.2 Memory

Without delay, inconsistencies must be eliminated from a truth table analysis. With the notion of “delay”, this con-dition presents itself as a momentary inconsistency between the fed-back output variable q and p = q ₑ ₐ ₑ .A truth table reveals the rows where inconsistencies occur between p = q ₑ ₐ ₑ at the input and q at the output. After“breaking” the feed-back,[26] the truth table construction proceeds in the conventional manner. But afterwards, inevery row the output q is compared to the now-independent input p and any inconsistencies between p and q are noted(i.e. p=0 together with q=1, or p=1 and q=0); when the “line” is “remade” both are rendered impossible by the Law

Page 108: Formal Semantics (Logic)

98 CHAPTER 15. PROPOSITIONAL FORMULA

of contradiction ~(p & ~p)). Rows revealing inconsistencies are either considered transient states or just eliminatedas inconsistent and hence “impossible”.

Once-flip memory

About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output“q” feeds back into “p”. Given that the formula is first evaluated (initialized) with p=0 & q=0, it will “flip” oncewhen “set” by s=1. Thereafter, output “q” will sustain “q” in the “flipped” condition (state q=1). This behavior, nowtime-dependent, is shown by the state diagram to the right of the once-flip.

Flip-flop memory

The next simplest case is the “set-reset” flip-flop shown below the once-flip. Given that r=0 & s=0 and q=0 at theoutset, it is “set” (s=1) in a manner similar to the once-flip. It however has a provision to “reset” q=0 when “r"=1.And additional complication occurs if both set=1 and reset=1. In this formula, the set=1 forces the output q=1 sowhen and if (s=0 & r=1) the flip-flop will be reset. Or, if (s=1 & r=0) the flip-flop will be set. In the abstract (ideal)instance in which s=1 => s=0 & r=1 => r=0 simultaneously, the formula q will be indeterminate (undecidable). Dueto delays in “real” OR, AND and NOT the result will be unknown at the outset but thereafter predicable.

Clocked flip-flop memory

The formula known as “clocked flip-flop” memory (“c” is the “clock” and “d” is the “data”) is given below. It worksas follows: When c = 0 the data d (either 0 or 1) cannot “get through” to affect output q. When c = 1 the data d “getsthrough” and output q “follows” d’s value. When c goes from 1 to 0 the last value of the data remains “trapped” atoutput “q”. As long as c=0, d can change value without causing q to change.Example: ( ( c & d ) V ( p & ( ~( c & ~( d ) ) ) ) = q, but now let p = q:

Example: ( ( c & d ) V ( q & ( ~( c & ~( d ) ) ) ) = q

The state diagram is similar in shape to the flip-flop’s state diagram, but with different labelling on the transitions.

15.12 Historical development

Bertrand Russell (1912:74) lists three laws of thought that derive from Aristotle: (1) The law of identity: “Whateveris, is.”, (2) The law of contradiction: “Nothing cannot both be and not be”, and (3) The law of excluded middle:“Everything must be or not be.”

Example: Here O is an expression about an objects BEING or QUALITY:

(1) Law of Identity: O = O(2) Law of contradiction: ~(O & ~(O))(3) Law of excluded middle: (O V ~(O))

The use of the word “everything” in the law of excluded middle renders Russell’s expression of this law open todebate. If restricted to an expression about BEING or QUALITY with reference to a finite collection of objects (afinite “universe of discourse”) -- the members of which can be investigated one after another for the presence orabsence of the assertion—then the law is considered intuitionistically appropriate. Thus an assertion such as: “Thisobject must either BE or NOT BE (in the collection)", or “This object must either have this QUALITY or NOT havethis QUALITY (relative to the objects in the collection)" is acceptable. See more at Venn diagram.Although a propositional calculus originated with Aristotle, the notion of an algebra applied to propositions had towait until the early 19th century. In an (adverse) reaction to the 2000 year tradition of Aristotle’s syllogisms, JohnLocke's Essay concerning human understanding (1690) used the word semiotics (theory of the use of symbols). By1826 Richard Whately had critically analyzed the syllogistic logic with a sympathy toward Locke’s semiotics. George

Page 109: Formal Semantics (Logic)

15.12. HISTORICAL DEVELOPMENT 99

Bentham's work (1827) resulted in the notion of “quantification of the predicate” (1827) (nowadays symbolized as∀ ≡ “for all”). A “row” instigated by William Hamilton over a priority dispute with Augustus De Morgan “inspiredGeorge Boole to write up his ideas on logic, and to publish them as MAL [Mathematical Analysis of Logic] in 1847”(Grattin-Guinness and Bornet 1997:xxviii).About his contribution Grattin-Guinness and Bornet comment:

“Boole’s principal single innovation was [the] law [ xn = x ] for logic: it stated that the mental acts ofchoosing the property x and choosing x again and again is the same as choosing x once... As consequenceof it he formed the equations x•(1-x)=0 and x+(1-x)=1 which for him expressed respectively the law ofcontradiction and the law of excluded middle” (p. xxviiff). For Boole “1” was the universe of discourseand “0” was nothing.

Gottlob Frege's massive undertaking (1879) resulted in a formal calculus of propositions, but his symbolism is sodaunting that it had little influence excepting on one person: Bertrand Russell. First as the student of Alfred NorthWhitehead he studied Frege’s work and suggested a (famous and notorious) emendation with respect to it (1904)around the problem of an antinomy that he discovered in Frege’s treatment ( cf Russell’s paradox ). Russell’s workled to a collatoration with Whitehead that, in the year 1912, produced the first volume of Principia Mathematica(PM). It is here that what we consider “modern” propositional logic first appeared. In particular, PM introduces NOTand OR and the assertion symbol ⊦ as primitives. In terms of these notions they define IMPLICATION → ( def.*1.01: ~p V q ), then AND (def. *3.01: ~(~p V ~q) ), then EQUIVALENCE p ←→ q (*4.01: (p → q) & ( q → p )).

• Henry M. Sheffer (1921) and Jean Nicod demonstrate that only one connective, the “stroke” | is sufficient toexpress all propositional formulas.

• Emil Post (1921) develops the truth-table method of analysis in his “Introduction to a general theory of ele-mentary propositions”. He notes Nicod’s stroke | .

• Whitehead and Russell add an introduction to their 1927 re-publication of PM adding, in part, a favorabletreatment of the “stroke”.

Computation and switching logic:

• William Eccles and F. W. Jordan (1919) describe a “trigger relay” made from a vacuum tube.

• George Stibitz (1937) invents the binary adder using mechanical relays. He builds this on his kitchen table.

Example: Given binary bits aᵢ and bᵢ and carry-in ( c_inᵢ), their summation Σᵢ and carry-out (c_outᵢ) are:

• ( ( aᵢ XOR bᵢ ) XOR c_inᵢ )= Σᵢ• ( aᵢ & bᵢ ) V c_inᵢ ) = c_outᵢ;

• Alan Turing builds a multiplier using relays (1937–1938). He has to hand-wind his own relay coils to do this.

• Textbooks about “switching circuits” appear in early 1950s.

• Willard Quine 1952 and 1955, E. W. Veitch 1952, and M. Karnaugh (1953) develop map-methods for simpli-fying propositional functions.

• George H. Mealy (1955) and Edward F. Moore (1956) address the theory of sequential (i.e. switching-circuit)“machines”.

• E. J. McCluskey and H. Shorr develop a method for simplifying propositional (switching) circuits (1962).

Page 110: Formal Semantics (Logic)

100 CHAPTER 15. PROPOSITIONAL FORMULA

15.13 Footnotes[1] Hamilton 1978:1

[2] PM p. 91 eschews “the” because they require a clear-cut “object of sensation"; they stipulate the use of “this”

[3] (italics added) Reichenbach p.80.

[4] Tarski p.54-68. Suppes calls IDENTITY a “further rule of inference” and has a brief development around it; Robbin,Bender and Williamson, and Goodstein introduce the sign and its usage without comment or explanation. Hamilton p. 37employs two signs ≠ and = with respect to the valuation of a formula in a formal calculus. Kleene p. 70 and Hamilton p.52 place it in the predicate calculus, in particular with regards to the arithmetic of natural numbers.

[5] Empiricits eschew the notion of a priori (built-in, born-with) knowledge. “Radical reductionists” such as John Lockeand David Hume “held that every idea must either originate directly in sense experience or else be compounded of ideasthus originating"; quoted from Quine reprinted in 1996 The Emergence of Logical Empriricism, Garland Publishing Inc.http://www.marxists.org/reference/subject/philosophy/works/us/quine.htm

[6] Neural net modelling offers a good mathematical model for a comparator as follows: Given a signal S and a threshold “thr”,subtract “thr” from S and substitute this difference d to a sigmoid function: For large “gains” k, e.g. k=100, 1/( 1 + e-k*(d)) = 1/( 1 + e-k*(S-thr) ) = ≃0, ≃1 . For example, if “The door is DOWN” means “The door is less than 50% of the wayup”, then a threshold thr=0.5 corresponding to 0.5*5.0 = +2.50 volts could be applied to a “linear” measuring-device withan output of 0 volts when fully closed and +5.0 volts when fully open.

[7] In actuality the digital 1 and 0 are defined over non-overlapping ranges e.g. “1” = +5/+0.2/−1.0 volts, 0 = +0.5/−0.2volts . When a value falls outside the defined range(s) the value becomes “u” -- unknown; e.g. +2.3 would be “u”.

[8] While the notion of logical product is not so peculiar (e.g. 0*0=0, 0*1=0, 1*0=0, 1*1=1), the notion of (1+1=1 is peculiar;in fact (a "+" b) = (a + (b - a*b)) where "+" is the “logical sum” but + and - are the true arithmetic counterparts. Occasionallyall four notions do appear in a formula: A AND B = 1/2*( A plus B minus ( A XOR B ) ] (cf p. 146 in John Wakerly 1978,Error Detecting Codes, Self-Checking Circuits and Applications, North-Holland, New York, ISBN 0-444-00259-6 pbk.)

[9] A careful look at its Karnaugh map shows that IF...THEN...ELSE can also be expressed, in a rather round-about way, interms of two exclusive-ORs: ( (b AND (c XOR a)) OR (a AND (c XOR b)) ) = d.

[10] Robbin p. 3.

[11] Rosenbloom p. 30 and p. 54ff discusses this problem of implication at some length. Most philosophers and mathematiciansjust accept the material definition as given above. But some do not, including the intuitionists; they consider it a form ofthe law of excluded middle misapplied.

[12] Indeed, exhaustive selection between alternatives --mutual exclusion -- is required by the definition that Kleene gives theCASE operator (Kleene 1952229)

[13] The use of quote marks around the expressions is not accidental. Tarski comments on the use of quotes in his “18. Identityof things and identity of their designations; use of quotation marks” p. 58ff.

[14] Hamilton p. 37. Bender and Williamson p. 29 state “In what follows, we'll replace “equals” with the symbol " ⇔ "(equivalence) which is usually used in logic. We use the more familiar " = " for assigning meaning and values.”

[15] Reichenbach p. 20-22 and follows the conventions of PM. The symbol =D is in the metalanguage and is not a formalsymbol with the following meaning: “by symbol ' s ' is to have the same meaning as the formula '(c & d)' ".

[16] Rosenbloom 1950:32. Kleene 1952:73-74 ranks all 11 symbols.

[17] cf Minsky 1967:75, section 4.2.3 “The method of parenthesis counting”. Minsky presents a state machine that will do thejob, and by use of induction (recursive definition) Minsky proves the “method” and presents a theorem as the result. Afully generalized “parenthesis grammar” requires an infinite state machine (e.g. a Turing machine) to do the counting.

[18] Robbin p. 7

[19] cf Reichenbach p. 68 for a more involved discussion: “If the inference is valid and the premises are true, the inference iscalled conclusive.

[20] As well as the first three, Hamilton pp.19-22 discusses logics built from only | (NAND), and ↓ (NOR).

[21] Wickes 1967:36ff. Wickes offers a good example of 8 of the 2 x 4 (3-variable maps) and 16 of the 4 x 4 (4-variable)maps. As an arbitrary 3-variable map could represent any one of 28=256 2x4 maps, and an arbitrary 4-variable map couldrepresent any one of 216 = 65,536 different formula-evaluations, writing down every one is infeasible.

Page 111: Formal Semantics (Logic)

15.14. REFERENCES 101

[22] This definition is given by Stephen Kleene. Both Kurt Gödel and Kleene believed that the classical paradoxes are uniformlyexamples of this sort of definition. But Kleene went on to assert that the problem has not been solved satisfactorily andimpredicative definitions can be found in analysis. He gives as example the definition of the least upper bound (l.u.b) u ofM. Given a Dedekind cut of the number line C and the two parts into which the number line is cut, i.e. M and (C -M),l.u.b. = u is defined in terms of the notion M, whereas M is defined in terms of C. Thus the definition of u, an elementof C, is defined in terms of the totality C and this makes its definition impredicative. Kleene asserts that attempts to arguethis away can be used to uphold the impredicative definitions in the paradoxes.(Kleene 1952:43).

[23] McCluskey comments that “it could be argued that the analysis is still incomplete because the word statement “The outputsare equal to the previous values of the inputs” has not been obtained"; he goes on to dismiss such worries because “Englishis not a formal language in a mathematical sense, [and] it is not really possible to have a formal procedure for obtainingword statements” (p. 185).

[24] More precisely, given enough “loop gain”, either oscillation or memory will occur (cf McCluskey p. 191-2). In abstract(idealized) mathematical systems adequate loop gain is not a problem.

[25] The notion of delay and the principle of local causation as caused ultimately by the speed of light appears in Robin Gandy(1980), “Church’s thesis and Principles for Mechanisms”, in J. Barwise, H. J. Keisler and K. Kunen, eds., The KleeneSymposium, North-Holland Publishing Company (1980) 123-148. Gandy considered this to be the most important of hisprinciples: “Contemporary physics rejects the possibility of instantaneous action at a distance” (p. 135). Gandy was AlanTuring's student and close friend.

[26] McKlusky p. 194-5 discusses “breaking the loop” and inserts “amplifiers” to do this; Wickes (p. 118-121) discuss insertingdelays. McCluskey p. 195ff discusses the problem of “races” caused by delays.

15.14 References

• Bender, Edward A. andWilliamson, S. Gill, 2005, A Short Course in Discrete Mathematics, Dover Publications,Mineola NY, ISBN 0-486-43946-1. This text is used in a “lower division two-quarter [computer science]course” at UC San Diego.

• Enderton, H. B., 2002, AMathematical Introduction to Logic. Harcourt/Academic Press. ISBN 0-12-238452-0

• Goodstein, R. L., (Pergamon Press 1963), 1966, (Dover edition 2007), Boolean Algebra, Dover Publications,Inc. Minola, New York, ISBN 0-486-45894-6. Emphasis on the notion of “algebra of classes” with set-theoretic symbols such as ∩, ∪, ' (NOT), ⊂ (IMPLIES). Later Goldstein replaces these with &, ∨, ¬, → (re-spectively) in his treatment of “Sentence Logic” pp. 76–93.

• Ivor Grattan-Guinness and Gérard Bornet 1997, George Boole: Selected Manuscripts on Logic and its Philoso-phy, Birkhäuser Verlag, Basil, ISBN 978-0-8176-5456-6 (Boston).

• A. G. Hamilton 1978, Logic for Mathematicians, Cambridge University Press, Cambridge UK, ISBN 0-521-21838-1.

• E. J. McCluskey 1965, Introduction to the Theory of Switching Circuits, McGraw-Hill Book Company, NewYork. No ISBN. Library of Congress Catalog Card Number 65-17394. McCluskey was a student of WillardQuine and developed some notable theorems with Quine and on his own. For those interested in the history,the book contains a wealth of references.

• Marvin L. Minsky 1967, Computation: Finite and Infinite Machines, Prentice-Hall, Inc, Englewood Cliffs, N.J..No ISBN. Library of Congress Catalog Card Number 67-12342. Useful especially for computability, plus goodsources.

• Paul C. Rosenbloom 1950, Dover edition 2005, The Elements of Mathematical Logic, Dover Publications, Inc.,Mineola, New York, ISBN 0-486-44617-4.

• Joel W. Robbin 1969, 1997, Mathematical Logic: A First Course, Dover Publications, Inc., Mineola, NewYork, ISBN 0-486-45018-X (pbk.).

• Patrick Suppes 1957 (1999Dover edition), Introduction to Logic, Dover Publications, Inc., Mineola, NewYork.ISBN 0-486-40687-3 (pbk.). This book is in print and readily available.

Page 112: Formal Semantics (Logic)

102 CHAPTER 15. PROPOSITIONAL FORMULA

• On his page 204 in a footnote he references his set of axioms to E. V. Huntington, “Sets of IndependentPostulates for the Algebra of Logic”, Transactions of the American Mathematical Society, Vol. 5 91904) pp.288-309.

• Alfred Tarski 1941 (1995 Dover edition), Introduction to Logic and to the Methodology of Deductive Sciences,Dover Publications, Inc., Mineola, New York. ISBN 0-486-28462-X (pbk.). This book is in print and readilyavailable.

• Jean van Heijenoort 1967, 3rd printing with emendations 1976, From Frege to Gödel: A Source Book in Mathe-matical Logic, 1879-1931, Harvard University Press, Cambridge, Massachusetts. ISBN 0-674-32449-8 (pbk.)Translation/reprints of Frege (1879), Russell’s letter to Frege (1902) and Frege’s letter to Russell (1902),Richard’s paradox (1905), Post (1921) can be found here.

• Alfred North Whitehead and Bertrand Russell 1927 2nd edition, paperback edition to *53 1962, PrincipiaMathematica, Cambridge University Press, no ISBN. In the years between the first edition of 1912 and the 2ndedition of 1927, H. M. Sheffer 1921 and M. Jean Nicod (no year cited) brought to Russell’s and Whitehead’sattention that what they considered their primitive propositions (connectives) could be reduced to a single |,nowadays known as the “stroke” or NAND (NOT-AND, NEITHER ... NOR...). Russell-Whitehead discussthis in their “Introduction to the Second Edition” and makes the definitions as discussed above.

• William E. Wickes 1968, Logic Design with Integrated Circuits, John Wiley & Sons, Inc., New York. NoISBN. Library of Congress Catalog Card Number: 68-21185. Tight presentation of engineering’s analysis andsynthesis methods, references McCluskey 1965. Unlike Suppes, Wickes’ presentation of “Boolean algebra”starts with a set of postulates of a truth-table nature and then derives the customary theorems of them (p.18ff).

Page 113: Formal Semantics (Logic)

15.14. REFERENCES 103

A truth table will contain 2n rows, where n is the number of variables (e.g. three variables “p”, “d”, “c” produce 23 rows). Eachrow represents a minterm. Each minterm can be found on the Hasse diagram, on the Veitch diagram, and on the Karnaugh map.(The evaluations of “p” shown in the truth table are not shown in the Hasse, Veitch and Karnaugh diagrams; these are shown in theKarnaugh map of the following section.)

Page 114: Formal Semantics (Logic)

104 CHAPTER 15. PROPOSITIONAL FORMULA

Steps in the reduction using a Karnaugh map. The final result is the OR (logical “sum”) of the three reduced terms.

Page 115: Formal Semantics (Logic)

15.14. REFERENCES 105

Page 116: Formal Semantics (Logic)

106 CHAPTER 15. PROPOSITIONAL FORMULA

About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output “q” feeding backinto “p”. The next simplest is the “flip-flop” shown below the once-flip. Analysis of these sorts of formulas can be done by eithercutting the feedback path(s) or inserting (ideal) delay in the path. A cut path and an assumption that no delay occurs anywhere in the“circuit” results in inconsistencies for some of the total states (combination of inputs and outputs, e.g. (p=0, s=1, r=1) results in aninconsistency). When delay is present these inconsistencies are merely transient and expire when the delay(s) expire. The drawingson the right are called state diagrams.

Page 117: Formal Semantics (Logic)

15.14. REFERENCES 107

A “clocked flip-flop” memory (“c” is the “clock” and “d” is the “data”). The data can change at any time when clock c=0; when clockc=1 the output q “tracks” the value of data d. When c goes from 1 to 0 it “traps” d = q’s value and this continues to appear at q nomatter what d does (as long as c remains 0).

Page 118: Formal Semantics (Logic)

Chapter 16

Rule of inference

In logic, a rule of inference, inference rule, or transformation rule is a logical form consisting of a function whichtakes premises, analyzes their syntax, and returns a conclusion (or conclusions). For example, the rule of inferencecalled modus ponens takes two premises, one in the form “If p then q” and another in the form “p”, and returns theconclusion “q”. The rule is valid with respect to the semantics of classical logic (as well as the semantics of many othernon-classical logics), in the sense that if the premises are true (under an interpretation), then so is the conclusion.Typically, a rule of inference preserves truth, a semantic property. In many-valued logic, it preserves a generaldesignation. But a rule of inference’s action is purely syntactic, and does not need to preserve any semantic property:any function from sets of formulae to formulae counts as a rule of inference. Usually only rules that are recursiveare important; i.e. rules such that there is an effective procedure for determining whether any given formula is theconclusion of a given set of formulae according to the rule. An example of a rule that is not effective in this sense isthe infinitary ω-rule.[1]

Popular rules of inference in propositional logic includemodus ponens, modus tollens, and contraposition. First-orderpredicate logic uses rules of inference to deal with logical quantifiers.

16.1 The standard form of rules of inference

In formal logic (and many related areas), rules of inference are usually given in the following standard form:Premise#1Premise#2...Premise#nConclusionThis expression states that whenever in the course of some logical derivation the given premises have been obtained,the specified conclusion can be taken for granted as well. The exact formal language that is used to describe bothpremises and conclusions depends on the actual context of the derivations. In a simple case, one may use logicalformulae, such as in:

A→ B

A

B

This is the modus ponens rule of propositional logic. Rules of inference are often formulated as schemata employingmetavariables.[2] In the rule (schema) above, the metavariables A and B can be instantiated to any element of theuniverse (or sometimes, by convention, a restricted subset such as propositions) to form an infinite set of inferencerules.A proof system is formed from a set of rules chained together to form proofs, also called derivations. Any derivationhas only one final conclusion, which is the statement proved or derived. If premises are left unsatisfied in the derivation,then the derivation is a proof of a hypothetical statement: "if the premises hold, then the conclusion holds.”

108

Page 119: Formal Semantics (Logic)

16.2. AXIOM SCHEMAS AND AXIOMS 109

16.2 Axiom schemas and axioms

Inference rules may also be stated in this form: (1) zero or more premises, (2) a turnstile symbol ⊢ , which means“infers”, “proves”, or “concludes”, and (3) a conclusion. This form usually embodies the relational (as opposed tofunctional) view of a rule of inference, where the turnstile stands for a deducibility relation holding between premisesand conclusion.An inference rule containing no premises is called an axiom schema or, if it contains no metavariables, simply anaxiom.[2]

Rules of inference must be distinguished from axioms of a theory. In terms of semantics, axioms are valid assertions.Axioms are usually regarded as starting points for applying rules of inference and generating a set of conclusions. Or,in less technical terms:Rules are statements about the system, axioms are statements in the system. For example:

• The rule that from ⊢ p you can infer ⊢ Provable(p) is a statement that says if you've proven p , then it isprovable that p is provable. This rule holds in Peano arithmetic, for example.

• The axiom p → Provable(p) would mean that every true statement is provable. This axiom does not hold inPeano arithmetic.

Rules of inference play a vital role in the specification of logical calculi as they are considered in proof theory, suchas the sequent calculus and natural deduction.

16.3 Example: Hilbert systems for two propositional logics

In a Hilbert system, the premises and conclusion of the inference rules are simply formulae of some language, usuallyemploying metavariables. For graphical compactness of the presentation and to emphasize the distinction betweenaxioms and rules of inference, this section uses the sequent notation (⊢) instead of a vertical presentation of rules.The formal language for classical propositional logic can be expressed using just negation (¬), implication (→) andpropositional symbols. A well-known axiomatization, comprising three axiom schema and one inference rule (modusponens), is:(CA1) ⊢ A→ (B→ A)(CA2) ⊢ (A→ (B→ C)) → ((A→ B) → (A→ C))(CA3) ⊢ (¬A→ ¬B) → (B→ A)(MP) A, A→ B ⊢ B

It may seem redundant to have two notions of inference in this case, ⊢ and →. In classical propositional logic, theyindeed coincide; the deduction theorem states that A⊢ B if and only if ⊢ A→ B. There is however a distinction worthemphasizing even in this case: the first notation describes a deduction, that is an activity of passing from sentences tosentences, whereas A → B is simply a formula made with a logical connective, implication in this case. Without aninference rule (like modus ponens in this case), there is no deduction or inference. This point is illustrated in LewisCarroll's dialogue called "What the Tortoise Said to Achilles".[3]

For some non-classical logics, the deduction theorem does not hold. For example, the three-valued logic Ł3 ofŁukasiewicz can be axiomatized as:[4]

(CA1) ⊢ A→ (B→ A)(LA2) ⊢ (A→ B) → ((B→ C) → (A→ C))(CA3) ⊢ (¬A→ ¬B) → (B→ A)(LA4) ⊢ ((A→ ¬A) → A) → A(MP) A, A→ B ⊢ B

This sequence differs from classical logic by the change in axiom 2 and the addition of axiom 4. The classicaldeduction theorem does not hold for this logic, however a modified form does hold, namely A ⊢ B if and only if ⊢A→ (A→ B).[5]

Page 120: Formal Semantics (Logic)

110 CHAPTER 16. RULE OF INFERENCE

16.4 Admissibility and derivability

Main article: Admissible rule

In a set of rules, an inference rule could be redundant in the sense that it is admissible or derivable. A derivablerule is one whose conclusion can be derived from its premises using the other rules. An admissible rule is onewhose conclusion holds whenever the premises hold. All derivable rules are admissible. To appreciate the difference,consider the following set of rules for defining the natural numbers (the judgment n nat asserts the fact that n is anatural number):

0 natn nats(n) nat

The first rule states that 0 is a natural number, and the second states that s(n) is a natural number if n is. In this proofsystem, the following rule, demonstrating that the second successor of a natural number is also a natural number, isderivable:

n nats(s(n)) nat

Its derivation is the composition of two uses of the successor rule above. The following rule for asserting the existenceof a predecessor for any nonzero number is merely admissible:

s(n) natn nat

This is a true fact of natural numbers, as can be proven by induction. (To prove that this rule is admissible, assume aderivation of the premise and induct on it to produce a derivation of n nat .) However, it is not derivable, because itdepends on the structure of the derivation of the premise. Because of this, derivability is stable under additions to theproof system, whereas admissibility is not. To see the difference, suppose the following nonsense rule were added tothe proof system:

s(−3) nat

In this new system, the double-successor rule is still derivable. However, the rule for finding the predecessor is nolonger admissible, because there is no way to derive −3 nat . The brittleness of admissibility comes from the way itis proved: since the proof can induct on the structure of the derivations of the premises, extensions to the system addnew cases to this proof, which may no longer hold.Admissible rules can be thought of as theorems of a proof system. For instance, in a sequent calculus where cutelimination holds, the cut rule is admissible.

16.5 See also

• Inference objection

• Immediate inference

• Law of thought

• List of rules of inference

• Logical truth

• Structural rule

Page 121: Formal Semantics (Logic)

16.6. REFERENCES 111

16.6 References[1] Boolos, George; Burgess, John; Jeffrey, Richard C. (2007). Computability and logic. Cambridge: Cambridge University

Press. p. 364. ISBN 0-521-87752-0.

[2] John C. Reynolds (2009) [1998]. Theories of Programming Languages. Cambridge University Press. p. 12. ISBN 978-0-521-10697-9.

[3] Kosta Dosen (1996). “Logical consequence: a turn in style”. In Maria Luisa Dalla Chiara, Kees Doets, Daniele Mundici,Johan van Benthem. Logic and Scientific Methods: Volume One of the Tenth International Congress of Logic, Methodologyand Philosophy of Science, Florence, August 1995. Springer. p. 290. ISBN 978-0-7923-4383-7. preprint (with differentpagination)

[4] Bergmann, Merrie (2008). An introduction to many-valued and fuzzy logic: semantics, algebras, and derivation systems.Cambridge University Press. p. 100. ISBN 978-0-521-88128-9.

[5] Bergmann, Merrie (2008). An introduction to many-valued and fuzzy logic: semantics, algebras, and derivation systems.Cambridge University Press. p. 114. ISBN 978-0-521-88128-9.

Page 122: Formal Semantics (Logic)

Chapter 17

Semantics

Semantics (from Ancient Greek: σημαντικός sēmantikós, “significant”)[1][2] is the study of meaning. It focuseson the relation between signifiers, like words, phrases, signs, and symbols, and what they stand for; their denotation.Linguistic semantics is the study of meaning that is used for understanding human expression through language. Otherforms of semantics include the semantics of programming languages, formal logics, and semiotics. In internationalscientific vocabulary semantics is also called semasiology.The word semantics itself denotes a range of ideas—from the popular to the highly technical. It is often used inordinary language for denoting a problem of understanding that comes down to word selection or connotation. Thisproblem of understanding has been the subject of many formal enquiries, over a long period of time, especially inthe field of formal semantics. In linguistics, it is the study of the interpretation of signs or symbols used in agentsor communities within particular circumstances and contexts.[3] Within this view, sounds, facial expressions, bodylanguage, and proxemics have semantic (meaningful) content, and each comprises several branches of study. Inwritten language, things like paragraph structure and punctuation bear semantic content; other forms of languagebear other semantic content.[3]

The formal study of semantics intersects with many other fields of inquiry, including lexicology, syntax, pragmatics,etymology and others. Independently, semantics is also a well-defined field in its own right, often with syntheticproperties.[4] In the philosophy of language, semantics and reference are closely connected. Further related fieldsinclude philology, communication, and semiotics. The formal study of semantics can therefore be manifold andcomplex.Semantics contrasts with syntax, the study of the combinatorics of units of a language (without reference to theirmeaning), and pragmatics, the study of the relationships between the symbols of a language, their meaning, and theusers of the language.[5] Semantics as a field of study also has significant ties to various representational theoriesof meaning including truth theories of meaning, coherence theories of meaning, and correspondence theories ofmeaning. Each of these is related to the general philosophical study of reality and the representation of meaning.

17.1 Linguistics

In linguistics, semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words,phrases, sentences, and larger units of discourse (termed texts, or narratives). The study of semantics is also closelylinked to the subjects of representation, reference and denotation. The basic study of semantics is oriented to theexamination of the meaning of signs, and the study of relations between different linguistic units and compounds:homonymy, synonymy, antonymy, hypernymy, hyponymy, meronymy, metonymy, holonymy, paronyms. A key con-cern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units ofmeaning. Traditionally, semantics has included the study of sense and denotative reference, truth conditions, argumentstructure, thematic roles, discourse analysis, and the linkage of all of these to syntax.

112

Page 123: Formal Semantics (Logic)

17.2. MONTAGUE GRAMMAR 113

17.2 Montague grammar

In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of thelambda calculus. In these terms, the syntactic parse of the sentence John ate every bagel would consist of a subject(John) and a predicate (ate every bagel); Montague demonstrated that the meaning of the sentence altogether couldbe decomposed into the meanings of its parts and in relatively few rules of combination. The logical predicate thusobtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set ofTarskiian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to thelanguage of thought hypothesis from the 1970s.Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led toseveral attempts at incorporating context, such as:

• Situation semantics (1980s): truth-values are incomplete, they get assigned based on context

• Generative lexicon (1990s): categories (types) are incomplete, and get assigned based on context

17.3 Dynamic turn in semantics

In Chomskyan linguistics there was no mechanism for the learning of semantic relations, and the nativist view consid-ered all semantic notions as inborn. Thus, even novel concepts were proposed to have been dormant in some sense.This view was also thought unable to address many issues such as metaphor or associative meanings, and semanticchange, where meanings within a linguistic community change over time, and qualia or subjective experience. An-other issue not addressed by the nativist model was how perceptual cues are combined in thought, e.g. in mentalrotation.[6]

This view of semantics, as an innate finite meaning inherent in a lexical unit that can be composed to generate mean-ings for larger chunks of discourse, is now being fiercely debated in the emerging domain of cognitive linguistics[7]and also in the non-Fodorian camp in philosophy of language.[8] The challenge is motivated by:

• factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this x, him, last week).In these situations context serves as the input, but the interpreted utterance also modifies the context, so it isalso the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences is viewed ascontexts changing potentials instead of propositions.

• factors external to language, i.e. language is not a set of labels stuck on things, but “a toolbox, the importanceof whose elements lie in the way they function rather than their attachments to things.”[8] This view reflectsthe position of the later Wittgenstein and his famous game example, and is related to the positions of Quine,Davidson, and others.

A concrete example of the latter phenomenon is semantic underspecification – meanings are not complete withoutsome elements of context. To take an example of one word, red, its meaning in a phrase such as red book is similar tomany other usages, and can be viewed as compositional.[9] However, the colours implied in phrases such as red wine(very dark), and red hair (coppery), or red soil, or red skin are very different. Indeed, these colours by themselveswould not be called red by native speakers. These instances are contrastive, so red wine is so called only in comparisonwith the other kind of wine (which also is not white for the same reasons). This view goes back to de Saussure:

Each of a set of synonyms like redouter ('to dread'), craindre ('to fear'), avoir peur ('to be afraid')has its particular value only because they stand in contrast with one another. No word has a value thatcan be identified independently of what else is in its vicinity.[10]

and may go back to earlier Indian views on language, especially the Nyaya view of words as indicators and not carriersof meaning.[11]

An attempt to defend a system based on propositional meaning for semantic underspecification can be found in thegenerative lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into thelexicon. Thus meanings are generated “on the fly” (as you go), based on finite context.

Page 124: Formal Semantics (Logic)

114 CHAPTER 17. SEMANTICS

17.4 Prototype theory

Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch in the1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions,but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members. One maycompare it with Jung's archetype, though the concept of archetype sticks to static concept. Some post-structuralistsare against the fixed or static meaning of the words. Derrida, following Nietzsche, talked about slippages in fixedmeanings.Systems of categories are not objectively out there in the world but are rooted in people’s experience. These categoriesevolve as learned concepts of the world – meaning is not an objective truth, but a subjective construct, learned fromexperience, and language arises out of the “grounding of our conceptual systems in shared embodiment and bodilyexperience”.[12] A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for differentcultures, or indeed, for every individual in the same culture. This leads to another debate (see the Sapir–Whorfhypothesis or Eskimo words for snow).

17.5 Theories in semantics

17.5.1 Model theoretic semantics

Main article: formal semantics (linguistics)

Originates from Montague’s work (see above). A highly formalized theory of natural language semantics in whichexpressions are assigned denotations (meanings) such as individuals, truth values, or functions from one of these toanother. The truth of a sentence, and more interestingly, its logical relation to other sentences, is then evaluatedrelative to a model.

17.5.2 Formal (or truth-conditional) semantics

Main article: truth-conditional semantics

Pioneered by the philosopher Donald Davidson, another formalized theory, which aims to associate each natural lan-guage sentence with a meta-language description of the conditions under which it is true, for example: 'Snow is white'is true if and only if snow is white. The challenge is to arrive at the truth conditions for any sentences from fixedmeanings assigned to the individual words and fixed rules for how to combine them. In practice, truth-conditionalsemantics is similar to model-theoretic semantics; conceptually, however, they differ in that truth-conditional seman-tics seeks to connect language with statements about the real world (in the form of meta-language statements), ratherthan with abstract models.

17.5.3 Lexical and conceptual semantics

Main article: conceptual semantics

This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntacticproperties of phrases reflect the meanings of the words that head them.[13] With this theory, linguists can better dealwith the fact that subtle differences in word meaning correlate with other differences in the syntactic structure thatthe word appears in.[13] The way this is gone about is by looking at the internal structure of words.[14] These smallparts that make up the internal structure of words are termed semantic primitives.[14]

17.5.4 Lexical semantics

Main article: lexical semantics

Page 125: Formal Semantics (Logic)

17.6. COMPUTER SCIENCE 115

A linguistic theory that investigates word meaning. This theory understands that the meaning of a word is fullyreflected by its context. Here, the meaning of a word is constituted by its contextual relations.[15] Therefore, a dis-tinction between degrees of participation as well as modes of participation are made.[15] In order to accomplish thisdistinction any part of a sentence that bears a meaning and combines with the meanings of other constituents is la-beled as a semantic constituent. Semantic constituents that cannot be broken down into more elementary constituentsare labeled minimal semantic constituents.[15]

17.5.5 Computational semantics

Main article: computational semantics

Computational semantics is focused on the processing of linguistic meaning. In order to do this concrete algorithmsand architectures are described. Within this framework the algorithms and architectures are also analyzed in termsof decidability, time/space complexity, data structures they require and communication protocols.[16]

17.6 Computer science

Main article: Semantics (computer science)

In computer science, the term semantics refers to the meaning of languages, as opposed to their form (syntax). Ac-cording to Euzenat, semantics “provides the rules for interpreting the syntax which do not provide themeaning directlybut constrains the possible interpretations of what is declared.”[17] In other words, semantics is about interpretationof an expression. Additionally, the term is applied to certain types of data structures specifically designed and usedfor representing information content.

17.6.1 Programming languages

The semantics of programming languages and other languages is an important issue and area of study in computerscience. Like the syntax of a language, its semantics can be defined exactly.For instance, the following statements use different syntaxes, but cause the same instructions to be executed:Generally these operations would all perform an arithmetical addition of 'y' to 'x' and store the result in a variablecalled 'x'.Various ways have been developed to describe the semantics of programming languages formally, building onmathematicallogic:[18]

• Operational semantics: Themeaning of a construct is specified by the computation it induces when it is executedon a machine. In particular, it is of interest how the effect of a computation is produced.

• Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executingthe constructs. Thus only the effect is of interest, not how it is obtained.

• Axiomatic semantics: Specific properties of the effect of executing the constructs are expressed as assertions.Thus there may be aspects of the executions that are ignored.

17.6.2 Semantic models

Terms such as semantic network and semantic data model are used to describe particular types of data model char-acterized by the use of directed graphs in which the vertices denote concepts or entities in the world, and the arcsdenote relationships between them.The Semantic Web refers to the extension of the World Wide Web via embedding added semantic metadata, usingsemantic data modelling techniques such as Resource Description Framework (RDF) and Web Ontology Language(OWL).

Page 126: Formal Semantics (Logic)

116 CHAPTER 17. SEMANTICS

17.7 Psychology

In psychology, semantic memory is memory formeaning – in other words, the aspect ofmemory that preserves only thegist, the general significance, of remembered experience – while episodic memory is memory for the ephemeral details– the individual features, or the unique particulars of experience. The term 'episodic memory' was introduced byTulving and Schacter in the context of 'declarative memory' which involved simple association of factual or objectiveinformation concerning its object. Word meaning is measured by the company they keep, i.e. the relationshipsamong words themselves in a semantic network. The memories may be transferred intergenerationally or isolated inone generation due to a cultural disruption. Different generations may have different experiences at similar points intheir own time-lines. This may then create a vertically heterogeneous semantic net for certain words in an otherwisehomogeneous culture.[19] In a network created by people analyzing their understanding of the word (such asWordnet)the links and decomposition structures of the network are few in number and kind, and include part of, kind of, andsimilar links. In automated ontologies the links are computed vectors without explicit meaning. Various automatedtechnologies are being developed to compute the meaning of words: latent semantic indexing and support vectormachines as well as natural language processing, neural networks and predicate calculus techniques.Ideasthesia is a psychological phenomenon in which activation of concepts evokes sensory experiences. For example,in synesthesia, activation of a concept of a letter (e.g., that of the letter A) evokes sensory-like experiences (e.g., ofred color).

17.8 See also

17.8.1 Linguistics and semiotics

• Asemic writing

• Cognitive semantics

• Colorless green ideas sleep furiously

• Computational semantics

• Discourse representation theory

• General semantics

• Generative semantics

• Hermeneutics

• Natural semantic metalanguage

• Onomasiology

• Phono-semantic matching

• Pragmatic maxim

• Pragmaticism

• Pragmatism

• Problem of universals

• Semantic change or progression

• Semantic class

• Semantic feature

• Semantic field

• Semantic lexicon

Page 127: Formal Semantics (Logic)

17.8. SEE ALSO 117

• Semantic primes

• Semantic property

• Sememe

• Semiosis

• Semiotics

• SPL notation

17.8.2 Logic and mathematics

• Formal logic

• Game semantics

• Model theory

• Gödel’s incompleteness theorems

• Proof-theoretic semantics

• Semantic consequence

• Semantic theory of truth

• Semantics of logic

• Truth-value semantics

17.8.3 Computer science

• Formal semantics of programming languages

• Knowledge representation

• Semantic networks

• Semantic transversal

• Semantic analysis

• Semantic compression

• Semantic HTML

• Semantic integration

• Semantic interpretation

• Semantic link

• Semantic reasoner

• Semantic service oriented architecture

• Semantic spectrum

• Semantic unification

• Semantic Web

17.8.4 Psychology

• Ideasthesia

Page 128: Formal Semantics (Logic)

118 CHAPTER 17. SEMANTICS

17.9 References[1] σημαντικός. Liddell, Henry George; Scott, Robert; A Greek–English Lexicon at the Perseus Project

[2] The word is derived from the Ancient Greek word σημαντικός (semantikos), “related to meaning, significant”, from ση-μαίνω semaino, “to signify, to indicate”, which is from σῆμα sema, “sign, mark, token”. The plural is used in analogy withwords similar to physics, which was in the neuter plural in Ancient Greek and meant “things relating to nature”.

[3] Neurath, Otto; Carnap, Rudolf; Morris, Charles F. W. (Editors) (1955). International Encyclopedia of Unified Science.Chicago, IL: University of Chicago Press.

[4] Cruse, Alan; Meaning and Language: An introduction to Semantics and Pragmatics, Chapter 1, Oxford Textbooks in Lin-guistics, 2004; Kearns, Kate; Semantics, Palgrave MacMillan 2000; Cruse, D. A.; Lexical Semantics, Cambridge, MA,1986.

[5] Kitcher, Philip; Salmon, Wesley C. (1989). Scientific Explanation. Minneapolis, MN: University of Minnesota Press. p.35.

[6] Barsalou, L.; Perceptual Symbol Systems, Behavioral and Brain Sciences, 22(4), 1999

[7] Langacker, Ronald W. (1999). Grammar and Conceptualization. Berlin/New York: Mouton de Gruyer. ISBN 3-11-016603-8.

[8] Peregrin, Jaroslav (2003). Meaning: The Dynamic Turn. Current Research in the Semantics/Pragmatics Interface. London:Elsevier.

[9] Gärdenfors, Peter (2000). Conceptual Spaces: The Geometry of Thought. MIT Press/Bradford Books. ISBN 978-0-585-22837-2.

[10] de Saussure, Ferdinand (1916). The Course of General Linguistics (Cours de linguistique générale).

[11] Matilal, Bimal Krishna (1990). TheWord and theWorld: India’s Contribution to the Study of Language. Oxford. The Nyayaand Mimamsa schools in Indian vyākaraṇa tradition conducted a centuries-long debate on whether sentence meaning arisesthrough composition on word meanings, which are primary; or whether word meanings are obtained through analysis ofsentences where they appear. (Chapter 8).

[12] Lakoff, George; Johnson, Mark (1999). Philosophy in the Flesh: The embodied mind and its challenge to Western thought.Chapter 1. New York, NY: Basic Books. OCLC 93961754.

[13] Levin, Beth; Pinker, Steven; Lexical & Conceptual Semantics, Blackwell, Cambridge, MA, 1991

[14] Jackendoff, Ray; Semantic Structures, MIT Press, Cambridge, MA, 1990

[15] Cruse, D.; Lexical Semantics, Cambridge University Press, Cambridge, MA, 1986

[16] Nerbonne, J.; The Handbook of Contemporary Semantic Theory (ed. Lappin, S.), Blackwell Publishing, Cambridge, MA,1996

[17] Euzenat, Jerome. Ontology Matching. Springer-Verlag Berlin Heidelberg, 2007, p. 36

[18] Nielson, Hanne Riis; Nielson, Flemming (1995). Semantics with Applications, A Formal Introduction (1st ed.). Chicester,England: John Wiley & Sons. ISBN 0-471-92980-8.

[19] Giannini, A. J.; Semiotic and Semantic Implications of “Authenticity”, Psychological Reports, 106(2):611–612, 2010

17.10 External links• semanticsarchive.net

• Teaching page for A-level semantics

• Chomsky, Noam; On Referring, Harvard University, 30 October 2007 (video)

• Jackendoff, Ray; Conceptual Semantics, Harvard University, 13 November 2007 (video)

• Semantics: an interview with Jerry Fodor (ReVEL, vol. 5, no. 8 (2007))

Page 129: Formal Semantics (Logic)

Chapter 18

Symbol (formal)

For other uses see Symbol (disambiguation)

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities that may be constructed from formal languages. The symbols and strings of symbols maybe broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of itswell-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

A logical symbol is a fundamental concept in logic, tokens of which may be marks or a configuration of markswhich form a particular pattern. Although the term “symbol” in common use refers at some times to the idea beingsymbolized, and at other times to the marks on a piece of paper or chalkboard which are being used to express that

119

Page 130: Formal Semantics (Logic)

120 CHAPTER 18. SYMBOL (FORMAL)

idea; in the formal languages studied in mathematics and logic, the term “symbol” refers to the idea, and the marksare considered to be a token instance of the symbol. In logic, symbols build literal utility to illustrate ideas.Symbols of a formal language need not be symbols of anything. For instance there are logical constants which do notrefer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses). Symbols of a formallanguage must be capable of being specified without any reference to any interpretation of them.A symbol or string of symbols may comprise a well-formed formula if it is consistent with the formation rules of thelanguage.In a formal system a symbol may be used as a token in formal operations. The set of formal symbols in a formallanguage is referred to as an alphabet (hence each symbol may be referred to as a “letter”)[1]

A formal symbol as used in first-order logic may be a variable (member from a universe of discourse), a constant, afunction (mapping to another member of universe) or a predicate (mapping to T/F).Formal symbols are usually thought of as purely syntactic structures, composed into larger structures using a formalgrammar, though sometimes they may be associated with an interpretation or model (a formal semantics).

18.1 Can words be modeled as formal symbols?

The move to view units in natural language (e.g. English) as formal symbols was initiated by Noam Chomsky (itwas this work that resulted in the Chomsky hierarchy in formal languages). The generative grammar model lookedupon syntax as autonomous from semantics. Building on these models, the logician Richard Montague proposed thatsemantics could also be constructed on top of the formal structure:

There is in my opinion no important theoretical difference between natural languages and the artificiallanguages of logicians; indeed, I consider it possible to comprehend the syntax and semantics of bothkinds of language within a single natural and mathematically precise theory. On this point I differ froma number of philosophers, but agree, I believe, with Chomsky and his associates.” [2]

This is the philosophical premise underlying Montague grammar.However, this attempt to equate linguistic symbols with formal symbols has been challenged widely, particularly inthe tradition of cognitive linguistics, by philosophers like Stevan Harnad, and linguists like George Lakoff and RonaldLangacker.

18.2 References[1] John Hopcroft, RajeevMotwani and Jeffrey Ullman, Introduction to Automata Theory, Languages, and Computation, 2000

[2] Richard Montague, Universal Grammar, 1970

18.3 See also• List of mathematical symbols

Page 131: Formal Semantics (Logic)

Chapter 19

Syntax (logic)

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities which may be constructed from formal languages.[1] The symbols and strings of symbolsmay be broadly divided into nonsense and well-formed formulas. A formal language is identical to the set of its well-formedformulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretationor meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols andwords of a language, as contrasted with the semantics of a language which is concerned with its meaning.The symbols, formulas, systems, theorems, proofs, and interpretations expressed in formal languages are syntactic

121

Page 132: Formal Semantics (Logic)

122 CHAPTER 19. SYNTAX (LOGIC)

entities whose properties may be studied without regard to any meaning they may be given, and, in fact, need not begiven any.Syntax is usually associated with the rules (or grammar) governing the composition of texts in a formal language thatconstitute the well-formed formulas of a formal system.In computer science, the term syntax refers to the rules governing the composition of meaningful texts in a formallanguage, such as a programming language, that is, those texts for which it makes sense to define the semantics ormeaning, or otherwise provide an interpretation.[2]

19.1 Syntactic entities

19.1.1 Symbols

Main article: Symbol (formal)

A symbol is an idea, abstraction or concept, tokens of which may be marks or a configuration of marks which forma particular pattern. Symbols of a formal language need not be symbols of anything. For instance there are logicalconstants which do not refer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses).A symbol or string of symbols may comprise a well-formed formula if the formulation is consistent with the formationrules of the language. Symbols of a formal language must be capable of being specified without any reference to anyinterpretation of them.

19.1.2 Formal language

Main article: Formal language

A formal language is a syntactic entity which consists of a set of finite strings of symbols which are its words (usuallycalled its well-formed formulas). Which strings of symbols are words is determined by fiat by the creator of thelanguage, usually by specifying a set of formation rules. Such a language can be defined without reference to anymeanings of any of its expressions; it can exist before any interpretation is assigned to it – that is, before it has anymeaning.

19.1.3 Formation rules

Main article: Formation rule

Formation rules are a precise description of which strings of symbols are the well-formed formulas of a formallanguage. It is synonymous with the set of strings over the alphabet of the formal language which constitute wellformed formulas. However, it does not describe their semantics (i.e. what they mean).

19.1.4 Propositions

Main article: Proposition

A proposition is a sentence expressing something true or false. A proposition is identified ontologically as an idea,concept or abstraction whose token instances are patterns of symbols, marks, sounds, or strings of words.[3] Proposi-tions are considered to be syntactic entities and also truthbearers.

19.1.5 Formal theories

Main article: Theory (mathematical logic)

Page 133: Formal Semantics (Logic)

19.2. REFERENCES 123

A formal theory is a set of sentences in a formal language.

19.1.6 Formal systems

Main article: Formal system

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with adeductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformationrules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expres-sion from one or more other expressions. Formal systems, like other syntactic entities may be defined without anyinterpretation given to it (as being, for instance, a system of arithmetic).

Syntactic consequence within a formal system

A formula A is a syntactic consequence[4][5][6][7] within some formal system FS of a set Г of formulas if there is aderivation in formal system FS of A from the set Г.

Γ ⊢FS A

Syntactic consequence does not depend on any interpretation of the formal system.[8]

Syntactic completeness of a formal system

Main article: Completeness (logic)

A formal system S is syntactically complete[9][10][11][12] (also deductively complete, maximally complete, negation com-plete or simply complete) iff for each formula A of the language of the system either A or ¬A is a theorem of S . Inanother sense, a formal system is syntactically complete iff no unprovable axiom can be added to it as an axiom with-out introducing an inconsistency. Truth-functional propositional logic and first-order predicate logic are semanticallycomplete, but not syntactically complete (for example the propositional logic statement consisting of a single variable“a” is not a theorem, and neither is its negation, but these are not tautologies). Gödel’s incompleteness theorem showsthat no recursive system that is sufficiently powerful, such as the Peano axioms, can be both consistent and complete.

19.1.7 Interpretations

Main articles: Formal semantics (logic) and Interpretation (logic)

An interpretation of a formal system is the assignment of meanings to the symbols, and truth values to the sentencesof a formal system. The study of interpretations is called formal semantics. Giving an interpretation is synonymouswith constructing a model. An interpretation is expressed in a metalanguage, which may itself be a formal language,and as such itself is a syntactic entity.

19.2 References[1] Dictionary Definition

[2] Abstract Syntax and Logic Programming

[3] Metalogic, Geoffrey Hunter

[4] Dummett, M. (1981). Frege: Philosophy of Language. Harvard University Press. p. 82. ISBN 9780674319318. Retrieved2014-10-15.

Page 134: Formal Semantics (Logic)

124 CHAPTER 19. SYNTAX (LOGIC)

[5] Lear, J. (1986). Aristotle and Logical Theory. Cambridge University Press. p. 1. ISBN 9780521311786. Retrieved2014-10-15.

[6] Creath, R.; Friedman, M. (2007). The Cambridge Companion to Carnap. Cambridge University Press. p. 189. ISBN9780521840156. Retrieved 2014-10-15.

[7] “syntactic consequence from FOLDOC”. swif.uniba.it. Retrieved 2014-10-15.

[8] Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of CaliforniaPres, 1971, p. 75.

[9] “A Note on Interaction and Incompleteness” (PDF). Retrieved 2014-10-15.

[10] “Normal forms and syntactic completeness proofs for functional independencies”. portal.acm.org. Retrieved 2014-10-15.

[11] Barwise, J. (1982). Handbook of Mathematical Logic. Elsevier Science. p. 236. ISBN 9780080933641. Retrieved2014-10-15.

[12] “syntactic completeness from FOLDOC”. swif.uniba.it. Retrieved 2014-10-15.

19.3 See also• Symbol (formal)

• Formation rule

• Formal grammar

• Syntax (linguistics)

• Syntax (programming languages)

• Mathematical logic

• Well-formed formula

Page 135: Formal Semantics (Logic)

Chapter 20

Theorem

For the Italian film, see Teorema (film).In mathematics, a theorem is a statement that has been proven on the basis of previously established statements,such as other theorems—and generally accepted statements, such as axioms. The proof of a mathematical theoremis a logical argument for the theorem statement given in accord with the rules of a deductive system. The proof ofa theorem is often interpreted as justification of the truth of the theorem statement. In light of the requirement thattheorems be proved, the concept of a theorem is fundamentally deductive, in contrast to the notion of a scientifictheory, which is empirical.[2]

Many mathematical theorems are conditional statements. In this case, the proof deduces the conclusion from condi-tions called hypotheses or premises. In light of the interpretation of proof as justification of truth, the conclusion isoften viewed as a necessary consequence of the hypotheses, namely, that the conclusion is true in case the hypothe-ses are true, without any further assumptions. However, the conditional could be interpreted differently in certaindeductive systems, depending on the meanings assigned to the derivation rules and the conditional symbol.Although they can be written in a completely symbolic form, for example, within the propositional calculus, theoremsare often expressed in a natural language such as English. The same is true of proofs, which are often expressed aslogically organized and clearly worded informal arguments, intended to convince readers of the truth of the statementof the theorem beyond any doubt, and from which a formal symbolic proof can in principle be constructed. Sucharguments are typically easier to check than purely symbolic ones—indeed, many mathematicians would express apreference for a proof that not only demonstrates the validity of a theorem, but also explains in some way why it isobviously true. In some cases, a picture alone may be sufficient to prove a theorem. Because theorems lie at the coreof mathematics, they are also central to its aesthetics. Theorems are often described as being “trivial”, or “difficult”,or “deep”, or even “beautiful”. These subjective judgments vary not only from person to person, but also with time:for example, as a proof is simplified or better understood, a theorem that was once difficult may become trivial. Onthe other hand, a deep theorem may be simply stated, but its proof may involve surprising and subtle connectionsbetween disparate areas of mathematics. Fermat’s Last Theorem is a particularly well-known example of such atheorem.

20.1 Informal account of theorems

Logically, many theorems are of the form of an indicative conditional: if A, then B. Such a theorem does not assertB, only that B is a necessary consequence of A. In this case A is called the hypothesis of the theorem (note that“hypothesis” here is something very different from a conjecture) and B the conclusion (formally, A and B are termedthe antecedent and consequent). The theorem “If n is an even natural number then n/2 is a natural number” is a typicalexample in which the hypothesis is "n is an even natural number” and the conclusion is "n/2 is also a natural number”.To be proven, a theorem must be expressible as a precise, formal statement. Nevertheless, theorems are usuallyexpressed in natural language rather than in a completely symbolic form, with the intention that the reader can producea formal statement from the informal one.It is common in mathematics to choose a number of hypotheses within a given language and declare that the theoryconsists of all statements provable from these hypotheses. These hypothesis form the foundational basis of the theoryand are called axioms or postulates. The field of mathematics known as proof theory studies formal languages, axioms

125

Page 136: Formal Semantics (Logic)

126 CHAPTER 20. THEOREM

and the structure of proofs.Some theorems are “trivial”, in the sense that they follow from definitions, axioms, and other theorems in obviousways and do not contain any surprising insights. Some, on the other hand, may be called “deep”, because their proofsmay be long and difficult, involve areas of mathematics superficially distinct from the statement of the theorem itself,or show surprising connections between disparate areas of mathematics.[3] A theorem might be simple to state andyet be deep. An excellent example is Fermat’s Last Theorem, and there are many other examples of simple yet deeptheorems in number theory and combinatorics, among other areas.Other theorems have a known proof that cannot easily be written down. The most prominent examples are the fourcolor theorem and the Kepler conjecture. Both of these theorems are only known to be true by reducing them to acomputational search that is then verified by a computer program. Initially, many mathematicians did not accept thisform of proof, but it has become more widely accepted. The mathematician Doron Zeilberger has even gone so far asto claim that these are possibly the only nontrivial results that mathematicians have ever proved.[4] Many mathemat-ical theorems can be reduced to more straightforward computation, including polynomial identities, trigonometricidentities and hypergeometric identities.[5]

20.2 Provability and theoremhood

To establish a mathematical statement as a theorem, a proof is required, that is, a line of reasoning from axioms in thesystem (and other, already established theorems) to the given statement must be demonstrated. However, the proof isusually considered as separate from the theorem statement. Although more than one proof may be known for a singletheorem, only one proof is required to establish the status of a statement as a theorem. The Pythagorean theorem andthe law of quadratic reciprocity are contenders for the title of theorem with the greatest number of distinct proofs.

20.3 Relation with scientific theories

Theorems inmathematics and theories in science are fundamentally different in their epistemology. A scientific theorycannot be proven; its key attribute is that it is falsifiable, that is, it makes predictions about the natural world that aretestable by experiments. Any disagreement between prediction and experiment demonstrates the incorrectness of thescientific theory, or at least limits its accuracy or domain of validity. Mathematical theorems, on the other hand, arepurely abstract formal statements: the proof of a theorem cannot involve experiments or other empirical evidence inthe same way such evidence is used to support scientific theories.Nonetheless, there is some degree of empiricism and data collection involved in the discovery of mathematical the-orems. By establishing a pattern, sometimes with the use of a powerful computer, mathematicians may have an ideaof what to prove, and in some cases even a plan for how to set about doing the proof. For example, the Collatzconjecture has been verified for start values up to about 2.88 × 1018. The Riemann hypothesis has been verified forthe first 10 trillion zeroes of the zeta function. Neither of these statements is considered proven.Such evidence does not constitute proof. For example, the Mertens conjecture is a statement about natural numbersthat is now known to be false, but no explicit counterexample (i.e., a natural number n for which the Mertens functionM(n) equals or exceeds the square root of n) is known: all numbers less than 1014 have the Mertens property, and thesmallest number that does not have this property is only known to be less than the exponential of 1.59 × 1040, whichis approximately 10 to the power 4.3 × 1039. Since the number of particles in the universe is generally consideredless than 10 to the power 100 (a googol), there is no hope to find an explicit counterexample by exhaustive search.Note that the word “theory” also exists in mathematics, to denote a body of mathematical axioms, definitions andtheorems, as in, for example, group theory. There are also “theorems” in science, particularly physics, and in en-gineering, but they often have statements and proofs in which physical assumptions and intuition play an importantrole; the physical axioms on which such “theorems” are based are themselves falsifiable.

20.4 Terminology

A number of different terms for mathematical statements exist, these terms indicate the role statements play in aparticular subject. The distinction between different terms is sometimes rather arbitrary and the usage of some termshas evolved over time.

Page 137: Formal Semantics (Logic)

20.4. TERMINOLOGY 127

• An axiom or postulate is a statement that is accepted without proof and regarded as fundamental to a subject.Historically these have been regarded as “self-evident”, but more recently they are considered assumptions thatcharacterize the subject of study. In classical geometry, axioms are general statements, while postulates arestatements about geometrical objects.[6] A definition is also accepted without proof since it simply gives themeaning of a word or phrase in terms of known concepts.

• An unproven statement that is believed true is called a conjecture (or sometimes a hypothesis, but with adifferent meaning from the one discussed above). To be considered a conjecture, a statement must usuallybe proposed publicly, at which point the name of the proponent may be attached to the conjecture, as withGoldbach’s conjecture. Other famous conjectures include the Collatz conjecture and the Riemann hypothesis.On the other hand, Fermat’s last theorem has always been known by that name, even before it was proven; itwas never known as “Fermat’s conjecture”.

• A proposition is a theorem of no particular importance. This term sometimes connotes a statement with asimple proof, while the term theorem is usually reserved for the most important results or those with long ordifficult proofs. In classical geometry, a proposition may be a construction that satisfies given requirements; forexample, Proposition 1 in Book I of Euclid’s elements is the construction of an equilateral triangle.[7]

• A lemma is a “helping theorem”, a proposition with little applicability except that it forms part of the proofof a larger theorem. In some cases, as the relative importance of different theorems becomes more clear, whatwas once considered a lemma is now considered a theorem, though the word “lemma” remains in the name.Examples include Gauss’s lemma, Zorn’s lemma, and the Fundamental lemma.

• A corollary is a proposition that follows with little or no proof from another theorem or definition.[8]

• A converse of a theorem is a statement formed by interchanging what is given in a theorem and what is to beproved. For example, the isosceles triangle theorem states that if two sides of a triangle are equal then twoangles are equal. In the converse, the given (that two sides are equal) and what is to be proved (that two anglesare equal) are swapped, so the converse is the statement that if two angles of a triangle are equal then twosides are equal. In this example, the converse can be proven as another theorem, but this is often not the case.For example, the converse to the theorem that two right angles are equal angles is the statement that two equalangles must be right angles, and this is clearly not always the case.[9]

• A generalization is a theorem which includes a previously proven theorem as a special case and hence as acorollary.

There are other terms, less commonly used, that are conventionally attached to proven statements, so that certaintheorems are referred to by historical or customary names. For example:

• An identity is an equality, contained in a theorem, between two mathematical expressions that holds regardlessof what values are used for any variables or parameters appearing in the expressions. Examples include Euler’sformula and Vandermonde’s identity.

• A rule is a theorem, such as Bayes’ rule and Cramer’s rule, that establishes a useful formula.

• A law or a principle is a theorem that applies in a wide range of circumstances. Examples include the lawof large numbers, the law of cosines, Kolmogorov’s zero-one law, Harnack’s principle, the least upper boundprinciple, and the pigeonhole principle.[10]

A few well-known theorems have even more idiosyncratic names. The division algorithm (see Euclidean division) isa theorem expressing the outcome of division in the natural numbers and more general rings. The Bézout’s identityis a theorem asserting that the greatest common divisor of two numbers may be written as a linear combination ofthese numbers. The Banach–Tarski paradox is a theorem in measure theory that is paradoxical in the sense that itcontradicts common intuitions about volume in three-dimensional space.

Page 138: Formal Semantics (Logic)

128 CHAPTER 20. THEOREM

20.5 Layout

A theorem and its proof are typically laid out as follows:

Theorem (name of person who proved it and year of discovery, proof or publication).Statement of theorem (sometimes called the proposition).Proof.Description of proof.

End mark.

The end of the proof may be signalled by the letters Q.E.D. (quod erat demonstrandum) or by one of the tombstonemarks "" or "∎" meaning “End of Proof”, introduced by Paul Halmos following their usage in magazine articles.The exact style depends on the author or publication. Many publications provide instructions or macros for typesettingin the house style.It is common for a theorem to be preceded by definitions describing the exact meaning of the terms used in thetheorem. It is also common for a theorem to be preceded by a number of propositions or lemmas which are thenused in the proof. However, lemmas are sometimes embedded in the proof of a theorem, either with nested proofs,or with their proofs presented after the proof of the theorem.Corollaries to a theorem are either presented between the theorem and the proof, or directly after the proof. Some-times, corollaries have proofs of their own that explain why they follow from the theorem.

20.6 Lore

It has been estimated that over a quarter of a million theorems are proved every year.[11]

The well-known aphorism, “A mathematician is a device for turning coffee into theorems”, is probably due to AlfrédRényi, although it is often attributed to Rényi’s colleague Paul Erdős (and Rényi may have been thinking of Erdős),who was famous for the many theorems he produced, the number of his collaborations, and his coffee drinking.[12]

The classification of finite simple groups is regarded by some to be the longest proof of a theorem. It comprisestens of thousands of pages in 500 journal articles by some 100 authors. These papers are together believed to givea complete proof, and several ongoing projects hope to shorten and simplify this proof.[13] Another theorem of thistype is the Four color theorem whose computer generated proof is too long for a human to read. It is certainly thelongest known proof of a theorem whose statement can be easily understood by a layman.

20.7 Theorems in logic

Logic, especially in the field of proof theory, considers theorems as statements (called formulas or well formedformulas) of a formal language. The statements of the language are strings of symbols and may be broadly dividedinto nonsense and well-formed formulas. A set of deduction rules, also called transformation rules or rules ofinference, must be provided. These deduction rules tell exactly when a formula can be derived from a set of premises.The set of well-formed formulas may be broadly divided into theorems and non-theorems. However, according toHofstadter, a formal system often simply defines all its well-formed formula as theorems.[14]

Different sets of derivation rules give rise to different interpretations of what it means for an expression to be atheorem. Some derivation rules and formal languages are intended to capture mathematical reasoning; the mostcommon examples use first-order logic. Other deductive systems describe term rewriting, such as the reduction rulesfor λ calculus.The definition of theorems as elements of a formal language allows for results in proof theory that study the structureof formal proofs and the structure of provable formulas. The most famous result is Gödel’s incompleteness theorem;by representing theorems about basic number theory as expressions in a formal language, and then representingthis language within number theory itself, Gödel constructed examples of statements that are neither provable nordisprovable from axiomatizations of number theory.

Page 139: Formal Semantics (Logic)

20.7. THEOREMS IN LOGIC 129

A theorem may be expressed in a formal language (or “formalized”). A formal theorem is the purely formal analogueof a theorem. In general, a formal theorem is a type of well-formed formula that satisfies certain logical and syntacticconditions. The notation S is often used to indicate that S is a theorem.Formal theorems consist of formulas of a formal language and the transformation rules of a formal system. Specif-ically, a formal theorem is always the last formula of a derivation in some formal system each formula of which isa logical consequence of the formulas that came before it in the derivation. The initially accepted formulas in thederivation are called its axioms, and are the basis on which the theorem is derived. A set of theorems is called atheory.What makes formal theorems useful and of interest is that they can be interpreted as true propositions and theirderivations may be interpreted as a proof of the truth of the resulting expression. A set of formal theorems may bereferred to as a formal theory. A theorem whose interpretation is a true statement about a formal system is called ametatheorem.

20.7.1 Syntax and semantics

Main articles: Syntax (logic) and Formal semantics (logic)

The concept of a formal theorem is fundamentally syntactic, in contrast to the notion of a true proposition, whichintroduces semantics. Different deductive systems can yield other interpretations, depending on the presumptionsof the derivation rules (i.e. belief, justification or other modalities). The soundness of a formal system depends onwhether or not all of its theorems are also validities. A validity is a formula that is true under any possible inter-pretation, e.g. in classical propositional logic validities are tautologies. A formal system is considered semanticallycomplete when all of its tautologies are also theorems.

20.7.2 Derivation of a theorem

Main article: Formal proof

The notion of a theorem is very closely connected to its formal proof (also called a “derivation”). To illustrate howderivations are done, we will work in a very simplified formal system. Let us call ours FS Its alphabet consists onlyof two symbols A, B and its formation rule for formulas is:

FS

The single axiom of FS is:

ABBA.

The only rule of inference (transformation rule) for FS is:

Any occurrence of "A" in a theorem may be replaced by an occurrence of the string "AB" and the resultis a theorem.

Theorems in FS are defined as those formulae that have a derivation ending with that formula. For example

1. ABBA (Given as axiom)

2. ABBBA (by applying the transformation rule)

3. ABBBAB (by applying the transformation rule)

is a derivation. Therefore, "ABBBAB" is a theorem of FS . The notion of truth (or falsity) cannot be applied to theformula "ABBBAB" until an interpretation is given to its symbols. Thus in this example, the formula does not yetrepresent a proposition, but is merely an empty abstraction.Two metatheorems of FS are:

Page 140: Formal Semantics (Logic)

130 CHAPTER 20. THEOREM

Every theorem begins with "A".Every theorem has exactly two "A“s.

20.7.3 Interpretation of a formal theorem

Main article: Interpretation (logic)

20.7.4 Theorems and theories

Main articles: Theory and Theory (mathematical logic)

20.8 See also

• Inference

• List of theorems

• Toy theorem

• Metamath – a language for developing strictly formalized mathematical definitions and proofs accompanied bya proof checker for this language and a growing database of thousands of proved theorems

20.9 Notes

[1] For full text of 2nd edition of 1940, see Elisha Scott Loomis. “The Pythagorean proposition: its demonstrations analyzedand classified, and bibliography of sources for data of the four kinds of proofs” (PDF). Education Resources InformationCenter. Institute of Education Sciences (IES) of the U.S. Department of Education. Retrieved 2010-09-26. Originallypublished in 1940 and reprinted in 1968 by National Council of Teachers of Mathematics.

[2] However, both theorems and theories are investigations. See Heath 1897 Introduction, The terminology of Archimedes, p.clxxxii:"theorem (θεὼρνμα) from θεωρεἳν to investigate”

[3] Weisstein, Eric W., “Deep Theorem”, MathWorld.

[4] Doron Zeilberger. “Opinion 51”.

[5] Petkovsek et al. 1996.

[6] Wentworth, G.; Smith, D.E. (1913). “Art. 46, 47”. Plane Geometry. Ginn & Co.

[7] Wentworth & Smith Art. 50

[8] Wentworth & Smith Art. 51

[9] Follows Wentworth & Smith Art. 79

[10] The word law can also refer to an axiom, a rule of inference, or, in probability theory, a probability distribution.

[11] Hoffman 1998, p. 204.

[12] Hoffman 1998, p. 7.

[13] An enormous theorem: the classification of finite simple groups, Richard Elwes, Plus Magazine, Issue 41 December 2006.

[14] Hofstadter 1980

Page 141: Formal Semantics (Logic)

20.10. REFERENCES 131

20.10 References• Heath, Sir Thomas Little (1897). The works of Archimedes. Dover. Retrieved 2009-11-15.

• Hoffman, P. (1998). The Man Who Loved Only Numbers: The Story of Paul Erdős and the Search for Mathe-matical Truth. Hyperion, New York. ISBN 1-85702-829-5.

• Hofstadter, Douglas (1979). Gödel, Escher, Bach: An Eternal Golden Braid. Basic Books.

• Hunter, Geofrfrey (1996) [1973]. Metalogic: An Introduction to the Metatheory of Standard First Order Logic.University of California Press. ISBN 0-520-02356-0.

• Mates, Benson (1972). Elementary Logic. Oxford University Press. ISBN 0-19-501491-X.

• Petkovsek, Marko; Wilf, Herbert; Zeilberger, Doron (1996). A = B. A.K. Peters, Wellesley, Massachusetts.ISBN 1-56881-063-6.

20.11 External links• Weisstein, Eric W., “Theorem”, MathWorld.

• Theorem of the Day

Page 142: Formal Semantics (Logic)

132 CHAPTER 20. THEOREM

The Pythagorean theorem has at least 370 known proofs[1]

Page 143: Formal Semantics (Logic)

20.11. EXTERNAL LINKS 133

A planar map with five colors such that no two regions with the same color meet. It can actually be colored in this way with onlyfour colors. The four color theorem states that such colorings are possible for any planar map, but every known proof involves acomputational search that is too long to check by hand.

Page 144: Formal Semantics (Logic)

134 CHAPTER 20. THEOREM

The Collatz conjecture: one way to illustrate its complexity is to extend the iteration from the natural numbers to the complex numbers.The result is a fractal, which (in accordance with universality) resembles the Mandelbrot set.

Page 145: Formal Semantics (Logic)

20.11. EXTERNAL LINKS 135

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities that can be constructed from formal languages. The symbols and strings of symbols maybe broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of itswell-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

Page 146: Formal Semantics (Logic)

Chapter 21

Theory (mathematical logic)

In mathematical logic, a theory (also called a formal theory) is a set of sentences in a formal language. Usually adeductive system is understood from context. An element ϕ ∈ T of a theory T is then called an axiom of the theory,and any sentence that follows from the axioms ( T ⊢ ϕ ) is called a theorem of the theory. Every axiom is also atheorem. A first-order theory is a set of first-order sentences.

21.1 Theories expressed in formal language generally

When defining theories for foundational purposes, additional care must be taken and normal set-theoretic languagemay not be appropriate.The construction of a theory begins by specifying a definite non-empty conceptual class E , the elements of whichare called statements. These initial statements are often called the primitive elements or elementary statements of thetheory, to distinguish them from other statements which may be derived from them.A theory T is a conceptual class consisting of certain of these elementary statements. The elementary statementswhich belong to T are called the elementary theorems of T and said to be true. In this way, a theory is a way ofdesignating a subset of E which consists entirely of true statements.This general way of designating a theory stipulates that the truth of any of its elementary statements is not knownwithout reference to T . Thus the same elementary statement may be true with respect to one theory, and not truewith respect to another. This is as in ordinary language, where statements such as “He is a terrible person.” cannot bejudged to be true or false without reference to some interpretation of who “He” is and for that matter what a “terribleperson” is under this theory.[1]

21.1.1 Subtheories and extensions

A theory S is a subtheory of a theory T if S is a subset of T. If T is a subset of S then S is an extension or supertheoryof T

21.1.2 Deductive theories

A theory is said to be a deductive theory if T is an inductive class. That is, that its content is based on some formaldeductive system and that some of its elementary statements are taken as axioms. In a deductive theory, any sentencewhich is a logical consequence of one or more of the axioms is also a sentence of that theory.[1]

21.1.3 Consistency and completeness

Main articles: Consistency and Completeness (logic)

136

Page 147: Formal Semantics (Logic)

21.2. FIRST-ORDER THEORIES 137

A syntactically consistent theory is a theory fromwhich not every sentence in the underlying language can be proven(with respect to some deductive system which is usually clear from context). In a deductive system (such as first-orderlogic) that satisfies the principle of explosion, this is equivalent to requiring that there is no sentence φ such that bothφ and its negation can be proven from the theory.A satisfiable theory is a theory that has a model. This means there is a structure M that satisfies every sentence inthe theory. Any satisfiable theory is syntactically consistent, because the structure satisfying the theory will satisfyexactly one of φ and the negation of φ, for each sentence φ.A consistent theory is sometimes defined to be a syntactically consistent theory, and sometimes defined to be asatisfiable theory. For first-order logic, the most important case, it follows from the completeness theorem that thetwo meanings coincide. In other logics, such as second-order logic, there are syntactically consistent theories that arenot satisfiable, such as ω-inconsistent theories.A complete consistent theory (or just a complete theory) is a consistent theory T such that for every sentence φ inits language, either φ is provable from T or T ∪ φ is inconsistent. For theories closed under logical consequence,this means that for every sentence φ, either φ or its negation is contained in the theory. An incomplete theory is aconsistent theory that is not complete.See also ω-consistent theory for a stronger notion of consistency.

21.1.4 Interpretation of a theory

Main article: Interpretation (logic)

An interpretation of a theory is the relationship between a theory and some contensive subject matter when there is amany-to-one correspondence between certain elementary statements of the theory, and certain contensive statementsrelated to the subject matter. If every elementary statement in the theory has a contensive correspondent it is calleda full interpretation, otherwise it is called a partial interpretation.[2]

21.1.5 Theories associated with a structure

Each structure has several associated theories. The complete theory of a structure A is the set of all first-ordersentences over the signature of A which are satisfied by A. It is denoted by Th(A). More generally, the theory of K,a class of σ-structures, is the set of all first-order σ-sentences that are satisfied by all structures in K, and is denotedby Th(K). Clearly Th(A) = Th(A). These notions can also be defined with respect to other logics.For each σ-structure A, there are several associated theories in a larger signature σ' that extends σ by adding one newconstant symbol for each element of the domain of A. (If the new constant symbols are identified with the elementsof A which they represent, σ' can be taken to be σ ∪ A.) The cardinality of σ' is thus the larger of the cardinality ofσ and the cardinality of A.The diagram of A consists of all atomic or negated atomic σ'-sentences that are satisfied by A and is denoted bydiagA. The positive diagram of A is the set of all atomic σ'-sentences which A satisfies. It is denoted by diag+A. Theelementary diagram of A is the set eldiagA of all first-order σ'-sentences that are satisfied by A or, equivalently, thecomplete (first-order) theory of the natural expansion of A to the signature σ'.

21.2 First-order theories

Further information: List of first-order theories

A first-order theory QS is a set of sentences in a first-order formal language Q .

21.2.1 Derivation in a first order theory

Main article: First order logic § Deductive systems

Page 148: Formal Semantics (Logic)

138 CHAPTER 21. THEORY (MATHEMATICAL LOGIC)

There are many formal derivation (“proof”) systems for first-order logic.

21.2.2 Syntactic consequence in a first order theory

Main article: First-order logic § Validity, satisfiability, and logical consequence

A formula A is a syntactic consequence of a first-order theory QS if there is a derivation of A using only formulasin QS as non-logical axioms. Such a formula A is also called a theorem ofQS . The notation " QS ⊢ A " indicatesA is a theorem of QS

21.2.3 Interpretation of a first order theory

Main article: Structure (mathematical logic)

An interpretation of a first-order theory provides a semantics for the formulas of the theory. An interpretation issaid to satisfy a formula if the formula is true according to the interpretation. A model of a first order theory QS isan interpretation in which every formula of QS is satisfied.

21.2.4 First order theories with identity

Main article: First order logic § Equality and its axioms

A first order theory QS is a first-order theory with identity if QS includes the identity relation symbol "=" and thereflexivity and substitution axiom schemes for this symbol.

21.2.5 Topics related to first order theories

• Compactness theorem

• Consistent set

• Deduction theorem

• Enumeration theorem

• Lindenbaum’s lemma

• Löwenheim–Skolem theorem

21.3 Examples

One way to specify a theory is to define a set of axioms in a particular language. The theory can be taken to includejust those axioms, or their logical or provable consequences, as desired. Theories obtained this way include ZFC andPeano arithmetic.A second way to specify a theory is to begin with a structure and then let the theory be the set of sentences that aresatisfied by the structure. This is one method for producing complete theories, described below. Examples of theoriesof this sort include the sets of true sentences in the structures (N, +, ×, 0, 1, =) and (R, +, ×, 0, 1, =), where N isthe set of natural numbers and R is the set of real numbers. The first of these, called the theory of true arithmetic,cannot be written as the set of logical consequences of any enumerable set of axioms. The theory of (R, +, ×, 0, 1,=) was shown by Tarski to be decidable; it is the theory of real closed fields.

Page 149: Formal Semantics (Logic)

21.4. SEE ALSO 139

21.4 See also• Axiomatic system

• List of first-order theories

21.5 References[1] Curry, Haskell, Foundations of Mathematical Logic

[2] Curry, Haskell, Foundations of Mathematical Logic p.48

21.6 Further reading• Hodges, Wilfrid (1997). A shorter model theory. Cambridge University Press. ISBN 0-521-58713-1.

Page 150: Formal Semantics (Logic)

Chapter 22

Unate function

A unate function is a type of boolean function which has monotonic properties. They have been studied extensivelyin switching theory.A function f(x1, x2, . . . , xn) is said to be positive unate in xi if for all possible values of xj , j = i

f(x1, x2, . . . , xi−1, 1, xi+1, . . . , xn) ≥ f(x1, x2, . . . , xi−1, 0, xi+1, . . . , xn).

Likewise, it is negative unate in xi if

f(x1, x2, . . . , xi−1, 0, xi+1, . . . , xn) ≥ f(x1, x2, . . . , xi−1, 1, xi+1, . . . , xn).

If for every xi f is either positive or negative unate in the variable xi then it is said to be unate (note that some ximay be positive unate and some negative unate to satisfy the definition of unate function). A function is binate if itis not unate (i.e., is neither positive unate nor negative unate in at least one of its variables).For example the Logical disjunction function or with boolean values used for true (1) and false (0) is positive unate.NB: positive unateness can also be considered as passing the same slope (no change in the input) and negative unateis passing the opposite slope.... non unate is dependence on more than one input (of same or different slopes)

140

Page 151: Formal Semantics (Logic)

Chapter 23

Variable (mathematics)

For variables in computer science, see Variable (computer science). For other uses, see Variable (disambiguation).

In elementary mathematics, a variable is an alphabetic character representing a number, called the value of thevariable, which is either arbitrary or not fully specified or unknown. Making algebraic computations with variables asif they were explicit numbers allows one to solve a range of problems in a single computation. A typical example isthe quadratic formula, which allows one to solve every quadratic equation by simply substituting the numeric valuesof the coefficients of the given equation to the variables that represent them.The concept of variable is also fundamental in calculus. Typically, a function y = f(x) involves two variables, y andx, representing respectively the value and the argument of the function. The term “variable” comes from the fact that,when the argument (also called the “variable of the function”) varies, then the value varies accordingly.[1]

In more advanced mathematics, a variable is a symbol that denotes a mathematical object, which could be a number,a vector, a matrix, or even a function. In this case, the original property of “variability” of a variable is not kept(except, sometimes, for informal explanations).Similarly, in computer science, a variable is a name (commonly an alphabetic character or a word) representingsome value represented in computer memory. In mathematical logic, a variable is either a symbol representing anunspecified term of the theory, or a basic object of the theory, which is manipulated without referring to its possibleintuitive interpretation.

23.1 Etymology

“Variable” comes from a Latin word, variābilis, with "vari(us)"' meaning “various” and "-ābilis"' meaning "-able”,meaning “capable of changing”.[2]

23.2 Genesis and evolution of the concept

François Viète introduced at the end of 16th century the idea of representing known and unknown numbers by letters,nowadays called variables, and of computing with them as if they were numbers, in order to obtain, at the end, theresult by a simple replacement. François Viète's convention was to use consonants for known values and vowels forunknowns.[3]

In 1637, René Descartes “invented the convention of representing unknowns in equations by x, y, and z, and knownsby a, b, and c".[4] Contrarily to Viète’s convention, Descartes’ one is still commonly in use.Starting in the 1660s, Isaac Newton and Gottfried Wilhelm Leibniz independently developed the infinitesimal calcu-lus, which essentially consists of studying how an infinitesimal variation of a variable quantity induces a correspondingvariation of another quantity which is a function of the first variable (quantity). Almost a century later Leonhard Eu-ler fixed the terminology of infinitesimal calculus and introduced the notation y = f(x) for a function f, its variablex and its value y. Until the end of the 19th century, the word variable referred almost exclusively to the argumentsand the values of functions.

141

Page 152: Formal Semantics (Logic)

142 CHAPTER 23. VARIABLE (MATHEMATICS)

In the second half of the 19th century, it appeared that the foundation of infinitesimal calculus was not formalizedenough to deal with apparent paradoxes such as a continuous function which is nowhere differentiable. To solve thisproblem, Karl Weierstrass introduced a new formalism consisting of replacing the intuitive notion of limit by a formaldefinition. The older notion of limit was “when the variable x varies and tends toward a, then f(x) tends toward L",without any accurate definition of “tends”. Weierstrass replaced this sentence by the formula

(∀ϵ > 0)(∃η > 0)(∀x) |x− a| < η ⇒ |L− f(x)| < ϵ,

in which none of the five variables is considered as varying.This static formulation led to the modern notion of variable which is simply a symbol representing a mathematicalobject which either is unknown or may be replaced by any element of a given set; for example, the set of real numbers.

23.3 Specific kinds of variables

It is common that many variables appear in the same mathematical formula, which play different roles. Some namesor qualifiers have been introduced to distinguish them. For example, in the general cubic equation

ax3 + bx2 + cx+ d = 0,

there are five variables. Four of them, a, b, c, d represent given numbers, and the last one, x, represents the unknownnumber, which is a solution of the equation. To distinguish them, the variable x is called an unknown, and the othervariables are called parameters or coefficients, or sometimes constants, although this last terminology is incorrect foran equation and should be reserved for the function defined by the left-hand side of this equation.In the context of functions, the term variable refers commonly to the arguments of the functions. This is typicallythe case in sentences like "function of a real variable", "x is the variable of the function f: x↦ f(x)", "f is a functionof the variable x" (meaning that the argument of the function is referred to by the variable x).In the same context, the variables that are independent of x define constant functions and are therefore called constant.For example, a constant of integration is an arbitrary constant function that is added to a particular antiderivativeto obtain the other antiderivatives. Because the strong relationship between polynomials and polynomial function,the term “constant” is often used to denote the coefficients of a polynomial, which are constant functions of theindeterminates.This use of “constant” as an abbreviation of “constant function” must be distinguished from the normal meaning ofthe word in mathematics. A constant, or mathematical constant is a well and unambiguously defined number orother mathematical object, as, for example, the numbers 0, 1, π and the identity element of a group.Here are other specific names for variables.

• A unknown is a variable in which an equation has to be solved for.

• An indeterminate is a symbol, commonly called variable, that appears in a polynomial or a formal powerseries. Formally speaking, an indeterminate is not a variable, but a constant in the polynomial ring of the ringof formal power series. However, because of the strong relationship between polynomials or power series andthe functions that they define, many authors consider indeterminates as a special kind of variables.

• A parameter is a quantity (usually a number) which is a part of the input of a problem, and remains constantduring the whole solution of this problem. For example, in mechanics the mass and the size of a solid bodyare parameters for the study of its movement. It should be noted that in computer science, parameter has adifferent meaning and denotes an argument of a function.

• Free variables and bound variables

• A random variable is a kind of variable that is used in probability theory and its applications.

It should be emphasized that all these denominations of variables are of semantic nature and that the way of computingwith them (syntax) is the same for all.

Page 153: Formal Semantics (Logic)

23.4. NOTATION 143

23.3.1 Dependent and independent variables

Main article: Dependent and independent variables

In calculus and its application to physics and other sciences, it is rather common to consider a variable, say y, whosepossible values depend of the value of another variable, say x. In mathematical terms, the dependent variable yrepresents the value of a function of x. To simplify formulas, it is often useful to use the same symbol for thedependent variable y and the function mapping x onto y. For example, the state of a physical system depends onmeasurable quantities such as the pressure, the temperature, the spatial position, ..., and all these quantities varieswhen the system evolves, that is, they are function of the time. In the formulas describing the system, these quantitiesare represented by variables which are dependent on the time, and thus considered implicitly as functions of the time.Therefore, in a formula, a dependent variable is a variable that is implicitly a function of another (or several other)variables. An independent variable is a variable that is not dependent.[5]

The property of a variable to be dependent or independent depends often of the point of view and is not intrinsic. Forexample, in the notation f(x, y, z), the three variables may be all independent and the notation represents a functionof three variables. On the other hand, if y and z depend on x (are dependent variables) then the notation represent afunction of the single independent variable x.[6]

23.3.2 Examples

If one defines a function f from the real numbers to the real numbers by

f(x) = x2 + sin(x+ 4)

then x is a variable standing for the argument of the function being defined, which can be any real number. In theidentity

n∑i=1

i =n2 + n

2

the variable i is a summation variable which designates in turn each of the integers 1, 2, ..., n (it is also called indexbecause its variation is over a discrete set of values) while n is a parameter (it does not vary within the formula).In the theory of polynomials, a polynomial of degree 2 is generally denoted as ax2 + bx + c, where a, b and c are calledcoefficients (they are assumed to be fixed, i.e., parameters of the problem considered) while x is called a variable.When studying this polynomial for its polynomial function this x stands for the function argument. When studyingthe polynomial as an object in itself, x is taken to be an indeterminate, and would often be written with a capital letterinstead to indicate this status.

23.4 Notation

In mathematics, the variables are generally denoted by a single letter. However, this letter is frequently followed bya subscript, as in x2, and this subscript may be a number, another variable (xi), a word or the abbreviation of a word(xᵢ and xₒᵤ ), and even a mathematical expression. Under the influence of computer science, one may encounter inpure mathematics some variable names consisting in several letters and digits.Following the 17th century French philosopher and mathematician, René Descartes, letters at the beginning of thealphabet, e.g. a, b, c are commonly used for known values and parameters, and letters at the end of the alphabet, e.g.x, y, z, and t are commonly used for unknowns and variables of functions.[7] In printed mathematics, the norm is toset variables and constants in an italic typeface.[8]

For example, a general quadratic function is conventionally written as:

ax2 + bx+ c ,

Page 154: Formal Semantics (Logic)

144 CHAPTER 23. VARIABLE (MATHEMATICS)

where a, b and c are parameters (also called constants, because they are constant functions), while x is the variable ofthe function. A more explicit way to denote this function is

x 7→ ax2 + bx+ c ,

which makes the function-argument status of x clear, and thereby implicitly the constant status of a, b and c. Since coccurs in a term that is a constant function of x, it is called the constant term.[9]:18

Specific branches and applications of mathematics usually have specific naming conventions for variables. Variableswith similar roles or meanings are often assigned consecutive letters. For example, the three axes in 3D coordinatespace are conventionally called x, y, and z. In physics, the names of variables are largely determined by the physicalquantity they describe, but various naming conventions exist. A convention often followed in probability and statisticsis to use X, Y, Z for the names of random variables, keeping x, y, z for variables representing corresponding actualvalues.There are many other notational usages. Usually, variables that play a similar role are represented by consecutiveletters or by the same letter with different subscript. Below are some of the most common usages.

• a, b, c, and d (sometimes extended to e and f) often represent parameters or coefficients.

• a0, a1, a2, ... play a similar role, when otherwise too many different letters would be needed.

• ai or ui is often used to denote the i-th term of a sequence or the i-th coefficient of a series.

• f and g (sometimes h) commonly denote functions.

• i, j, and k (sometimes l or h) are often used to denote varying integers or indices in an indexed family.

• l and w are often used to represent the length and width of a figure.

• l is also used to denote a line. In number theory, l often denotes a prime number not equal to p.

• n usually denotes a fixed integer, such as a count of objects or the degree of an equation.

• When two integers are needed, for example for the dimensions of a matrix, one uses commonly m and n.

• p often denotes a prime numbers or a probability.

• q often denotes a prime power or a quotient

• r often denotes a remainder.

• t often denotes time.

• x, y and z usually denote the three Cartesian coordinates of a point in Euclidean geometry. By extension, theyare used to name the corresponding axes.

• z typically denotes a complex number, or, in statistics, a normal random variable.

• α, β, γ, θ and φ commonly denote angle measures.

• ε usually represents an arbitrarily small positive number.

• ε and δ commonly denote two small positives.

• λ is used for eigenvalues.

• σ often denotes a sum, or, in statistics, the standard deviation.

Page 155: Formal Semantics (Logic)

23.5. SEE ALSO 145

23.5 See also• Free variables and bound variables (Bound variables are also known as dummy variables)

• Variable (programming)

• Mathematical expression

• Physical constant

• Coefficient

• Constant of integration

• Constant term of a polynomial

• Indeterminate (variable)

• Lambda calculus

23.6 Bibliography• J. Edwards (1892). Differential Calculus. London: MacMillan and Co. pp. 1 ff.

• Karl Menger, “On Variables in Mathematics and in Natural Science”, The British Journal for the Philosophy ofScience 5:18:134-142 (August 1954) JSTOR 685170

• Jaroslav Peregrin, “Variables in Natural Language: Where do they come from?", in M. Boettner, W. Thümmel,eds., Variable-Free Semantics, 2000, p. 46-65.

• W. V. Quine, “Variables Explained Away”, Proceedings of the American Philosophical Society 104:343-347(1960).

23.7 References[1] Syracuse University. “Appendix One Review of Constants and Variables”. cstl.syr.edu.

[2] ""Variable” Origin”. dictionary.com. Retrieved 18 May 2015.

[3] Fraleigh, John B. (1989). A First Course in Abstract Algebra (4 ed.). United States: Addison-Wesley. p. 276. ISBN0-201-52821-5.

[4] Tom Sorell, Descartes: A Very Short Introduction, (2000). New York: Oxford University Press. p. 19.

[5] Edwards Art. 5

[6] Edwards Art. 6

[7] Edwards Art. 4

[8] William L. Hosch (editor), The Britannica Guide to Algebra and Trigonometry, Britannica Educational Publishing, TheRosen Publishing Group, 2010, ISBN 1615302190, 9781615302192, page 71

[9] Foerster, Paul A. (2006). Algebra and Trigonometry: Functions and Applications, Teacher’s Edition (Classics ed.). UpperSaddle River, NJ: Prentice Hall. ISBN 0-13-165711-9.

Page 156: Formal Semantics (Logic)

Chapter 24

Well-formed formula

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities which may be constructed from formal languages. The symbols and strings of symbolsmay be broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of itswell-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

In mathematical logic, a well-formed formula, shortly wff, often simply formula, is a word (i.e. a finite sequenceof symbols from a given alphabet) that is part of a formal language.[1] A formal language can be considered to beidentical to the set containing all and only its formulas.A formula is a syntactic formal object that can be given a semantic meaning by means of semantics.

146

Page 157: Formal Semantics (Logic)

24.1. INTRODUCTION 147

24.1 Introduction

A key use of formulae is in propositional logic and predicate logics such as first-order logic. In those contexts, aformula is a string of symbols φ for which it makes sense to ask “is φ true?", once any free variables in φ have beeninstantiated. In formal logic, proofs can be represented by sequences of formulas with certain properties, and thefinal formula in the sequence is what is proven.Although the term “formula” may be used for written marks (for instance, on a piece of paper or chalkboard), it ismore precisely understood as the sequence being expressed, with the marks being a token instance of formula. It isnot necessary for the existence of a formula that there be any actual tokens of it. A formal language may thus have aninfinite number of formulas regardless whether each formula has a token instance. Moreover, a single formula mayhave more than one token instance, if it is written more than once.Formulas are quite often interpreted as propositions (as, for instance, in propositional logic). However formulas aresyntactic entities, and as such must be specified in a formal language without regard to any interpretation of them.An interpreted formula may be the name of something, an adjective, an adverb, a preposition, a phrase, a clause, animperative sentence, a string of sentences, a string of names, etc.. A formula may even turn out to be nonsense, if thesymbols of the language are specified so that it does. Furthermore, a formula need not be given any interpretation.

24.2 Propositional calculus

The formulas of propositional calculus, also called propositional formulas,[2] are expressions such as (A ∧ (B ∨C)). Their definition begins with the arbitrary choice of a set V of propositional variables. The alphabet consists of theletters in V along with the symbols for the propositional connectives and parentheses "(" and ")", all of which areassumed to not be in V. The formulas will be certain expressions (that is, strings of symbols) over this alphabet.The formulas are inductively defined as follows:

• Each propositional variable is, on its own, a formula.

• If φ is a formula, then ¬ φ is a formula.

• If φ and ψ are formulas, and • is any binary connective, then ( φ • ψ) is a formula. Here • could be (but is notlimited to) the usual operators ∨, ∧, →, or ↔.

This definition can also be written as a formal grammar in Backus–Naur form, provided the set of variables is finite:

<alpha set> ::= p | q | r | s | t | u | ... (the arbitrary finite set of propositional variables)<form> ::= <alpha set> | ¬ <form> | (<form> ∧ <form>) | (<form> ∨ <form>) | (<form>→ <form>) |(<form>↔ <form>)

Using this grammar, the sequence of symbols

(((p→ q) ∧ (r→ s)) ∨ ( ¬ q ∧ ¬ s))

is a formula, because it is grammatically correct. The sequence of symbols

((p→ q)→ (qq))p))

is not a formula, because it does not conform to the grammar.A complex formula may be difficult to read, owing to, for example, the proliferation of parentheses. To alleviatethis last phenomenon, precedence rules (akin to the standard mathematical order of operations) are assumed amongthe operators, making some operators more binding than others. For example, assuming the precedence (from mostbinding to least binding) 1. ¬ 2. → 3. ∧ 4. ∨ . Then the formula

(((p→ q) ∧ (r→ s)) ∨ ( ¬ q ∧ ¬ s))

Page 158: Formal Semantics (Logic)

148 CHAPTER 24. WELL-FORMED FORMULA

may be abbreviated as

p→ q ∧ r→ s ∨ ¬ q ∧ ¬ s

This is, however, only a convention used to simplify the written representation of a formula. If the precedence wasassumed, for example, to be left-right associative, in following order: 1. ¬ 2. ∧ 3. ∨ 4. → , then the same formulaabove (without parentheses) would be rewritten as

(p→ (q ∧ r))→ (s ∨ (( ¬ q) ∧ ( ¬ s)))

24.3 Predicate logic

The definition of a formula in first-order logic QS is relative to the signature of the theory at hand. This signaturespecifies the constant symbols, relation symbols, and function symbols of the theory at hand, along with the arities ofthe function and relation symbols.The definition of a formula comes in several parts. First, the set of terms is defined recursively. Terms, informally,are expressions that represent objects from the domain of discourse.

1. Any variable is a term.

2. Any constant symbol from the signature is a term

3. an expression of the form f(t1,...,tn), where f is an n-ary function symbol, and t1,...,tn are terms, is again aterm.

The next step is to define the atomic formulas.

1. If t1 and t2 are terms then t1=t2 is an atomic formula

2. If R is an n-ary relation symbol, and t1,...,tn are terms, then R(t1,...,tn) is an atomic formula

Finally, the set of formulas is defined to be the smallest set containing the set of atomic formulas such that thefollowing holds:

1. ¬ϕ is a formula when ϕ is a formula

2. (ϕ ∧ ψ) and (ϕ ∨ ψ) are formulas when ϕ and ψ are formulas;

3. ∃xϕ is a formula when x is a variable and ϕ is a formula;

4. ∀xϕ is a formula when x is a variable and ϕ is a formula (alternatively, ∀xϕ could be defined as an abbreviationfor ¬∃x¬ϕ ).

If a formula has no occurrences of ∃x or ∀x , for any variable x , then it is called quantifier-free. An existentialformula is a formula starting with a sequence of existential quantification followed by a quantifier-free formula.

24.4 Atomic and open formulas

Main article: Atomic formula

An atomic formula is a formula that contains no logical connectives nor quantifiers, or equivalently a formula that hasno strict subformulas. The precise form of atomic formulas depends on the formal system under consideration; forpropositional logic, for example, the atomic formulas are the propositional variables. For predicate logic, the atomsare predicate symbols together with their arguments, each argument being a term.According to some terminology, an open formula is formed by combining atomic formulas using only logical con-nectives, to the exclusion of quantifiers.[3] This has not to be confused with a formula which is not closed.

Page 159: Formal Semantics (Logic)

24.5. CLOSED FORMULAS 149

24.5 Closed formulas

Main article: Sentence (mathematical logic)

A closed formula, also ground formula or sentence, is a formula in which there are no free occurrences of any variable.If A is a formula of a first-order language in which the variables v1, ..., vn have free occurrences, then A preceded byv1 ... vn is a closure of A.

24.6 Properties applicable to formulas

• A formula A in a language Q is valid if it is true for every interpretation of Q .

• A formula A in a language Q is satisfiable if it is true for some interpretation of Q .

• A formula A of the language of arithmetic is decidable if it represents a decidable set, i.e. if there is aneffective method which, given a substitution of the free variables of A, says that either the resulting instance ofA is provable or its negation is.

24.7 Usage of the terminology

In earlier works on mathematical logic (e.g. by Church[4]), formulas referred to any strings of symbols and amongthese strings, well-formed formulas were the strings that followed the formation rules of (correct) formulas.Several authors simply say formula.[5][6][7][8] Modern usages (especially in the context of computer science withmathematical software such as model checkers, automated theorem provers, interactive theorem provers) tend toretain of the notion of formula only the algebraic concept and to leave the question of well-formedness, i.e. of theconcrete string representation of formulas (using this or that symbol for connectives and quantifiers, using this or thatparenthesizing convention, using Polish or infix notation, etc.) as a mere notational problem.However, the expression well-formed formulas can still be found in various works,[9][10][11] these authors using thename well-formed formula without necessarily opposing it to the old sense of formula as arbitrary string of symbols sothat it is no longer common in mathematical logic to refer to arbitrary strings of symbols in the old sense of formulas.The expression “well-formed formulas” (WFF) also pervaded in popular culture. Indeed,WFF is part of an esotericpun used in the name of the academic game "WFF 'N PROOF: The Game of Modern Logic,” by Layman Allen,[12]developed while he was at Yale Law School (he was later a professor at the University of Michigan). The suite ofgames is designed to teach the principles of symbolic logic to children (in Polish notation).[13] Its name is an echoof whiffenpoof, a nonsense word used as a cheer at Yale University made popular in The Whiffenpoof Song and TheWhiffenpoofs.[14]

24.8 See also

• Ground expression

24.9 Notes[1] Formulas are a standard topic in introductory logic, and are covered by all introductory textbooks, including Enderton

(2001), Gamut (1990), and Kleene (1967)

[2] First-order logic and automated theorem proving, Melvin Fitting, Springer, 1996

[3] Handbook of the history of logic, (Vol 5, Logic from Russell to Church), Tarski’s logic by Keith Simmons, D. Gabbay andJ. Woods Eds, p568 .

[4] Alonzo Church, [1996] (1944), Introduction to mathematical logic, page 49

Page 160: Formal Semantics (Logic)

150 CHAPTER 24. WELL-FORMED FORMULA

[5] Hilbert, David; Ackermann, Wilhelm (1950) [1937], Principles of Mathematical Logic, New York: Chelsea

[6] Hodges, Wilfrid (1997), A shorter model theory, Cambridge University Press, ISBN 978-0-521-58713-6

[7] Barwise, Jon, ed. (1982), Handbook of Mathematical Logic, Studies in Logic and the Foundations of Mathematics, Am-sterdam: North-Holland, ISBN 978-0-444-86388-1

[8] Cori, Rene; Lascar, Daniel (2000), Mathematical Logic: A Course with Exercises, Oxford University Press, ISBN 978-0-19-850048-3

[9] Enderton, Herbert [2001] (1972), A mathematical introduction to logic (2nd ed.), Boston, MA: Academic Press, ISBN978-0-12-238452-3

[10] R. L. Simpson (1999), Essentials of Symbolic Logic, page 12

[11] Mendelson, Elliott [2010] (1964), An Introduction to Mathematical Logic (5th ed.), London: Chapman & Hall

[12] Ehrenburg 2002

[13] More technically, propositional logic using the Fitch-style calculus.

[14] Allen (1965) acknowledges the pun.

24.10 References• Allen, Layman E. (1965), “TowardAutotelic Learning ofMathematical Logic by theWFF 'N PROOFGames”,Mathematical Learning: Report of a Conference Sponsored by the Committee on Intellective Processes Researchof the Social Science Research Council, Monographs of the Society for Research in Child Development 30 (1):29–41

• Boolos, George; Burgess, John; Jeffrey, Richard (2002), Computability and Logic (4th ed.), Cambridge Uni-versity Press, ISBN 978-0-521-00758-0

• Ehrenberg, Rachel (Spring 2002). “He’s Positively Logical”. Michigan Today (University of Michigan). Re-trieved 2007-08-19.

• Enderton, Herbert (2001), A mathematical introduction to logic (2nd ed.), Boston, MA: Academic Press, ISBN978-0-12-238452-3

• Gamut, L.T.F. (1990), Logic, Language, and Meaning, Volume 1: Introduction to Logic, University Of ChicagoPress, ISBN 0-226-28085-3

• Hodges, Wilfrid (2001), “Classical Logic I: First-Order Logic”, in Goble, Lou, The Blackwell Guide to Philo-sophical Logic, Blackwell, ISBN 978-0-631-20692-7

• Hofstadter, Douglas (1980), Gödel, Escher, Bach: An Eternal Golden Braid, Penguin Books, ISBN 978-0-14-005579-5

• Kleene, Stephen Cole (2002) [1967], Mathematical logic, New York: Dover Publications, ISBN 978-0-486-42533-7, MR 1950307

• Rautenberg, Wolfgang (2010), A Concise Introduction to Mathematical Logic (3rd ed.), New York: SpringerScience+Business Media, doi:10.1007/978-1-4419-1221-3, ISBN 978-1-4419-1220-6

24.11 External links• Well-Formed Formula for First Order Predicate Logic - includes a short Java quiz.

• Well-Formed Formula at ProvenMath

• WFF N PROOF game site

Page 161: Formal Semantics (Logic)

24.12. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 151

24.12 Text and image sources, contributors, and licenses

24.12.1 Text• Atomic sentence Source: https://en.wikipedia.org/wiki/Atomic_sentence?oldid=654112998 Contributors: AugPi, OkPerson, Creidieki,

Nortexoid, Oleg Alexandrov, Joriki, Linas, BD2412, Rjwilmsi, TheDJ, Rick Norwood, Tomisti, Mhss, Byelf2007, Dreftymac, CR-Greathouse, CBM, Gregbard, Cydebot, Julian Mendez, DWRZ, Zorakoid, Juliancolton, Tomer T, Philogo, JimJJewett, Arda Xi, Flyer22,Botsjeh, WikHead, Addbot, Zorrobot, Headlikeawhole, FrescoBot, BrideOfKripkenstein, Joerom5883, Mhiji, Blitzmut, Khazar2, CamilaCavalcanti Nery, Erick Lucena and Anonymous: 11

• First-order logic Source: https://en.wikipedia.org/wiki/First-order_logic?oldid=675933401 Contributors: AxelBoldt, The Anome, Ben-Baker, Dwheeler, Youandme, Stevertigo, Frecklefoot, Edward, Patrick, Michael Hardy, Kwertii, Kku, Ixfd64, Chinju, Zeno Gantner,Minesweeper, Looxix~enwiki, TallJosh, Julesd, AugPi, Dpol, Jod, Nzrs0006, Charles Matthews, Timwi, Dcoetzee, Dysprosia, Greenrd,Markhurd, Hyacinth, David.Monniaux, Robbot, Fredrik, Vanden, Wikibot, Jleedev, Tobias Bergemann, Filemon, Snobot, Giftlite, Xplat,Kim Bruning, Lethe, Jorend, Guanaco, Siroxo, Gubbubu, Mmm~enwiki, Utcursch, Kusunose, Almit39, Karl-Henner, Creidieki, Urhix-idur, Lucidish, Mormegil, Rich Farmbrough, Guanabot, Paul August, Bender235, Elwikipedista~enwiki, Pmetzger, Spayrard, Chalst,Nile, Rsmelt, EmilJ, Marner, Randall Holmes, Per Olofsson, Nortexoid, Spug, ToastieIL, AshtonBenson, Obradovic Goran, Mpeisenbr,Officiallyover, Msh210, Axl, Harburg, Dhruvee, Caesura, BRW, Iannigb, Omphaloscope, Apolkhanov, Bookandcoffee, Oleg Alexan-drov, Kendrick Hang, Hq3473, Joriki, Velho, Kelly Martin, Linas, Ahouseholder, Ruud Koot, BD2412, SixWingedSeraph, Grammar-bot, Rjwilmsi, Tizio, .digamma, MarSch, Mike Segal, Ekspiulo, R.e.b., Penumbra2000, Mathbot, Banazir, NavarroJ, Chobot, Bgwhite,Jayme, Roboto de Ajvol, Wavelength, Borgx, Michael Slone, Marcus Cyron, Meloman, Trovatore, Expensivehat, Hakeem.gadi, JEComp-ton, Saric, Arthur Rubin, Netrapt, Nahaj, Katieh5584, RG2, Otto ter Haar, Jsnx, SmackBot, InverseHypercube, Brick Thrower, Slaniel,NickGarvey, Mhss, Foxjwill, Onceler, Jon Awbrey, Turms, Henning Makholm, Tesseran, Byelf2007, Lambiam, Cdills, Dbtfz, Richard L.Peterson, Cronholm144, Physis, Loadmaster, Mets501, Pezant, Phuzion, Mike Fikes, JulianMendez, Dan Gluck, Iridescent, Hilverd, Zerosharp, JRSpriggs, 8754865, CRGreathouse, CBM,Mindfruit, Gregbard, Fl, Danman3459, Blaisorblade, JulianMendez, Juansempere, Eu-bulide, Malleus Fatuorum, Mojo Hand, RobHar, Nick Number, Rriegs, Klausness, Eleuther, Jirislaby, VictorAnyakin, Childoftv, TigranesDamaskinos, JAnDbot, Ahmed saeed, Thenub314, RubyQ, Igodard, Martinkunev, Alastair Haines, Jay Gatsby, A3nm, David Eppstein,Pkrecker, TechnoFaye, Avakar, Exostor, Pomte, Maurice Carbonaro, WarthogDemon, Inquam, SpigotMap, Policron, Heyitspeter, Mis-tercupcake, Camrn86, English Subtitle, Crowne, Voorlandt, The Tetrast, Philogo, LBehounek, VanishedUserABC, Kgoarany, RJaguar3,ConcernedScientist, Lord British, Ljf255, SouthLake, Kumioko (renamed), DesolateReality, Anchor Link Bot, Wireless99, Randomblue,CBM2, NoBu11, Francvs, Classicalecon, Phyte, NicDumZ, Jan1nad, Gherson2, Mild Bill Hiccup, Dkf11, Nanobear~enwiki, Nanmus,Watchduck, Cacadril, Hans Adler, Djk3, Willhig, Palnot, WikHead, Subversive.sound, Sameer0s, Addbot, Norman Ramsey, Histre, Pdib-ner, Tassedethe, ב ,.דניאל Snaily, Legobot, Yobot, Ht686rg90, Cloudyed, Pcap, AnakngAraw, AnomieBOT, Citation bot, TitusCarus,Grim23, Ejars, FrescoBot, Hobsonlane, Mark Renier, Liiiii, Citation bot 1, Tkuvho, DrilBot, Sh Najd, 34jjkky, Rlcolasanti, Diannaa,Reach Out to the Truth, Lauri.pirttiaho, WildBot, Gf uip, Carbo1200, Be hajian, Chharvey, Sampletalk, Bulwersator, Jaseemabid, Ti-jfo098, Templatetypedef, ClueBot NG, Johannes Schützel, MerlIwBot, Daviddwd, BG19bot, Lifeformnoho, Dhruvbaldawa, Virago250,Solomon7968, Rjs.swarnkar, Sanpra1989, Deltahedron, Gabefair, Jochen Burghardt, Hoppeduppeanut, Cptwunderlich, Seppi333, Holy-seven007, Wilbertcr, Threerealtrees, Immanuel Thoughtmaker, Jwinder47, Mario Castelán Castro, Purgy Purgatorio, Comp-heur-intel,Broswald and Anonymous: 248

• Formal proof Source: https://en.wikipedia.org/wiki/Formal_proof?oldid=661169170 Contributors: Charles Matthews, Hyacinth, J D,Timrollpickering, Pmanderson, EmilJ, Dfranke, BD2412, Salix alba, Gaius Cornelius, Seegoon, Chrylis, Gregbard, Nick Number, Phil-ogo, VanishedUserABC, Radagast3, Ndenison, Mild Bill Hiccup, Hans Adler, HexaChord, Addbot, Numbo3-bot, AnomieBOT,MattTait,The Wiki ghost, Disinvented and Anonymous: 9

• Formal semantics (logic) Source: https://en.wikipedia.org/wiki/Formal_semantics_(logic)?oldid=630172993 Contributors: Stevertigo,Rp, Hyacinth, Giftlite, Thv, Robin klein, Chalst, Wareh, Nortexoid, Mdd, Thüringer, Tabor, Woohookitty, BD2412, Porcher, WouterBot,YurikBot, RussBot, StefanMangold, Jpbowen, Arthur Rubin, SmackBot, Mhss, Cybercobra, Byelf2007, Lambiam, Gregbard, Cydebot,AnAj, Philogo, Accounting4Taste, Firstwingman, Kumioko (renamed), SummerWithMorons, CohesionBot, Alexbot, Hans Adler, Ad-dbot, AnomieBOT, MauritsBot, Shadowjams, LucienBOT, D'ohBot, Gamewizard71, Diannaa, Gf uip, EmausBot, Tijfo098, Doctoram-bient, Helpful Pixie Bot, Brad7777, Jochen Burghardt, DimasMendes and Anonymous: 8

• Formal system Source: https://en.wikipedia.org/wiki/Formal_system?oldid=671526383Contributors: Michael Hardy,Modster, Mpagano,AugPi, Charles Matthews, Dysprosia, Hyacinth, Timrollpickering, Benc, Ancheta Wis, Giftlite, Acattell, Pmanderson, Felix Wiemann,Ascánder, R. S. Shaw, PWilkinson, Obradovic Goran, Mdd, Ruud Koot, Waldir, Raguks, Qwertyus, FlaBot, Margosbot~enwiki, Tillmo,Rekleov, YurikBot, Jpbowen, Arthur Rubin, Bsod2, SmackBot, Rex the first, Pro8, Hmains, Colonies Chris, Salt Yeung, Jon Awbrey,Byelf2007, Lambiam, Bjankuloski06en~enwiki, RandomCritic, 16@r, Mets501, JMK, BrainMagMo, CBM, Neelix, Gregbard, Cydebot,Al Lemos, Olaf, MartinBot, R'n'B, Maurice Carbonaro, Jonathanzung, Am Fiosaigear~enwiki, Maximillion Pegasus, Philogo, Popopp,Dmcq, Udirock, Vanished user kijsdion3i4jf, DesolateReality, Kas-nik, WestwoodMatt, Hans Adler, Aleksd, Sleepinj, Libcub, Addbot,TomorrowsDream, Adama44, Ptbotgourou, TaBOT-zerem, Pcap, KamikazeBot, AnomieBOT, Buenasdiaz, Omnipaedista, The Wikighost, Nikiriy, Undsoweiter, FrescoBot, Anthony.h.burton, Jonkerz, EmausBot, John of Reading, Montgolfière, Architectchao, Tijfo098,ChuispastonBot, Rmashhadi, Wcherowi, Paylett, JRBugembe, Helpful Pixie Bot, BG19bot, Justincheng12345-bot, Steamerandy, Saehry,Trackteur, Mario Castelán Castro and Anonymous: 36

• Formation rule Source: https://en.wikipedia.org/wiki/Formation_rule?oldid=635230107 Contributors: Michael Hardy, Hyacinth, Tim-rollpickering, Giftlite, EmilJ, BD2412, Kbdank71, Arthur Rubin, SmackBot, Gregbard, Cydebot, Cpiral, ClueBot, Hans Adler, Addbot,Tahu88810, Xqbot, The Wiki ghost, Snotbot, Brirush and Anonymous: 1

• Interpretation (logic) Source: https://en.wikipedia.org/wiki/Interpretation_(logic)?oldid=663599204Contributors: Michael Hardy,Markhurd,Hyacinth, Ancheta Wis, Bogdanb, Urhixidur, Chalst, EmilJ, Alansohn, Sligocki, Woohookitty, Linas, Kbdank71, Salix alba, Trovatore,Arthur Rubin, SmackBot, Malagent, Ppntori, Javalenok, TheGerm, Lambiam, Levineps, Zero sharp, CBM, Dgw, Neelix, Ksoileau, Greg-bard, Cydebot, Cic, Heyitspeter, DASonnenfeld, Camrn86, Philogo, Aaron Rotenberg, Costela, Tomaxer, Cnilep, CBM2, Hans Adler,Alexey Muranov, Muro Bot, Kaba3, Djk3, Good Olfactory, Addbot, Luckas-bot, AnomieBOT, Citation bot, The Wiki ghost, Citationbot 1, Trappist the monk, Hriber, Tijfo098, RockMagnetist, ClueBot NG, Helpful Pixie Bot, BG19bot, Nathanielfirst, Jochen Burghardt,Sean61961, Evgeni.rovinsky and Anonymous: 21

• Logical consequence Source: https://en.wikipedia.org/wiki/Logical_consequence?oldid=667253451 Contributors: Hyacinth, AnchetaWis, Giftlite, Mani1, Eric Kvaalen, BDD, Velho, Dionyziz, Macaddct1984, BD2412, Kbdank71, Koavf, Mathbot, Algebraist, Borgx,

Page 162: Formal Semantics (Logic)

152 CHAPTER 24. WELL-FORMED FORMULA

Alynna Kasmira, Arthur Rubin, SmackBot, Incnis Mrsi, Kintetsubuffalo, Bluebot, Javalenok, DMacks, Dbtfz, Grumpyyoungman01,Slakr, Inquisitus, KyleP, Igoldste, CBM, Gregbard, Cydebot, Gimmetrow, Thijs!bot, Luna Santin, Albany NY, Magioladitis, Trusilver,Maurice Carbonaro, VolkovBot, Philogo, Jamelan, Graymornings, Wemlands, Cnilep, Botev, Aplex, ClueBot, Tomas e, Sps00789, Panyd,Hans Adler, Good Olfactory, Iranway, Addbot, Niriel, AnomieBOT, RibotBOT, Minister Alkabaz, Machine Elf 1735, I dream of horses,MoreNet, Adam.a.a.golding, Tijfo098, Tziemer991, ClueBot NG, Wbm1058, Hanlon1755, ,چالاک Aubreybardo and Anonymous: 30

• Logical constant Source: https://en.wikipedia.org/wiki/Logical_constant?oldid=666654591 Contributors: Timrollpickering, Nortexoid,BD2412, Reinis, SmackBot, The great kawa, Frap, Lambiam, Dbtfz, CBM, Gregbard, Nick Number, SieBot, Randomblue, Fadesga,MystBot, Addbot, Luckas-bot, AnomieBOT, RibotBOT, Undsoweiter, FrescoBot, Hriber, Tijfo098, Frietjes, Masssly, Camila CavalcantiNery, ZX95 and Anonymous: 6

• Logical Syntax of Language Source: https://en.wikipedia.org/wiki/Rudolf_Carnap?oldid=671120071 Contributors: Mav, William Av-ery, Michael Hardy, Gabbe, Poor Yorick, Jod, Charles Matthews, Radgeek, Markhurd, Hyacinth, Qertis, Banno, Robbot, Josh Cherry,Fredrik, Timrollpickering, Snobot, Giftlite, Carlo.Ierna, Protagoras~enwiki, Ot, Marcos, Hkpawn~enwiki, Jutta, Lucidish, D6, Mormegil,Simonides, Rich Farmbrough, Euthydemos, DcoetzeeBot~enwiki, Bender235, Whosyourjudas, Sebastianlutz, Rje, LostLeviathan, Lok-ifer, Mdd, Tweedy7736, Velho, Eras-mus, BD2412, Rjwilmsi, Pitan, Olessi, FlaBot, Chobot, YurikBot, Wavelength, RobotE, Gellersen,Arado, KSchutte, Tony1, Hans G. Oberlack, Elonka, Hmains, Jaymay, Josteinn, Sistema13, OrphanBot, TheKMan, Makemi, JJstro-ker, Lacatosias, Byelf2007, ArglebargleIV, Ser Amantio di Nicolao, JzG, Dbtfz, Dfass, Aiwendil42, Kripkenstein, RevTarthpeigust,Joseph Solis in Australia, Ewulp, Gregbard, Cydebot, Bellerophon5685, Universitytruth, Lindsay658, PKT, Basilo, Dalliance, OreoPriest, Sobaka, Fayenatic london, Danny lost, Nikolaos Bakalis, VoABot II, Here2fixCategorizations, Exiledone, Philosophy Junkie, JaGa,Pikolas, CommonsDelinker, Johnpacklambert, Warren Allen Smith, DadaNeem, Cometstyles, Inwind, Nadavvv, VolkovBot, TXiKiBoT,BertSen, Starnold, Ontoraul, Broadbot, Insane-Contrast, Alcmaeonid, Logan, Castelargus, Jmccance, BotMultichill, Jauerback, Vojvo-daen, Admitone, ClueBot, Marenty, Auntof6, PixelBot, Brews ohare, Adamwodeham, Wulf Isebrand, Tassedethe, Luckas-bot, Yobot,Bunnyhop11, Ptbotgourou, AnomieBOT, W.stanovsky, MauritsBot, Xqbot, Omnipaedista, FreeKnowledgeCreator, FrescoBot, Schnuf-flus, Shiki2, Tkuvho, Foobarnix, SEVEREN, Minimac, Dewritech, Faolin42, ZéroBot, Fæ, Shuipzv3, Jenks24, Suslindisambiguator,Gwestheimer, Kunstlerbob, Cntras, Masssly, Chastra, ChrisGualtieri, YFdyh-bot, Lf(lx(f)(x)x)lx(f)(x)x, VIAFbot, Jochen Burghardt,Unearthly Stew, Nblount, Tomajohnson, Begotknow, Nixin06, Aubreybardo, A Boelen, Hammel a, Neadv, KasparBot and Anonymous:92

• Metasyntactic variable Source: https://en.wikipedia.org/wiki/Metasyntactic_variable?oldid=666176352 Contributors: Dreamyshade,Tarquin, BlckKnght, Benwbrum, Ortolan88, Shii, DavidLevinson, Anthere, Cayzle, Shaydon, Xoder, Stevertigo, Frecklefoot, Bdesham,Michael Hardy, Tim Starling, Nixdorf, Philbog, Liftarn, MartinHarper, Taras, Graue, Dcljr, TakuyaMurata, GTBacchus, Minesweeper,Tregoweth, Card~enwiki, Ihcoyc, Jpatokal, Kimiko, IMSoP, Cimon Avaro, Kaihsu, Harvester, Ec5618, Timwi, A1r, Dcoetzee, Ww,Dysprosia, Timc, Furrykef, Hyacinth, Saltine, Jnc, Dogface, Populus, Sabbut, Rnbc, ,דוד Dpbsmith, Mpost89, Finlay McWalter, Rob-bot, RichiH, Popageorgio, Josh Cherry, ChrisO~enwiki, Fredrik, Faco~enwiki, R3m0t, Sanders muc, RedWolf, Altenmann, Greudin,Ashdurbat, Henrygb, Meelar, Timrollpickering, Hadal, Iron Bishop, Wereon, Benc, Jor, Danceswithzerglings, Diberri, Jleedev, TobiasBergemann, Vir4030, Mintleaf~enwiki, Inter, BenFrantzDale, Geeoharee, Tom harrison, Sploo22, LT-P, FunnyMan3595, Guanaco, Big-Ben212, RemyB,Mboverload, ElfMage, Lakefall~enwiki, ChicXulub, LucasVB, Gzuckier, Joeblakesley,WhiteDragon, Ablewisuk, Mza-jac, OwenBlacker, Maximaximax, RetiredUser2, Patilkedar, DenisMoskowitz, Drhaggis, Defenestrate, HenHei~enwiki, Kevyn, Trafton,Jbinder, Andreas Kaufmann, Mtnerd, Grunt, RandalSchwartz, Omassey, Astronouth7303, Venu62, Sparky the Seventh Chaos, Mind-spillage, Rich Farmbrough, Reallycoolguy, MCBastos, Drano, Luqui, Rama, Ardonik, Mikkel, Bo Lindbergh, Night Gyr, Dolda2000,Ljosa, Demo, Nabla, CanisRufus, Kwamikagami, Tverbeek, PhilHibbs, Triona, Circeus, GalaxiaGuy, Mike Schwartz, AKGhetto, Co-hesion, Brian McNeil, Kajiki, Alphax, Benji22210, Anthony Appleyard, Diego Moya, Andrewpmk, Andrew Gray, Jnothman, ,ליאורHoary, InShaneee, Dark Shikari, Stillnotelf, Saga City, Vyruss, Paul1337, Yuckfoo, Ilse@, Alanhwiki, Rotring, Kay Dekker, Brookie,Ron Ritzman, Kelly Martin, Simetrical, Ae-a, Quattrop, Riumplus, Tckma, Srborlongan, Firien, Dionyziz, MarkusHagenlocher, XiongChiamiov, RoyHui, Starwed, Radiant!, Marudubshinki, Matturn, Yoghurt, MagisterMathematicae, Keeves, BD2412, Crocodealer, Gram-marbot, BorgHunter, Quiddity, Bruce1ee, Nick R, Brajeshs, Danfuzz, Sarg, Faduci, Cassowary, KaiMartin, Ecb29, Fëaluinix, Margos-bot~enwiki, Loggie, OpenToppedBus, Mark Yen, Correon, Peterl, FrankTobia, YurikBot, Flameviper, RussBot, FrenchIsAwesome,Hyad, SpuriousQ, Jasonglchu, Sikon, Viltris, Dysmorodrepanis~enwiki, Cleared as filed, PeepP, Lomn, Tertulia, Freshgavin, Richard-cavell, FF2010, Ninly, Nzzl, Nentuaby, Clayhalliwell, StealthFox, Eric TF Bat, Trubye, Leon2323, Chrismith, IslandHopper973, Sar-danaphalus, SmackBot, Larry Doolittle, Matt Raines, Kilo-Lima, Reverend Loki, Zzzzz, Tjp1982, Setanta747 (locked), Renesis, ProveIt,Guyalsfere, TheDoctor10, Colonel Tom, Shai-kun, Moe Aboulkheir, Kesafloyd, Richfife, Masklinn, Wcoenen, Nympheta, Crashmatrix,Snori, Stevage, Pigeon.dyndns.org, Nbarth, Jdthood, Efitu, MrBananaGrabber, Metageek, Kristod, Tinctorius, Radagast83, Cybercobra,TheLimbicOne, Datapoohbah, BlackFingolfin, FelisLeo, TenPoundHammer, Amtiss, Gennaro Prota, LtPowers, MegA, Errorx666, Soap,MageKing17, Aroundthewayboy, Breno, IronGargoyle, Astrait47, LandruBek, Shannernanner, LPH, Tawkerbot2, CRGreathouse, Cm-drObot, USMCM1A1, JohnCD, Except, Coldplayer, Davnor, Gregbard, AndrewHowse, Cydebot, Ccnelson, Cvxdes, Thijs!bot, Scampida dude, Scampi da dudee, Criggie, Ajo Mama, Alex Barrett, AgentPeppermint, AntiVandalBot, Jimbomorrison, Scepia, TexMurphy,Alphachimpbot, Phloopy, Danfe~enwiki, Coolhandscot, Tqbf, Murgh, VoABot II, Bruce Tindall, Usien6, Skew-t, Shocking Blue, Thibbs,Talon Artaine, Gwern, Pauljackson1, Jtir, Kf4yfd, MartinBot, Church of emacs, Kiore, B33R, Test0zero, Donnaidh sidhe, Noah Keating,Inapplicable, Jaredehansen, Akuankka en, Thomas Larsen, Dannys-777, Raise exception, Babedacus, OliverHarris, AlKing464, AdamZivner, Rogerkilday, Kevinkor2, D A Patriarche, Thaddeus Slamp, Tclark88, Sami Kerola, Antixt, Cnilep, FrostPaladin, Dgmjr05, Kara-boom, BlueAzure, CaptainLepton, Anchor Link Bot, Foxj, Jaredj, Excirial, Qwfp, DumZiBoT, Asrghasrhiojadrhr, Logicist, Slacker874,Elehack, Legobot, PlankBot, Yobot, TaBOT-zerem, Denispir, Jason Recliner, Esq., AnomieBOT, 1exec1, ThaddeusB, Crimsonmar-garine, Freebirth Toad, Iheartmarek, Carlesso, Shirik, Lothar von Richthofen, ComputScientist, BrideOfKripkenstein, Luisdanielmesa,AreaMan, Lotje, Morton Shumway, Hogehogeloon, Jesse V., Fugafugafudesaka, Fugokyara, Wikipelli, John Cline, Wayne Slam, Ti-jfo098, ClueBot NG,Monotrema, BG19bot, Chmarkine, Fraulein451, Choephix, ChrisGualtieri, Arash.Joorabchi, Enamex, Nyuszika7H,Ahsbdx, Babsbshd and Anonymous: 414

• Metavariable Source: https://en.wikipedia.org/wiki/Metavariable?oldid=618282154Contributors: Timrollpickering, Vegaswikian, DavidPierce, Courcelles, Gregbard, Kf4yfd, Cnilep, CorenSearchBot, AnomieBOT,Morton Shumway, Bk314159, Tijfo098, Garetjax3891 andAnonymous: 3

• Proposition Source: https://en.wikipedia.org/wiki/Proposition?oldid=672228119 Contributors: AxelBoldt, Mav, Toby Bartels, Zoe,Stevertigo, K.lee, Michael Hardy, Zeno Gantner, TakuyaMurata, Minesweeper, Evercat, Sethmahoney, Conti, Reddi, Greenrd, Markhurd,Hyacinth, Banno, RedWolf, Ojigiri~enwiki, Timrollpickering, Tobias Bergemann, Giftlite, Jason Quinn, Stevietheman, Antandrus, Su-perborsuk, Sebbe, Amicuspublilius, Martpol, Hapsiainen, Vanished user lp09qa86ft, Chalst, Phiwum, Duesentrieb, Bobo192, Larry V,MPerel, Helix84, V2Blast, Ish ishwar, Emvee~enwiki, RJFJR, Bobrayner, Philthecow, Velho, Woohookitty, Kzollman, Isnow, Patl,

Page 163: Formal Semantics (Logic)

24.12. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 153

Brolin Empey, Lakitu~enwiki, Fresheneesz, Bornhj, YurikBot, Hairy Dude, Rick Norwood, Wknight94, Finell, SmackBot, Evanreyes,Ignacioerrico, Bluebot, Jaymay, DHN-bot~enwiki, Cybercobra, Richard001, Lacatosias, Jon Awbrey, Vina-iwbot~enwiki, Byelf2007,Harryboyles, SilkTork, Ckatz, 16@r, Grumpyyoungman01, Stwalkerster, Caiaffa, Levineps, Iridescent, JoeBot, Gveret Tered, Eastlaw,CRGreathouse, CBM, Sdorrance, Andkore, Gregbard, Juansempere, Yesterdog, Thijs!bot, Barticus88, Kredal, AllenFerguson, Voyag-ing, NSH001, JAnDbot, MER-C, Leolaursen, Bookinvestor, Connormah, VoABot II, WhatamIdoing, Pomte, Tgeairn, J.delanoy, Ali,Ginsengbomb, Katalaveno, Coppertwig, Nieske, Funandtrvl, King Lopez, ABF, TXiKiBoT, Philogo, Tracerbullet11, Cnilep, Barkeep,SieBot, Legion fi, Oxymoron83, OKBot, ClueBot, The Thing That Should Not Be, Watchduck, Estirabot, Hans Adler, Hugo Herbelin,DumZiBoT, Makotoy, Crazy Boris with a red beard, Dthomsen8, Dwnelson, SilvonenBot, Good Olfactory, Addbot, Andrewghutchison,LAAFan, Luckas-bot, TheSuave, Denyss, THENWHOWAS PHONE?, Ehuss, KamikazeBot, AnomieBOT, E235, Yalckram, Wortafad,ArthurBot, Luis Felipe Schenone, Omnipaedista, FrescoBot, BrideOfKripkenstein, Motomuku, Pinethicket, A8UDI, Monkeymanman,Gamewizard71, FoxBot, Lotje, TheMesquito, Daliot, EmausBot, John of Reading, Eekerz, Look2See1, Honestrosewater, Bollyjeff,Coasterlover1994, Chewings72, ClueBot NG, MelbourneStar, Satellizer, Masssly, Helpful Pixie Bot, Hans-Jürgen Streicher~enwiki,,زكريا ChrisGualtieri, Jochen Burghardt, Eyesnore, Purnendu Karmakar, DetectiveKraken, SanketDash, Ashika Bieber, Eavestn andAnonymous: 163

• Propositional calculus Source: https://en.wikipedia.org/wiki/Propositional_calculus?oldid=675960448 Contributors: The Anome, Tar-quin, Jan Hidders, Tzartzam, Michael Hardy, JakeVortex, Kku, Justin Johnson, Minesweeper, Looxix~enwiki, AugPi, Rossami, Ev-ercat, BAxelrod, Charles Matthews, Dysprosia, Hyacinth, UninvitedCompany, BobDrzyzgula, Robbot, Benwing, MathMartin, Rorro,GreatWhiteNortherner, Marc Venot, Ancheta Wis, Giftlite, Lethe, Jason Quinn, Gubbubu, Gadfium, LiDaobing, Grauw, Almit39, Ku-tulu, Creidieki, Urhixidur, PhotoBox, EricBright, Extrapiramidale, Rich Farmbrough, Guanabot, FranksValli, Paul August, GlennWillen,Elwikipedista~enwiki, Tompw, Chalst, BrokenSegue, Cmdrjameson, Nortexoid, Varuna, Red Winged Duck, ABCD, Xee, Nightstallion,Bookandcoffee, Oleg Alexandrov, Japanese Searobin, Joriki, Linas, Mindmatrix, Ruud Koot, Trevor Andersen, Waldir, Graham87, Qw-ertyus, Kbdank71, Porcher, Koavf, PlatypeanArchcow, Margosbot~enwiki, Kri, Gareth E Kegg, Roboto de Ajvol, Hairy Dude, Russell C.Sibley, Gaius Cornelius, Ihope127, Rick Norwood, Trovatore, TechnoGuyRob, Jpbowen, Cruise, Voidxor, Jerome Kelly, Arthur Rubin,Reyk, Teply, GrinBot~enwiki, SmackBot, Michael Meyling, Imz, Incnis Mrsi, Srnec, Mhss, Bluebot, Cybercobra, Jon Awbrey, Andeggs,Ohconfucius, Lambiam, Wvbailey, Scientizzle, Loadmaster, Mets501, Pejman47, JulianMendez, Adriatikus, Zero sharp, JRSpriggs,George100, Harold f, Vaughan Pratt, CBM, ShelfSkewed, Sdorrance, Gregbard, Cydebot, Julian Mendez, Taneli HUUSKONEN, Ap-plemeister, GeePriest, Salgueiro~enwiki, JAnDbot, Thenub314, Hut 8.5, Magioladitis, Paroswiki, MetsBot, JJ Harrison, Epsilon0, San-tiago Saint James, R'n'B, N4nojohn, Wideshanks, TomS TDotO, Created Equal, The One I Love, Our Fathers, STBotD, Mistercupcake,VolkovBot, JohnBlackburne, TXiKiBoT, Lynxmb, The Tetrast, Philogo, Wikiisawesome, General Reader, Jmath666, VanishedUser-ABC, Sapphic, Newbyguesses, SieBot, Iamthedeus, Дарко Максимовић, Jimmycleveland, OKBot, Svick, Huku-chan, Francvs, Clue-Bot, Unica111, Wysprgr2005, Garyzx, Niceguyedc, Thinker1221, Shivakumar2009, Estirabot, Alejandrocaro35, Reuben.cornel, HansAdler, MilesAgain, Djk3, Lightbearer, Addbot, Rdanneskjold, Legobot, Yobot, Tannkrem, Stefan.vatev, Jean Santeuil, AnomieBOT,Materialscientist, Ayda D, Doezxcty, Cwchng, Omnipaedista, SassoBot, January2009, Thehelpfulbot, FrescoBot, LucienBOT, Xenfreak,HRoestBot, Dinamik-bot, EmausBot, John of Reading, 478jjjz, Chharvey, Chewings72, Bomazi, Tijfo098, MrKoplin, Frietjes, Help-ful Pixie Bot, Brad7777, Wolfmanx122, Hanlon1755, Jochen Burghardt, Mark viking, Mrellisdee, Christian Nassif-Haynes, MatthewKastor, Marco volpe, Jwinder47, Mario Castelán Castro, Eavestn, SiriusGR and Anonymous: 148

• Propositional formula Source: https://en.wikipedia.org/wiki/Propositional_formula?oldid=658354394 Contributors: Michael Hardy,Hyacinth, Timrollpickering, Tobias Bergemann, Filemon, Giftlite, Golbez, PWilkinson, Klparrot, Bookandcoffee, Woohookitty, Linas,Mindmatrix, Tabletop, BD2412, Kbdank71, Rjwilmsi, Bgwhite, YurikBot, Hairy Dude, RussBot, Open2universe, SmackBot, Hmains,Chris the speller, Bluebot, Colonies Chris, Tsca.bot, Jon Awbrey, Muhammad Hamza, Lambiam, Wvbailey, Wizard191, Iridescent,Happy-melon, ChrisCork, CBM, Gregbard, Cydebot, Julian Mendez, Nick Number, Arch dude, Djihed, R'n'B, Raise exception, Wiki-isawesome, Billinghurst, Spinningspark, WRK, Maelgwnbot, Jaded-view, Mild Bill Hiccup, Neuralwarp, Addbot, Yobot, Adelpine,AnomieBOT, Neurolysis, LilHelpa, The Evil IP address, Kwiki, Kevin Gorman, Helpful Pixie Bot, BG19bot, PhnomPencil, Wolf-manx122, Jochen Burghardt, Mark viking, Knife-in-the-drawer and Anonymous: 18

• Rule of inference Source: https://en.wikipedia.org/wiki/Rule_of_inference?oldid=671924786 Contributors: Michael Hardy, Darkwind,Poor Yorick, Rossami, BAxelrod, Hyacinth, Ldo, Timrollpickering, Markus Krötzsch, Jason Quinn, Khalid hassani, Neilc, Quadell,CSTAR, Lucidish, MeltBanana, Elwikipedista~enwiki, EmilJ, Nortexoid, Giraffedata, Joriki, Ruud Koot, Hurricane Angel, Waldir,BD2412, Kbdank71, Emallove, Brighterorange, Algebraist, YurikBot, Rsrikanth05, Cleared as filed, Arthur Rubin, Fram, Nahaj, El-wood j blues, Mhss, Chlewbot, Byelf2007, ArglebargleIV, Robofish, Tktktk, Jim.belk, Physis, JHunterJ, Grumpyyoungman01, DanGluck, CRGreathouse, CBM, Simeon, Gregbard, Cydebot, Thijs!bot, Epbr123, LokiClock, TXiKiBoT, Cliff, Eusebius, Alejandro-caro35, Addbot, Luckas-bot, AnomieBOT, Citation bot, GrouchoBot, RibotBOT, WillMall, Undsoweiter, Jonesey95, Gamewizard71,Onel5969, TomT0m, Tesseract2, Tijfo098, ClueBot NG, Delphinebbd, Ginsuloft and Anonymous: 28

• Semantics Source: https://en.wikipedia.org/wiki/Semantics?oldid=671135429 Contributors: The Anome, Youssefsan, Vaganyik, Or-tolan88, Ben-Zin~enwiki, HannesHirzel, Heron, Ryguasu, Netesq, Stevertigo,Michael Hardy, Pit~enwiki, Gdarin, Rp, Kku, Looxix~enwiki,Glenn, Rossami, Andres, Hectorthebat, Jitse Niesen, Mjklin, Haukurth, Shizhao, Fvw, Jens Meiert, Jon Roland, Seriv, Robbot, Lambda,Pigsonthewing, Jakohn, Kiwibird, Sverdrup, Rursus, Moink, Spellbinder, Marc Venot, Gwalla, Markus Krötzsch, Jpta~enwiki, HHirzel,Everyking, Zhen Lin, Eequor, Khalid hassani, Jackol, Javier Carro, JoJan, Mukerjee, Augur, Kntg, Bornslippy, Urhixidur, Yuriz, Lu-cidish, Rich Farmbrough, Cacycle, Rama, Slipstream, Kzzl, Dbachmann, Paul August, Jaberwocky6669, Evice, El C, Chalst, Joan-joc~enwiki, Linkoman, Enric Naval, Nortexoid, Jonsafari, Jooyoonchung, Helix84, Anthony Appleyard, Mark Dingemanse, Sligocki,Cdc, Sabrebattletank, Ish ishwar, Tycho, EvenT, Jason L. Gohlke, Redvers, Simlorie, Galaxiaad, Ott, Jtauber, Velho, Woohookitty,Mindmatrix, Kokoriko, Kelisi, Analogisub, SDC, Mandarax, Graham87, Imersion, Rjwilmsi, Mayumashu, Koavf, Jivecat, Dmccreary,Brighterorange, Mlinar~enwiki, NeoAmsterdam, FlaBot, Sinatra, Isotope23, Ben Babcock, Vonkje, Comiscuous, Lambyuk, Chobot,Sonic Mew, Roboto de Ajvol, YurikBot, Wavelength, Hairy Dude, Retodon8, Stephenb, Anomalocaris, NawlinWiki, Maunus, Mark-Brooks, JECompton, WAS 4.250, Light current, G. Lakoff, Lt-wiki-bot, Donald Albury, SMcCandlish, JuJube, Pred, AGToth, Nick-elShoe, Sardanaphalus, SmackBot, Zerida, Unyoyega, Shamalyguy, Lindosland, Chris the speller, MasterofUnvrs314, MK8, MalafayaBot,Droll, Jerome Charles Potts, A. B., Scwlong, Zsinj, Frap, Ioscius, Chlewbot, SundarBot, Khoikhoi, Cybercobra, Iblardi, Battamer, JonAwbrey, Byelf2007, SashatoBot, 16@r, Hvn0413, Nabeth, Kvng, Hu12, Gandalf1491, J Di, DEddy, Ziusudra, George100, Stifynse-mons, Wolfdog, Sir Vicious, Kensall, Gregbard, FilipeS, Cydebot, Warhorus, ST47, Quibik, Nickleus, Gimmetrow, Thijs!bot, Wikid77,Runch, Mbell, Dalahäst, Azymuthca, X201, Nick Number, Mentifisto, AntiVandalBot, Shawn wiki, Gioto, Widefox, TimVickers, DylanLake, Danny lost, JAnDbot, MER-C, Shermanmonroe, Jmchambers90, Dcooper, .anacondabot, Daveh1, AndriesVanRenssen, Tmus-grove, Nicodemus13, Mahitgar, Revery~enwiki, Mechanismic, Ekotkie, MartinBot, J.delanoy, Cyborg Ninja, Piercetheorganist, Dbiel,Rod57, AKA MBG, Lygophile, Erick.Antezana, Lrunge, RasputinJSvengali, Macedonian, LokiClock, Philip Trueman, Amos Han,

Page 164: Formal Semantics (Logic)

154 CHAPTER 24. WELL-FORMED FORMULA

TXiKiBoT, Purpose Observatory, Aaeamdar, Goberiko~enwiki, HillarySco, Merijn2, Synthebot, Lova Falk, Cnilep, Jimbo2222, Lo-gan, Botev, SieBot, Nubiatech, Kgoarany, Asderff, PaulColby, Jerryobject, Yerpo, ScAvenger lv, Strife911, Bguest, MiNombreDeGuerra,Doc honcho, CharlesGillingham, Emptymountains, Martarius, ClueBot, Bbadree, Tanglewood4, Eklir, Niceguyedc, DragonBot, Awi007,PixelBot, Vanisheduser12345, Rhododendrites, MacedonianBoy, Cenarium, Aleksd, Micmachete, MystBot, Alanthehat, Addbot, Rdan-neskjold, The singapore ministry of education sucks, AVand, Guoguo12, Landon1980, Friginator, K1US, Aboctok, Ayatniazi, Cana-dianLinuxUser, Pirtskhalava, CarsracBot, Numbo3-bot, Erutuon, Tide rolls, JAHendler, Krixou, Legobot, Luckas-bot, TaBOT-zerem,Vanished user rt41as76lk, AnakngAraw, 8ung3st, Molewood6, Rockypedia, Rjanag, Govindmaheswaran, Jim1138, Materialscientist,Citation bot, LilHelpa, Xqbot, Hyggelig, Lynch9000s, Aenioc, JustinCope82, Omnipaedista, Benjamin Dominic, FrescoBot, Levalley,Citation bot 1, Mundart, Smithonian, Harold Philby, Pinethicket, Joost.b, RedBot, MastiBot, Nora lives, FoxBot, ,کاشفعقیل Jonkerz,Lotje, Theyetiman12345, RobotQuistnix, 2bluey, Mchcopl, Zegarad, EmausBot, Jefffi, Active Banana, Hpvpp, Alexey.kudinkin, Lla-mas4drama'10, Unreal7, SporkBot, Gabnh, Eric Biggs, Edunoramus, Kgsbot, Ready, Odysseus1479, Tijfo098, Manytexts, ClueBot NG,Squarrels, Aniketdalal, Movses-bot, Helpful Pixie Bot, BG19bot, BenSmak, Boblibr, Lawandeconomics1, Davidiad, Tom Pippens, Se-mantia, UnconsciousInferno, Darylgolden, Suraduttashandilya, Dave5702, Kevin12xd, Faizan, Bienmanchot, Ahernandez33, Didigodot,Noizy Boy, Sarahjane212013, Pavel Stankov, Csusarah, FelixRosch, Good afternoon, Nøkkenbuer, Spyker247, KasparBot, Vjpand andAnonymous: 278

• Symbol (formal) Source: https://en.wikipedia.org/wiki/Symbol_(formal)?oldid=630172650 Contributors: Dominus, Markhurd, Hy-acinth, Timrollpickering, Mukerjee, Rich Farmbrough, Ruud Koot, MithrandirMage, Arthur Rubin, SmackBot, CBM, Gregbard, Cy-debot, PamD, Nick Number, Calaka, R'n'B, Good Olfactory, Addbot, AnomieBOT, FrescoBot, Tijfo098, Kejia, Jiri 1984, Masssly,HMSSolent, Saehry, Wamiq, Hbb 1988 and Anonymous: 6

• Syntax (logic) Source: https://en.wikipedia.org/wiki/Syntax_(logic)?oldid=643357223 Contributors: Michael Hardy, Charles Matthews,Hyacinth, Gwydion~enwiki, Rich Farmbrough, Whosyourjudas, Oleg Alexandrov, Woohookitty, BD2412, MithrandirMage, Roboto deAjvol, Bhny, Arthur Rubin, Otto ter Haar, SmackBot, Nbarth, Lambiam, Derek farn, Physis, CBM, Gregbard, FilipeS, Cydebot, Hebrides,Thijs!bot, Nick Number, Thenub314, Johan1298~enwiki, Macedonian, Hans Adler, Marc van Leeuwen, Good Olfactory, Addbot, Coted'Azur, Erik9bot, FrescoBot, Eball, Masssly and Anonymous: 3

• Theorem Source: https://en.wikipedia.org/wiki/Theorem?oldid=668391294 Contributors: AxelBoldt, Mav, Zundark, The Anome, Tar-quin, Tbackstr, XJaM, Aldie, Michael Hardy, Zeno Gantner, TakuyaMurata, Bagpuss, Glenn, Tim Retout, Rotem Dan, Andres, CharlesMatthews, Dcoetzee, Bemoeial, Hyacinth, Traroth, SirPeebles, Moriel~enwiki, Josh Cherry, Fredrik, MathMartin, Ojigiri~enwiki, Tim-rollpickering, Hadal, Alan Liefting, Snobot, Ancheta Wis, Tosha, Giftlite, Monedula, Fropuff, Fishal, Chowbok, Alaz, MarkSweep,Karol Langner, Jacob grace, Pmanderson, Tyler McHenry, Hkpawn~enwiki, TzankoMatev, Joyous!, EugeneZelenko, Discospinster, RichFarmbrough, Paul August, Bender235, Gauge, Tompw, El C, Edwinstearns, Billymac00, Sasquatch, Alansohn, Gary, Sciurinæ, Joriki,Igny, Ruud Koot, Eras-mus, Mekong Bluesman, Tslocum, Graham87, BD2412, Gmelli, Sdornan, Salix alba, Jrtayloriv, AndriuZ, M7bot,Chobot, MithrandirMage, Sbrools, DVdm, Algebraist, Roboto de Ajvol, Borgx, Chaos, Trovatore, Dbfirs, Bota47, Tomisti, Arthur Ru-bin, Kier07, Anclation~enwiki, Curpsbot-unicodify, Erudy, Finell, Sardanaphalus, SmackBot, RDBury, Bomac, Skizzik, Chris the speller,Fuzzform, MalafayaBot, DHN-bot~enwiki, Sholto Maud, Cybercobra, Acdx, SashatoBot, Lambiam, IronGargoyle, Craigblock, Lalaith,Autonova, Mike Fikes, Zero sharp, JRSpriggs, CRGreathouse, Hi.ro, CBM, Gregbard, Cydebot, R Harris, Moxmalin, Thijs!bot, Epbr123,Mrcs, James086, Nick Number, AntiVandalBot, Thenub314, .anacondabot, Magioladitis, Dvptl, Animum, Gwern, Stephenchou0722,Pomte, Hippasus the Younger, Fcsuper, Jeepday, Nznancy, Coppertwig, Haseldon, Tparameter, Fjbfour, Dessources, DavidCBryant,Caiodnh, Austinmohr, VolkovBot, Psmythirl, Am Fiosaigear~enwiki, TXiKiBoT, Rei-bot, IKiddo, Voorlandt, Philogo, Geometry guy,Graymornings, Dmcq, HiDrNick, DestroyerofDreams, Hthoreau2, Newbyguesses, SieBot, Respir, Huzzah018, Oxymoron83, Simon-Trew, OKBot, Kumioko (renamed), DesolateReality, Wjemather, Loren.wilton, ClueBot, Blanchardb, Excirial, Alexbot, TheSnacks,Hans Adler, Muzz 2008, MonoBot, XLinkBot, Burkaja, Marc van Leeuwen, SilvonenBot, Badgernet, Addbot, DrThunder88, Some jerkon the Internet, CanadianLinuxUser, CarsracBot, Dr. Universe, Favonian, Numbo3-bot, Flatfish89, Stepfordswife, Zorrobot, Legobot,Cote d'Azur, Luckas-bot, Yobot, II MusLiM HyBRiD II, Kristen Eriksen, LilHelpa, Xqbot, Doezxcty, Capricorn42, Almabot, Miym,GrouchoBot, Jubb-green, RibotBOT, FrescoBot, Sirtywell, Haeinous, Heptadecagon, BigDwiki, RedBot, Eric wisniewski, EmausBot,ZéroBot, The Nut, Xzenu, GZ-Bot, H3llBot, D.Lazard, ChuispastonBot, ClueBot NG, Helpful Pixie Bot, Howald, Jibun, bukiyou desukara, Khazar2, Oracions, Wywin, Yamaha5, Loraof, EoRdE6, BabyChastie, Nbro, KasparBot, Blakktaktiks and Anonymous: 111

• Theory (mathematical logic) Source: https://en.wikipedia.org/wiki/Theory_(mathematical_logic)?oldid=655374287Contributors: Vkun-cak, Hyacinth, Timrollpickering, Jabowery, Uffish, Woohookitty, Linas, Waldir, Tillmo, Algebraist, Trovatore, Arthur Rubin, Cícero,Lambiam, CBM, Myasuda, Gregbard, David Eppstein, Philogo, Hans Adler, Iranway, Addbot, Numbo3-bot, Yobot, Buenasdiaz, Masti-Bot, ClueBot NG, Wcherowi, Masssly and Anonymous: 9

• Unate function Source: https://en.wikipedia.org/wiki/Unate_function?oldid=646692253 Contributors: Timrollpickering, Rich Farm-brough, Sgauria, RJFJR, Salix alba, Cydebot, Addbot, Leeaiwei, San wolverine4, Avsmal, Mogism and Anonymous: 3

• Variable (mathematics) Source: https://en.wikipedia.org/wiki/Variable_(mathematics)?oldid=666440432Contributors: Michael Hardy,Rp, TakuyaMurata, Nickshanks, Robbot, Gandalf61, Tobias Bergemann, Giftlite, Micru, Macrakis, Kusunose, Iantresman, Mike Rosoft,Discospinster, Rgdboer, Kwamikagami, MattGiuca, Eclecticos, Silivrenion, Bgwhite, Phantomsteve, Reyk, True Pagan Warrior, Smack-Bot, RDBury, Georg-Johann, Rrburke, Cybercobra, Kashmiri, JForget, CRGreathouse,Myasuda, Gregbard, Cydebot, Thijs!bot, Marek69,Seaphoto, QuiteUnusual, Bongwarrior, P64, JamesBWatson, JaGa, R'n'B, Gill110951, Fylwind, Idioma-bot, VolkovBot, LokiClock,Philip Trueman, LimStift, Enviroboy, Symane, Gaelen S., SieBot, Steorra, Gerakibot, Flyer22, Michel421, Correogsk, Sean.hoyland,Melcombe, ClueBot, Mild Bill Hiccup, Excirial, He7d3r, Muhandes, SchreiberBike, Qwfp, Marc van Leeuwen, Mifter, Addbot, Somejerk on the Internet, Vchorozopoulos, Glane23, Tide rolls, Zorrobot, Luckas-bot, KamikazeBot, Reindra, AnomieBOT, Materialscientist,Holmes7893, The High Fin SpermWhale, Xqbot, Jsharpminor, Isheden, Rodneidy, Erik9bot, Sławomir Biały, Boxplot, Pinethicket, Mrs-marenawalker, LittleWink, RedBot, TobeBot, LogAntiLog, Nataev, TjBot, DASHBot, EmausBot, Razertek, ZéroBot, Bollyjeff, Phrixus-sun, D.Lazard, Paulmiko, FrankFlanagan, BioPupil, Emperyan, ChuispastonBot, EdoBot, Txus.aparicio, ClueBot NG, Wcherowi, Satel-lizer, Cntras, Kevin Gorman, LightBringer, Widr, Jojo966, Ignacitum, HMSSolent, Wiki13, David815, AwamerT, Mark Arsten, Trav-elour, Thatemooverthere, GoShow, EuroCarGT, Dexbot, Lugia2453, Bulba2036, Jamesx12345, Nbeaver, YiFeiBot, Cubism44, Ashleyangulo, Wikapedist, Thewikiguru1, Grayhawk22, Mhsh98, BrianPansky, Gmalaven, Rainboomcool, Gamingforfun365, Karissaisbae andAnonymous: 146

• Well-formed formula Source: https://en.wikipedia.org/wiki/Well-formed_formula?oldid=660313819 Contributors: Edward, MichaelHardy, DavidWBrooks, Charles Matthews, Wik, Hyacinth, Onebyone, Josh Cherry, Tobias Bergemann, Giftlite, Andy, Jeffwarnica,Abdull, Mh, Dolda2000, Spayrard, Linas, GregorB, BD2412, Qwertyus, MithrandirMage, SpuriousQ, IanManka, Ihope127, Trovatore,Pawyilee, Ripper234, Arthur Rubin, Otto ter Haar, SmackBot, Pokipsy76,Mhss, Ioscius, Jgoulden, B7T,Midnighttonight, CRGreathouse,

Page 165: Formal Semantics (Logic)

24.12. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 155

CBM, Gregbard, Cydebot, Julian Mendez, Forgot, Nick Number, Behco, Laymanal, CountingPine, Epsilon0, Dessources, Philogo,Paradoctor, DeathByNukes, Kumioko (renamed), Martarius, ClueBot, PixelBot, Hans Adler, Jonverve, Hugo Herbelin, LheaJLove, Ad-dbot, Yobot, Ptbotgourou, Notacupcakebaker, GrouchoBot, Tkuvho, Jonesey95, Dude1818, Gamewizard71, BillyPreset, GoingBatty,Bamyers99, Pyy999, Wcherowi, Helpful Pixie Bot, Laymanallen, CitationCleanerBot, Blakehill, Jochen Burghardt and Anonymous: 38

24.12.2 Images• File:4CT_Non-Counterexample_1.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a6/4CT_Non-Counterexample_

1.svg License: Public domain Contributors: Based on a this raster image by Dmharvey on en.wikipedia. Original artist: Inductiveload• File:Accusative_alignment.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/66/Accusative_alignment.svgLicense: Pub-

lic domain Contributors: Own work Original artist: User:RedHotHeat• File:Ambox_important.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b4/Ambox_important.svg License: Public do-

main Contributors: Own work, based off of Image:Ambox scales.svg Original artist: Dsmurat (talk · contribs)• File:CollatzFractal.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1c/CollatzFractal.pngLicense: Public domainCon-tributors: English wikipedia Original artist: Pokipsy76

• File:Commons-logo.svg Source: https://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg License: ? Contributors: ? Origi-nal artist: ?

• File:Complex-adaptive-system.jpg Source: https://upload.wikimedia.org/wikipedia/commons/0/00/Complex-adaptive-system.jpgLi-cense: Public domain Contributors: Own work by Acadac : Taken from en.wikipedia.org, where Acadac was inspired to create this graphicafter reading: Original artist: Acadac

• File:Folder_Hexagonal_Icon.svg Source: https://upload.wikimedia.org/wikipedia/en/4/48/Folder_Hexagonal_Icon.svg License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Formal_languages.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/da/Formal_languages.svg License: CC BY-SA 3.0 Contributors: Own work based on: en:Image:Formal languages.png by Gregbard. Original artist: MithrandirMage

• File:HelloWorld.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/28/HelloWorld.svg License: Public domain Contrib-utors: Own work Original artist: Wooptoo

• File:Logic.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e7/Logic.svg License: CC BY-SA 3.0 Contributors: Ownwork Original artist: It Is Me Here

• File:Logic_portal.svg Source: https://upload.wikimedia.org/wikipedia/commons/7/7c/Logic_portal.svg License: CC BY-SA 3.0 Con-tributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk)

• File:Logical_connectives_Hasse_diagram.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Logical_connectives_Hasse_diagram.svg License: Public domain Contributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk)

• File:Mergefrom.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Mergefrom.svg License: Public domain Contribu-tors: ? Original artist: ?

• File:ParseTree.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6e/ParseTree.svg License: Public domain Contributors:en:Image:ParseTree.jpg Original artist: Traced by User:Stannered

• File:Predicate_logic;_2_variables;_example_matrix_a(12).svg Source: https://upload.wikimedia.org/wikipedia/commons/5/53/Predicate_logic%3B_2_variables%3B_example_matrix_a%2812%29.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_example_matrix_a12.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/55/Predicate_logic%3B_2_variables%3B_example_matrix_a12.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_example_matrix_a1e2_nodiag.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/ce/Predicate_logic%3B_2_variables%3B_example_matrix_a1e2_nodiag.svg License: CC BY-SA 3.0 Contributors: Own work, basedon File:Predicate_logic;_2_variables;_example_matrix_a1e2.svg (didn't clean-up the mess in the svg source before modifying) Originalartist: Jochen Burghardt

• File:Predicate_logic;_2_variables;_example_matrix_a2e1.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/62/Predicate_logic%3B_2_variables%3B_example_matrix_a2e1.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_example_matrix_e(12).svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f0/Predicate_logic%3B_2_variables%3B_example_matrix_e%2812%29.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_example_matrix_e12.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5d/Predicate_logic%3B_2_variables%3B_example_matrix_e12.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_example_matrix_e1a2.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/02/Predicate_logic%3B_2_variables%3B_example_matrix_e1a2.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_example_matrix_e2a1.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3a/Predicate_logic%3B_2_variables%3B_example_matrix_e2a1.svg License: Public domain Contributors: ? Original artist: ?

• File:Predicate_logic;_2_variables;_implications.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/Predicate_logic%3B_2_variables%3B_implications.svg License: Public domain Contributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk)

• File:Prop-tableau-4.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/21/Prop-tableau-4.svg License: CC-BY-SA-3.0Contributors: Transferred from en.wikipedia; transferred to Commons by User:Piquart using CommonsHelper. Original artist: Originaluploader was Tizio at en.wikipedia. Later version(s) were uploaded by RobHar at en.wikipedia.

• File:Propositional_formula_3.png Source: https://upload.wikimedia.org/wikipedia/commons/d/dc/Propositional_formula_3.png Li-cense: CC-BY-SA-3.0 Contributors: Drawn by wvbailey in Autosketch then imported into Adobe Acrobat and exported as .png. Originalartist: User:Wvbailey

Page 166: Formal Semantics (Logic)

156 CHAPTER 24. WELL-FORMED FORMULA

• File:Propositional_formula_NANDs.png Source: https://upload.wikimedia.org/wikipedia/commons/c/c9/Propositional_formula_NANDs.png License: CC-BY-SA-3.0 Contributors: Own work Original artist: User:Wvbailey

• File:Propositional_formula_connectives_1.png Source: https://upload.wikimedia.org/wikipedia/en/c/ca/Propositional_formula_connectives_1.png License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Propositional_formula_flip_flops_1.png Source: https://upload.wikimedia.org/wikipedia/en/5/5b/Propositional_formula_flip_flops_1.png License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Propositional_formula_maps_1.png Source: https://upload.wikimedia.org/wikipedia/en/b/bb/Propositional_formula_maps_1.pngLicense: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Propositional_formula_maps_2.png Source: https://upload.wikimedia.org/wikipedia/commons/9/90/Propositional_formula_maps_2.png License: CC-BY-SA-3.0 Contributors: Own work by the original uploader Original artist: User:Wvbailey

• File:Propositional_formula_oscillator_1.png Source: https://upload.wikimedia.org/wikipedia/en/e/e3/Propositional_formula_oscillator_1.png License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Pythagorean_Proof_(3).PNG Source: https://upload.wikimedia.org/wikipedia/commons/1/16/Pythagorean_Proof_%283%29.PNGLicense: CC BY-SA 3.0 Contributors: Own work Original artist: Brews ohare

• File:Question_book-new.svg Source: https://upload.wikimedia.org/wikipedia/en/9/99/Question_book-new.svg License: Cc-by-sa-3.0Contributors:Created from scratch in Adobe Illustrator. Based on Image:Question book.png created by User:Equazcion Original artist:Tkgd2007

• File:Socrates.png Source: https://upload.wikimedia.org/wikipedia/commons/c/cd/Socrates.png License: Public domain Contributors:Transferred from en.wikipedia to Commons. Original artist: The original uploader was Magnus Manske at English Wikipedia Laterversions were uploaded by Optimager at en.wikipedia.

• File:Venn1001.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/47/Venn1001.svg License: Public domain Contributors:? Original artist: ?

• File:Wikibooks-logo-en-noslogan.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/df/Wikibooks-logo-en-noslogan.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: User:Bastique, User:Ramac et al.

• File:Wikiquote-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikiquote-logo.svg License: Public domainContributors: ? Original artist: ?

• File:Wiktionary-logo-en.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f8/Wiktionary-logo-en.svg License: Publicdomain Contributors: Vector version of Image:Wiktionary-logo-en.png. Original artist: Vectorized by Fvasconcellos (talk · contribs),based on original logo tossed together by Brion Vibber

• File:Wuppertal_Ronsdorf_-_Villa_Carnap_01_ies.jpg Source: https://upload.wikimedia.org/wikipedia/commons/4/49/Wuppertal_Ronsdorf_-_Villa_Carnap_01_ies.jpg License: CC-BY-SA-3.0 Contributors: Own work Original artist: Frank Vincentz

24.12.3 Content license• Creative Commons Attribution-Share Alike 3.0