Академический Документы
Профессиональный Документы
Культура Документы
RUSSELL MARCUS
Hamilton College
9 8 7 6 5 4 3 2 1
Printed by LSC Communications, United States of America
Contents
Preface vii
Introduction to Formal Logic (IFL) and Introduction to Formal Logic with Philosophi-
cal Applications (IFLPA) are a pair of new logic textbooks, designed for students of
formal logic and their instructors, to be rigorous, yet friendly and accessible. Unlike
many other logic books, IFL and IFLPA both focus on deductive logic. They cover
syntax, semantics, and natural deduction for propositional and predicate logics. They
emphasize translation and derivations, with an eye to semantics throughout. Both
books contains over 2000 exercises, enough for in-class work and homework, with
plenty left over for extra practice, and more available on the Oxford website.
vii
v i i i P reface
Teachers of logic are often faced with a bimodal distribution of student abili-
ties: some students get the material quickly, and some students take more time—
sometimes significantly more time—to master it. Thus one central challenge to logic
teachers is to figure out how to support the former group of students while keeping
the latter group engaged. I have addressed this challenge, in part, by providing lots
of exercises with varying, progressive levels of difficulty and including some exercise
sections that can be used by the strongest students and skipped by others without
undermining their later work.
Since logic is most often taught in philosophy departments, special attention is
given to how logic is useful for philosophers. Many examples use philosophical con-
cepts, translating philosophical arguments to one of the formal languages, for ex-
ample, and deriving their conclusions using the inferential tools of the text. Some of
these arguments are artificial, as one might expect in an introductory logic text; I do
not endorse their content. I hope mainly to have the arguments be ones that someone
might use. There are plenty of exercises with more ordinary content, too, which may
be friendlier to the beginning student, or one with no background in philosophy.
SPECIAL FEATURES
Each section of IFL contains a Summary and a section of important points to
Keep in Mind.
Key terms are boldfaced in the text and defined in the margins, and are listed
at the end of each chapter. In addition, all terms are defined in a glossary at
the end of the book.
There are over 2000 exercises in the book.
Exercises are presented progressively, from easier to more challenging.
Translate-and-derive exercises are available in every section on deriva-
tions, helping to maintain students’ translation skills.
Translation exercises are supplemented with examples for translation
from formal languages into English.
Regimentations and translations contain both ordinary and philosophi-
cal themes.
Solutions to exercises, about 20% of total, are included at the back of the
book. Solutions to translate-and-derive exercises appear in two parts:
first, just the translation, and then the derivation. Solutions to all exer-
cises are available for instructors.
IFL contains several topics and exercise types not appearing in many standard
logic textbooks:
Seven rules for biconditionals, parallel to the standard rules for conditionals;
Exercises asking students to interpret and model short theories
Two sections on functions at the end of chapter 5
P R E F A C E i x
ACKNOWLEDGMENTS
The first draft of this book was written in the summer of 2011. I worked that sum-
mer alongside my student, Jess Gutfleish, with support of a Class of 1966 Faculty
Development Award from the Dean of Faculty’s Office at Hamilton College, in the
archaeology teaching lab at Hamilton. I wrote the text, and she worked assiduously
and indefatigably writing exercises; I had difficulty keeping up with her. I am ineffably
grateful to Jess for all of her hard work and the mountain of insidiously difficult (as
well as more ordinary) logic problems she devised. Jess worked on more problems in
spring 2014. Spencer Livingstone worked with me and Jess in spring 2014. Deanna
Cho helped enormously with the section summaries and glossary in the summer of
2014, supported by the philosophy department at Hamilton College. Spencer Living-
stone and Phil Parkes worked during summer 2015, helping me with some research
and writing still further exercises. Sophie Gaulkin made many editing suggestions
during summer 2015. Reinaldo Camacho assisted me with new exercises in fall 2016.
Jess, Spencer, and Rey have all been indescribably supportive and useful as teaching
assistants and error-seeking weapons. Students in my logic classes at Hamilton, too
numerous to mention, found many typos. Andrew Winters, using a draft of the text at
Slippery Rock University in 2016, sent the errors he and his students discovered, and
made many helpful suggestions.
At the behest of Oxford, the following people made helpful comments on drafts of
the book, and I am grateful for their work:
Joshua Alexander, Siena College
Brian Barnett, St. John Fisher College
Larry Behrendt, Mercer County Community College
Thomas A. Blackson, Arizona State University
Dan Boisvert, University of North Carolina, Charlotte
Jeff Buechner, Rutgers University, Newark
Eric Chelstrom, Minnesota State University
Chris Dodsworth, Spring Hill College
Michael Futch, University of Tulsa
Nathaniel Goldberg, Washington and Lee University
Nancy Slonneger Hancock, Northern Kentucky University
Brian Harding, Texas Woman’s University
Reina Hayaki, University of Nebraska, Lincoln
Marc A. Hight, Hampden Sydney College
Jeremy Hovda, KU Leuven
Gyula Klima, Fordham University
Karen Lewis, Barnard College
Leemon McHenry, California State University, Northridge
John Piers Rawling, Florida State University
Reginald Raymer, University of North Carolina, Charlotte
Ian Schnee, Western Kentucky University
P R E F A C E x i i i
1
2 C h apter 1 Intro d u c i ng L og i c
When evaluating an argument, we can perform two distinct steps. First, we can see
whether the conclusion follows from the assumptions. An argument whose conclu-
sion follows from its premises is called valid. Chapter 2 is dedicated to constructing
a precise notion of deductive validity, of what follows, for propositional logic. Indeed,
the notion of validity is the central topic of the book.
A second step in evaluating an argument is to see whether the premises are true.
In a valid deductive argument, if the premises are true, then the conclusion must be
true. This result is what makes deductive logic interesting and is, in a sense, the most
important sentence of this entire book: in a valid deductive argument, if the premises
are true, then the conclusion must be.
An Introduction to Formal Logic is dedicated to the first step in the process of eval-
uating arguments. The second step is not purely logical, and it is largely scientific.
Roughly speaking, we examine our logic to see if our reasoning is acceptable and we
examine the world to see if our premises are true. Although we prefer our arguments
both to be valid and to have true premises, this book is dedicated mainly to the form
of the argument, not to its content.
concepts, it seemed obvious that the parallel postulate, Euclid’s fifth postulate,
would also hold.
1.3.3 Euclid’s Fifth Axiom, the Parallel Postulate: If a straight line falling
on two straight lines makes the interior angles on the same side less
than two right angles, the two straight lines, if produced indefinitely,
meet on that side on which are the angles less than the two right
angles.
The parallel postulate is equivalent to Playfair’s postulate (after John Playfair, the
Scottish mathematician who proposed his version in 1795), 1.3.4, which may be easier
to visualize.
1.3.4 Given a line, and a point not on that line, there exists a single line
that passes through the point and is parallel to the given line.
In the two millennia between Euclid and the early nineteenth century, geometers
tried in vain to prove 1.3.3 or 1.3.4. They did so mainly by trying to find that some
contradiction would arise from the denials of one or the other. They supposed that there
was more than one parallel line through the given point. They supposed that
there were no parallel lines through the given point. Both suppositions led to odd
kinds of spaces. But neither supposition led to an outright contradiction.
By the early nineteenth century, some mathematicians realized that instead of lead-
ing to contradiction, the denials of 1.3.3 and 1.3.4 lead to more abstract conceptions
of geometry, and exciting new fields of study. Riemann and others explored the prop-
erties of elliptical geometries, those that arise when adding the claim that there are
no parallel lines through the given point mentioned in Playfair’s postulate to the first
four axioms. Lobachevsky, Gauss, and others explored the properties of hyperbolic
geometries, which arise when adding the claim that there are infinitely many parallel
lines through the given point in 1.3.4 to the first four axioms. In both elliptical and
hyperbolic geometries, the notions of straightness and right-angularity, among oth-
ers, have to be adjusted. Our original Euclidean conceptions had been smuggled in
to the study of geometry for millennia, preventing mathematicians from discovering
important geometric theories.
These geometric theories eventually found important applications in physical sci-
ence. The parallel postulate is also equivalent to the claim that the sum of the angles
of a triangle is 180°. Consider an interstellar triangle, formed by the light rays of three
stars, whose vertices are the centers of those stars. The sum of the angles of our inter-
stellar triangle will be less than 180° due to the curvatures of space-time correspond-
ing to the gravitational pull of the stars and other large objects. Space-time is not
Euclidean, but hyperbolic.
As in the case of Cantor’s work with infinity, mathematicians considering the
counterintuitive results of non-Euclidean geometries worried that the laws of logical
8 C h apter 1 Intro d u c i ng L og i c
Augustus De Morgan, even earlier than Peirce and Frege, had worked on relational
logic. But Frege’s larger logicist project, coming mainly as a response to Kant’s phi-
losophy and that of the early nineteenth-century idealists, is especially interesting
to contemporary philosophers. Indeed, Frege produced seminal work not only in
logic and philosophy of mathematics, but in philosophy of language, epistemology,
and metaphysics.
But enough about this engaging history. Let’s get started with the formal work.
since therefore
because we may conclude that
for we may infer that
in that entails that
may be inferred from hence
given that thus
seeing that consequently
for the reason that it follows that
inasmuch as implies that
owing to as a result
Although these lists are handy, they should not be taken as exhaustive or categorical.
Natural languages like English are inexact and non-formulaic. Not all sentences in an
argument will contain indicators. ‘And’ often indicates the presence of an additional
premise, but it can also be used to indicate the extension of a conclusion. Often you will
have to judge from the content of an argument which propositions are premises and
which are conclusions. The best way to identify premises and conclusions is to deter-
mine what the main point of an argument is, and then to see what supports that point.
Once we have determined what the conclusion of an argument is, and which prop-
ositions are the premises, we can regiment the argument into numbered premise-
conclusion form, identifying each of the premises (P1, P2, etc.) and indicating the
conclusion with a ‘C’. Thus we can regiment 1.4.1 as the perspicuous 1.4.2, eliminat-
ing premise and conclusion indicators, and placing the conclusion at the end.
1.4.2 P1. Texting is distracting.
P2. Driving while distracted is wrong.
C. Texting while driving is wrong.
When regimenting an argument, the order of premises is unimportant. 1.4.3 would
be just as good a regimentation as 1.4.2.
1.4.3 P1. Driving while distracted is wrong.
P2. Texting is distracting.
C. Texting while driving is wrong.
Similarly, the number of premises is not very important. You can combine or sepa-
rate premises, though it is often useful to keep the premises as simple as possible. 1.4.4
is logically acceptable but not as perspicuous as 1.4.2 or 1.4.3.
1.4.4 P1. Driving while distracted is wrong, and texting is distracting.
C. Texting while driving is wrong.
The most important task when first analyzing an argument is to determine its con-
clusion. The most serious mistake you can make in this exercise is to confuse prem-
ises and conclusions. Argument 1.4.5 is derived from Leibniz’s work.
1 . 4 : S eparat i ng P re m i ses fro m C oncl u s i ons 1 1
1.4.5 God is the creator of the world. If this world is not the best of all
possible worlds, then either God is not powerful enough to bring
about a better world or God did not wish this world to be the best.
So, this world is the best of all possible worlds, because God is both
omnipotent and all-good.
1.4.6 is a poor and misleading regimentation of 1.4.5, merely listing the assertions
in the order in which they appear in 1.4.5.
1.4.6 P1. God is the creator of the world.
P2. If this world is not the best of all possible worlds, then either
God is not powerful enough to bring about a better world or
God did not wish this world to be the best.
P3. This world is the best of all possible worlds.
C. God is both omnipotent and all-good.
The main problem with 1.4.6 is that it switches a premise and the conclusion. The
central claim of 1.4.5 is that this is the best of all possible worlds. The “so” at the begin-
ning of the last sentence is a hint to the conclusion. Thinking about the content of the
argument should produce the same analysis. A proper regimentation would switch
P3 and C, as in 1.4.7.
1.4.7 P1. God is the creator of the world.
P2. If this world is not the best of all possible worlds, then either
God is not powerful enough to bring about a better world or
God did not wish this world to be the best.
P3. God is both omnipotent and all-good.
C. This world is the best of all possible worlds.
Sometimes it is not easy to determine how to separate premises from conclusions.
Often, such discrimination requires broad context. For example, some single sen-
tences contain both a premise and a conclusion. Such compound sentences must be
divided. 1.4.8 is derived from Locke’s work.
1.4.8 Words must refer either to my ideas or to something outside my mind.
Since my ideas precede my communication, words must refer to my
ideas before they could refer to anything else.
A good regimentation of 1.4.8 divides the last sentence, as in 1.4.9.
1.4.9 P1. Words must refer either to my ideas or to something outside
my mind.
P2. My ideas precede my communication.
C. Words must refer to my ideas before they could refer to any-
thing else.
Some arguments contain irrelevant, extraneous information. When constructing
an argument, it is better to avoid extraneous claims, lest you distract or mislead a
reader. But when regimenting someone else’s argument, it is usually good practice to
1 2 C h apter 1 Intro d u c i ng L og i c
include all claims, even extraneous ones. Then, when you are evaluating an argument,
you can distinguish the important premises from the extraneous ones.
Lastly, some arguments contain implicit claims not stated in the premises. These
arguments are called enthymemes. 1.4.10 is enthymemic.
1.4.10 P1. Capital punishment is killing a human being.
C. Capital punishment is wrong.
Again, when regimenting an argument, we ordinarily show just what is explicitly
present in the original. When evaluating an argument, we can mention suppressed
premises. For instance, we can convert 1.4.10 into a more complete argument by in-
serting a second premise.
1.4.11 P1. Capital punishment is killing a human being.
P2. Killing a human being is wrong.
C. Capital punishment is wrong.
Notice that P2 here is contentious. Is it always wrong to kill a human being? What if
you are defending yourself from a raging murderer? Or what if you are fighting a just
war? Some people believe that euthanasia is acceptable for people suffering from ter-
minal illnesses and in great pain. The contentiousness of P2 might explain why some-
one defending 1.4.10 might suppress it. Still, filling out an enthymeme is a job for later,
once you have become confident regimenting arguments as they appear. Nothing in
our logic will determine an answer to the interesting questions around claims like P2,
but logic will help us understand the structures of arguments that contain or suppress
such premises.
KEEP IN MIND
The first step in analyzing arguments is to identify a conclusion and separate it from the
premises.
There are often indicators for premises and conclusions.
EXERCISES 1.4
Regiment each of the following arguments into premise-
conclusion form. The inspiration for each argument is noted;
not all arguments are direct quotations.
1. Statements are meaningful if they are verifiable. There are mountains on the
other side of the moon. No rocket has confirmed this, but we could verify it to
be true. Therefore, the original statement is significant. (A. J. Ayer, Language,
Truth, and Logic)
1 . 4 : S eparat i ng P re m i ses fro m C oncl u s i ons 1 3
2. The workingman does not have time for true integrity on a daily basis. He can-
not afford to sustain the manliest relations to men, for his work would be mini-
mized in the market. (Henry David Thoreau, Walden)
3. The passage from one stage to another may lead to long-continued different
physical conditions in different regions. These changes can be attributed to
natural selection. Hence, the dominant species are the most diffused in their
own country and make up the majority of the individuals, and often the most
well marked varieties. (Charles Darwin, On the Origin of Species)
4. We must be realists about mathematics. Mathematics succeeds as the language
of science. And there must be a reason for the success of mathematics as the lan-
guage of science. But no positions other than realism in mathematics provide a
reason. (Hilary Putnam)
5. Local timelines are temporally ordered. The faster you go, the quicker you get
to your destination. As you go faster, time itself becomes compressed. But it is
not possible to go so fast that you get there before you started. (Albert Einstein,
Relativity)
6. The sphere is the most perfect shape, needing no joint and being a complete
whole. A sphere is best suited to enclose and contain things. The sun, moon,
planets, and stars are seen to be of this shape. Thus, the universe is spherical.
(Nicolaus Copernicus, The Revolution of the Celestial Orbs)
7. The happiest men are those whom the world calls fools. Fools are entirely de-
void of the fear of death. They have no accusing consciences to make them fear
it. Moreover, they feel no shame, no solicitude, no envy, and no love. And they
are free from any imputation of the guilt of sin. (Desiderius Erasmus, In Praise
of Folly)
8. It is impossible for someone to scatter his fears about the most important mat-
ters if he knows nothing about the universe, but gives credit to myths. Without
the study of nature, there is no enjoyment of pure pleasure. (Epicurus of Samos,
Sovran Maxims)
9. If understanding is common to all mankind, then reason must also be com-
mon. Additionally, the reason which governs conduct by commands and prohi-
bitions is common to us. Therefore, mankind is under one common law and so
are fellow citizens. (Marcus Aurelius, Meditations)
10. Rulers define ‘justice’ as simply making a profit from the people. Unjust men
come off best in business. But just men refuse to bend the rules. So, just men get
less and are despised by their own friends. (Plato, Republic)
11. We must take non-vacuous mathematical sentences to be false. This is because
we ought to take mathematical sentences at face value. If we take some sentences
to be non-vacuously true, then we have to explain our access to mathematical
1 4 C h apter 1 Intro d u c i ng L og i c
objects. The only good account of access is the indispensability argument. But
the indispensability argument fails. (Hartry Field)
12. Labor was the first price, in that it yielded money that was paid for all things.
But it is difficult to ascertain the proportion between two quantities of labor.
Every commodity is compared with other exchanged commodities rather than
labor. Therefore, most people better understand the quantity of a particular
commodity than the quantity of labor. (Adam Smith, The Wealth of Nations)
13. Authority comes from only agreed conventions between men. Strength alone
is not enough to make a man into a master. Moreover, no man has natural au-
thority over his fellows and force creates no right. (Jean Jacques Rousseau, The
Social Contract)
14. Just as many plants only bear fruit when they do not grow too tall, so in the
practical arts, the theoretical leaves and flowers must not be constructed to
sprout too high, but kept near to experience, which is their proper soil. (Carl
von Clausewitz, On War)
15. The greatest danger to liberty is the omnipotence of the majority. A democratic
power is never likely to perish for lack of strength or resources, but it may fall
because of the misdirection of this strength and the abuse of resources. There-
fore, if liberty is lost, it will be due to an oppression of minorities, which may
drive them to an appeal to arms. (Alexis de Tocqueville, Democracy in America)
16. There is no distinction between analytic and synthetic claims. If there is an
analytic/synthetic distinction, there must be a good explanation of synonymy.
The only ways to explain synonymy are by interchangeability salva veritate or
definition. However, interchangeability cannot explain synonymy. And defini-
tion presupposes synonymy. (W. V. Quine)
17. The object of religion is the same as that of philosophy; it is the internal verity
itself in its objective existence. Philosophy is not the wisdom of the world, but
the knowledge of things that are not of this world. It is not the knowledge of ex-
ternal mass, empirical life and existence, but of the eternal, of the nature of God,
and all which flows from his nature. This nature ought to manifest and develop
itself. Consequently, philosophy in unfolding religion merely unfolds itself and
in unfolding itself it unfolds religion. (Georg Wilhelm Friedrich Hegel, The Phi-
losophy of Religion)
18. Every art and every inquiry, and similarly every action and pursuit, is thought
to aim at some good; and for this reason the good has rightly been declared to
be that at which all things aim. (Aristotle, Nicomachean Ethics)
19. By ‘matter’ we are to understand an inert, senseless substance, in which ex-
tension, figure, and motion do actually subsist. But it is evident from what we
have already shown that extension, figure, and motion are only ideas existing
in the mind, and that an idea can be like nothing but another idea, and that
consequently neither they nor their archetypes can exist in an unperceiving
1 . 4 : S eparat i ng P re m i ses fro m C oncl u s i ons 1 5
substance. Hence it is plain that the very notion of what is called matter, or
corporeal substance, involves a contradiction in it. (George Berkeley, A Treatise
Concerning the Principles of Human Knowledge)
20. Reading challenges a person more than any other task of the day. It requires the
type of training that athletes undergo, and with the same life-long dedication.
Books must be read as deliberately and reservedly as they were written. Thus, to
read well, as in, to read books in a true spirit, is a noble exercise. (Henry David
Thoreau, Walden)
21. The only course open to one who wished to deduce all our knowledge from first
principles would be to begin with a priori truths. An a priori truth is a tautology.
From a set of tautologies alone, only further tautologies can be further deduced.
However, it would be absurd to put forward a system of tautologies as consti-
tuting the whole truth about the universe. Therefore, we cannot deduce all our
knowledge from first principles. (A. J. Ayer, Language, Truth, and Logic)
22. Men, in the state of nature, must have reached some point when the obstacles
maintaining their state exceed the ability of the individual. Then the human
race must either perish or change. Men cannot create new forces, only unite
and direct existing ones. Therefore, they can preserve themselves only by com-
bining forces great enough to overcome resistance. (Jean Jacques Rousseau,
On the Social Contract)
23. Physics can be defined as the study of the laws that regulate the general prop-
erties of bodies regarded en masse. In observing physics, all senses are used.
Mathematical analysis and experiments help with observation. Thus in the
phenomena of physics man begins to modify natural phenomena. (Auguste
Comte, The Course in Positive Philosophy)
24. There are not two indiscernible individuals in our world. If there were two in-
discernible individuals in our world then there must be another possible world
in which those individuals are switched. God could have had no reason for choos-
ing one of these worlds over the other. But God must have a reason for acting
as she does. (Leibniz)
25. In aristocratic countries, great families have enormous privileges, which their
pride rests on. They consider these privileges as a natural right ingrained in
their being, and thus their feeling of superiority is a peaceful one. They have
no reason to boast of the prerogatives that everyone grants to them without
question. So, when public affairs are directed by an aristocracy, the national
pride takes a reserved, haughty, and independent form. (Alexis de Tocqueville,
Democracy in America)
26. It must be some one impression that gives rise to every real idea. But self or
person is not any one impression, but that to which our several impressions and
ideas are supposed to have a reference. If any impression gives rise to the idea
of self, that impression must continue invariably the same through the whole
1 6 C h apter 1 Intro d u c i ng L og i c
course of our lives, since self is supposed to exist after that manner. But there
is no impression constant and invariable. Pain and pleasure, grief and joy, pas-
sions and sensations succeed each other and never all exist at the same time. It
cannot, therefore, be from any of these impressions or from any other that the
idea of self is derived, and, consequently, there is no idea of the self. (David
Hume, A Treatise of Human Nature)
27. Every violent movement of the will, every emotion, directly agitates the body.
This agitation interferes with the body’s vital functions. So, we can legitimately
say that the body is the objectivity of the will. (Arthur Schopenhauer, The
World as Will and Idea)
28. The work of the defensive forces of the ego prevents repressed desires from en-
tering the conscious during waking life, and even during sleep. The dreamer
knows just as little about the meaning of his dreams as the hysteric knows
about the significance of his symptoms. The technique of psychoanalysis is the
act of discovering through analysis the relation between manifest and latent
dream content. Therefore, the only way to treat these patients is through the
technique of psychoanalysis. (Sigmund Freud, The Origin and Development of
Psychoanalysis)
29. Either mathematical theorems refer to ideal objects or they refer to objects that
we sense. If they refer to ideal objects, the radical empiricist cannot defend our
knowledge of them, since we never sense such objects. If they refer to objects
that we sense, they are false. So, for the radical empiricist, mathematical theo-
rems are either unknowable or false. In either case, the radical empiricist can-
not justify any proof of a mathematical theorem. (John Stuart Mill)
30. My mind is distinct from my body. I have a clear and distinct understanding
of my mind, independent of my body. I have a clear and distinct understand-
ing of my body, independent of my mind. Whatever I can clearly and distinctly
conceive of as separate can be separated by God and so are really distinct.
(René Descartes, Meditations on First Philosophy)
KEEP IN MIND
In deductive logic, if the form of an argument is valid, and the premises are all true, then
the conclusion must be true.
An argument is valid if the conclusion follows logically from the premises.
The validity of an argument depends on its form and is independent of the truth of its
premises.
A valid argument is sound if all of its premises are true.
Only valid arguments can be sound.
1 . 5 : Val i d i t y an d S o u n d ness 1 9
EXERCISES 1.5
Determine whether each of the following arguments is
intuitively valid or invalid. For valid arguments, determine
whether they are sound (if you can).
17. Either it is raining or it is sunny, but not both. It is not raining. So, it is sunny.
18. Either I stop smoking or I risk getting ill. If I stop smoking, then I will have
withdrawal symptoms. If I get ill, then I risk death. So, either I have withdrawal
symptoms or I risk death.
19. Some fish live in the Atlantic Ocean. The Atlantic Ocean is a body of water. So,
some fish live in water.
20. All rats have tails. Some rats are white. So, all rats are white and have tails.
21. All rats have tails. Some rats are white. Therefore, some white things have tails.
22. All squares are rectangles. All rectangles are parallelograms. All parallelograms
are quadrilaterals. Therefore, all squares are quadrilaterals.
23. All professional singers are classically trained. Some classically trained singers
are Italian. So, some professional singers are Italian.
24. Kangaroos live in Australia. Sydney is in Australia. Hence, kangaroos live in
Sydney.
25. All logicians are philosophers. All philosophers study Kant. It follows that all
logicians study Kant.
26. If mathematical objects exist, then either we have mathematical intuition or we
can’t know about them. We don’t have mathematical intuition. So, mathemati-
cal objects don’t exist.
27. Either only the present is real or time is four-dimensional. Time is four-dimen-
sional. So, only the present is real.
28. Logic is a priori if, and only if, mathematics is. Mathematics is a priori if, and
only if, metaphysics is. So, logic is a priori if, and only if, metaphysics is.
29. Nietzsche believes in eternal recurrence, but Spinoza does not. If Heidegger
believes in the reality of time, then Spinoza believes in eternal recurrence. So,
Heidegger does not believe in the reality of time.
30. Objective morality is either consequentialist or deontological. If objective mo-
rality is deontological then Aristotle is a relativist. So, Aristotle is not a relativist.
31. All logical empiricists are verificationists. Some verificationists are holists. So,
some holists are logical empiricists.
32. Either Plato taught Aristotle or Aristotle taught Plato. But Aristotle taught
Alexander, and Alexander was not taught by Plato. So, Plato taught Aristotle.
33. Descartes corresponded with Elisabeth of Bohemia and Queen Christina of
Sweden. So, Queen Christina and Elisabeth corresponded with each other.
34. If Hegel was influenced by Kant, then Marx was influenced by Hegel. Marx was
influenced by Hegel if, and only if, Nietzsche was influenced by Marx. So, if
Hegel was influenced by Kant, then Nietzsche was influenced by Marx.
1 . 5 : Val i d i t y an d S o u n d ness 2 1
35. There is a difference between correlation and causation only if we have knowl-
edge of the laws of nature. But the laws of nature are obscured to us. So, correla-
tion is causation.
36. All ravens are black. But black is a color. And nothing has color. So, there are
no ravens.
37. All humans have some virtues. Not all humans have all virtues. So, some hu-
mans lack some virtues, but no humans lack all virtues.
38. If infinity is actual, then Achilles cannot catch the tortoise. If infinity is poten-
tial, then Achilles can catch the tortoise. Infinity is either actual or potential. So,
Achilles can catch the tortoise.
39. If I am my body, then the self is constantly changing and does not persist
through time. If the self does not persist through time, then the person who
borrows money is not the one who returns it. So, if the person who borrows
money is the person who returns it, then I am not my body.
40. If knowledge is justified true belief, then Gettier cases are not counterexamples.
But Gettier cases are counterexamples, and there are others, too. So, knowl-
edge is justified true belief with a causal connection between the knower and
the object of knowledge.
KEY TERMS
22
2 . 1 : L og i cal O perators an d T ranslat i on 2 3
uninhabited palace, longer than the whole village and much taller than the
steeple of the church, and it sailed by in the darkness toward the colonial
city on the other side of the bay that had been fortified against buccaneers,
with its old slave port and the rotating light, whose gloomy beams transfig-
ured the village into a lunar encampment of glowing houses and streets of
volcanic deserts every fifteen seconds . . . (Gabriel García Márquez, “The Last
Voyage of the Ghost Ship,” emphases added)
Grammarians often bristle at long, run-on sentences like this one. But from a logi-
cal point of view, we can build sentences of indefinite length by repeated applications
of operators like the ‘and’ in Márquez’s story. Such operators, including ‘or’ and ‘not’,
are often all called conjunctions in grammar, though in logic we reserve the term
‘conjunction’ for just the operator for which we use ‘and’.
The system of propositional logic that we will study uses five operators, which we
identify by their syntactic properties, or shapes:
Tilde ∼
Dot ∙
Vel ∨
Horseshoe ⊃
Triple bar ≡
These operators are used to represent logical operations on sentences. We will
consider five basic logical operations, though systems of logic can be built from
merely one or two operations. We could also introduce other, less intuitive logical
operations.
These five operators are standard for propositional logic:
Negation ∼
Conjunction ∙
Disjunction ∨
Material implication ⊃
The biconditional ≡
We read or write sentences of English from left to right, and we might think of them
as being composed in that way. But the logical structure of a complex sentence is
grounded in its simple parts and the operators used, like bricks and mortar. We think
of complex sentences, as we will see in the next section, as being composed or built up
from smaller parts using the operators.
Along with the assumption of our ability to construct sentences of indefinite length,
we presume a principle, called compositionality, that the meaning of the longer sen- Compositionality: the
tences is determined by the meanings of the shorter sentences, along with the mean- meaning of a complex
sentence is determined
ings of the conjunctions or other logical operators. The compositionality of our logic
by the meanings of its
allows us to understand the properties of even very long sentences as long as we un- component parts.
derstand the nature of the logical operators. This section is a detailed explication of
each of our five operators.
2 4 C h apter 2 P ropos i t i onal L og i c
Negation
Negation, ∼, is the logical Negation is a unary operator, applying to one propositional variable. The other four
operator used for ‘it is not operators are all binary. Some English indicators of negation include the following:
the case that’ and related
terms. not
it is not the case that
A unary operatorapplies it is not true that
to a single proposition. it is false that
Binary operators
relate or connect two 2.1.2–2.1.4 each express a negation of 2.1.1.
propositions.
2.1.1 John will take the train.
2.1.2 John won’t take the train.
2.1.3 It’s not the case that John will take the train.
2.1.4 John takes the train . . . not!
We can represent 2.1.1 as ‘P’ and each of 2.1.2–2.1.4 as the negation ‘∼P’. 2.1.5–2.1.7
are all negations, too.
2.1.5 ∼R
2.1.6 ∼(P ∙ Q)
2.1.7 ∼{[(A ∨ B) ⊃ C] ∙ ∼D}
2.1.5 is built out of a simple sentence ‘R’ and a negation in front of it. 2.1.6 is built
out of two simple sentences, conjoined and then negated. 2.1.7 is the negation of a
conjunction of a conditional and another negation, though now we’re getting a little
bit ahead of ourselves.
Negation is a fairly simple logical operator to translate, though some subtleties are
worth considering. Ordinarily, when we translate natural language into logical lan-
guage, we want to reveal as much logical structure as we can so that we can see the
logical relations among sentences. We use single capital letters to represent simple,
positive sentences, so that we can show the logical operation of negation on those
simple sentences. For example, we symbolize ‘Pedro has no beard’ as ‘∼P’, where ‘P’
stands for ‘Pedro has a beard’.
For some sentences, it is not clear whether to use a negation when symbolizing.
2.1.9 has a negative feel to it.
2.1.8 Kant affirms that arithmetic is synthetic a priori.
2.1.9 Kant denies that arithmetic is synthetic a priori.
It would be misleading to represent 2.1.9 as the negation of 2.1.8, though. Deny-
ing is not the negation of affirming. There are two ways to fail to affirm P. First, one
can deny P. Second, one can remain silent. Denying is an activity that is related to
affirming, but it is not, strictly, the negation of affirming. For similar reasons, reject-
ing, disputing, and dissenting are not negations of accepting or affirming. We want
our simple sentences to be positive, if possible, but not at the expense of the meaning
of the original. Sometimes a negative verb can represent a positive act, or anyway not
the logical negation of any simple act.
2 . 1 : L og i cal O perators an d T ranslat i on 2 5
Conjunction
These are some English indicators of a logical conjunction: Conjunction, ∙, is the
logical operator used for
and still ‘and’ and related terms.
but moreover The formulas joined by
also although a conjunction are called
however nevertheless conjuncts.
yet both
2.1.10–2.1.13 are English sentences that we can represent as conjunctions.
2.1.10 Angelina walks the dog, and Brad cleans the floors. A∙B
2.1.11 Although Beyonce walks the dog, Jay cleans the floors. B∙J
2.1.12 Key and Peele are comedians. K∙P
2.1.13 Carolina is nice, but Emilia is really nice. C∙E
Although the logical operator in each of 2.1.10–2.1.13 is a conjunction, the tone
of the conjunction varies. Logicians often distinguish between the logical and prag-
matic properties of language. ‘And’ and ‘but’ are both used to express conjunctions
even though they have different practical uses.
We use conjunctions to combine complete sentences. In English, 2.1.12 is short for
a more complete sentence like 2.1.14.
2.1.14 Key is a comedian and Peele is a comedian.
Sometimes, sentences using ‘and’ are not naturally rendered as conjunctions.
2.1.15 Key and Peele are brothers.
2.1.15 is most naturally interpreted as expressing a relation between two people,
and not a conjunction of two sentences. Of course, 2.1.15 could also be used to ex-
press the claim that both Key and Peele are monks, in which case it would best be
represented in logic as a conjunction. In propositional logic, we regiment the most
natural sense of 2.1.15 merely as a simple letter: ‘P’, say. We will see how to represent
the sibling relation more finely in chapter 5. The difference between the two inter-
pretations cannot be found in the sentence itself. It has to be seen from the use of the
sentence in context. Many sentences are ambiguous when seen out of context.
In symbols, 2.1.16–2.1.18 are all conjunctions.
2.1.16 P ∙ ∼Q
2.1.17 (A ⊃ B) ∙ (B ⊃ A)
2.1.18 (P ∨ ∼Q) ∙ ∼[P ≡ (Q ∙ R)]
Disjunction
Disjunction is sometimes called alternation. Some English indicators of disjunction Disjunction, ∨, is the
include the following: logical operator used for
‘or’ and related terms.
or The formulas joined by
either a disjunction are called
unless disjuncts.
2 6 C h apter 2 P ropos i t i onal L og i c
Unlessis represented as a Most disjunctions use an ‘or’, though ‘unless’ is also frequently used for disjunction.
disjunction in PL . 2.1.19–2.1.21 are English sentences that we can represent as disjunctions.
2.1.19 Either Paco makes the website or Matt does. P∨M
2.1.20 Jared or Rene will go to the party. J∨R
2.1.21 Tomas doesn’t feed the kids unless Aisha asks him to. ∼T ∨ A
In symbols, 2.1.22–2.1.24 are all disjunctions.
2.1.22 ∼P ∨ Q
2.1.23 (A ⊃ B) ∨ (B ⊃ A)
2.1.24 (P ∨ ∼Q) ∨ ∼[P ≡ (Q ∙ R)]
Standard combinations of negations with disjunctions and conjunctions are useful
Not bothP and Q is to learn. The negation of a conjunction is simply ‘not both’, as in 2.1.25.
represented in PL as
∼(P ∙ Q ). 2.1.25 It is not the case that both Adam goes to the movies and Bianca
works on her paper.
2.1.25 is the denial that both claims hold, leaving open the possibility that one (but
not the other) holds. Such a claim is best translated as 2.1.26, which (as we will see in
section 3.3) is logically equivalent to the form at 2.1.27.
2.1.26 ∼(A ∙ B) Not both A and B
2.1.27 ∼A ∨ ∼B Not both A and B
NeitherP nor Q is In parallel, the negation of a disjunction is just the common structure of ‘neither’,
represented in PL as short for ‘not either’. 2.1.28 is both a denial that Caleb takes ethics and a denial that
∼(P ∨ Q ).
Danica does.
2.1.28 Neither Caleb nor Danica takes ethics.
2.1.28 is most directly translated as 2.1.29, the negation of a disjunction. 2.1.30,
the conjunction of two negations, is logically equivalent to both, and also acceptable.
2.1.29 ∼(C ∨ D) Neither C nor D
2.1.30 ∼C ∙ ∼D Neither C nor D
‘Neither’ and ‘not-both’ sentences are not logically equivalent to each other, so it is
important not to confuse the two.
understand. If some condition in the antecedent is met, then the consequent follows;
the condition is sufficient to entail the consequent.
Necessary conditions are trickier. If A is necessary for B, then if B is true, we can
infer that A must also be true. For example, oxygen is necessary for burning. So, if
something is burning, there must be oxygen present; the necessary condition is in the
consequent. Given that the presence of oxygen is a necessary condition for something
burning, we cannot infer from the presence of oxygen to something burning. Oxygen
is not sufficient to cause a fire; it’s just one of various necessary conditions.
To remember that sufficient conditions are antecedents and necessary conditions
are consequents, we can use the mnemonic ‘SUN’. Rotating the ‘U’ to a ‘⊃’ we get
‘S ⊃ N’.
In symbols, 2.1.40–2.1.42 are all conditionals.
2.1.40 ∼P ⊃ Q
2.1.41 (A ⊃ B) ⊃ (B ⊃ A)
2.1.42 (P ∨ ∼Q) ⊃ ∼[P ≡ (Q ∙ R)]
While we’re defining terms, we can define three conditionals using traditional
names that you might run into. The names of the conditionals 2.1.44–2.1.46 are all
relative to the original conditional at 2.1.43.
2.1.43 The conditional A⊃B
2.1.44 Its converse B⊃A
2.1.45 Its inverse ∼A ⊃ ∼B
2.1.46 Its contrapositive ∼B ⊃ ∼A
A statement and its contrapositive, 2.1.43 and 2.1.46, are logically equivalent. The
inverse and the converse of a conditional, 2.1.44 and 2.1.45, are logically equivalent
to each other. But a conditional is not equivalent to either its inverse or its converse.
These names are holdovers from the traditional, Aristotelian logic, and ‘inverse’ es-
pecially is not much used in modern logic. I will explain what ‘logical equivalence’
means in more detail in section 2.5.
The Biconditional
The biconditional, ≡, is Our final propositional operator, the biconditional, is really the conjunction of a con-
the logical operator used ditional with its converse. We see biconditionals in definitions, which give both nec-
for ‘if and only if ’ and
related terms.
essary and sufficient conditions. Some English indicators of a biconditional include
the following:
if and only if
when and only when
just in case
is a necessary and sufficient condition for
The biconditional ‘A ≡ B’ is short for ‘(A ⊃ B) ∙ (B ⊃ A)’, to which we will return,
once we are familiar with truth conditions. ‘If and only if ’ statements often indi-
cate definitions. For example, something is water if, and only if, it is H 2O. Thus, if
2 . 1 : L og i cal O perators an d T ranslat i on 2 9
something is water, then it is H2O. And, if something is H 2O, then it is water. ‘If and
only if ’ is sometimes abbreviated ‘iff ’.
2.1.47 and 2.1.48 are English examples of biconditionals.
2.1.47 You’ll be successful just in case you work hard and are lucky.
S ≡ (W ∙ L)
2.1.48 Something is a bachelor if, and only if, it is unwed and a man.
B ≡ (∼W ∙ M)
In symbols, 2.1.49–2.1.51 are all biconditionals.
2.1.49 ∼P ≡ Q
2.1.50 (A ⊃ B) ≡ (B ⊃ A)
2.1.51 (P ∨ ∼Q) ≡ ∼[P ≡ (Q ∙ R)]
Remember that we can regiment in two ways, by putting an argument into num-
bered premise-conclusion form, or by translating the argument into a formal lan-
guage. Here we will do both.
For the first step, remember that in chapter 1, we indicated premises with ‘P’ and a
number, and conclusions with a ‘C’. Here, we will adjust that form slightly, omitting
the ‘P’s and ‘C’s, and using a ‘/’ to indicate the separation between the premises and
the conclusion. Thus, we can write the argument 2.1.57 as 2.1.58.
2.1.58 1. I f morality is possible, then it is either forward-looking or
backward-looking.
2. We can be moral.
3. Morality is not forward-looking.
/ Morality is backward-looking.
There are alternatives to the ‘/’ to indicate conclusions. We could just use ‘so’, or
some other simple English conclusion indicator. Some people use ‘∴’ to indicate a
conclusion. Sometimes logicians draw a horizontal line between the premises and
conclusions, as at 2.1.59.
2.1.59 1. I f morality is possible, then it is either forward-looking or
backward-looking.
2. We can be moral.
3. Morality is not forward-looking.
Morality is backward-looking.
Neither 2.1.58 nor 2.1.59 is regimented into PL, which is our goal here. To regi-
ment it, we need to choose propositional letters for the simple English sentences. I’ll
use ‘P’ for ‘morality is possible’ and ‘we can be moral’, since I take those to be the
same proposition expressed slightly differently. ‘F’ can stand for ‘morality is forward-
looking’ and ‘B’ for ‘morality is backward-looking.’ The result is 2.1.60.
2.1.60 1. P ⊃ (F ∨ B)
2. P
3. ∼F /B
Notice that I put the conclusion on the same line as the last premise, rather than on
a different line. This form will be useful later on, and it makes the argument just a bit
more compact.
Summary
Now that you have seen each of the five operators and their English-language approxi-
mations, you can start to translate both simple and complex English sentences into
our artificial language, PL. Given a translation key, you can also interpret sentences
of PL into English sentences.
Translation is an art. In this section, I presented some guidelines for translating
English terms, like ‘and’ and ‘if . . . then . . .’, into our precise formal language. These
guidelines are not hard and fast rules. Natural language is flexible and inexact, which
2 . 1 : L og i cal O perators an d T ranslat i on 3 1
is part of why formal languages are useful. The indicators of conditionals are particu-
larly liable to misconstrual. You must be careful to distinguish antecedents and con-
sequents. Be especially wary of confusing ‘only if ’ with ‘if ’, and with ‘if and only if.’
Certain uses of the indicators are not even properly translated as logical operators.
‘Means’ is a conditional in “this means war” and “Beth’s deciding to join us means that
Kevin will be uncomfortable.” But ‘means’ is not a conditional in “Josie means to go
to the party tonight” and “ ‘querer’ means ‘to love’ in Spanish.” Sometimes the indica-
tors can be quite misleading; we will even see, in section 4.2, instances of ‘and’ that
are best translated using disjunction! But the indicators provided are generally good
hints about where to start with a translation, and the guidelines in this section should
be violated only for good reasons. As you develop greater facility with the logical lan-
guages in the book, you will come to a better feel of how best to translate. And there
are many acceptable alternatives to any translation, as we will see better after more
discussion of logical equivalence.
KEEP IN MIND
Our language PL uses five operators, which we identify by their syntactic properties.
The five propositional logical operations are negation, conjunction, disjunction, material
implication, and the biconditional.
The operators always apply to complete propositions, whether simple or complex.
The rules for translating conditionals are particularly tricky and require carefully distin-
guishing between antecedents and consequents.
EXERCISES 2.1a
Identify the antecedents and consequents of each of the
following sentences.
8. Lisette joins the activities board on the condition that the board revises its
funding rules.
9. Mercedes manages mock trial only if Nana is too busy to do it.
10. Orlando organizes peer tutoring when Percy rounds up volunteers.
11. Aristotle distinguishes actual from potential infinity if Parmenides argues for
the One.
12. If Bergson denies time, then so does McTaggart.
13. Camus encouraging authenticity means that Sartre does too.
14. Davidson defending anomalous monism is sufficient for Spinoza’s being cor-
rect about parallelism.
15. Emerson bails out Thoreau on the condition that Thoreau pays his taxes.
16. Fanon writing Black Skin, White Masks is a necessary condition for Freire writ-
ing Pedagogy of the Oppressed.
17. Grice analyzes pragmatics on the condition that Austin follows Wittgenstein.
18. Foot discussing trolley cases and philosophers reflecting on her work entail
that there will be more thought experiments.
19. The Churchlands denying mental states is sufficient for Dennett denying qualia
and Chalmers emphasizing the hard problem of consciousness.
20. When Singer is a utilitarian, no one else is.
EXERCISES 2.1b
Translate each sentence into propositional logic using the
propositional variables given after the sentence.
9. It is safe to swim when, and only when, the water is calm or a lifeguard is on
duty. (S, C, L)
10. Logic is challenging and fun given that you pay attention in class. (C, F, P)
11. Cars are eco-friendly if they are hybrids or run on low-emission fuel. (E, H, L)
12. Cara will go horseback riding only if it doesn’t rain, and she has a helmet.
(C, R, H)
13. The restaurant served chicken, and either peas or carrots. (C, P, T)
14. Making butter is a necessary condition for the farmer to go to the market and
make a profit. (B, M, P)
15. Patrons may have corn and potatoes if, and only if, they do not order carrots.
(C, P, T)
16. If the restaurant runs out of cheesecake, then you can have a meal of chicken
and pie and ice cream. (C, K, P, I)
17. A farmer keeps goats in a pen and sheep in a pen only if the dogs and cat are kept
inside. (G, S, D, C)
18. Either the farmer shears the sheep and milks the cows, or he slops the pigs and
walks the dogs. (S, C, P, D)
19. If the farmer shears the sheep, then he makes wool, and if he milks the cows,
then he makes butter. (S, W, C, B)
20. If the farmer goes to the market, then she makes a profit, and her wife is happy.
(M, P, W)
21. Plato believed in the theory of forms, and Aristotle held that there are four
kinds of causes, but Parmenides thought that only the one exists. (P, A, R)
22. If Thales reduced everything to water, then Democritus was an atomist if and
only if Heraclitus claimed that the world is constantly in flux. (T, D, H)
23. If Plato believed in the theory of forms or Democritus was an atomist, then
Aristotle held that there are four kinds of causes or Parmenides thought that
only the one exists. (P, D, A, R)
24. Democritus was not an atomist if and only if Plato didn’t believe in the theory
of forms, and Thales didn’t reduce everything to water. (D, P, T)
25. Either Heraclitus claimed that the world is constantly in flux or Thales reduced
everything to water, and either Aristotle held that there are four kinds of causes
or Parmenides thought that only the one exists. (H, T, A, R)
26. Smart believes that minds are brains, and Skinner thinks that inner states are
otiose, unless Descartes argues that the mind and body are distinct. (M, K, D)
3 4 C h apter 2 P ropos i t i onal L og i c
27. Either Putnam claims that minds are probabilistic automata, or the Church-
lands deny that there are any minds and Turing believes that machines can
think. (P, C, T)
28. Searle rejects the possibility of artificial intelligence if, and only if, Smart
believes that minds are brains and Turing believes that machines can think.
(E, M, T)
29. Either Putnam doesn’t claim that minds are probabilistic automata and the
Churchlands don’t deny that there are any minds, if Skinner thinks that inner
states are otiose, or Searle rejects the possibility of artificial intelligence and
Descartes doesn’t argue that the mind and body are distinct. (S, P, C, R, D)
30. Either Turing believes that machines can think or Smart doesn’t believe that
minds are brains, and the Churchlands deny that there are any minds. (T, S, C)
EXERCISES 2.1c
Translate each argument into propositional logic using the
letters provided.
4. Descartes defended libertarian free will just in case Spinoza defended deter-
minism. If Spinoza defended determinism, then either Hume developed com-
patibilism or Elisabeth complained that free will makes virtue independent of
luck. Descartes defended libertarian free will. Hume does not develop com-
patibilism. Therefore, Descartes defended libertarian free will and Elisabeth
complained that free will makes virtue independent of luck; also Spinoza
defended determinism.
5. Descartes defended libertarian free will. Hume developed compatibilism if
either Descartes defended libertarian free will or Spinoza defended determin-
ism. Elisabeth complained that free will makes virtue independent of luck if
Hume developed compatibilism. Elisabeth complaining that free will makes
virtue independent of luck and Descartes defending libertarian free will are
sufficient conditions for Spinoza not defending determinism. So, Spinoza does
not defend determinism.
the fat man scenario. If either Foot developed the trolley problem or Thomson
introduced the fat man scenario, then Kamm does not present the looping trol-
ley case. Hence, Foot did not develop the trolley problem.
15. Either Foot developed the trolley problem or Kamm presents the looping trol-
ley case. Foot developed the trolley problem unless Thomson introduced the
fat man scenario. It is not the case that Foot developed the trolley problem,
and Thomson introduced the fat man scenario. If Thomson introduced the fat
man scenario, then Kamm presents the looping trolley case. Both Foot did not
develop the trolley problem and Thomson introduced the fat man scenario if
Kamm presents the looping trolley case. So, Foot did not develop the trolley
problem and Thomson introduced the fat man scenario.
F: Field is a fictionalist.
B: Bueno is a nominalist.
W: Wright is a neo-logicist.
L: Leng is an instrumentalist.
M: Maddy is a naturalist.
16. If Field is a fictionalist, then Leng is an instrumentalist and Wright is a neo-
logicist. Maddy is a naturalist and Field is a fictionalist. If Wright is a neo-logicist,
then Bueno is a nominalist. So, Bueno is a nominalist.
17. Maddy is a naturalist only if Wright is a neo-logicist. Wright is a neo-logicist
only if Field is a fictionalist and Bueno is a nominalist. Leng is an instrumen-
talist, but Maddy is a naturalist. Hence, Field is a fictionalist and Bueno is a
nominalist.
18. Maddy is a naturalist, if, and only if, Bueno is not a nominalist. Maddy is a
naturalist unless Leng is an instrumentalist. Leng being an instrumentalist is
a sufficient condition for Field being a fictionalist. Wright is a neo-logicist, yet
Bueno is a nominalist. Therefore, Field is a fictionalist.
19. Bueno is a nominalist unless both Wright is a neo-logicist and Maddy is a natu-
ralist. Leng being an instrumentalist is a necessary condition for Bueno not
being a nominalist entailing that Wright is a neo-logicist. Leng being an instru-
mentalist entails that Field is a fictionalist. Bueno is not a nominalist; however,
Maddy is a naturalist. Thus, Field is a fictionalist.
20. Leng is an instrumentalist given that Field is not a fictionalist. If Bueno is a
nominalist, then Leng is not an instrumentalist. Either Field is not a fictionalist
and Bueno is a nominalist, or Maddy is a naturalist. Maddy is a naturalist just
in case Wright is a neo-logicist. Wright is a neo-logicist only if Bueno is not a
nominalist. So, Bueno is not a nominalist.
3 8 C h apter 2 P ropos i t i onal L og i c
R: Rawls is a deontologist.
V: Hursthouse is a virtue ethicist.
A: Anscombe defends the doctrine of double effect.
U: Hardin is a utilitarian.
21. If Hardin is a utilitarian, then Rawls is a deontologist. Rawls is a deontologist
only if Hursthouse is not a virtue ethicist. Hardin is a utilitarian. Consequently,
Anscombe defends the doctrine of double effect if Hursthouse is a virtue
ethicist.
22. Hardin is a utilitarian and Rawls is not a deontologist only if both Hursthouse
is a virtue ethicist and Anscombe defends the doctrine of double effect. Hurst-
house is not a virtue ethicist unless Anscombe does not defend the doctrine of
double effect. Hardin is a utilitarian. Hence, Rawls is a deontologist.
23. Hursthouse being a virtue ethicist is a necessary condition for Hardin not be-
ing a utilitarian. Hardin being a utilitarian is a sufficient condition for Rawls
not being a deontologist. Rawls is a deontologist. Anscombe defends the doc-
trine of double effect if Hardin is not a utilitarian. So, Anscombe defends the
doctrine of double effect and Hursthouse is a virtue ethicist.
24. If Anscombe defends the doctrine of double effect, then Hardin is a utilitarian.
Either Hursthouse is a virtue ethicist or Rawls is a deontologist. Hursthouse is
not a virtue ethicist. Rawls is not a deontologist if Hardin is a utilitarian. Con-
sequently, Anscombe does not defend the doctrine of double effect.
25. Hardin is not a utilitarian if, and only if, Rawls is not a deontologist. Rawls is
a deontologist. Anscombe defends the doctrine of double effect if Hardin is
a utilitarian. Hursthouse is a virtue ethicist. If Hursthouse is a virtue ethicist
and Anscombe defends the doctrine of double effect, then it is not the case that
either Hardin is not a utilitarian or Anscombe defends the doctrine of double
effect. So, Hardin is a utilitarian and Anscombe does not defend the doctrine
of double effect.
C: Chisholm is a foundationalist.
L: Lehrer is a coherentist.
G: Goldman is a reliabilist.
U: Unger is a skeptic.
Z: Zagzebski is a virtue epistemologist.
31. Zagzebski is a virtue epistemologist if, and only if, either Goldman is a reliabi-
list or Chisholm is a foundationalist. Zagzebski is a virtue epistemologist, but
Unger is a skeptic. Lehrer is a coherentist and Chisholm is not a foundationalist.
Thus, Goldman is a reliabilist.
32. If Unger is a skeptic, then Lehrer is a coherentist and Zagzebski is a virtue epis-
temologist. Chisholm being a foundationalist is sufficient for both Goldman
being a reliabilist and Lehrer not being a coherentist. Chisholm is a foundation-
alist; still, Goldman is a reliabilist. So, Unger is not a skeptic.
33. Chisholm is a foundationalist, or Unger is a skeptic only if Lehrer is a coherent-
ist. Zagzebski is a virtue epistemologist given that Goldman is a reliabilist. If
4 0 C h apter 2 P ropos i t i onal L og i c
S: Searle is a descriptivist.
K: Kripke is a direct reference theorist.
N: Neale is a metalinguistic descriptivist.
36. Searle is a descriptivist or Neale is a metalinguistic descriptivist. If Neale is a
metalinguistic descriptivist, then Kripke is a direct reference theorist. Searle is
a descriptivist if Kripke is a direct reference theorist. So, Searle is a descriptivist.
37. Either Searle is a descriptivist or Kripke is a direct reference theorist. Kripke is
not a direct reference theorist. Searle is a descriptivist only if Neale is a meta-
linguistic descriptivist. Therefore, Neale is a metalinguistic descriptivist and
Searle is a descriptivist.
38. Searle is a descriptivist given that Neale is a metalinguistic descriptivist. It is
not the case that Neale is a metalinguistic descriptivist unless Kripke is a direct
reference theorist. Either Searle is a descriptivist or Kripke is not a direct refer-
ence theorist. If it is not the case that both Searle is a descriptivist and Kripke
is a direct reference theorist, then Neale is a metalinguistic descriptivist. Thus,
it is not the case that Searle is a descriptivist and Neale is not a metalinguistic
descriptivist.
39. Searle being a descriptivist is sufficient for Kripke being a direct reference
theorist. Kripke being a direct reference theorist is necessary and sufficient for
Neale not being a metalinguistic descriptivist. If it is not the case that Kripke is
a direct reference theorist, then Searle is a descriptivist and Neale is a metalin-
guistic descriptivist. Searle is a descriptivist. So, it is not the case that Neale is a
metalinguistic descriptivist.
40. Either Kripke is a direct reference theorist or Neale is a metalinguistic descrip-
tivist, just in case Searle is not a descriptivist. Neale is not a metalinguistic de-
scriptivist unless Kripke is a direct reference theorist. Kripke is not a direct
2 . 1 : L og i cal O perators an d T ranslat i on 4 1
EXERCISES 2.1d
Interpret the following sentences of propositional logic using
the given translation key. Strive for elegance in your English
sentences.
13. [(N ∙ P) ∙ Q ] ⊃ ∼R
14. [(M ∨ N) ∙ P] ⊃ (Q ∨ R)
15. ∼P ≡ ∼Q
there are infinitely many wffs, constructed by applying a simple set of rules, called
Formation rulessay formation rules.
how to combine the
vocabulary of a language
Formation Rules for Wffs of PL
into well-formed PL1. A single capital English letter is a wff.
formulas. PL2. If α is a wff, so is ∼α.
PL3. If α and β are wffs, then so are:
(α ∙ β)
(α ∨ β)
(α ⊃ β)
(α ≡ β)
PL4. These are the only ways to make wffs.
An atomic formulais The simplest wffs, which we call atomic, are formed by a single use of PL1. Com-
formed by a single use of plex wffs are composed of atomic wffs, using any of the other rules. The Greek letters
PL1. All other wffs are α and β in the formation rules are metalinguistic variables; they can be replaced by
complex formulas. any wffs of the object language to form more complex wffs.
We add the punctuation in PL3 to group any pair of wffs combined using a binary
operator. By convention, we drop the outermost punctuation of a wff. That punctua-
tion must be replaced when a shorter formula is included in a more complex formula.
As wffs get longer, it can become difficult to distinguish nested punctuation. For read-
ability, I use square brackets, [ and ], when I need a second set of parentheses, and
braces, { and }, when I need a third. The three kinds of punctuation are interchangeable.
2.2.7 provides an example of how one might construct a complex wff using the for-
mation rules, starting with simple letters.
2.2.7 W By PL 1
X By PL 1
∼W By PL2
∼W ∙ X By PL3, and the convention for dropping brackets
(∼W ∙ X) ≡ X By PL3, putting the brackets back
∼[(∼W ∙ X) ≡ X] By PL2
The order of construction of a wff is especially important because it helps us de-
The last operator added termine the main operator. The main operator of a wff is important because we
to a wff according to the characterize wffs by their main operators: negations, conjunctions, disjunctions, con-
formation rules is called ditionals, or biconditionals. In the next few sections, we will learn how to character-
the main operator. ize wffs further.
We can determine the main operator of any wff of PL by analyzing the formation of
that wff, as I do at 2.2.8.
2.2.8 (∼M ⊃ P) ∙ (∼N ⊃ Q)
‘M’, ‘P’, ‘N’, and ‘Q’ are all wffs, by PL1.
‘∼M’ and ‘∼N’ are wffs by PL2.
‘(∼M ⊃ P)’ and ‘(∼N ⊃ Q)’ are then wffs by PL3.
Finally, the whole formula is a wff by PL3 and the convention for dropping
brackets.
2 . 2 : S y nta x of P L : W ffs an d Ma i n O perators 4 5
Summary
In this section, we examined the syntax of the language PL, its vocabulary and rules
for constructing well-formed formulas. We saw that the main operator of a complex
formula is the final operator added when the formula is built according to the forma-
tion rules. We identify formulas with their main operators, and, as we leave syntax to
study the semantics of PL in the next three sections, we will classify formulas accord-
ing to the truth conditions at their main operators.
KEEP IN MIND
We can use the formation rules to distinguish wffs from non-well-formed strings.
We can also use the formation rules to identify the main operator of any wff.
EXERCISES 2.2
Are the following formulas wffs? If so, which operator is the
main operator? (For the purposes of this exercise, consider
formulas without their outermost punctuation as well formed,
according to the convention mentioned in this section.)
We will start our study of the semantics of propositional logic by looking at how
we calculate the truth value of a complex proposition on the basis of the truth values
of its component sentences. We can calculate the truth value of any complex proposi-
tion using the truth values of its component propositions and the basic truth tables
for each propositional operator, which we will see shortly. The fact that the truth
values of complex propositions are completely determined by the truth values of the
component propositions is called truth-functional compositionality, a basic presup-
position of our logic. Consider a complex proposition like 2.3.1 and its translation
into PL 2.3.2.
2.3.1 If either the Beatles made The White Album or Jay-Z didn’t make
The Black Album, then Danger Mouse did not make The Grey Album
or Jay-Z did make The Black Album.
2.3.2 (W ∨ ∼J) ⊃ (∼G ∙ J)
We can easily determine the truth values of the component, atomic propositions,
W, J, and G. In this case, all of the atomic propositions are true: the Beatles made The
White Album, Jay-Z made The Black Album, and Danger Mouse made The Grey Album.
But what is the truth value of the whole complex proposition 2.3.1? The truth value of a
To determine the truth value of a complex proposition, we combine the truth complex proposition is
the truth value of its main
values of the component propositions using rules for each operator. These rules are
operator.
summarized in basic truth tables, one for each propositional operator. The basic
truth table for each logical operator defines the operator by showing the truth value The basic truth table
of the operation, given any possible distribution of truth values of the component for each logical operator
propositions. shows the truth value of
a complex proposition,
Once we combine these truth tables, our semantics, with our translations of natural given the truth values
languages into PL, can cause certain problems to arise. Not all of our natural-lan- of its component
guage sentences conform precisely to the semantics given by the truth tables. Diffi- propositions.
culties arise for the conditional, in particular. In this section, we’ll look at the details
of the truth tables for each operator before returning to 2.3.1 to see how to use the
basic truth tables.
Negation
Negation is the simplest truth function. When a statement is true, its negation is false;
when a statement is false, its negation is true.
2.3.3 Two plus two is four.
2.3.4 Two plus two is not four.
2.3.5 Two plus two is five.
2.3.6 Two plus two is not five.
2.3.3 is true, and its negation, 2.3.4, is false. 2.3.5 is false, and its negation, 2.3.6,
is true.
4 8 C h apter 2 P ropos i t i onal L og i c
We generalize these results using the basic truth table for negation. In the first row
of the truth table, we have an operator, the tilde, and a Greek metalinguistic letter,
α. The column under the ‘α’ represents all possible assignments of truth values to a
single proposition. We could use ‘T’ for ‘true’ and ‘F’ for ‘false’ in the truth table. I use
‘1’ for true and ‘0’ for false in this book, largely because ‘1’s and ‘0’s are very easy to tell
apart. The column under the ‘∼’ represents the values of the negation of the proposi-
tion in each row.
∼ α
1 0
The truth table for a complex proposition containing one variable has two lines,
since there are only two possible assignments of truth values. This truth table says
that if the value of a propositional variable is true, the value of its negation is false, and
if the value of a propositional variable is false, the value of its negation is true.
Conjunction
Conjunctions are true only when both conjuncts are true; otherwise they are false.
2.3.7 Esmeralda likes logic and metaphysics.
2.3.7 is true if ‘Esmeralda likes logic’ is true and ‘Esmeralda likes metaphysics’ is
true. It is false otherwise. Note that we need four lines to explore all the possibilities
of combinations of truth values of two propositions: when both are true, when one is
true and the other is false (and vice versa), and when both are false.
α ∙ β
1 1 1
0 0 1
0 0 0
Our basic truth tables all have either two lines or four lines, since all of our opera-
tors use either one or two variables. Truth tables for more-complex sentences can be
indefinitely long.
2 . 3 : S e m ant i cs of P L : T r u t h F u nct i ons 4 9
Disjunction
Disjunctions are false only when both disjuncts are false.
2.3.8 Kareem will get an A in either history or physics.
We’ll take 2.3.8 as expressing our optimism that Kareem will do very well in at least
one of the named courses. If he gets an A in neither course, then our optimism will
have proven to be unfounded; our statement will have been false. But as long as he
gets an A in either history or physics, the statement will have been shown to be true.
And if he gets an A in both of those classes, our optimism will have been shown to be
more than called for.
This interpretation of the ‘∨’ is slightly contentious, and is called inclusive disjunc-
tion. On inclusive disjunction, 2.3.8 is false only when both component statements
are false.
α ∨ β
1 1 1
0 1 1
0 0 0
There is an alternative use of ‘or’ on which a disjunction is also false when both com-
ponent propositions are true, which we can call exclusive disjunction. 2.3.9 is most
naturally interpreted as using an exclusive disjunction.
2.3.9 You may have either soup or salad.
Uses of 2.3.9 are usually made to express that one may have either soup or salad, but
not both. Thus it seems that some uses of ‘or’ are inclusive and some uses of ‘or’ are
exclusive.
One way to manage the problem of the different senses of ‘or’ would be to have two
different logical operators, one for inclusive ‘or’ and one for exclusive ‘or.’ This would
give us more operators than we need, since we can define either one in terms of the
other, along with other logical operators. So, it is mainly arbitrary whether we take
inclusive or exclusive disjunction as the semantics of ‘∨’. We just need to be clear
about what we mean when we are regimenting sentences into our formal logic. We
will thus (traditionally, but also sort of arbitrarily) use inclusive disjunction, the ∨, to
translate ‘or’.
5 0 C h apter 2 P ropos i t i onal L og i c
Material Implication
To interpret English-language conditionals, we use what is called the material inter-
pretation on which a conditional is false only when the antecedent is true and the
consequent is false.
α ⊃ β
1 1 1
0 1 1
0 1 0
boiling water while steel does not. That is not a logical difference, though, and the two
sentences have the same truth conditions as far as ⊃ is concerned.
Some uses of conditionals in English are truth-functional, and we are going to
use ‘⊃’ to regiment conditionals into PL despite worries about sentences like 2.3.11
and 2.3.12.
The Biconditional
A biconditional is true if the component statements share the same truth value. It is
false if the components have different values.
α ≡ β
1 1 1
0 0 1
0 1 0
Let’s see how to use this method with the example 2.3.14.
5 2 C h apter 2 P ropos i t i onal L og i c
2.3.14 (A ∨ X) ∙ ∼B
Let’s arbitrarily assume that A and B are true and X is false. If we were starting with
an English sentence, we might be able to determine appropriate truth values of the
component sentences.
First, assign the assumed values to the atomic formulas A, B, and X.
(A ∨ X) ∙ ∼ B
1 0 1
(A ∨ X) ∙ ∼ B
1 0 0 1
Since we know the values of the disjuncts, we can next evaluate the disjunction.
(A ∨ X) ∙ ∼ B
1 1 0 0 1
(A ∨ X) ∙ ∼ B
1 1 0 0 0 1
(W ∨ ∼ J) ⊃ (∼ G ∙ J)
1 1 1 1
Then we can use our method for determining the truth value of a complex proposi-
tion, first evaluating the negations.
2 . 3 : S e m ant i cs of P L : T r u t h F u nct i ons 5 3
(W ∨ ∼ J) ⊃ (∼ G ∙ J)
1 0 1 0 1 1
Now we can evaluate the disjunction on the left and the conjunction on the right.
(W ∨ ∼ J) ⊃ (∼ G ∙ J)
1 1 0 1 0 1 0 1
Finally, we can find the truth value of the main operator, the horseshoe.
(W ∨ ∼ J) ⊃ (∼ G ∙ J)
1 1 0 1 0 0 1 0 1
A ⊃ (∼ X ∙ ∼ Y)
1 0 0
A ⊃ (∼ X ∙ ∼ Y)
1 1 0 1 0
A ⊃ (∼ X ∙ ∼ Y)
1 1 0 1 1 0
5 4 C h apter 2 P ropos i t i onal L og i c
A ⊃ (∼ X ∙ ∼ Y)
1 1 1 0 1 1 0
[(A ∙ B) ⊃ Y] ⊃ [A ⊃ (C ⊃ Z)]
1 1 0 1 1 0
[(A ∙ B) ⊃ Y] ⊃ [A ⊃ (C ⊃ Z)]
1 1 1 0 1 1 0 0
Now we can evaluate both the antecedent and the consequent of the main operator.
[(A ∙ B) ⊃ Y] ⊃ [A ⊃ (C ⊃ Z)]
1 1 1 0 0 1 0 1 0 0
[(A ∙ B) ⊃ Y] ⊃ [A ⊃ (C ⊃ Z)]
1 1 1 0 0 1 1 0 1 0 0
value of the complex proposition. If the truth values of the whole proposition are the
same whatever values we assign to the unknown propositions, then the statement
has that truth value. If the values come out different in different cases, then the truth
value of the complex statement is really unknown.
Let’s look at a few of these cases, and suppose that A, B, C are true; X, Y, Z are false;
and P and Q are unknown for the remainder of the section. We’ll start with 2.3.17.
2.3.17 P ∙ A
P ∙ A
1 1 1
P ∙ A
0 0 1
Since the truth value of 2.3.17 depends on the truth value of P, it too is unknown.
In contrast, 2.3.18 has a determinable truth value even though one of the atomic
propositions in it is unknown.
2.3.18 P ∨ A
P ∨ A
1 1 1
P ∨ A
0 1 1
The truth value of 2.3.18 is true in both cases. In our bivalent logic, these are the
only cases we have to consider. Thus, the value of that statement is true, even though
we didn’t know the truth value of one of its component propositions.
5 6 C h apter 2 P ropos i t i onal L og i c
We have seen that the truth value of a complex proposition containing a component
proposition with an unknown truth value may be unknown and it may be true. Some-
times the truth value of such a complex proposition will come out false, as in 2.3.19.
2.3.19 Q ∙ Y
If Q is true, then 2.3.19 is false.
Q ∙ Y
1 0 0
Q ∙ Y
0 0 0
Since the truth value of the complex proposition is false in both cases, the value of
2.3.19 is false.
Lastly, we can have more than one unknown in a statement. If there are two un-
knowns, we must consider four cases: when both propositions are true; when one is
true and the other is false; the reverse case, when the first is false and the second is
true; and when both are false, as in 2.3.20.
2.3.20 (A ⊃ P) ∨ (Q ⊃ A) where A is true.
(A ⊃ P) ∨ (Q ⊃ A)
Since all possible substitutions of truth values for ‘P’ and ‘Q’ in 2.3.20 yield a true
statement, the statement itself is true.
Summary
In this section I introduced all of the basic truth tables, one for each of the five propo-
sitional operators. The basic truth tables are mostly intuitive, and so not very difficult
2 . 3 : S e m ant i cs of P L : T r u t h F u nct i ons 5 7
to reconstruct if you forget one or other of the lines. Remember, especially, that our
disjunction is inclusive, and the material conditional is false only in the second row,
when the antecedent is true and the consequent is false.
The basic truth tables are useful in evaluating the truth value of a complex proposi-
tion on the basis of the truth values of the component, atomic propositions. We can
even sometimes evaluate propositions for which we do not know all of the truth val-
ues of the atomic propositions.
KEEP IN MIND
EXERCISES 2.3a
Assume A, B, C are true and X, Y, Z are false. Evaluate the
truth values of each complex expression.
1. X ∨ Z 11. X ∙ [A ⊃ (Y ∨ Z)]
2. A ∙ ∼C 12. (B ∨ X) ⊃ ∼(Y ≡ C)
3. ∼C ⊃ Z 13. (∼B ⊃ Z) ∙ (A ≡ X)
4. (A ∙ Y) ∨ B 14. ∼(A ≡ C) ⊃ (X ∙ Y)
5. (Z ≡ ∼B) ⊃ X 15. ∼(A ∨ Z) ≡ (X ∙ Y)
6. (A ⊃ B) ∨ ∼X 16. (C ⊃ Y) ∨ [(A ∙ B) ⊃ ∼X]
7. (Z ∙ ∼X) ⊃ (B ∨ Y) 17. [(C ∙ Y) ∨ Z] ≡[∼B ∨ (X ⊃ Y)]
8. (B ≡ C) ⊃ ( A ⊃ X) 18. [(X ∙ A) ⊃ B] ≡[C ∨ ∼(Z ⊃ Y)]
9. (A ∙ Z) ∨ ∼(X ∙ C) 19. [(A ∙ B) ≡ X] ⊃ [(∼Z ∙ C) ∨ Y]
10. (Z ∙ A) ∨ (∼C ∙ Y) 20. [X ⊃ (A ∨ B)] ≡ [(X ∙ Y) ∨ (Z ∙ C)]
5 8 C h apter 2 P ropos i t i onal L og i c
EXERCISES 2.3b
Assume A, B, C are true; X, Y, Z are false; and P and Q are
unknown. Evaluate the truth value of each complex
expression.
1. Q ∙ ∼Q 11. ∼[(P ∙ Z) ⊃ Y] ≡ (Z ∨ X)
2. Q ⊃ B 12. [Q ∙ (B ≡ C)] ∙ ∼Y
3. P ∙ ∼C 13. [(A ∨ X) ⊃ (Y ∙ B)] ≡ ∼Q
4. P ≡ ∼P 14. ∼(A ∨ P) ≡ [(B ∙ X) ⊃ Y]
5. P ∨ (X ∙ Y) 15. ∼P ⊃ [∼(A ∙ B) ∨ (Z ∙ Y)]
6. ∼(Z ∙ A) ⊃ P 16. [∼Z ∙ (P ⊃ A)] ∨ [X ≡ ∼(B ⊃ Y)]
7. Q ∨ ∼(Z ∙ A) 17. ∼(X ∨ C) ∙[(P ⊃ B) ⊃ (Y ∙ Z)]
8. (P ⊃ A) ∙ (Z ∨ B) 18. [∼P ⊃ (A ∨ X)] ⊃ [(B ∨ P) ≡ (Y ⊃ Z)]
9. (P ≡ B) ∨ (Y ⊃ C) 19. [(P ∙ A) ∨ ∼B] ≡{∼A ⊃ [(C ∨ X) ∙ Z]}
10. [(Z ⊃ C) ∙ P] ≡ (A ∨ X) 20. [(Q ∨ ∼C) ⊃ Q ] ≡ ∼[Q ≡ (A ∙ ∼Q )]
EXERCISES 2.3c
As in Exercises 2.3b, assume A, B, C are true; X, Y, Z are
false; and P and Q are unknown. Evaluate the truth value of
each complex expression.
1. Q ⊃ (A ∨ P)
2. (P ⊃ C) ≡ [(B ∨ Q ) ⊃ X]
3. (A ∙ Q ) ∙ (X ∙ P)
4. (P ∙ Q ) ⊃ (X ∨ A)
5. (Q ⊃ P) ∙(Z ∨ ∼Y)
6. (P ∙ Z) ⊃ (Q ∨ A)
7. (P ∨ Q ) ∨ (∼A ≡ Y)
8. (P ∙ Z) ⊃ ∼(Q ≡ C)
9. ∼(Y ∨ Q ) ∨ [(P ⊃ B) ≡ A]
10. (X ∙ P) ≡ [(Q ∨ B) ⊃ (Z ≡ A)]
11. ∼{[P ⊃ (Q ⊃ C)] ∙ Z}
2 . 4 : T r u t h T ables 5 9
For step 1, the number of rows of a truth table is a function of the number of vari-
ables in the wff. With one propositional variable, we need only two rows, as in the
basic truth table for negation: one for when the variable is true and one for when it
is false. With two propositional variables, we need four rows, as in the basic truth
tables for all the binary operators. Each additional variable doubles the number of
rows needed: the number of rows needed for the simpler table when the new variable
is true and the same number again when the new variable is false.
Determining the Size of a Truth Table
1 variable: 2 rows
2 variables: 4 rows
3 variables: 8 rows
4 variables: 16 rows
n variables: 2 n rows
For step 2, it is conventional and useful to start truth tables in a systematic way,
assigning a set of truth values that depends on the size of the truth table to the first
variable in the proposition, a different set to the next variable, and so on. These con-
ventions are constrained by two requirements:
• The truth table must contain every different combination of truth values of the
component propositions.
• The assignments of truth values to any particular propositional variable must
be consistent within the truth table: if the third row under the variable ‘P’ has
a 1 in one column, the third row under the variable ‘P’ must have a 1 in every
column.
Our conventional method for constructing truth tables, which I’ll describe in the
remainder of this section, can be adapted to construct a truth table for any wff of PL.
First, I’ll introduce columns on the left of the table, one for each variable in the wff.
Then, I’ll use a conventional method for assigning truth values to each variable. The
method is the same for each wff with the same number of variables and expands in a
natural way for longer formulas. There are other ways of presenting the same infor-
mation, the truth conditions for any proposition, but I’ll use this one method consis-
tently throughout the book.
For wffs with only one variable, we only need to consider what happens when that
variable is true and when it is false: two rows. We’ll consider what happens when
the variable is true in the first row and what happens when it is false in the second row.
Here is a two-row truth table, for ‘P ⊃ P’:
P P ⊃ P
1 1 1 1
0 0 1 0
2 . 4 : T r u t h T ables 6 1
Notice that the left side of the truth table contains a column for the only variable,
‘P’. The values in that column are exactly the same under every instance of ‘P’ in the
table. The column under the ⊃, the main operator, contains the values of the whole
wff, which we calculate using the values of ‘P’ and the basic truth table for the material
conditional.
Also notice that, to make things a little easier to read, I highlight the values of the
main operator. Some students like to use highlighters for the values of the main op-
erator, or even different colored highlighters for different columns as one constructs
the table.
Below example 2.4.1 is the beginning of a four-row truth table.
2.4.1 (P ∨ ∼Q) ∙ (Q ⊃ P)
P Q (P ∨ ∼ Q) ∙ (Q ⊃ P)
1 1
1 0
0 1
0 0
Since the wff at 2.4.1 has two variables, the left side of the truth table has two col-
umns. The assignments of truth values to the variables ‘P’ and ‘Q’ use the conven-
tional method I mentioned; it would be good to memorize this pair of columns. All
four-row truth tables ordinarily begin with this set of assignments, though it does not
matter which variable gets which column at first.
To continue to complete the truth table for 2.4.1, we copy the values from the left
side of the truth table to columns under each propositional variable on the right side,
making sure to assign the same values to any particular variable each time it occurs.
P Q (P ∨ ∼ Q) ∙ (Q ⊃ P)
1 1 1 1 1 1
1 0 1 0 0 1
0 1 0 1 1 0
0 0 0 0 0 0
6 2 C h apter 2 P ropos i t i onal L og i c
To complete the truth table, we have to fill in the column under the main operator,
the conjunction. We work toward it in the order described in the formation rules of
section 2.2, first evaluating the negations of any formulas whose columns are already
complete, then evaluating binary operators whose two sides are complete.
Let’s continue our example 2.4.1. First complete the column under the tilde.
P Q (P ∨ ∼ Q) ∙ (Q ⊃ P)
1 1 1 0 1 1 1
1 0 1 1 0 0 1
0 1 0 0 1 1 0
0 0 0 1 0 0 0
Then we can complete the columns under the disjunction and the conditional.
P Q (P ∨ ∼ Q) ∙ (Q ⊃ P)
1 1 1 1 0 1 1 1 1
1 0 1 1 1 0 0 1 1
0 1 0 0 0 1 1 0 0
0 0 0 1 1 0 0 1 0
Finally, we can complete the truth table by completing the column under the main
operator, the conjunction, using the columns for the disjunction and the conditional.
2 . 4 : T r u t h T ables 6 3
P Q (P ∨ ∼ Q) ∙ (Q ⊃ P)
1 1 1 1 0 1 1 1 1 1
1 0 1 1 1 0 1 0 1 1
0 1 0 0 0 1 0 1 0 0
0 0 0 1 1 0 1 0 1 0
Thus, 2.4.1 is false when P is false and Q is true, and true otherwise.
Ordinarily, we write out the truth table only once, as in the last table in this dem-
onstration. Some people choose not to use the left side of the truth table, just assign-
ing values to variables directly. This has the short-term advantage of making your
truth tables shorter, but the long-term disadvantage of making them more difficult
to read.
Here is the start to an eight-line truth table, for 2.4.2.
2.4.2 [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
P Q R [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
1 1 1
1 1 0
1 0 1
1 0 0
0 1 1
0 1 0
0 0 1
0 0 0
6 4 C h apter 2 P ropos i t i onal L og i c
To proceed, first copy the values of the component propositions, P, Q , and R into
the right side of the table.
P Q R [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
1 1 1 1 1 1 1 1 1
1 1 0 1 1 1 0 1 0
1 0 1 1 0 0 1 1 1
1 0 0 1 0 0 0 1 0
0 1 1 0 1 1 1 0 1
0 1 0 0 1 1 0 0 0
0 0 1 0 0 0 1 0 1
0 0 0 0 0 0 0 0 0
Now work inside out, determining the truth values of the operators inside
parentheses.
P Q R [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
1 1 1 1 1 1 1 1 1 1 1 1
1 1 0 1 1 1 1 0 0 1 0 0
1 0 1 1 0 0 0 1 1 1 1 1
1 0 0 1 0 0 0 1 0 1 0 0
0 1 1 0 1 1 1 1 1 0 1 1
0 1 0 0 1 1 1 0 0 0 1 0
0 0 1 0 1 0 0 1 1 0 1 1
0 0 0 0 1 0 0 1 0 0 1 0
2 . 4 : T r u t h T ables 6 5
P Q R [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
1 1 1 1 1 1 1 1 1 1 1 1 1
1 1 0 1 1 1 0 1 0 0 1 0 0
1 0 1 1 0 0 0 0 1 1 1 1 1
1 0 0 1 0 0 0 0 1 0 1 0 0
0 1 1 0 1 1 1 1 1 1 0 1 1
0 1 0 0 1 1 0 1 0 0 0 1 0
0 0 1 0 1 0 1 0 1 1 0 1 1
0 0 0 0 1 0 1 0 1 0 0 1 0
P Q R [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
1 1 1 1 1 1 1 1 1 1 1 1 1 1
1 1 0 1 1 1 0 1 0 0 1 1 0 0
1 0 1 1 0 0 0 0 1 1 1 1 1 1
1 0 0 1 0 0 0 0 1 0 1 1 0 0
0 1 1 0 1 1 1 1 1 1 1 0 1 1
0 1 0 0 1 1 0 1 0 0 1 0 1 0
0 0 1 0 1 0 1 0 1 1 1 0 1 1
0 0 0 0 1 0 1 0 1 0 1 0 1 0
6 6 C h apter 2 P ropos i t i onal L og i c
You may notice that 2.4.2 has an interesting property: it is true in every row. Not
every proposition is true in all cases! We will return to this property, and others, in
the next section.
Summary
The goal of this section is to show you how to construct truth tables for any proposi-
tion, of any length, at least in principle. It will be helpful to memorize the method
for assigning truth values to variables for propositions with one, two, and three, and
even four variables. But there is, of course, a general method that you could learn for
propositions of any number of variables.
KEEP IN MIND
Truth tables summarize the distributions of truth values of simple and complex propositions.
We construct truth tables for wffs of PL in three steps:
1. Determine how many rows we need.
2. Assign truth values to the component variables.
3. Work inside out until we reach the main operator.
The number of rows of a truth table is a function of the number of variables in the wff.
Assignments of truth values to a propositional variable must be consistent within the truth
table.
If completed correctly, the truth table will contain every different combination of truth values
of the component propositions.
EXERCISES 2.4
Construct truth tables for each of the following propositions.
P Q R [(P ⊃ Q) ∙ (Q ⊃ R)] ⊃ (P ⊃ R)
1 1 1 1 1 1 1 1 1 1 1 1 1 1
1 1 0 1 1 1 0 1 0 0 1 1 0 0
1 0 1 1 0 0 0 0 1 1 1 1 1 1
1 0 0 1 0 0 0 0 1 0 1 1 0 0
0 1 1 0 1 1 1 1 1 1 1 0 1 1
0 1 0 0 1 1 0 1 0 0 1 0 1 0
0 0 1 0 1 0 1 0 1 1 1 0 1 1
0 0 0 0 1 0 1 0 1 0 1 0 1 0
Logical truthsare
propositions which
are true on any Tautologies are important because they are the logical truths of PL, propositions
interpretation. that are true on any interpretation, for any values of its component premises.
2 . 5 : C lass i f y i ng P ropos i t i ons 6 9
While there are infinitely many tautologies of PL, most wffs are not tautologies.
2.5.2 is true in some cases, false in others; its truth value is contingent on the truth
values of its component propositions.
2.5.2 P ∨ ∼Q
P Q P ∨ ∼ Q
1 1 1 1 0 1
1 0 1 1 1 0
0 1 0 0 0 1
0 0 0 1 1 0
Contingencies are true in at least one row of their truth table and false in at least A contingency is a
one row. In ordinary language, we say that an event is contingent if it is possible that proposition that is true
in some rows of its truth
it happens and possible that it doesn’t happen; logical contingency is similarly neither table and false in others.
certainly true nor certainly false.
Some propositions are false in every row. We call such statements contradictions. A proposition which
2.5.3 and 2.5.4 are contradictions. is false in every row
of its truth table is a
2.5.3 P ∙ ∼P contradiction .
P P ∙ ∼ P
1 1 0 0 1
0 0 0 1 0
P Q (∼ P ⊃ Q) ≡ ∼ (Q ∨ P)
1 1 0 1 1 1 0 0 1 1 1
1 0 0 1 1 0 0 0 0 1 1
0 1 1 0 1 1 0 0 1 1 0
0 0 1 0 0 0 0 1 0 0 0
7 0 C h apter 2 P ropos i t i onal L og i c
A B (A ∨ B) ≡ (∼ B ⊃ A)
1 1 1 1 1 1 0 1 1 1
1 0 1 1 0 1 1 0 1 1
0 1 0 1 1 1 0 1 1 0
0 0 0 0 0 1 1 0 0 0
Now consider the two sides of the biconditional in 2.5.5, as separate propositions,
2.5.6 and 2.5.7, and let’s look at the truth table for just the main operators of each.
2.5.6 A ∨ B 2.5.7 ∼B ⊃ A
A ∨ B ∼ B ⊃ A
1 1
1 1
1 1
0 0
Notice that 2.5.6 and 2.5.7 have the same truth values in each row; that’s what
Two or more propositions makes the biconditional between them a tautology. This property of propositions,
are logically equivalent having identical truth conditions, is called logical equivalence.
when they have the same The concept of logical equivalence has many uses. It is important in part because
truth values in every row
it shows a limit to the expressibility of truth-functional languages like PL: there are
of their truth tables.
many equivalent ways of saying the same thing, of expressing the same truth con-
ditions. For example, notice that the truth conditions of any statement made using
2 . 5 : C lass i f y i ng P ropos i t i ons 7 1
the biconditional are identical to those made with a conjunction of two conditionals.
That is, a statement of the form ‘α ≡ β’ is logically equivalent to a statement that uses
only other operators, a statement of the form ‘(α ⊃ β) ∙ (β ⊃ α)’.
α β α ≡ β (α ⊃ β) ∙ (β ⊃ α)
1 1 1 1 1 1 1 1 1 1 1 1
1 0 1 0 0 1 0 0 0 0 1 1
0 1 0 0 1 0 1 1 0 1 0 0
0 0 0 1 0 0 1 0 1 0 1 0
We can thus see the biconditional as a superfluous element of our logical language.
Other operators can be shown to be superfluous in similar ways. When constructing
languages for propositional logic, we have choices of which operators to use and how
many operators to use. The study of the relations among the different operators, and
which operators are adequate for propositional logic, is a topic in metalogic. Meta
logic is the study of logical systems.
When evaluating the relations among two or more propositions, make sure to as-
sign the same truth conditions to the same variables throughout the exercise. To com-
pare the two propositions, the column under the A in 2.5.8 should be the same as the
column under the A in 2.5.9, and similarly for the B, even though the B comes first,
reading left to right, in the latter proposition.
2.5.8 A ∨ ∼B
A B A ∨ ∼ B
1 1 1 1 0 1
1 0 1 1 1 0
0 1 0 0 0 1
0 0 0 1 1 0
7 2 C h apter 2 P ropos i t i onal L og i c
2.5.9 B ∙ ∼A
A B B ∙ ∼ A
1 1 1 0 0 1
1 0 0 0 0 1
0 1 1 1 1 0
0 0 0 0 1 0
2.5.8 and 2.5.9 have opposite truth values in each row; we call this pair of proposi-
Two propositions with
opposite truth values in
tions a contradiction.
all rows of the truth table Notice that just as a biconditional connecting logically equivalent statements
are contradictory. is a tautology, a biconditional connecting two contradictory statements will be a
contradiction.
Also notice that contradiction is a relation between exactly two propositions, where
logical equivalence can hold for indefinitely many propositions.
Most pairs of statements, like 2.5.10 and 2.5.11, are neither logically equivalent nor
contradictory.
2.5.10 E ⊃ D 2.5.11 ∼E ∙ D
E D E ⊃ D E D ∼ E ∙ D
1 1 1 1 1 1 1 0 1 0 1
1 0 1 0 0 1 0 0 1 0 0
0 1 0 1 1 0 1 1 0 1 1
0 0 0 1 0 0 0 1 0 0 0
We can see that 2.5.10 and 2.5.11 are not contradictory in rows 2 and 4; they have
Two or more propositions the same truth values in those two rows. We can see that they are not logically equiva-
that are true in at least
one common row of
lent in rows 1 and 4, where they have opposite truth values. Still, there are ways to
their truth tables are characterize their relation.
consistent. 2.5.10 and 2.5.11 are called consistent propositions, since while they are not equiv-
alent, they still may be true together. There is at least one row of the truth tables in
2 . 5 : C lass i f y i ng P ropos i t i ons 7 3
which both propositions are true. In consistent propositions, there are values of the
component variables that will make both propositions true in the same conditions.
2.5.10 and 2.5.11 are both true in row 3. Thus, someone who uttered both proposi-
tions would be speaking truthfully if E is false and D is true. This assignment of truth
values to component propositions is called a valuation. When you determine that A valuation i s an
two or more propositions are consistent, you can thus describe a consistent valuation assignment of truth
values to simple
by stating the values of the component variables in the row in which both full proposi-
component propositions.
tions are true.
If two statements are neither logically equivalent nor contradictory, they may thus
be consistent or inconsistent. Inconsistency is just the negation of consistency; like
contradictoriness, inconsistency holds only among pairs of propositions.
2.5.12 and 2.5.13 are an inconsistent pair.
2.5.12 E ∙ F 2.5.13 ∼(E ⊃ F)
E F E ∙ F E F ∼ (E ⊃ F)
1 1 1 1 1 1 1 0 1 1 1
1 0 1 0 0 1 0 1 1 0 0
0 1 0 0 1 0 1 0 0 1 1
0 0 0 0 0 0 0 0 0 1 0
KEEP IN MIND
EXERCISES 2.5a
Construct truth tables for each of the following propositions
and then classify each proposition as tautologous,
contingent, or contradictory.
1. A ∨ ∼A 9. (E ⊃ F) ≡ ∼(F ∨ ∼E)
2. B ≡ ∼B 10. (G ∙ ∼H) ∨ (H ⊃ ∼G)
3. ∼C ⊃ ∼C 11. ∼(I ∙ J) ≡ (∼I ∨ ∼J)
4. ∼(A ∨ ∼A) 12. (K ⊃ L) ≡ (K ∙ ∼L)
5. ∼(B ∙ ∼B) 13. (∼M ∙ N) ∙ (N ⊃ M)
6. ∼C ≡ (C ∨ ∼C) 14. (A ⊃ B) ≡ (∼A ∨ B)
15. (∼E ⊃ F) ∨ (∼E ∙ ∼F)
7. B ≡ (A ∙ ∼B) 16. (M ⊃ ∼N) ∙ (M ∙ N)
8. (C ∨ D) ∙ ∼(D ⊃ C) 17. (Q ⊃ R) ≡ (∼R ∙ Q )
2 . 5 : C lass i f y i ng P ropos i t i ons 7 5
18. (S ⊃ ∼T) ∨ (T ≡ S)
19. (U ∙ ∼V) ⊃ (V ≡ ∼U)
20. (∼P ≡ Q ) ∙ ∼[Q ⊃ (P ∨ Q )]
21. (T ⊃ U) ∨ (U ⊃ T)
22. (D ⊃ F) ∨ (E ⊃ D)
23. (O ≡ P) ≡ [(∼O ∨ P) ⊃ (P ∙ ∼O)]
24. ∼[W ⊃ (X ∨ ∼W)]
25. (∼Y ⊃ ∼Z) ∙ (Z ∨ Y)
26. ∼C ≡ (A ∨ ∼B)
27. (G ∙ H) ⊃ (G ∨ I)
28. (J ∙ ∼K) ∙ ∼(L ∨ J)
29. (N ∨ O) ⊃ (M ∙ O)
30. ∼(P ∙ Q ) ∨ (Q ⊃ R)
31. ∼{A ⊃ [(B ∙ C) ≡ ∼A]}
32. [(G ∙ H) ⊃ (I ∨ ∼H)] ≡ ∼(G ∙ H)
33. [(J ∙ K) ⊃ L] ≡ [L ∨ (∼J ∨ ∼K)]
34. [M ⊃ (∼N ∙ ∼O)] ∙ [(M ∙ N) ∨ (M ∙ O)]
35. [∼A ∨ (∼B ∙ ∼C)] ≡ [(A ∙ B) ∨ (A ∙ C)]
36. [D ∨ (E ∙ F)] ≡ [(D ∨ E) ∙ (D ∨ F)]
37. (G ∨ H) ∨ (I ∨ J)
38. (T ∙ U) ⊃ ∼(V ⊃ W)
39. [K ∙ (L ⊃ M)] ∨ (N ≡ K)
40. [P ⊃ (Q ∙ R)] ⊃ [∼S ≡ (P ∨ R)]
41. [(W ∙ X) ⊃ (Y ∨ Z)] ∨ [(∼Z ∙ X) ∙ (W ∙ ∼Y)]
42. [(A ∨ B) ⊃ (∼D ∙ C)] ≡ {∼[(A ∨ B) ∙ D] ∙ [(A ∨ B) ⊃ C]}
43. [(E ∙ F) ∨ (∼E ∨ ∼F)] ⊃ [(∼G ∙ H) ∙ (∼G ⊃ ∼H)]
44. [(J ⊃ ∼I) ∙ (∼K ⊃ ∼L)] ∙[(L ∙ ∼K) ∨ (I ∙ J)]
45. [M ≡ (∼N ∙ O)] ⊃ [(P ∙ N) ⊃ M]
7 6 C h apter 2 P ropos i t i onal L og i c
EXERCISES 2.5b
Construct truth tables for each of the following pairs of
propositions. Then, for each pair of propositions, determine
whether the statements are logically equivalent or
contradictory. If neither, determine whether they are
consistent or inconsistent.
1. ∼E ⊃ ∼F and E ∨ F
2. G ⊃ H and ∼H ∙ G
3. K ≡ L and ∼(L ⊃ K)
4. ∼(M ∨ N) and ∼M ∙ ∼N
5. ∼O ⊃ P and O ∨ P
6. ∼Q ≡ R and Q ∙ R
7. (S ∨ T) ∙ ∼S and T ⊃ S
8. ∼Y ⊃ Z and ∼Z ⊃ Y
9. ∼(A ∙ B) and ∼A ⊃ B
10. C ⊃ (D ∙ C) and ∼D ∙ C
11. (E ∨ F) ∙ E and
∼(E ∨ F)
12. (G ∙ H) ∨ ∼G and
∼H ⊃ (G ≡ H)
13. I ∨ (J ∙ ∼J) and ( J ≡ ∼I) ∙ J
14. (∼M ∙ ∼N) ≡ N and (N ∨ M) ∙ ∼M
15. (O ∨ P) ⊃ O and
∼O ≡ (P ∙ O)
16. (Q ∨ R) ∙ S and (Q ⊃ S) ∙ R
17. T ∨ (U ∙ W) and (T ∨ U) ∙ (T ∨ W)
18. (X ∙ Y) ∨ Z and (∼X ∨ ∼Y) ∙ ∼Z
19. (A ∙ B) ⊃ C and A ⊃ (B ⊃ C)
20. ∼(G ∨ H) ∙ I and (I ⊃ G) ∙ H
21. ( J ≡ K) ∙ L and [(∼L ∨ ∼K) ∙(L ∨ K)] ∨ ∼L
22. (M ⊃ N) ∨ (N ∙ ∼O) and (M ∙ ∼N) ∙ (∼N ∨ O)
23. (X ∙ Y) ⊃ Z and (X ∙ Y) ∙ ∼Z
24. (A ⊃ B) ∙ C and (∼B ⊃ ∼A) ∙ C
2 . 6 : Val i d an d In v al i d A rg u m ents 7 7
To show that 2.6.3 is invalid, we could assign truth values to the component propo-
sitions which yield true premises and a false conclusion. If ‘God exists’ were false and
‘every effect has a cause’ were true, then the conclusion would be false, but each of the
premises would be true. (The first premise is vacuously true according to the basic
truth table for the material conditional.) This assignment of truth values, or valuation,
A counterexample t o an is called a counterexample to argument 2.6.3.
argument is a valuation The argument in example 2.6.3 has the form at 2.6.4.
that makes the premises
true and the conclusion 2.6.4 α⊃β
false. β / α
In deductive logic, an invalid argument is called a fallacy. In informal or inductive
contexts, the term ‘fallacy’ has a broader meaning. Arguments of the form 2.6.4 are
fallacies that are so well known that they have a name: affirming the consequent. It
is logically possible for its premises to be true while its conclusion is false. A counter-
example is generated when the wff that replaces α is false and the wff that replaces
β is true. This fallacy is a formal result having nothing to do with the content of the
propositions used in the argument.
We need a rigorous method for distinguishing valid argument forms like 2.6.2 from
invalid ones like 2.6.4. The truth table method for determining if an argument is valid
is both rigorous and simple.
P Q P ⊃ Q / P // Q
1 1 1 1 1 1 1
1 0 1 0 0 1 0
0 1 0 1 1 0 1
0 0 0 1 0 0 0
Now that our truth table is complete, we can search for a counterexample. Notice
that in no row are the premises true and the conclusion false. There is thus no coun-
terexample. 2.6.5 is a valid argument.
In contrast, both 2.6.6 and 2.6.7 are invalid arguments. To show that they are in-
valid, we specify a counterexample. Some arguments will have more than one coun-
terexample; demonstrating that one counterexample is sufficient to show that an
argument is invalid.
2.6.6 P ⊃ Q
Q /P
P Q P ⊃ Q / Q // P
1 1 1 1 1 1 1
1 0 1 0 0 0 1
0 1 0 1 1 1 0
0 0 0 1 0 0 0
Argument 2.6.6 has a counterexample in the third row, when P is false and Q is true.
8 0 C h apter 2 P ropos i t i onal L og i c
2.6.7 (P ∙ Q) ⊃ R
∼P ∨ R
Q ∨ R / R ∙ Q
P Q R (P ∙ Q) ⊃ R / ∼ P ∨ R /
1 1 1 1 1 1 1 1 0 1 1 1
1 1 0 1 1 1 0 0 0 1 0 0
1 0 1 1 0 0 1 1 0 1 1 1
1 0 0 1 0 0 1 0 0 1 0 0
0 1 1 0 0 1 1 1 1 0 1 1
0 1 0 0 0 1 1 0 1 0 1 0
0 0 1 0 0 0 1 1 1 0 1 1
0 0 0 0 0 0 1 0 1 0 1 0
Q ∨ R // R ∙ Q
1 1 1 1 1 1
1 1 0 0 0 1
0 1 1 1 0 0
0 0 0 0 0 0
1 1 1 1 1 1
1 1 0 0 0 1
0 1 1 1 0 0
0 0 0 0 0 0
2 . 6 : Val i d an d In v al i d A rg u m ents 8 1
Argument 2.6.7 has a counterexample in row 3, where P and R are true and Q is
false. There is another counterexample in row 6, where Q is true but P and R are false,
and another in line 7, where P and Q are false and R is true. Again, one needs only a
single counterexample to demonstrate that an argument is invalid.
Summary
The method of determining the validity of an argument of PL in this section is the
most important item in this chapter. It is a foundation of all of the work on deriva-
tions in chapter 3 and in principle could be used to test the validity of any argument
of propositional logic. As we will see in the next section, and in the next chapter, this
method can get unwieldy and there are alternative methods for determining the va-
lidity and the invalidity of longer, more complicated arguments. All of those methods,
though, rely on this method for their justifications.
KEEP IN MIND
To test an argument for validity, look for a counterexample, a valuation on which the prem-
ises are true and the conclusion is false.
To look for a counterexample, construct one truth table for the entire argument, including
all of the premises and the conclusion.
Line up premises and conclusion horizontally, separating premises with a single slash and
separating the premises from the conclusion with a double slash.
Use consistent assignments to component variables throughout the whole truth table.
If there is a counterexample, the argument is invalid.
An invalid argument is one in which it is possible for true premises to yield a false
conclusion.
EXERCISES 2.6
Construct truth tables to determine whether each argument
is valid. If an argument is invalid, specify a counterexample.
1. A ⊃ ∼A 3. A ⊃ ∼A
∼A /A ∼ ∼A / ∼A
4. B ∙ C 19. D ∨ E
C / ∼B ∼D ∙ ∼F / ∼(E ∙ F)
5. C ∨ D 20. G ≡ H
∼D / ∼C H ∙ ∼I / ∼(I ∙ G)
6. E ∨ F 21. J ⊃ ∼K
∼(E ∙ ∼F) / E ≡ F K ⊃ L / ∼(L ∙ J)
7. G ≡ H 22. I ⊃ ( J ∙ K)
∼H / ∼G I ∙ ∼K / J ∙ ∼K
8. (K ∙ L) ∨ (K ∙ ∼L) 23. O ⊃ P
∼K /L ∼P ∨ Q
9. M ≡ ∼N ∼Q / ∼O
M∨N 24. (∼A ∨ B) ⊃ C
M / ∼N ⊃ N A ∙ B /C
10. ∼P ⊃ Q 25. L ≡ (M ∨ N)
Q ⊃ P / ∼P L ∙ ∼N
11. A ⊃ B M ⊃ ∼L / ∼L
∼B ∙ ∼A 26. ∼R ∨ S
B / B ∨ ∼A ∼(∼T ∙ S)
12. G ⊃ (H ∙ ∼G) ∼T ∙ ∼R
H∨G R ∨ S /T∙S
∼H / ∼G 27. (U ∙ V) ∨ W
13. J ⊃ K (∼W ⊃ U) ⊃ V
K ∼V ∨ W
∼J ∨ K / ∼J ∼W ∨ U / U
14. P ⊃ Q 28. (X ∙ Y) ≡ Z
∼Q ∨ P ∼Z ∙ X
∼Q /P ∼X ⊃ Y /Y
15. R ≡ S 29. D ∨ ∼E
∼R ∨ S ∼E ∙ F
∼S ⊃ ∼R /R ∼D ⊃ F /D
16. R ⊃ S 30. (G ∙ H) ⊃ ∼I
S ∨ T /R⊃T I∨G
H ⊃ ∼G
17. X ∙ ∼Y H ≡ I / ∼H ∨ G
Y ∨ Z / ∼Z
31. T ⊃ (U ∙ V)
18. ∼(A ∙ B) T∙U
B ⊃ C /A ∼V
U ⊃ ∼T / T
2 . 7 : In d i rect T r u t h T ables 8 3
32. M ∙ ∼N 38. ∼A ⊃ (B ∨ C)
O⊃P ∼C ∙ (∼B ∨ A)
P ∨ N / ∼M C ∨ ∼A
33. Q ⊃ R A ≡ (B ⊃ ∼C)
S∨T B / ∼A
T /R 39. (D ∙ G) ⊃ (E ∙ F)
34. ∼W ⊃ (X ∨ Y) D∨E
Y∙Z (G ∨ F) ≡ ∼E
∼(Z ⊃ X) /W≡Y G ⊃ E / ∼G
35. ∼A ∙ (B ∨ C) 40. ∼(H ⊃ K)
C⊃A K ⊃ (I ∙ J)
B ⊃ D / A ⊃ ∼D I≡H
H ⊃ ( J ∨ K)
36. E ∙ F ∼J ∙ (H ∨ ∼K) / K
G ⊃ (H ∨ ∼E)
∼F ∨ G /H
37. (W ⊃ X) ≡ Z
∼Z ∙ Y
(Y ∙ W) ⊃ X
X∙Z
(W ∙ Y) ⊃ (∼Z ∙ X) / Z ∨ W
2.7.1 is an invalid argument, as we can show with the indirect, or shortcut, method.
2.7.1 G ≡ H
G / ∼H
To show that 2.7.1 is invalid, first write it out, as you would a normal truth table for
an argument. Just as I did for the truth tables, I’ll list all the component propositions
on the left side of the table; that way, when we’re done, the valuation that generates a
counterexample will be obvious.
G H G ≡ H / G // ∼ H
Next, we can assign the value true to H, in order to make the conclusion false. I’ll
use the left side of the truth table to keep track of the valuation.
G H G ≡ H / G // ∼ H
1 0 1
G H G ≡ H / G // ∼ H
1 1 0 1
G H G ≡ H / G // ∼ H
1 1 1 1 1 1 0 1
2.7.1 is thus invalid since there is a counterexample when G is true and H is true.
Note that an argument is either valid or invalid. If there is at least one counterexample,
the argument is invalid. It is not merely invalid on that assignment of truth values; it
is always invalid.
2 . 7 : In d i rect T r u t h T ables 8 5
If there is a counterexample, this indirect method will be able to find it. But we have
to make sure to try all possible valuations before we pronounce the argument valid.
2.7.2 is a valid argument. We will not be able to construct a counterexample. Let’s
see how that goes.
2.7.2 C ⊃ (D ⊃ E)
D ⊃ (E ⊃ F) / C ⊃ (D ⊃ F)
The only way to make the conclusion false is to assign true to C and to D, and
false to F.
C D E F C ⊃ (D ⊃ E) /
1 1 0
D ⊃ (E ⊃ F) // C ⊃ (D ⊃ F)
1 0 1 0 0
C D E F C ⊃ (D ⊃ E) /
1 1 0 1 1
D ⊃ (E ⊃ F) // C ⊃ (D ⊃ F)
1 0 1 0 1 0 0
8 6 C h apter 2 P ropos i t i onal L og i c
C D E F C ⊃ (D ⊃ E) /
1 1 1 0 1 1 1 1 1
D ⊃ (E ⊃ F) // C ⊃ (D ⊃ F)
1 1 0 1 0 1 0 0
But now the second premise is false. If we tried to make the second premise true
by making E false, the first premise would come out false. There was no other way to
make the conclusion false. So, there is no counterexample. 2.7.2 is thus valid.
In some arguments, there is more than one way to make a conclusion false or to
make premises true. You may have to try more than one. Once you arrive at a counter-
example, you may stop. But if you fail to find a counterexample, you must keep going
until you have tried all possible assignments.
The argument at 2.7.3 has multiple counterexamples.
2.7.3 I ⊃ K
K ⊃ J /I∙J
There are three ways to make the conclusion of 2.7.3 false. We can try them in any
order, but we have to remember that if our first attempts to construct true premises
fail, we must try the others. I’ll write them all, which gives us (potentially) a three-
row truth table to complete; it’s still fewer than the eight rows we would need in a full
truth table.
I K J I ⊃ K / K ⊃ J // I ∙ J
1 0 1 0 0
0 1 0 0 1
0 0 0 0 0
In the first row, there is no way to assign a truth value to K that makes the prem-
ises true.
2 . 7 : In d i rect T r u t h T ables 8 7
I K J I ⊃ K / K ⊃ J // I ∙ J
1 0 1 ? ? 0 1 0 0
0 1 0 0 1
0 0 0 0 0
I K J I ⊃ K / K ⊃ J // I ∙ J
1 0 1 ? ? 0 1 0 0
0 1 0 1 0 0 1
0 0 0 0 0
In the second row, we can assign either value to K and find a counterexample. So,
2.7.3 is shown invalid by the counterexample when I is false, J is true, and K is true;
it is also shown invalid by the counterexample when I is false, J is true, and K is false.
Since we found counterexamples in the second option, there is no need to continue
with the third option.
2.7.4 requires more work.
2.7.4 T ⊃ (U ∨ X)
U ⊃ (Y ∨ Z)
Z ⊃ A
∼(A ∨ Y) / ∼T
Let’s start with the conclusion, making T true in order to make its negation false,
carrying that assignment into the first premise.
T U X Y Z A T ⊃ (U ∨ X) /
1 1
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
0 1
8 8 C h apter 2 P ropos i t i onal L og i c
From the first premise, ‘U ∨ X’ must be true, but there are three ways to assign
values to make it so. Similarly, there are multiple ways to assign values for the second
and third premises. But there is only one assignment that makes the fourth premise
true, making A and Y false.
T U X Y Z A T ⊃ (U ∨ X) /
1 0 0 1
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
1 0 0 0 0 1
Let’s carry these assignments to the Y in the second premise and the A in the third.
T U X Y Z A T ⊃ (U ∨ X) /
1 0 0 1
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
0 0 1 0 0 0 0 1
Inspecting the third premise, we can see that Z must also be false; we can carry this
value to the second premise.
T U X Y Z A T ⊃ (U ∨ X) /
1 0 0 0 1
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
0 0 0 1 0 1 0 0 0 0 1
2 . 7 : In d i rect T r u t h T ables 8 9
Since “Y ∨ Z” has now been made false, U must be made false in order to keep the
second premise true.
T U X Y Z A T ⊃ (U ∨ X) /
1 0 0 0 0 1
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
0 1 0 0 0 0 1 0 1 0 0 0 0 1
Carry the value of U to the first premise; we are now forced to make X true in order
to make the first premise true.
T U X Y Z A T ⊃ (U ∨ X) /
1 0 0 0 0 1 0
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
0 1 0 0 0 0 1 0 1 0 0 0 0 1
T U X Y Z A T ⊃ (U ∨ X) /
1 0 1 0 0 0 1 1 0 1 1
U ⊃ (Y ∨ Z) / Z ⊃ A / ∼ (A ∨ Y) // ∼ T
0 1 0 0 0 0 1 0 1 0 0 0 0 1
9 0 C h apter 2 P ropos i t i onal L og i c
A B C D A ⊃ (B ∨ C) / ∼ B ∨ ∼ C /
1 1
(A ∙ B) ⊃ C / A ∙ D
1 1 1
2 . 7 : In d i rect T r u t h T ables 9 1
I’ll carry the value of A through the rest of the set (there are no other Ds), but there
are no other obvious, forced moves.
A B C D A ⊃ (B ∨ C) / ∼ B ∨ ∼ C /
1 1 1
(A ∙ B) ⊃ C / A ∙ D
1 1 1 1
The consequent in the first proposition must be true, but there are three ways to
make it true (making B true, C true, or both). There are three ways to make any con-
ditional, like that in the second proposition, true. And the antecedent of the third
proposition may be either true or false, so we are not forced to assign a value to its
consequent.
We must arbitrarily choose a next place to work. I’ll choose to start with B, expand-
ing the table to include a true value and a false value for B. If one does not work out, I
will have to return to the other one.
A B C D A ⊃ (B ∨ C) / ∼ B ∨ ∼ C /
1 1 1 1 1 1
1 0 1 1 0 0
(A ∙ B) ⊃ C / A ∙ D
1 1 1 1 1
1 0 1 1 1
I’ll try the first line first. Assigning 1 to B makes the first proposition true, without
constraining an assignment to C; so far so good. In the second proposition, if B is true,
9 2 C h apter 2 P ropos i t i onal L og i c
then C must be false. But in the third proposition, if B is true, then the antecedent is
true and so C must be true.
A B C D A ⊃ (B ∨ C) / ∼ B ∨ ∼ C /
1 1 1 1 1 1 1 0 1
1 0 1 1 0 0
(A ∙ B) ⊃ C / A ∙ D
1 1 1 1 1 1
1 0 1 1 1
There is thus no consistent valuation with B true. Let’s move to the second line,
where B is false; I’ll cross off the values in the first row to remind us that we’re finished
with it. With B false, in the first proposition, C must be true.
A B C D A ⊃ (B ∨ C) / ∼ B ∨ ∼ C /
1 1 1 1 1 1 1 0 1
1 0 1 1 1 1 0 1 1 0
(A ∙ B) ⊃ C / A ∙ D
1 1 1 1 1 1
1 0 1 1 1
But making B false makes the second proposition true without considering the
value for C. And the third proposition is the same; once we make B false, the anteced-
ent is false and so the proposition is true.
2 . 7 : In d i rect T r u t h T ables 9 3
A B C D A ⊃ (B ∨ C) / ∼ B ∨ ∼ C /
1 1 1 1 1 1 1 0 1
1 0 1 1 1 1 0 1 1 1 0 1
(A ∙ B) ⊃ C / A ∙ D
1 1 1 1 1 1
1 0 0 1 1 1 1
We have thus found a consistent valuation. The set of propositions is shown consis-
tent when A, C, and D are true and B is false.
A B C D E F A ≡ B / (B ∨ ∼ A) ⊃ C /
(A ∨ ∼ B) ⊃ D / D ⊃ E / ∼ F ∨ ∼ D
9 4 C h apter 2 P ropos i t i onal L og i c
There is no obvious place to start. There are three ways to make the conditionals in
the second, third, and fourth propositions true and three ways to make the disjunc-
tion in the final proposition true. We might as well start with the first proposition,
since there are only two ways to make it true: either both A and B are true or both A
and B are false. Other options are available, and may even be better in the long run.
In this example, I’ll work on both rows at the same time, carrying values for A and
B throughout.
A B C D E F A ≡ B / (B ∨ ∼ A) ⊃ C /
1 1 1 1 1 1 1
0 0 0 1 0 0 0
(A ∨ ∼ B) ⊃ D / D ⊃ E / ∼ F ∨ ∼ D
1 1
0 0
A B C D E F A ≡ B / (B ∨ ∼ A) ⊃ C /
1 1 1 1 1 1 1 0 1
0 0 0 1 0 0 1 1 0
(A ∨ ∼ B) ⊃ D / D ⊃ E / ∼ F ∨ ∼ D
1 1 0 1
0 1 1 0
Looking at the second proposition, above, we see that C must be true in both rows,
since the antecedent of the main operator is true in both rows. Similar reasoning
2 . 7 : In d i rect T r u t h T ables 9 5
holds for D. I’ll fill in the results for the second and third propositions and carry the
values for D to the fourth and fifth.
A B C D E F A ≡ B / (B ∨ ∼ A) ⊃ C /
1 1 1 1 1 1 1 1 1 0 1 1 1
0 0 1 1 0 1 0 0 1 1 0 1 1
(A ∨ ∼ B) ⊃ D / D ⊃ E / ∼ F ∨ ∼ D
1 1 0 1 1 1 1 1
0 1 1 0 1 1 1 1
Now we can see from the fourth proposition that E must be true in both rows, too.
We can also evaluate the negation in the fifth proposition.
A B C D E F A ≡ B / (B ∨ ∼ A) ⊃ C /
1 1 1 1 1 1 1 1 1 1 0 1 1 1
0 0 1 1 1 0 1 0 0 1 1 0 1 1
(A ∨ ∼ B) ⊃ D / D ⊃ E / ∼ F ∨ ∼ D
1 1 0 1 1 1 1 1 1 0 1
0 1 1 0 1 1 1 1 1 0 1
Since we want the last proposition to be true, since we are working toward a con-
sistent valuation, the negation of F must be true. But for the negation of F to be true,
F must be false.
9 6 C h apter 2 P ropos i t i onal L og i c
A B C D E F A ≡ B / (B ∨ ∼ A) ⊃ C /
1 1 1 1 1 0 1 1 1 1 1 0 1 1 1
0 0 1 1 1 0 0 1 0 0 1 1 0 1 1
(A ∨ ∼ B) ⊃ D / D ⊃ E / ∼ F ∨ ∼ D
1 1 0 1 1 1 1 1 1 1 0 1 0 1
0 1 1 0 1 1 1 1 1 1 0 1 0 1
We have thus found two consistent valuations for 2.7.6: when A, B, C, D, and E are
all true and F is false; and when A, B, and F are false and C, D, and E are true.
Remember, just as an argument is invalid if there is at least one counterexample, a
set of propositions is consistent if there is at least one consistent valuation; we do not
need the second one. If there is no consistent valuation, the set is inconsistent.
Let’s look at one more example.
2.7.7 P ⊃ (Q ∙ R)
Q ⊃ (S ⊃ T)
R ⊃ (T ⊃ ∼S)
P ∙ S
We have a clear place to begin: the fourth proposition. P and S must both be true.
I’ll fill in those values through all four propositions.
P Q R S T P ⊃ (Q ∙ R) / Q ⊃ (S ⊃ T) /
1 1 1 1
R ⊃ (T ⊃ ∼ S) / P ∙ S
1 1 1 1
2 . 7 : In d i rect T r u t h T ables 9 7
Looking at the first proposition next, we can see that the values of Q and R are also
determined. I’ll fill those in throughout, finishing the first proposition, and evaluate
the negation of S in the third proposition.
P Q R S T P ⊃ (Q ∙ R) / Q ⊃ (S ⊃ T) /
1 1 1 1 1 1 1 1 1 1 1
R ⊃ (T ⊃ ∼ S) / P ∙ S
1 0 1 1 1 1
Now we can turn our attention to the final component variable T. If we make T true,
then the second proposition comes out true but the third proposition turns out false.
If we make T false, then the third proposition comes out true but the second propo-
sition comes out false. There are no other possibilities: our hand was forced at each
prior step. There is no way to make all the propositions in the set true. 2.7.7 is thus an
inconsistent set of propositions.
Summary
The method of indirect truth tables is powerful when applied both to determining the
validity of an argument and to determining the consistency of a set of propositions.
(It can also be fun to use!) At root, it is the same method. But be careful to distinguish
the two cases. When we want to know if a set of propositions is consistent, we try to
make all the propositions true. When we want to know if an argument is valid, we
look for a counterexample, a valuation on which the premises all come out true but
the conclusion comes out false. And remember, some arguments are valid and some
arguments are invalid, and some sets of propositions are consistent and some sets of
propositions are inconsistent. So, even though you must try all possible valuations,
you might not be able to find a counterexample or consistent valuation.
We will use an extended version of this indirect truth table method for determining
counterexamples to arguments again in chapters 4 and 5, in first-order logic. For now,
there are two salient applications of the method. When determining if an argument
is valid, the method, if used properly, will generate a counterexample if there is one.
For sets of sentences, the method will yield a consistent valuation if there is one. Make
sure to work until you have exhausted all possible assignments of truth values to the
simple, component propositions.
9 8 C h apter 2 P ropos i t i onal L og i c
KEEP IN MIND
EXERCISES 2.7a
Determine whether each of the following arguments is valid.
If invalid, specify a counterexample.
1. L ≡ (M ∙ N)
L∙O
(M ∙ O) ⊃ P /P
2. A ⊃ (B ∨ C)
C ∙ (∼D ⊃ A)
E ∙ B /E∙A
3. F ≡ (G ∨ H)
I ⊃ (J ⊃ F)
(I ∙ G) ∨ H /J⊃G
4. (Z ∙ V) ⊃ (U ∨ W)
X ∨ (∼Y ≡ W)
Z ∙ Y / ∼U
5. A ∙ B
B⊃C
∼B ∨ (D ⊃ ∼C) / ∼D
6. ∼Y ≡ (∼X ∙ Z)
Z ⊃ Y /Z⊃X
7. J ∨ M
L∙M
K ⊃ L / ∼K ⊃ J
8. N ⊃ O
O∙P
P ≡ Q / ∼(Q ∨ N)
2 . 7 : In d i rect T r u t h T ables 9 9
9. T ≡ S
S∙U
R ⊃ U /R∨T
10. Z ∨ (X ∙ Y)
W≡V
Z ∙ V / W ⊃ (X ∨ Y)
11. S ⊃ (V ∙ T)
U∨R
∼S ≡ (R ∨ T) /T⊃U
12. E ⊃ (F ∨ H)
(G ∙ H) ⊃ E
∼F ∙ ∼H
E ⊃ ∼G / E ⊃ ∼H
13. W ⊃ (X ∙ Y)
∼(Z ⊃ X)
X ∨ (W ∙ ∼Z) / Y ∙ ∼Z
14. A ∨ (D ∙ C)
A ⊃ (B ∨ C)
D ∙ (∼B ∙ ∼C) / D ∙ C
15. ∼N
[(O ∨ P) ∨ Q ] ⊃ (N ∙ R)
P ⊃ ∼Q
(O ∙ R) ⊃ N /P
16. D ∨ ∼E
(F ∙ G) ∙ ∼H
D ⊃ (H ∨ I)
∼I / F ∙ ∼E
17. J ⊃ (K ∙ ∼L)
∼L ≡ (N ⊃ M)
J ∨ ∼N
K ∙ M / ∼N
18. ∼(P ⊃ Q )
R ≡ (S ∨ ∼T)
P⊃R
Q ∨ T /S∨O
1 0 0 C h apter 2 P ropos i t i onal L og i c
19. B ∙ (D ∨ C)
D ⊃ (A ∨ E)
∼E ∨ (B ∙ C) / (A ⊃ E) ∨ C
20. (F ∨ G) ≡ (H ∙ J)
(I ⊃ H) ∙ (J ∨ G)
∼G /I⊃F
21. K ⊃ (M ⊃ P)
P ∙ ∼(N ∨ L)
O ⊃ (K ≡ N) /M
22. Q ⊃ (T ∙ S)
R ≡ (U ∨ T)
∼[S ⊃ (T ⊃ Q )] / ∼U ∙ S
23. Y ⊃ (Z ≡ X)
Y ∙ ∼W
W ⊃ (Y ∨ Z) / ∼(X ⊃ ∼W)
24. L ⊃ (M ≡ ∼N)
(M ∙ O) ∨ (∼P ∙ O)
O∨L
∼M /N
25. S ∙ (T ∨ W)
U ⊃ (W ∙ V)
S ≡ ∼W
T ⊃ V / ∼(S ∨ U)
26. ∼(X ∨ Y)
∼(Z ⊃ W)
U∙Z
X ⊃ V /Y≡U
27. R ⊃ [U ∨ (S ∨ Q )]
R ∙ ∼S
∼U ≡ T /T⊃Q
28. N ∙ (Q ⊃ P)
M ∨ ∼L
L⊃Q
P ∨ M /L≡M
29. U ∙ ∼R
(T ∨ ∼S) ≡ U
R∨S
T ⊃ (∼R ∨ V) / ∼V
2 . 7 : In d i rect T r u t h T ables 1 0 1
30. E ∙ F
E⊃G
∼G ∙ ∼H /F≡H
31. N ∨ O
N ⊃ (Q ⊃ O)
(P ∨ Q ) ∨ R
R ⊃ ∼R / ∼O ⊃ P
32. A ⊃ B
B∨C
D ≡ C / D ≡ ∼B
33. ∼(I ≡ J)
K ⊃ ( J ∨ L)
(I ∙ L) ⊃ K
(L ∨ J) ∨ (K ⊃ I) / ∼( J ⊃ L)
34. Q ⊃ T
(T ∙ S) ∨ R
R≡Q
∼(S ∙ R) / ∼Q ≡ S
35. Q ⊃ (R ∙ ∼S)
(T ∨ U) ∙ (V ⊃ W)
(R ⊃ S) ∙ ∼U /Q∙T
36. F ⊃ (G ∙ H)
∼I ⊃ (G ⊃ ∼H)
I ⊃ ( J ∙ K) / F ⊃ ( J ∙ K)
37. V ⊃ (Z ∙ W)
X ∨ ∼Y
Z⊃Y
V ≡ Y / ∼W
38. B ∙ (C ⊃ E)
B ⊃ (A ∙ F)
D ⊃ (∼B ∨ C)
E ⊃ D / A ≡ [F ∙ (E ⊃ C)]
39. E ≡ [(F ∙ G) ⊃ H]
∼H ∙ ∼F
E∨G
G ⊃ (F ∙ E) / H ∙ ∼E
40. ∼(E ∙ F)
F ⊃ (G ∨ H)
(H ∙ E) ≡ F
G ∨ ∼F / ∼(E ∨ G)
1 0 2 C h apter 2 P ropos i t i onal L og i c
41. I ∨ ( J ∙ K)
(∼I ⊃ J) ⊃ L
L ≡ (∼J ∨ ∼I) / (I ∨ K) ∙ (I ⊃ ∼J)
42. ∼(∼C ∙ B)
A∨D
D ≡ (∼B ∙ ∼A)
C ⊃ ∼A /∼(B ∨ D)
43. ∼[I ⊃ ( J ∙ K)]
J ∨ ∼L
M ⊃ (K ∙ I)
L ≡ M /J≡K
44. (M ⊃ N) ∙ (O ⊃ P)
N∨O
(M ∨ P) ≡ (∼N ⊃ ∼O)
(O ⊃ N) ⊃ (∼M ∙ ∼P) / ∼(M ∨ P)
45. (A ∙ ∼D) ∨ (∼B ∙ C)
∼C ⊃ ∼B
(A ∨ E) ⊃ D
A ≡ B / ∼(A ∙ ∼E) ∙ B
EXERCISES 2.7b
Determine, for each given set of propositions, whether it is
consistent. If it is, provide a consistent valuation.
1. A ∨ B 4. B ∙ (C ⊃ A)
B ∙ ∼C D ∨ (E ∙ F)
∼C ⊃ D F ⊃ (C ∨ D)
D≡A E ∙ ∼A
2. D ⊃ F
5. ∼A ∙ ∼E
F ≡ (A ∙ E)
(A ∨ B) ⊃ (D ∙ F)
D ∙ (B ∨ C)
C ⊃ ( E ⊃ D)
∼A
∼A ∙ (C ∨ B)
E∨C
3. D 6. ∼[A ⊃ (F ∙ B)]
A⊃C B ∙ (E ∙ ∼D)
(B ∙ ∼C) ∙ ∼A E≡F
D ⊃ (A ∙ B) D ⊃ (C ∙ A)
2 . 7 : In d i rect T r u t h T ables 1 0 3
7. G ⊃ (H ∙ I) 16. (O ∨ ∼P) ⊃ ∼Q
∼J ⊃ (K ∨ L) R ∙ (∼S ∨ T)
L ∨ (G ⊃ J) O ∙ ∼(R ⊃ Q )
(I ≡ K) ∨ H P⊃S
8. (A ∙ C) ⊃ (D ∙ B) 17. O ≡ Q
∼(A ⊃ D) ∙ ∼(C ⊃ B) P ∙ (Q ∨ O)
B ≡ ∼(D ∨ C) R ⊃ ∼(P ∙ S)
(A ∙ B) ⊃ ∼C (S ∨ O) ∙ ∼Q
9. (W ∙ X) ⊃ Z 18. T ∙ V
(Y ∙ W) ≡ (X ∙ Z) U ⊃ (W ∙ X)
W ∨ (Y ⊃ Z) Y ∙ (T ⊃ ∼V)
(X ∙ Y) ⊃ (Z ∨ W) (Z ∙ U) ≡ (W ∙ Y)
X ⊃ (V ∨ W)
10. (E ∙ F) ⊃ (G ∨ H)
(E ∙ ∼H) ∙ (I ∨ J) 19. Q ⊃ (R ∨ S)
(I ⊃ ∼H) ∙ (F ∙ ∼G) T ≡ (U ∙ Q )
( J ∙ I) ≡ ∼F (∼S ∙ Q ) ∙ (R ∨ T)
U ∨ (S ∙ T)
11. ∼F ∙ ∼G
H ⊃ (I ∙ F) 20. ∼(J ⊃ I)
J ∙ (F ∨ G) I ∙ (K ∨ L)
∼H ∨ (I ∙ J) (L ∙ J) ≡ K
H ≡ ∼F (K ∙ I) ⊃ ∼( J ∨ L)
12. (F ∙ G) ≡ I 21. ∼F
(H ∨ J) ⊃ F (E ∙ G) ⊃ F
K ≡ (G ∙ J) (E ∙ H) ∙ G
H ⊃ (K ≡ I) F≡H
13. C ≡ (D ∨ B) 22. ∼(M ⊃ K)
D ∙ (C ⊃ A) ( J ∙ L) ⊃ K
∼A ∙ (E ∨ F) ( J ∨ M) ∙ (M ⊃ J)
F ⊃ (B ∙ A) K∨L
14. D ⊃ (∼A ∙ ∼F) 23. ∼( J ⊃ N)
E ∨ (∼B ∨ C) N ⊃ (M ∙ ∼L)
E⊃C K ≡ ∼I
A ∙ (∼B ≡ D) J ∙ (K ∨ M)
I∙L
15. B ∨ (F ∙ D)
E≡B 24. K ⊃ (L ∙ M)
∼E ∙ ∼F N ∙ ∼M
D ⊃ (A ⊃ C) (K ∙ ∼L) ∨ (K ∙ ∼M)
(C ∙ E) ∨ A (N ⊃ K) ∨ (N ⊃ L)
1 0 4 C h apter 2 P ropos i t i onal L og i c
2.8.3r Q ∙ ∼F
2.8.4r ∼(∼Q ∨ F)
Q ∙ ∼ F ∼ (∼ Q ∨ F)
1 0 0 1 0 0 1 1 1
1 1 1 0 1 0 1 0 0
0 0 0 1 0 1 0 1 1
0 0 1 0 0 1 0 1 0
Since the two propositions have the same values for the main operator in their truth
tables, despite whatever differences they might have in meaning, 2.8.3 and 2.8.4 are
logically equivalent. As far as our truth-functional logic is concerned, we can use
these two propositions interchangeably. They have the same entailments. They are
consistent or inconsistent with the same propositions.
The notion of an intension, or a meaning, like the concept of a proposition, is con-
troversial. To help clarify or illustrate the concept of an intension, some philosophers
and logicians have explored, fruitfully, possible worlds and their corresponding
modal logics, advanced topics not covered in this book.
In contrast, the concept of logical equivalence is the central concept in the charac-
terization of logic as extensional. We can use it, for example, to help us understand
the biconditional.
α ≡ β (α ⊃ β) ∙ (β ⊃ α)
1 1 1 1 1 1 1 1 1 1
1 0 0 1 0 0 0 0 1 1
0 0 1 0 1 1 0 1 0 0
0 1 0 0 1 0 1 0 1 0
2 . 8 : N otes on T ranslat i on w i t h P L 1 0 7
Notice that claims of each form are logically equivalent. And the expression on the
right is just the conjunction of ‘α if β’, on the right of the conjunction, and ‘α only if β’,
on the left: α if, and only if, β.
α ∨ β α ⊕ β
1 1 1 1 0 1
1 1 0 1 1 0
0 1 1 0 1 1
0 0 0 0 0 0
Using the concept of logical equivalence, we can show that ⊕ is definable in terms
of ∨, and thus that we do not need a special symbol for exclusive disjunction. We just
need to provide a formula that yields the same truth table as ⊕, but which does not
use that term. Such a truth table is at 2.8.7.
2.8.7 (α ∨ β) ∙ ∼(α ∙ β)
(α ∨ β) ∙ ∼ (α ∙ β)
1 1 1 0 0 1 1 1
1 1 0 1 1 1 0 0
0 1 1 1 1 0 0 1
0 0 0 0 1 0 0 0
1 0 8 C h apter 2 P ropos i t i onal L og i c
Thus we can see that if we want to regiment a sentence of English as an exclusive ‘or’,
we can just use the conjunction of ‘α ∨ β’ with ‘∼(α ∙ β)’, which, if you think about
it, should strike you as sensible: you’ll take either the Thursday lab or the Tuesday lab,
but not both.
A good grasp of logical equivalence allows us also to clear up a related question
about translation, about the use of disjunction for ‘unless’.
∼ R ∨ G
0 1 1 1
0 1 0 0
1 0 1 1
1 0 1 0
In the first row, the car runs and has gas, so the complex proposition 2.8.8 should be
true. In the second row, the car runs but does not have gas. In this case, perhaps the
car runs on an alternative fuel source, or magic. The proposition 2.8.8 should thus be
false in the second row.
In the third row, the car does not run but has gas. Perhaps the car is missing its en-
gine. This case does not falsify the complex proposition, which does not say what else
the car needs to run. 2.8.8 gives a necessary condition for a car to run (having gas),
2 . 8 : N otes on T ranslat i on w i t h P L 1 0 9
but not sufficient conditions. Thus 2.8.8 should be considered true in the third row.
In the fourth row, the car does not run and does not have gas. The proposition thus
should be true in the fourth row.
Considering our desired truth values for the sentence, we get a truth table for ‘un-
less’, at 2.8.10
2.8.10
The car r uns. The car will not run unless it has gas. The car has gas.
1 1 1
1 0 0
0 1 1
0 1 0
Notice that the truth table for ‘unless’, at 2.8.10, is precisely the same as the truth
table for the ∨, at 2.8.9. Since the two truth tables are the same, we can use the ∨ to
stand for ‘unless’; it gives us precisely what we want.
Unfortunately, this felicitous result does not hold for all uses of ‘unless’. Let’s ana-
lyze 2.8.11 the same way we analyzed 2.8.8.
2.8.11 Liesse attends school full time unless she gets a job.
1 1
1 0
0 1
0 0
This time, we will work from the bottom up. In the fourth row, Liesse does not get
a job but doesn’t go to school. The complex proposition is false, since it says that she
will attend school unless she gets a job. In the third row, she gets a job and doesn’t go
to school, and so the proposition should be true. In the second row, she attends school
but doesn’t get a job, and so the proposition should be true.
In the first row, Liesse gets a job but attends school anyway. What are your intu-
itions about the truth value of 2.8.11 in this case?
In my experience, most people who have not studied formal logic take 2.8.11 to be
false in the first row. It’s clear that if the proposition is true and Liesse does not get a
job, then she will attend school. Many people also believe that if the complex propo-
sition is true and Liesse does get a job, then she will not attend school. Here, ‘unless’
is taken in what is sometimes called a stronger sense. In this case, the truth table for
2.8.11 should be 2.8.12.
1 1 0 C h apter 2 P ropos i t i onal L og i c
2.8.12
1 0 1
1 1 0
0 1 1
0 0 0
The truth table for ‘unless’ as used in 2.8.11 seems to have the same truth conditions
as ⊕, exclusive disjunction, not for ∨. Unless thus appears to be ambiguous in the
same way as ‘or’: there is an inclusive and exclusive ‘unless’.
To regiment 2.8.11, then, it would be natural to use the form of 2.8.7, the exclusive
disjunction, yielding 2.8.13.
2.8.13
(S ∨ J) ∙ ∼ (S ∙ J)
1 1 1 0 0 1 1 1
1 1 0 1 1 1 0 0
0 1 1 1 1 0 0 1
0 0 0 0 1 0 0 0
There are even simpler ways of representing exclusive disjunctions. Notice that
we understand 2.8.11 really as a biconditional: Liesse attends school if she does not
get a job, and if she attends school she does not get a job. Thus we can use either ‘∼S ≡
J’ or ‘∼(S ≡ J)’, as we see at 2.8.14, since they are logically equivalent to 2.8.13 (and
shorter too!).
2.8.14
∼ S ≡ J ∼ (S ≡ J)
0 1 0 1 0 1 1 1
0 1 1 0 1 1 0 0
1 0 1 1 1 0 0 1
1 0 0 0 0 0 1 0
2 . 8 : N otes on T ranslat i on w i t h P L 1 1 1
In other words, if you have a sentence that you wish to regiment as an exclusive dis-
junction, you can use a proposition of any of the forms: ∼α ≡ β, ∼(α ≡ β); or (α ∨ β)
∙ ∼(α ∙ β); or any alternative form that is logically equivalent to it.
When faced with an unless, then, we ordinarily just take it to be a ∨. But if we are
concerned about getting the truth conditions precisely correct, then we have to de-
cide whether the sentence functions more like 2.8.8, and so deserves the inclusive
disjunction, or more like 2.8.11, in which case we should write it with one of the ac-
ceptable forms for exclusive disjunction. Nothing in our logic can tell you which truth
conditions you want in a translation. That is a matter of interpretation.
Summary
The extensionality of our logic means that our main concern in translation is getting
the truth conditions of our propositions right. There are always different, but logically
equivalent, ways of regimenting a sentence of English into PL. The concept of logical
equivalence is thus central to our work in translation.
Generally, we seek the simplest translations. But the concept of simplicity is not
clear and categorical. Using ⊕ for exclusive disjunction, for example, makes our lan-
guage more complicated. But ‘P ⊕ Q’ is a shorter, and thus simpler, way of expressing
‘∼(P ≡ Q )’ or ‘(P ∨ Q ) ∙ ∼(P ∙ Q )’. This tension in the notion of simplicity becomes
more apparent as we think more about how many logical operators we really need to
express the concepts and entailments of propositional logic.
Suggested Readings
Fitting, Melvin. “Intensional Logic.” In The Stanford Encyclopedia of Philosophy. http://plato
.stanford.edu/archives/sum2015/entries/logic-intensional/. Accessed January 25, 2016.
Traces the history of intensional logics and presents some details of various approaches.
Hurford, James. “Exclusive or Inclusive Disjunction.” Foundations of Language 11 (1974):
409–411. Hurford argues that some uses of ‘or’ are exclusive.
Orlandini, Anna. “Logical, Semantic and Cultural Paradoxes.” Argumentation 17 (2003):
65–86. Orlandini connects the exclusive disjunction to some paradoxes.
Sainsbury, Mark. Logical Forms: An Introduction to Philosophical Logic, 2nd ed. Oxford, UK:
Blackwell, 2001. Chapter 2 has a lovely and engaging discussion of many aspects of trans-
lation with propositional logic.
1 1 2 C h apter 2 P ropos i t i onal L og i c
KEY TERMS
11 3
1 1 4 C h apter 3 Inference i n P ropos i t i onal L og i c
In a complete system of methods. The rules are chosen so that our system is complete: every valid argument
inference, every valid and logical truth will be provable using our rules.
argument and every
logical truth is provable.
For PL, the logical truths are just the tautologies; we will expand our definition
of logical truth for the logics in chapters 4 and 5. Our rules are chosen arbitrarily, in
the sense that there are many different complete systems of rules—indeed, infinitely
many. One can devise deductive systems with very few rules; the resulting proofs be-
come very long. One can devise systems so that proofs become very short; in such sys-
tems the required number of rules can be unfeasibly large. I chose a moderate number
of rules (twenty-five) so that there are not too many to memorize and the proofs are
not too long.
I also chose the rules and proof methods in our system of inference to mirror, at
least loosely, natural patterns of inference. You are likely to find some of the rules to
be easy and obvious, though the full collection of rules will include some inferences
you may find awkward at first.
The rules we choose are defined purely syntactically, in terms of their form, but
Rules of inferenceare they are justified semantically. A rule of inference must preserve truth: given true
valid argument forms premises, the rules must never yield a false conclusion. A rule preserves truth if every
that are used to justify
steps in an inference.
argument of its form is valid. We can show that each of the rules of inference preserves
truth using the indirect truth table method. We show that each rule of equivalence
preserves truth using truth tables as well.
This criterion for our rules, that they should preserve the truth of the premises, un-
In a sound system of derlies our goal of soundness for a system of inference. I do not prove the metalogical
inference, every provable results of soundness or completeness for the systems in this book; the proofs require
argument is semantically
more mathematics than we will use.
valid; every provable
proposition is logically Derivations begin with any number of premises and proceed by steps to a conclu-
true. sion. A derivation is valid if every step is either a premise or derived from premises
or previous steps using our rules. I introduce four rules of inference in this section
and four more in the next section. I introduce ten rules of equivalence in the third
and fourth sections of this chapter. In section 6, I introduce the seven remain-
ing rules (three inference rules and four equivalence rules), all of which govern the
biconditional.
Notice that despite their differing complexity, 3.1.1–3.1.3 share a form. The first
premise of each argument is a conditional. The second premise is the antecedent of
that conditional. The conclusion is the consequent of the conditional.
We can write this form at 3.1.4, using metalinguistic (Greek) variables.
3.1.4 α⊃β
α /β Modus Ponens
This form of argument is called modus ponens, abbreviated MP. We can apply Modus ponens (MP) is a
3.1.4 in our object language, PL, by constructing substitution instances of it, par- rule of inference of PL.
ticular applications of the rule which match, syntactically, its form. In particular,
the main operators of each formula in the substitution instance will be the same as the
main operators in the rule. So, a substitution instance of MP will contain one wff
whose main operator is a conditional and another that is precisely the antecedent of
that conditional. The last wff of a substitution instance of MP will contain exactly the
consequent of the conditional statement as a new wff in a derivation.
Notice that any substitution instance of MP yields a valid argument. Logicians A substitution instance
ordinarily prove the validity of rules by mathematical induction. Here, an informal of a rule is a set of wffs of
PL that match the form of
argument should suffice: the only way to construct a counterexample would be on a
the rule.
line on which the main operator of the conclusion were false and the main operator
of the second premise were true. Any such valuation would make the first premise
false and so make the inference valid. (Remember, a counterexample requires true
premises and a false conclusion.)
Given that every substitution instance of MP will be valid, we can substitute simple
or complex formulas for α and β in 3.1.4 and be sure that the resulting deduction is
valid. 3.1.5 is another example of MP, with even greater complexity.
3.1.5 [(H ∨ G) ⊃ I] ⊃ (K ∙ ∼L)
[(H ∨ G) ⊃ I] / (K ∙ ∼L)
of wffs, we will write our deductions in the metalanguage, including line numbers
and justifications in a second column. The line numbers allow us to keep track of our
A justification i n a justifications. All steps except the premises require justification. The justification of
derivation includes the any step includes the line numbers and rule of inference used to generate the new wff.
rule used and the earlier
For example, “3, 4, MP” on line 5 indicates that ‘∼S’ is derived directly from the wffs
line numbers to which
the rule is applied. at lines 3 and 4 by a use of the rule of modus ponens. The explanations such as “taking
‘U’ for α and ‘∼S’ for β” are not required elements of the derivation, but they can be
useful, especially when you are first learning to use natural deductions.
The conclusion of the argument is initially written after a single slash following the
last premise. The conclusion, like the justifications of every following step, is not tech-
nically part of the deduction. Importantly, you may not use it as part of your proof. It
merely indicates what the last numbered line of your derivation should be.
QEDis placed at the end of Lastly, QED at the end of the derivation stands for ‘Quod erat demonstrandum’,
a derivation, to show that which is Latin for ‘that which was required to be shown’. ‘QED’ is a logician’s
it is finished.
punctuation mark: “I’m done!” It is not essential to a proof, but looks neat and signals
your intention to end the derivation.
Rules of inference are to be used only on whole lines, not on portions of lines. In
other words, the main operators of the propositions to which you are applying the rule
must match the operators given in the rule. The inference at 3.1.14 violates this condi-
tion and so is illegitimate, even though valid.
3.1.14 1. P ⊃ (Q ⊃ R)
2. Q
3. P ⊃ R 1, 2, MP Not Acceptable!
We’ll have other ways to make such valid inferences once our proof system is
complete.
3.1.15 is an example of a longer derivation using our first four rules of inference.
3.1.15 1. ∼A ⊃ [A ∨ (B ⊃ C)]
2. (B ∨ D) ⊃ ∼A
3. B ∨ D
4. C ⊃ A /D
5. ∼A 2, 3, MP
6. A ∨ (B ⊃ C) 1, 5, MP
7. B ⊃ C 6, 5, DS
8. B ⊃ A 7, 4, HS
9. ∼B 8, 5, MT
10. D 3, 9, DS
QED
Summary
In this section, we saw the first four of our rules of inference and how they can com-
bine to form derivations of the conclusions of arguments. Constructing derivations
3 . 1 : R u les of Inference 1 1 1 9
can be intimidating at first. If you can, start with simple sentences or negations of
simple negations. Plan ahead. Working backward from the conclusion on the side can
be helpful. For example, in 3.1.15, we could start the derivation by observing that we
could get the conclusion, ‘D’, by DS from line 3 if we had ‘∼B’. Then, both ‘∼B’ and ‘D’
are goals as we work forward through the proof.
Don’t worry about introducing extraneous lines into your proof as long as they are
the results of valid inferences. Especially as we introduce further rules, we are going
to be able to infer statements that are not needed for the most concise derivation. But
as long as every step is valid, the entire inference will be valid. It is not the case that
every wff must be used after it is introduced into the deduction.
Lastly, notice that some wffs may be used more than once in a derivation. In 3.1.15,
the ‘∼A’ at line 5 was used first with premise 1 in a MP to yield the wff at line 6. Then,
it is used immediately a second time, with the wff at line 6, to yield ‘B ⊃ C’ on line 7.
Some students will have encountered proofs like these, perhaps in slightly less rigor-
ous form, in a geometry class, or in other mathematics courses. For other students, nat-
ural deductions of this sort are new. Be patient, and practice. And practice some more.
KEEP IN MIND
Our formal system for propositional logic will use eleven rules of inference, fourteen rules
of equivalence, and three proof methods.
We have seen four rules of inference: modus ponens (MP); modus tollens (MT);
disjunctive syllogism (DS); hypothetical syllogism (HS).
Every valid argument will be provable using our rules once our rule set is complete.
Rules of inference preserve truth; given true premises, the rules never yield a false
conclusion.
Derivations begin with any number of premises and proceed by steps to a conclusion.
A derivation is valid if every step is either a premise or derived from premises or previous
steps using our rules.
In derivations:
Number all premises and every wff that follows.
The conclusion of the argument is written after a single slash following the last premise.
Justify all steps except the premises.
A justification includes line numbers and the rule of inference used to generate the
new wff.
Use rules of inference only on whole lines, not on portions of lines.
QED may be added to the end of a derivation to mark its conclusion.
Rules Introduced
Modus Ponens (MP)
α⊃β
α / β
1 2 0 C h apter 3 Inference i n P ropos i t i onal L og i c
EXERCISES 3.1a
Derive the conclusions of each of the following arguments
using natural deduction.
1. 1. V ⊃ (W ⊃ X) 8. 1. (P ∙ Q ) ∨ R
2. V 2. ∼(P ∙ Q )
3. ∼X / ∼W 3. R ⊃ ∼S / ∼S
2. 1. X ⊃ Y 9. 1. P ⊃ (Q ⊃ R)
2. ∼Y 2. (Q ⊃ R) ⊃ S
3. X ∨ Z /Z 3. (P ⊃ S) ⊃ (T ⊃ P) / T ⊃ P
3. 1. E ⊃ F
2. ∼F 10. 1. P ⊃ (Q ∙ R)
3. ∼E ⊃ (G ∙ H) / G ∙ H 2. ∼(Q ∙ R)
3. P ∨ (S ≡ T) /S≡T
4. 1. I ⊃ J
2. J ⊃ K 11. 1. (P ⊃ Q ) ⊃ (P ⊃ R)
3. ∼K / ∼I 2. P ⊃ S
3. S ⊃ Q /P⊃R
5. 1. (I ∙ L) ⊃ (K ∨ J)
2. I ∙ L 12. 1. G ⊃ E
3. ∼K /J 2. F ⊃ ∼E
6. 1. (P ∨ Q ) ⊃ R 3. H ∨ F
2. R ⊃ S 4. ∼H / ∼G
3. P ∨ Q /S
13. 1. A ⊃ D
7. 1. P ⊃ R 2. D ⊃ (B ⊃ C)
2. Q ⊃ P 3. B
3. (Q ⊃ R) ⊃ S /S 4. A /C
3 . 1 : R u les of Inference 1 1 2 1
14. 1. L ∨ N 24. 1. J ⊃ L
2. ∼L 2. L ⊃ (I ∙ M)
3. N ⊃ (M ∨ O) 3. (I ∙ M) ⊃ K
4. (M ∨ O) ⊃ (P ≡ Q ) / P ≡ Q 4. ∼K / ∼J
15. 1. R ⊃ S 25. 1. Q ⊃ (∼R ⊃ S)
2. S ⊃ (T ∨ U) 2. T ∨ Q
3. R 3. ∼T
4. ∼T /U 4. R ⊃ T /S
16. 1. U ⊃ V 26. 1. ∼Q ⊃ (N ∙ O)
2. ∼V 2. (N ∙ O) ⊃ (P ⊃ Q )
3. U ∨ W 3. M ∨ ∼Q
4. W ⊃ X /X 4. ∼M / ∼P
17. 1. X ⊃ Z 27. 1. (P ∨ Q ) ∨ (S ∨ ∼T)
2. Z ⊃ Y 2. R ⊃ ∼(P ∨ Q )
3. ∼Y 3. (S ∨ ∼T) ⊃ ∼S
4. ∼X ⊃ A /A 4. R / ∼T
18. 1. P ⊃ (Q ∙ ∼R) 28. 1. (P ∙ ∼R) ⊃ (Q ∨ S)
2. S ⊃ ∼(Q ∙ ∼R) 2. Q ⊃ (S ≡ T)
3. T ∨ S 3. ∼(S ≡ T) ⊃ (P ∙ ∼R)
4. ∼T / ∼P 4. ∼(S ≡ T) / S
EXERCISES 3.1b
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the arguments
using the first four rules of our system of natural deduction.
1. If Allison doesn’t go grocery shopping, Billy will go. Allison goes grocery
shopping only if Carla gets home from school early. Carla doesn’t get home
early. Therefore, Billy goes grocery shopping.
2. Don Juan plays golf only if Edie makes a reservation. If Edie makes a reservation,
then Frederique writes it on the calendar. Don Juan played golf. So, Frederique
wrote it down on the calendar.
3. If Gertrude mops the kitchen, then Hillary washes the dishes. Either Inez or
Gertrude mops the kitchen. Inez doesn’t mop the kitchen. So, Hillary washes
the dishes.
3 . 1 : R u les of Inference 1 1 2 3
14. If mathematics can be known a priori, then so can logic. If logic is knowable a
priori, then human reason is not purely scientific. If the a priori knowability of
mathematics entails that human reason is not purely scientific, then if logic can
be known a priori, then there are eternal truths. So, if mathematics is knowable
a priori, then there are eternal truths.
15. Either monadism is true just in case atomism is, or space is infinitely divisible
if and only if the world is a plenum. If monadism entails atomism, then space
is not infinitely divisible. Either space is infinitely divisible or it’s not the
case that monadism is true just in case atomism is. If monadism is true, then
there are elementary particles. But if there are elementary particles, then
atomism is true. So, space is infinitely divisible if, and only if, the world is a
plenum.
16. You’re befuddled. Either you are a necessitarian or you are not a proper
apriorist. Either you are a contingentist or you are not a proper empiricist. If
you’re a contingentist, then you don’t believe that logic is a priori. But it’s not
the case that you do not believe that logic is a priori. You’re not a necessitarian.
And if you aren’t a proper apriorist, then if you aren’t a proper empiricist, then
you are befuddled.
Simplification (Simp)
Simplification, the rule of inference shown at 3.2.5, is like the reverse of conjunction, Simplification (Simp) i s a
allowing you to infer the first conjunct of a conjunction. rule of inference of PL.
3.2.5 α ∙ β / α Simplification
1 2 6 C h apter 3 Inference i n P ropos i t i onal L og i c
If you have peas and carrots, then you have peas. Notice that Simp does not license
the derivation of ‘you have carrots’ from ‘you have peas and you have carrots’; a rule
of equivalence in the next section will allow us to infer the second conjunct. For now,
our list of rules is incomplete. We must leave the second conjunct alone.
3.2.6 is a sample derivation using conjunction and simplification.
3.2.6 1. A ⊃ B
2. F ⊃ D
3. A ∙ E
4. ∼D / B ∙ ∼F
5. A 3, Simp
6. B 1, 5, MP
7. ∼F 2, 4, MT
8. B ∙ ∼F 6, 7, Conj
QED
Be careful to avoid the invalid inferences 3.2.7 and 3.2.8.
3.2.7 α / α ∙ β Invalid!
3.2.8 α ∨ β / α Invalid!
From a single proposition, 3.2.7, we cannot conclude the conjunction of two propo-
sitions unless the second appears earlier in our derivation. And from a disjunction,
3.2.8, we cannot conclude either disjunct unless the negation of the other appears
earlier in our derivation.
The derivation at 3.2.11 uses all of the rules of inference of this section.
3.2.11 1. P ∨ Q
2. Q ⊃ S
3. R ⊃ T
4. ∼P ∙ U / Q ∙ (S ∨ T)
5. ∼P 4, Simp
6. Q 1, 5, DS
7. Q ∨ R 6, Add
8. S ∨ T 2, 3, 7, CD
9. Q ∙ (S ∨ T) 6, 8, Conj
QED
Summary
The four new rules of inference in this section differ only in their details from the four
rules of section 3.1. Any substitution instance of a rule yields a valid inference. You
can check the validity of each form using the truth table method, or indirect truth
table method, applied to the metalinguistic forms.
Remember, we choose these rules on two bases: the completeness of the resulting
logical system and the way in which they represent or reflect ordinary inferences.
As our derivations become more complex, it will become increasingly important
for you not only to be able to use the rules we have, but also to see substitution in-
stances of the rules quickly and naturally. Constructing derivations requires not just
understanding the rules, but knowing how to use them. It’s like riding a bicycle or
cooking: you can’t just know how to do it in theory; you have actually to do it in order
to get good at it. At the risk of redundancy: practice, practice, practice.
KEEP IN MIND
We have seen four more rules of inference: conjunction (Conj); addition (Add); simplifica-
tion (Simp); constructive dilemma (CD).
We now have eight rules.
Be especially careful not to confuse conjunction and addition.
Rules Introduced
Conjunction (Conj)
α
β / α ∙ β
Addition (Add)
α / α ∨ β
Simplification (Simp)
α ∙ β / α
1 2 8 C h apter 3 Inference i n P ropos i t i onal L og i c
EXERCISES 3.2a
For each of the following arguments, determine which, if
any, of the eight rules of inference is being followed. Though
there are many valid inferences other than our eight rules,
in these exercises, if the inference is not in the form of one
of the eight rules, it is invalid.
The invalid inferences in these exercises are common
errors that logic students make when learning the rules of
inference, so it might be worth your time to study and
understand the errors in order to avoid them yourself.
1. A ⊃ (B ∙ C)
∼(B ∙ C) / ∼A
2. (D ∨ E) ⊃ F
F ⊃ (G ≡ H)
(D ∨ E) ∨ F / F ∨ (G ≡ H)
3. I ⊃ ∼J
K ⊃ I / K ⊃ ∼J
4. L
∼M ∙ N / ∼(M ∙ N) ∙ L
5. O / O ∙ ∼O
6. P / P ∨ [Q ≡ (R ∙ ∼P)]
7. S ∨ ∼T
∼ ∼T / ∼S
8. ∼U ≡ V
(∼U ≡ V) ⊃ W /W
9. X ⊃ ∼Y
∼Y ⊃ Z / (X ⊃ ∼Y) ∙ (∼Y ⊃ Z)
10. (A ∨ ∼B) ∨ ∼∼C / A ∨ ∼B
3 . 2 : R u les of Inference 2 1 2 9
EXERCISES 3.2b
Derive the conclusions of each of the following arguments
using the eight rules of inference.
1. 1. P ∙ Q
2. R /P∙R
2. 1. P ⊃ ∼Q
2. ∼Q ⊃ R / (P ⊃ R) ∨ (S ⊃ T)
3. 1. (A ⊃ C) ⊃ D
2. ∼B ⊃ C
3. A ⊃ ∼B / D
4. 1. (E ∨ F) ⊃ ∼G
2. H ⊃ G
3. E / ∼H
1 3 0 C h apter 3 Inference i n P ropos i t i onal L og i c
5. 1. I ∨ J
2. ∼I ∙ K /J∨L
6. 1. W ⊃ X
2. ∼X ∙ Y / (∼W ∨ Z) ∙ ∼X
7. 1. T ∨ S
2. ∼T
3. U /U∙S
8. 1. ∼P ⊃ ∼Q
2. ∼R ⊃ ∼S
3. T ∨ (∼P ∨ ∼R)
4. ∼T / ∼Q ∨ ∼S
9. 1. N ∨ ∼ ∼P
2. ∼N ∙ Q
3. ∼P ∨ Q / ∼∼P ∙ Q
10. 1. (P ≡ Q ) ⊃ R
2. Q ∨ ∼R
3. ∼Q
4. ∼P ⊃ (P ≡ Q ) / ∼∼P
11. 1. P ⊃ Q
2. ∼R ⊃ S
3. P ∨ ∼R
4. ∼Q /S
12. 1. P ⊃ Q
2. ∼Q ∙ R / ∼P ∨ R
13. 1. ∼P ∨ Q
2. ∼P ⊃ R
3. ∼R /Q∨S
14. 1. P ∙ ∼Q
2. ∼Q ∙ R
3. (P ∙ ∼Q ) ⊃ S /S∙P
15. 1. ∼P ⊃ Q
2. ∼Q ⊃ R
3. (∼P ∨ ∼Q ) ∙ S / Q ∨ R
16. 1. (P ∙ ∼Q ) ∙ R
2. P ⊃ S
3. R ⊃ T /S∨T
17. 1. (P ∨ Q ) ⊃ R
2. (P ∨ S) ⊃ T
3. P ∙ V /R∙T
3 . 2 : R u les of Inference 2 1 3 1
18. 1. ∼P ⊃ Q
2. ∼R ⊃ S
3. Q ⊃ ∼R
4. ∼P / ∼R ∨ S
19. 1. (E ∨ I) ⊃ H
2. H ⊃ (F ∙ G)
3. E / (F ∙ G) ∙ E
20. 1. M ⊃ N
2. O ⊃ P
3. M ∙ Q /N∨P
21. 1. ∼A ⊃ B
2. C ⊃ D
3. A ⊃ D
4. ∼D /B∨D
22. 1. M ⊃ N
2. N ⊃ O
3. M ∙ P /O∨P
23. 1. B ⊃ A
2. ∼A ∙ D
3. ∼B ⊃ C / C ∨ A
24. 1. D ∨ E
2. D ⊃ F
3. ∼F ∙ G / (E ∨ H) ∙ ∼F
25. 1. O ⊃ Q
2. Q ⊃ P
3. P ⊃ (R ∙ S)
4. O /R∙S
26. 1. (R ∨ T) ⊃ S
2. S ⊃ U
3. R /U∨T
27. 1. P ∙ Q
2. ∼P ∙ R /S
28. 1. [(∼Q ∙ ∼P) ⊃ R] ∙ (S ∨ ∼T)
2. P ⊃ Q
3. ∼Q /R∨T
29. 1. P ⊃ (Q ⊃ R)
2. P ⊃ (R ⊃ ∼S)
3. Q ∨ R
4. P ∙ T / R ∨ ∼S
1 3 2 C h apter 3 Inference i n P ropos i t i onal L og i c
39. 1. P ⊃ (Q ⊃ ∼U)
2. R ⊃ (Q ⊃ S)
3. (P ∨ R) ∙ T
4. ∼(Q ⊃ ∼U)
5. Q / S ∨ ∼U
40. 1. P ⊃ (Q ⊃ R)
2. S ⊃ (T ⊃ U)
3. W ⊃ X
4. ∼(Q ⊃ R)
5. P ∨ S
6. T ∨ W / U ∨ X
EXERCISES 3.2c
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the arguments
using the eight rules of inference.
1. If Alessandro sings in the musical, then Beatriz will buy a ticket. Beatriz doesn’t
buy a ticket and Carlo goes to watch the musical. So, Alessandro doesn’t sing in
the musical and Beatriz doesn’t buy a ticket.
2. If Don is an EMT, then everyone is saved. All girls are saved provided that
Frank is an EMT. Helga’s being a doctor implies that Don is an EMT. Helga is
a doctor; moreover, all girls are saved. So, either everyone is saved or all girls
are saved.
3. If the classroom is quiet, then it is not rowdy. If the classroom isn’t rowdy, then
it’s silent. The classroom is quiet and not tumultuous. So, the classroom is quiet
and silent.
4. Having a thunderstorm is a sufficient condition for needing an umbrella. Either
it is very cloudy or you don’t need an umbrella. It’s not very cloudy. So, either
there aren’t thunderstorms or it’s windy.
5. Either elephants or flamingos eat nuts. If elephants eat nuts, then gorillas eat
fruit. Gorillas don’t eat fruit, but hippos eat berries. So, either flamingos eat
nuts or hippos eat berries.
6. Elia playing basketball is a necessary condition of her taking art. She’ll walk the
dog on the condition that she takes ceramics. She doesn’t play basketball. She
takes ceramics. So, she doesn’t take art, but she does walk the dog.
1 3 4 C h apter 3 Inference i n P ropos i t i onal L og i c
7. Jaime either flies a kite or lies in the sun and listens to music. He doesn’t fly a
kite, but he juggles. If he lies in the sun, then he juggles. So, he either juggles or
listens to music.
8. If Xavier takes Spanish, then Yolanda tutors him. Zeke pays Yolanda if she tu-
tors Xavier. Either Waldo or Xavier takes Spanish. Waldo doesn’t take Spanish;
also Yolanda doesn’t tutor Xavier. So, Zeke pays Yolanda, but Waldo doesn’t
take Spanish.
9. If God is either benevolent or omnipotent, then we have both freedom and
knowledge. Either God is morally neutral or benevolent. But God is not mor-
ally neutral. So, we are free.
10. If I do not have sense experience of apples, then I do not know about apples. If I
have an idea of an apple, then the apple is real. If you tell me about apples, then
either I do not have sense experience of apples or I have an idea of an apple. You
tell me about apples. It is not the case that I do not know about apples. So, an
apple is real.
11. If we eat meat, then the environment is degraded. If we are vegetarians, then
fewer livestock are raised. If humanity persists, then either we eat meat or are
vegetarians. Humanity persists. So, either the environment is degraded or
fewer livestock are raised.
12. Either art is dead or a new form will appear. If art is dead, then it is not the
case that some sculpture by Botero is valuable. But the claim that it’s not the case
that some sculpture by Botero is valuable is false. So, a new form will appear
and art is not dead.
13. If Mill is right, then consequences have moral weight; also, I like Mill’s work. If
Kant is right, then pleasure is not important; I’m not a fan of Kant’s work. Either
Mill is right or Kant is. So, either consequences have moral weight or pleasure
is not important.
14. If values are transcendent, then truth does not matter. Either values are tran-
scendent or the world has no meaning. But it is not the case that truth does not
matter. So, either the world has no meaning or truth is pleonastic.
15. If names are either purely referential or contain descriptive content, then both
Mill and Frege are worth reading. Names are purely referential and do not contain
descriptive content. So, Mill is worth reading and names are purely referential.
16. If there is a self, then I could be eternal. If I could be eternal, then I am not my
body. If I could be eternal, then I am not my soul. Either there is a self or I could
be eternal. So, either I am not my body or I am not a soul.
3 . 3 : R u les of E q u i v alence 1 1 3 5
3.3.1 1. (A ∨ B) ⊃ E
2. ∼E
3. A ∨ D /D
4. ∼(A ∨ B) 1, 2, MT
5. ∼A ∙ ∼B 4, DM
6. ∼A 5, Simp
7. D 3, 6, DS
QED
3.3.2 1. G ⊃ ∼(H ∙ F)
2. ∼(∼H ∨ ∼F) / ∼G
3. ∼∼(H ∙ F) 2, DM
4. ∼G 1, 3, MT
QED
Association (Assoc)
Association (Assoc) a re Association allows you to regroup series of conjunctions or disjunctions.
rules of equivalence of PL. → (α ∨ β) ∨ γ Association
α ∨ (β ∨ γ) ←
→ (α ∙ β) ∙ γ
α ∙ (β ∙ γ) ←
As with DM, Assoc has a version for conjunction and a version for disjunction. Un-
like DM, Assoc requires no switching of operators. It merely allows you to regroup
the component propositions; the two operators must be the same. Assoc is often used
to organize a series of conjunctions before simplifying one of the conjuncts, or with
DS, as in 3.3.3.
3.3.3 1. (L ∨ M) ∨ N
2. ∼L
3. (M ∨ N) ⊃ O /O
4. L ∨ (M ∨ N) 1, Assoc
5. M ∨ N 4, 2, DS
6. O 3, 5, MP
QED
Distribution (Dist)
Distribution (Dist) a re The rules of distribution allow you to distribute a conjunction over a disjunction or
rules of equivalence of PL. to distribute a disjunction over a conjunction.
→ (α ∙ β) ∨ (α ∙ γ) Distribution
α ∙ (β ∨ γ) ←
→ (α ∨ β) ∙ (α ∨ γ)
α ∨ (β ∙ γ) ←
The main operator is always switched (between conjunction and disjunction) after
a use of Dist. So, using Dist on a sentence whose main operator is a disjunction yields
a conjunction from which you can simplify.
3 . 3 : R u les of E q u i v alence 1 1 3 7
Notice that while the grouping of terms changes, the order of the first two opera-
tors remains after using Dist, with an extra operator of the first type added at the end
(going left to right) or taken away (going right to left). So, ∙∨ becomes ∙∨∙ and ∨∙
becomes ∨∙∨ (and vice versa).
Be careful to distinguish Dist from Assoc. Assoc is used when you have two of the
same operators. Dist is used when you have a combination of conjunction and dis-
junction. 3.3.4 contains a forward use of Dist, while 3.3.5 contains a backward use.
3.3.4 1. H ∙ (I ∨ J)
2. ∼(H ∙ I) /H∙J
3. (H ∙ I) ∨ (H ∙ J) 1, Dist
4. H ∙ J 3, 2, DS
QED
3.3.5 1. (P ∨ Q) ∙ (P ∨ R)
2. ∼P /Q∙R
3. P ∨ (Q ∙ R) 1, Dist
4. Q ∙ R 3, 2, DS
QED
Commutativity (Com)
Commutativity often combines with rules of inference to facilitate some obvious in- Commutativity (Com)
ferences that we could not yet make. a re rules of equivalence
of PL.
α ∨ β → β ∨ α Commutativity
←
α ∙ β → β∙α
←
In effect, Com doubles the rules DS, Simp, and Add. From a disjunction, we can
now infer the first disjunct from the negation of the second, as at 3.3.6. From a con-
junction, we can now infer the second conjunct using Simp, as at 3.3.7. And we can
add a proposition in front of a given wff, as at 3.3.8.
3.3.6 1. P ∨ Q
2. ∼Q
3. Q ∨ P 1, Com
4. P 3, 4, DS
3.3.7 1. P ∙ Q
2. Q ∙ P 1, Com
3. Q 2, Simp
3.3.8 1. P
2. P ∨ Q 1, Add
3. Q ∨ P 2, Com
1 3 8 C h apter 3 Inference i n P ropos i t i onal L og i c
Each of the three derivations 3.3.6–3.3.8 can be inserted into any derivation. 3.3.9
demonstrates the use of commutativity with simplification and disjunctive syllogism.
3.3.9 1. A ∙ B
2. B ⊃ (D ∨ E)
3. ∼E /D
4. B ∙ A 1, Com
5. B 4, Simp
6. D ∨ E 2, 5, MP
7. E ∨ D 6, Com
8. D 7, 3, DS
QED
Summary
Rules of equivalence are transformation rules that allow us to replace some formulas
and subformulas with logical equivalents. These transformations help expand the
applications of our rules of inference. They are also, in many cases, formal versions
of natural-language equivalencies. We’ve seen five rules of equivalence in this sec-
tion, though each rule has at least two different applications (in each direction), and
some of the rules, like De Morgan’s laws and distribution, are actually two pairs of
rules. We’ll see five more rules of equivalence in the next section, and four more in
section 3.6.
While the thirteen rules to this point are not very many to manage, they allow so
many more derivations than just the first few rules that the proofs can be subtle and
interesting. Even the strongest logic students should find some of the derivations in
this section challenging.
KEEP IN MIND
Rules of equivalence allow you to substitute one proposition or part of a proposition with a
logically equivalent expression.
We saw five rules of equivalence in this section: De Morgan’s laws (DM); association
(Assoc); distribution (Dist); commutativity (Com); double negation (DN).
Forward DM distributes a tilde to the components of a conjunction or disjunction.
Backward DM factors out the tilde.
All uses of DM switch a conjunction to a disjunction or a disjunction to a conjunction.
1 4 0 C h apter 3 Inference i n P ropos i t i onal L og i c
Rules Introduced
De Morgan’s Laws (DM)
∼(α ∙ β) → ∼α ∨ ∼β
←
∼(α ∨ β) → ∼α ∙ ∼β
←
Association (Assoc)
α ∨ (β ∨ γ) → (α ∨ β) ∨ γ
←
α ∙ (β ∙ γ) → (α ∙ β) ∙ γ
←
Distribution (Dist)
α ∙ (β ∨ γ) → (α ∙ β) ∨ (α ∙ γ)
←
α ∨ (β ∙ γ) → (α ∨ β) ∙ (α ∨ γ)
←
Commutativity (Com)
α ∨ β → β∨α
←
α ∙ β → β∙α
←
Double Negation (DN)
→ ∼∼α
α ←
EXERCISES 3.3a
Derive the conclusions of each of the following arguments
using the rules of inference and the first five rules of
equivalence.
1. 1. A ⊃ B
2. C ∙ A /B
2. 1. ∼(P ∨ Q )
2. R ⊃ P / ∼R
3 . 3 : R u les of E q u i v alence 1 1 4 1
3. 1. H ∨ J
2. I ∙ ∼H /J
4. 1. X ⊃ Y
2. Z ∙ ∼Y / ∼X ∙ Z
5. 1. R ∨ B
2. B ⊃ M
3. R ⊃ D
4. ∼M /D
6. 1. Q ⊃ R
2. ∼(S ∨ T)
3. T ∨ Q /R
7. 1. X ⊃ Y
2. (∼Y ∙ Z) ∙ T
3. X ∨ W /W
8. 1. ∼A ∨ B
2. ∼[(∼A ∨ C) ∨ D] / B
9. 1. A ∨ (B ∙ C)
2. (C ∨ A) ⊃ ∼∼B /B
10. 1. A ⊃ (C ∨ B)
2. ∼C ∙ A
3. B ⊃ D /D
11. 1. (A ⊃ B) ∨ T
2. ∼T
3. B ⊃ C /A⊃C
12. 1. ∼A ⊃ C
2. B ∙ ∼C
3. A ⊃ D /D∙B
13. 1. ∼D ∙ ∼E
2. (D ∨ F) ∨ E /F
14. 1. E ∙ D
2. D ⊃ ∼A
3. (B ∨ A) ∨ C /B∨C
15. 1. P ∨ (Q ∙ R)
2. P ⊃ S
3. R ⊃ T /S∨T
1 4 2 C h apter 3 Inference i n P ropos i t i onal L og i c
29. 1. C ∨ (D ∙ B)
2. (C ∨ D) ⊃ ∼C /D∙B
30. 1. E ∨ (F ∨ G)
2. ∼(∼∼G ∨ ∼H)
3. [(E ∨ F) ∙ ∼G] ⊃ A /A
31. 1. ∼X ∙ (Y ∨ Z)
2. ∼Y ∨ ∼∼X
3. (∼X ∙ Z) ⊃ W /T∨W
32. 1. (P ∨ Q ) ∨ R
2. ∼P
3. Q ⊃ S
4. R ⊃ T
5. ∼S /T
33. 1. J ⊃ K
2. K ⊃ [L ∨ (M ∙ N)]
3. ∼N ∙ J /L
34. 1. [O ∨ (P ∙ Q )] ⊃ R
2. R ⊃ ∼S
3. P ∙ S / ∼Q
35. 1. A ⊃ B
2. ∼[(C ∙ D) ∨ (C ∙ B)]
3. C ∙ E / ∼A
36. 1. F ⊃ G
2. H ⊃ I
3. (J ∨ F) ∨ H
4. ∼J ∙ ∼G /I
37. 1. ∼(A ∨ B)
2. D ⊃ B
3. A ∨ (∼E ∨ D)
4. [∼(∼C ∨ E) ⊃ F] ∙ C / F
38. 1. A ∙ ∼C
2. ∼(C ∙ D) ⊃ E
3. ∼(F ∨ C) ⊃ ∼E /F∙E
1 4 4 C h apter 3 Inference i n P ropos i t i onal L og i c
39. 1. M ∨ (Q ⊃ ∼P)
2. (∼Q ∙ L) ⊃ (∼Q ⊃ ∼O)
3. (P ∨ M) ∙ (M ∨ L)
4. ∼M / ∼O
40. 1. (O ∙ P) ⊃ (Q ∙ R)
2. P ⊃ ∼Q
3. O ⊃ ∼R
4. P ∨ O / ∼P ∨ ∼O
EXERCISES 3.3b
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the arguments
using the eight rules of inference and the first five rules of
equivalence.
1. If Albert asks Bernice on a date, then she’ll say yes. Bernice doesn’t say yes to a
date and her cat died, but her dog is still alive. So, Albert didn’t ask Bernice on
a date.
2. Callie majors in English only if she reads Charles Dickens. Either Callie and
Elisa major in English or Callie and Franz major in English. So, Callie reads
Charles Dickens.
3. If there is a mouse in the house, then nuts were left out. The lights were turned
off unless no nuts were left out. Neither the lights were turned off nor were the
doors left open. So, there was no mouse in the house.
4. It is not the case that either there was a paper or both a quiz and recitation
in French class. If there is no quiz, then the students are happy. If there is no
recitation, the teacher is happy. So, either the students or the teacher is happy.
5. Roland will either go on the upside-down roller coaster, or the speedy vehicle or
the water slide. He doesn’t go on the upside-down roller coaster and he doesn’t
go on the speedy vehicle. If he goes on the tilt-a-whirl, then he won’t go on the
water slide. So, he doesn’t go on the tilt-a-whirl.
6. If Luz doesn’t travel to Greece, then she’ll go to Haiti. She’ll go to Israel given
that she travels to Haiti. She doesn’t go to either Greece or Jordan. So, she goes
to Israel and not Jordan.
7. It is not the case that either Ernesto and Francisco go to swim practice or Gillian
or Hayden go to swim practice. Either Isaac or Joan goes to swim practice. If
3 . 3 : R u les of E q u i v alence 1 1 4 5
Isaac goes to swim practice, then Hayden will go to swim practice. So, Joan
goes to swim practice.
8. If it’s not the case that both Katrina and Laetitia go to math class, then Ms.
Macdonald will be angry. Ms. Macdonald is angry only when Nigel skips math
class. It is not the case that either Olivia and Polly both skip math class, or Nigel
does. Therefore, Laetitia goes to math class.
9. Time is not both dynamic and static. But time is both subjective and dynamic.
So, time is not static.
10. Anaximander, Thales, or Pythagoras believes that everything is made of water.
But neither Anaximander nor Pythagoras believes that everything is made of
water. So, either Thales or Protagoras believes that everything is made of water.
11. If meaning is atomic and compositional, then there are no incompatible
translation manuals. But there are incompatible translation manuals. And
meaning is compositional. So, meaning is not atomic.
12. Either Sartre believes in freedom just in case Camus does, or existentialism
is problematic. But existentialism is neither incoherent nor problematic. So,
Sartre believes in freedom if, and only if, Camus does.
13. Descartes and either Spinoza or Leibniz defend the ontological argument. But
if Descartes and Spinoza defend the ontological argument, then rationalism
is not theistic. If Descartes and Leibniz defend the ontological argument,
then rationalism is not libertarian. So, rationalism is not both theistic and
libertarian.
14. If truth is not subjective, then there are universally valid principles of justice. If
truth is not relative, then we can know the principles of justice. If truth is both
subjective and relative, then there are no moral facts. But there are moral facts.
So, either there are universally valid principles of justice or we can know the
principles of justice.
15. Either morality is individualistic or Nietzsche is not right about morality.
Either morality is individualistic or Thrasymachus is not right about
morality. Nietzsche and Thrasymachus are not both wrong. So, morality is
individualistic.
16. The self is either the soul or consciousness, or it’s irreducible or nonexistent.
If the self is either the soul or consciousness, then empirical science is
useless. If the self is irreducible, then it is really consciousness. Empirical
science is not useless. So, neither empirical science is useless nor is the soul
not nonexistent.
1 4 6 C h apter 3 Inference i n P ropos i t i onal L og i c
Contraposition (Cont)
Contraposition (Cont) is a Contraposition is based on the equivalence of a conditional and its contrapositive.
rule of equivalence of PL. → ∼β ⊃ ∼α Contraposition
α ⊃ β ←
In other words, the antecedent and consequent of a conditional statement may be
exchanged if they are both negated (or, right-to-left, un-negated). Cont is often used
with HS, as in 3.4.1.
3.4.1 1. A ⊃ B
2. D ⊃ ∼B / A ⊃ ∼D
3. ∼∼B ⊃ ∼D 2, Cont
4. B ⊃ ∼D 3, DN
5. A ⊃ ∼D 1, 4, HS
QED
Cont can be tricky when only one formula is negated, as we can see in 3.4.2 and
3.4.3, which perform the same transformation in different orders. You can either add
a negation to both the antecedent and consequent when you use Cont or you can take
a tilde off of each of them. But you cannot mix-and-match. Thus, you often need to
invoke DN together with Cont.
3.4.2 A ⊃ ∼B
∼∼B ⊃ ∼A by Cont (left-to-right)
B ⊃ ∼A by DN
3.4.3 A ⊃ ∼B
∼∼A ⊃ ∼B by DN
B ⊃ ∼A by Cont (right-to-left)
If you need to derive a biconditional, again the first version of the rule is often more
useful. First, derive the two component conditionals. Then, conjoin them and use the
rule. We will explore this method more carefully in sections 3.6, 3.7, and 3.9. For now,
take a moment to see how the rule is used at 3.4.7.
3.4.7 1. ∼[(K ⊃ ∼H) ∙ (∼H ⊃ K)]
2. (I ∙ J) ⊃ (K ≡ ∼H) / ∼(I ∙ J)
3. ∼(K ≡ ∼H) 1, Equiv
4. ∼(I ∙ J) 2, 3, MT
QED
Exportation (Exp)
Exportation (Exp) i s a Exportation allows you to group antecedents of nested conditionals either together
rule of equivalence of PL. as a conjunction (on the right) or separately (on the left).
→ (α ∙ β) ⊃ γ Exportation
α ⊃ (β ⊃ γ) ←
According to Exp, a typical nested conditional like 3.4.8 can be translated as either
3.4.9 or 3.4.10.
3.4.8 If I get my paycheck today, then if you come with me, we can go
to dinner.
3.4.9 P ⊃ (C ⊃ D)
3.4.10 (P ∙ C) ⊃ D
While 3.4.9 is the more natural reading of 3.4.8, the alternative 3.4.10 is also satis-
fying. A close English translation of 3.4.10, at 3.4.11, is intuitively equivalent to the
original.
3.4.11 If I get my paycheck today and you come with me, then we can go
to dinner.
Further, exportation, when combined with commutativity, allows us to switch an-
tecedents. So, 3.4.9 is also equivalent to 3.4.12. A natural translation of that proposi-
tion into English is at 3.4.13.
3.4.12 C ⊃ (P ⊃ D)
3.4.13 If you come with me, then if I get my paycheck, we can go to
dinner.
While 3.4.13 is not as intuitively satisfying as 3.4.11 as an equivalent of 3.4.8, they
are all logically equivalent. The difference in tone or presupposition may arise from
the awkwardness of representing natural-language conditionals, and their causal
properties, with the material conditional.
The rule of exportation sometimes allows you to get to MP or MT, as in 3.4.14.
3 . 4 : R u les of E q u i v alence 2 1 4 9
3.4.14 1. L ⊃ (M ⊃ N)
2. ∼N / ∼L ∨ ∼M
3. (L ∙ M) ⊃ N 1, Exp
4. ∼(L ∙ M) 3, 2, MT
5. ∼L ∨ ∼M 4, DM
QED
When using exportation, be careful to distinguish propositions like 3.4.15 from
propositions like 3.4.16. These are not equivalent. Remember that exportation allows
us to group two antecedents, as in the former, not two consequents, as in the latter.
Only 3.4.15 may be used with exportation.
3.4.15 A ⊃ (B ⊃ C)
3.4.16 (A ⊃ B) ⊃ C
Tautology (Taut)
Tautology eliminates some redundancy. Tautology (Taut) a re rules
→ α ∙ α Tautology of equivalence of PL.
α ←
→ α∨α
α ←
The conjunction version of Taut is redundant on whole lines, right-to-left, since we
can use Simp instead. The disjunction version is redundant on whole lines left-to-
right, since we can use Add instead. But Taut can be used on parts of lines, and the
other directions can also be useful, especially for disjunction, as in 3.4.17.
3.4.17 1. O ⊃ ∼O / ∼O
2. ∼O ∨ ∼O 1, Impl
3. ∼O 2, Taut
QED
Summary
We have now seen eight rules of inference and ten rules of equivalence. It is a lot of
rules to learn and master. The best way to learn how to use the rules is just to practice
lots of derivations.
With our eighteen rules, our proof system is almost complete. We’ll need at least
one of the proof methods of sections 3.7 and 3.9 to finish. But we have plenty of in-
teresting rules to learn and use, and the derivations of this section and the next are
among the most difficult in the textbook. While it will take some work to learn the
new proof techniques, they will, in the end, make derivations much simpler.
Before we get to the new techniques, though, our next section, 3.5, features some
hints and tricks that may be adapted for use in lots of longer derivations and should
help make some of the more difficult derivations more manageable. In section 3.6, we
will see a set of rules governing inferences using the biconditional.
1 5 0 C h apter 3 Inference i n P ropos i t i onal L og i c
KEEP IN MIND
We saw five further rules of equivalence in this section: contraposition (Cont), material
implication (Impl), material equivalence (Equiv), exportation (Exp), and tautology
(Taut).
Cont displays the equivalence of a statement with its contrapositive.
The rule of material implication is another way of saying that either the antecedent is false
or the consequent is true.
Equiv provides two ways to unpack or introduce a biconditional.
Exp allows you to group the antecedents of some nested conditionals.
Taut eliminates redundancy with conjunctions or disjunctions.
We now have eighteen rules available for use in derivations.
Rules Introduced
Contraposition (Cont)
→ ∼β ⊃ ∼α
α ⊃ β ←
Material Implication (Impl)
→ ∼α ∨ β
α ⊃ β ←
Material Equivalence (Equiv)
α ≡ β → (α ⊃ β) ∙ (β ⊃ α)
←
α ≡ β → (α ∙ β) ∨ (∼α ∙ ∼β)
←
Exportation (Exp)
→ (α ∙ β) ⊃ γ
α ⊃ (β ⊃ γ) ←
Tautology (Taut)
α → α∙α
←
α → α∨α
←
EXERCISES 3.4a
For each of the following inferences, determine which single
rule of equivalence of sections 3.3 or 3.4 is used, if any. If
the second formula does not result from a single application
of a rule of equivalence to the first formula, write, ‘does not
follow’. (Some of those inferences are valid, even if not
immediately inferable in our system.)
The inferences that do not immediately follow in these
exercises are common errors that logic students make when
3 . 4 : R u les of E q u i v alence 2 1 5 1
EXERCISES 3.4b
Derive the conclusions of each of the following arguments
using the rules of inference and equivalence.
1. 1. P ⊃ ∼Q
2. R ⊃ Q
3. (P ⊃ ∼R) ⊃ S /S
2. 1. P ∨ Q
2. ∼Q ∨ R /P∨R
3. 1. ∼(P ≡ ∼Q )
2. P /Q
4. 1. ∼I ∨ J
2. J ≡ K
3. (I ∙ L) ∨ (I ∙ M) / K
5. 1. G ∨ H
2. ∼I ∙ ( J ∙ ∼G) / H ∨ ∼I
6. 1. P ∨ (Q ∙ R)
2. S ⊃ ∼R /S⊃P
7. 1. ∼P ∨ (Q ∨ S)
2. ∼P ⊃ R / ∼R ⊃ (Q ∨ S)
8. 1. E ≡ F
2. ∼(G ∨ E) / ∼F
9. 1. A ∨ (B ∨ A)
2. ∼(B ∨ C)
3. A ⊃ D /D
10. 1. (P ∙ Q ) ⊃ R
2. (P ∙ S) ∨ (P ∙ T) / Q ⊃ R
11. 1. L ⊃ ∼(∼M ∨ K)
2. M ⊃ (∼K ⊃ N)
3. ∼N / ∼L
12. 1. D ≡ E
2. (E ∨ F) ⊃ G
3. ∼(G ∨ H) / ∼D
13. 1. (P ∙ Q ) ∨ (R ∙ S)
2. ∼S /P
14. 1. (P ∙ Q ) ⊃ R
2. ∼(R ∨ S) / P ⊃ ∼Q
3 . 4 : R u les of E q u i v alence 2 1 5 3
15. 1. P ⊃ (∼Q ⊃ R)
2. ∼(R ∨ S) /P⊃Q
16. 1. ∼P ∨ Q
2. ∼R ⊃ ∼Q
3. S ∨ ∼R /P⊃S
17. 1. ∼(P ∨ Q ) ⊃ R
2. ∼P / ∼R ⊃ Q
18. 1. ∼(P ∙ Q ) ⊃ R
2. ∼S ∨ ∼R /S⊃P
19. 1. ∼Q ⊃ ∼P
2. ∼Q ∨ R
3. ∼(∼S ∙ R) / ∼S ⊃ ∼P
20. 1. P ≡ ∼Q
2. P ∨ R
3. Q /R
21. 1. (P ∙ Q ) ∨ ∼R
2. ∼R ⊃ S / ∼S ⊃ P
22. 1. ∼P ∨ Q
2. ∼Q ∨ (R ⊃ ∼S) / S ⊃ (∼P ∨ ∼R)
23. 1. (P ∙ Q ) ⊃ R
2. ∼S ∨ P / (S ∙ Q ) ⊃ R
24. 1. D ∨ (E ∨ F)
2. F ⊃ (G ∙ H)
3. ∼G /D∨E
25. 1. Q ⊃ R
2. R ⊃ (S ⊃ T) / ∼T ⊃ (S ⊃ ∼Q )
26. 1. (P ⊃ ∼Q ) ∨ R / (∼R ∙ P) ⊃ ∼Q
27. 1. (P ≡ Q ) ∨ P / P ∨ ∼Q
28. 1. ∼[(P ∨ Q ) ∙ R]
2. R ∨ S /Q⊃S
29. 1. (P ≡ Q ) ∨ ∼P /P⊃Q
30. 1. ∼P ∨ Q
2. R ⊃ ∼Q
3. R ∨ ∼S
4. ∼T ⊃ S /P⊃T
31. 1. (S ≡ T) ∙ ∼U
2. ∼S ∨ (∼T ∨ U) / ∼S
1 5 4 C h apter 3 Inference i n P ropos i t i onal L og i c
32. 1. [V ∨ (W ∨ X)] ⊃ Y
2. Y ⊃ Z / Z ∨ ∼V
33. 1. F ⊃ (G ⊃ H)
2. G ∙ ∼H
3. J ⊃ F / ∼J
34. 1. N ⊃ O
2. P ⊃ Q
3. ∼(Q ∨ O) /P≡N
35. 1. T ⊃ (U ⊃ V)
2. Q ⊃ (R ⊃ V)
3. (T ∙ U) ∨ (Q ∙ R) / V
36. 1. (P ∙ Q ) ⊃ (R ∙ S)
2. Q / ∼S ⊃ ∼P
37. 1. (P ∙ ∼Q ) ⊃ (R ∨ S)
2. P ∙ ∼S /Q∨R
38. 1. Q ⊃ ∼P
2. ∼Q ⊃ R
3. ∼R ∨ ∼S
4. S ∨ ∼P / ∼P
39. 1. P ≡ (Q ∙ R)
2. S ⊃ P
3. T ⊃ P
4. ∼S ⊃ T /Q
40. 1. ∼(P ≡ ∼Q )
2. P ⊃ R
3. Q ∨ R / R
EXERCISES 3.4c
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the arguments
using the rules of inference and equivalence.
1. There is a rainbow if, and only if, the sun is out. The sun is not out. So, there is
no rainbow.
2. If there are alpacas on the farm, then there are beagles. If there are beagles, then
there are cows. So, either there are cows or there are no alpacas.
3 . 4 : R u les of E q u i v alence 2 1 5 5
3. If there is a line, Marla must wait in it. If New England High School shows up,
then there is a line if the organist attends. The organist attends and New Eng-
land High School shows up. Therefore, Marla must wait in line.
4. Cecilia goes roller skating if, and only if, Denise comes with her. Denise and
Elise go roller skating, and Felicia goes running. So, Cecilia goes roller skating.
5. Either Ana doesn’t like lemons or she likes mangoes. She likes lemons and nec-
tarines, and oranges. She either doesn’t like mangoes or she likes plums. So, she
likes plums.
6. Quincy takes the job just in case Miriam does not veto the move. Miriam
vetoes the move. So, either Quincy does not take the job or she gets another
offer.
7. I can be happy if, and only if, I have both friends and wealth. But I have no
friends. So, I cannot be happy.
8. Either we act freely or we lack reasons to act. Either we conceive of ourselves
as free or we do not act freely. So, either we conceive of ourselves as free or we
lack reasons to act.
9. Either art does not presuppose a distinctive sort of experience or there is no
unified essence for art. If art does not presuppose a distinctive sort of experience
then there is a unified essence for art. So, art presupposes a distinctive sort of
experience if, and only if, there is no unified essence for art.
10. Either there are moral facts or murder is not wrong. Either murder is wrong or
we cannot know ethical principles. If there are moral facts then we can know
ethical principles. So, there are moral facts if, and only if, we can know ethical
principles.
11. If metaphysics is a priori, then if it is synthetic, then Hume is wrong about
causation. If we cannot see gravity, then Hume is not wrong about causation.
Therefore, if metaphysics is synthetic and a priori, then we can see gravity.
12. We are conscious if, and only if, not all facts are physical. If we are not conscious
and we are zombies, then dualism is true. All facts are physical. So, if we are
zombies, then dualism is true.
13. If there is a self, then the concept of the self is irreducible. If I am my conscious
experience, then the concept of the self is not irreducible. If I do not have a soul,
then I am my conscious experience. If I do have a soul, then I am not my body.
So, if I am my body, then there is no self.
14. Consequences are morally important if, and only if, duties are not. Either con-
sequences are morally important or duties are not. So, consequences are mor-
ally important and duties are not.
1 5 6 C h apter 3 Inference i n P ropos i t i onal L og i c
Making Conditionals
In 3.5.1, we infer from the negation of a wff that the wff (un-negated) entails anything.
You just add the desired consequent and use the rule of material implication.
3.5.1 1. ∼A /A⊃B
2. ∼A ∨ B 1, Add
3. A ⊃ B 2, Impl
QED
In 3.5.2, we see that any wff entails a formula that is already assumed or proven. As
in 3.5.1, you add a wff: this time, the negation of your desired antecedent. Again, a use
of Impl ends the derivation.
3.5.2 1. E /F⊃E
2. E ∨ ∼F 1, Add
3. ∼F ∨ E 2, Com
4. F ⊃ E 3, Impl
QED
3 . 5 : P ract i ce w i t h Der i v at i ons 1 5 7
Negated Conditionals
Having the negation of a conditional in a proof can often be useful. Remember, the
only way for a conditional to be false is for the antecedent to be true and the consequent
to be false. So, if you have assumed or derived the negation of a conditional, you can
also derive the antecedent conjoined with the negation of the consequent, as at 3.5.4.
Then you can simplify either conjunct.
3.5.4 1. ∼(P ⊃ Q) / P ∙ ∼Q
2. ∼(∼P ∨ Q) 1, Impl
3. ∼ ∼P ∙ ∼Q 2, DM
4. P ∙ ∼Q 3, DN
QED
3.5.6 1. (R ∨ S) ⊃ T /R⊃T
2. ∼(R ∨ S) ∨ T 1, Impl
3. (∼R ∙ ∼S) ∨ T 2, DM
4. T ∨ (∼R ∙ ∼S) 3, Com
5. (T ∨ ∼R) ∙ (T ∨ ∼S) 4, Dist
6. T ∨ ∼R 5, Simp
7. ∼R ∨ T 6, Com
8. R ⊃ T 7, Impl
QED
Be careful to note the contrast between 3.5.5 and 3.5.6. We can reduce a condi-
tional with a conjunction in the consequent or a conditional with a disjunction in the
antecedent. We cannot reduce a conditional with a conjunction in the antecedent,
nor can we reduce a conditional with a disjunction in the consequent. If α entails β
and γ, then α entails β and α entails γ. If either α or β entails γ, then α entails γ and
β entails γ. But from α and β together entailing γ, one cannot conclude that either
α or β on its own entails γ. And from α entailing either β or γ, one does not know
whether β or γ is entailed.
Combining Conditionals
3.5.7 and 3.5.8 show techniques that are the reverse of those in 3.5.5 and 3.5.6, com-
bining two conditionals that share a consequent (in the former) and combining two
conditionals that share an antecedent (in the latter).
3.5.7 1. W ⊃ X
2. Y ⊃ X / (W ∨ Y) ⊃ X
3. (W ⊃ X) ∙ (Y ⊃ X) 1, 2, Conj
4. (∼W ∨ X) ∙ (Y ⊃ X) 3, Impl
5. (∼W ∨ X) ∙ (∼Y ∨ X) 4, Impl
6. (X ∨ ∼W) ∙ (∼Y ∨ X) 5, Com
7. (X ∨ ∼W) ∙ (X ∨ ∼Y) 6, Com
8. X ∨ (∼W ∙ ∼Y) 7, Dist
9. (∼W ∙ ∼Y) ∨ X 8, Com
10. ∼(W ∨ Y) ∨ X 9, DM
11. (W ∨ Y) ⊃ X 10, Impl
QED
3.5.8 1. A ⊃ B
2. A ⊃ C / A ⊃ (B ∙ C)
3. ∼A ∨ B 1, Impl
4. ∼A ∨ C 2, Impl
5. (∼A ∨ B) ∙ (∼A ∨ C) 3, 4, Conj
6. ∼A ∨ (B ∙ C) 5, Dist
7. A ⊃ (B ∙ C) 6, Impl
QED
3 . 5 : P ract i ce w i t h Der i v at i ons 1 5 9
Explosion
Lastly, let’s take a look at an important and curious inference that logicians call
explosion. Explosion is a characteristic of inconsistent theories, given the rules of
inference of classical logic. An inconsistent theory is one in which both a statement
and its negation are derivable. In other words, inconsistent theories contain In derivations, a
contradictions. contradiction is any
In chapter 2, we saw that individual statements can be self-contradictory, if they statement of the form:
α ∙ ∼α.
are false in every row of the truth table. We also saw that pairs of statements can
be contradictory, if they differ in truth value in each row of the truth table, and
inconsistent, if they cannot be true together. For the purposes of our proof theory,
we will henceforth take a more narrow view of the term ‘contradiction’, as the con-
junction of any statement with its negation, any statement of the form α ∙ ∼α, for
any wff α.
Let’s look at explosion, starting with a contradiction, at 3.5.10. Explosionis a property
of classical systems
3.5.10 1. P ∙ ∼P of inference: from a
2. P 1, Simp contradiction, any
3. P ∨ Q 2, Add statement can be derived.
4. ∼P ∙ P 1, Com
5. ∼P 4, Simp
6. Q 3, 5, DS
QED
Notice that the only premise for the explosive inference is the contradiction at
line 1; Q never appears until it’s added at line 3. And then it is derived all by itself!
From a contradiction, anything, and everything, follows. That’s why logicians call
this property of logical systems explosion: every wff of the language can be derived
from any contradiction. Classical systems explode.
We will return to explosion and the importance of contradictions (and avoid-
ing them in classical systems such as ours) in section 3.9. For now, just notice that
if you ever find an argument in which a contradiction is provable, you can just in-
sert a few lines, as in 3.5.10, to demonstrate any conclusion. (I ordinarily try to keep
the premises of the arguments in the exercises consistent, but here I included a few
contradictions—see if you can find them!)
1 6 0 C h apter 3 Inference i n P ropos i t i onal L og i c
Summary
The proofs of section 3.4 were often difficult due both to the number of rules we have
to know and the complexities of the arguments whose conclusions we are now able to
derive. One way to improve your abilities to construct complicated derivations is to
know and recognize a variety of common techniques, such as the ones of this section.
They are worth a little time studying, so that you can use them in future derivations.
Moreover, some of the underlying concepts, such that if a statement is true then
anything entails it or that a contradiction entails all other formulas, are central to PL,
classical propositional logic. So, getting to know these techniques can help you better
understand the logic you are using.
KEEP IN MIND
EXERCISES 3.5a
Derive the conclusions of each of the following arguments
using the rules of inference and equivalence.
1. 1. A ⊃ B
2. B ⊃ ∼B / ∼A
2. 1. ∼K ∨ L
2. L ⊃ ∼K / ∼K
3. 1. G ⊃ H
2. ∼(I ⊃ H) / ∼G
4. 1. (T ∙ U) ⊃ V
2. ∼(T ⊃ W) / U ⊃ V
3 . 5 : P ract i ce w i t h Der i v at i ons 1 6 1
5. 1. ∼(P ⊃ Q )
2. ∼(R ⊃ S) / ∼(Q ∨ S)
6. 1. P ⊃ Q
2. P /R⊃Q
7. 1. (P ∨ Q ) ⊃ R
2. R ⊃ ∼S / P ⊃ ∼S
8. 1. (A ⊃ B) ⊃ C
2. ∼A ∨ (B ∙ D) / C
9. 1. W ⊃ (X ∙ Y)
2. (W ∙ ∼X) ∨ Z / Z
10. 1. N ⊃ (O ∙ P)
2. ∼N ⊃ Q / ∼O ⊃ Q
11. 1. ∼P ⊃ R
2. ∼Q ⊃ R
3. ∼R / S ⊃ (P ∙ Q )
12. 1. P ≡ (Q ∙ R)
2. ∼Q / ∼P
13. 1. P ⊃ (∼Q ⊃ ∼R)
2. R /P⊃Q
14. 1. ∼[(P ∙ Q ) ∙ R]
2. R / P ⊃ ∼Q
15. 1. (P ∙ Q ) ⊃ (R ⊃ S)
2. Q ∙ R / ∼S ⊃ ∼P
16. 1. I ⊃ J
2. ∼J ∙ K
3. ∼J ⊃ L
4. ∼ ∼I /K∙L
17. 1. ∼(P ≡ ∼Q )
2. P ⊃ ∼Q / ∼Q ∙ ∼P
18. 1. P ⊃ R
2. Q ⊃ R
3. S ⊃ (P ∨ Q ) /S⊃R
1 6 2 C h apter 3 Inference i n P ropos i t i onal L og i c
19. 1. R ∨ Q
2. ∼R ∨ ∼S
3. ∼(∼S ∙ ∼T)
4. ∼(P ⊃ U)
5. ∼(P ∙ Q ) / T ∙ ∼U
20. 1. (P ∙ Q ) ⊃ (R ∨ S) / ∼R ⊃ [(Q ∙ P) ⊃ S]
21. 1. ∼(X ⊃ Y)
2. Y ∨ (Z ∙ A) /Z≡A
22. 1. (H ∙ I) ⊃ J
2. H ∙ (I ∨ K) / ∼J ⊃ K
23. 1. (X ⊃ Y) ⊃ Z
2. W ⊃ ∼Z / ∼(W ∙ Y)
24. 1. ∼V ⊃ W
2. X ⊃ Y
3. V ⊃ Z
4. ∼W ∙ X
5. ∼Z ∙ Y / Y ∙ ∼V
25. 1. P ⊃ Q
2. P ⊃ R
3. (Q ∙ R) ⊃ ∼S / ∼P ∨ ∼S
26. 1. P ⊃ (Q ∨ R)
2. R ⊃ (S ∙ T)
3. ∼Q /P⊃T
27. 1. ∼P ∨ Q
2. ∼R ∨ ∼Q
3. ∼R ⊃ (S ∙ T) /P⊃S
28. 1. A ⊃ B
2. B ⊃ D
3. D ⊃ A
4. A ⊃ ∼D / ∼A ∙ ∼D
29. 1. (I ∙ E) ⊃ ∼F
2. F ∨ (G ∙ H)
3. I ≡ E /I⊃G
30. 1. ( J ⊃ J) ⊃ (K ⊃ K)
2. (K ⊃ L) ⊃ ( J ⊃ J) /K⊃K
3 . 5 : P ract i ce w i t h Der i v at i ons 1 6 3
EXERCISES 3.5b
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the arguments
using the rules of inference and equivalence.
1. If David quits the team, then Sandra watches the games provided that Ross
joins the team. So, it is not the case that David quits the team, and Ross joins
the team, and Sandra doesn’t watch the games.
2. If you are from the planet Orc, then you have pin-sized nostrils. But, things
with pin-sized nostrils are not from Orc. Either you are from Orc or Quaznic,
or you rode a long way in your spaceship. So, you are from Quaznic unless you
rode a long way in your spaceship.
3. It is not the case that violets bloom only if they are watered. Either violets are
watered or they undergo special treatment. So, they undergo special treatment.
4. If Francesca playing the xylophone entails that she yawns in class, then Zara
gives a presentation in class. If Zara gives a presentation, then the woodwind
players listen. So, either the woodwind players listen or Francesca plays
xylophone.
5. Either experience eternally recurs unless there is no God, or suffering is the
meaning of existence. If I can go under, then experience does not eternally re-
cur. So, if I can go under and there is a God, then the suffering is the meaning
of existence.
6. If life is suffering, then if you do not have compassion, then only the truth can
save us. It is not the case that if life is suffering, then you have compassion. So,
only the truth can save us.
7. If we explain events by reference to better-known phenomena, then explana-
tions are not inferences. Explanations of events refer to better-known phenom-
ena. So, we explain events by reference to better-known phenomena if, and only
if, explanations are not inferences.
8. If God’s nonexistence entails her existence, then the existence of goodness en-
tails that there is no goodness. God exists. So, there is no goodness.
9. If removing one’s glasses entails that the quality of experience changes, then
the content of experience is subjective. But the content of experience is not sub-
jective. So, if the quality of experience changes, then qualia fade.
10. If truth arises from societies with constraints and not from solitary freedom,
then philosophy and politics are inextricably linked. Truth arising from socie
ties with constraints does not entail that philosophy and politics are inextrica-
bly linked. So, truth arises from solitary freedom.
1 6 4 C h apter 3 Inference i n P ropos i t i onal L og i c
11. If acting freely entails the existence of external causation, then we are aware
of our freedom. If we are aware of our freedom, then we are unaware of our
freedom. So, we act freely.
12. If slowness is not a property of a walker, then if it is a property of walking,
then events exist. Either slowness is a property of walking and of running,
or slowness is a property of walking and of thinking. So, if slowness is not a
property of a walker, then events exist.
13. If moral theory is useful, then it should not serve only oneself. If either moral
theory should not serve only oneself or self-interest is difficult to know, then
we ought to consider the good of others. If moral theory is not useful, then we
should not consider the good of others. So, a moral theory is useful if, and only
if, we should consider the good of others.
14. If the general will is common interest, then if foreign powers see the state as
an individual, then the general will involves total subjection and is sometimes
misunderstood. If the general will is sometimes misunderstood, then to govern
is to serve. Foreign powers see the state as an individual. So, if the general will
is common interest, then to govern is to serve.
15. If sense experience is reliable, then mass is a real property and color is not. If
Newtonian physics is true, then mass is a real property and teleology is not a
physical concept. If Newtonian physics is not true, then sense experience is reli-
able. If mass is a real property and color is not, then teleology is a physical con-
cept. So, color is a real property if, and only if, teleology is not a physical concept.
16. If there is a God, then there is goodness. But the existence of a God also entails
that we are free. Either the nonexistence of God entails the nonexistence of
goodness or we are not free. So, God exists if, and only if, there is goodness and
we are free.
the biconditional. 3.6.2, which is identical to 3.6.1 except for the main operator of the
first premise, is logically valid.
3.6.2 I’ll go with you if, and only if, you go to the movies.
You don’t go to the movies.
So, I don’t go with you.
In 3.6.1, I commit to joining you if you go to the movies, but I say nothing about
what happens if you decide instead to go bowling. Perhaps I really like you and would
join you no matter what you do. (And perhaps I utter the first premise of 3.6.1 in order
not to appear overeager!) In contrast, in 3.6.2, I both commit to joining you if you go
to the movies and not going with you if you do anything else. I join you only if you
go to the movies, so if you go bowling, I’m out.
Compounding the confusion, perhaps, is the fact that in many mathematical or log-
ical contexts, people use conditionals where biconditionals are also (perhaps more)
appropriate. For example, a mathematician might utter 3.6.3.
3.6.3 If a tangent to a circle intersects a chord at a right angle, the
chord is a diameter.
While there’s nothing wrong with 3.6.3, the stronger 3.6.4 is also warranted.
3.6.4 A tangent to a circle intersects a chord at a right angle if, and only
if, the chord is a diameter.
Since conditionals and biconditionals have different truth conditions, it is impor-
tant to keep them distinct in your mind and regimentations. It will also be useful to
have some more rules governing inferences using the biconditional.
We have lots of rules governing the conditional. The only rule we have so far govern-
ing use of the biconditional is Equiv. So, the inference 3.6.5 is made in a single step.
3.6.5 1. P ⊃ Q Premise
2. P Premise
3. Q 1, 2, MP
QED
In contrast, the parallel inference 3.6.6 has five lines.
3.6.6 1. P ≡ Q Premise
2. P Premise
3. (P ⊃ Q) ∙ (Q ⊃ P) 1, Equiv
4. P ⊃ Q 3, Simp
5. Q 4, 2, MP
QED
Since our reasoning with biconditionals often parallels (with important differ-
ences) our reasoning with conditionals, it is useful to shorten some derivations by
adopting some rules governing the biconditional that are parallel to those governing
the conditional. Here are three rules of inference and four rules of equivalence. The
validity of these rules of inference and the equivalence of the rules of equivalence
1 6 6 C h apter 3 Inference i n P ropos i t i onal L og i c
are easily demonstrated using truth tables; tables for BDM and BInver appear in the
appendix.
Lastly, biconditional association helps with propositions containing multiple bi- Biconditional Association
conditionals, often in combination with other biconditional rules. (BAssoc)is a rule of
equivalence of PL which
3.6.11 1. P ≡ (Q ≡ R) allows you to regroup
2. ∼R ∙ ∼Q /P propositions with two
3. (P ≡ Q) ≡ R 1, BAssoc biconditionals.
4. R ≡ (P ≡ Q) 3, BCom
5. ∼R 2, Simp
6. ∼(P ≡ Q) 4, 5, BMT
7. ∼P ≡ Q 6, BDM
8. Q ≡ ∼P 7, BCom
9. ∼Q ∙ ∼R 2, Com
10. ∼Q 9, Simp
11. ∼ ∼P 8, 10, BMT
12. P 11, DN
QED
Summary
The biconditional rules of this section facilitate many inferences, though they do not
allow you to convert biconditionals into other operators as Equiv does. They supple-
ment, rather than supplant, the earlier rule.
1 6 8 C h apter 3 Inference i n P ropos i t i onal L og i c
KEEP IN MIND
We saw three new rules of inference in this section: biconditional modus ponens (BMP),
biconditional modus tollens (BMT), and biconditional hypothetical syllogism
(BHS).
We saw four new rules of equivalence in this section: biconditional De Morgan’s law
(BDM), biconditional commutativity (BCom), biconditional inversion (BInver), and
biconditional association (BAssoc).
BMT uses the negation of the left side of the biconditional.
BHS often must be set up properly with BCom.
BDM does not require changing an operator, only the punctuation.
To use BInver, either add one negation to each side of a biconditional or remove one from
each side.
It is important, especially in future sections, not to confuse the biconditional rules with the
parallel rules governing the conditional.
Rules Introduced
Rules of Inference:
Rules of Equivalence:
EXERCISES 3.6a
Derive the conclusions of each of the following arguments
using the eighteen standard rules and the new rules for the
biconditional. Compare your derivations to those done in
3.4b without these new rules.
EXERCISES 3.6b
Derive the conclusions of each of the following arguments
using the rules of inference and equivalence, including the
biconditional rules.
1. 1. A ≡ B
2. ∼B / ∼A
2. 1. ∼(E ≡ F)
2. F / ∼E
3. 1. G ≡ H
2. ∼H ≡ ∼I /G≡I
4. 1. J ≡ K
2. K ≡ ∼L / L ≡ ∼J
5. 1. M ≡ (N ≡ O)
2. ∼O / ∼M ≡ N
6. 1. ∼(S ≡ T)
2. ∼(T ≡ U) / S ≡ U
7. 1. X ≡ (∼Y ∨ Z)
2. X ∙ ∼Z / ∼Y
8. 1. (A ≡ B) ≡ C
2. ∼B / ∼A ≡ C
9. 1. ∼[D ≡ (E ∙ F)]
2. ∼F /D
10. 1. (G ≡ H) ⊃ H
2. ∼H /G
11. 1. L ∙ M
2. M ≡ N /L≡N
12. 1. (P ≡ Q ) ∙ (P ∨ R)
2. ∼R /Q
13. 1. W ≡ (X ∨ Y)
2. Y ∨ Z
3. ∼W /Z
14. 1. (P ≡ Q ) ⊃ ∼(R ≡ ∼S)
2. ∼(R ≡ S) / ∼P ≡ Q
15. 1. ∼P ≡ (Q ∙ R)
2. ∼Q /P
3 . 6 : T h e B i con d i t i onal 1 7 1
30. 1. P ≡ Q
2. ∼Q ≡ R
3. R ≡ P /S
31. 1. (A ∙ B) ≡ C
2. (D ∙ ∼A) ∨ (D ∙ ∼B)
3. (C ≡ D) ≡ E / ∼E
32. 1. (J ∙ K) ≡ L
2. J ≡ M
3. K ≡ N
4. M ≡ N
5. M ∨ N /L
33. 1. ∼X ∨ Y
2. X ∨ ∼Y
3. (Z ≡ A) ⊃ ∼(X ≡ Y) / ∼Z ≡ A
34. 1. ∼P ≡ Q
2. Q ≡ R
3. (R ∙ S) ≡ T
4. S ∙ ∼T /P
35. 1. P ≡ (Q ∙ ∼R)
2. ∼S ≡ P
3. S ∙ ∼R /Q≡R
36. 1. ∼P ≡ Q
2. ∼Q ≡ R
3. P ⊃ S
4. ∼R ⊃ S /S
37. 1. B ≡ (C ∙ D)
2. E ≡ C
3. ∼D ≡ ∼E /B≡E
38. 1. (F ∨ G) ⊃ H
2. (I ∨ J) ⊃ ∼H
3. ∼I ⊃ F / ∼(F ≡ I)
39. 1. P ∨ (Q ∙ R)
2. ∼(P ∙ Q ) / ∼(P ≡ Q )
40. 1. P ≡ (Q ∨ R)
2. R ≡ S
3. Q ⊃ R /P≡S
3 . 6 : T h e B i con d i t i onal 1 7 3
EXERCISES 3.6c
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the arguments
using the rules of inference and equivalence.
1. Edye is patient when, and only when, she is not sleepy. She is sleepy when, and
only when, her children are not happy. So, Edye’s children are happy when,
and only when, Edye is patient.
2. Gustavo plays tennis if, and only if, he runs. But Gustavo doesn’t run. So, if Gus-
tavo plays tennis, then Martians have landed on Earth.
3. Aardvarks eat ants just in case they don’t drink beer. Aardvarks drink beer just
in case they do not chase chickens. But aardvarks do chase chickens. So, they
eat ants.
4. Doug’s playing golf entails his eating a hearty dinner if, and only if, he either
plays with Bob or he doesn’t eat at home. But it’s not the case that if Doug eats
at home, then he plays with Bob. So, Doug plays golf but does not eat a hearty
dinner.
5. Emily studies in Rome if, and only if, it is not the case that she prefers classes
on campus and her funding does not fall through. Her preferring classes on
campus does not entail that her funding falls through. So, she does not study
in Rome.
6. I’ll work in the supermarket this summer just in case I need money for a new
guitar. It’s not the case that I need money for a new guitar if, and only if, my
band gets back together. My band gets back together if, and only if, the drum-
mer drops out and the guitarist transfers back home. Neither the guitarist
transfers back home nor the singer breaks up with her girlfriend. So, I will work
in the supermarket this summer.
7. Isla stays in to study if, and only if, Christine goes to the party just in case Mer-
cedes does not go to the movie. Either Mercedes goes to the movie or Kwadwo
doesn’t hang around reading Wittgenstein. Kwadwo hangs around reading
Wittgenstein if, and only if, Hunter is busy working on his paper. It’s not the
case that Hunter’s being busy with his paper entails that Christine goes to the
party. So, Isla stays in to study.
8. Genesis does research on Hobbes if, and only if, she gets a grant or finds other
money for her work. She does not do research on Hobbes if, and only if, she
takes a different job. So, if she gets a grant, then she will not take a different job.
1 74 C h apter 3 Inference i n P ropos i t i onal L og i c
9. We are not free if, and only if, our wills are determined or our bodies are con-
strained. But we are free. So, if our wills are determined, then Leibniz is a
libertarian.
10. We have moral responsibilities if, and only if, it is not the case that our wills are
free just in case there are souls. But we do not have moral responsibilities and
our wills are not free. So, there are no souls.
11. I am not mortal if, and only if, the self is a conceptual construct. I am mortal
just in case either my body dies or the self is something physical. So, the self
is a conceptual construction if, and only if, my body does not die and the self is
not physical.
12. God is perfect if, and only if, there is no evil, just in case human intelligence is
limited. If God is perfect, then there is no evil. If God is not perfect, then there
is evil. So, human intelligence is limited.
13. Zombies are possible if, and only if, we are conscious just in case mental states
are not physical. But mental states are physical and zombies are not possible.
So, we are conscious.
14. I am altruistic, just in case I am not just if, and only if, I use the ring of Gyges.
But I am just. So, I do not use the ring of Gyges if, and only if, I am altruistic.
15. Either arithmetic is synthetic or not a priori, if, and only if, it is not analytic.
Arithmetic is synthetic just in case seven and five are not contained in twelve.
Seven and five are contained in twelve if, and only if, arithmetic is not a priori.
So, arithmetic is not analytic.
16. If color is real if, and only if, mass is, then philosophy is not independent of
science. If philosophy is not independent of science, then there are synthetic
a priori claims. If there are synthetic a priori claims, then arithmetic is syn-
thetic. Either arithmetic is not synthetic or philosophy is independent of sci-
ence. Color is not real. So, mass is.
Conditional proof is useful when you want to derive a conditional conclusion. We Conditional proof is a
assume the antecedent of the desired conditional, for the purposes of the derivation, derivation method useful
for deriving conditional
taking care to indicate the presence of that assumption later. conclusions.
Consider the argument at 3.7.1, which has a conditional conclusion.
3.7.1 1. A ∨ B
2. B ⊃ (E ∙ D) / ∼A ⊃ D
Think about what would happen if we had the antecedent of the conditional
conclusion, ‘∼A’, as another premise. First, we would be able to infer ‘B’ by DS with
line 1. Then, since we would have ‘B’, we could use MP to infer ‘E ∙ D’ from line 2.
Lastly, given ‘E ∙ D’ we could use Com and Simp to get ‘D’. So, ‘D’ would follow from
‘∼A’. The method of conditional proof formalizes this line of thought.
The line of thought we took discussing 3.7.1 is thus formalized by using the in-
dented sequence you see at 3.7.2.
3.7.2 1. A ∨ B
2. B ⊃ (E ∙ D) / ∼A ⊃ D
3. ∼A ACP Suppose ∼A.
4. B 1, 3, DS
5. E ∙ D 2, 4, MP
6. D ∙ E 5, Com
7. D 6, Simp Then D would follow.
8. ∼A ⊃ D 3–7, CP So, if ∼A were true, then D
An indented sequence
would be.
is a series of lines in a
QED derivation that do not
The purpose of indenting and using a vertical line is to create an indented sequence follow from the premises
directly, but only with
that marks the scope of your assumption. Any statements you derive within the a further assumption,
scope of an assumption are not derived only from the premises, as in all the direct indicated on the first line
derivations we have done until now. They are derived from the premises with an of the sequence.
1 7 6 C h apter 3 Inference i n P ropos i t i onal L og i c
additional assumption like the one we made at line 3. Thus, after you discharge an
assumption, you may not use statements derived within the scope of that assumption
later in the proof. We could have discharged our assumption in 3.7.2 after any number
of steps in the indented sequence: ‘∼A ⊃ (D ∙ E)’; ‘∼A ⊃ (E ∙ D)’; ‘∼A ⊃ B’; and even
‘∼A ⊃ ∼A’ are all valid inferences given the premises. But none of the consequents
of those conditional statements are themselves validly inferred from the premises
without assuming ‘∼A’.
Conditional proof makes many of the derivations we have done earlier using the
direct method significantly easier. To see a striking difference between the direct and
conditional derivation methods, compare an argument proved directly, in 3.7.3, and
conditionally, in 3.7.4.
3.7.3 Direct Method
1. (P ⊃ Q) ∙ (R ⊃ S) / (P ∙ R) ⊃ (Q ∙ S)
2. P ⊃ Q 1, Simp
3. ∼P ∨ Q 2, Impl
4. (∼P ∨ Q) ∨ ∼R 3, Add
5. ∼P ∨ (Q ∨ ∼R) 4, Assoc
6. (R ⊃ S) ∙ (P ⊃ Q) 1, Com
7. (R ⊃ S) 6, Simp
8. ∼R ∨ S 7, Impl
9. (∼R ∨ S) ∨ ∼P 8, Add
10. ∼P ∨ (∼R ∨ S) 9, Com
11. [∼P ∨ (Q ∨ ∼R)] ∙ [∼P ∨ (∼R ∨ S)] 5, 10, Conj
12. ∼P ∨ [(Q ∨ ∼R) ∙ (∼R ∨ S)] 11, Dist
13. ∼P ∨ [(∼R ∨ Q) ∙ (∼R ∨ S)] 12, Com
14. ∼P ∨ [∼R ∨ (Q ∙ S)] 13, Dist
15. P ⊃ [∼R ∨ (Q ∙ S)] 14, Impl
16. P ⊃ [R ⊃ (Q ∙ S)] 15, Impl
17. (P ∙ R) ⊃ (Q ∙ S) 16, Exp
QED
3.7.4 Conditional Method
1. (P ⊃ Q) ∙ (R ⊃ S) / (P ∙ R) ⊃ (Q ∙ S)
2. P ∙ R ACP
3. P ⊃ Q 1, Simp
4. P 2, Simp
5. Q 3, 4, MP
6. (R ⊃ S) ∙ (P ⊃ Q) 1, Com
7. R ⊃ S 6, Simp
8. R ∙ P 2, Com
9. R 8, Simp
10. S 7,9, MP
11. Q ∙ S 5, 10, Conj
12. (P ∙ R) ⊃ (Q ∙ S) 2–11, CP
QED
3 . 7 : C on d i t i onal P roof 1 7 7
Not only is the conditional method often much shorter, as in this case, it is also
conceptually much easier. In this case, to see that one has to add what one needs at
lines 4 or 9 in the direct version is not easy. The conditional proof proceeds in more
obvious ways.
You can use CP repeatedly within the same proof, whether nested or sequentially. A nested sequence is
3.7.5 demonstrates a nested use of CP. an assumption within
another assumption.
3.7.5 1. P ⊃ (Q ∨ R)
2. (S ∙ P) ⊃ ∼Q / (S ⊃ P) ⊃ (S ⊃ R)
3. S ⊃ P ACP Now we want S ⊃ R.
4. S ACP Now we want R.
5. P 3, 4, MP
6. Q ∨ R 1, 5, MP
7. S ∙ P 4, 5, Conj
8. ∼Q 2, 7, MP
9. R 6, 8, DS
10. S ⊃ R 4–9, CP
11. (S ⊃ P) ⊃ (S ⊃ R) 3–10, CP
QED
Within an indented sequence, you can use any formula in which that sequence is
embedded. So, in the sequence following line 4, you can use lines 1 and 2 as well as line
3. But once you discharge your assumption, as I do at line 10, any conclusions of that
indented sequence are also put off limits. At line 10, the only lines I can use are lines
1–3. If you need any of the propositions derived within an indented sequence after
you discharge the relevant assumption, you have to rederive them. Given this restric-
tion, it is often useful to do as much work as you can before making an assumption.
3.7.6 shows how we can use CP sequentially to prove biconditionals. In such cases,
you want ‘α ≡ β’ which is logically equivalent to ‘(α ⊃ β) ∙ (β ⊃ α)’. This method is
not always the best one, but it is usually a good first thought.
3.7.6 1. (B ∨ A) ⊃ D
2. A ⊃ ∼D
3. ∼A ⊃ B /B≡D
4. B ACP
5. B ∨ A 4, Add
6. D 1, 5, MP
7. B ⊃ D 4–6 CP
8. D ACP
9. ∼ ∼D 8, DN
10. ∼A 2, 9, MT
11. B 3, 10, MP
12. D ⊃ B 8–11 CP
13. (B ⊃ D) ∙ (D ⊃ B) 7, 12, Conj
14. B ≡ D 13, Equiv
QED
1 7 8 C h apter 3 Inference i n P ropos i t i onal L og i c
Notice that we start the second sequence at line 8 intending to derive ‘B’. We al-
ready have a ‘B’ in the proof at line 4. But that ‘B’ was a discharged assumption, and is
off limits after line 6.
You may also use CP in the middle of a proof to derive statements that are not your
main conclusion, as in 3.7.7.
3.7.7 1. P ⊃ (Q ∙ R)
2. (P ⊃ R) ⊃ (S ∙ T) /T
3. P ACP
4. Q ∙ R 1, 3, MP
5. R ∙ Q 4, Com
6. R 5, Simp
7. P ⊃ R 3–6, CP
8. S ∙ T 2, 7, MP
9. T ∙ S 8, Com
10. T 9, Simp
QED
Such uses are perhaps not common. But you can feel free to use a conditional proof
at any point in a derivation if you need a conditional claim.
Summary
We now have two derivation methods, a direct method and a conditional method. In
direct proofs we ordinarily construct our derivations by looking at the premises and
seeing what we can infer. Sometimes we work backward from our conclusions, figur-
ing out what we need, but that kind of work is done on the side, not within a proof.
When setting up conditional proofs, in contrast, we generally look toward our de-
sired conditionals, assuming the antecedent of some conditional we want, rather than
looking at what we have in the premises. We hope that our assumptions will work
with our premises, of course, and we proceed, after our assumptions, to use the ordi-
nary, direct methods. But in setting up our indented sequences, we focus on what we
want, thinking about how our assumption will be discharged.
As it was used in this section, the conditional derivation method is used within a
direct proof, as a subsequence of formulas. In the next section, we’ll do some proofs
completely by the conditional derivation method. In the following section, we’ll look
at a third and final derivation method, indirect proof.
3 . 7 : C on d i t i onal P roof 1 7 9
KEEP IN MIND
When you want to derive a conditional conclusion, you can assume the antecedent of the
conditional, taking care to indicate the presence of that assumption later.
Conditional proofs are especially useful when the conclusion of the argument is a conditional
or a biconditional.For biconditionals, assume one side to derive the other side and dis-
charge; do a second CP for the reverse (if necessary); then conjoin the two conditionals.
Indent and use a vertical line to mark the scope of an assumption.
After you discharge an assumption, you may not use statements derived within the scope of
that assumption later in the proof.
It is often useful to do what you can with a proof before making an assumption so that the
propositions you derive are available after you discharge your assumption.
You can use conditional proof at any point during a proof and anytime you need a condi-
tional statement, not just when the conclusion of the argument is a conditional.
EXERCISES 3.7a
Derive the conclusions of each of the following arguments
using the method of conditional proof where appropriate.
1. 1. (A ∨ C) ⊃ D
2. D ⊃ B /A⊃B
2. 1. X ⊃ Y
2. Y ⊃ Z / X ⊃ (Y ∙ Z)
3. 1. R ⊃ ∼O
2. ∼R ⊃ [S ∙ (P ∨ Q )] / O ⊃ (P ∨ Q )
4. 1. (E ∨ F) ∨ G
2. ∼F / ∼E ⊃ G
5. 1. L ⊃ M
2. L ⊃ N
3. (M ∙ N) ⊃ O /L⊃O
6. 1. Q ⊃ (∼R ∙ S) / R ⊃ ∼Q
7. 1. ∼M ⊃ N
2. L ⊃ ∼N / ∼L ∨ M
8. 1. I ⊃ H
2. ∼I ⊃ J
3. J ⊃ ∼H / J ≡ ∼H
9. 1. ∼M ∨ N
2. P / (M ∨ ∼P) ⊃ (O ∨ N)
1 8 0 C h apter 3 Inference i n P ropos i t i onal L og i c
26. 1. A ≡ (B ∙ ∼C)
2. C ⊃ (D ∙ E)
3. (D ∨ F) ⊃ G / (∼A ∙ B) ⊃ G
27. 1. (H ∨ J) ⊃ K
2. (I ∨ L) ⊃ M / (H ∨ I) ⊃ (K ∨ M)
28. 1. J ⊃ K
2. L ⊃ ∼K
3. ∼J ⊃ M
4. N ⊃ ∼O
5. ∼N ⊃ I
6. ∼O ⊃ L /M∨I
29. 1. D ⊃ (F ∨ G)
2. E ⊃ (F ∨ H)
3. I ⊃ ∼F
4. ∼H / (D ∨ E) ⊃ (I ⊃ G)
30. 1. (X ⊃ Y) ⊃ Z
2. (∼X ∨ Y) ≡ (A ∨ B)
3. ∼B ⊃ (D ⊃ A) / ∼Z ⊃ ∼D
31. 1. (K ∙ ∼L) ⊃ ∼M
2. M ∨ N
3. M ∨ O
4. ∼(N ∙ O) / ∼K ∨ L
32. 1. L ⊃ M
2. O ⊃ M
3. ∼N ⊃ (L ∨ O)
4. (M ∙ N) ⊃ K
5. ∼(J ⊃ K) / ∼M ≡ N
33. 1. I ⊃ (J ∨ K)
2. ∼J ∨ (∼I ∨ L)
3. L ⊃ ∼I /I⊃K
34. 1. (A ⊃ B) ⊃ (C ⊃ B)
2. A ⊃ ∼(B ⊃ D)
3. (A ⊃ ∼D) ⊃ C / B
35. 1. A ⊃ (∼B ∨ C)
2. ∼A ⊃ (B ∨ C)
3. C ⊃ ∼C / ∼(A ≡ B)
36. 1. (A ∙ B) ⊃ (C ∙ D)
2. (A ∙ C) ⊃ (E ∨ ∼D)
3. F ⊃ (E ⊃ G) / A ⊃ [B ⊃ (F ⊃ G)]
1 8 2 C h apter 3 Inference i n P ropos i t i onal L og i c
37. 1. (P ∙ Q ) ∨ (R ∙ S)
2. ∼P ∨ T
3. ∼Q ∨ W
4. T ⊃ (W ⊃ S) / ∼R ⊃ S
38. 1. X ⊃ [(T ∨ W) ⊃ S]
2. (W ⊃ S) ⊃ (Y ⊃ R)
3. ∼Z ⊃ ∼R / X ⊃ (Y ⊃ Z)
39. 1. ∼R ⊃ S
2. S ⊃ (R ∨ ∼P)
3. ∼(R ∨ P) ⊃ (Q ⊃ ∼S) / (P ∨ Q ) ⊃ R
40. 1. J ≡ (L ∨ M)
2. (M ∨ J) ≡ N
3. (L ⊃ N) ⊃ (K ≡ ∼K) / L ≡ (N ∨ K)
EXERCISES 3.7b
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the
arguments.
1. If Raul doesn’t play lacrosse, then he plays tennis. So, if Raul doesn’t play la-
crosse, then he plays either tennis or soccer.
2. It is not the case that either Polly or Ramon takes out the trash. So, if Owen
cleans his room, then Polly takes out the trash only if Quinn clears the table.
3. If Adams and Barnes are translators, then Cooper is a reviewer. Evans is an edi-
tor if either Cooper or Durning are reviewers. Hence, Adams being a translator
is a sufficient condition for Barnes being a translator only if Evans is an editor.
4. If it’s not the case that there are frogs in the pond, then George will go swim-
ming. So, if Eloise goes swimming and George does not, then either there are
frogs in the pond or hornets in the trees.
5. If Kip does well on his report card, then he will get ice cream. If Kip doesn’t do
well on his report card, then he’ll be jealous of his brother. So, Kip will either
get ice cream or be jealous.
6. If Lisa goes to Arizona, then she’ll go to Colorado. If she goes to Boulder, Colo-
rado, then she’ll go to Dragoon, Arizona. So, if she goes to Arizona and Boulder,
then she’ll go to Colorado and Dragoon.
3 . 7 : C on d i t i onal P roof 1 8 3
7. If the train doesn’t come, then it is not the case that Shanti and Ricardo go to
New York. So, If Ricardo goes to New York, then Shanti goes to New York only
if the train comes.
8. If Justin goes to Ikea, then Luke doesn’t go. Either Luke goes to Ikea or Kate
sleeps on the floor. If either Kate or Madeline sleeps on the floor, then Justin
goes to Ikea. So, Justin goes to Ikea if, and only if, Kate sleeps on the floor.
9. If Aristotle’s Physics is right, then motion is goal-directed and everything has a
telos. But if everything is goal-directed, then other planets are unlike Earth. So,
if Aristotle’s Physics is right, then other planets are unlike Earth.
10. If nothing is worse for society than anarchy, then if people are mutually hostile,
then we need a central authority. But we do not need a central authority. So, if
nothing is worse for society than anarchy, then people are not mutually hostile.
11. If meanings are abstract objects or mental states, then if I believe that cats are
robots, then cats are robots. But cats are not robots. So, if meanings are mental
states, then I don’t believe that cats are robots.
12. If being a platonist entails rejecting empiricism, then Quine is not a platonist.
Being a platonist entails being an apriorist. Not rejecting empiricism entails not
being an apriorist. So, Quine is not a platonist.
13. If the common interest is imposed on individuals, then they are alienated or not
self-determining. But people are self-determining. So, if people are not alien-
ated, then the common interest is not imposed.
14. Either it is not the case that nothing is certain or we have unmediated access
to our mental states. If we have unmediated access to our mental states and
our basic beliefs are not secure, then either our mental states are potentially
misleading or we lack mental states. But if we lack mental states, then our basic
beliefs are secure. So, if nothing is certain and our basic beliefs are not secure,
then we have unmediated access to our mental states, but they are potentially
misleading.
15. Either some objects are beautiful or we impose cultural standards on artifacts.
It’s not the case that some particular proportions are best. So, if some objects be-
ing beautiful entails that some particular proportions are best, then if something
is aesthetically moving, then we impose cultural standards on artifacts.
16. If suicide is not legal, then we lack autonomy and the least powerful people do
not have self-determination. If education is universal and free, then the least
powerful people have self-determination. If only the privileged are educated,
then suicide is not legal. Either education is universal and free or only the
privileged are educated. So, suicide is legal if, and only if, the least powerful
people have self-determination.
1 8 4 C h apter 3 Inference i n P ropos i t i onal L og i c
Many proofs of logical truths involve nesting conditional proofs, as the derivation
3.8.2 does in showing that ‘(P ⊃ Q ) ⊃ [(Q ⊃ R) ⊃ (P ⊃ R)]’ is a logical truth.
3.8.2 1. P ⊃ Q ACP
2. Q ⊃ R ACP
3. P ⊃ R 1, 2, HS
4. (Q ⊃ R) ⊃ (P ⊃ R) 2–3, CP
5. (P ⊃ Q) ⊃ [(Q ⊃ R) ⊃ (P ⊃ R)] 1–4, CP
QED
Again, the conclusion is a conditional statement, but one that requires no premises
for its derivability. It is another logical truth. You can check that the theorems at 3.8.1
and 3.8.2 are logical truths by constructing truth tables for them, or for any of the
logical truths of this section. They will all be tautologies.
Derivations of logical truths can look awkward when you are first constructing and
considering them. Remember, the logical truth we prove in 3.8.2 is conditional, and
doubly so: if P entails Q , then if Q entails R, then P entails R. So, while we have
demonstrated a logical truth out of thin air, the nature of that logical truth should
make the process seem less magical.
When the logical truth has nested conditionals, as 3.8.2 does, setting up the as-
sumptions can require care. But such logical truths are often simple to derive once
they are set up properly. Be especially careful not to use the assigned proposition in
the proof. The conclusion is not part of the derivation until the very end.
3.8.3 shows that ‘[P ⊃ (Q ⊃ R)] ⊃ [(P ⊃ Q ) ⊃ (P ⊃ R)]’ is a logical truth, using
three nested conditional sequences.
3.8.3 1. P ⊃ (Q ⊃ R) ACP (to prove (P ⊃ Q) ⊃
(P ⊃ R))
2. P ⊃ Q ACP (to prove (P ⊃ R))
3. P ACP (to prove R)
4. Q ⊃ R 1, 3, MP
5. Q 2, 3, MP
6. R 4, 5, MP
7. P ⊃ R 3–6 CP
8. (P ⊃ Q) ⊃ (P ⊃ R) 2–7, CP
9. [P ⊃ (Q ⊃ R)] ⊃ [(P ⊃ Q) ⊃ (P ⊃ R)] 1–8, CP
QED
A trivial, or degenerate, instance of CP can prove one of the simplest logical truths,
at 3.8.4.
3.8.4 1. P ACP
2. P ⊃ P CP, 1
QED
Notice that the CP at 3.8.4 has only one line. The second line discharges the as-
sumption; since the first and last line are the same, the antecedent and consequent of
1 8 6 C h apter 3 Inference i n P ropos i t i onal L og i c
the discharging formula are the same. It should be no surprise that a statement entails
itself. But a use of Impl and Com on that formula yields an instance of the standard
The law of the excluded form of the law of excluded middle, at 3.8.5, one of the characteristic logical truths.
middle is that any claim
of the form α ∨ ~α is a
3.8.5 P ∨ ∼P
tautology, a logical truth The metalinguistic version of the law of the excluded middle is called bivalence, as we
of PL.
saw in section 2.3. Bivalence, that every proposition is either true or false, and not both,
underlies the two-valued semantics of PL. (The middle that is excluded is any truth
value other than truth or falsity.) Bivalence has long been a controversial claim. Con-
sider the problem of future contingents; Aristotle’s example is, ‘there will be a sea battle
tomorrow’. Since we do not know today whether there will be a sea battle tomorrow, we
don’t know whether the statement is true or false and seem unable to assert either. We
surely could look back on the day after tomorrow to assign a truth value to the claim, but
as of today, it may not even have a truth value. Though even this simple logical truth is
controversial, our uses of CP do not raise these problems. The problem comes from the
semantics of PL, since every instance of the law of excluded middle is a tautology.
The second version of the error, not properly setting up the CP, is to take the desired
formula as a premise, as at 3.8.9. Then a CP can just prove the same formula you’ve
already assumed.
3.8.9 1. [P ⊃ (Q ⊃ R)] ⊃[(P ⊃ Q) ⊃ (P ⊃ R)] Premise
2. P ⊃ (Q ⊃ R) ACP
3. (P ⊃ Q ) ⊃ (P ⊃ R) 1, 2, MP
4. [P ⊃ (Q ⊃ R)] ⊃ [(P ⊃ Q) ⊃ (P ⊃ R)] 2–3, MP
In 3.8.9, the conclusion certainly follows from the premise. Line 4 is just a restate-
ment of line 1. Any statement entails itself! But what we want, as at 3.8.3, is a deriva-
tion of the logical truth with no premises at all.
The assumptions of this subsection, from 3.8.6–3.8.9, are all errors to avoid in con-
structing conditional proofs to demonstrate logical truths. If you learn to set up your
CPs correctly, indenting and assuming only the antecedent of your desired condi-
tional, you can easily avoid these mistakes and the proofs tend to be quite simple.
Any consistent substitution instance of these new forms, ones in which each meta-
linguistic variable is replaced by the same wffs of the object language throughout, will
be a logical truth and provable in PL with no premises.
All ten rules of equivalence we have been using can easily be turned into templates
for constructing logical truths even more easily. We can just replace the metalinguis-
→ ’ with the object-language symbol ‘≡’, as I did for Impl and one version
tic symbol ‘ ←
of DM in 3.8.19.
3.8.19 (α ⊃ β) ≡ (∼α ∨ β)
∼(α ∨ β) ≡ (∼α ∙ ∼β)
Again, any substitution instance of these forms will be a logical truth.
These metalinguistic templates for logical truths are the kinds of rules one would
adopt in an axiomatic system of logic. The templates are called axiom schemas. Such
axiomatic theories can be constructed to derive the same logical theorems as our
PL, to have the same strength as our system of logic, often with many fewer rules of
inference or equivalence. Again, we are not using an axiomatic system, and we will
retain all twenty-five rules, as well as the direct, conditional, and indirect derivation
methods, the last of which is the subject of our next section.
Summary
The primary goal of this section was to show you how to construct proofs of logical
truths of PL, the theorems of propositional logic. Using conditional proof, we start by
indenting and assuming the antecedents of a conditional logical truth and then derive
the consequent. When we discharge our assumption, we have proven a formula of PL
without any premises.
The secondary goal of the section was to show the relation between our ordinary
proofs so far, which contain premises, and the proofs of logical truths. Since every
proof that assumes premises is convertible into a proof that does not, even the deriva-
tions that assume contingent premises can be seen as proofs of logical truths.
KEEP IN MIND
EXERCISES 3.8a
Convert each of the following arguments to a logical truth,
using either of the methods described above.
1. 1. ∼A ⊃ B 7. 1. R ⊃ S
2. ∼B /A 2. S ⊃ T
3. ∼(T ∨ U) / ∼R
2. 1. ∼C ∨ D
2. C /D 8. 1. V ⊃ W
2. ∼W ∨ X
3. 1. E ∙ (F ∨ G)
3. V ∙ (Y ∙ Z) / X
2. ∼E /G
9. 1. A ∨ (B ∙ C)
4. 1. ∼(H ∨ I)
2. A ⊃ D
2. J ⊃ I / ∼J
3. ∼(D ∨ E) /C
5. 1. K ∙ (∼L ∨ M)
10. 1. F ⊃ G
2. L ⊃ ∼K / M
2. H ⊃ F
6. 1. N ⊃ (P ∙ Q ) 3. H ∙ I / ∼G ⊃ I
2. ∼(O ∨ P) / ∼N
EXERCISES 3.8b
Use conditional proof to derive each of the following logical
truths.
1. [A ∨ (B ∙ C)] ⊃ (A ∨ C)
2. [(A ⊃ B) ∙ C] ⊃ (∼B ⊃ ∼A)
3. (O ∨ P) ⊃ [∼(P ∨ Q ) ⊃ O]
4. [V ∙ (W ∨ X)] ⊃ (∼X ⊃ W)
5. [(P ∨ Q ) ∨ (R ∨ S)] ⊃ [(R ∨ Q ) ∨ (S ∨ P)]
6. [P ⊃ (Q ⊃ R)] ⊃ [(P ∙ ∼R) ⊃ ∼Q ]
7. [(P ∨ Q ) ∨ R] ⊃ [∼P ⊃ (∼Q ⊃ R)]
8. (P ⊃ Q ) ⊃ [(Q ⊃ S) ⊃ (∼S ⊃ ∼P)]
3 . 9 : In d i rect P roof 1 9 1
a premise, show that it leads to an unacceptable (or absurd) consequence, and then
conclude the opposite of our assumption. Indirect proof, like conditional proof, is
useful for proving logical truths.
We can see the justification for indirect proof by considering the arguments 3.9.1,
which we called explosion in section 3.5, and 3.9.2.
3.9.1 1. A ∙ ∼A /B
2. A 1, Simp
3. A ∨ B 2, Add
4. ∼A ∙ A 1, Com
5. ∼A 4, Simp
6. B 3, 5, DS
QED
3.9.2 1. B ⊃ (P ∙ ∼P) / ∼B
2. B ACP
3. P ∙ ∼P 1, 2, MP
4. P 3, Simp
5. P ∨ ∼B 4, Add
6. ∼P ∙ P 3, Com
7. ∼P 6, Simp
8. ∼B 5, 7, DS
9. B ⊃ ∼B 2–8, CP
10. ∼B ∨ ∼B 9, Impl
11. ∼B 10, Taut
QED
The moral of 3.9.1 is that anything follows from a contradiction in PL. The moral
of 3.9.2 is that if a statement entails a contradiction in PL, then its negation is
provable. Indirect proof is based on these two morals, and it captures a natural style
of inference: showing that some assumption leads to unacceptable consequences and
then rejecting the assumption.
To use an indirect proof, we assume the opposite of our desired conclusion and
derive a contradiction. When we get the contradiction, then we can infer the negation
of our assumption.
The last line of an indented sequence for indirect proof is always a contradiction. As
in section 2.5, a contradiction, for the purposes of indirect proof, is any statement of
the form α ∙ ∼α. The wffs listed in 3.9.3 are all contradictions.
3.9.3 P ∙ ∼P
∼ ∼P ∙ ∼ ∼ ∼P
∼(P ∨ ∼Q) ∙ ∼ ∼(P ∨ ∼Q)
We can assume any wff we want, for both CP and IP, by indenting and noting the
assumption. But only certain assumptions will discharge in the desired way. For CP,
we assume the antecedent of a desired conditional because when we discharge, the
first line of the assumption becomes the antecedent of the resulting conditional. For
IP, we always discharge the first line of the proof with one more tilde. Thus, if we wish
to prove the negation of a formula, we can just assume the formula itself.
3.9.4 is a sample derivation using IP. At line 3, we are considering what would follow
if the opposite of the conclusion is true. At line 6, we have found a contradiction, and
so we discharge our assumption at line 7.
3.9.4 1. A ⊃ B
2. A ⊃ ∼B / ∼A
3. A AIP
4. B 1, 3, MP
5. ∼B 2, 3, MP
6. B ∙ ∼B 4, 5, Conj
7. ∼A 3–6, IP
QED
Since the discharge step of an indirect proof requires an extra∼, we often need to
use DN at the end of an indirect proof, as in 3.9.5.
3.9.5 1. F ⊃ ∼D
2. D
3. (D ∙ ∼E) ⊃ F /E
4. ∼E AIP
5. D ∙ ∼E 2, 4, Conj
6. F 3, 5, MP
7. ∼D 1, 6, MP
8. D ∙ ∼D 2, 7, Conj
9. ∼ ∼E 4–8, IP
10. E 9, DN
QED
In addition to deriving simple statements and negations, the method of indirect
proof is especially useful for proving disjunctions, as in 3.9.6. Assuming the negation
of a disjunction leads quickly, by DM, to two conjuncts that you can simplify.
1 9 4 C h apter 3 Inference i n P ropos i t i onal L og i c
3.9.6 1. ∼A ⊃ (B ⊃ C)
2. C ⊃ D
3. B /A∨D
4. ∼(A ∨ D) AIP
5. ∼A ∙ ∼D 4, DM
6. ∼A 5, Simp
7. B ⊃ C 1, 6, MP
8. ∼D ∙ ∼A 5, Com
9. ∼D 8, Simp
10. ∼C 2, 9, MT
11. C 7, 3, MP
12. C ∙ ∼C 11, 10, Conj
13. ∼ ∼(A ∨ D) 4–12, IP
14. A ∨ D 13, DN
QED
Indirect proof is compatible with conditional proof. Indeed, the structure of many
mathematical proofs involves making a conditional assumption, and then assuming
the opposite of a desired conclusion to get a contradiction. 3.9.7 is a formal example
of exactly this procedure, nesting an IP within a CP.
3.9.7 1. E ⊃ (A ∙ D)
2. B ⊃ E / (E ∨ B) ⊃ A
3. E ∨ B ACP
4. ∼A AIP
5. ∼A ∨ ∼D 4, Add
6. ∼(A ∙ D) 5, DM
7. ∼E 1, 6, MT
8. B 3, 7, DS
9. ∼B 2, 7, MT
10. B ∙ ∼B 8, 9, Conj
11. ∼ ∼A 4–10, IP
12. A 11, DN
13. (E ∨ B) ⊃ A 3–12, CP
QED
Essentially the same proof structure could have been used with a single assump-
tion of the negation of the whole desired conclusion, as a single IP without using CP.
I begin that alternative at 3.9.8.
3 . 9 : In d i rect P roof 1 9 5
3.9.8 1. E ⊃ (A ∙ D)
2. B ⊃ E / (E ∨ B) ⊃ A
3. ∼[(E ∨ B) ⊃ A] AIP
4. ∼[∼(E ∨ B) ∨ A] 3, Impl
5. ∼ ∼(E ∨ B) ∙ ∼A 4, DM
6. (E ∨ B) ∙ ∼A 5, DN
Now the proof can proceed as it did from line 5 in 3.9.7. Either method is acceptable,
though some find the nested structure of 3.9.7 both clearer and more conceptually
useful. You can even nest indirect proofs within one another, though such measures
are rarely warranted.
When first learning to use IP, it is typical to try to invoke it as if it were magic, turn-
ing statements into their negations. Be very careful with your negations and with the
structure of indirect proofs. DN always adds or subtracts pairs of consecutive tildes.
IP always places a single tilde in front of the formula you assumed in the first line of
your indented sequence after that sequence ends in a contradiction.
Like conditional proof, the method of indirect proof is easily adapted to proving
logical truths. To prove that ‘∼[(X ≡ Y) ∙ ∼(X ∨ ∼Y)]’ is a logical truth, as in 3.9.9, we
again start with an assumption, the opposite of the theorem we wish to prove.
3.9.9 1. (X ≡ Y) ∙ ∼(X ∨ ∼Y) AIP
2. X ≡ Y 1, Simp
3. (X ⊃ Y) ∙ (Y ⊃ X) 2, Equiv
4. ∼(X ∨ ∼Y) ∙ (X ≡ Y) 1, Com
` 5. ∼(X ∨ ∼Y) 4, Simp
6. ∼X ∙ ∼ ∼Y 5, DM
7. ∼X ∙ Y 6, DN
8. (Y ⊃ X) ∙ (X ⊃ Y) 3, Com
9. Y ⊃ X 8, Simp
10. ∼X 6, Simp
11. ∼Y 9, 10, MT
12. Y ∙ ∼X 7, Com
13. Y 12, Simp
14. Y ∙ ∼Y 13, 11, Conj
15. ∼[(X ≡ Y) ∙ ∼(X ∨ ∼Y)] 1–14, IP
QED
3.9.10 is another example of using IP to derive a logical truth, ‘(P ⊃ Q ) ∨ (∼Q ⊃ P)’.
Since our desired formula this time is a disjunction, an indirect proof quickly yields,
by a use of DM, two simpler formulas with which to work. Since the assumption is a
formula with a negation, though, we have to use DN at the end (line 17) to get our
desired formula.
1 9 6 C h apter 3 Inference i n P ropos i t i onal L og i c
Summary
We now have three derivation methods: direct, conditional, and indirect. Indirect
proof is both a useful, legitimate tool of inference in classical systems like ours and
the last hope of the desperate. If you are stuck in a proof and cannot see how to get
your conclusion, it is often very useful just to assume the opposite of what you want
and derive whatever you can, looking for a contradiction. The result might not be the
most efficient derivation, but as long as you do not misuse any rules, the derivation
will be legitimate.
We now have two different kinds of assumptions: assumptions for conditional
proof and assumptions for indirect proof. These assumptions are really no different.
Indeed, you might think of indirect proof as a conditional proof of a formula whose
consequent is a contradiction. Since the antecedent entails a contradiction, we know
that the first line of the indented sequence is false, and we can, given bivalence, con-
clude its opposite.
It is natural, especially at first, to wonder about which derivation method to use in
any particular derivation. Some guidelines are generally useful, though they should
not be taken as inviolable rules.
KEEP IN MIND
For indirect proof, assume (in the first indented line) the opposite of your desired
conclusion.
The last line of an indented sequence for IP should always be a contradiction.
A contradiction is any statement of the form α ∙ ∼α.
For IP, always discharge the first line of the proof with one more tilde.
Logical truths may be proven using either CP or IP.
You may use indirect proof whenever you are stuck in a derivation.
EXERCISES 3.9a
Derive the conclusions of the following arguments using
conditional proof and/or indirect proof where appropriate.
1. 1. U ⊃ (V ∨ W)
2. ∼(W ∨ V) / ∼U
2. 1. Y ∨ ∼Z
2. ∼X ∨ Z /X⊃Y
3. 1. A ⊃ B
2. ∼(C ∨ ∼A) /B
4. 1. L ⊃ M
2. L ∨ O /M∨O
5. 1. A ∨ ∼B
2. (B ∨ C) ⊃ ∼A / ∼B
6. 1. F ⊃ (E ∨ D)
2. ∼E ∙ (∼D ∨ ∼F) / ∼F
7. 1. M ⊃ L
2. ∼(K ∙ N) ⊃ (M ∨ L) / K ∨ L
8. 1. H ⊃ G
2. H ∨ J
3. ∼(J ∨ ∼I) /G∙I
9. 1. X ⊃ Y
2. ∼(Z ⊃ W) / X ⊃ (Y ∙ Z)
10. 1. ∼(G ⊃ H) ⊃ ∼F
2. G ∙ (F ∨ H) /H
3 . 9 : In d i rect P roof 1 9 9
11. 1. B ≡ (A ∙ D)
2. ∼A ⊃ (∼B ⊃ C) /A∨C
12. 1. P ≡ (Q ∨ ∼R)
2. T ∙ ∼(Q ∙ P) / ∼(P ∙ R)
13. 1. (C ∨ ∼B) ⊃ (∼D ⊃ A)
2. (A ∨ B) ≡ D /D
14. 1. X ⊃ T
2. Y ⊃ T
3. T ⊃ Z / (X ∨ Y) ⊃ Z
15. 1. S ⊃ T
2. S ∨ (∼R ∙ U) /R⊃T
16. 1. A ≡ (B ∙ D)
2. C ⊃ (E ∨ F)
3. A ∨ ∼E
4. A ∨ ∼F /C⊃B
17. 1. M ⊃ (L ∙ ∼P)
2. K ⊃ ∼(O ∙ ∼P)
3. N ⊃ O / (K ∙ M) ⊃ ∼N
18. 1. A ⊃ B
2. ∼C ⊃ ∼(A ∨ ∼D)
3. ∼D ∨ (B ∙ C) / A ⊃ (B ∙ C)
19. 1. Z ⊃ Y
2. Z ∨ W
3. Y ⊃ ∼W
4. W ≡ ∼X /X≡Y
20. 1. W ≡ (X ∙ Z)
2. ∼(∼X ∙ ∼W) /Z⊃W
21. 1. ∼[J ∨ (F ∙ ∼H)]
2. ∼G ⊃ ∼H
3. G ∨ [∼F ⊃ (J ∙ K)] / E ∨ G
22. 1. (G ∙ ∼H) ⊃ F
2. G / (H ∨ F) ∙ G
23. 1. Y ≡ ∼(V ∙ X)
2. ∼W ⊃ ∼V
3. ∼(Y ⊃ ∼V) / ∼(W ⊃ X)
24. 1. ∼(I ⊃ J) ⊃ ∼F
2. (F ∨ H) ∙ (G ∨ I)
3. ∼H ⊃ ∼J /H∨G
2 0 0 C h apter 3 Inference i n P ropos i t i onal L og i c
25. 1. K ⊃ (L ∙ I)
2. ∼(J ⊃ M)
3. L ⊃ (∼K ∨ ∼I) / ∼[ J ⊃ (M ∨ K)]
26. 1. ∼(∼E ∙ ∼H) ∨ I
2. (E ∙ ∼I) ⊃ (H ∙ G) /H∨I
27. 1. (T ⊃ U) ∙ (S ⊃ V)
2. [V ⊃ (∼T ⊃ W)] ⊃ ∼U
3. S / ∼T ∙ V
28. 1. M ⊃ (O ⊃ L)
2. ∼[(∼O ∙ ∼K) ≡ (L ∨ M)] / L ∨ ∼O
29. 1. P ⊃ (Q ∙ R)
2. ∼Q ⊃ R
3. (∼R ≡ ∼Q ) ∨ P / ∼(Q ⊃ ∼R)
30. 1. A ≡ ∼(B ∨ C)
2. (D ∨ E) ⊃ ∼C
3. ∼(A ∙ D) /D⊃B
31. 1. U ⊃ (P ∙ ∼Q )
2. T ⊃ (S ∨ U)
3. ∼T ⊃ ∼R / (P ⊃ Q ) ⊃ (R ⊃ S)
32. 1. B ⊃ C
2. E ≡ ∼(B ∨ A)
3. D ⊃ ∼E / D ⊃ (A ∨ C)
33. 1. F ⊃ (K ≡ M)
2. ∼F ⊃ [L ⊃ (F ≡ H)]
3. ∼(M ∨ ∼L)
4. ∼H ⊃ ∼(∼K ∙ L) /F≡H
34. 1. ∼P ∨ R
2. ∼P ⊃ ∼(N ⊃ ∼Q )
3. ∼R ≡ (P ∨ O) /Q∙N
35. 1. ∼(R ∙ U) ⊃ T
2. [R ⊃ ∼(S ∙ ∼Q )] ⊃ ∼T / R ∙ (S ∨ U)
36. 1. ∼L ⊃ ∼K
2. N ∙ ∼(K ∙ L) / ∼[K ∨ ∼(J ⊃ N)]
37. 1. (L ⊃ ∼J) ∨ (K ∙ M)
2. (∼M ⊃ K) ⊃ ( J ∙ L) /K≡M
38. 1. (E ⊃ ∼A) ⊃ B
2. [(A ∙ D) ⊃ ∼C] ⊃ ∼B / A ∙ (C ∨ E)
3 . 9 : In d i rect P roof 2 0 1
39. 1. ∼E ⊃ ∼(A ⊃ C)
2. (∼D ∙ A) ⊃ (B ∙ ∼B)
3. ∼(∼A ∙ E) /D
40. 1. V ⊃ (T ∙ ∼W)
2. (T ⊃ W) ⊃ (∼X ∨ ∼Y)
3. ∼[∼(V ∨ Y) ∨ ∼(V ∨ X)] / ∼(T ⊃ W)
EXERCISES 3.9b
Translate each of the following paragraphs into arguments
written in PL. Then, derive the conclusions of the
arguments.
1. If Lorena makes quiche, then she’ll make potatoes. She either doesn’t make po-
tatoes or doesn’t make quiche. So, she doesn’t make quiche.
2. Stephanie either plays miniature golf and not netball, or she goes to the ocean.
She doesn’t play miniature golf. So, she goes to the ocean.
3. If Grady eats quickly, then he’ll get hiccups. If he gets hiccups, then he’ll suck
on an ice cube and will not eat quickly. So, Grady doesn’t eat quickly.
4. If either Xander or Yael go to the water park, then Vivian will go. Winston go-
ing to the water park is sufficient for Vivian not to go. So, if Winston goes to the
water park, then Xander will not.
5. If Esme grows olives, then she grows mangoes. She grows either olives or nec-
tarines. So, she grows either mangoes or nectarines.
6. Having gorillas at the circus entails that there are elephants. There are either
gorillas or hippos. Having fancy ponies means that there are no hippos. Thus,
either there are elephants or there are no fancy ponies.
7. If the house is painted ivory and not green, then it will appear friendly. The
neighbors are either happy or jealous. If the neighbors are jealous, then the
house will be painted ivory. So, if it is not the case that either the house appears
friendly or it is painted green, then the neighbors will be happy.
8. If tanks tops are worn in school, then the rules are not enforced. It is not the
case that either short skirts or very high heels are in the dress code. Tank tops
are worn in school, and either uniforms are taken into consideration or the
rules are not enforced. So, it is not the case that either the rules are enforced or
short skirts are in the dress code.
2 0 2 C h apter 3 Inference i n P ropos i t i onal L og i c
9. If we are just, we help our friends. If we are unjust, we harm our enemies. So, we
either help our friends or harm our enemies.
10. If beauty does not increase with familiarity, then it either is intellectual per-
fection or a manifestation of secret natural laws. But beauty is not intellectual
perfection. If it’s a manifestation of secret natural laws, then it is intellection per-
fection. So, beauty increases with familiarity.
11. If I am my body, then I am constantly changing. If I am my conscious aware-
ness, then I am sometimes changing. If I am either constantly or sometimes
changing, then I do not have to repay my debts. But I do have to repay my debts.
So, I am not my body and I am not my conscious awareness.
12. If there are no atoms, then multiplicity is an illusion. If there are no atoms, we
can’t explain physical phenomena. Either we can explain physical phenomena
or there is a physical world. Either there is no physical world or multiplicity is
not an illusion. So, there are atoms.
13. If everything is either simple or real, then either causation is observable or time
is an illusion. But time is no illusion. So, if everything is simple, then causation
is observable.
14. Truth is not both correspondence of words to reality and consistency. If truth
is not consistency, then we do not know whether our sentences are true and we
are threatened with solipsism. If we have a good semantic theory, then we know
whether our sentences are true. So, if truth is correspondence of words to real-
ity, then we don’t have a good semantic theory.
15. If life is not all suffering, then we can be compassionate. If we can be compas-
sionate or have empathy, then we are emotionally vulnerable. It is not the case
that our sentience entails that we are emotionally vulnerable. So, life is all
suffering.
16. If morality is relative, then it is either subjective or culturally conditioned. If
morality is absolute, then either it is intuitive or not culturally conditioned.
If morality is not intuitive, then it is not subjective. So, if morality is relative and
not intuitive, then it is not absolute.
EXERCISES 3.9c
Use conditional or indirect proof to derive each of the
following logical truths.
1. ∼(∼P ∨ ∼Q ) ⊃ P
2. [∼P ∨ (Q ∙ R)] ⊃ (Q ∨ ∼P)
3 . 1 0 : C h apter R e v i ew 2 0 3
3. ∼(P ≡ ∼P)
4. (P ∨ Q ) ∨ (∼P ∙ ∼Q )
5. A ∨ (B ∨ ∼A)
6. C ∨ (C ⊃ D)
7. ∼(P ∙ Q ) ∨ P
8. ∼P ∨ (P ∨ Q )
9. ∼[(I ⊃ ∼I) ∙ (∼I ⊃ I)]
10. J ≡ [ J ∨ ( J ∙ K)]
11. (∼P ≡ Q ) ≡ [(∼P ∙ Q ) ∨ (P ∙ ∼Q )]
12. [(∼P ∨ Q ) ∙ (∼P ∨ R)] ∨ [P ∨ (∼Q ∙ ∼R)]
13. (P ∨ ∼Q ) ∨ (∼P ∨ R)
14. (E ⊃ F) ∨ (F ⊃ E)
15. (G ⊃ H) ∨ (∼G ⊃ H)
16. (L ≡ ∼M) ≡ ∼(L ≡ M)
17. (P ⊃ Q ) ≡ (Q ∨ ∼P)
18. (∼P ≡ Q ) ∨ (∼P ∨ Q )
19. [(P ∙ Q ) ∙ ∼R] ∨ [(P ∙ ∼Q ) ∨ (∼P ∨ R)]
20. [(P ∙ ∼Q ) ∨ (R ∙ ∼S)] ∨ [(Q ∙ S) ∨ (∼P ∨ ∼R)]
Proof Strategies
Sometimes, when faced with the challenge of deriving the conclusion of an argument
or proving a logical truth, we can quickly see our way through to the end. Other times,
we get stuck. At such times, it is useful to work off to the side of the proof, or on scratch
paper, trying different strategies. In particular, it can often be useful to work back-
ward from our desired conclusions. Here are some useful strategies worth keeping in
mind, for various kinds of conclusions. They can work for the main conclusions of an
argument, or to get propositions that you see you need along the way. It is not a com-
plete list, but it collects some of the most reliable strategies.
If your desired conclusion is a simple propositional letter or a negation of one, it is
useful to see where that letter exists in the premises. If it is in the consequent of a
conditional, try to derive the antecedent of that conditional, so you can use MP. If it is
in the antecedent of a conditional, try to derive the negation of the consequent, so you
can use MT. If it is part of a disjunction, try to get the negation of the other disjunct,
so you can use DS. You might also try an indirect proof, starting with the negation of
your desired conclusion. Sometimes, though much less frequently, you can use Taut
on statements of the form α ∨ α.
If your desired conclusion is a conjunction, it is typical to derive each conjunct sepa-
rately. Remember that conjunctions are the negations of disjunctions, by DM, so that
statements of the form ∼(α ∨ β) turn into statements of the form ∼α ∙ ∼β.
You can sometimes derive a disjunction merely by deriving one of the disjuncts and
using Add for the other. If that fails, CD can be useful. Since Impl allows us to turn
statements of the form α ∨ β into statements of the form ∼α ⊃ β, conditional proof
can be effective with disjunctions, too. And an indirect proof of a disjunction allows
you quickly to get two simpler statements. One use of DM on the negation of a state-
ment of the form α ∨ β yields ∼α ∙ ∼β; you can simplify either side.
Conditional proof is often effective in proving conditionals, especially for logical
truths. Don’t forget HS, especially when you are given a few conditionals in the prem-
ises. Cont can help you set up HS properly. DM can turn disjunctions into condition-
als on which you can use HS too.
Lastly, while there are many rules for deriving biconditionals in section 3.6, it re-
mains typical to derive each of the two component conditionals and then conjoin
them. CP can help with each side, though you should try first to see if you really need
CP; sometimes derivations are quicker without it.
To determine whether it is a logical truth, we can just construct a truth table and see
whether it comes out false in any row, as we did in section 2.5. Perhaps more easily, we
can attempt to construct a derivation, as I begin to do at 3.10.2.
3.10.2 1. P ⊃ (R ⊃ Q) ACP (to prove P ⊃ Q)
2. P ACP (to prove Q)
3. R ⊃ Q 1, 2, MP
At this point, I don’t see any further helpful inferences and I begin to worry that I
might have a contingent (or even contradictory) proposition on my hands. I turn to
my semantic techniques: Can I construct a valuation that makes 3.10.1 false?
P Q R [P ⊃ (R ⊃ Q)] ⊃ (P ⊃ Q)
To make the proposition false, I have to make the antecedent true and the conse-
quent false. To make the consequent false, I must make P true and Q false. I can carry
these values through the formula.
P Q R [P ⊃ (R ⊃ Q)] ⊃ (P ⊃ Q)
1 0 1 0 1 0 0
If 3.10.1 were a logical truth, I would not be able to make the antecedent true. But if
I take R to be false, the antecedent comes out true and the whole formula comes out
false. We have a valuation that shows that 3.10.2 is not a logical truth.
Valid or Invalid?
We can use a similar combination of the methods of chapters 2 and 3 when given an
argument that we do not know is valid or invalid, like 3.10.3.
3.10.3 1. P ≡ Q
2. ∼P ∨ R
3. R ⊃ S / ∼Q ∙ S
We might try to derive the conclusion, as I do at 3.10.4.
3.10.4 1. P ≡ Q
2. ∼P ∨ R
3. R ⊃ S / ∼Q ∙ S
4. (P ⊃ Q) ∙ (Q ⊃ P) 1, Equiv
5. (Q ⊃ P) ∙ (P ⊃ Q) 4, Com
6. Q ⊃ P 5, Simp
7. ∼P ⊃ ∼Q 6, Cont
8. ∼Q ∨ S 7, 3, 2, CD
2 0 6 C h apter 3 Inference i n P ropos i t i onal L og i c
At this point, despite my ingenuity in using CD, I begin to suspect that the argu-
ment is invalid. I could try an indirect proof, but with the conjunction in the conclu-
sion, it doesn’t seem promising. If the argument is invalid, I should be able to construct
a counterexample. I turn to that task next.
There’s no obvious place to start, so I’ll start with the first premise, which is true
either when P and Q are both true or when they are both false.
P Q R S P ≡ Q / ∼ P ∨ R /
1 1 1 1 1 0 1
0 0 0 1 0 1 0
R ⊃ S // ∼ Q ∙ S
0 1 0
1 0
In the first row, our conclusion is already false, so we just need to make the second
and third premises true. If we take R to be false, we make the third premise true, but
the second premise is false. But if we take R to be true, we can make both premises true
by taking S to be true. We have a counterexample when all atomic formulas are true.
P Q R S P ≡ Q / ∼ P ∨ R /
1 1 1 1 1 1 1 0 1 1 1
0 0 0 1 0 1 0
R ⊃ S // ∼ Q ∙ S
1 1 1 0 1 0
1 0
3 . 1 0 : C h apter R e v i ew 2 0 7
Since the argument has a counterexample, it is invalid. Since all and only valid argu-
ments are provable in our system of deduction, the attempted derivation at 3.10.4 was
indeed quixotic.
To complete chapter 3, then, and our study of PL, use the tools from both chapters
2 and 3 on the exercises below, which give you arguments and propositions without
telling you whether they are valid or invalid, logical truths or not.
EXERCISES 3.10a
Determine whether each of the following arguments is valid
or invalid. If it is valid, provide a derivation of the
conclusion. If it is invalid, provide a counterexample.
1. 1. A ≡ C
2. C ⊃ (D ∨ B)
3. D /A⊃B
2. 1. E
2. (E ∨ G) ⊃ H
3. H ⊃ F
4. (F ∙ E) ⊃ ∼G / ∼G
3. 1. L ⊃ I
2. I ⊃ (K ⊃ J)
3. K ⊃ L /J⊃L
4. 1. M ⊃ N
2. N ≡ ∼O
3. ∼N ⊃ (M ∙ O) / ∼N
5. 1. (Q ∨ R) ≡ ∼P
2. Q ∨ S
3. P /S∙R
6. 1. X ⊃ W
2. W ⊃ X
3. Y
4. (Z ∙ Y) ⊃ ∼X / ∼X
2 0 8 C h apter 3 Inference i n P ropos i t i onal L og i c
7. 1. (A ∙ B) ⊃ (C ∙ D)
2. ∼C
3. B
4. A ∨ (∼D ∙ ∼B) / ∼D
8. 1. E ∨ F
2. ∼F ∨ G
3. E ≡ G
4. F ⊃ (G ∨ E) /F
9. 1. P ⊃ Q
2. R ∨ S
3. ∼R
4. Q ⊃ S / ∼P
10. 1. Z ≡ ∼X
2. ∼X ∨ Y
3. W ∙ ∼Y /Z∙W
11. 1. A ≡ B
2. ∼B
3. C
4. (D ∙ C) ⊃ ∼(A ∨ D) / ∼A ∙ ∼D
12. 1. F ≡ (H ∙ I)
2. ∼H ∨ ∼I
3. ∼F ⊃ G
4. G ⊃ E /E
13. 1. ∼P ⊃ R
2. Q ⊃ ∼R
3. (∼P ∙ Q ) ∨ S
4. S ≡ T
5. T ⊃ ∼Q /∼Q
14. 1. (W ∙ X) ⊃ Y
2. Y ⊃ (Z ∨ ∼X)
3. ∼Z / ∼(W ∙ X)
15. 1. ∼A ⊃ ∼B
2. A ⊃ (C ∙ D)
3. (C ∙ D) ≡ A /A
16. 1. ∼(E ∨ F)
2. H ≡ F
3. (H ∙ G) ∨ (H ∙ I) / ∼(G ⊃ E)
3 . 1 0 : C h apter R e v i ew 2 0 9
17. 1. J ≡ K
2. ∼J ∙ L
3. M ⊃ J
4. N ⊃ (K ∨ M) / L ∙ ∼N
18. 1. P ⊃ Q
2. P ∨ R
3. Q ⊃ ∼R
4. R ≡ ∼S /S≡Q
19. 1. ∼W ∨ X
2. Y ⊃ X
3. Y ⊃ ∼(Z ∙ X) / ∼Z ∨ ∼X
21. 1. P ≡ (∼Q ∙ R)
2. (R ⊃ Q ) ⊃ S
3. S ⊃ T
4. S ⊃ ∼T
5. P ⊃ (T ≡ ∼X) / ∼(X ≡ T)
23. 1. (E ∙ F) ⊃ (G ∙ H)
2. ∼G ∨ ∼H
3. F
4. I ⊃ ( J ⊃ E) / ∼I ∨ ∼J
24. 1. K ⊃ (∼L ⊃ M)
2. N ∨ K
3. L ⊃ ∼N /M∨L
25. 1. ∼Z ⊃ Y
2. Z ⊃ ∼X
3. X ∨ ∼Z
4. Y ⊃ A
5. X ⊃ ∼A / ∼X
2 1 0 C h apter 3 Inference i n P ropos i t i onal L og i c
EXERCISES 3.10b
Determine whether each of the following propositions is a
logical truth. If it is a logical truth, provide a proof using
our system of natural deduction. If it is not a logical truth,
provide a valuation that makes the statement false.
1. (G ∨ G) ⊃ G
2. (T ∨ ∼T) ⊃ T
3. (P ∙ Q ) ⊃ (P ∨ Q )
4. (R ∨ S) ⊃ (R ∙ S)
5. [(A ∨ ∼B) ∙ ∼A] ⊃ B
6. [(C ∨ ∼D) ∙ ∼C] ⊃ (∼D ∨ E)
7. [(A ⊃ B) ∙ (B ⊃ C)] ⊃ (∼C ⊃ ∼A)
8. [E ⊃ (F ⊃ G)] ⊃ [F ⊃ (E ⊃ G)]
9. [(H ∨ I) ⊃ K] ⊃ [(H ∙ I) ⊃ K]
10. [(J ∙ L) ⊃ M] ⊃ [( J ∨ L) ⊃ M]
11. ∼(R ⊃ S) ≡ (T ⊃ R)
12. ∼(P ⊃ Q ) ≡ (P ∨ ∼Q )
13. ∼(X ⊃ Y) ⊃ (Y ⊃ Z)
14. [(S ∨ T) ∙ ∼T] ⊃ (S ⊃ R)
15. [(P ∨ Q ) ∙ ∼P] ⊃ [(Q ⊃ R) ⊃ R]
16. [P ⊃ (Q ∨ S)] ⊃ (∼Q ⊃ ∼P)
17. [J ≡ (K ∙ L)] ⊃ [(J ⊃ K) ∙ (K ⊃ J)]
18. ∼(A ∨ ∼B) ⊃ [(A ⊃ C) ∙ (C ⊃ B)]
19. [G ≡ (H ∨ I)] ⊃ [(H ⊃ G) ∙ (I ⊃ G)]
20. [A ≡ (B ∙ C)] ⊃ [(A ≡ B) ∙ (A ≡ C)]
21. (E ∨ F) ⊃ {(E ⊃ H) ⊃ [(F ⊃ H) ⊃ H]}
22. [D ≡ (E ∨ F)] ⊃ [(D ⊃ E) ∙ (D ⊃ F)]
23. [(W ≡ X) ⊃ (Y ≡ Z)] ⊃ [(Y ≡ ∼Z) ⊃ (∼W ≡ X)]
24. [(P ∨ Q ) ⊃ (R ∙ S)] ⊃ [(P ⊃ R) ∙ (Q ⊃ S)]
25. [(W ∙ X) ⊃ (Y ∙ Z)] ⊃ [(W ⊃ X) ⊃ Y]
3 . 1 0 : C h apter R e v i ew 2 1 1
KEY TERMS
212
4 . 1 : Intro d u c i ng P re d i cate L og i c 2 1 3
In predicate logic, we extend the vocabulary. We retain the same propositional op- A predicate logic
erators and punctuation. But the terms are more complex, revealing some subpropo- i ncludes singular
terms, predicates, and
sitional logical relations: quantifiers.
Complex statements made of singular terms and predicates
Quantifiers
Five propositional operators
Punctuation
Our study of predicate logic starts with a simple language, which I will call M, for M is monadic predicate
monadic predicate logic. logic.
We call M monadic The predicates used in 4.1.3 and 4.1.4, and generally in M, are called one-place
because the predicates predicates since they are followed by only one singular term. In section 5.1, we will
take exactly one singular
term.
extend our uses of predicates, using capital letters followed by any number of singular
terms to stand for relations among various objects.
Returning to 4.1.1, we can now regiment the second premise and the conclusion.
Emily is a philosopher. Pe
Emily is happy. He
To finish translating the argument in M, we must deal with the first premise, which
is not about a single thing and so cannot be translated using a constant. We can use a
variable, but variables are themselves insufficient to complete a proposition. ‘Px’ just
means that x is a philosopher and ‘Hx’ just means that x is happy. Those claims are,
by themselves, ambiguous among claims that a something is a philosopher, nothing
is a philosopher, or everything is a philosopher; and among claims that something is
happy, nothing is happy, or everything is happy. We need to disambiguate.
Frege thought of predicates as functions from singular terms to complete proposi-
tions. He put the singular terms after the predicates in imitation of the mathematical
practice of putting a function in front of its argument: f(3) or g(x). (See section 5.6 for
more on functions.) We follow Frege, writing ‘Pe’ for ‘Emily is a philosopher’ instead
of ‘eP’, or ‘Ep’, either of which might be a bit more natural.
Just as a function needs an argument, a proposition expressed by a predicate has
a hole in it, which must be filled with a singular term. When the singular term is a
constant, we have a complete proposition, as at 4.1.3. But when the singular term is
a variable, as at 4.1.4, we have to complete the proposition by indicating more about
the variable, disambiguating among something, nothing, and everything. We do that
with quantifiers.
Quantifiers
The subject of ‘All philosophers are happy’ is not a specific philosopher. No specific
object is mentioned. Similarly, in ‘Something is made in the USA’, there is no spe-
Quantifiers a re operators cific thing to which the sentence refers. For sentences like these, we use quantifiers
that work with variables to bind and modify our singular terms. There are two quantifiers: existential and
to stand for terms like
‘something’, ‘everything’,
universal, which always appear with a variable.
‘nothing’, and ‘anything’. (∃x), (∃y), (∃z), (∃w), (∃v)
They may be existential (∀x), (∀y), (∀z), (∀w), (∀v)
or universal .
Existential quantifiers are used to represent expressions like the following:
There exists a thing such that
For some thing
There is a thing
For at least one thing
Something
4 . 1 : Intro d u c i ng P re d i cate L og i c 2 1 5
and is not mortal; the second option says that it is not the case that all gods are mortal,
which would be the case only if some god is not mortal. The two forms are logically
equivalent. In parallel, the first version at 4.1.17 says that everything that is a frog is
not a person, whereas the second says, equivalently, that it is not the case that there is
something that is a frog and a person. Notice that even with the negation, the univer-
sal statement is a conditional and the existential statement is a conjunction. Later, in
section 4.5, we will move between these equivalent translations.
Summary
The goal of this section is to start you translating between English and monadic predi-
cate logic. When faced with a sentence of English, you first have to ask whether it uses
constants (if it names particular objects) or quantifiers and variables (if it uses the
quantifier terms like ‘all’, ‘some’, ‘none’, ‘any’, or ‘only’). Some sentences will use both
constants and variables.
The main subformulas of universally quantified sentences (after their quantifiers)
are ordinarily conditionals, with subjects as their antecedents and attributes as their
consequents. The main subformulas of existentially quantified sentences are ordinar-
ily conjunctions; the order of the subject and attribute does not matter.
Remember that sentences containing ‘nothing’ and related quantifiers can be trans-
lated either using a universal quantifier, with a negation embedded inside the formula,
or using the negation of an existentially quantified sentence.
2 1 8 C h apter 4 Mona d i c P re d i cate L og i c
KEEP IN MIND
Predicate logic extends propositional logic with predicates, singular terms, and quantifiers.
Singular terms may be constants, standing for particular things, or variables, which must be
modified by quantifiers to form a closed sentence that expresses a complete proposition.
Quantifiers may be existential or universal.
Statements with quantifiers and negations can be translated in at least two different ways.
Start translating into M by asking whether the sentence is universal or existential.
Think of English sentences in terms of the ordinary rules of subject-predicate grammar.
The subject of the proposition is what we are talking about.
The attribute of the proposition is what we are saying about it.
The subject of a sentence is the antecedent in a universally quantified statement or the first
conjunct in an existentially quantified statement.
The attribute of a sentence is the consequent of the conditional in a universally quantified
statement or the second conjunct of an existentially quantified statement.
Quantifiers are logical operators and may be the main operators of a proposition.
EXERCISES 4.1a
Translate each sentence into predicate logic using constants
in each.
1. Andre is tall.
2. Belinda sings well.
3. Deanna drives to New York City.
4. The Getty Museum is located in Los Angeles.
5. Snowy is called Milou in Belgium.
6. Cortez and Guillermo go to the gym after school.
7. Either Hilda makes dinner or Ian does.
8. Jenna doesn’t run for class president.
9. Ken doesn’t walk to school when it rains.
10. Either Lauren or Megan buys lunch.
11. Nate and Orlando play in the college orchestra.
12. Paco will play football only if he’s not injured.
13. Ramona plays volleyball if, and only if, she sets up the net.
4 . 2 : T ranslat i on Us i ng M 2 1 9
14. If Salvador invests all his money in the stock market, then he takes a second job.
15. Hamilton College is closed if, and only if, President Wippman invokes the
closure policy.
EXERCISES 4.1b
Translate each sentence into predicate logic. Do not use
constants.
Only
Like ‘all’ and ‘some’, ‘only’ can modify an open sentence and so indicate the presence
of a quantifier. But such translations can be tricky. ‘Only’ usually indicates a universal ‘Only’ usually indicates
quantifier, as at 4.2.12. a universal quantifier.
Sentences using ‘only’
4.2.12 Only men have been presidents. must be carefully
distinguished from their
4.2.12 claims that if something has been a president, it has been a man; all presi-
related ‘all’ sentences.
dents have been men. Thus, it is equivalent to 4.2.13.
4.2.13 All presidents have been men.
In propositions with just two predicates, ‘only Ps are Qs’ is logically equivalent to
‘all Qs are Ps’. Thus, in simple cases, we can just invert the antecedent and consequent
of a parallel sentence that uses ‘all’. Start with a related ‘all’ sentence, like 4.2.14 or
4.2.16. Then take the converse to find the ‘only’ sentence.
4.2.14 All men have been presidents. (∀x)(Mx ⊃ Px)
4.2.15 Only men have been presidents. (∀x)(Px ⊃ Mx)
4.2.16 All cats are animals. (∀x)(Cx ⊃ Ax)
4.2.17 Only cats are animals. (∀x)(Ax ⊃ Cx)
In more complex sentences, the rule of just switching antecedent and consequent
between an ‘all’ sentence and its correlated ‘only’ sentence must be adjusted. 4.2.18 is
standardly regimented as 4.2.19.
4.2.18 All intelligent students understand Kant.
4.2.19 (∀x)[(Ix ∙ Sx) ⊃ Ux]
If we regiment 4.2.20 merely by taking the converse of the conditional in 4.2.19, we
get 4.2.21.
4.2.20 Only intelligent students understand Kant.
4.2.21 (∀x)[Ux ⊃ (Ix ∙ Sx)]
2 2 2 C h apter 4 Mona d i c P re d i cate L og i c
4.2.21 says that anything that understands Kant must be an intelligent student. It
follows from that regimentation that I don’t understand Kant, since I am no longer
a student. I am not sure whether I understand Kant, but that I do not is not a logical
consequence of 4.2.20.
A preferred regimentation of 4.2.20 is 4.2.22, which says that any student who un-
derstands Kant is intelligent.
4.2.22 (∀x)[(Ux ∙ Sx) ⊃ Ix]
4.2.22 is a reasonable representation of 4.2.20. When regimenting, we need not
assume that everything that is said is reasonable; that’s surely a false assumption.
But it is customary and charitable to presume reasonableness unless we have good
reason not to.
Just above, I said that to regiment sentences into predicate logic, we think of them
as divided into a subject and an attribute. Universally quantified sentences ordinarily
have a horseshoe between the subject portion of the proposition and the attribute por-
tion. In existential sentences, we use a conjunction between the subject and attribute.
In sentences like 4.2.18, the subject portion of the sentence has both a subordi-
nate subject (‘x is a student’) and a subordinate attribute (‘x is intelligent’); there is a
single grammatical attribute (‘x understands Kant’). The relation between the only-
quantified sentence and its corresponding all-quantified sentence is that the sub-
ordinate attribute is switched with the main attribute, but the subordinate subject
remains where it is, in the antecedent.
Thus, an amended rule could be that if an only-quantified sentence uses only two
predicates, you can just switch the antecedent and consequent from the related ‘all’
sentence, the one that results from replacing ‘only’ with ‘all’; but if the grammatical
subject contains two predicates (a subordinate subject and an attribute), then you
should just switch the two subordinate attributes (‘x is intelligent’ and ‘x understands
Kant’), leaving the subordinate subject alone. Let’s summarize this new guideline for
‘only’ as 4.2.23.
4.2.23 ‘Only PQs are R’ is ordinarily the same as ‘All RQs are P’
4.2.23 is a good general rule, often applicable. But there are exceptions, and some
sentences may be ambiguous. It is not especially clear whether 4.2.24 is best regi-
mented as 4.2.25 or as 4.2.26.
4.2.24 Only famous men have been presidents.
4.2.25 (∀x)[Px ⊃ (Mx ∙ Fx)]
4.2.26 (∀x)[(Px ∙ Mx) ⊃ Fx]
4.2.25 and 4.2.26 are not logically equivalent. 4.2.25 says that if something is a
president, then it is a famous man. 4.2.26 says that if something is a male president,
then it is famous. If we take ‘president’ to refer to presidents of the United States, say,
4 . 2 : T ranslat i on Us i ng M 2 2 3
the former regimentation seems better. But imagine a place in which there have been
both men and women presidents (like Switzerland). Of the women presidents, let’s
imagine, some have been famous, and some have been obscure. But, all of the men
who have been president have been famous. In such a case, we would favor the second
regimentation, using an inflection on ‘men’ when we utter the original 4.2.24 to say
that of the male presidents, all of them have been famous, but of the women, some
have been famous and some have not.
4.2.27 is a good exception to the rule at 4.2.23.
4.2.27 Only probability-challenged ticket holders win the lottery.
Since one must hold a ticket to win the lottery, ‘winners of the lottery who are ticket
holders’, at 4.2.28, which the rule at 4.2.23 would recommend, is redundant. The bet-
ter regimentation is 4.2.29.
4.2.28 (∀x)[(Wx ∙ Tx) ⊃ Px]
4.2.29 (∀x)[Wx ⊃ (Px ∙ Tx)]
When translating ‘only’ sentences, then, you have to decide from the context
whether to use the simple converse rule (as at 4.2.14–4.2.17) or the more complex
rule at 4.2.23.
Adjectives
Adjectives are a main source of increasing complexity in our sentences and their
regimentations. For example, in 4.2.1 we represented ‘wooden desks’ as ‘(Wx ∙ Dx)’,
2 2 4 C h apter 4 Mona d i c P re d i cate L og i c
something that has the properties both of being wooden and of being a desk. 4.2.34
has a selection of similar examples.
4.2.34 green book Gx ∙ Bx
beautiful painting Bx ∙ Px
hungry puppy Hx ∙ Px
confused teenager Cx ∙ Tx
A green book is something that is both green and a book; a confused teenager is
something that is both confused and a teenager.
But not all adjectives are properly regimented using an additional predicate, as in
the items in the list at 4.2.35.
4.2.35 large baby
smart bee
old fruit fly
A large baby is not something that is large and a baby; it is something that is large
for a baby. Such adjectives are context sensitive and cannot be ascribed to something
in the way that ‘green’ or ‘hungry’ can. Nothing could be said to be large or smart or
old by itself; things have these properties only relative to other things of their types.
When faced with a sentence containing such context-sensitive adjectives, it is
best to use one predicate for the modified noun. In 4.2.36, I use ‘Sx’ for ‘x is a jumbo
shrimp’ and ‘Px’ for ‘x is on the plate’.
4.2.36 There are jumbo shrimp on the plate.
(∃x)(Sx ∙ Px)
In the exercises that follow, I provide predicates and specify what they are to rep-
resent, so you won’t find yourself challenged to make the distinction. But if you are
regimenting completely on your own, it is worth keeping this phenomenon in mind.
Summary
In section 4.1, we started translating between English and monadic predicate logic. In
this section, we explored the subtleties of M. As sentences become more complicated,
they have increasing numbers of predicates. A rough division of our natural-language
sentences into subjects and attributes can be useful. While the main subformulas
of universally quantified sentences are ordinarily conditionals, the antecedents and
consequents of those conditionals may be complex formulas, often conjunctions,
especially in the antecedents. The main subformulas of existentially quantified sen-
tences are ordinarily conjunctions; again, the first and second conjuncts may be com-
plex formulas themselves.
There are lots of translation exercises in this section and the following sections that
explore derivations in M. In section 5.1, we expand beyond monadic predicate logic
4 . 2 : T ranslat i on Us i ng M 2 2 5
into full first-order predicate logic. Even if you have mastered translation in M, the
new translations there, and in section 5.4 where we look at identity theory, will be
challenging. Practice! The translations to English from logic in exercises 4.2b can also
be useful in learning how to translate from logic to English.
KEEP IN MIND
Simple quantified English sentences often have two predicates, separated by a conditional,
for universal sentences, or a conjunction, for existential sentences. More complex uni-
versal sentences may have complex antecedents (often conjunctions) or consequents.
More complex existential sentences may have multiple predicates either before or after
the main conjunction.
Be careful to distinguish sentences with ‘someone’, ‘everyone’, ‘anyone’, and ‘no one’ from
the simpler, more universal sentences that contain ‘something’, ‘everything’, ‘anything’,
and ‘nothing’.
To formalize sentences that use ‘only’ as a quantifier, there are two options:
For two-predicate sentences, and some more complex sentences, just use the converse
of the related ‘all’ sentence.
For more-complex sentences, ‘Only PQs are R’ is often best rendered as ‘All RQs are P’.
The meaning of the sentence, in context, will help you decide between the two
alternatives.
Sentences with multiple quantifiers often have propositional operators as their main
operator.
EXERCISES 4.2a
Translate each sentence into predicate logic using the given
translation keys.
23. It is not the case that both some dogs with pointed ears like humans and no cats
with whiskers like humans.
24. All cats have whiskers if, and only if, they have pointed ears.
Mx: x is a materialist
Ox: x is a monist
Px: x is a philosopher
85. No libertarian philosophers are determinists.
86. Monists are compatibilists if they are materialists.
87. Monists are compatibilists only if they are materialists.
88. If you’re a compatibilist, then you’re a determinist, but not a libertarian.
89. Either every material monist is a compatibilist or some material monists are
determinists.
90. Some materialists are compatibilists, if some philosophers are monists, but not
determinists.
91. No determinist who is not a materialist is a compatibilist.
92. If all materialist monists are compatibilists, if they are philosophers, then some
libertarian philosophers are actually determinists.
EXERCISES 4.2b
Use the given interpretations to translate the following
arguments written in predicate logic into natural, English
sentences.
Ax: x is an athlete
Bx: x is brawny
Cx: x is a champion
m: Malik
g: Gita
n: Ned
1. 1. (∀x)(Ax ⊃ Bx)
2. Am ∙ An / Bm ∙ Bn
2. 1. (∀x)(Ax ⊃ Bx)
2. (∀x)(Bx ⊃ Cx) / (∀x)(Ax ⊃ Cx)
3. 1. (∀x)(Bx ⊃ Cx)
2. (∃x)(Ax ∙ Bx) / (∃x)(Ax ∙ Cx)
4. 1. (∀x)(Ax ⊃ Bx)
2. ∼Bm / (∃x)∼Ax
5. 1. (∀x)[Ax ⊃ (Bx ∨ Cx)]
2. Ag ∙ ∼Bg / Cg
6. 1. (∀x)[(Ax ∙ Bx) ⊃ Cx]
2. (∃x)(Bx ∙ ∼Cx) / (∃x)∼Ax
7. 1. (∃x)Ax ⊃ (∀x)(Cx ⊃ Bx)
2. (∃x)(Ax ∨ Bx)
3. (∀x)(Bx ⊃ Ax) / (∀x)(Cx ⊃ Ax)
8. 1. (∀x)[Bx ∨ (Cx ∙ Ax)]
2. ~Bg / ~(∀x)(Cx ⊃~Ax)
9. 1. Cg ∙ (∃x)Bx
2. ∼Am ⊃ (∀x)∼Cx / ∼[(∃x)Ax ⊃ ∼(∃x)Bx]
10. 1. (∀x)[Bx ∙ (Ax ∨ Cx)]
2. Cn ⊃ (∀x)∼(Ax ∨ Bx)
3. ∼(∃ x)Cx / ∼Cn
4 . 3 : S y nta x for M 2 3 3
Vocabulary of M
Capital letters A . . . Z used as one-place predicates
Lower-case letters used as singular terms
a, b, c, . . . u are used as constants.
v, w, x, y, z are used as variables.
Five operators: ∼, ∙, ∨, ⊃, ≡
Quantifier symbols: ∃, ∀
Punctuation: (), [], {}
The next step is to specify formation rules for formulas (wffs) of M. In order to
explain the formation rules and use quantifiers properly, one has to be sensitive to
their scope. The quantifiers in 4.3.1 and 4.3.2 have different scope. The scope o f an operator
is its range of application.
4.3.1 (∀x)(Px ⊃ Qx) Every P is Q.
4.3.2 (∀x)Px ⊃ Qx If everything is P, then x is Q.
We have already tacitly seen the notion of scope in using negations. The scope of a negation is
whatever directly follows
If what follows the tilde is a single propositional variable, then the scope of the the tilde.
negation is just that propositional variable.
If what follows the tilde is another tilde, then the scope of the first (outside)
negation is the scope of the second (inside) negation plus that inside tilde.
If what follows the tilde is a bracket, then the entire formula that occurs
between the opening and closing of that bracket is in the scope of the
negation.
4.3.3 ∼{(P ∙ Q) ⊃ [∼R ∨ ∼ ∼(S ≡ T)]}
There are four tildes in 4.3.3. The first one has the broadest scope. Since what follows
it is a bracket, the rest of the formula, everything enclosed in the squiggly brackets,
is in the scope of the leading negation. The second tilde in the formula, which occurs
just in front of the ‘R’, has narrow scope. It applies only to the ‘R’. The third tilde in the
formula has ‘∼ (S ≡ T)’ in its scope. The fourth tilde has ‘(S ≡ T)’ in its scope.
Similarly, the scope of a quantifier is whatever formula immediately follows the The scope of a quantifier
quantifier. is whatever formula
immediately follows the
If what follows the quantifier is a bracket, then any formulas that occur until quantifier.
that bracket is closed are in the scope of the quantifier.
2 3 4 C h apter 4 Mona d i c P re d i cate L og i c
If what follows the quantifier is a tilde, then the tilde and every formula in its
scope is in the scope of the quantifier.
If what follows the quantifier is another quantifier, then the inside quantifier
and every formula in the scope of the inside quantifier is in the scope of the
outside quantifier.
4.3.4 (∀w){Pw ⊃ (∃x)(∀y)[(Px ∙ Py) ⊃ (∃z)∼(Qz ∨ Rz)]}
The scope of a quantifier may be wider or narrower. We can increase the scope by
using punctuation. There are four quantifiers in the formula at 4.3.4. Their scopes are
as follows.
Quantifier Scope
Scope is important for quantifiers because it affects which variables are bound by
the quantifier. When we construct derivations in predicate logic, we will often remove
quantifiers from formulas. When we do so, the variables bound by those quantifiers
will become unbound. Similarly, we will add quantifiers to the fronts of formulas,
binding variables that are in their scopes. We will see some rules for removing and
replacing quantifiers, unbinding and binding variables, in the next section, with a
few further restrictions to follow. If we are not careful in using these rules, observant
about binding and unbinding variables, invalid inferences can result.
A bound variable is Quantifiers bind every instance of their variable in their scope. A bound variable is
attached, or related, to connected to the quantifier that binds it. In 4.3.1, the ‘x’ in ‘Qx’ is bound, as is the ‘x’ in
a quantifier. A variable
‘Px’. In 4.3.2, the ‘x’ in ‘Qx’ is not bound, though the ‘x’ in ‘Px’ is bound. An unbound
is bound by a quantifier
when it is in the scope of variable is called a free variable.
the quantifier and they Wffs that contain at least one unbound variable are open sentences, as we saw in
share a variable. A free section 4.1. Examples 4.3.5–4.3.8 are all open sentences.
variable is not bound by a
quantifier. 4.3.5 Ax
4.3.6 (∀x)Px ∨ Qx
4.3.7 (∃x)(Px ∨ Qy)
4.3.8 (∀x)(Px ⊃ Qx) ⊃ Rz
4.3.6, 4.3.7, and 4.3.8 contain both bound and free variables. In 4.3.6, ‘Qx’ is not in
the scope of the quantifier, so is unbound. In 4.3.7, ‘Q y’ is in the scope of the quanti-
fier, but ‘y’ is not the quantifier variable, so is unbound. In 4.3.8, ‘Rz’ is neither in the
scope of the quantifier, nor does it contain the quantifier variable.
A closed sentence h as no
If a wff has no free variables, it is a closed sentence, and expresses a proposition.
free variables. An open
sentence has at least one 4.3.9 and 4.3.10 are closed sentences. Translations from English into M should ordi-
free variable. narily yield closed sentences.
4 . 3 : S y nta x for M 2 3 5
The terms ‘Qx’ and ‘Rx’ contain variables that appear to be bound by both the lead-
ing existential quantifier and the universal quantifier inside the proposition. In the
first few sections of chapter 4, we won’t normally be tempted to construct such sen-
tences. But after we introduce relational predicates, we will have to be very careful to
avoid such overlapping.
Summary
As we explore the languages of predicate language, we will focus mainly on two cen-
tral tasks: translation and derivation. Each time we extend our language, I will show
the changes to the vocabulary and formation rules. It will be important, especially as
we learn the derivation rules for predicate logic, to understand scope and binding, the
central concepts in proofs for predicate logic. As with PL, it will also be important
to quickly determine the main operator of a wff. The few exercises in this section are
4 . 3 : S y nta x for M 2 3 7
aimed at helping you master these important concepts in order to make the deriva-
tions easier.
In 4.7 and 5.2, we will look at the semantics for our languages of predicate logic,
which are distinctly more complicated than the mere truth tables of PL. The semantics
will allow us also to demonstrate the invalidity of arguments in predicate logic.
KEEP IN MIND
EXERCISES 4.3
For each of the following wffs of M, answer each of the
following questions:
A. For each quantifier in the sentence, which subformulas are in its scope?
(List them all.)
B. For each quantifier in the sentence, which variables are bound by the
quantifier?
C. Which variables in the sentence are free?
D. Is the sentence open or closed?
E. What is the main operator of the sentence?
1. (∃x)(Px ∙ Qx)
2. (∀x)[(Px ∙ Qx) ⊃ ∼Ra]
2 3 8 C h apter 4 Mona d i c P re d i cate L og i c
4.4: DERIVATIONS IN M
In this section, we start to construct derivations in M. All of the twenty-five rules we
used with PL continue to hold, governing the uses of the propositional operators.
There are four new rules governing removing and adding quantifiers, the subjects of
this section. In the next section, we will add a rule for exchanging the universal and
existential quantifiers. Then, in section 4.6, we will look at how the methods of condi-
tional and indirect proof must be modified for M.
The general structure of most of the derivations of this section is first to take off
quantifiers; second, to use the rules we already saw for PL; and last, to put quantifiers
on. So, we need four rules, two for taking off each of the quantifiers and two for put-
ting on each of the quantifiers.
1. (∀x)(Hx ⊃ Cx)
2. (∀x)(Mx ⊃ ∼Cx) / (∀x)(Mx ⊃ ∼Hx)
We have UI to guide our removal of the quantifiers. We just need a rule allowing us
to put a universal quantifier on the front of a formula in a derivation.
We might be tempted to introduce a rule such as 4.4.7.
4.4.7 Bad Universal Generalization Rule
Fa
(∀x)F x
2 4 0 C h apter 4 Mona d i c P re d i cate L og i c
To see why 4.4.7 is a bad generalization rule, consider the instance of it at 4.4.8.
4.4.8 1. Pa
2. (∀x)Px
Inferring a universal
claim from an existential
Now, interpret ‘P’ as ‘is a professor’ and ‘a’ as ‘Asha’. 4.4.7 thus licenses the conclusion
one commits the fallacy that everything is a professor from just the premise that Asha is a professor. Such an
of hasty generalization. inference is called the fallacy of hasty generalization. Most of the restrictions on the
instantiation and generalization rules are constructed precisely to avoid confusing
our existential assertions with our universal ones, to prevent our making a strong
universal conclusion on the bases of weak existential assumptions.
To avoid hasty generalization, we never universally generalize (or quantify) over a
constant. In other words, we may not replace a constant with a variable bound by a
universal quantifier. This restriction keeps us from ever universally quantifying over
individual cases.
While we do not universally quantify over constants, we may do so over variables.
Indeed, the point of introducing variables, and distinguishing them from constants,
is to mark where universal generalization is permitted. Variables, except in circum-
stances we will introduce in section 4.6, retain universal character, even when they
are unbound. Generalizing over them (i.e., binding them with a universal quantifier)
does not commit a fallacy because the variable can stand for anything and everything.
speaking, this would not be a proof of the stated conclusion. But since the statements
are equivalent, such a derivation would suffice.
UI would have allowed us to instantiate either premise to constants. Indeed, the
derivation could have proceeded through line 6 with all of the ‘y’s changed to ‘a’s or
‘b’s. But line 7 would not have been permitted by UG had the ‘y’s been constants.
Since EI contains a restriction whereas UI does not, in the common case in which
you have to instantiate both universally quantified and existentially quantified
propositions, EI before you UI.
4.4.15 contains an acceptable use of EI.
4.4.15 1. (∀x)(Nx ⊃ Ax)
2. (∃x)(Nx ∙ Bx) / (∃x)(Ax ∙ Bx)
3. Na ∙ Ba 2, EI
4. Na ⊃ Aa 1, UI
5. Na 3, Simp
6. Aa 4, 5, MP
7. Ba ∙ Na 3, Com
8. Ba 7, Simp
9. Aa ∙ Ba 6, 8, Conj
10. (∃x)(Ax ∙ Bx) 9, EG
QED
I’m taking off an ∃ (Using EI) I’m taking off an ∀ (Using UI)
Summary
The four rules of inference in this section allow you to take off quantifiers and put
them back on. Once the quantifiers are off, proofs generally proceed according to the
rules of PL.
Great care must be paid in order not to misuse the instantiation and generalization
rules. Students first using these rules are sometimes not as sensitive as they should be
to the differences between constants and variables. A proof can look perfectly fine,
and use all of the PL rules well, and yet make serious errors of using constants when
one must use variables, or vice versa.
Be careful not to instantiate parts of lines. The exercises for this section mainly
illustrate the instantiation and generalization rules and so mainly contain premises
and conclusions that have quantifiers as the main operators. In the next section,
we will work with propositions whose main operators are not quantifiers, but the
propositional operators, and you will have to take care not to instantiate (or gener-
alize) errantly.
2 4 6 C h apter 4 Mona d i c P re d i cate L og i c
KEEP IN MIND
Pay close attention to the application conditions for each of the four rules, whether they
hold for just constants, just variables, or for any singular term.
EG and UI are anytime, anywhere rules; they have no restrictions.
EI and UG require care:
Never EI to a variable; always use a new constant.
We need new constants when using EI in order not to confuse our claims about par-
ticular objects.
A new constant is one that appears nowhere earlier in the derivation, not even in the
stated conclusion.
Never UG from (i.e., over) a constant.
The restrictions on EI and UG are grounded mainly in avoiding hasty generalization.
Constants may be replaced only by existentially quantified variables.
Unbound variables are available for universal generalization.
If you want to make inferences that connect existential and universal claims, EI before
you UI.
Rules Introduced
Universal Instantiation (UI)
(∀α)Fα
Fβ for any variable α, any formula F, and any
singular term β
Universal Generalization (UG)
Fβ
(∀α)Fα for any variable β, any formula F not
containing α, and any variable α
Never UG over a constant.
Existential Instantiation (EI)
(∃α)Fα
Fβ for any variable α, any formula F, and any
new constant β
Never EI to a variable.
Existential Generalization (EG)
Fβ
(∃α)Fα for any singular term β, any formula F not
containing α, and for any variable α
4 . 4 : Der i v at i ons i n M 2 4 7
EXERCISES 4.4a
Derive the conclusions of the following arguments.
1. 1. (∀x)(Ax ⊃ Bx)
2. (∀x)(Cx ⊃ ∼Bx)
3. Aa / ∼Ca
2. 1. (∀x)(Ax ⊃ Bx)
2. (∀x)(Cx ⊃ ∼Bx) / (∀x)(Cx ⊃ ∼Ax)
3. 1. (∃x)(Dx ∙ ∼Ex)
2. (∀x)(Ex ∨ Fx) / (∃x)Fx
4. 1. (∃x)(Ax ∙ ∼Bx)
2. (∀x)(Cx ⊃ Bx) / (∃x)(Ax ∙ ∼Cx)
5. 1. (∀x)Hx ∨ Ja
2. (∀x)[(∼ Jx ∙ Ix) ∨ (∼ Jx ∙ Kx)] / (∀x)Hx
6. 1. (∀x)(Jx ∙ Kx) / (∃x)Jx ∙ (∃x)Kx
7. 1. (∃x)(Px ∙ Qx)
2. (∃x)(Rx ∙ Sx) / (∃x)Px ∙ (∃x)Rx
8. 1. (∀x)(Fx ∨ Hx) ⊃ (∃x)Ex
2. (∀x)[Fx ∨ (Gx ∙ Hx)] / (∃x)Ex
9. 1. (∀x)(Ix ⊃ Kx)
2. (∀x)(Jx ⊃ Lx)
3. (∃x)(Jx ∨ Ix) / (∃x)(Kx ∨ Lx)
10. 1. (∀x)[Gx ⊃ (Hx ∨ Ix)]
2. (∃x)(Gx ∙ ∼Ix) / (∃x)(Gx ∙ Hx)
11. 1. (∀x)(Dx ⊃ Ex)
2. (∀x)(Ex ⊃ ∼Gx)
3. (∃x)Gx / (∃x) ∼Dx
12. 1. (∀x)(Ox ⊃ Qx)
2. (∀x)(Ox ∨ Px)
3. (∃x)(Nx ∙ ∼Qx) / (∃x)(Nx ∙ Px)
13. 1. (∀x)[Ax ⊃ (Bx ∨ Cx)]
2. (∃x)∼(Bx ∨ ∼Ax) / (∃x)Cx
2 4 8 C h apter 4 Mona d i c P re d i cate L og i c
EXERCISES 4.4b
Translate each of the following paragraphs into arguments
written in M, using the given translation key. Then, derive
the conclusions of the arguments using the four quantifier
rules, plus the rules of inference and equivalence for PL.
1. Some students are teenagers. Everything is either not a teenager or not a child.
So, some students are not children. (Cx: x is a child; Sx: x is a student; Tx: x is
a teenager)
2. If there are black holes, then there are star clusters. S5 0014+81 is a black hole.1
All star clusters are gravitationally bound. So, something is gravitationally
bound. (a: S5 0014+81; Bx: x is a black hole; Gx: x is gravitationally bound; Sx:
x is a star cluster)
3. Someone is either an elephant or a badger. No one is a badger. If there are el-
ephants, then there are tusks. So, there are tusks. (Bx: x is a badger; Ex: x is an
elephant; Px: x is a person; Tx: x is a tusk)
4. Some prime numbers are either Mersenne primes or semiprimes. Things are
prime if, and only if, they are not composite. Semiprimes are composite. So,
some prime numbers are Mersenne primes. (Cx: x is composite; Mx: x is a Mer-
senne prime; Px: x is a prime number; Sx: x is a semiprime)
5. Things are cats just in case they are feline. No feline is canine. There are cats. So,
something is not canine. (Cx: x is canine; Fx: x is feline; Mx: x is a cat)
6. All trains run on tracks. Trains that run on tracks lack steering wheels. No cars
lack steering wheels. Some trains are purple. So, some trains aren’t cars. (Lx: x
lacks a steering wheel; Px: x is purple; Rx: x runs on tracks; Tx: x is a train)
7. If Shangri-La and the Shire exist, then so does Sodor. Anything that’s Sodor
has tank engines. Nothing with tank engines has real people. But Utopia is
Shangri-La and i Drann is the Shire. So, Sodor exists and does not have real
people. (i: i Drann; u: Utopia; Lx: x is Shangri-La; Px: x has real people; Rx: x is
the Shire; Sx: x is Sodor; Tx: x has tank engines)
1
S5 0014+81 is actually the name of a “blazar, in fact an FSRQ quasar, the most energetic subclass
of objects known as active galactic nuclei, produced by the rapid accretion of matter by a central
supermassive black hole,” according to its Wikipedia entry, June 9, 2016. But let’s take it as the name
of the black hole itself here.
4 . 4 : Der i v at i ons i n M 2 5 1
8. Someone is a composer but does not get paid. Others are composers and work
in Hollywood. Anyone who works in Hollywood gets paid. So, some people get
paid and some don’t. (Cx: x is a composer; Gx: x gets paid; Px: x is a person; Wx:
x works in Hollywood)
9. All treatises are books. No journal article is a book. So, everything is either not
a treatise or not a journal article. (Bx: x is a book; Jx: x is a journal article; Tx: x
is a treatise)
10. All fallacies seem valid, if they resemble formal inferences. But nothing that
seems valid is valid. So, everything valid, if it resembles a formal inference, is
not a fallacy. (Fx: x is a fallacy; Rx: x resembles a formal inference; Sx: x seems
valid; Vx: x is valid)
11. Some intuitions are reliable. Nothing reliable is obviously false. If some intu-
ition is not obviously false, then there are useful epistemologies. So, there are
useful epistemologies. (Ix: x is an intuition; Ox: x is obviously false; Rx: x is
reliable; Ux: x is a useful epistemology)
12. There is a thing that is either a utilitarian or a Kantian. Any utilitarian is a con-
sequentialist. Any Kantian is a deontologist. If something is either a conse-
quentialist or a deontologist, then something is a moral theorist. So, something
is a moral theorist. (Cx: x is a consequentialist; Dx: x is a deontologist; Kx: x is
a Kantian; Mx is a moral theorist; Ux: x is a utilitarian)
13. All empiricists make sense experience primary. No rationalist does. And every-
thing is either an empiricist or a rationalist. So, everything is an empiricist just
in case it is not a rationalist. (Ex: x is an empiricist; Rx: x is a rationalist; Sx: x
makes sense experience primary)
14. Everything good is beautiful and hard work. If something is hard work or re-
warding, then it is worth pursuing. So, the good is worth pursuing. (Bx: x is
beautiful; Gx: x is good; Hx: x is hard work; Rx: x is rewarding; Wx: x is worth
pursuing)
15. Everything good is beautiful and hard work. If something is hard work, then
either you do it yourself or you ask someone else to do it for you. Nothing you
do yourself is beautiful. So, anything good you ask someone else to do for you.
(Ax: you ask someone else to do x for you; Bx: x is beautiful; Gx: x is good; Hx:
x is hard work; Yx: you do x yourself)
16. If some philosophers are existentialists, then some are nihilists. There are her-
meneuticist philosophers. All philosophers are hermeneuticists just in case
they are existentialists. All nihilist philosophers are empowering. So, some-
thing is empowering. (Ex: x is an existentialist; Hx: x is a hermeneuticist; Nx: x
is a nihilist; Px: x is a philosopher; Sx: x is empowering)
2 5 2 C h apter 4 Mona d i c P re d i cate L og i c
EXERCISES 4.4c
Find the errors in each of the following illicit inferences.
Some of the arguments are valid; some are not. All
derivations contain errors. (We’ll show the invalid ones to be
invalid in Exercises 4.8b.)
1. 1. (∀x)(Px ⊃ Qx)
2. (∃x)(Px ∙ Rx) / (∃x)(Qx ∙ Rx)
3. Px ∙ Rx 2, EI
4. Px ⊃ Qx 1, UI
5. Px 3, Simp
6. Qx 4, 5, MP
7. Rx ∙ Px 3, Com
8. Rx 7, Simp
9. Qx ∙ Rx 6, 8, Conj
10. (∃x)(Qx ∙ Rx) 9, EG
QED—Oops!
5. 1. (∃x)(Px ∙ Qx)
2. (∀x)(Px ⊃ Rx) / (∀x)(Qx ⊃ Rx)
3. Pa ∙ Qa 1, EI
4. Pa ⊃ Ra 2, UI
5. Pa 3, Simp
6. Ra 4, 5, MP
7. Ra ∨ ∼Qa 6, Add
8. ∼Qa ∨ Ra 7, Com
9. Qa ⊃ Ra 8, Impl
10. (∀x)(Qx ⊃ Rx) 9, UG
QED—Oops!
QE says that to change a quantifier, you change each of the three spaces.
Add or remove a tilde directly before the quantifier.
Switch quantifiers: existential to universal or vice versa.
Add or remove a tilde directly after the quantifier.
For example, in 4.5.10, we have a negation in front of a universal quantifier, but no
negation directly after it.
4.5.10 ∼(∀x)(Px ⊃ Qx)
Using quantifier exchange, we can transform 4.5.10 into 4.5.11, removing the lead-
ing quantifier (the main operator of 4.5.10), changing the universal quantifier to an
existential quantifier, and adding a negation immediately following the existential
quantifier.
4.5.11 (∃x)∼(Px ⊃ Qx)
We can also transform 4.5.12 into 4.5.13 by adding a negation in front, where there
is none, changing the existential quantifier to a universal quantifier, and adding a ne-
gation directly after the quantifier where again there is none.
4.5.12 (∃x)(Px ∙ Qx)
4.5.13 ∼(∀x)∼(Px ∙ Qx)
Wffs like 4.5.11 and 4.5.13 may seem unnatural; we would rarely translate a sen-
tence of English into forms like either of those. But a few uses of propositional rules
of equivalence within those formulas can transform them into wffs that would be
the obvious results of translations from natural language, as we will see in the next
subsection.
Summary
The rules of quantifier exchange (QE) allow us to manage the interactions between
negations and quantifiers. They allow us to instantiate some wffs in which the main
operator is not the quantifier but the negation; just make sure to use QE to change
the wff so that the quantifier is the main operator before instantiating. QE also allows
us to use the rules governing the propositional operators to make inferences with
propositions whose main operators are neither negations nor quantifiers, especially
in propositions with multiple quantifiers.
KEEP IN MIND
Rules Introduced
Quantifier Exchange (QE)
(∀α)Fα →
← ∼(∃α)∼Fα
(∃α)Fα →
← ∼(∀α)∼Fα
(∀α)∼Fα →
← ∼(∃α)Fα
(∃α)∼Fα →
← ∼(∀α)Fα
EXERCISES 4.5a
Derive the conclusions of each of the following arguments.
Do not use CP or IP.
1. 1. (∀x)Ax ⊃ (∃x)Bx
2. (∀x)∼Bx / (∃x)∼Ax
2. 1. (∃x)[Qx ∙ (Rx ∙ ∼Sx)] / ∼(∀x)Sx
3. 1. (∀x)Xx ⊃ (∀x)Yx
2. (∃x)∼Yx / (∃x)∼Xx
4 . 5 : Q u ant i f i er E x c h ange 2 5 9
4. 1. (∃x)(Px ∙ Qx)
2. ∼(∃ x)(Px ∙ Rx) / (∃x)(Px ∙ ∼Rx)
5. 1. (∀x)(Dx ⊃ Ex)
2. ∼(∀x)(Dx ⊃ Fx) / (∃x)(Ex ∙ ∼Fx)
6. 1. (∃x)[(Gx ∙ Hx) ∙ Ix]
2. ∼(∃x)(Ix ∙ Jx) / (∃x)(Hx ∙ ∼Jx)
7. 1. (∀x)(Px ⊃ Qx)
2. (∀x)(Rx ⊃ ∼Qx) / ∼(∃x)(Px ∙ Rx)
8. 1. (∃x)Sx ⊃ (∃x)Tx
2. (∀x)∼Tx / (∀x)∼Sx
9. 1. (∃x)(Xx ∙ Yx) ⊃ (∃x)(Xx ∙ Zx)
2. (∀x)(Xx ⊃ ∼Zx) / ∼(∃x)(Xx ∙ Yx)
10. 1. (∀x)(Ax ⊃ Bx) ⊃ (∀x)(Ax ⊃ Cx)
2. (∃x)(Ax ∙ ∼Cx) / (∃x)(Ax ∙ ∼Bx)
11. 1. (∃x)(Tx ∙ ∼Vx)
2. (∃x)(Tx ∙ Vx) / ∼(∀x)(Tx ⊃ Vx) ∙ ∼(∀x)(Tx ⊃ ∼Vx)
12. 1. (∃x)∼Fx ∨ (∀x)(Gx ∙ Hx)
2. (∀x)[(Fx ∙ Gx) ∨ (Fx ∙ Hx)] / (∃x)(Gx ∙ Hx)
13. 1. ∼(∀x)(Qx ⊃ Rx)
2. (∀x)(∼Rx ⊃ Tx) / ∼(∀x)∼Tx
14. 1. (∀x)[Lx ∨ (Mx ∙ ∼Nx)]
2. ∼(∃x)Lx / ∼(∃x)(Lx ∨ Nx)
15. 1. (∀x)(Ax ∨ Bx)
2. (∀x)(Ax ⊃ Dx)
3. ∼(∀x)(Bx ∙ ∼Cx) / (∃y)(Dy ∨ Cy)
16. 1. ∼(∃x)(Ox ≡ Px)
2. Pa / ∼(∀x)Ox
17. 1. (∃x)(Px ∙ Qx) ⊃ (∃x)(Px ∙ Rx)
2. (∀x)(Px ⊃ ∼Rx) / (∀x)(Qx ⊃ ∼Px)
18. 1. ∼(∃x)(Lx ∙ ∼Mx)
2. ∼(∃x)(Mx ∙ Nx) / ∼(∃x)(Lx ∙ Nx)
19. 1. ∼(∃x)[(Px ∙ Qx) ∙ ∼Rx]
2. ∼(∃x)(Rx ∙ ∼Sx) / ∼(∃x)[(Px ∙ Qx) ∙ ∼Sx]
2 6 0 C h apter 4 Mona d i c P re d i cate L og i c
EXERCISES 4.5b
Translate each of the following arguments into propositions
of M. Then, derive the conclusions of the arguments.
1. Everyone is weird. But not everyone is nice. So, some weird things aren’t nice.
(Nx: x is nice; Px: x is a person; Wx: x is weird)
2. If there are gods, then everything is determined. But something is not deter-
mined. So, everything is not a god. (Dx: x is determined; Gx: x is a god)
3. Nothing blue is edible. This Sour Patch Kid is blue food. So, not all food is ed-
ible. (s: this Sour Patch Kid; Bx: x is blue; Ex: x is edible; Fx: x is food)
4. Someone in the class doesn’t keep up with the reading. Anyone who doesn’t
keep up with the reading has trouble understanding the classwork. It is not the
case that someone who has trouble understanding the classwork doesn’t strug-
gle with the final. So, someone in the class struggles with the final. (Cx: x is in
the class; Fx: x struggles with the final; Kx: x keeps up with the reading; Px: x is
a person; Ux: x has trouble understanding the classwork)
5. All new phones have lots of memory and large screens. Not every new phone
lacks a screen protector. So, not everything with a large screen lacks a screen
protector. (Lx: x has a large screen; Mx: x has lots of memory; Px: x is a new
phone; Sx: x has a screen protector)
2 6 2 C h apter 4 Mona d i c P re d i cate L og i c
6. Any fruit on the table is either a strawberry or has a pit. Some fruits on the
tables are apples. It is not the case that some apples are strawberries. So, it is not
the case that no apples have a pit. (Ax: x is an apple; Fx: x is a fruit on the table;
Px: x has a pit; Sx: x is a strawberry)
7. Everything is an Earthling just in case it is not an alien. It is false that some
politicians are aliens from Mars. So, all politicians from Mars are Earthlings.
(Ax: x is an alien; Ex: x is an Earthling; Mx: x is from Mars; Px: x is a politician)
8. No rock stars have bad hair. It is not the case that some rock stars lack
amplifiers. Not every rock star has either devoted fans or a functioning website.
So, not everything with amplifiers but not bad hair has either devoted fans or a
functioning website. (Ax: x has amplifiers; Fx: x has devoted fans; Hx: x has bad
hair; Rx: x is a rock star; Wx: x has a functioning website)
9. Some philosophers are A-theorists. It is not the case that some A-theorist
doesn’t overvalue the present. So, some philosophers overvalue the present.
(Ax: x is an A-theorist; Ox: x overvalues the present; Px: x is a philosopher)
10. Every Hegelian idealist believes in the transcendent. Not everything believes
in the transcendent. So, not everything is a Hegelian idealist. (Hx: x is a Hege-
lian; Ix: x is an idealist; Tx: x believes in the transcendent)
11. All ethicists are utilitarians if, and only if, they are consequentialists. Not every
ethicist is a utilitarian. So, not everything is a consequentialist. (Cx: x is a con-
sequentialist; Ex: x is an ethicist; Ux: x is a utilitarian)
12. If all beliefs are grounded in sense experience, then some beliefs are abstract.
All beliefs are mental states. It is not the case that some mental states are not
grounded in sense experience. And it is not the case that something abstract is
not ineffable. So, some beliefs are ineffable. (Ax: x is abstract; Bx: x is a belief;
Ix: x is ineffable; Mx: x is a mental state; Sx: x is grounded in sense experience)
13. All existentialists are either nihilists or theists. All theists have faith. Not all
existentialists have faith. So, it is not the case that no existentialists are nihil-
ists. (Ex: x is an existentialist; Fx: x has faith; Nx: x is a nihilist; Tx: x is a theist)
14. Neither everything is material nor some people are zombies. It’s not the case
that something is both not a zombie and not material. So, not everything is a
person. (Mx: x is material; Px: x is person; Zx: x is a zombie)
15. All philosophers are determinists if, and only if, they are not libertarians. Not
all philosophers are either determinists or nihilists. It is not the case that some
libertarians are pessimists and not nihilists. So, not everything is either a de-
terminist or a pessimist. (Dx: x is a determinist; Lx: x is a libertarian; Nx: x is a
nihilist; Px: x is a philosopher; Sx: x is a pessimist)
4 . 6 : C on d i t i onal an d In d i rect P roof i n M 2 6 3
16. All empiricists either believe in abstract ideas or do not believe that we have
mathematical knowledge. It is not the case that some empiricists who believe
in abstract ideas are fictionalists. It is not the case that some empiricists who
do not believe that we have mathematical knowledge approve of the calculus.
So, it is not the case that some empiricists both are fictionalists and approve of
the calculus. (Ax: x believes in abstract ideas; Cx: x approves of the calculus;
Ex: x is an empiricist; Fx: x is a fictionalist; Mx: x believes that we have
mathematical knowledge)
The problem with 4.6.1 can be seen at step 3. The assumption for conditional proof
at line 2 just means that a random thing has the property denoted by ‘R’, not that
everything has that property. While variables ordinarily retain their universal char-
acter in a proof, when they are used within an assumption (for CP or IP), they lose
that universal character. It is as if we are saying, “Imagine that some (particular)
thing has the property ascribed in the assumption.” If it follows that the object in
the assumption also has other properties, we may universally generalize after we’ve
discharged, as in line 7, for we have not made any specific claims about the thing
outside of the assumption.
Using conditional proof in this way should be familiar to mathematics students.
Often in mathematics we will show that some property holds of a particular example.
Then we claim, without loss of generality, that since our example was chosen arbitrarily,
whatever we derived using our assumption holds universally. Within the assumption,
we have a particular example and we treat it existentially. Once we are done with that
portion of the proof, we can treat our object universally.
Consider an indirect proof of some universally quantified formula, ‘(∀x)α’. To be-
gin the proof, we assume its opposite: ‘∼(∀x)α’. We can then change that assumption,
using QE, to ‘(∃x)∼α’. In other words, we start an indirect proof of a universal claim
with an existential assertion: let’s say that something is not α. Another way to do such
an indirect proof would be to assume ‘∼α’ immediately. We could do this by making
the free variables in α constants or variables. Either way, they have to act as constants
within the assumption, so we must not use UG within the assumption on those sin-
gular terms.
Whenever we use CP or IP, we start by indenting, drawing a vertical line, and
All lines of an indented
sequence are within the making an assumption. All lines of the proof until we discharge the assumption are
scope of an assumptionin also indented, indicating that they are within the scope of the assumption in the first
the first line. line of the indented sequence.
To summarize the restriction, we may not UG on a variable within the scope of
an assumption in which that variable is free. Once the assumption is discharged,
the restriction is dismissed and you may UG on the variable. This restriction holds
on both CP and IP, though it would be unusual to use IP with a free variable in the
first line.
Addendum to the rule of inference UG: Within the scope of an assumption
for conditional or indirect proof, never UG on a variable that is free in the
assumption.
Logical Truths of M
Just as CP and IP allowed us to use our proof theory to prove that some formulas of
PL were logical truths, these methods allow us to prove that some formulas of M, like
4.6.9, are logical truths.
4.6.9 (∀x)(Px ∨ ∼Px)
1. ∼(∀x)(Px ∨ ∼Px) AIP
2. (∃x)∼(Px ∨ ∼Px) 1, QE
3. ∼(Pa ∨ ∼Pa) 2, EI
4. ∼Pa ∙ ∼ ∼Pa 3, DM
5. ∼ ∼(∀x)(Px ∨ ∼Px) 1–4, IP
6. (∀x)(Px ∨ ∼Px) 5, DN
QED
4.6.10–4.6.13 are further logical truths of M. Note that each one has a similarity to
one of the four rules for removing or replacing quantifiers.
4.6.10 (∀y)[(∀x)Fx ⊃ Fy]
4.6.11 (∀y)[Fy ⊃ (∃x)Fx]
4.6.12 (∃y)[Fy ⊃ (∀x)Fx]
4.6.13 (∃y)[(∃x)Fx ⊃ Fy]
I’ll prove the first, at 4.6.14, leaving the others for Exercises 4.6c.
4.6.14 1. ∼(∀y)[(∀x)Fx ⊃ Fy] AIP
2. (∃y)∼[(∀x)Fx ⊃ Fy] 1, QE
3. (∃y)∼[∼(∀x)Fx ∨ Fy] 2, Impl
4. (∃y)[∼ ∼(∀x)Fx ∙ ∼Fy] 3, DM
5. (∃y)[(∀x)Fx ∙ ∼Fy] 4, DN
6. (∀x)Fx ∙ ∼Fa 5, EI
7. (∀x)Fx 6, Simp
8. Fa 7, UI
9. ∼Fa ∙ (∀x)Fx 6, Com
10. ∼Fa 9, Simp
11. Fa ∙ ∼Fa 8, 10, Conj
12. (∀y)[(∀x)Fx ⊃ Fy] 1–11, IP
QED
Summary
In PL, we first showed that statements were logical truths semantically by using the
truth tables to show that they were tautologies. We can show that statements are logi-
cal truths of M semantically, too, though the semantics for predicate logic are more
complicated; we’ll deal with them in the next two sections, after which we’ll be able
to show that arguments are invalid, too.
2 6 8 C h apter 4 Mona d i c P re d i cate L og i c
KEEP IN MIND
The conditional and indirect derivation methods are useful in predicate logic, though there
is an important restriction on UG within any indented sequence.
Within the scope of an assumption for conditional or indirect proof, never UG on a vari-
able that is free in the assumption.
You may UG on a variable that is free in an assumption after the assumption is discharged.
Conditional proof is especially useful for deriving universally quantified conclusions or for
deriving conditional conclusions.
Indirect proof is often used just as in PL, by assuming the opposite of your desired
conclusion.
Be sure to maintain our strict sense of ‘contradiction’ for the last line of an indirect proof.
Either CP or IP is useful in proving logical truths of M.
EXERCISES 4.6a
Derive the conclusions of the following arguments.
1. 1. (∀x)(Ax ⊃ Bx)
2. (∀x)∼(Bx ∙ ∼Cx) / (∀x)(Ax ⊃ Cx)
2. 1. (∀x)(Dx ∨ Ex)
2. (∀x)(Fx ⊃ ∼Ex) / (∀x)(∼Dx ⊃ ∼Fx)
3. 1. (∀x)(Gx ≡ ∼Hx)
2. (∀x)(Ix ⊃ Hx) / (∃x)Ix ⊃ (∃x)∼Gx
4. 1. (∀x)[Mx ⊃ (Nx ∙ Ox)]
2. (∃x)∼Nx / (∃x)∼Mx
5. 1. (∀x)[Px ⊃ (Qx ∙ Rx)]
2. (∀x)(Qx ⊃ Sx) / (∀x)(Px ⊃ Sx)
6. 1. (∀x)(Tx ≡ ∼Vx)
2. (∀x)[Vx ⊃ (Wx ∙ Xx)] / (∀x)(∼Tx ⊃ Xx)
7. 1. (∀x)(Ax ⊃ Cx)
2. ∼(∃x)(Bx ∙ ∼Cx) / (∀x)[(Ax ∨ Bx) ⊃ Cx]
8. 1. (∃x)[(Dx ∙ Ex) ∙ ∼Fx]
2. (∀x)(Gx ⊃ Fx) / ∼(∀x)(Ex ⊃ Gx)
9. 1. (∀x)[Hx ≡ (Ix ∨ Jx)]
2. ∼(∃x) Jx / (∀x)(Hx ≡ Ix)
4 . 6 : C on d i t i onal an d In d i rect P roof i n M 2 6 9
EXERCISES 4.6b
Translate each of the following arguments into propositions
of M. Then, derive the conclusions of the arguments.
1. All gibbons are apes. It’s not the case that there are apes that are not primates.
So, if there are gibbons, there are primates. (Ax: x is an ape; Gx: x is a gibbon;
Px: x is a primate)
2. All living things are carbon-based. Things that aren’t living are eternal. So,
anything not eternal is carbon-based. (Cx: x is carbon-based; Ex: x is eternal;
Lx: x is living)
3. Anything corrupt is not happy if it’s real. There are real dinosaurs. So, if every-
thing is corrupt, then there are unhappy dinosaurs. (Cx: x is corrupt; Dx: x is a
dinosaur; Hx: x is happy; Rx: x is real)
4. All plays are either comedies or tragedies. Everything is not a tragedy if, and
only if, it ends well. So, if some play is not a comedy, then something doesn’t
end well. (Cx: x is a comedy; Ex: x ends well; Px: x is a play; Tx: x is a tragedy)
5. No violent thunderstorms are safe. There are safe thunderstorms. So, not every-
thing is violent. (Sx: x is safe; Tx: x is a thunderstorm; Vx: x is violent)
6. All restaurants have chefs. It’s not the case that there are lazy chefs. There are
restaurants. So, something isn’t lazy. (Cx: x is a chef; Lx: x is lazy; Rx: x is a
restaurant)
2 7 2 C h apter 4 Mona d i c P re d i cate L og i c
7. All deserts are arid and cool at night. Anything arid or semi-arid has lizards.
So, it is not the case that some deserts lack lizards. (Ax: x is arid; Cx: x is cool at
night; Dx: x is a desert; Lx: x has lizards; Sx: x is semi-arid)
8. Good parents are either not too busy or don’t fail to make time for their children.
So, if all good parents are too busy, then if something is a good parent, then not
everything fails to make time for its children. (Bx: x is too busy; Px: x is a good
parent; Tx: x fails to make time for its children)
9. Platonists believe that forms are causes. Aristotelians believe that forms are
material. So, if there are Platonists or Aristotelians, then something believes
either that forms are causes or that they are material. (Ax: x is an Aristotelian;
Cx: x believes that forms are causes; Mx: x believes that forms are material; Px:
x is a Platonist)
10. All art is either expressive or representational. All art is either expressive or
formal. Art exists. So, either something is expressive or something is both rep-
resentational and formal. (Ax: x is art; Ex: x is expressive; Fx: x is formal; Rx: x
is representational)
11. Everything is a human if, and only if, it is rational. Everything is an animal if,
and only if, it is either human or not rational. So, there are animals. (Ax: x is an
animal; Hx: x is human; Rx: x is rational)
12. Everything is either a substance or an accident. Something is not a substance,
but a shape. So, something is an accident and a shape. (Ax: x is an accident; Fx:
x is a shape; Sx: x is a substance)
13. All desire is self-destructive. It is not the case that something is both not desire
and not self-destructive. So, something is self-destructive. (Dx: x is a desire; Sx:
x is self-destructive)
14. It is not the case that some historians are not both broadly trained and learned.
Some philosophers are not broadly trained. So, it’s not the case that everything
is an historian if, and only if, it is a philosopher. (Hx: x is an historian; Lx: x is
learned; Px: x is a philosopher; Tx: x is broadly trained)
15. If no morality is objective, then all morality is relative. Some morality is not
relative. If something is objective, then something lacks perspective. So, not
everything has perspective. (Mx: x is morality; Ox: x is objective; Px: x has per-
spective; Rx: x is relative)
16. All idealists are either empirical or transcendental. Some idealist is not empiri-
cal. All transcendentalists are empirical if they haven’t read Kant. So, not ev-
erything hasn’t read Kant. (Ex: x is empirical; Ix: x is an idealist; Kx: x has read
Kant; Tx: x is transcendental)
4 . 7 : S e m ant i cs for M 2 7 3
EXERCISES 4.6c
Derive the following logical truths of M.
1. (∀y)[Fy ⊃ (∃x)Fx]
2. (∃y)[Fy ⊃ (∀x)Fx]
3. (∃y)[(∃x)Fx ⊃ Fy]
4. (∃x)Ax ∨ (∀x)∼Ax
5. (∀x)Bx ⊃ (∃x)Bx
6. (∀x)(Cx ⊃ Dx) ⊃ [(∀x)Cx ⊃ (∀x)Dx]
7. [(∀x)(Gx ⊃ Hx) ∙ (∃x)Gx] ⊃ (∃x)Hx
8. (∀x)(Ix ⊃ Jx) ∨ (∃x)(Ix ∙ ∼Jx)
9. Fa ∨ [(∀x)Fx ⊃ Ga]
10. (∃x)(Px ∙ Qx) ⊃ [(∀x)(Qx ⊃ Rx) ⊃ (∃x)(Px ∙ Rx)]
11. (∃x)(Ax ∙ Bx) ⊃ [(∃x)Ax ∙ (∃x)Bx]
12. [(∀x)Dx ∨ (∀x)Ex] ⊃ (∀x)(Dx ∨ Ex)
13. (∃x)Ix ∨ (∀x)(Ix ⊃ Jx)
14. [(∀x)(Px ⊃ Qx) ∙ (∀x)(Qx ⊃ Rx)] ⊃ (∀x)(∼Rx ⊃ ∼Px)
15. [(∀x)(Mx ⊃ Nx) ∙ ∼(∃x)(Ox ∙ Nx)] ⊃ ∼(∃x)(Mx ∙ Ox)
16. ∼(∃x)Kx ≡ [(∀x)(Kx ⊃ Lx) ∙ (∀x)(Kx ⊃ ∼Lx)]
17. [(∃x)Ax ⊃ Ba] ≡ (∀x)(Ax ⊃ Ba)
18. [∼(∃x)Cx ∙ ∼(∃x)Dx] ⊃ (∀x)(Cx ≡ Dx)
19. {[(∃x)Fx ∨ (∃x)Gx] ∙ (∀x)(Gx ⊃ Hx)} ⊃ [(∃x)Fx ∨ (∃x)Hx]
20. (∃x)(Ka ∙ Lx) ≡ [Ka ∙ (∃x)Lx]
(in section 2.2) and M (in section 4.3). Once we have specified the wffs of a language,
we can use that language in a theory. But until we specify a semantics or a proof theory,
a language can be used in a variety of theories. We could have, for instance, adopted
a three-valued semantics for PL, which would have generated different logical truths
and thus a different logical theory.
There are different ways to specify a theory. We can just list some theorems. List-
ing the theorems of infinite theories like those we use with PL or M would be an
arduous task.
More promisingly, we can describe some limited ways of generating theorems.
For example, we can adopt some axioms and rules of inference. In geometry, the
Euclidean axioms, along with a background logic, characterize what we call Euclid-
ean geometry. We can also axiomatize physical theories, like quantum mechanics,
and purely logical systems too.
The logical systems in this book do not include any axioms. Instead, to characterize
the theories we are using, we have two options. The first option involves what we call
Proof theory is the study proof theory, the subject of chapter 3. Proof theory studies the axioms, for theories
of axioms (if any) and that include axioms, and rules of a formal theory. Our proof theory included both
rules for a formal theory.
rules of inference and rules of equivalence.
To generate the theorems of the theory we used with the language PL, we just stated
our rules of inference, including the methods of conditional and indirect proof that
allow us to derive the logical truths. By adding the inference rules of the previous few
sections to those of chapter 3, we have been developing a proof theory for monadic
predicate logic.
The second option, which is independent of proof theory, is to provide a semantics
for our language, a pursuit more generally called model theory. Our semantics for
propositional logic consists of assigning truth values to the simple sentences and
using the basic truth tables to compute truth conditions for complex sentences. We
simply interpret formulas by assigning 1 or 0 to each atomic sentence. We compute
truth values of complex propositions by combining the truth values of the atomic
sentences according to the truth table definitions. Since we have only twenty-six
simple terms, there are only 226 = ∼6.7 million possible interpretations, a large, but
finite, number.
The semantics for PL was thus pretty easy, using the truth tables. For M, and the
other languages of predicate logic, the semantics is more complicated. We have to
deal with logical particles, singular terms, predicates, and quantifiers. That is the goal
of this section, and we’ll use the framework described here to show the invalidity of
arguments in M in the next section.
Different constants may correspond to the same object, just as an individual person
or thing can have multiple names.
For example, if we are using M and working with a small domain of interpretation
{1, 2, 3}, we can assign the number 1 to ‘a’, the number 2 to ‘b’, and the number 3 to all
of the remaining nineteen constants (‘c’, . . .‘u’).
Just as not every object in our world has a name, not every object in a domain of
interpretation needs to have a name in a theory. So we can pick a universe of many
objects and name only some of them. Also, since one object can have multiple names,
a theory with many different constants can be interpreted with a domain of fewer
objects. But we ordinarily use a different name for each object.
Step 3. Assign some set of objects in the domain to each predicate.
We interpret predicates as subsets of the domain of interpretation, the objects of
which that predicate holds. We can interpret predicates by providing a list of mem-
bers of the domain or by providing a rule. If we use a predicate ‘Dx’ to stand for ‘x is a
Democrat who has been elected president of the United States’, then the interpreta-
tion of that predicate will be the set of things in the domain of interpretation that were
elected president as Democrats. Using a domain of S1, the interpretation of ‘Dx’ will
be {Bill Clinton, Barack Obama}. Using a domain of S2 , it will be empty.
In the domain of natural numbers, S3, we might define a predicate of even numbers,
‘Ex’, as the set of all objects that are multiples of two: {2, 4, 6 . . . }.
Step 4. Use the customary truth tables to interpret the propositional operators.
We are familiar with step 4 of the semantics from our work with PL, and we natu-
rally assume the truth table definitions for all the propositional operators when inter-
preting theories written in M.
Let’s take, for an example, the interpretation of a small set of sentences that I’ll call
Theory TM1, with a small domain.
Theory TM1 1. Pa ∙ Pb
2. ∼Ib
3. (∃x)Px
4. (∀x)Px
5. (∀x)(Ix ⊃ Px)
6. (∀x)(Px ⊃ Ix)
An Interpretation of TM1
Domain: {Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune}
a: Venus
b: Mars
c: Neptune
4 . 7 : S e m ant i cs for M 2 7 7
To show that a formula is true for every interpretation, we have to think about
various domains, with various assignments of objects to constants and sets of objects
to predicates. I will show semantically that 4.7.1 is a logical truth.
4.7.1 Pa ∨ [(∀x)Px ⊃ Qa]
Suppose that ‘Pa ∨ [(∀x)Px ⊃ Qa]’ is not a logical truth.
Then there is an interpretation on which it is false.
On that interpretation, the object assigned to ‘a’ will not be
in the set assigned to ‘Px’, and there is some counterex-
ample to ‘(∀x)Px ⊃ Qa’.
Any counterexample to a conditional statement has to have a
true antecedent.
So, every object in the domain of our interpretation will be
in the set assigned to ‘Px’.
That contradicts the claim that the object assigned to ‘a’ will
not be in the set assigned to ‘Px’.
So, our assumption must be false: no interpretation will
make that sentence false.
So, ‘Pa ∨ [(∀x)Px ⊃ Qa]’ is logically true.
QED
Semantic proofs of the logical truth of wffs of M are essentially metalogical,
and very different in feel from the semantic proofs for PL. The truth tables are
also metalogical, not part of the object language, but they are more mechanical.
Semantics proofs for logical truths of M are often structured as reductio arguments:
suppose that the given proposition is not a logical truth. Then there will be an
interpretation that makes it false. If the statement is a logical truth, a contradiction
should follow.
We’ll spend more time on proof theory for M and F than we will on the semantics
for logical truth. Still, there is a nice, simple, and agreeable method for showing that
an argument is invalid using the semantics for M, one that we will examine in our
next section.
Summary
We are near the end of our studies of M. We have translated between natural language
and predicate logic. We have a proof system to show that arguments are valid and
which can be used to show that formulas are logical truths. And we have a semantic
method for interpreting our theories of M, constructing models and showing that
formulas are logical truths.
When we introduced our system of inference for PL, we already had a way of
distinguishing the valid from the invalid arguments, using truth tables. In M, we need
a corresponding method for showing that an argument is invalid. In the next section,
4 . 7 : S e m ant i cs for M 2 7 9
we will explore a formal, semantic method for showing that an argument is invalid in
M. Then, we will proceed to a new language, of relational predicate logic.
KEEP IN MIND
EXERCISES 4.7a
Construct models for each of the following theories by
specifying a domain of interpretation (make one up) and
interpreting the constants and predicates. Translate each of
the sentences of the theory to English, given your
interpretation.
1. Pa ∙ ∼Pb 3. Eb ∙ Ec
Qa ∙ Qb Kd ∙ ∼Ka
(∀x)(Px ⊃ Qx) ∼Ea ∙ Pa
(∃x)(∼Px ∙ ∼Qx) (∀x)(Ex ⊃ ∼Kx)
(∃x)(Px ∙ Kx)
(Eb ∨ Ed) ⊃ ∼Ka
2. Mb ∙ ∼Md 4. Oa ∙ ∼Ob
∼La ∙ ∼Wa Ra ∙ ∼Ea
Wc ∙ Wd Rd ∙ Od ∙ ∼Ed
(∃x)(Mx ∙ Lx) (∃x)(Rx ∙ Ox)
(∃x)(Mx ∙ ∼Wx) ∼(∃x)(Ex ∙ Ox)
(∀x)(Lx ⊃ ∼Wx) (∃x)(Ex ∙ Rx) ⊃ ∼Oc
2 8 0 C h apter 4 Mona d i c P re d i cate L og i c
5. (Pa ∙ Pb) ∙ Pc
(Qa ∙ Qb) ∙ ∼Qc
(∀x)[(Px ∙ Qx) ⊃ Rx]
(∃x)[(Px ∙ Qx) ∙ Sx]
(∀x)[(Px ∙ ∼Qx) ⊃ (∼Rx ∙ ∼Sx)]
EXERCISES 4.7b
Show, semantically, that the following propositions selected
from Exercises 4.6c are logical truths.
4.8: INVALIDITY IN M
We studied proof-theoretic methods for showing that an argument in M is valid in
sections 4.4–4.6. In this section, I demonstrate a semantic method for showing that
an argument in M is invalid.
A valid argument is one that is valid under any interpretation, using any domain.
An invalid argument will have counterexamples, interpretations on which the
premises come out true and the conclusion comes out false. Understanding how we
interpret theories in the language of predicate language, the subject of section 4.7,
will help us here to formulate a method for showing that an argument in predicate
logic is invalid.
Recall how we proved that an argument in PL, such as 4.8.1, is invalid.
4.8.1 1. A ⊃ B
2. ∼(B ∙ A) /A ≡ B
4 . 8 : In v al i d i t y i n M 2 8 1
We lined up the propositional variables on the left side of the table, and the premises
and conclusion on the right. Then we assigned truth values to the component
sentences to form a counterexample, a valuation that makes the premises true and the
conclusion false.
4.8.2
A B A ⊃ B / ∼ (B ∙ A) // A ≡ B
0 1 0 1 1 1 1 0 0 0 0 1
The table at 4.8.2 shows that the argument is invalid since there is a counterexample
when A is false and B is true. We will adapt this method for first-order logic.
Just as logical truths are true for all interpretations, if an argument is valid, then it is
valid no matter what we choose as our domain of interpretation. Even if our domain has
only one member, or two or three or a million, valid arguments have no counterexamples.
Inversely, if an argument is invalid, then there will be a counterexample in some finite
domain, though there may be no counterexample in any particular finite domain.
As in PL, we will show that arguments of M are invalid by constructing
counterexamples. Our approach is sometimes called the method of finite universes. The method of finite
Of course, the counterexamples for M will be more complex. Assigning truth values to universesis a semantic
method that can produce
closed atomic propositions is easy enough. It’s the quantifiers that create complexity. counterexamples to
So, to construct a counterexample, we transform propositions with quantifiers into arguments in predicate
unquantified equivalents in finite domains. Then we will have propositions whose logic.
operators are just the operators of PL and we’ll be able to use our old methods.
We’ll start with some examples in domains of one member and then move to more
complex examples that require larger domains.
To show that 4.8.3 is invalid more formally, I will start by choosing a domain of one
object. We will call it ‘a’. Since there is only one object in the domain, the universally
quantified formulas are equivalent to statements about that one object.
4.8.4 (∀x)(Wx ⊃ Mx) is equivalent to Wa ⊃ Ma
(∀x)(Px ⊃ Mx) is equivalent to Pa ⊃ Ma
(∀x)(Wx ⊃ Px) is equivalent to Wa ⊃ Pa
We can thus eliminate the quantifiers and use the same method we used for
arguments in PL. We assign truth values to make the premises true and the conclusion
false, as in 4.8.5.
4.8.5
Wa Ma Pa Wa ⊃ Ma / Pa ⊃ Ma // Wa ⊃ Pa
1 1 0 1 1 1 0 1 1 1 0 0
4.8.8
1 0 1 1 1 0 1 1 0 1 1 0 0
Wa) / Ua ∙ Wa // Ua ∙ Ta
1 1 1 1 1 0 0
Be careful not to confuse expansions into finite domains with instantiation in natu-
ral deductions. In each case, we remove quantifiers. But the restrictions on EI play no
role in expansions.
To show that an argument is invalid, we need only one counterexample. For many
simple arguments, we can construct a counterexample in a domain of one member.
But not all invalid arguments have counterexamples in a one-member domain. To
construct a counterexample, we often must use a larger domain.
Wa Ha Ea Wa ⊃ Ha / Ea ∙ Ha // Wa ⊃ Ea
0 1 0 0
Thus, to show that 4.8.9 is invalid, we have to consider a larger domain. If there are
two objects in a domain, a and b, then the expansions of quantified formulas become
more complex. Universally quantified formulas become conjunctions: a universally
quantified proposition states that every object in the domain has some property.
Existentially quantified formulas become disjunctions: at least one object in the
domain has the property ascribed by an existential formula.
4.8.10 shows the rules for expanding quantified formulas into two- and three-
member domains.
2 8 4 C h apter 4 Mona d i c P re d i cate L og i c
1 0 1 1 0 1 1 1 1 1 0 1 1
0 0 1 1 1 1 1
1 0 0 0 0 1 1
Note that in a two-membered domain, each quantified wff has two instances, one
for each object in the domain.
Constants
When expanding formulas into finite domains, constants remain themselves; there
is no need to expand a term with a constant when moving to a larger domain. If an
argument contains more than one constant, then it will require a domain larger than
one object.
Remember that expanding formulas into finite domains is not the same as
instantiating. In particular, the restriction on EI that we must instantiate to a new
constant does not apply. If an argument contains both an existential quantifier and
a constant, you may expand the quantifier into a single-member domain using the
constant already present in the argument. It need not be a new constant.
4.8.12 cannot be shown invalid in a one-member domain.
4 . 8 : In v al i d i t y i n M 2 8 5
Ac Bc Ac ∙ Bc / Ac // Bc
0 0 0
1 1 0 1 1 0 0 1 1 1 1 1 0
The counterexample is in a two-member domain, when Aa, Ac, and Ba are true and
Bc is false.
Some arguments require three-member, four-member, or larger domains to be
shown invalid. The pattern apparent at 4.8.10 can be extended for larger domains,
adding further conjunctions for universal quantifiers and further disjunctions for ex-
istential quantifiers.
Pa Qa Ra Pb Qb Rb Pc Qc Rc
Pa ∙ Qa
2 8 6 C h apter 4 Mona d i c P re d i cate L og i c
Note that the first premise does not get expanded to other objects; only quantified
sentences expand. No matter how large a domain you choose, a statement without
quantifiers remains the same.
Also notice that I do not group the three disjuncts in the second premise, the third
premise, or the conclusion, and that I do not group the three conjuncts in the fourth
premise. Technically, according to our formation rules, each pair of disjuncts or con-
juncts should be grouped. But since conjunction and disjunction are both associative
and commutative, the grouping really doesn’t matter. For a disjunction to be true,
only one of however many disjuncts appear must be true; it doesn’t matter which.
For a disjunction to be false, every one of the disjuncts must be false. For a conjunc-
tion to be true, every one of the conjuncts must be true. For a conjunction to be false,
just one of the conjuncts has to be false. The extra punctuation as you reach three- or
four-membered domains is less helpful than it is cluttering, so I relax the need for
groupings of pairs when unpacking quantifiers into larger domains. I’ll still use it
for derivations until section 5.5, when similar considerations lead me again to relax
punctuation in long conjunctions or disjunctions.
Returning to our work at hand, the counterexample is relatively easy to construct. I’ll
describe my process of constructing a counterexample and provide a completed table.
I started by assigning values for Pa and Qa in the first premise, both true. The con-
clusion has three disjuncts that each have to be false, and the truth of Qa means that
the first disjunct is false.
4 . 8 : In v al i d i t y i n M 2 8 7
The fourth premise includes three conjuncts, each of which must be true, and the
truth of Pa entails that Ra must be false in order for the first conjunct to be true.
The second and third premises are both series of disjuncts. The values so far as-
signed entail that the first disjunct in each expanded premise is false, but we have two
other disjuncts that we can make true for each, and only one of the disjuncts has to
be true.
I assigned true to Pb and false to Qb to take care of the second premise. The truth
of Pb, carried to the fourth premise, entails that Rb must be false. And the falsity of
Rb makes the second disjunct in the conclusion false, which was needed there given
the falsity of Qb.
Still, the third premise now had two false disjuncts, so I had to make Qc and Rc both
true. Then all that remained was making the last conjunct of the fourth premise true
and the last disjunct of the conclusion false. The truth of Qc already accomplished
the latter task, and making Pc false accomplishes the former. The counterexample is
constructed.
Pa Qa Ra Pb Qb Rb Pc Qc Rc
1 1 0 1 0 0 0 1 1
Pa ∙ Qa
1 1 1
1 0 0 1 1 1 1 0 0 0 0 1
1 0 0 0 0 0 1 1 1
1 1 1 0 1 1 1 0 0 1 0 1
2 8 8 C h apter 4 Mona d i c P re d i cate L og i c
0 0 1 0 0 1 0 1 0 0 1
There is no easy rule for determining how large a domain is required for a
counterexample for a given argument. The standard approach is just to start with
a one-membered domain and work upward as needed. But students often ask for
guidelines, and a rough one is that the size of the required domain increases with
the number of existential premises. Universal premises are easily satisfied trivially,
with false antecedents of their conditionals. But existentials often require conflicting
assignments of truth values and so can increase the size of the required domain.
It is useful and elegant to find a counterexample in the smallest domain possible.
But whatever the minimum size of the domain required to construct a counterex-
ample for a particular argument, there will be counterexamples in all larger domains.
So, if you mistakenly miss a counterexample in, say, a two-membered domain, there
will be one in a three-membered domain, and in larger ones.
Pa Qa Ra Pa ∙ Qa / Pa ⊃ Ra / Ra ⊃ Qa // Qa
0 0 0 0
4.8.16
1 1 0 1 1
0 1 0 1 1 1 1
// Qa ∙ Qb
0 0 1
Logical Truths
The method of finite domains can easily be adapted to show that individual propo-
sitions are not logical truths. If a proposition is a logical truth, it will be true on
any valuation, in any domain. So, if we can find a valuation that makes it false in a
domain of any size, we have a counterexample to the claim that the proposition is
a logical truth.
4.8.17 is not a logical truth.
4.8.17 (∀x)(Px ⊃ Qx) ∨ (∀x)(Qx ⊃ Px)
Let’s start by translating it into a domain of one object, at 4.8.18.
4.8.18 (Pa ⊃ Qa) ∨ (Qa ⊃ Pa)
In a one-object domain, no false valuation is possible. Making either disjunct false
makes the other disjunct true. We’ll have to expand it into a domain of two objects,
at 4.8.19.
4.8.19 [(Pa ⊃ Qa) ∙ (Pb ⊃ Qb)] ∨ [(Qa ⊃ Pa) ∙ (Qb ⊃ Pb)]
The expansion into a two-object domain, 4.8.20, is more promising for a false
valuation.
2 9 0 C h apter 4 Mona d i c P re d i cate L og i c
4.8.20
1 0 0 1 1 0 0 0
0 0 1 0 0
We can make each disjunct false, so that the whole proposition is false. Thus, we
have a valuation that shows that 4.8.17 is not a logical truth.
Overlapping Quantifiers
Sometimes, two quantifiers of M overlap. Unpacking propositions with overlapping
quantifiers requires some care. Consider a logical truth such as 4.8.21. (We saw the
derivation of this proposition at example 4.6.14.)
4.8.21 (∀y)[(∀x)Fx ⊃ Fy]
To expand 4.8.21 into a finite domain, we have to manage the overlapping quanti-
fiers. For a one-membered domain, the expansion is simple, as at 4.8.22.
4.8.22 Fa ⊃ Fa
For larger domains, just work in stages, starting with the outside quantifiers, as I do
at 4.8.23, in a two-object domain, and at 4.8.24, in a three-object domain.
4.8.23 [(∀x)Fx ⊃ Fa] ∙ [(∀x)Fx ⊃ Fb]
[(Fa ∙ Fb) ⊃ Fa] ∙ [(Fa ∙ Fb) ⊃ Fb]
4.8.24 [(∀x)Fx ⊃ Fa] ∙ [(∀x)Fx ⊃ Fb] ∙ [(∀x)Fx ⊃ Fc]
[(Fa ∙ Fb ∙ Fc) ⊃ Fa] ∙ [(Fa ∙ Fb ∙ Fc) ⊃ Fb] ∙ [(Fa ∙ Fb ∙ Fc) ⊃ Fc]
As you should be able to see, no matter how large a domain we consider, we will not
be able to construct a counterexample.
Now consider a related claim that is not a logical truth, 4.8.25.
4.8.25 (∀y)[(∃x)Fx ⊃ Fy]
To show that it is not a logical truth, we just need a valuation that makes the state-
ment false. There is no counterexample in a one-membered domain, which looks
exactly like 4.8.22. For a two-membered domain, once again start with the outside
quantifier, as at 4.8.26.
4 . 8 : In v al i d i t y i n M 2 9 1
0 1 0 1 1 0 0 0
There is a false valuation of 4.8.25 in a domain of two objects, when Fa is false and
Fb is true.
Summary
The method of constructing counterexamples to arguments by considering
interpretations in finite domains draws on both our semantics for PL, in the uses
of truth tables, and our semantics for M, in translating quantified sentences into
unquantified claims in finite domains. We now have a semantic method for proving
arguments of M invalid and a proof-theoretic method of proving arguments valid.
2 9 2 C h apter 4 Mona d i c P re d i cate L og i c
We can also adapt our method of expansion into finite domains to provide a
semantic method for showing that a statement is not a logical truth. We can prove
logical truths of M using the methods of conditional or indirect proof, or the semantic
method sketched at the end of section 4.7. We can now show that a wff is not a logical
truth by providing a valuation that makes it false in a finite domain.
KEEP IN MIND
EXERCISES 4.8a
Show that each of the following arguments is invalid by
generating a counterexample.
1. 1. (∃x)(Ax ∨ Bx)
2. (∀x)Ax / (∀x)Bx
2. 1. (∀x)(Cx ⊃ Dx)
2. Da / Ca
3. 1. (∀x)(Kx ≡ Lx)
2. (∃x)(Mx ∙ Lx) / (∃x)(Nx ∙ Kx)
5. 1. (∀x)(Px ≡ Rx)
2. (∃x)(Qx ∙ ∼Sx) / (∀x)(Qx ⊃ ∼Rx)
6. 1. (∃x)(Ex ∙ Fx) ⊃ (∀x)(Gx ⊃ Hx)
2. ∼(∀x)(Fx ⊃ Ex) / (∀x)(∼Hx ⊃ ∼Gx)
7. 1. (∃x)(Ix ∙ Jx) ≡ (∀x)(Lx ⊃ Kx)
2. (∃x)( Jx ∙ ∼Kx) ≡ (∀x)(Lx ⊃ ∼Kx)
3. (∃x)(Ix ∙ ∼Kx) / (∃x)(Lx ∙ Kx)
8. 1. (∃x)[(Ax ∙ Bx) ∙ Cx]
2. (∃x)[(Ax ∙ Bx) ∙ ∼Cx]
3. (∃x)(Bx ∙ Dx)
4. ∼Da / (∀x)(Cx ⊃ Dx)
9. 1. (∃x)(Ex ∙ Fx)
2. Fb / Eb
10. 1. (∃x)Dx ⊃ (∃x)Gx
2. (∃x)(Dx ∙ Ex) / (∃x)(Ex ∙ Gx)
11. 1. (∃x)(Sx ∙ Tx)
2. (∃x)(Tx ∙ Vx) / (∃x)(Sx ∙ Vx)
12. 1. (∃x)(Xx ∙ Yx)
2. (∀x)(Yx ⊃ Zx)
3. (∃x)(Zx ∙ ∼Yx) / ∼(∀x)(Xx ⊃ Yx)
13. 1. Pa ∙ Qb
2. (∃x)(Rx ∙ Sx)
3. (∃x)(Rx ∙ ∼Sx)
4. (∀x)(Sx ⊃ Qx) / (∀x)(Rx ⊃ Px)
14. 1. (∃x)(Lx ∙ Nx)
2. (∃x)(Mx ∙ ∼Nx)
3. (∀x)(Lx ⊃ Ox) / (∀x)(Mx ⊃ Ox)
15. 1. (∃x)(Rx ∨ ∼Tx)
2. (∃x)(∼Rx ∙ Tx)
3. (∀x)(Sx ≡ Tx) / (∀x)(Sx ⊃ Rx)
16. 1. (∃x)(Ax ∙ Bx)
2. (∃x)(Cx ∙ ∼Bx)
3. (∀x)[(Ax ∙ Cx) ⊃ Dx] / (∀x)(Bx ⊃ Dx)
17. 1. (∃x)(Ex ∙ Fx)
2. ∼(∀x)(Ex ⊃ Fx)
3. (∀x)(Fx ⊃ Ex) / (∀x)(∼Fx ⊃ ∼Ex)
18. 1. (∃x)( Jx ∨ Kx) ⊃ (∃x)(Lx ∙ ∼Jx)
2. (∃x)(Lx ∙ Jx) / (∃x)(Kx ∙ ∼Jx)
2 9 4 C h apter 4 Mona d i c P re d i cate L og i c
EXERCISES 4.8b
Show that each of the invalid arguments from Exercises 4.4c,
listed here, is invalid.
EXERCISES 4.8c
For each argument, determine whether it is valid or invalid.
If it is valid, derive the conclusion using our rules of
inference and equivalence. If it is invalid, provide a
counterexample.
EXERCISES 4.8d
For each proposition, determine if it is a logical truth. If it
is a logical truth, provide a derivation. If it is not, provide a
valuation that shows it false in some finite domain.
1. (∀x)Ax ⊃ (∃x)Ax
2. (∀x)(Bx ⊃ ∼Bx)
3. (∃x)Cx ∨ (∃x)∼Cx
4. (∀x)Dx ∨ (∀x)∼Dx
5. (∀x)(Ex ⊃ Fx) ⊃ (∃x)(Ex ∙ Fx)
6. [(∀x)(Gx ⊃ Hx) ∙ (∃x)∼Hx] ⊃ (∃x)∼Gx
7. ∼(∃x)∼(Kx ∙ Lx) ⊃ (∀x)(Kx ∙ Lx)
8. (∃x)(Ix ∙ ∼Jx) ≡ ∼(∀x)(Ix ⊃ Jx)
9. (∀x)[(Mx ∨ Nx) ⊃ Ox] ⊃ (∀x)(Mx ⊃ Ox)
10. (∀x)[(Px ∙ Qx) ⊃ Rx] ⊃ (∀x)(Px ⊃ Rx)
11. [(∃x)(Sx ∙ ∼Tx) ∙ (∃x)Tx] ⊃ (∃x)∼Sx
12. (∀x)[Xx ⊃ ∼(Yx ∨ Zx)] ⊃ ∼(∃x)(Xx ∙ Yx)
13. (∀x)[Ax ⊃ ∼(Bx ∙ Cx)] ⊃ (∃x)(∼Bx ∨ ∼Cx)
14. (∀x)(Dx ⊃ ∼Ex) ∨ (∃x)(Dx ∙ Ex)
15. (∀x)[(Fx ∨ Gx) ∨ Hx] ⊃ [(∃x)(∼Fx ∙ ∼Gx) ⊃ (∃x)Hx]
16. (∀x)[(Ix ∨ Jx) ∨ Kx] ⊃ (∀x)[∼(Ix ∙ Jx) ⊃ Kx]
17. (∃x)(Lx ∨ Mx) ≡ ∼(∀x)(Lx ∙ Mx)
18. (∃x)(Nx ∨ Ox) ∨ (∃x)(∼Nx ∨ ∼Ox)
19. (∃x)(Px ∙ Qx) ∨ (∃x)(∼Px ∙ ∼Qx)
20. [(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ∼Sx)] ∨ [(∃x)(∼Rx ∙ Sx) ∨ (∃x)(∼Rx ∙ ∼Sx)]
4 . 9 : N otes on T ranslat i on w i t h M 2 9 9
(or vice versa), then 4.9.12 and 4.9.14 are the correct pair. But if the speaker intended
a proposition that would be refuted only if all planets and all asteroids were rocky,
then 4.9.11 and 4.9.13 are the right ones. And if one wants a version of those with two
quantifiers, one can use 4.9.15.
4.9.15 (∃x)(Px ∙ ∼Rx) ∨ (∃x)(Ax ∙ ∼Rx)
Again, I leave the proofs of the equivalence of 4.9.15 and 4.9.13 to the appendix.
a bit of complex charity, but that’s not the kind of thing we can ever really avoid.
Interpretive questions are just below the surface, and sometimes they poke out
without warning.
At this point, you might just want to give up completely on charity. What’s the
use of charity if it gives you such a headache? But charity is always a factor in our
translations. Even taking the less controversial sentences for translation in this book
as grammatical sentences of English requires charity. You might take them as Swahili
nonsense, for example, and refuse to translate nonsense!
Many uses of universal quantifiers for translation require some kind of charity. We
rarely say anything about everything. More often with the universal quantifier, we
make claims about all things within a particular domain, as at 4.9.24, where we are
talking about only humans, not all things.
4.9.24 All humans are mortal.
(∀x)(Hx ⊃ Mx)
4.9.25 seems false, unless we were to stipulate it as a definition of ‘executive’, and such
a definition is implausible. To read 4.9.25 charitably, we are likely to want to restrict
the domain to a particular institution in which only executives have administrative
assistants. For example, the speaker of 4.9.25 is likely to mean something like 4.9.27,
translated into M at 4.9.28.
4.9.27 At Metalogic Incorporated, only executives have administrative
assistants.
4.9.28 (∀x)[(Mx ∙ Ax) ⊃ Ex]
Do we take this as a true sentence, and thus interpret ‘president’ as ‘president of the
United States’? Or do we take this as a false sentence in which the speaker forgot that
there are presidents of many different sorts.
3 0 4 C h apter 4 Mona d i c P re d i cate L og i c
In practice, it often doesn’t matter. We work mainly on the surface grammar. And
for the purposes of this book, 4.9.18 is probably the best choice for the original 4.9.16,
and 4.9.25 is probably best as 4.9.26.
Still, there are important cases in which the surface grammar of a sentence is really
not what we usually use it to mean, as in 4.9.30.
4.9.30 All that glitters is not gold.
The surface grammar yields 4.9.31, with ‘Gx’ standing for ‘x glitters’ and ‘Ax’
standing for ‘x is gold’: nothing gold glitters.
4.9.31 (∀x)(Gx ⊃ ∼Ax)
But ordinary uses of 4.9.30 are usually better rendered as either 4.9.32 or 4.9.33:
there are things that glitter that aren’t gold; you’d better not conclude from its
glittering that what you’ve got is valuable.
4.9.32 ∼(∀x)(Gx ⊃ Ax)
4.9.33 (∃x)(Gx ∙ ∼Ax)
So, while we ordinarily translate according to the surface grammar, in cases like
4.9.30, where the usage is so obviously not according to the surface grammar, we
have to invoke charitable interpretation. It would be nice if language were cleaner and
easier to translate. But if English were precise in all cases, we wouldn’t need formal
logic.
Summary
Pretty much all work in philosophy requires interpretation and critical assessment.
We first have to know what folks are saying before we can determine whether it is
valid or sound, true or false. One of the advantages of formal language is that it can be
more precise than natural language. Thus, regimenting sentences of English or other
languages into formal languages requires us to disambiguate and clarify. Written
sentences are often ambiguous or unclear. When we translate into logical languages,
we have to make decisions about their likely intended meanings. Such meanings will
not always be determinate. Our translations must be guided by general principles of
charity and by our practical goals. Do we need relational predicates, or will monadic
ones do for our purposes? Should we use functions or definite descriptions? Why are
we formalizing our claims? Answers to questions about why we are translating into
logical languages can help us frame our decisions about levels of precision and charity.
Suggested Readings
Davidson, Donald. “Coherence Theory of Truth and Knowledge.” In Truth and Interpretation:
Perspectives on the Philosophy of Donald Davidson, edited by Ernest LePore, 308–319.
Oxford, UK: Blackwell, 1986. Davidson uses the principle of charity in responding to
problems of skepticism.
Dummett, Michael. “Quantifiers.” In A Philosophical Companion to First-Order Logic, edited
by R. I. Hughes, 136–161. Indianapolis, IN: Hackett, 1993. A detailed examination of the
nature of quantification in Frege’s work.
Fisher, Jennifer. On the Philosophy of Logic. Belmont, CA: Wadsworth, 2008. Chapters 1 and 5
contain some useful observations on the utility of quantificational logics.
Frege, Gottlob. Begriffsschrift. In From Frege to Gödel: A Source Book in Mathematical Logic,
1879–1931, edited by Jean van Heijenoort, 1–82. Cambridge, MA: Harvard University
Press, 1967. The preface is an important and engaging statement of the purposes of formal
logic, and includes Frege’s eye and microscope analogies.
Kneale, W., and M. Kneale. The Development of Logic. Oxford, UK: Clarendon Press, 1962.
In this classic history of logic, chapter VIII on Frege’s logic contains a detailed and lucid
discussion of Frege’s work on quantification.
Quine, W. V. Methods of Logic, 4th ed. Cambridge, MA: Harvard University Press, 1982.
Chapters 14–18 describe various alternatives to quantification, historical antecedents to
Fregean quantification, and their limits.
Quine, W. V. Word and Object. Cambridge, MA: MIT Press, 1960. Chapter 2, “Translation
and Meaning,” is an influential work on the challenges of translation, especially from an
unknown language, and contains some of Quine’s thoughts on charity.
Sainsbury, Mark. Logical Forms: An Introduction to Philosophical Logic, 2nd ed. Oxford,
UK: Blackwell, 2001. Chapter 4, “Quantification,” is a broad discussion of the uses of
quantificational logic, with close attention to questions about translation.
Strawson, P. F. “Logical Appraisal.” In A Philosophical Companion to First-Order Logic, edited
by R. I. Hughes, 6–27. Indianapolis, IN: Hackett, 1993. This chapter contains insightful
observations about logic and our goals in using it.
3 0 6 C h apter 4 Mona d i c P re d i cate L og i c
Appendix to 4.9
DERIVING 4.9.7 FROM 4.9.6
1. (∀x)[(Px ∨ Ax) ⊃ Rx] Premise
2. ∼[(∀x)(Px ⊃ Rx) ∙ (∀x)(Ax ⊃ Rx)] AIP
3. ∼(∀x)(Px ⊃ Rx) ∨ ∼(∀x)(Ax ⊃ Rx) 2, DM
4. (∃x)∼(Px ⊃ Rx) ∨ (∃x)∼(Ax ⊃ Rx) 3, QE
5. (∃x)∼(∼Px ∨ Rx) ∨ (∃x)∼(∼Ax ∨ Rx) 4, Impl
6. (∃x)(Px ∙ ∼Rx) ∨ (∃x)(Ax ∙ ∼Rx) 5, DM, DN
7. (∃x)(Px ∙ ∼Rx) AIP
8. Pa ∙ ∼Ra 7, EI
9. (Pa ∨ Aa) ⊃ Ra 1, UI
10. Pa 8, Simp
11. Pa ∨ Aa 10, Add
12. Ra 9, 11, MP
13. ∼Ra 8, Com, Simp
14. Ra ∙ ∼Ra 12, 13, Conj
15. ∼(∃x)(Px ∙ ∼Rx) 7–14, IP
16. (∃x)(Ax ∙ ∼Rx) 6, 15, DS
17. Ab ∙ ∼Rb 16, EI
18. Ab 17, Simp
19. (Pb ∨ Ab) ⊃ Rb 1, UI
20. Pb ∨ Ab 18, Add, Com
21. Rb 19, 20, MP
22. ∼Rb 17, Com, Simp
23. Rb ∙ ∼Rb 21, 22, Conj
24. (∀x)(Px ⊃ Rx) ∙ (∀x)(Ax ⊃ Rx) 2–23, IP, DN
QED
KEY TERMS
first is also taller than the third’. We translate that claim, also known as the transitive
property of ‘taller than’ with three universal quantifiers, as in the third premise of the
argument 5.1.17.
5.1.17 1. Tab
2. Tbc
3. (∀x)(∀y)(∀z)[(Txy ∙ Tyz) ⊃ Txz] / Tac
We will return to deriving the conclusion of this argument in section 5.3. For the
remainder of this section, and in the next section as well, we will look at some more
complicated translations.
More Translations
For 5.1.33–5.1.38, I use Px: x is a person, and Kxy: x knows y.
5.1.33 Someone knows everything. (∃x)[Px ∙ (∀y)Kxy]
5.1.34 Someone knows everyone. (∃x)[Px ∙ (∀y)(Py ⊃ Kxy)]
5.1.35 Everyone knows someone. (∀x)[Px ⊃ (∃y)(Py ∙ Kxy)]
5.1.36 Everyone knows everyone. (∀x)[Px ⊃ (∀y)(Py ⊃ Kxy)]
or (∀x)(∀y)[(Px ∙ Py) ⊃ Kxy)]
5.1.37 No one knows everything. (∀x)[Px ⊃ (∃y)∼Kxy]
or ∼(∃x)[Px ∙ (∀y)Kxy]
5.1.38 No one knows everyone. (∀x)[Px ⊃ (∃y)(Py ∙ ∼Kxy)]
or ∼(∃x)[Px ∙ (∀y)(Py ⊃ Kxy)]
5 . 1 : T ranslat i on Us i ng R elat i onal P re d i cates 3 1 5
When punctuating, make sure never to leave variables unbound. It is often useful
to punctuate after the translation is done, rather than along the way, or at least to
check your punctuation once you have completed a translation. Leading quantifiers
generally have the whole statement in their scope. Other quantifiers tend to have
smaller scopes. In 5.1.26, we saw two quantifiers with very narrow scopes. The second
quantifier in 5.1.45, as in many of the earlier examples, has the remainder of the
formula in its scope since it has to bind a variable in the last term of the wff.
5.1.45 A dead lion is more dangerous than a live dog.
(Ax: x is alive; Dx: x is a dog; Lx: x is a lion;
Dxy: x is more dangerous than y)
(∀x){(Lx ∙ ∼Ax) ⊃ (∀y)[(Dy ∙ Ay) ⊃ Dxy]}
The Power of F
F allows us to translate some neat subtleties and facilitate the understanding of many
aspects of our language. Using F can be pretty amusing, too. For example, check out
the formalization of William Carlos Williams’s “The Red Wheelbarrow,” at 5.1.46,
using: Bx: x is a wheelbarrow; Bxy: x is beside y; Cx: x is a chicken; Dxy: x depends
on y; Gxy: x glazes y; Rx: x is red; Sx: x is rainwater; Wx: x is white.
5.1.46 so much depends
upon
a red wheel
barrow
glazed with rain
water
beside the white
chickens.
(∃x){(Bx ∙ Rx) ∙ (∃y)Dyx ∙ (∃z)(Sz ∙ Gzx) ∙ (∃w)(Cw ∙ Ww ∙ Bxw)}
An interesting exercise would be to discuss the virtues and weaknesses of this for-
malization. Another interesting exercise would be to translate other work.
There is a translation of Williams’s “This Is Just to Say” at the end of the exercises;
look it up and give it a try before peeking!
Summary
F is a powerful language, nearly the strongest of the formal languages we will study. It
allows us to represent, in logical language, a wide range of propositions and inferences
of English, without the ambiguity of natural languages. Exercises 5.1b, which I
adapted from the logic textbook I used as an undergraduate, asks you to translate into
English some well-known sentences that have been rendered in F; you’ll be able to see
how much subtlety and expression you can pack into wffs of F there especially. Once
5 . 1 : T ranslat i on Us i ng R elat i onal P re d i cates 3 1 7
again, the best way to get comfortable with the difficulties and subtleties of F is to
practice your translations as much as possible. Remember to check your punctuation.
In the next section, I’ll lay out the formal syntax and semantics of F. You might
find that looking at that material will help you with the translation exercises of this
section. Then we’ll look at derivations in section 5.3. There’s one more major topic,
identity theory, that we will study in sections 5.4 and 5.5. Identity theory actually
doesn’t change our language, but it introduces a special predicate and some rules
governing inferences with it.
KEEP IN MIND
Relational predicates can be followed by any number of singular terms, though most of our
work in this text will use one- to three-place predicates.
The order of the singular terms matters.
The order of quantifiers also matters.
Try to keep the scope of your quantifiers as narrow as possible.
When all quantifiers are existential or all are universal, putting them all in front, with wide
scope, is acceptable.
Be careful to distinguish “someone” from “something” and “everyone” from “everything.”
It is important to punctuate correctly, never leaving an unbound variable.
EXERCISES 5.1a
Translate each of the following into predicate logic using
relational predicates.
40. If some students take courses in philosophy and mathematics, then all litera-
ture majors take courses in philosophy or mathematics.
132. Some unpunished act is more heinous than any punished act.
133. For any act that is punished, there is some act more heinous.
134. Any act that produces better consequences than some act is laudable.
135. No laudable act is more heinous than all punished acts.
136. For any laudable act, some more heinous act produces better consequences.
137. Some laudable act does not produce better consequences than some act that is
not laudable.
138. Some good acts are punished even though they produce better consequences
than some acts that are not good.
139. No punished act is laudable if it doesn’t produce better consequences than
some good act.
140. If no good acts are punished, then no acts which are not good produce better
consequences than any laudable acts.
EXERCISES 5.1b
Use the translation key to translate the formulas into
natural English sentences.1
1
Adapted from I. Copi, Symbolic Logic, 5th ed. (New York: Macmillan, 1979), 127–128.
3 2 8 C h apter 5 F u ll F i rst - O r d er L og i c
I have eaten
the plums
that were in
the icebox
and which
you were probably
saving
for breakfast
Forgive me
they were delicious
so sweet
and so cold
(∃x){Px ∙ Dx ∙ Sx ∙ Cx ∙ (∃y)(Iy ∙ Ixy) ∙ ◊Sux ∙ Eix ∙ Fiu}
where: i: me; u: you; Cx: x is cold; Dx: x is delicious; Exy: x eats y; Fxy: x asks
forgiveness from y; Ix: x is an icebox; Ixy: x is in y; Sx: x is sweet; and ◊ is to
be taken (contentiously) as modal operator representing ‘probably’.
For relational predicates, our definitions of satisfaction and truth must be adjusted
as well. Objects in the domain can satisfy predicates; that remains the case for one-
place predicates. Ordered n-tuples may satisfy relational predicates. A wff will be
satisfiable if there are objects in the domain of quantification that stand in the rela-
tions indicated in the wff. A wff will be true for an interpretation if all objects in the
domain of quantification stand in the relations indicated in the wff. The definition of
logical truth remains the same: a wff is logically true if, and only if, it is true for every
interpretation.
For an example, let’s extend the interpretation we considered when originally dis-
cussing semantics of M, in section 4.7, to the theory TF1.
Theory TF1: 1. Pa ∙ Pb
2. Ib ∙ ∼Ic
3. Nab
4. Nbc
5. (∃x)(Px ∙ Nxb)
6. (∃x)(Px ∙ Nbx)
7. (∀x)[Ix ⊃ (∃y)(Py ∙ Nxy)]
An Interpretation of TF1
Domain: {Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune}
a: Venus
b: Mars
c: Neptune
Px: {Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune}
Ix: {Mercury, Venus, Earth, Mars}
Nxy: {<Mercury, Venus>, <Mercury, Earth>, <Mercury, Mars>,
<Mercury, Jupiter>, <Mercury, Saturn>, <Mercury, Uranus>,
<Mercury, Neptune>, <Venus, Earth>, <Venus, Mars>, <Ve-
nus, Jupiter>, <Venus, Saturn>, <Venus, Uranus>, <Venus,
Neptune>, <Earth, Mars>, <Earth, Jupiter>, <Earth, Saturn>,
<Earth, Uranus>, <Earth, Neptune>, <Mars, Jupiter>, <Mars,
Saturn>, <Mars, Uranus>, <Mars, Neptune>, <Jupiter, Saturn>,
<Jupiter, Uranus>, <Jupiter, Neptune>, <Saturn, Uranus>,
<Saturn, Neptune>, <Neptune, Uranus>}
Notice that our interpretation is a model of TF1; all of the statements of the theory
come out true.
Constructing an interpretation of a theory of F can be arduous, especially if the
theory contains lots of relational predicates. I wrote out all of the ordered pairs for
‘Nxy’. But, as you probably observed, I could have just said that I was taking that rela-
tion to be interpreted as ‘x is nearer to the sun than y’, in which case I would have at
least provided a rule that allows us to generate the list if we need it.
For a three-place predicate, we use ordered triples. We can interpret the predicate
‘Bxyz’ as at 5.2.2, with a small domain.
5 . 2 : S y nta x , S e m ant i cs , an d In v al i d i t y i n F 3 3 1
Invalidity in F
The method of finite domains of section 4.8 can be used in F just as well as it can be
used in M, though the preponderance of overlapping quantifiers in many formulas of
F can make the process more arduous. Let’s work with the invalid argument 5.2.3.
5.2.3 (∀x)[Px ⊃ (∃y)(Py ∙ Lxy)]
(∃x)(Px ∙ Qx) / (∃x)[Qx ∙ (∃y)Lyx]
The argument is easily expanded into a domain of one object, though there is no
counterexample there.
We’re ready to construct the counterexample, lining up the premises and a conclu-
sion, after a list of all the atomic formulas. It has taken a little more work to get to the
unquantified expansion, but the work from here is no more difficult than it was in M.
I’ll start with the second premise. One of the disjuncts has to be true, so I’ll arbi-
trarily choose the first one, making Pa and Qa true. Carrying those values into the
conclusion, we see that Laa and Lba must be false. Then, on the left side of the first
premise, we can see that Pb and Lab each must be true.
1 1 1 0 1 0
5 . 2 : S y nta x , S e m ant i cs , an d In v al i d i t y i n F 3 3 3
1 1 1 0 0 1 1 1 1
1 1 0 1
1 1 1 1
1 0 0 0 0 1
The second premise is done, but we still have to make the right conjunct of the first
premise true and the right disjunct of the conclusion false.
All we need to do to make the first premise true is make Lbb true. Then the disjunc-
tion is true, and so the conditional is also true, finishing our work with the premise.
Only the conclusion remains, and that’s easily completed by making Qb false.
1 1 1 0 0 1 0 1
1 1 1 0 0 1 1 1 1 1
1 1 1 0 0 1 1 1 1
3 3 4 C h apter 5 F u ll F i rst - O r d er L og i c
1 1 1 1 1 0 0
1 0 0 0 0 0 0 0 1 1 1
Summary
The semantics for F are not much different from the semantics for M, except for the
interpretations of relational predicates by ordered n-tuples. The semantic definitions
of validity and logical truth remain unaltered.
We can also still use our method of finite domains, though its utility is limited. The
expansions of formulas with three or more quantifiers can get unpleasantly long, even
in a two-membered domain, let alone larger domains. But this method can generate
counterexamples reliably for many invalid arguments.
5 . 2 : S y nta x , S e m ant i cs , an d In v al i d i t y i n F 3 3 5
There are other methods for generating counterexamples for invalid arguments of F
and the further extensions of logic in this book. Most notably, truth trees, sometimes
called semantic tableaux, can be both amusing and effective. But we’ll stick with our
work on natural deduction, moving to proof theory for F in the next section.
KEEP IN MIND
EXERCISES 5.2a
Construct models for each of the given theories by
specifying a domain of interpretation and interpreting the
constants and predicates so that all sentences of the theory
come out true.
1. 1. Aa ∙ Ab
2. Rab ∙ Rba
3. (∃x)∼Rax ∙ (∃x)∼Rbx
4. (∃x)∼Rxa ∙ (∃x)∼Rxb
2. 1. (Pa ∙ Pb) ∙ Pc
2. Babc ∙ ∼Bcba
3. (∀x)(∃y)(∃z)(Byxz ∨ Bzxy)
3. 1. Pa ∙ ∼Sa
2. Pb ∙ ∼Tb
3. (∃x)(∃y)[(Px ∙ Py) ∙ (Rxy ∙ ∼Ryx)]
4. (∀x)[Px ⊃ (Sx ∨ Tx)]
3 3 6 C h apter 5 F u ll F i rst - O r d er L og i c
EXERCISES 5.2b
Show that each of the following arguments is invalid by
generating a counterexample.
1. 1. Aa ∙ Ab
2. Bab ∙ ∼Bba / (∃x)(Ax ∙ Bxa)
2. 1. (∃x)Cax
2. (∃x)Cbx / (∃x)(Cax ∙ Cbx)
3. 1. Da ∙ (∃x)Eax
2. Db ∙ (∃x)Ebx / (∀x)[Dx ∙ (∃y)Exy]
4. 1. (∀x)(Fax ⊃ Gx)
2. (∀x)[Gx ⊃ (∃y)Fyx]
3. Faa / (∀x)(∃y)Fyx
5. 1. (∀x)(∀y)[( Jx ∙ Jy) ⊃ (Kxa ∙ Kya)]
2. Jb / Kba
6. 1. (∀x)[Lx ⊃ (∃y)Mxy]
2. ∼Mab / ∼La
7. 1. (∃x)[Px ∙ (∃y)(Py ∙ Qxy)]
2. (∀x)(Px ⊃ Rx) / (∀x)[Rx ⊃ (∃y)(Py ∙ Q yx)]
8. 1. (∀x)(∀y)[(Hx ∙ Hy) ⊃ Ixy]
2. Ha / (∀x)[Hx ⊃ (∀y)Ixy]
9. 1. Da ∙ Eab
2. (∃x)(∃y)(Eyx ∙ Fx) / (∃x)(Dx ∙ Fx)
5 . 3 : Der i v at i ons i n F 3 3 7
5.3: DERIVATIONS IN F
In section 5.1, I motivated extending our language M to a language F by introducing
relational predicates to regiment argument 5.1.1.
5.1.1 Alyssa is taller than Bhavin.
Bhavin is taller than Carlos.
Given any three things, if one is taller than another, and the latter is
taller than the third, then the first is also taller than the third.
So, Alyssa is taller than Carlos.
1. Tab
2. Tbc
3. (∀x)(∀y)(∀z)[(Txy ∙ Tyz) ⊃ Txz] / Tac
3 3 8 C h apter 5 F u ll F i rst - O r d er L og i c
To derive the conclusion, we use the same rules of inference we used with M. When
instantiating, we remove quantifiers one at a time, taking care to make appropriate
instantiations to variables or constants. We will need to make only one small
adjustment to the rule UG, which I will note shortly. A derivation of our motivating
argument is below, at 5.3.1. Notice that the removal of quantifiers from the third
premise takes three steps.
5.3.1 1. Tab
2. Tbc
3. (∀x)(∀y)(∀z)[(Txy ∙ Tyz) ⊃ Txz] / Tac
4. (∀y)(∀z)[(Tay ∙ Tyz) ⊃ Taz] 3, UI
5. (∀z)[(Tab ∙ Tbz) ⊃ Taz] 4, UI
6. (Tab ∙ Tbc) ⊃ Tac 5, UI
7. (Tab ∙ Tbc) 1, 2, Conj
8. Tac 6, 7, MP
QED
The Restriction on UG
All of our rules for removing and replacing quantifiers work in F just as they did in M,
with only one exception. Consider the problematic 5.3.3, beginning with a proposition
that can be interpreted as ‘Everything loves something’.
5.3.3 1. (∀x)(∃y)Lxy
2. (∃y)Lxy 1, UI
3. Lxa 2, EI
4. (∀x)Lxa 3, UG: but wrong!
5. (∃y)(∀x)Lxy 4, EG
5 . 3 : Der i v at i ons i n F 3 3 9
Given our interpretation of line 1, line 5 reads, ‘There’s something that everything
loves’. It does not follow from the proposition that everything loves something that
there is one thing that everything loves. Imagine that we arranged all the things in
a circle and everyone loved just the thing to its left. Line 1 would be true, but line 5
would be false. We should not be able to derive step 5 from step 1.
We can locate the problem in step 4 of 5.3.3. In line 2 we universally instantiated
to some random object x. So, ‘x’ could have stood for any object. It retains its
universal character, even without a universal quantifier to bind it, and so we are free
to UG over x.
Then, in line 3, we existentially instantiated. In existentially instantiating, we gave
a name, ‘a’ to the thing that bore relation L to it, to the thing that x loves. Once we
gave a name to the thing that x loves, x lost its universal character. It could no longer
be anything that loves something. It now is the thing that loves a. Thus ‘x’ became as
particular an object as ‘a’ is. So, the generalization at line 4 must be blocked. In other
words, variables lose their universal character if they are free when EI is used.
We formulate the resultant restriction on UG as 5.3.4.
5.3.4 Never UG on a variable when there’s a constant present and the
variable was free when the constant was introduced.
A constant may be introduced as the result of EI or UI, and these are the cases you
will have to keep your eye on. Constants may also be introduced in the premises,
though there are ordinarily no free variables in premises, since premises should be
closed formulas. The restriction on UG debars line 4 of 5.3.3 because ‘x’ was free in
line 3 when ‘a’ was introduced.
5.3.5 contains an acceptable use of UG in F.
5.3.5 1. (∃x)(∀y)[(∃z)Ayz ⊃ Ayx]
2. (∀y)(∃z)Ayz / (∃x)(∀y)Ayx
3. (∀y)[(∃z)Ayz ⊃ Aya] 1, EI
4. (∃z)Ayz ⊃ Aya 3, UI
5. (∃z)Ayz 2, UI
6. Aya 4, 5, MP
7. (∀y)Aya 6, UG
8. (∃x)(∀y)Ayx 7, EG
QED
Note that at line 7, UG is acceptable because ‘y’ was not free when ‘a’ was introduced
in line 3. The restriction 5.3.4 applies only to UG. All other rules are just as they are
in monadic predicate logic.
Accidental Binding
When using UG or EG, watch for illicit accidental binding. 5.3.6 contains an instance
of accidental binding.
3 4 0 C h apter 5 F u ll F i rst - O r d er L og i c
More Derivations
Derivations in F often involve propositions with overlapping quantifiers. Neverthe-
less, we must adhere to the rules and restrictions we had in M, as well as the new
restriction on UG for F. UI and EG remain anytime-anywhere rules. The restrictions
on EI can be trickier to manage, since quantifiers may be buried inside formulas. Still,
remember always to EI to a new constant. Derivations with more than one existential
quantifier in the premises are likely to need multiple constants, as in 5.3.9, where at
line 4 I EI line 2 to ‘b’ because I had already EIed line 1 to ‘a’.
5.3.9 1. (∃x)[Px ∙ (∀y)(Py ⊃ Qxy)]
2. (∃x)(Px ∙ Sx) / (∃x)[Sx ∙ (∃y)Qyx]
3. Pa ∙ (∀y)(Py ⊃ Qay) 1, EI
4. Pb ∙ Sb 2, EI
5. (∀y)(Py ⊃ Qay) ∙ Pa 3, Com
6. (∀y)(Py ⊃ Qay) 5, Simp
7. Pb ⊃ Qab 6, UI
8. Pb 4, Simp
9. Qab 7, 8, MP
10. (∃y)Qyb 9, EG
11. Sb ∙ Pb 4, Com
12. Sb 11, Simp
13. Sb ∙ (∃y)Qyb 12, 10, Conj
14. (∃x)[Sx ∙ (∃y)Qyx] 13, EG
QED
It remains generally useful to EI before you UI. But sometimes an existential quan-
tifier is buried in a line and we cannot instantiate its subformula until we have the
quantifier as the main operator, as in 5.3.10, which uses conditional proof.
5 . 3 : Der i v at i ons i n F 3 4 1
Logical Truths
We can use CP and IP to prove logical truths in F. 5.3.13 proves that ‘(∃x)(∀y)Pxy ⊃
(∀x)(∃y)Pyx’ is a logical truth by conditional proof.
5.3.13 1. (∃x)(∀y)Pxy ACP
2. (∀y)Pay 1, EI
3. Pax 2, UI
4. (∃y)Pyx 3, EG
5. (∀x)(∃y)Pyx 4, UG
6. (∃x)(∀y)Pxy ⊃ (∀x)(∃y)Pyx 1–5, CP
QED
Notice that the use of UG at line 5 is legitimate since the constant at line 3 was
bound at line 4; there’s no constant present on the line on which I used UG.
5 . 3 : Der i v at i ons i n F 3 4 3
As with all other proofs in F, take your time with the quantifiers. Notice that the
exchange of the consecutive quantifiers from lines 6–8 takes two separate steps. Be
careful also to obey the restrictions on UG, and always EI to a new constant.
Summary
Derivations in F look different from those in M, and they are generally more
complex, but the rules are basically the same. The presence of multiple quantifiers
tends to lengthen any derivation, since instantiation, generalization, and
exchanging quantifiers has to be done one step at a time. Keep track of your
variables and constants, make sure to obey the restrictions on UG and EI, and be
patient. And, of course, practice. It is much better to do a little every day than to
try to do a lot at once.
KEEP IN MIND
All rules for M are the same for F, with one exception, a restriction on UG.
Never UG on a variable when there’s a constant present and the variable was free when the
constant was introduced.
Remove quantifiers from formulas one at a time, and only when they are the main
operators.
Logical truths of F can be derived using conditional or indirect proof, just as for M.
3 4 4 C h apter 5 F u ll F i rst - O r d er L og i c
EXERCISES 5.3a
Derive the conclusions of each of the following arguments.
1. 1. Bab
2. (∀x)(Bax ⊃ Ax) / (∃x)Ax
2. 1. Da ∙ (∃x)Eax
2. Db ∙ (∀x)Ebx / (∃x)(Eax ∙ Ebx)
3. 1. Fab
2. (∀x)(Fax ⊃ Gx)
3. (∀x)(Gx ⊃ Fxa) / Fba
4. 1. ∼(∃x)(Hx ∙ Ixa)
2. (∃x)Ixa / (∃x)∼Hx
5. 1. (∀x)[Lx ⊃ (∃y)Mxy]
2. (∀y)∼May / ∼La
6. 1. Aa ∙ (Ba ∙ ∼Cab)
2. (∀y)Cay ∨ (∀z)Dbz / (∃y)(∀z)Dyz
7. 1. (∀x)[(∃y)Bxy ⊃ (Ax ∨ Cx)]
2. (∃z)(∼Az ∙ ∼Cz) / (∃z)(∀y)∼Bzy
8. 1. Db ∙ Eab
2. (∀x)[(∃y)Eyx ⊃ Fx] / (∃x)(Dx ∙ Fx)
9. 1. (∃x)[Nx ∙ (∃y)(Ny ∙ Qxy)]
2. (∀x)(Nx ⊃ Px) / (∃x)[Px ∙ (∃y)(Py ∙ Q yx)]
10. 1. (∃x)[Qx ∨ (∃y)(Ry ∙ Pxy)]
2. ∼(∃x)(Sx ∨ Qx) / (∃z)(∃y)(Ry ∙ Pzy)
11. 1. (∀x)[(∀y)Uxy ⊃ (Tx ∙ Vx)]
2. ∼(∃x)Tx / (∃z)∼Uza
12. 1. (∀x)[Ax ⊃ (∃y)Bxy]
2. (∀x)[(∃y)Bxy ⊃ (Cx ∨ Dx)]
3. (∃x)(Ax ∙ ∼Cx) / (∃x)(Ax ∙ Dx)
13. 1. (∃x)[Mx ∙ (∃y)(Ny ∙ Lxy)]
2. (∀x)(∀y)[Lxy ⊃ (∃z)Oyz] / (∃x)(∃y)Oxy
14. 1. (∀x)[Ex ∙ (Fx ∨ Gx)]
2. (∃x){Hx ∙ (∀y)[(Fy ∨ Gy) ⊃ Ixy]} / (∃y)(∃x)Ixy
15. 1. (∀x)[Ax ⊃ (∃y)(Cy ∙ Dxy)]
2. (∀x)(∀y)(Dxy ⊃ By) / (∀x)Ax ⊃ (∃y)(By ∙ Cy)
5 . 3 : Der i v at i ons i n F 3 4 5
EXERCISES 5.3b
Translate each of the following arguments into propositions
of F using the indicated formulas. Then, derive the
conclusions of the arguments.
1. Some ballet dancers are shorter than some gymnasts. No gymnasts are clumsy.
So, it is not the case that all things are clumsy. (Bx: x is a ballet dancer; Gx: x is
a gymanst; Cx: x is clumsy; Sxy: x is shorter than y)
5 . 3 : Der i v at i ons i n F 3 4 7
2. Anyone who teaches a math class is intelligent. Professor Rosen is a person who
teaches Calculus I. Calculus I is a math class. So, Professor Rosen is intelligent.
(c: Calculus I; r: Professor Rosen; Px: x is a person; Ix: x is intelligent; Mx: x is
a math class; Txy: x teaches y)
3. All cats love all dogs. It is not the case that everything loves Brendan; and all
things are cats. So, it is not the case that everything is a dog. (b: Brendan; Cx:
x is a cat; Dx: x is a dog; Lxy: x loves y)
4. Alice buys a baguette from some store. Baguettes are food. Alice is a resident
of Clinton. So, some residents of Clinton buy some food from some store. (a:
Alice; c: Clinton; Bx: x is a baguette; Fx: x is food; Sx: x is a store; Rxy: x is a
resident of y; Bxyz: x buys y from z)
5. All philosophers have some mentor to whom they respond. Either something
isn’t a philosopher or nothing is a mentor. So, not everything is a philosopher.
(Mx: x is a mentor; Px: x is a philosopher; Rxy: x responds to y)
6. Some students read books written by professors. All books written by profes-
sors are well-researched. So, some professor wrote a well-researched book. (Bx:
x is a book; Px: x is a professor; Sx: x is a student; Wx: x is well-researched; Rxy:
x reads y; Wxy: x wrote y)
7. Sunflowers and roses are plants. Some sunflowers grow taller than all roses.
Russell gave a rose to Emily. So, some plant is taller than some rose. (e: Emily;
r: Russell; Px: x is a plant; Rx: x is a rose; Sx: x is a sunflower; Gxy: x grows taller
than y; Gxyz: x gives y to z)
8. There is something trendier than everything that’s expensive or of good quality.
Anything that’s meaningful or serves a purpose is either expensive, or there’s
something more uninteresting than it. Not everything is expensive or not
meaningful, but everything is of good quality. So, there is something trendier,
and there is something more uninteresting, than something of good quality.
(Ex: x is expensive; Mx: x is meaningful; Px: x serves a purpose; Qx: x is of good
quality; Txy: x is trendier than y; Uxy: x is more uninteresting than y)
9. All philosophers are more skeptical than some physicists. All physicists are sci-
entists. So, all philosophers are more skeptical than some scientists. (Px: x is
a philosopher; Sx: x is a scientists; Yx: x is a physicist; Sxy: x is more skeptical
than y)
10. Some sets include sets. If something includes all sets, then it is not a set. So,
some set does not include some set. (Sx: x is a set; Ixy: x includes y)
11. All philosophers who influenced Mill influenced Quine. Bentham was a politi-
cal theorist and a philosopher who influenced Mill. Any philosopher who influ-
enced Quine was an empiricist. So, Bentham was an empiricist. (b: Bentham;
m: Mill; q: Quine; Ex: x is an empiricist; Px: x is a philosopher; Tx: x is a politi-
cal theorist; Ixy: x influenced y)
3 4 8 C h apter 5 F u ll F i rst - O r d er L og i c
12. Any act with better consequences than some act is more morally required than
it. Pulling the lever in a trolley case is an act with better consequences than the
act of ignoring it. If pulling the lever is more morally required than ignoring it,
then the doctrine of acts and omissions is unsupportable. So, the doctrine of
acts and omissions is unsupportable. (a: the doctrine of acts and omissions; i:
ignoring the lever in a trolley case; p: pulling the lever in a trolley case; Ax: x
is an act; Sx: x is supportable; Cxy: x has better consequences than y; Mxy: x is
more morally required than y)
13. Any characteristic that is between extremes is a virtue. Cowardice and rash-
ness are vices. Every vice is an extreme. Courage is a characteristic between
cowardice and rashness. So, courage is a virtue. (c: courage; f: cowardice; r:
rashness; Cx: x is a characteristic; Ex: x is an extreme; Gx: x is a virtue; Vx: x is
a vice; Bxyz: y is between x and z)
14. All virtues are between some extremes. Any characteristic between any two
things is not an extreme. Any characteristic that is not extreme has some ben-
efit. Temperance is a characteristic that is a virtue. So, temperance has some
benefit. (t: temperance; Cx: x is a characteristic; Ex: x is an extreme; Vx: x is a
virtue; Bxy: x is a benefit of y; Bxyz: y is between x and z)
15. Philosophers who are read more widely than other philosophers have greater
influence than them. No philosopher has greater influence than the philoso-
pher Plato. So, no philosopher is read more widely than Plato. (p: Plato; Px: x
is a philosopher; Ixy: x has greater influence than y; Rxy: x is read more widely
than y)
16. Given any three works of philosophy, if the first has greater influence than the
second, and the second has greater influence than the third, then the first has
greater influence than the third. Gorgias, Republic, and Laws are all dialogues
written by Plato. Everything written by Plato is a work of philosophy. Gorgias
has more influence than Laws, but Republic has more influence than Gorgias. So,
Republic has greater influence than Laws. (g: Gorgias; l: Laws; p: Plato; r: Repub-
lic; Dx: x is a dialogue; Wx: x is a work of philosophy; Ixy: x has greater influence
than y; Wxy: x wrote y)
EXERCISES 5.3c
Derive the following logical truths of F.
1. (∀x)(∀y)Axy ⊃ (∃x)(∃y)Axy
2. (∃x)(∀y)Dyx ⊃ (∃x)Dxx
5 . 3 : Der i v at i ons i n F 3 4 9
3. (∀x)Fmxn ⊃ (∃x)(∃y)Fxoy
4. (∀x)(∃y)(Gxy ∨ ∼Gxx)
5. (∃x)Exx ⊃ (∃x)(∃y)Exy
6. (∃x)∼Bxa ∨ (∃x)Bbx
7. (∃x)(∀y)Cxy ⊃ (∀y)(∃x)Cxy
8. (∀x)(∃y)Hxy ⊃ (∃x)(∃y)Hxy
9. (∀x)[Px ⊃ (∃y)(Q y ∙ Rxy)] ⊃ {(∃x)Px ⊃ (∃x)(∃y)[(Px ∙ Q y) ∙ Rxy]}
10. (∃x)(∀y)( Jxy ∙ ∼Jyx) ∨ (∀x)(∃y)(Jxy ⊃ Jyx)
11. (∃x)(∀y)(Kxy ∨ Kyx) ⊃ (∃x)[(∃y)∼Kxy ⊃ (∃y)Kyx]
12. (∀x)[Px ⊃ (∃y)Qxy] ⊃ [(∀x)(∀y)∼Qxy ⊃ ∼(∃x)Px]
13. (∃x)[Px ∙ (∃y)(Q y ∙ Rxy)] ⊃ (∃x)[Qx ∙ (∃y)(Py ∙ Ryx)]
14. (∀x)[Px ⊃ (∀y)Qxy] ≡ (∀x)(∀y)(Py ⊃ Q yx)
15. (∀x)[Px ⊃ (∃y)(Q y ∙ Rxy)] ∨ (∃x)(∀y)[Px ∙ ∼(Q y ∙ Rxy)]
EXERCISES 5.3d
For each argument, determine whether it is valid or invalid.
If it is valid, derive the conclusion using our rules of
inference and equivalence. If it is invalid, provide a
counterexample.
1. 1. (∀x)(∀y)(Bxy ≡ Byx)
2. Bab ∙ Bbc / Bac
2. 1. (∀x)(∀y)(Pxy ≡ ∼Pyx)
2. ∼(∃x)Pxa / (∃x)Pax
3. 1. (∀x)(Px ⊃ Qxi)
2. (∃x)(Qix ∙ Px)
3. Pa / Qia
4. 1. (∀x)(∀y)(∃z)(Bxzy ≡ Byzx)
2. Babc / Bcba
5. 1. (∀x)[Px ⊃ (∃y)(Py ∙ Qxy)]
2. (∀x)(Px ⊃ ∼Rx) / (∀x)[Rx ⊃ (∀y)(Ry ⊃ ∼Qxy)]
6. 1. (∀x)[Px ⊃ (∃y)(Py ∙ Rxy)]
2. (∀x)(Px ⊃ Qx) / (∀x)[Px ⊃ (∃y)(Q y ∙ Rxy)]
3 5 0 C h apter 5 F u ll F i rst - O r d er L og i c
7. 1. (∀x)[Px ⊃ (∃y)Qxy]
2. (∃x)∼Qax / (∃x)∼Px
8. 1. (∀x)[Ux ⊃ (∃y)(Ty ∙ Vxy)]
2. (∃x)Vax ⊃ (∀x)Vax
3. Ua / (∃x)(∀y)Vxy
9. 1. (∃x)(∃y)[(Px ∙ Py) ∙ Rxy]
2. (∃x)(∃y)[(Px ∙ Py) ∙ Qxy] / (∃x)(∃y)(Qxy ∙ Rxy)
10. 1. (∀x)(∀y)(Pxy ⊃ Pyx)
2. (∃x)[Qx ∙ (∀y)Pxy] / (∃x)[Qx ∙ (∀y)Pyx]
11. 1. (∀x)[(Px ∙ Qx) ⊃ Rxx]
2. (∃x)(Px ∙ ∼Rxx)
3. (∀x)[Qx ⊃ (∃y)(Py ∙ Rxy)] / (∃x)(∃y)(Rxy ∙ ∼Rxx)
12. 1. (∀x)[(∃y)Pxy ⊃ (∃y)Qxy]
2. (∃x)(∀y)∼Qxy / (∃x)(∀y)∼Pxy
13. 1. (∀x)[(∃y)Pxy ⊃ (∃y)Qxy]
2. (∃x)(∃y)∼Qxy / (∃x)(∃y)∼Pxy
14. 1. (∃x)(∀y)[(Fx ∙ Dx) ∨ (Ey ⊃ Gxy)]
2. (∀x)[(∃y)Gxy ⊃ (∃z)Hxz]
3. ∼(∃x)Fx ∙ (∀z)Ez / (∃y)(∃z)Hyz
15. 1. (∀x)(∀y)(Pxy ⊃ Pyx)
2. Pab ∙ Pbc / Pac
16. 1. (∀x)(∀y)(∀z)[(Pxy ∙ Pyz) ⊃ Pxz]
2. Pab ∙ ∼Pac / Pbc
17. 1. (∀x)(∀y)(∀z)[(Pxy ∙ Pyz) ⊃ Pxz]
2. Pab ∙ Pba / (∃x)Pxx
18. 1. (∀x)(∀y)(∀z)[(Pxy ∙ Pyz) ⊃ Pxz]
2. (∀x)Pxx
3. Pac ∙ ∼Pba / ∼Pcb
19. 1. (∀x)(∀y)(∀z)(Bxzy ≡ ∼Byzx)
2. (∀x)(∀y)(∀z){[(Px ∙ Py) ∙ Pz] ⊃ Bxyz}
3. Pa ∙ Pb
4. Babc / ∼Pc
20. 1. (∀x)(∀y)(Pxy ⊃ Pyx)
2. (∀x)[Qx ⊃ (∃y)(Sy ∙ Rxy)]
3. (∀x)(Sx ⊃ Qx)
4. Qa ∙ Pba / (∃x)(Qx ∙ Pxb) ∙ (∃x)(Qx ∙ Rax)
5 . 4 : T h e I d ent i t y P re d i cate : T ranslat i on 3 5 1
So, we introduce derivation rules that govern inferences like this one and give
identity its own symbol, ‘=’.
Translation
The identity predicate allows us to reveal inferential structure for a wide variety of
propositions, making it extraordinarily powerful. It allows us to express propositions
with ‘only’ and ‘except’; superlatives; and ‘at least’, ‘at most’, and ‘exactly’; and to man-
age a problem with names and definite descriptions.
To start, note that, as a convention for the rest of the chapter, I will drop the
requirement on wffs that series of conjunctions and series of disjunctions have
5 . 4 : T h e I d ent i t y P re d i cate : T ranslat i on 3 5 3
brackets for every two conjuncts or disjuncts. Propositions using identity can
become long and complex. To reduce the amount of punctuation in our formulas,
given that commutativity and association hold for both conjunction and disjunction,
we allow such series, even if they have many terms, to be collected with one set of
brackets.
Thus, 5.4.8 can be written as 5.4.9, and 5.4.10 can be written as 5.4.11.
5.4.8 (∃x)(∃y){(Ax ∙ Bxj) ∙ [(Ay ∙ Iyj) ∙ x≠y)]}
5.4.9 (∃x)(∃y)(Ax ∙ Bxj ∙ Ay ∙ Iyj ∙ x≠y)
5.4.10 (∀x)(∀y)(∀z)(∀w){[(Px ∙ Py) ∙ (Pz ∙ Pw)] ⊃ {[(x=y ∨ x=z) ∨
(x=w ∨ y=z)] ∨ (y=w ∨ z=w)}}
5.4.11 (∀x)(∀y)(∀z)(∀w)[(Px ∙ Py ∙ Pz ∙ Pw) ⊃ (x=y ∨ x=z ∨ x=w ∨ y=z
∨ y=w ∨ z=w)]
‘Only’ sentences can be even more complex, as at 5.4.18, in which both clauses con-
tain quantification.
5.4.18
Only Locke plays billiards with some rationalist who is read more
widely than Descartes.
(Rx: x is a rationalist; Mxy: x is read more widely than y;
Pxy: x plays billiards with y)
(∃x){(Rx ∙ Mxd ∙ Plx) ∙ (∀y)[(Ry ∙ Myd) ⊃ (∀z)(Pzy ⊃ z=l)]}
Sentences with ‘except’ also contain universal claims and a preceding clause. As
usual, universal claims have a conditional as the main propositional operator in their
scope. But identity shows up in the consequent of the conditional for ‘only’ claims,
while it shows up in the antecedent in ‘except’ claims, allowing us to omit the desired
exception, as in 5.4.19.
5.4.19 Everyone except Julio loves Maria.
∼Ljm ∙ (∀x)[(Px ∙ x≠j) ⊃ Lxm]
Ordinarily, when we use ‘except’, not only do we exempt one individual from a uni-
versal claim, we also deny that whatever we are ascribing to everyone else holds of
the exemption. Julio doesn’t love Maria, and every other person does. As with ‘only’
sentences, these denials are extra clauses that I put at the beginning.
5.4.20 and 5.4.21 have slightly more complex preceding clauses; you can see the
role of negation in the latter.
5.4.20 Every philosopher except Berkeley respects Locke.
Pb ∙ ∼Rbl ∙ (∀x)[(Px ∙ x≠b) ⊃ Rxl]
The exception clause added to the antecedent of the conditional following the uni-
versal quantifier can also be longer, as when we except more than one thing, as at
5.4.22.
5.4.22 Some philosopher respects all philosophers except Plato and Aristotle.
Pp ∙ Pa ∙ (∃x){Px ∙ ~Rxp ∙ ~Rxa ∙ (∀y)[(Py ∙ y≠p ∙ y≠a) ⊃ Rxy]}
Some uses of ‘but’ work just like ordinary uses of ‘except’, as at 5.4.23, which also
has a quantified preceding clause.
5.4.23
Every philosopher but Socrates wrote a book.
(Bx: x is a book; Px: x is a philosopher; Wxy: x wrote y)
Ps ∙ ∼(∃x)(Bx ∙ Wsx) ∙ (∀x)[(Px ∙ x≠s) ⊃ (∃y)(By ∙ Wxy)]
Socrates is a philosopher, and there is no book that he wrote, but for all philosophers
except Socrates, there is a book that they wrote. Of course, 5.4.23 is false, though
that’s no barrier to writing it.
5 . 4 : T h e I d ent i t y P re d i cate : T ranslat i on 3 5 5
SUPERLATIVES
Relational predicates allow us to express comparisons: larger than, smaller than, older
than, funnier than, and so on. The identity predicate allows us to express superlatives.
We have a comparison at 5.4.24 where ‘Ix’ stands for ‘x is an impressionist’ and ‘Bxy’
stands for ‘x is a better impressionist than y’.
5.4.24 Degas is a better impressionist than Monet. Id ∙ Im ∙ Bdm
We don’t really need the ‘Ix’ clauses for 5.4.24, and we don’t need identity. But what
if you want to say that Degas is the best impressionist, or to say that you are the nicest
person? If you are nicer than anyone, then you are nicer than yourself, which is impos-
sible. We really need to say ‘nicer than anyone else’, ‘nicer than anyone except oneself ’.
We thus add a universal quantifier with an identity clause to except the single, reflex-
ive case: better (or more profound or nicer or whatever) than anyone else, as at 5.4.25.
5.4.25 Degas is the best impressionist. Id ∙ (∀x)[(Ix ∙ x≠d) ⊃ Bdx]
Notice that we do need the ‘Ix’ clauses here: Degas is an impressionist, and no mat-
ter what other impressionist you pick, he’s a better impressionist.
5.4.26 is another standard superlative sentence. 5.4.27 adds a negation, which leads
to two equivalent propositions (given QE).
5.4.26
Hume is the biggest philosopher.
(h: Hume; Px: x is a philosopher; Bxy: x is bigger than y)
Ph ∙ (∀x)[(Px ∙ x≠h) ⊃ Bhx]
5.4.27
Hume is not the most difficult empiricist to read.
(h: Hume; Ex: x is an empiricist; Dxy: x is more difficult to
read than y)
Eh ∙ ∼(∀x)[(Ex ∙ x≠h) ⊃ Dhx]
Eh ∙ (∃x)[(Ex ∙ x≠h) ∙ ∼Dhx]
5.4.28 just complicates the sentence slightly, and 5.4.29 a bit more.
5.4.28
The Ethics is the most difficult book by Spinoza to read.
(e: The Ethics; Bx: x is a book; Wxy: x wrote y; Dxy:
x is more difficult to read than y)
Be ∙ Wse ∙ (∀x)[(Bx ∙ Wsx ∙ x≠e) ⊃ Dex]
5.4.29 Either The Critique of Pure Reason or The Ethics is the most difficult
book to read.
(c: The Critique of Pure Reason; e: The Ethics;
Bx: x is a book; Dxy: x is more difficult to read than y)
Bc ∙ Be ∙ (∀x)[(Bx ∙ x≠c ∙ x≠e) ⊃ (Dcx ∨ Dex)]
The last few uses of identity that I will discuss are especially philosophically inter-
esting. The next few (‘at least’, ‘at most’, and ‘exactly’) concern how much mathemat-
ics can be developed using just logic. The latter (‘definite descriptions’) concerns a
puzzle in the philosophy of language, often called the problem of empty reference.
3 5 6 C h apter 5 F u ll F i rst - O r d er L og i c
sure to have one clause for each pair of variables. The complexity of relational predi-
cates and quantified subformulas, which we see in 5.4.44–5.4.49, does not change the
‘at most’ pattern.
5.4.44
Nietzsche respects at most one philosopher.
(n: Nietzsche; Px: x is a philosopher; Rxy: x respects y)
(∀x)(∀y)[(Px ∙ Rnx ∙ Py ∙ Rny) ⊃ x=y]
5.4.45 Nietzsche respects at most two philosophers.
(∀x)(∀y)(∀z)[(Px ∙ Rnx ∙ Py ∙ Rny ∙ Pz ∙ Rnz) ⊃ (x=y ∨ x=z ∨ y=z)]
5.4.46 Nietzsche respects at most three philosophers.
(∀x)(∀y)(∀z)(∀w)[(Px ∙ Rnx ∙ Py ∙ Rny ∙ Pz ∙ Rnz ∙ Pw ∙ Rnw) ⊃
(x=y ∨ x=z ∨ x=w ∨ y=z ∨ y=w ∨ z=w)]
5.4.47
Kant likes at most two empiricists better than Hume.
(h: Hume; k: Kant; Ex: x is an empiricist; Lxyz: x likes y better
than z)
(∀x)(∀y)(∀z)[(Ex ∙ Lkxh ∙ Ey ∙ Lkyh ∙ Ez ∙ Lkzh) ⊃
(x=y ∨ x=z ∨ y=z)]
5.4.48
At most one idealist plays billiards with some rationalist.
(Ix: x is an idealist; Rx: x is a rationalist; Pxy: x plays
billiards with y)
(∀x)(∀y){Ix ∙ (∃z)(Rz ∙ Pxz) ∙ Iy ∙ (∃z)(Rz ∙ Pyz)] ⊃ x=y}
5.4.49
At most two rationalists wrote a book more widely read than every
book written by Hume.
(h: Hume; Bx: x is a book; Rx: x is a rationalist; Wxy: x wrote y;
Mxy x is read more widely than y)
(∀x)(∀y)(∀z){{Rx ∙ (∃w)[Bw ∙ Wxw ∙ (∀v)[(Bv ∙ Whv) ⊃ Mwv]] ∙
Ry ∙ (∃w)[Bw ∙ Wyw ∙ (∀v)[(Bv ∙ Whv) ⊃ Mwv]] ∙ Rz ∙ (∃w)[Bw ∙
Wzw ∙ (∀v)[(Bv ∙ Whv) ⊃ Mwv]]} ⊃ (x=y ∨ x=z ∨ y=z)}
EX ACTLY
To express “exactly,” we combine the at-least and at-most clauses. 5.4.30 says that
there is exactly one aardvark. The first portion says that there is at least one. The sec-
ond portion, starting with the universal quantifier, expresses the redundancy that fol-
lows from supposing that there are two aardvarks. We still need n+1 quantifiers in an
‘exactly’ sentence. The first n quantifiers are existential. Then we add the one further
universal quantifier.
The identity clauses at the end of the at-most portion of the proposition hold be-
tween only the variable bound by the universal quantifier and the other variables, not
among the existentially bound variables: there are n things that have such and such a
property; if you think that you have another one, an n+1 thing, it must be identical to
5 . 4 : T h e I d ent i t y P re d i cate : T ranslat i on 3 5 9
one of the first n. As you can see at 5.4.50–5.4.52, the ‘at most’ clause always has just
one universal quantifier.
5.4.50 There are exactly two aardvarks.
(∃x)(∃y){Ax ∙ Ay ∙ x≠y ∙ (∀z)[Az ⊃ (z=x ∨ z=y)]}
5.4.51 There are exactly three aardvarks.
(∃x)(∃y)(∃z){Ax ∙ Ay ∙ Az ∙ x≠y ∙ x≠z ∙ y≠z ∙
(∀w)[Aw ⊃ (w=x ∨ w=y ∨ w=z)]}
5.4.52 There are exactly four aardvarks.
(∃x)(∃y)(∃z)(∃w){Ax ∙ Ay ∙ Az ∙ Aw ∙ x≠y ∙ x≠z ∙ x≠w ∙ y≠z ∙
y≠w ∙ z≠w ∙ (∀v)[Av ⊃ (v=x ∨ v=y ∨ v=z ∨ v=w)]}
These numerical sentences get very long very quickly. Indeed, our language of predi-
cate logic, F, cannot express ‘exactly five’ or more, since we have run out of quantifiers.
To abbreviate numerical sentences, logicians sometimes introduce special shorthand
quantifiers like the ones at 5.4.53.
5.4.53 (∃1x), (∃2x), (∃3x) . . .
The quantifiers at 5.4.53 might be taken to indicate that there are at least the num-
ber indicated. To indicate exactly a number, ‘!’ is sometimes used. For exactly one
thing, people sometimes write ‘(∃!x)’. For more things, we can insert the number and
the ‘!’, as at 5.4.54.
5.4.54 (∃1!x), (∃2!x), (∃3!x) . . .
These abbreviations are useful for translation. But once we want to make inferences
using the numbers, we have to unpack their longer forms. We will not extend our lan-
guage F to include more variables, or to include numerals or ‘!’, but it is easy enough
to do so.
5.4.55–5.4.58 contain further ‘exactly’ translations, with the same kinds of compli-
cations we saw above with ‘at least’ and ‘at most’ sentences.
5.4.55 There is exactly one even prime number.
(∃x){(Ex ∙ Px ∙ Nx) ∙ (∀y)[(Ey ∙ Py ∙ Ny) ⊃ y=x]}
5.4.56 There are exactly two chipmunks in the yard.
(∃x)(∃y){Cx ∙ Yx ∙ Cy ∙ Yy ∙ x≠y ∙ (∀z)[(Cz ∙ Yz) ⊃ (z=x ∨ z=y)]}
5.4.57 There are exactly three aardvarks on the log.
(∃x)(∃y)(∃z){Ax ∙ Lx ∙ Ay ∙ Ly ∙ Az ∙ Lz ∙ x≠y ∙ x≠z ∙ y≠z ∙
(∀w)[(Aw ∙ Lw) ⊃ (w=x ∨ w=y ∨ w=z]}
5.4.58 Exactly three idealists play billiards with some rationalist.
(∃x)(∃y)(∃z){[Ix ∙ (∃w)(Rw ∙ Pxw) ∙ Iy ∙ (∃w)(Rw ∙ Pyw) ∙ Iz ∙
(∃w)(Rw ∙ Pzw) ∙ x≠y ∙ x≠z ∙ y≠z] ∙ (∀v){[Iv ∙ (∃w)(Rw ∙
Pvw)] ⊃ (v=x ∨ v=y ∨ v=z)}}
3 6 0 C h apter 5 F u ll F i rst - O r d er L og i c
DEFINITE DESCRIPTIONS
Our last use of the identity predicate is in a solution to a problem in the philosophy of
language. The problem can be seen in trying to interpret 5.4.59.
5.4.59 The king of America is bald.
We might regiment 5.4.59 as 5.4.60, taking ‘k’ for ‘the king of America’.
5.4.60 Bk
5.4.60 is false, since there is no king of America. Given our bivalent semantics, then,
5.4.61 should be true since it is the negation of a false statement.
5.4.61 ∼Bk
5.4.61 seems to be a perfectly reasonable regimentation of 5.4.62.
5.4.62 The king of America is not bald.
5.4.62 has the same grammatical form as 5.4.63.
5.4.63 This happy man is not bald.
We take 5.4.63 to be true because the happy man
has a lot of hair. So, 5.4.61 may reasonably be taken
to say that the king of America has hair. But that’s not
something we want to assert as true.
In fact, we want both 5.4.60 and 5.4.61 to be false.
The conjunction of their negations is the contradiction
5.4.64.
5.4.64 ∼Bk ∙ ∼ ∼Bk
And given what we saw about explosion in section 3.5, we certainly don’t want to
assert that! We had better regiment our sentences differently.
Bertrand Russell, facing just this problem, focused on the fact that ‘the king of
A definite description America’ is a definite description that refers to no real thing. Like a name, a definite
picks out an object by description is a way of referring to a specific object. A definite description picks out
using a descriptive phrase
an object by using a descriptive phrase that begins with ‘the’, as in ‘the person who . . .’,
beginning with ‘the’.
or ‘the thing that . . .’.
Both 5.4.59 and 5.4.62 use definite descriptions to refer to an object. They are
both false due to a false presupposition in the description that there exists a king of
America.
Russell’s solution to the problem is to rewrite sentences that use definite
descriptions. Definite descriptions, he says, are disguised complex propositions,
and the grammatical form of sentences that contain definite descriptions are more
complicated than they look. We have to unpack them to reveal their true logical
form. So, according to Russell, 5.4.59, properly understood, consists of three
simpler expressions.
5 . 4 : T h e I d ent i t y P re d i cate : T ranslat i on 3 6 1
Putting them together, so that every term is within the scope of the original ex
istential quantifier, we get 5.4.65, which Russell claims is the proper analysis of
5.4.59.
5.4.65 (∃x)[Kx ∙ (∀y)(Ky ⊃ y=x) ∙ Bx]
5.4.59 is false because clause A is false. 5.4.62 is also false, for the same reason,
which we can see in its proper regimentation, 5.4.66.
5.4.66 (∃x)[Kx ∙ (∀y)(Ky ⊃ y=x) ∙ ∼Bx]
The tilde in 5.4.66 affects only the third clause. The first clause is the same in 5.4.65
and 5.4.66, and still false. Further, when we conjoin 5.4.65 and 5.4.66, we do not get
a contradiction, as we did in 5.4.64.
5.4.67 (∃x)[Kx ∙ (∀y)(Ky ⊃ y=x) ∙ Bx] ∙ (∃x)[Kx ∙ (∀y)(Ky ⊃ y=x) ∙ ∼Bx]
Summary
The identity symbol, =, is just an ordinary binary relation between two singular terms.
But the logic of that relation is both simple and powerful in translation, allowing us to
regiment sentences with ‘except’, ‘only’, superlatives, ‘at least’, ‘at most’, ‘exactly’, and
definite descriptions. Each kind of translation follows a standard pattern that can be
learned without too much effort, if you have mastered F.
In our next section, we will construct derivations using the rules governing
identity that I introduced in this section. Take your time to get comfortable with the
translations before moving on to derivations.
KEEP IN MIND
EXERCISES 5.4
Translate into first-order logic, using the identity predicate
where applicable.
Axy: x attends y
Exy: x enrolls at y
9. At most two students who attend Riverdale High enroll at Sunnydale Uni-
versity.
10. At most three students who attend Riverdale High enroll at Sunnydale
University.
11. All students who attend Riverdale High enroll at Sunnydale, except Leah.
12. All students who enroll in some university attend some high school, except
Zoe and Leah.
13. Exactly three students who attend Riverdale High enroll at Sunnydale Uni-
versity.
14. Only Zoe attends high school without enrolling in some university.
15. The university in our town is Sunnydale.
16. If exactly one student attends Riverdale High and enrolls in Sunnydale Uni-
versity, then Zoe enrolls in a university in our town.
73. At least one determinist believes both in free will and moral responsibility.
74. At least two determinists believe in moral responsibility, but not free will.
75. At most three compatibilists do not believe in moral responsibility.
76. All compatibilists who believe in moral responsibility are determinists, except
Hume.
77. No philosopher is a libertarian except Descartes.
78. The libertarian is Descartes; the determinist is Spinoza; the compatibilist is Hume.
79. The only determinist who does not believe in free will but does believe in moral
responsibility is Spinoza.
80. If exactly one compatibilist believes in free will, then only Hume believes in
moral responsibility.
identity of indiscernibles says that no two things share all properties. Whether two
things can share all of their properties is a vexing question that depends for its truth
on a theory of properties, a topic well beyond our range.
For examples of these rules in use, let’s start with 5.4.4, the inference with which I
motivated identity theory.
Superman can fly.
Superman is Clark Kent.
So, Clark Kent can fly.
To derive the conclusion, we need only a simple application of IDi, as at 5.5.3.
5.5.3 1. Fs
2. s=c / Fc
3. Fc 1, 2, IDi
QED
5.5.4 uses IDs and IDi.
5.5.4 1. a=b ⊃ j=k
2. b=a
3. Fj / Fk
4. a=b 2, IDs
5. j=k 1, 4, MP
6. Fk 3, 5, IDi
QED
To derive the negation of an identity statement, one ordinarily uses indirect proof
as in 5.5.5.
5.5.5 1. Rm
2. ∼Rj / m≠j
3. m=j AIP
4. Rj 1, 3, IDi
5. Rj ∙ ∼Rj 4, 2, Conj
6. m≠j 3–5, IP
QED
5.5.6 uses the reflexivity rule, at line 4, to produce a contradiction. Alternatively,
one could use it to set up a modus tollens with line 3.
5.5.6 1. (∀x)(∼Gx ⊃ x≠d) / Gd
2. ∼Gd AIP
3. ∼Gd ⊃ d≠d 1, UI
4. d=d IDr
5. d≠d 3, 2, MP
6. d=d ∙ d≠d 4, 5, Conj
7. ~ ~Gd 2-6, IP
8. Gd 7, DN
QED
3 7 2 C h apter 5 F u ll F i rst - O r d er L og i c
You may use Dist to distribute a conjunction over any number of disjuncts
and to distribute a disjunction over any number of conjuncts.
You may use Com to re-order, in any way, any series of disjuncts or of conjuncts.
In the proof of the argument 5.5.9, which is at 5.5.10 and uses a standard CP, I avail
myself of the first of these conventions at lines 4 and 7.
5.5.10 1. (∃x){Jx ∙ Hx ∙ (∀y)[(Jy ∙ Hy) ⊃ x=y] ∙ Ex} / (∀x)[(Jx ∙ Hx) ⊃ Ex]
2. Jx ∙ Hx ACP
3. Ja ∙ Ha ∙ (∀y)[(Jy ∙ Hy) ⊃ a=y] ∙ Ea 1, EI
4. (∀y)[(Jy ∙ Hy) ⊃ a=y] 3, Simp
5. (Jx ∙ Hx) ⊃ a=x 4, UI
6. a=x 5, 2, MP
7. Ea 3, Simp
8. Ex 7, 6, IDi
9. (Jx ∙ Hx) ⊃ Ex 2–8, CP
10. (∀x)[(Jx ∙ Hx) ⊃ Ex] 9, UG
QED
5.5.11 contains another substantial proof using propositions with identity, this
time showing how ‘at least’ and ‘at most’ entail ‘exactly’ proof-theoretically.
5.5.11 There is at least one moon of Earth.
There is at most one moon of Earth. / So, there is exactly one
moon of Earth.
1. (∃x)Mx
2. (∀x)(∀y)[(Mx ∙ My) ⊃ x=y] / (∃x)[Mx ∙ (∀y)(My ⊃ x=y)]
3. Ma 1, EI
4. My ACP
5. (∀y)[(Ma ∙ My) ⊃ a=y] 2, UI
6. (Ma ∙ My) ⊃ a=y 5, UI
7. Ma ∙ My 3, 4, Conj
8. a=y 6, 7, MP
9. My ⊃ a=y 4–8, CP
10. (∀y)(My ⊃ a=y) 9, UG
11. Ma ∙ (∀y)(My ⊃ a=y) 3, 10, Conj
12. (∃x)[Mx ∙ (∀y)(My ⊃ x=y)] 11, EG
QED
5.5.12 has an even longer derivation, even with our new conventions (especially at
lines 27, 33, and 40). Removing and replacing multiple quantifiers, moving negations
across multiple quantifiers using QE, and just working with the complex statements
that identity helps us represent all lengthen the proofs.
When working with long proofs, be especially careful to keep track of your different
singular terms, which ones are constants and which are variables. Look ahead to see
whether you are going to need to UG, in which case you’ll need to work with variables.
And, as always, indirect proof is the refuge of the desperate.
3 74 C h apter 5 F u ll F i rst - O r d er L og i c
Summary
The rules governing the identity predicate are fairly simple and easy to learn. The
propositions that use identity, though, can be long and complex. Arguments that use
such propositions tend to be consequently long, and sometimes difficult, mainly just
because of the complexity of the propositions.
KEEP IN MIND
Singular terms of which identity holds may be exchanged in wffs; we call this property the
indiscernibility of identicals, or Leibniz’s law.
Do not confuse Leibniz’s law with its converse, the identity of indiscernibles.
IDi allows us to rewrite a whole line, switching one singular term for another.
IDs is a rule of equivalence, allowing us to commute the two singular terms flanking a ‘=’.
IDr allows us to insert an identity sentence, of a singular with itself, with no line justifica-
tion; it is rarely useful in derivations.
Our conventions for dropping brackets in series of conjunctions or disjunctions lead to
further conventions within derivations for some rules.
Rules Introduced
For any singular terms α and β:
IDr (reflexivity) α=α
IDs (symmetry) α=β → β=α
←
IDi (indiscernibility of identicals) ℱα
α=β / ℱβ
3 7 6 C h apter 5 F u ll F i rst - O r d er L og i c
EXERCISES 5.5a
Derive the conclusions of each of the following arguments.
1. 1. (∀x)[(∃y)Pxy ⊃ (∃z)Pzx]
2. (∃x)(Pxb ∙ x=d) / (∃z)Pzd
2. 1. (∀x)(∀y)[Ax ⊃ (By ⊃ Cxy)]
2. Aa ∙ Ba
3. a=b / Cab
3. 1. (∃x)(Mx ∙ Px)
2. (∀x)[Mx ⊃ (∀y)(Ky ⊃ x=y)]
3. Kf / Mf ∙ Pf
4. 1. Pa ∙ (∀x)[(Px ∙ x≠a) ⊃ Qax]
2. Pb ∙ a≠b / Qab
5. 1. Dkm ∙ (∀x)(Dkx ⊃ x=m)
2. Dab
3. Fb ∙ ∼Fm / a≠k
6. 1. (∀x)[ Jx ∨ (Kx ∙ Lx)]
2. ∼(Ja ∨ Kb) / a≠b
7. 1. (∀x)[(Mx ∨ Nx) ⊃ Ox]
2. ∼Oc
3. Md / c≠d
8. 1. (∀x)(Qx ⊃ Sx)
2. (∀x)(Rx ⊃ Tx)
3. (∀x)[Qx ∨ (Rx ∙ Ux)]
4. a=b / Sb ∨ Ta
9. 1. (∀x)[Ax ∨ (Bx ∙ Cx)]
2. ∼(∀x)Bx
3. (∀x)(Ax ⊃ x=c) / (∃x)x=c
10. 1. (∃x){Px ∙ Qx ∙ (∀y)[(Py ∙ Q y ∙ x≠y) ⊃ Axy]}
2. (∀x)(∀y)(Axy ⊃ Byx) / (∃x){Px ∙ Qx ∙ (∀y)[(Py ∙ Q y ∙ x≠y) ⊃ Byx]}
11. 1. (∀x)[(Px ∙ Qx ∙ x≠a) ⊃ (∃y)Rxy]
2. ∼(∃y)Rby
3. Sa ∙ ∼Sb / ∼(Pb ∙ Qb)
12. 1. Pa ∙ Qab ∙ (∀x)[(Px ∙ Qxb ∙ x≠a) ⊃ Rax]
2. Pc ∙ Qcb ∙ ∼Rac / c=a
13. 1. Dp ∙ (∃x)(Ex ∙ ∼Fxp)
2. (∀x)[Gx ⊃ (∀y)Fyx] / (∃x)(Dx ∙ ∼Gx)
5 . 5 : T h e I d ent i t y P re d i cate : Der i v at i ons 3 7 7
14. 1. La ∙ Lb ∙ a≠b
2. (∀x)(∀y)(∀z)[(Lx ∙ Ly ∙ Lz) ⊃ (x=y ∨ y=z ∨ x=z)]
/ (∀x)[Lx ⊃ (x=a ∨ x=b)]
15. 1. (∃x){Px ∙ Qx ∙ (∀y)[(Py ∙ Q y) ⊃ y=x]}
2. (∃x){Rx ∙ Qx ∙ (∀y)[(Ry ∙ Q y) ⊃ y=x]}
3. (∀x)(Px ≡ ∼Rx) / (∃x)(∃y)(Qx ∙ Q y ∙ x≠y)
16. 1. (∀x)(∀y)[(Px ∙ Qx ∙ Py ∙ Q y) ⊃ x=y]
2. (∃x)(∃y)(Px ∙ Rx ∙ Py ∙ Ry ∙ x≠y) / (∃x)(Px ∙ ∼Qx)
17. 1. (∃x)[Px ∙ (∀y)(Py ⊃ y=x)]
2. (∀x){Px ⊃ (∃y)[Q y ∙ (∀z)(Qz ⊃ z=y) ∙ Rxy]}
/ (∃x)(∃y)[Px ∙ Q y ∙ Rxy ∙ (∀z)(Pz ⊃ z=x) ∙(∀z)(Qz ⊃ z=y)]
18. 1. (∀x)[(Px ∙ Qx) ⊃ x≠a]
2. (∃x){Px ∙ Rx ∙ (∀y)[(Py ∙ Ry) ⊃ y=x]}
3. (∀x)(Rx ⊃ Qx) / ∼(Pa ∙ Ra)
19. 1. (∃x)[Px ∙ (∀y)(Py ⊃ y=x) ∙ Qx]
2. (∀x)[Qx ⊃ (∃y)Rxy]
3. (∃x)(Px ∙ Sx) / (∃x)[Qx ∙ Sx ∙ (∃y)Rxy]
20. 1. (∃x)(∃y)(Px ∙ Qx ∙ Py ∙ Q y ∙ x≠y)
2. (∀x)(Px ⊃ Rx)
3. (∀x)(∀y)(∀z)[(Qx ∙ Rx ∙ Q y ∙ Ry ∙ Qz ∙ Rz) ⊃ (x=y ∨ x=z ∨ y=z)]
/ (∃x)(∃y){Px ∙ Qx ∙ Py ∙ Qy ∙ x≠y ∙ (∀z)[(Pz ∙ Qz) ⊃ (z=x ∨ z=y)]}
21. 1. (∃x)(∃y)(∃z)(Px ∙ Py ∙ Pz ∙ x≠y ∙ x≠z ∙ y≠z ∙ Qxyz ∙ Qzyx)
2. (∀x)(∀y)(∀z)(Qxyz ≡ Q yxz)
3. (∀x)(∀y)(∀z)(Qxyz ≡ Qxzy) / (∃x)(∃y)(∃z)(Px ∙ Py ∙
Pz ∙ x≠y ∙ x≠z ∙ y≠z ∙ Qxyz ∙ Qxzy ∙ Q yxz ∙ Q yzx ∙ Qzxy ∙ Qzyx)
22. 1. (∃x)(∃y)(Hx ∙ Ix ∙ Jx ∙ Hy ∙ Iy ∙ Jy ∙ x≠y)
2. (∀x)(∀y)(∀z)[(Hx ∙ Ix ∙ Jx ∙ Hy ∙ Iy ∙ Jy ∙ Hz ∙ Iz ∙ Jz) ⊃ (x=y ∨ x=z ∨ y=z)]
/ (∃x)(∃y){Hx ∙ Ix ∙ Jx ∙
Hy ∙ Iy ∙ Jy ∙ x≠y ∙ (∀z)[(Hz ∙ Iz ∙ Jz) ⊃ (z=x ∨ z=y)]}
23. 1. Na ∙ Oa ∙ Nb ∙ Ob ∙ a≠b ∙ (∀x)[(Nx ∙ Ox) ⊃ (x=a ∨ x=b)]
2. Na ∙ ∼Pa ∙ (∀x)[(Nx ∙ x≠a) ⊃ Px]
/ (∃x){Nx ∙ Ox ∙ Px ∙ (∀y)[(Ny ∙ Oy ∙ Py) ⊃ y=x]}
24. 1. (∃x)(∃y)(Kx ∙ Lx ∙ Ky ∙ Ly ∙ x≠y)
2. Ka ∙ La ∙ Ma ∙ (∀y)[(Ky ∙ Ly ∙ My) ⊃ y=a] / (∃x)(Kx ∙ Lx ∙ ∼Mx)
25. 1. (∃x)(∃y)(Ax ∙ Cx ∙ Ay ∙ Cy ∙ x≠y)
2. (∀x)(∀y)(∀z)[(Cx ∙ Cy ∙ Cz) ⊃ (x=y ∨ x=z ∨ y=z)]
3. (∃x)(Bx ∙ ∼Ax) / ∼(∀x)(Bx ⊃ Cx)
3 7 8 C h apter 5 F u ll F i rst - O r d er L og i c
EXERCISES 5.5b
Translate each of the following arguments into F, using the
given terms and the identity predicate, where useful. Then,
derive the conclusion using our rules of inference.
1. Polly flies. Olivia doesn’t. So, Polly is not Olivia. (o: Olivia; p: Polly; Fx: x flies)
2. If George is Dr. Martin, then Dr. Martin is married to Mrs. Wilson. Dr. Martin
is George. Mrs. Wilson is Hilda. So, George is married to Hilda. (g: George; h:
Hilda; m: Dr. Martin; w: Mrs. Wilson; Mxy: x is married to y)
3. If something is a not superhero, then everything is not Wonder Woman. So,
Wonder Woman is a superhero. (w: Wonder woman; Sx: x is a superhero)
4. Katerina is the fastest runner on the team. Pedro is a runner on the team. Kat-
erina is not Pedro. So, Katerina is faster than Pedro. (k: Katerina; p: Pedro; Rx:
x is a runner; Tx: x is on the team; Fxy: x is faster than y)
5 . 5 : T h e I d ent i t y P re d i cate : Der i v at i ons 3 7 9
5. The author of Republic was a Greek philosopher. John Locke was a philosopher,
but he was not Greek. Therefore, John Locke did not write Republic. (l: John
Locke; r: Republic; Gx: x is Greek; Px: x is a philosopher; Wxy: x wrote y)
6. The only person who went skiing was James. The only person who caught a cold
was Mr. Brown. Some person who went skiing also caught a cold. So, James is
Mr. Brown. (b: Mr. Brown; j: James; Cx: x caught a cold; Px: x is a person; Sx:
x went skiing)
7. Exactly one student in the class gives a presentation about Spinoza. At least two
students in the class give a presentation about Leibniz. No student in the class
gives a presentation about both Leibniz and Spinoza. So, there are at least three
students in the class. (l: Leibniz; s: Spinoza; Sx: x is a student in the class;
Gxy: x gives a presentation about y)
8. Every employee except Rupert got a promotion. The only employee to get a
promotion was Jane. So, there are exactly two employees. (j: Jane; r: Rupert;
Ex: x is an employee; Px: x gets a promotion)
9. No philosopher except Descartes is a dualist. Spinoza is a philosopher, distinct
from Descartes. Every philosopher is either a dualist or a monist. So, Spinoza
is a monist. (d: Descartes; s: Spinoza; Dx: x is a dualist; Mx: x is a monist; Px:
x is a philosopher)
10. Kierkegaard and Sartre are both existentialists, but Kierkegaard is a theist and
Sartre is not. If all existentialists are nihilists, then Kierkegaard and Sartre are
identical. So, some existentialists are not nihilists. (k: Kierkegaard; s: Sartre;
Ex: x is an existentialist; Nx: x is a nihilist; Tx: x is a theist)
11. No idealist is more renowned than Berkeley, except Kant. Russell, who is nei-
ther Berkeley nor Kant, is more renowned than Berkeley. So, Russell is not an
idealist. (b: Berkeley; k: Kant; r: Russell; Ix: x is an idealist; Rxy: x is more re-
nowned than y)
12. Every platonist except Plato believes in the existence of the material world. Ev-
ery platonist believes in an abstract realm. Gödel is a platonist who is not Plato.
So, something believes in both a material world and an abstract realm, and
something does not. (g: Gödel; p: Plato; Ax: x believes in an abstract realm;
Mx: x believes in the existence of a material world; Px: x is a platonist)
13. At least two philosophers are more prolific than the philosopher Hume. No
philosopher is more insightful than Hume. Nothing is more prolific than itself.
So, at least two philosophers are more prolific, without being more insightful,
than a third philosopher. (h: Hume; Px: x is a philosopher; Ixy: x is more in-
sightful than y; Pxy: x is more prolific than y)
14. At most one argument for consequentialism is not utilitarian. There are some
non-utilitarian arguments for consequentialism. Any argument for consequen-
tialism faces trolley-case objections. So, exactly one non-utilitarian argument
3 8 0 C h apter 5 F u ll F i rst - O r d er L og i c
EXERCISES 5.5c
Derive the following logical truths of identity theory.
1. (∀x)(∀y)(x=y ≡ y=x)
2. (Fa ∙ a=b) ⊃ Fb
3. (∃x)x=a ∨ (∀x)x≠a
4. (∀x)(∀y)(∀z)[(x=y ∙ y=z) ⊃ x=z]
5. (∀x)(∀y)(∀z)[(x=y ∙ x=z) ⊃ y=z]
6. (∀x)(∀y)[(Fx ∙ ∼Fy) ⊃ x≠y]
7. (∀x)(∀y)[x=y ⊃ (Fx ≡ Fy)]
8. (∀x)(∀y)[x=y ⊃ (∀z)(Pxz ≡ Pyz)]
9. (∀x)(∀y){(x=a ∙ y=a) ⊃ [Rab ≡ (Rxb ∙ Ryb)]}
10. (∀x)(Pax ⊃ x=b) ⊃ [(∃y)Pay ⊃ Pab]
While the derivation at 5.6.6 is successful, there is a more efficient, and more fe-
cund, option for regimenting ‘the first child of x and y’: we can take ‘the first child of x
and y’ to be a function. Using a function allows us to regiment both the third premise
and the conclusion more simply, and to construct tighter derivations. Let’s take a mo-
ment to explore functions before returning to 5.6.1.
When working with Consider terms like ‘the biological father of ’, ‘the successor of ’, ‘the sum of ’, and
functions, an argument ‘the academic adviser of ’. Each takes one or more arguments, from their domain,
is an element or ordered and produces a single output, the range. We can tell that there is a single output by the
n-tuple of elements of
use of the definite description. One-place functions take one argument, two-place
the domain paired with
exactly one element of functions take two arguments, and n-place functions take n arguments. With a small
the range. extension of F, adding functors like ‘f(x)’, we can express such functions neatly.
5.6.7 lists some functions and some possible logical representations.
5.6.7 f(x) the father of
g(x) the successor of
f(x, y) the sum of
f(a, b) the truth value of the conjunction of A and B
g(x1 . . . xn) the teacher of
The last function can take as arguments, say, all the students in a class.
An essential characteristic of functions is that they yield exactly one value no mat-
ter how many arguments they take. Thus, the expressions at 5.6.8 are not functions.
5.6.8 the biological parents of a
the classes that a and b share
the square root of x
These expressions are relations. Relations may be one-many, like ‘the square root
of n’, which pairs a single number, say 4, with both its positive and negative square
roots, +2 and −2. Relations may be many-many, like the classes that Johanna and
Alexis share when they are both taking Logic, Organic Chemistry, and The Study
of the Novel. Functions are special types of relations that always yield a single value.
‘The positive square root of x’ is a function, as is ‘the first class of the day for student x’.
Functions play an important role in mathematics and science, as well as logic. We
have seen that we can use the identity predicate to simulate adjectival uses of numbers:
three apples, seven seas. With functions, we can express even more mathematics. A
A functor is a symbol functor is a symbol used to represent a function, like any of the functions ubiquitous
used to represent a in mathematics and science. In mathematics, there are linear functions, exponential
function.
functions, periodic functions, quadratic functions, and trigonometric functions. In
science, force is a function of mass and acceleration; momentum is a function of mass
and velocity. The genetic code of a child is a function of the genetic codes of its biologi-
cal parents. Functions are also essential for metalogic. Recall that the semantics for
PL is presented in terms of truth functions. All the operators are truth functions, tak-
ing one argument (negation) or two arguments (the rest of the operators) and yield-
ing a specific truth value.
5 . 6 : T ranslat i on w i t h F u nct i ons 3 8 3
By adding functors to our language F, we adopt a new language, which I call FF, for
full first-order predicate logic with functors.
Vocabulary of FF
Capital letters A . . . Z, used as predicates
Lower-case letters
a, b, c, d, e, i, j, k . . . u are used as constants.
f, g, and h are used as functors.
v, w, x, y, z are used as variables.
Five propositional operators: ∼, ∙, ∨, ⊃, ≡
Quantifiers: ∃, ∀
Punctuation: (), [], {}
In order to specify the formation rules for FF, we invoke n-tuples of singular An n-tuple of singular
terms, ordered series of singular terms: constants, variables, or functor terms. As we terms is an ordered series
of singular terms.
saw in section 5.2, n-tuples are like sets in that they are collections of objects but
differ from sets in that the order of their objects (which we call arguments) matters.
Often, n-tuples are represented using angle brackets: <a, b>, <Clinton, New Hartford,
Utica>, or <Lady Gaga>. For FF, we will represent n-tuples of singular terms by listing
the singular terms separated by commas, as at 5.6.9.
5.6.9 a, b two arguments
a, a, f(a) three arguments
x, y, b, d, f(x), f(a, b, f(x)) six arguments
a one argument
Now that we have characterized n-tuples, we can use them to define functor A functor term is a
terms. Suppose α is an n-tuple of singular terms. Then a functor symbol, followed by functor followed by an
n-tuple of singular terms
an n-tuple of singular terms in brackets, is a functor term. The expressions at 5.6.10 in brackets.
are all functor terms (once we substitute the proper n-tuple for α).
5.6.10 f(α)
g(α)
h(α)
Note that an n-tuple of singular terms can include functor terms, as in the second
and third examples at 5.6.9. ‘Functor term’ is defined recursively, which allows for
composition of functions. For example, one can refer to the grandfather of x using just
the functions for father, for example f(x), and mother, for example g(x). 5.6.11 repre-
sents ‘paternal grandfather’ and 5.6.12 represents maternal grandfather’.
5.6.11 f(f(x))
5.6.12 f(g(x))
The use of punctuation (parentheses) in functor terms can multiply, but is sadly
needed.
For another example, if we take ‘h(x)’ to represent the square of x, then 5.6.13 rep-
resents the eighth power of x, in other words, ((x 2)2)2 .
3 8 4 C h apter 5 F u ll F i rst - O r d er L og i c
5.6.13 h(h(h(x)))
I have introduced only three functor letters. As with variables and constants (see
4.3), there are several different tricks for constructing an indefinite number of terms
out of a finite vocabulary using indexing. But we won’t need more than the three let-
ters here, so we will make do with only these.
Even with just the three letters, we have an indefinite number of functors, since
each of 5.6.14 is technically a different functor and can represent a different function.
5.6.14 f(a)
f(a, b)
f(a, b, c)
f(a, b, c, d)
and so on
The scope and binding rules are the same for FF as they were for M and F. The for-
mation rules need only one small adjustment, at the first line.
Semantics for FF
Step 1. Specify a set to serve as a domain of interpretation.
Step 2. Assign a member of the domain to each constant.
Step 3. A
ssign a function with arguments and ranges in the domain to each func-
tion symbol.
Step 4. A ssign some set of objects in the domain to each one-place predicate; as-
sign sets of ordered n-tuples to each relational predicate.
Step 5. Use the customary truth tables for the interpretation of the propositional
operators.
5 . 6 : T ranslat i on w i t h F u nct i ons 3 8 5
The function assigned in step 3 will be a function in the metalanguage used to inter-
pret the function in the object language. I won’t pursue a discussion of metalinguis-
tic functions, except to say that they work just like ordinary mathematical functions.
Once you have the idea of how functions work in the object language, it will become
clear how they work in the metalanguage.
Let’s move on to the nuts and bolts of translation with functions.
We can write the Peano axioms in FF using the given key, as I do at 5.6.17.
5.6.17 Peano’s Axioms in FF
a: zero
Nx: x is a number
f(x): the successor of x
PA1. Na
PA2. (∀x)(Nx ⊃ Nf(x))
PA3. ∼(∃x)(Nx ∙ f(x)=a)
PA4. (∀x)(∀y)[(Nx ∙ Ny) ⊃ (f(x)=f(y) ⊃ x=y)]
PA5. {Pa ∙ (∀x)[(Nx ∙ Px) ⊃ Pf(x)]} ⊃ (∀x)(Nx ⊃ Px)
Notice that the predicate ‘P’ as used in PA5 can stand for any property, like the
property of being prime or the property of having a square. To write this axiom even
more generally, one needs a stronger language, such as second-order logic.
5.6.18–5.6.21 present translations of some arithmetic sentences using functions.
Note that in the following sentences, I take ‘number’ to mean ‘natural number’ (i.e.,
the counting numbers 1, 2, 3, . . .) and use the following translation key.
o: one
f(x): the successor of x
f(x, y): the product of x and y
Ex: x is even
Nx: x is a number
Ox: x is odd
Px: x is prime
5.6.18 One is the successor of some number.
(∃x)[Nx ∙ f(x)=o]
5.6.19 The product of the successor of one and any other number is even.
(∀x)Ef(f(o), x)
5.6.20
If the product of a pair of numbers is odd, then the product of the
successors of those numbers is even.
(∀x)(∀y){(Nx ∙ Ny) ⊃ [Of(x, y) ⊃ Ef(f(x), f(y))]}
5.6.21 There are no prime numbers such that their product is prime.
∼(∃x)(∃y)[Nx ∙ Px ∙ Ny ∙ Py ∙ Pf(x, y)]
Summary
Functors are not in the vocabulary of standard first-order logic. By adding functors
to our language, we switch from F to FF. The addition facilitates some natural infer-
ences. But some philosophers resist seeing functions as purely logical, and see them
as mathematical. Mathematicians treat functions as kinds of relations, and relations
as kinds of sets; we can define relations and functions in terms of sets. Set theory is
5 . 6 : T ranslat i on w i t h F u nct i ons 3 8 7
ordinarily taken to be mathematics, not logic. But we are here supposed to be working
with a purely logical, and not mathematical, language.
Concerns that functions are mathematical, and not logical, should be allayed some-
what by noting that the work we are doing with functions here can be done in F with
definite descriptions, though in more complicated fashion.
Our last technical subject, in the next section, is derivations with functions.
KEEP IN MIND
The work of functions can be done, less efficiently, with definite descriptions.
FF is the result of adding functors to the language of F.
We reserve ‘f ’, ‘g’, and ‘h’ as functor symbols; they don’t work as constants in FF.
EXERCISES 5.6
Use the given key to translate the following sentences into FF.
22. Some philosophy majors are two spaces in front of some sociology majors.
23. Of all the graduates in line, none is one place in front of Olivia, and none is one
place behind Spencer.
24. At most three sociology majors are two spaces in front of some philosophy
major.
of its argument places, or those of any of its embedded functions. Otherwise, it acts
like a constant.
If the arguments of a function are all variables, then you are free to use UG over the
variables in that function. If the arguments of a function contain any constants, then
you may not use UG. You may use UG on either ‘x’ or ‘y’ in 5.7.2, assuming that the
proposition does not appear within an indented sequence in which the variables are
free in the first line.
5.7.2 Af(x, y) ⊃ Bx(fx)
You may not use UG on ‘x’ and ‘y’ in 5.7.3, depending on whether the variables were
free when ‘a’ was introduced.
5.7.3 Af(x, y, a)
For EI, we must continue always to instantiate to a new singular term. A functor is
not a new singular term if any of its arguments or any of the arguments of any of its
subfunctors have already appeared in the derivation or appear in the conclusion. The
functor itself need not be new. At 5.7.4, you may not instantiate line 2 to ‘a’ or to ‘b’;
use a new constant, as at line 3.
5.7.4 1. f(a)=b Premise
2. (∃x)Sf(x) Premise
3. Sf(c) 2, EI
Turning to complete proofs, the derivation at 5.7.5 uses a function merely as a sin-
gular term and does not alter the functional structure of any singular term.
5.7.5 1. (∀x)[Px ⊃ Pf(x)]
2. (∃x)(Px ∙ Rxa) / (∃x)[Pf(x) ∙ Rxa]
3. Pb ∙ Rba 2, EI
4. Pb 3, Simp
5. Pb ⊃ Pf(b) 1, UI
6. Pf(b) 5, MP
7. Rba 3, Simp
8. Pf(b) ∙ Rba 6, 7, Conj
9. (∃x)[Pf(x) ∙ Rxa] 8, EG
QED
Sometimes, though, a derivation requires us to add or reduce functional structure,
as at 5.7.6.
5.7.6 1. (∀x)[Px ⊃ Pf(x)]
2. (∃x)[Pf(x) ∙ Qf(f(x))] / (∃x)[Pf(f(x)) ∙ Qf(f(x))]
In order to derive the conclusion of 5.7.6, we have to UI line 1 to ‘f (a)’. That will in-
crease the functional structure of the terms in the premise. That’s acceptable, though,
since the premise is universal. If a claim holds of anything, it holds of all functions of
anything. So, the derivation at 5.7.7 is perfectly fine.
3 9 2 C h apter 5 F u ll F i rst - O r d er L og i c
than zero from the premise that all successors are greater than zero. But zero is not
greater than zero!
Decreasing functional structure is also unacceptable for UI. Imagine an interpre-
tation of 5.7.11 that takes ‘Px’ as ‘x is even’ and ‘f(x)’ as ‘twice x’, and imagine again a
domain of natural numbers.
5.7.11 (∀x)[Pf(f(x)) ⊃ Pf(x)]
On our interpretation, 5.7.11 says that if four times a number is even, then twice
a number is even. That’s true. But if we decrease the functional structure when
instantiating, as at 5.7.12, we get a false claim.
5.7.12 Pf(a) ⊃ Pa
5.7.12 says that if twice ‘a’ is even, then ‘a’ is even. If we interpret ‘a’ as any odd num-
ber, say 3, 5.7.12 is false even though 5.7.11 is true.
So, when using universal instantiation and generalization rules, you can increase
functional structure. But never decrease functional structure with the universal rules.
Conversely, you may decrease functional structure with existential rules, both EI
and EG, but you may never increase functional structure with them.
Since existentially quantified sentences are so weak, merely claiming that some ob-
ject in the domain has a property, we can EG at any point over any singular terms.
‘(∃x)(Px ∙ Qx)’ can be inferred from any of the statements listed above the horizontal
line at 5.7.13, decreasing even very complex functional structure.
5.7.13 Pa ∙ Qa
Pf(a) ∙ Qf(a)
Pf(x) ∙ Qf(x)
Pf(a, b, c) ∙ Qf(a, b, c)
Pf(f(x), x, f(f(x))) ∙ Qf(f(x), x, f(f(x)))
Pf(f(g(f(a)))) ∙ Qf(f(g(f(a))))
Pf(f(g(f(x)))) ∙ Qf(f(g(f(x))))
(∃x)(Px ∙ Qx)
Moreover, with nested functions, you can EG in different ways. All of the proposi-
tions below the line at 5.7.14 can also be acceptably inferred from the proposition at
the top using EG.
5.7.14 Pf(f(g(f(a)))) ∙ Qf(f(g(f(a))))
(∃x)[Pf(f(g(f(a)))) ∙ Qf(f(g(f(a))))]
(∃x)[Pf(f(g(x))) ∙ Qf(f(g(x)))]
(∃x)[Pf(f(x)) ∙ Qf(f(x))]
(∃x)[Pf(x) ∙ Qf(x)]
Decreasing functional structure using EI is also acceptable. Either inference from
the quantified formula at 5.7.15 is acceptable.
3 9 4 C h apter 5 F u ll F i rst - O r d er L og i c
But, as I said, invoking functions will make the derivation simpler. Let’s use a func-
tion ‘f(x, y)’ for ‘the first child of x and y’ to regiment the third premise and conclu-
sion; the result is at 5.7.19. Notice how quickly and easily the derivation follows.
5.7.19 1. (∀x)(Ax ⊃ Gx)
2. Aj
3. j=f(d, h) / (∃x)(∃y)Gf(x, y)
4. Aj ⊃ Gj 1, UI
5. Gj 4, 2, MP
6. Gf(d, h) 5, 3, IDi
7. (∃y)Gf(d, y) 6, EG
8. (∃x)(∃y)Gf(x, y) 7, EG
QED
5.7.20 contains a derivation that uses some composition of functions. Note that ‘B’
is a two-place predicate, taking as arguments a variable and a functor term with a vari-
able argument in the first premise, and taking as arguments two functor terms, each
with variable arguments, in the conclusion.
5.7.20 1. (∀x)[Ax ⊃ Bxf(x)]
2. (∃x)Af(x) / (∃x)Bf(x)f(f(x))
3. Af(a) 2, EI to “a”
4. Af(a) ⊃ Bf(a)f(f(a)) 1, UI to “f(a)”
5. Bf(a)f(f(a)) 4, 3, MP
6. (∃x)Bf(x)f(f(x)) 5, EG
QED
In the short derivation 5.7.21, we instantiate to a two-place function, f(x, g(x)), one
of whose arguments is itself a function. Since none of the arguments of any of the
functions in 5.7.21 are constants, UG is permissible at line 3.
5.7.21 1. (∀x)∼Cx / (∀x)∼Cf(f(x), x)
2. ∼Cf(x, g(x)) 1, UI
3. (∀x)∼Cf(x, g(x)) 2, UG
QED
3 9 6 C h apter 5 F u ll F i rst - O r d er L og i c
5.7.22 derives the conclusion of an argument that uses concepts from number the-
ory in which functions play an important role.
5.7.22 1. If the product of a pair of numbers is odd, then the
product of the successors of those numbers is even.
2. Seven and three are odd numbers.
3. The product of seven and three is odd.
So, the product of the successors of seven and three
is even.
1. (∀x)(∀y){(Nx ∙ Ny) ⊃ [Of(x, y) ⊃ Ef(f(x), f(y))]}
2. Os ∙ Ns ∙ Ot ∙ Nt
3. Of(s, t) / Ef(f(s), f(t))
4. (∀y){(Ns ∙ Ny) ⊃ [Of(s, y) ⊃ Ef(f(s), f(y))]} 1, UI
5. (Ns ∙ Nt) ⊃ [Of(s, t) ⊃ Ef(f(s), f(t))] 4, UI
6. Ns ∙ Nt 2, Simp
7. Of(s, t) ⊃ Ef(f(s), f(t)) 5, 6, MP
8. Ef(f(s), f(t)) 7, 3, MP
QED
Summary
The derivation system we use with FF is basically the same as the one we use with
F; you mainly have to be careful to obey the guidelines about altering functional
structure.
We have come to the end of our main technical work. Still, there are many
further logical languages and systems, discussions of some of which are available as
supplements to this book.
KEEP IN MIND
The restrictions on instantiation and generalization rules for constants and variables are
the same whether the singular terms are simple or complex.
A complex singular term acts like a variable if there are any variables in any of its
argument places, or those of any of its embedded functions. Otherwise, it acts like
a constant.
You may increase functional structure when using universal rules (UI or UG), but you may
not decrease it.
You may decrease functional structure when using existential rules (EI or EG), but you
may not increase it.
If you change the functional structure of a wff, you must change it uniformly throughout.
5 . 7 : Der i v at i ons w i t h F u nct i ons 3 9 7
EXERCISES 5.7a
Derive the conclusions of each of the following arguments.
1. 1. (∀x)[Ax ⊃ Af (x)]
2. Aa
3. f (a)=b / Ab
2. 1. (∀x)[Bx ≡ Bg(x)]
2. (∀x)g(x)=f (x, x)
3. Ba / Bf(a,a)
3. 1. (∀x)[Px ≡ Pf (x)]
2. f (a)=f (b)
3. Pa / Pb
4. 1. (∀x)[Px ⊃ Pf (x)]
2. (∀x)(Qx ⊃ Px)
3. Qa / Pf (a)
5. 1. (∀x)(∀y)(∀z)[f(x, z)=y ⊃ f(y, z)=x]
2. f (a, b)=c
3. Pc ∙ Pa / (∃x)[Pf (a, x) ∙ Pf (c, x)]
6. 1. (∀x)Hf(x)
2. a=f (b) ∙ b=f (c)
3. (∀x)(Hx ⊃ ∼Ix) / a=f (f (c)) ∙ ∼Ia
7. 1. (∀x)(∀y)f (x, y)=f (y, x)
2. a=f (b, c)
3. b=f (c, a)
4. a≠b
5. Pa ∙ Pb / (∃x)(∃y)(∃z)[Pf (x, z) ∙ Pf (y, z) ∙ x≠y]
8. 1. (∀x)(∀y)[f (x)=f (y) ⊃ x=y]
2. f (a)=g(c, d)
3. f (b)=g(c, e)
4. d=e / a=b
9. 1. (∀x)[Pf (x) ⊃ (Qx ≡ Rx)]
2. Pa ∙ Qf(a)
3. (∀x)f (f (x))=x / Rf(f(f(a)))
10. 1. Pa ∙ (∀x)[(Px ∙ x≠a) ⊃ Qax]
2. (∀x)(∀y)[x=f (y) ⊃ x≠y]
3. Pb ∙ b=f(a) / Qab
3 9 8 C h apter 5 F u ll F i rst - O r d er L og i c
EXERCISES 5.7b
Translate each of the following arguments into FF. Then,
derive the conclusion using our rules of inference.
1. If something is your father, then you are its child. Pavel has no children. So,
Pavel is not the father of Andres. (a: Andres; p: Pavel; f(x): the father of x; Cxy:
x is the child of y)
2. No number is equal to its successor. One and two are numbers, and two is the
successor of one. So, one is not two. (a: one; b: two; f(x): the successor of x;
Nx: x is a number)
3. The brother of Amanda and Amanda are children of Nancy. Peter’s mother is
a woman named Nancy. Something is your mother if, and only if, it is a woman
and you are her child. So, Amanda and Peter share a mother. (a: Amanda; n:
Nancy; p: Peter; f(x): the mother of x; Wx: x is a woman; Bxy: x is a brother of
Amanda; Cxy: x is a child of y)
4. Anyone is happy on any day if, and only if, they are unhappy on the following
day. Joyce is a person who will be happy in three days. Today is a day, and the day
after any day is a day. So, Joyce won’t be happy in two days. (j: Joyce; t: today;
f(x): the day after x; Dx: x is a day; Px: x is a person; Hxy: x is happy on day y)
5. Anyone who completes a task is proud on the following day. Friday is the day
that the person Emma completed the task of her logic homework. Saturday is
the day after Friday. So, Emma is proud on Saturday. (a: Friday; b: Saturday; e:
Emma; l: Emma’s logic homework; f(x): the day after x; Dx: x is a day; Px: x is a
person; Tx: x is a task; Cxyz: x completes y on day z)
6. The product of two and any odd is even. The sum of two and any odd is odd.
Seven is odd. So, the product of two and the sum of two and seven is even. (a:
two; b: seven; f(x, y): the product of x and y; g(x, y): the sum of x and y; Ex: x is
even; Ox: x is odd)
7. One, two, and four are distinct numbers. The positive square root of four is two.
Two is the sum of one and itself. So, the positive square root of some number is
the sum of some other number and itself. (a: one; b: two; c: four; f(x): the posi-
tive square root of x; f(x, y): the sum of x and y; Nx: x is a number)
8. One, two, and four are distinct numbers. The sum of two and the sum of one
and one is four. Two is the sum of one and itself. So, some number is the sum of
the sum of some other number with itself and the sum of the latter number with
itself again. (a: one; b: two; c: four; f(x, y): the sum of x and y; Nx: x is a number)
4 0 0 C h apter 5 F u ll F i rst - O r d er L og i c
9. Exactly one number is the sum of itself and itself. Zero is the sum of itself and
itself. The number one is the successor of the number zero, and no number is its
own successor. So, one is not the sum of itself and itself. (a: zero; b: one; f(x):
the successor of x; f(x, y): the sum of x and y; Nx: x is a number)
10. Exactly two numbers are the products of themselves with themselves. The prod-
uct of a number and itself is its square. The square of zero is zero. The square of
one is one. Zero, one, and five are distinct numbers. So, the square of five is not
five. (a: zero; b: one; c: five; f(x): the square of x; g(x, y): the product of x and y;
Nx: x is a number)
KEY TERMS
This book is dedicated to distinguishing good deductive arguments from bad ones.
To that end, in the first five chapters, I discuss not only the rules for valid inferences,
but also ways to identify invalid inferences and construct counterexamples. Given
any argument of our main formal languages, PL, M, and F, we should be able to show
either that it is valid or invalid.
In ordinary discourse, though, the concepts of argument and validity range far be-
yond their applications to our formal languages and deductive reasoning. Many argu-
ments are not deductive, including those we see every day in the news, in advertising,
in science, and in our personal conversations. Compliance with the formal, deductive
methods of this book is important, perhaps even necessary, for good reasoning. But
there is much more to be said about good argumentation generally.
The logic of non-deductive arguments is not as clean as that of the deductive logic
of this book. There are, in principle, no categorical formal rules for distinguishing
good inductive arguments from poor ones, which we can call, most broadly, fallacies.
For reasons partly rooted in the problems of induction identified by Hume, there is
no formal criterion for valid induction. No comprehensive set of rules is available for
identifying good informal reasoning. No list of rules suffices to show us how to avoid
all fallacies.
Still, we can identify some poor patterns of reasoning and pick out some general
principles for distinguishing good informal inferences from bad ones. Indeed, there
is a long history of philosophers trying to develop such principles. Aristotle identified
many fallacies, especially in On Sophistical Refutations, Prior Analytics, and On Rheto-
ric; his work continues to influence contemporary research. Other important histori-
cal figures in advancing the understanding of fallacies include Antoine Arnauld and
Pierre Nicole, in their seventeenth-century Port-Royal Logic, and John Stuart Mill,
in his nineteenth-century A System of Logic. All of these works identified patterns of
arguments or dialogue to be avoided in discourse that is not purely deductive.
Attention to the fecund tools of formal logic in the late nineteenth and early
twentieth centuries largely eclipsed research on fallacies, inductive reasoning, and
401
4 0 2 A P P E N DIX A
informal logic, even as greater attention was being paid to the methods of science,
especially to the related notions of confirmation and explanation. Through most of
the twentieth century, logicians paid little attention to inductive or informal fallacies,
and logic books mainly mentioned them in passing.
Still, beginning perhaps with Hamblin’s Fallacies in 1970, and with the develop-
ment of an academic society devoted to the study of informal logic, research on
non-deductive argument, especially informal fallacies, has recently burgeoned. In
parallel, philosophers have paid increasing attention to probabilistic reasoning and
details of the methods of the hard and social sciences. One result is that many con-
temporary logic books now split their attention between formal, deductive methods
and informal, inductive ones. Moreover, contemporary research on cognitive biases
has produced work related to the traditional study of fallacies. There are many ways
to infer badly.
This book is focused on formal, deductive methods, leaving the mountain of work
on informal logic and natural reasoning to other sources. But in the perhaps lamen-
table tradition of lip service to informal logic, this section is devoted to identifying
and describing some general fallacies of reasoning, both formal and informal. There
are many competing ways of classifying fallacies and distinguishing among them. I’ll
start by distinguishing between formal fallacies, as ones that are defects in the struc-
ture of an argument, and informal fallacies, as ones that are, generally, defects in the
content, and so perhaps not really logical fallacies, in the sense in which we have been
using ‘logic’.
FORMAL FALLACIES
One aspect of many fallacies, which some philosophers take to be essential to any
fallacy, is their shallow similarity to legitimate, even deductively valid, inferences.
This similarity is especially apparent in some formal fallacies, especially when they
are presented abstractly, like rules of inference. In §3.1, we saw two formal fallacies
that have traditional names: affirming the consequent and denying the antecedent.
AFA.1 α⊃β
β / α Fallacy of Affirming the Consequent
AFA.2 α ⊃ β
~α / ~β Fallacy of Denying the Antecedent
Inferences of these forms are categorically fallacious, since the premises can be true
while the consequent is false, as we saw in §3.1. Of course, it is possible to provide sub-
stitution instances on which the conclusions are true. But we define ‘validity’ so that
any form that allows for true premises and a false conclusion is invalid.
Other formal fallacies include the fallacy of the undistributed middle, AFA.3,
which is similar in appearance to some rules of Aristotelian logic, or syllogism, a logi-
cal theory superseded by, and mainly derivable from, our work in predicate logic.
F A L L A C I E S A N D A R G UM E N T A T I O N 4 0 3
INFORMAL FALLACIES
Consider the argument AFA.7, which Descartes presents in his letter of dedication to
the Meditations on First Philosophy.
AFA.7 We must believe in God’s existence because it is taught in the Holy
Scriptures, and, conversely, we must believe the Holy Scriptures
because they have come from God (Descartes, AT VII.2, CSM II.3).
In the ensuing discussion, Descartes points out two characteristics of the argu-
ment. First, whether one accepts the argument or not depends on one’s background
beliefs. For theists like Descartes, such an argument is acceptable. But nonbelievers
will judge it to be fallaciously circular. Indeed, though Descartes does not say so ex-
plicitly, such a fallacy has a traditional Latin name, petitio principii, and is also known
as begging the question, or just circular reasoning.
4 0 4 A P P E N DIX A
IRRELEVANT PREMISES
Many arguments in our day-to-day conversations are not nearly as tight as most of
the arguments in this textbook. In the valid inferences of this book, I usually provide
just the right premises to derive the conclusion. In politics, philosophy, and ordinary
conversation, people tend to speak or argue more freely, often mistakenly omitting
key premises or offering irrelevant reasons for their conclusions. Arguments missing
a premise are called enthymemes; they are easily remedied by the addition of what
F A L L A C I E S A N D A R G UM E N T A T I O N 4 0 5
was omitted. Arguments with irrelevant premises may be called non sequiturs, since
the conclusions don’t follow, either deductively or informally.
We can identify several different kinds of irrelevant premises. Advertisers often
seem to commit the fallacy of appeal to unreliably authority when they present the
endorsement of a product from a famous person. Athletes, movie stars, and other ce-
lebrities are often used to sell products, even if their authority about those products
is minimal. A football player, say, usually has no particular expertise about the nutri-
tional value or the taste of a breakfast cereal, or the reliability of a car. Expertise in one
domain, acting, say, does not automatically transfer to another domain, like evaluat-
ing a medication. The premise of a person’s authority in some area is not relevant to
the truth of a claim in another area.
Still, many product endorsements can be seen not as arguments for the quality of
a product, but as lending the product a certain quality by association with a celeb-
rity. Since many people, either consciously or unconsciously, idolize celebrities, a
celebrity’s endorsement of a product can be a compelling reason to buy or approve of
the product. And, of course, many celebrities are experts about some of the products
they endorse: a model’s endorsement of a skin care product or a basketball player’s
endorsement of a sneaker may well be based on reliable expertise.
So appeals to authority can be challenging to evaluate. Given the vastness and de-
tail of human knowledge, all of us need to defer to authorities: to doctors about our
health, to physicists about the structure of the universe, to mechanics about our cars.
A biology professor’s assertion about the structure of a phylum is a good reason to be-
lieve that the phylum is structured as she says. A biology professor’s assertions about
the development of a fetus in the womb might be similarly reliable. But if a biology
professor were to make assertions about the morality of abortion, say, we might ques-
tion whether her expertise extends to ethical domains.
Even if someone is not reliably expert in an area, that is not a cause to dismiss a
claim made in that area by that person. We should not rule out an assertion because
of the ignorance or even bias of a speaker. But neither should we use their authority in
other areas as an argument for the claim.
More broadly, we commit the so-called ad hominem (to the person) fallacy when
we accept or deny a claim on the basis of the person who makes it. An ad populum
fallacy is an appeal to a group sentiment, accepting or denying a claim because of oth-
ers’ beliefs about the claim. Nationalists may appeal to their views about a country’s
values in order to court voters: this is what it means to be an American, or French, or
Chinese. Advertisers often tout the popularity of a product as a reason to buy it. While
a product’s popularity may be justified by its effectiveness or utility, it is no guarantee
that you should buy it. Appeals to tradition are similar. In the United States, we of-
ten hear that rights to bear weapons are grounded in the Second Amendment to the
Constitution, though the difference between the kinds of weapons available now and
those that were available at the time the Bill of Rights was adopted may undermine
the importance of that tradition.
4 0 6 A P P E N DIX A
positions in between the two branches. I might be neither with you nor against you,
but somewhere in between.
Begging the question (or petitio principii, or circular argument), a classic example
of which we saw in the scriptural circle from Descartes at the beginning of this sec-
tion, may be seen as a fallacy of unwarranted premises. Neither the truth of scripture
nor the existence of God is warranted from inside of the small circle. Of course, one
might have independent reasons to believe in God or the truth of scripture, or one
may not. But the premises in the circular argument themselves are insufficient.
An ordinary example of begging the question is an argument for someone’s, or
something’s, trustworthiness. You can ask the person (or, say, the crystal ball) whether
s/he is trustworthy, but you are unlikely to get any information that will assuage your
concerns, unless the person presents evidence apart from their assurances. The crys-
tal ball’s assurances that you should believe what it says are no assurances at all.
Some results in philosophy suggest that some sorts of circular reasoning, and thus
begging the question, are unavoidable. According to philosophers who are sometimes
called atomists, certain fundamental or basic propositions (perhaps the claim from
Augustine and Descartes that I exist whenever I am thinking, perhaps the claim that
one and one are two, perhaps our current sense perceptions) are known incorrigibly;
other beliefs are derived from the basic ones. In contrast to the claims of atomists,
holists argue that no belief is fundamental. Any belief requires a host of other beliefs
in order even to make sense. For the holist, the argument for any claim can never
be traced back to fundamental, incorrigible premises. All reasoning is, at root, circu-
lar. Still, whether atomism or holism is true, small circles like the scriptural or trust
circles seem clearly fallacious.
Like begging the question, the slippery slope fallacy is closely related to some le-
gitimate reasoning patterns. In its most offensive instances, users of the slippery slope
fallacy argue against a small change by insisting that it will lead to larger, repugnant
changes: if we limit sales of assault rifles, then the government will start to limit all
guns, and repress the people, and take away all of our rights until we are nothing but
slaves. While pretty much everyone agrees that we must be aware of unjust extensions
of governmental intrusion into our lives, not every federal restriction is an enslave-
ment of the people. Such slippery slope arguments often involve appeals to fear, as we
cringe from the loss of important freedoms.
Still, as the famous poem by Pastor Martin Niemöller points out, we must be vigi-
lant about the consequences of any of our actions.
First they came for the Socialists, and I did not speak out—
Because I was not a Socialist.
Then they came for the Trade Unionists, and I did not speak out—
Because I was not a Trade Unionist.
Then they came for the Jews, and I did not speak out—
Because I was not a Jew.
Then they came for me—and there was no one left to speak for me.
4 0 8 A P P E N DIX A
CAUSAL FALLACIES
Smoking causes cancer. It does not cause cancer in quite the same way that throwing
a stone off a cliff causes it to fall into the sea below. In the latter, simple case, the causa-
tion is nearly categorical. A strong wind or errant arm angle might alter the trajectory
of the stone, but it feels like we can almost see the workings of the physical laws, espe-
cially of gravity, in some cases.
Other causal relationships, like the connection between smoking and cancer, are
more complicated. The time between an action and its effect may be distant, and the
connection between the cause and the effect is not without exception. Not everyone
who smokes gets lung cancer, and even those who do may appear healthy for a long
time. Similarly, the effects of carbon emissions on climate change have been more
difficult to see than the falling stone off of the cliff. It is sometimes difficult on a blus-
tery day to believe the warnings we hear about global warming. The proper inferences
require detailed understandings of statistical principles and the relevant data. Such
research is essentially scientific, and failure to understand and respect good scientific
practice underlies lots of fallacious reasoning.
One error, common when using statistics and science more generally, is to make an
induction on too few cases, often by using a sample size that is too small, by neglect-
ing to randomize one’s sample effectively, or by using an unrepresentative sample.
Such errors are essentially hasty generalizations.
A similar error is called post hoc ergo propter hoc (after this, so caused by this). It
often darkens before a storm, but the storm is not caused by the darkness. There are
F A L L A C I E S A N D A R G UM E N T A T I O N 4 0 9
causal connections among events and phenomena, and there are accidental correla-
tions. It is, in large part, the business of science to distinguish them.
One slogan invoked to help people avoid such fallacious inferences is the claim that
correlation does not entail causation. Often, phenomena that are related have no di-
rect causal connection. A snowstorm might cause schools to close and milk deliveries
to be delayed. But the closed schools don’t cause the delays in the deliveries, nor the
reverse. Those two events are collateral effects of a common cause, and so their cor-
relation need not indicate any causal relationship between them.
More importantly, effects may correlate without even having a common cause.
An amusing website, Spurious Correlations (http://www.tylervigen.com/spurious
-correlations) shows that many unrelated events can be correlated, even with statisti-
cal significance: the number of people who drowned by falling into a pool correlates
with the number of films Nicolas Cage appeared in, between 1999 and 2009; total
revenue generated by arcades correlates with computer science doctorates awarded
in the United States over the same period.
Humans are notoriously bad at applying mathematics. One might categorize sta-
tistical and other mathematical errors among the causal fallacies. One commits the
gambler’s fallacy when one’s expectation for a random event increases over trials
in which the event does not occur: thinking that a particular roll of the dice is more
likely since it hasn’t been rolled lately, say. The likelihood of flipping an ideal fair coin
and getting heads is one-half, no matter how many tails in a row have come up.
Many people misunderstand statistics and make decisions, even important life de-
cisions, on such ill-informed and ill-understood grounds. Indeed, some philosophers
and psychologists call people fundamentally irrational because of our failings to apply
mathematics well, and for other cognitive biases, including framing, or anchoring,
when one’s first impressions distract us from what should be more overwhelming sub-
sequent evidence. People are especially bad at understanding and applying statistics,
especially concepts like regression toward the mean and the importance of sample
size. Research into human cognitive limitations and biases is legion these days; work
by Daniel Kahneman is especially engaging and enlightening.
AMBIGUITY
As we have seen throughout the book, a central advantage of formal languages is their
relative lack of ambiguity, especially when compared to natural languages. A word,
phrase, or sentence is ambiguous when it has multiple meanings. ‘Bear’ is ambiguous
between a verb meaning ‘carry’ and a noun for an ursine animal; ‘visiting relatives can
be annoying’ is an ambiguous sentence.
Fallacies of ambiguity often arise from using words in different ways in different
parts of a sentence. For example, one might deny the existence of a past or future,
even in thought, since to think of the past or future, one has to make it present. Such
4 1 0 A P P E N DIX A
SUMMARY
We have identified some of the most common fallacies, grouping them into the cat-
egories of irrelevant premises, unwarranted or weak premises, causal fallacies, and
ambiguity:
ad hominem false dilemma
ad populum gambler’s fallacy
anchoring, or framing hasty generalization, or induction on too few
begging the question cases
complex questions post hoc ergo propter hoc
composition slippery slope
division straw man
emotion tradition
unreliably authority
wFallacies are sometimes committed intentionally, in order to manipulate consumers
or voters, say. The best way to avoid being subject to such manipulation is to learn to
read and consume in a critical way, to question claims of politicians and advertisers,
to look at good research and facts, and to learn how to use and understand statistics,
especially concepts such as statistical significance, confidence intervals, and regres-
sion to the mean (among others).
F A L L A C I E S A N D A R G UM E N T A T I O N 4 1 1
You may also worry about how to avoid committing fallacies in your own argu-
ments and conversations. Most fallacies are, in some ways, close to justifiable patterns
of reasoning. For example, while the gambler’s fallacy is clearly an error in statistical
reasoning, there are cases in which the longer one waits, the more likely what one is
waiting for will arrive, as in waiting for a bus or an elevator. Learning more mathemat-
ics, especially more statistics, and embracing the mathematics of daily life is a good
step toward avoiding some errant reasoning.
To avoid fallacies of suppressed premises, ask yourself whether the argument is
missing some important information. To avoid fallacies of unwarranted premises,
you can ask whether each assumption or reason given in an argument is itself justi-
fied or justifiable. Focus also on the relevance of each premise to its conclusion. For
ambiguity, look at the grammar of an argument and the meanings of its terms. And
to avoid causal fallacies, make sure to consider alternative explanations of any event.
Adjudicating between alternative explanations can be tricky work, sometimes requir-
ing deep scientific understanding. But the work of the logician is mainly just to ensure
that the structure of an argument explanation is legitimate, to verify that the form of
scientific work is acceptable.
5. What is the difference between formal and informal fallacies? Compare the
work of the first five chapters on invalid inferences with the fallacies of this
section.
SUGGESTED READINGS
Aristotle. Prior Analytics, Sophistical Refutations, Rhetoric. In The Complete Works of Aristo-
tle, edited by Jonathan Barnes, 39–113, 278–314, 2152–2269. Princeton, NJ: Princeton
University Press, 1984. The three best sources in Aristotle for discussions of reasoning,
fallacious and legitimate.
Arnauld, Antoine, and Pierre Nicole. Logic, or the Art of Thinking, 5th ed. Translated and ed-
ited by Jill Vance Buroker. Cambridge: Cambridge University Press, (1683) 1996. Incom-
parably influential in the seventeenth, eighteenth, and even nineteenth centuries, this
book, also known as the Port-Royal Logic, has received renewed attention in recent years.
It is heavily influenced by Descartes and is perhaps the most important discussion of rea-
soning between Aristotle’s and Kant’s.
Carroll, Lewis. Alice’s Adventures in Wonderland and Through the Looking-Glass. New York:
Signet Classic, (1872, 1960) 2000. Carroll’s amusing uses of logical fallacies are manifest
in Through the Looking Glass. His work on logic is an interesting perspective on the last
days of pre-Fregean logic; his two books Symbolic Logic and Game of Logic are available
together from Dover.
Hamblin, C. L. Fallacies. London: Methuen, 1970. Hamblin emphasizes the dialectical na-
ture of reasoning and criticizes some treatments of fallacies for failing to recognize the
broader contexts. The book contains a useful discussion of Aristotle’s work.
Hansen, Hans. “Fallacies.” In Stanford Encyclopedia of Philosophy, edited by Edward N. Zalta. Stan-
ford University, Summer 2015. plato.stanford.edu/archives/sum2015/entries/fallacies/.
Kahneman, Daniel. Thinking Fast and Slow. New York: Farrar, Straus and Giroux, 2011. An
engaging, readable, and recently influential overview of the work of Kahneman’s long
and impressive career as a Nobel Prize winning economist studying human reasoning.
For more strictly academic work, see Judgment Under Uncertainty: Heuristics and Biases,
edited by Daniel Kahneman, Paul Slovic, and Amos Tversky (Cambridge: Cambridge
University Press, 1982).
Walton, Douglas. A Pragmatic Theory of Fallacies. Tuscaloosa: University of Alabama Press,
1995. Walton’s work is highly influential in the contemporary world of informal logic, and
he invokes some formal tools for explication. See also his book-length treatments Begging
the Question (New York: Greenwood, 1991); Slippery Slope Arguments (Oxford: Claren-
don Press, 1992); and Ad Hominem Arguments (Tuscaloosa: University of Alabama Press,
1998).
In addition to the above sources, many standard logic textbooks, especially textbooks
for informal logic or critical reasoning, have extended discussion of logical fallacies.
Appendix B: The Logical
Equivalence of the Rules
of Equivalence
→ ~α ∙ ~β
De Morgan’s Laws: ~(α ∨ β) ←
~ (α ∨ β) ~ α ∙ ~ β
0 1 1 1 0 1 0 0 1
0 1 1 0 0 1 0 1 0
0 0 1 1 1 0 0 0 1
1 0 0 0 1 0 1 1 0
→ ~α ∨ ~β
De Morgan’s Laws: ~(α ∙ β) ←
~ (α ∙ β) ~ α ∨ ~ β
0 1 1 1 0 1 0 0 1
1 1 0 0 0 1 1 1 0
1 0 0 1 1 0 1 0 1
1 0 0 0 1 0 1 1 0
41 3
4 1 4 A P P E N DIX B
→ (α ∨ β) ∨ γ
Association: α ∨ (β ∨ γ) ←
α ∨ (β ∨ γ) (α ∨ β) ∨ γ
1 1 1 1 1 1 1 1 1 1
1 1 1 1 0 1 1 1 1 0
1 1 0 1 1 1 1 0 1 1
1 1 0 0 0 1 1 0 1 0
0 1 1 1 1 0 1 1 1 1
0 1 1 1 0 0 1 1 1 0
0 1 0 1 1 0 0 0 1 1
0 0 0 0 0 0 0 0 0 0
→ (α ∙ β) ∙ γ
Association: α ∙ (β ∙ γ) ←
α ∙ (β ∙ γ) (α ∙ β) ∙ γ
1 1 1 1 1 1 1 1 1 1
1 0 1 0 0 1 1 1 0 0
1 0 0 0 1 1 0 0 0 1
1 0 0 0 0 1 0 0 0 0
0 0 1 1 1 0 0 1 0 1
0 0 1 0 0 0 0 1 0 0
0 0 0 0 1 0 0 0 0 1
0 0 0 0 0 0 0 0 0 0
T H E L O G I C A L E QUIV A L E N C E O F T H E R U L E S O F E QUIV A L E N C E 4 1 5
→ (α ∨ β) ∙ (α ∨ γ)
Distribution: α ∨ (β ∙ γ) ←
α ∨ (β ∙ γ) (α ∨ β) ∙ (α ∨ γ)
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 0 0 1 1 1 1 1 1 0
1 1 0 0 1 1 1 0 1 1 1 1
1 1 0 0 0 1 1 0 1 1 1 0
0 1 1 1 1 0 1 1 1 0 1 1
0 0 1 0 0 0 1 1 0 0 0 0
0 0 0 0 1 0 0 0 0 0 1 1
0 0 0 0 0 0 0 0 0 0 0 0
→ (α ∙ β) ∨ (α ∙ γ)
Distribution: α ∙ (β ∨ γ) ←
α ∙ (β ∨ γ) (α ∙ β) ∨ (α ∙ γ)
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 0 1 1 1 1 1 0 0
1 1 0 1 1 1 0 0 1 1 1 1
1 0 0 0 0 1 0 0 0 1 0 0
0 0 1 1 1 0 0 1 0 0 0 1
0 0 1 1 0 0 0 1 0 0 0 0
0 0 0 1 1 0 0 0 0 0 0 1
0 0 0 0 0 0 0 0 0 0 0 0
4 1 6 A P P E N DIX B
→ ~β ⊃ ~α
Contraposition: α ⊃ β ←
α ⊃ β ~ β ⊃ ~ α
1 1 1 0 1 1 0 1
1 0 0 1 0 0 0 1
0 1 1 0 1 1 1 0
0 1 0 1 0 1 1 0
→ ~α ∨ β
Material Implication: α ⊃ β ←
α ⊃ β ~ α ∨ β
1 1 1 0 1 1 1
1 0 0 0 1 0 0
0 1 1 1 0 1 1
0 1 0 1 0 1 0
→ (α ⊃ β) ∙ (β ⊃ α)
Material Equivalence: α ≡ β ←
α ≡ β (α ⊃ β) ∙ (β ⊃ α)
1 1 1 1 1 1 1 1 1 1
1 0 0 1 0 0 0 0 1 1
0 0 1 0 1 1 0 1 0 0
0 1 0 0 1 0 1 0 1 0
T H E L O G I C A L E QUIV A L E N C E O F T H E R U L E S O F E QUIV A L E N C E 4 1 7
→ (α ∙ β) ∨ (~α ∙ ~β)
Material Equivalence: α ≡ β ←
α ≡ β (α ∙ β) ∨ (~ α ∙ ~ β)
1 1 1 1 1 1 1 0 1 0 0 1
1 0 0 1 0 0 0 0 1 0 1 0
0 0 1 0 0 1 0 1 0 0 0 1
0 1 0 0 0 0 1 1 0 1 1 0
→ α ⊃ (β ⊃ γ)
Exportation: (α ∙ β) ⊃ γ ←
(α ∙ β) ⊃ γ α ⊃ (β ⊃ γ)
1 1 1 1 1 1 1 1 1 1
1 1 1 0 0 1 0 1 0 0
1 0 0 1 1 1 1 0 1 1
1 0 0 1 0 1 1 0 1 0
0 0 1 1 1 0 1 1 1 1
0 0 1 1 0 0 1 1 0 0
0 0 0 1 1 0 1 0 1 1
0 0 0 1 0 0 1 0 1 0
4 1 8 A P P E N DIX B
→ α ∨ α Tautology: α ←
Tautology: α ← → α ∙α
α ∨ α α ∙ α
1 1 1 1 1 1
0 0 0 0 0 0
→ ~α ≡ β
Biconditional De Morgan’s Law (BDM): ~(α ≡ β) ←
~ (α ≡ β) ~ α ≡ β
0 1 1 1 0 1 0 1
1 1 0 0 0 1 1 0
1 0 0 1 1 0 1 1
0 0 1 0 1 0 0 0
→ ~α ≡ ~β
Biconditional Inversion (BInver): α ≡ β ←
α ≡ β ~ α ≡ ~ β
1 1 1 0 1 1 0 1
1 0 0 0 1 0 1 0
0 0 1 1 0 0 0 1
0 1 0 1 0 1 1 0
Summary of Rules and Terms
Names of Languages
PL: propositional logic
M: monadic (first-order) predicate logic
F: full (first-order) Predicate Logic
FF: full (first-order) predicate logic with functors
Symbols
~, or tilde, is used to represent negation. (2.1)
∙, or dot, is used to represent conjunction. (2.1)
∨, or vel, is used to represent disjunction. (2.1)
⊃, or horseshoe, is used to represent material implication. (2.1)
≡, or triple bar, is used to represent biconditionals. (2.1)
→ is a metalogical symbol used to show the equivalence of two different forms
←
of wffs. (3.3)
∃ is the existential quantifier. (4.1)
∀ is the universal quantifier. (4.1)
ℱ is a metalogical symbol for a formula. (4.4)
= is the identity relation; α=β is shorthand for formulas using an identity relation
Iαβ. (5.4)
EXERCISES 1.4
5. P1. The faster you go, the quicker you get to your C. When public affairs are directed by an aristocracy,
destination. the national pride takes a reserved, haughty and in-
P2. As you go faster, time itself becomes compressed. dependent form.
P3. But it is not possible to go so fast that you get there
30. P1. I have a clear and distinct understanding of my
before you started.
mind, independent of my body.
C. Local timelines are temporally ordered.
P2. I have a clear and distinct understanding of my
10. P1. Rulers define ‘justice’ as simply making a profit body, independent of my mind.
from the people. P3. Whatever I can clearly and distinctly conceive of as
P2. Unjust men come off best in business. separate, can be separated by God, and so are really
P3. Just men refuse to bend the rules. distinct.
C. Just men get less and are despised by their own C. My mind is distinct from my body.
friends.
15. P1. The greatest danger to liberty is the omnipotence EXERCISES 1.5
of the majority. 5. Invalid 30. Invalid
P2. A democratic power is never likely to perish for lack 10. Valid, unsound 35. Valid, soundness is up
of strength or resources, but it may fall because of 15. Valid, unsound for debate
the misdirection of this strength and the abuse of 20. Invalid 40. Invalid
resources. 25. Valid, unsound
C. If liberty is lost, it will be due to an oppression of
minorities, which may drive them to an appeal to EXERCISES 2.1a
arms. 5. Antecedent: Gita’s financial aid comes through.
20. P1. Reading challenges a person more than any other Consequent: Gita plays lacrosse.
task of the day. 10. Antecedent: Percy rounds up volunteers.
P2. It requires the type of training that athletes un- Consequent: Orlando organizes peer tutoring.
dergo, and with the same life-long dedication. 15. Antecedent: Thoreau pays his taxes.
P3. Books must be read as deliberately and reservedly Consequent: Emerson bails out Thoreau.
as they were written. 20. Antecedent: Singer is a utilitarian.
C. To read well, as in, to read books in a true spirit, is a Consequent: No one else is.
noble exercise.
25. P1. In aristocratic countries, great families have enor- EXERCISES 2.1b
mous privileges, which their pride rests on. 5. M ∙ A
P2. They consider these privileges as a natural right 10. P ⊃ (C ∙ F)
ingrained in their being, and thus their feeling of 15. (C ∙ P) ≡ ~T
superiority is a peaceful one. 20. M ⊃ (P ∙ W)
P3. They have no reason to boast of the prerogatives 25. (H ∨ T) ∙ (A ∨ R)
which everyone grants to them without question. 30. (T ∨ ~S) ∙ C
421
4 2 2 S ol u t i ons to S electe d E x erc i ses
EXERCISES 2.4
5. 10.
E E ≡ ~ E M N ~ (M ∨ N) ≡ N
1 1 0 0 1 1 1 0 1 1 1 0 1
0 0 0 1 0 1 0 0 1 1 0 1 0
0 1 0 0 1 1 0 1
0 0 1 0 0 0 0 0
S ol u t i ons to S electe d E x erc i ses 4 2 3
15.
S T (S ∙ ~ T) ∨ (T ⊃ S)
1 1 1 0 0 1 1 1 1 1
1 0 1 1 1 0 1 0 1 1
0 1 0 0 0 1 0 1 0 0
0 0 0 0 1 0 1 0 1 0
20.
A B (A ≡ ~ B) ⊃ [(B ∨ ~ B) ∙ A]
1 1 1 0 0 1 1 1 1 0 1 1 1
1 0 1 1 1 0 1 0 1 1 0 1 1
0 1 0 1 0 1 0 1 1 0 1 0 0
0 0 0 0 1 0 1 0 1 1 0 0 0
25.
P Q R (P ⊃ Q) ∨ [R ≡ (~ Q ∙ P)]
1 1 1 1 1 1 1 1 0 0 1 0 1
1 1 0 1 1 1 1 0 1 0 1 0 1
1 0 1 1 0 0 1 1 1 1 0 1 1
1 0 0 1 0 0 0 0 0 1 0 1 1
0 1 1 0 1 1 1 1 0 0 1 0 0
0 1 0 0 1 1 1 0 1 0 1 0 0
0 0 1 0 1 0 1 1 0 1 0 0 0
0 0 0 0 1 0 1 0 1 1 0 0 0
4 2 4 S ol u t i ons to S electe d E x erc i ses
30.
U V W [U ⊃ (V ⊃ W)] ∙ (V ∨ W)
1 1 1 1 1 1 1 1 1 1 1 1
1 1 0 1 0 1 0 0 0 1 1 0
1 0 1 1 1 0 1 1 1 0 1 1
1 0 0 1 1 0 1 0 0 0 0 0
0 1 1 0 1 1 1 1 1 1 1 1
0 1 0 0 1 1 0 0 1 1 1 0
0 0 1 0 1 0 1 1 1 0 1 1
0 0 0 0 1 0 1 0 0 0 0 0
S ol u t i ons to S electe d E x erc i ses 4 2 5
35.
A B C D ~ (A ⊃ B) ∙ (C ∨ D)
1 1 1 1 0 1 1 1 0 1 1 1
1 1 1 0 0 1 1 1 0 1 1 0
1 1 0 1 0 1 1 1 0 0 1 1
1 1 0 0 0 1 1 1 0 0 0 0
1 0 1 1 1 1 0 0 1 1 1 1
1 0 1 0 1 1 0 0 1 1 1 0
1 0 0 1 1 1 0 0 1 0 1 1
1 0 0 0 1 1 0 0 0 0 0 0
0 1 1 1 0 0 1 1 0 1 1 1
0 1 1 0 0 0 1 1 0 1 1 0
0 1 0 1 0 0 1 1 0 0 1 1
0 1 0 0 0 0 1 1 0 0 0 0
0 0 1 1 0 0 1 0 0 1 1 1
0 0 1 0 0 0 1 0 0 1 1 0
0 0 0 1 0 0 1 0 0 0 1 1
0 0 0 0 0 0 1 0 0 0 0 0
4 2 6 S ol u t i ons to S electe d E x erc i ses
40.
M N O P [(~ M ∙ N) ∨ (O ⊃ P)] ≡ M
1 1 1 1 0 1 0 1 1 1 1 1 1 1
1 1 1 0 0 1 0 1 0 1 0 0 0 1
1 1 0 1 0 1 0 1 1 0 1 1 1 1
1 1 0 0 0 1 0 1 1 0 1 0 1 1
1 0 1 1 0 1 0 0 1 1 1 1 1 1
1 0 1 0 0 1 0 0 0 1 0 0 0 1
1 0 0 1 0 1 0 0 1 0 1 1 1 1
1 0 0 0 0 1 0 0 1 0 1 0 1 1
0 1 1 1 1 0 1 1 1 1 1 1 0 0
0 1 1 0 1 0 1 1 1 1 0 0 0 0
0 1 0 1 1 0 1 1 1 0 1 1 0 0
0 1 0 0 1 0 1 1 1 0 1 0 0 0
0 0 1 1 1 0 0 0 1 1 1 1 0 0
0 0 1 0 1 0 0 0 0 1 0 0 1 0
0 0 0 1 1 0 0 0 1 0 1 1 0 0
0 0 0 0 1 0 0 0 1 0 1 0 0 0
20. 1. M ⊃ N 8. 1. X ⊃ Y
2. O ⊃ P 2. Y ⊃ Z
3. M ∙ Q /N∨P 3. W ∨ X
4. M 3, Simp 4. ~W ∙ Y / Z ∙ ~W
5. M ∨ O 4, Add
6. N ∨ P 1, 2, 5, CD 12. 1. D ∨ F
QED 2. D ⊃ ~B
3. ~~B / F ∙ ~D
25. 1. O ⊃ Q
2. Q ⊃ P 16. 1. S ⊃ E
3. P ⊃ (R ∙ S) 2. E ⊃ ~B
4. O /R∙S 3. E ⊃ ~A
5. O ⊃ P 1, 2, HS 4. S ∨ E / ~B ∨ ~A
6. P 5, 4, MP
7. R ∙ S 3, 6, MP
QED
35. 1. R ⊃ S 8. 1. X ⊃ Y
2. S ⊃ (T ⊃ U) 2. Y ⊃ Z
3. R 3. W ∨ X
4. U ⊃ R /T⊃R 4. ~W ∙ Y / Z ∙ ~W
5. R ⊃ (T ⊃ U) 1, 2, HS 5. ~W 4, Simp
6. T ⊃ U 5, 3, MP 6. X 3, 5, DS
7. T ⊃ R 6, 4, HS 7. X ⊃ Z 1, 2, HS
QED 8. Z 6, 7, MP
9. Z ∙ ~W 5, 8, Conj
40. 1. P ⊃ (Q ⊃ R) QED
2. S ⊃ (T ⊃ U)
3. W ⊃ X 12. 1. D ∨ F
4. ~(Q ⊃ R) 2. D ⊃ ~B
5. P ∨ S 3. ~~B / F ∙ ~D
6. T ∨ W /U∨X 4. ~D 2, 3, MT
7. (Q ⊃ R) ∨ (T ⊃ U) 1, 2, 5, CD 5. F 1, 4, DS
8. T ⊃ U 7, 4, DS 6. F ∙ ~D 5, 4, Conj
9. U ∨ X 8, 3, 6, CD QED
QED
16. 1. S ⊃ E
2. E ⊃ ~B
3. E ⊃ ~A
EXERCISES 3.2c: TR ANSLATIONS 4. S ∨ E / ~B ∨ ~A
4. 1. T ⊃ U 5. S ⊃ ~B 1, 2, HS
2. V ∨ ~U 6. ~B ∨ ~A 5, 3, 4, CD
3. ~V / ~T ∨ W QED
4 3 0 S ol u t i ons to S electe d E x erc i ses
30. 1. ~P ∨ Q 12. 1. C ≡ ~F
2. R ⊃ ~Q 2. (~C ∙ Z) ⊃ D
3. R ∨ ~S 3. F /Z⊃D
4. ~T ⊃ S /P⊃T
5. P ⊃ Q 1, Impl 16. 1. (M ∙ R) ∨ ~H
6. ~~Q ⊃ ~R 2, Cont 2. (~M ∙ R) ∨ H
7. Q ⊃ ~R 6, DN 3. ~(H ≡ M) ∨ R /R
8. P ⊃ ~R 5, 7, HS
9. ~~R ∨ ~S 3, DN
10. ~R ⊃ ~S 9, Impl EXERCISES 3.4c: DERIVATIONS
11. P ⊃ ~S 8, 10, HS 4. 1. C ≡ D
12. ~S ⊃ ~~T 4, Cont 2. (D ∙ E) ∙ F /C
13. ~S ⊃ T 12, DN 3. (C ⊃ D) ∙ (D ⊃ C) 1, Equiv
14. P ⊃ T 11, 13, HS 4. (D ⊃ C) ∙ (C ⊃ D) 3, Com
QED 5. D ⊃ C 4, Simp
35. 1. T ⊃ (U ⊃ V) 6. D ∙ (E ∙ F) 2, Assoc
2. Q ⊃ (R ⊃ V) 7. D 6, Simp
3. (T ∙ U) ∨ (Q ∙ R) /V 8. C 5, 7, MP
4. (T ∙ U) ⊃ V 1. Exp QED
5. (Q ∙ R) ⊃ V 2, Exp
6. V ∨ V 4, 5, 3, CD 8. 1. F ∨ L
7. V 6, Taut 2. C ∨ ~F /C∨L
QED 3. ~~F ∨ L 1, DN
4. ~F ⊃ L 3, Impl
40. 1. ~(P ≡ ~Q ) 5. ~~C ∨ ~F 2, DN
2. P ⊃ R 6. ~C ⊃ ~F 5, Impl
3. Q ∨ R /R 7. ~C ⊃ L 6, 4, HS
4. ~[(P ∙ ~Q ) ∨ (~P ∙ ~~Q )] 1,Equiv 8. ~~C ∨ L 7, Impl
5. ~(P ∙ ~Q ) ∙ ~(~P ∙ ~~Q ) 4, DM 9. C ∨ L 8, DN
6. ~(~P ∙ ~~Q ) ∙ ~(P ∙ ~Q ) 5, Com QED
7. ~(~P ∙ ~~Q ) 6, Simp
8. ~~P ∨ ~~~Q 7, DM 12. 1. C ≡ ~F
9. ~~P ∨ ~Q 8, DN 2. (~C ∙ Z) ⊃ D
10. ~P ⊃ ~Q 9, Impl 3. F /Z⊃D
11. Q ⊃ P 10, Cont 4. ~C ⊃ (Z ⊃ D) 2, Exp
12. Q ⊃ R 11, 2, HS 5. (C ⊃ ~F) ∙ (~F ⊃ C) 1, Equiv
13. R ∨ Q 3, Com 6. C ⊃ ~F 5, Simp
14. ~~R ∨ Q 13, DN 7. ~~F 3, DN
15. ~R ⊃ Q 14, Impl 8. ~C 6, 7, MT
16. ~R ⊃ R 15, 12, HS 9. Z ⊃ D 4, 8, MP
17. ~~R ∨ R 16, Impl QED
18. R ∨ R 17, DN
19. R 18, Taut 16. 1. (M ∙ R) ∨ ~H
QED 2. (~M ∙ R) ∨ H
3. ~(H ≡ M) ∨ R /R
4. ~H ∨ (M ∙ R) 1, Com
EXERCISES 3.4c: TR ANSLATIONS 5. (~H ∨ M) ∙ (~H ∨ R) 4, Dist
4. 1. C ≡ D
6. ~H ∨ M 5, Simp
2. (D ∙ E) ∙ F /C
7. H ⊃ M 6, Impl
8. 1. F ∨ L 8. H ∨ (~M ∙ R) 2, Com
2. C ∨ ~F /C∨L 9. (H ∨ ~M) ∙ (H ∨ R) 8, Dist
S ol u t i ons to S electe d E x erc i ses 4 3 3
20. 1. D ≡ (E ∙ F) 40. 1. P ≡ (Q ∨ R)
2. ~F / ~D 2. R ≡ S
3. ~F ∨ ~E 2, Add 3. Q ⊃ R /P≡S
4. ~(F ∙ E) 3, DM 4. [P ⊃ (Q ∨ R)] ∙ [(Q ∨ R) ⊃ P] 1, Equiv
5. ~(E ∙ F) 4, Com 5. P ⊃ (Q ∨ R) 4, Simp
6. (E ∙ F) ≡ D 1, BCom 6. ~P ∨ (Q ∨ R) 5, Impl
7. ~D 6, 7, BMT 7. ~P ∨ (R ∨ Q ) 6, Com
QED 8. (~P ∨ R) ∨ Q 7, Assoc
25. 1. (P ∙ Q ) ≡ R 9. (~P ∨ ~~R) ∨ Q 8, DN
2. P ≡ S 10. ~(P ∙ ~R) ∨ Q 9, DM
3. R /S∙Q 11. (P ∙ ~R) ⊃ Q 10, Impl
4. R ≡ (P ∙ Q ) 1, BCom 12. (P ∙ ~R) ⊃ R 11, 3, HS
5. P ∙ Q 4, 3, BMP 13. ~(P ∙ ~R) ∨ R 12, Impl
6. P 5, Simp 14. (~P ∨ ~~R) ∨ R 13, DM
7. S 2, 6, BMP 15. (~P ∨ R) ∨ R 14, DN
8. Q ∙ P 5, Com 16. ~P ∨ (R ∨ R) 15, Assoc
9. Q 8, Simp 17. ~P ∨ R 16, Taut
10. S ∙ Q 7, 9, Conj 18. P ⊃ R 17, Impl
QED 19. [(Q ∨ R) ⊃ P] ∙ [P ⊃ (Q ∨ R)] 4, Com
20. (Q ∨ R) ⊃ P 19, Simp
30. 1. P ≡ Q 21. ~(Q ∨ R) ∨ P 20, Impl
2. ~Q ≡ R 22. (~Q ∙ ~R) ∨ P 21, DM
3. R ≡ P /S 23. P ∨ (~Q ∙ ~R) 22, Com
4. ~Q ≡ P 2, 3, BHS 24. (P ∨ ~Q ) ∙ (P ∨ ~R) 23, Dist
5. ~~Q ≡ ~P 4, BInver 25. (P ∨ ~R) ∙ (P ∨ ~Q ) 24, Com
6. Q ≡ ~P 5, DN 26. P ∨ ~R 25, Simp
7. P ≡ ~P 1, 6, BHS 27. ~R ∨ P 26, Com
8. (P ∙ ~P) ∨ (~P ∙ ~~P) 7, Equiv 28. R ⊃ P 27, Impl
9. (P ∙ ~P) ∨ (~P ∙ P) 8, DN 29. (P ⊃ R) ∙ (R ⊃ P) 18, 28, Conj
10. (P ∙ ~P) ∨ (P ∙ ~P) 9, Com 30. P ≡ R 29, Equiv
11. P ∙ ~P 10, Taut 31. P ≡ S 30, 2, BHS
12. P 11, Simp QED
13. P ∨ S 12, Add
14. ~P ∙ P 11, Com
15. ~P 14, Simp EXERCISES 3.6c: TR ANSLATIONS
16. S 13, 15, DS 4. 1. (G ⊃ D) ≡ (B ∨ ~H)
QED 2. ~(H ⊃ B) / G ∙ ~D
35. 1. P ≡ (Q ∙ ~R) 8. 1. H ≡ (G ∨ O)
2. ~S ≡ P 2. ~H ≡ D / G ⊃ ~D
3. S ∙ ~R /Q≡R 12. 1. (P ≡ ~E) ≡ L
4. S 3, Simp 2. P ⊃ ~E
5. ~~S 4, DN 3. ~P ⊃ E /L
6. ~P 2, 5, BMT
16. 1. (C ≡ M) ⊃ ~I
7. ~(Q ∙ ~R) 1, 6, BMT
2. ~I ⊃ S
8. ~Q ∨ ~~R 7, DM
3. S ⊃ A
9. ~Q ∨ R 8, DN
4. ~A ∨ I
10. Q ⊃ R 9, Impl
5. ~C /M
11. ~R ∙ S 3, Com
12. ~R 11, Simp
13. ~R ∨ Q 12, Add EXERCISES 3.6c: DERIVATIONS
14. R ⊃ Q 13, Impl 4. 1. (G ⊃ D) ≡ (B ∨ ~H)
15. (Q ⊃ R) ∙ (R ⊃ Q ) 10, 14, Conj 2. ~(H ⊃ B) / G ∙ ~D
16. Q ≡ R 15, Equiv 3. ~(~H ∨ B) 2, Impl
QED 4. ~(B ∨ ~H) 3, Com
4 3 6 S ol u t i ons to S electe d E x erc i ses
40. 1. V ⊃ (T ∙ ~W) 8. 1. T ⊃ ~R
2. (T ⊃ W) ⊃ (~X ∨ ~Y) 2. ~(S ∨ V)
3. ~[~(V ∨ Y) ∨ ~(V ∨ X)] / ~(T ⊃ W) 3. T ∙ (U ∨ ~R) / ~(R ∨ S)
4. T ⊃ W AIP 4. R ∨ S AIP
5. ~X ∨ ~Y 2, 4, MP 5. ~~R ∨ S 4, DN
6. ~~(V ∨ Y) ∙ ~~(V ∨ X) 3, DM 6. ~R ⊃ S 5, Impl
7. (V ∨ Y) ∙ ~~(V ∨ X) 6, DN 7. T ⊃ S 1, 6, HS
8. (V ∨ Y) ∙ (V ∨ X) 7, DN 8. ~S ∙ ~V 2, DM
9. V ∨ (Y ∙ X) 8, Dist 9. ~S 8, Simp
10. ~Y ∨ ~X 5, Com 10. (T ∙ U) ∨ (T ∙ ~R) 3, Dist
11. ~(Y ∙ X) 10, DM 11. ~T 7, 9, MT
12. (Y ∙ X) ∨ V 9, Com 12. ~T ∨ ~U 11, Add
13. V 12, 11, DS 13. ~(T ∙ U) 12, DM
14. T ∙ ~W 1, 13, MP 14. T ∙ ~R 10, 13, DS
15. ~W ∙ T 14, Com 15. ~R ∙ T 14, Com
16. ~W 15, Simp 16. ~R 15, Simp
17. T 14, Simp 17. S 4, 16, DS
18. W 4, 17, MP 18. ~S ∙ S 9, 17, Conj
19. W ∙ ~W 18, 16, Conj 19. ~(R ∨ S) 4–18, IP
20. ~(T ⊃ W) 4–19, IP QED
QED 12. 1. ~A ⊃ M
2. ~A ⊃ ~E
EXERCISES 3.9b: TR ANSLATIONS 3. E ∨ P
4. 1. (X ∨ Y) ⊃ V 4. ~P ∨ ~M /A
2. W ⊃ ~V / W ⊃ ~X 5. ~A AIP
8 . 1. T ⊃ ~R 6. M 1, 5, MP
2. ~(S ∨ V) 7. ~E 2, 5, MP
3. T ∙ (U ∨ ~R) / ~(R ∨ S) 8. P 3, 7, DS
9. ~~P 8, DN
12. 1. ~A ⊃ M 10. ~M 4, 9, DS
2. ~A ⊃ ~E 11. M ∙ ~M 6, 10, Conj
3. E ∨ P 12. ~~A 5–11, IP
4. ~P ∨ ~M /A 13. A 12, DN
16. 1. R ⊃ (S ∨ C) QED
2. A ⊃ (I ∨ ~C) 16. 1. R ⊃ (S ∨ C)
3. ~I ⊃ ~S / (R ∙ ~I) ⊃ ~A 2. A ⊃ (I ∨ ~C)
3. ~I ⊃ ~S / (R ∙ ~I) ⊃ ~A
EXERCISES 3.9b: DERIVATIONS 4. R ∙ ~I ACP
4. 1. (X ∨ Y) ⊃ V 5. R 4, Simp
2. W ⊃ ~V / W ⊃ ~X 6. S ∨ C 1, 5, MP
3. W ACP 7. ~I ∙ R 4, Com
4. X AIP 8. ~I 7, Simp
5. X ∨ Y 4, Add 9. ~S 3, 8, MP
6. V 1, 5, MP 10. C 6, 9, DS
7. ~~V 6, DN 11. A AIP
8. ~W 2, 7, MT 12. I ∨ ~C 2, 11, MP
9. W ∙ ~W 3, 8, Conj 13. ~C 12, 8, DS
10. ~X 4–9, IP 14. C ∙ ~C 10, 13, Conj
11. W ⊃ ~X 3–10, CP 15. ~A 11–14, IP
QED 16. (R ∙ ~I) ⊃ ~A 4–15, CP
QED
4 4 2 S ol u t i ons to S electe d E x erc i ses
15. Invalid. Counterexample when all propositions are false. 13. (∀x){[Px ∙ (Fx ∙ Ix)] ⊃ Sx}
20. Invalid. Counterexample when K is true, L is false, M is 17. (∃x)[Cx ∙ (Wx ∙ ~Ex)]
false, and N is true. 21. (∀x)(Cx ⊃ Wx) ∙ (∀x)(Dx ⊃ Wx)
25. 1. ~Z ⊃ Y 25. (∃x)[(Ax ∙ Lx) ∙ Gx]
2. Z ⊃ ~X 29. ~Gt ≡ (∃x)[(Px ∙ Cx) ∙ ~Gx]
3. X ∨ ~Z 33. Et ∙ (Pt ∙ Nt)
4. Y ⊃ A 38. (∀x)[(Nx ∙ Px) ⊃ Ox] ⊃ (∀x)[(Nx ∙ Px) ⊃ ~Ex]
5. X ⊃ ~A / ~X 43. (∀x)[(Ax ∙ Hx) ⊃ Mx]
6. ~~X ∨ ~Z 3, DN 47. (∃x)(Gx ∙ Mx) ∙ (∃x)(Cx ∙ Mx)
7. ~X ⊃ ~Z 6, Impl 51. (∀x)[(Rx ∙ Px) ⊃ Sx]
8. Z ⊃ ~Z 2, 7, HS 55. (∃x)[(Mx ∙ Sx) ∙ (Cx ∙ ~Rx)]
9. ~Z ∨ ~Z 8, Impl 59. (∀x)[(Ex ∙ Ax) ⊃ Ox]
10. ~Z 9, Taut 64. (∀x)[Ex ⊃ (~Ax ⊃ Rx)] or (∀x)[(Ex ∙ ~Ax) ⊃ Rx]
11. Y 1, 10, MP 69. (∃x)[(Ex ∙ Px) ∙ Hx]
12. A 4, 11, MP 73. (∃x)[(Hx ∙ Px) ∙ Nx] ∨ (∀x)(Nx ⊃ ~Hx)
13. ~~A 12, DN 77. (Bh ∙ Eh) ∙ ~(Bs ∨ Es)
14. ~X 5, 13, MT 81. (Bs ∙ Es) ⊃ [~Sh ∙ (∀x)(Rx ⊃ Bx)]
QED 85. (∀x)[(Lx ∙ Px) ⊃ ~Dx]
89. (∀x)[(Mx ∙ Ox) ⊃ Cx] ∨ (∃x)[(Mx ∙ Ox) ∙ Dx]
93. (Dk ∙ Kk) ∙ ~(Dm ∨ Km)
EXERCISES 3.10b 97. (∀x)(Ux ⊃ Cx) ⊃ (∀x)(Kx ⊃ Dx)
1. 1. G ∨ G ACP
2. G 1, Taut
3. (G ∨ G) ⊃ G 1–2, CP EXERCISES 4.2b
QED 1. All athletes are brawny. Malik and Ned are athletes. So,
Malik and Ned are brawny.
5. False valuation when A and B are false. 5. All athletes are either brawny or champions. Gita is an
10. False valuation when J is true, L is false, and M is false. athlete, but she isn’t brawny. So, Gita is a champion.
15. 1. (P ∨ Q ) ∙ ~P ACP 10. Everything is brawny, and either an athlete or a cham-
2. P ∨ Q 1, Simp pion. If Ned is a champion, then everything is neither
3. ~P ∙ (P ∨ Q ) 1, Com an athlete nor brawny. Nothing is a champion. So Ned is
4. ~P 3, Simp not a champion.
5. Q 2, 4, DS
6. Q ⊃ R ACP EXERCISES 4.3
7. R 6, 5, MP 5. a) Px
8. (Q ⊃ R) ⊃ R 6–7, CP b) Only the x in Px is bound
9. [(P ∨ Q ) ∙ ~P] ⊃ [(Q ⊃ R) ⊃ R] 1–8, CP c) The x in Qx unbound
QED d) Open
20. False valuation when A is false, B is true, and C is false. e) ⊃
25. False valuation when W is false, X is true, Y is false, and 10. a) There are no quantifiers
Z is true. b) There are no bound variables
c) There are no unbound variables
d) Closed
EXERCISES 4.1a EXERCISES 4.1b e) ⊃
1. Ta 1. (∀x)(Cx ⊃ Dx)
15. a) (∀x): (Px ∙ Q y) ⊃ (∃y)[(Ry ⊃ Sy) ∙ Tx], Px ∙
5. Cs 5. (∃x)(Px ∙ Wx)
Q y, Px, Q y, (∃y)[(Ry ⊃ Sy) ∙ Tx], (Ry ⊃ Sy) ∙ Tx,
10. Bl ∨ Bm 10. (∃x)(Px ∙ Sx)
Ry ⊃ Sy, Ry, Sy, Tx; (∃y): (Ry ⊃ Sy) ∙ Tx, Ry ⊃ Sy,
15. Ch ≡ Iw 15. (∀x)(Lx ⊃ Cx)
Ry, Sy, Tx
b) (∀x): Both x’s are bound; (∃y): The y’s in Ry and
EXERCISES 4.2a Sy are bound.
1. (∃x)[(Px ∙ Fx) ∙ Sx] c) The y in Q y is unbound
5. (∀x)[Fx ⊃ ~(Ox ∙ Px)] or ~(∃x)[Fx ∙ (Ox ∙ Px)] d) Open
9. (∃x)(Px ∙ Fx) e) (∀x)
4 4 4 S ol u t i ons to S electe d E x erc i ses
8. 1. (∀x)(Rx ⊃ ~Hx)
2. ~(∃x)(Rx ∙ ~Ax)
3. ~(∀x)[Rx ⊃ (Fx ∨ Wx)] / ~(∀x)[(Ax ∙ ~Hx) ⊃ (Fx ∨ Wx)]
4. (∃x)~[Rx ⊃ (Fx ∨ Wx)] 3, QE
5. (∃x)~[~Rx ∨ (Fx ∨ Wx)] 4, Impl
6. (∃x)[~~Rx ∙ ~(Fx ∨ Wx)] 5, DM
7. (∃x)[Rx ∙ ~(Fx ∨ Wx)] 6, DN
8. (∀x)~(Rx ∙ ~Ax) 2, QE
9. (∀x)(~Rx ∨ ~~Ax) 8, DM
10. (∀x)(~Rx ∨ Ax) 9, DN
11. (∀x)(Rx ⊃ Ax) 10, Impl
12. Rj ∙ ~(Fj ∨ Wj) 7, EI
13. Rj 12, Simp
14. Rj ⊃ ~Hj 1, UI
15. ~Hj 14, 13, MP
16. Rj ⊃ Aj 11, UI
17. Aj 16, 13, MP
18. Aj ∙ ~Hj 17, 15, Conj
19. ~(Fj ∨ Wj) ∙ Rj 12, Com
20. ~(Fj ∨ Wj) 19, Simp
21. (Aj ∙ ~Hj) ∙ ~(Fj ∨ Wj) 18, 20, Conj
22. (∃x)[(Ax ∙ ~Hx) ∙ ~(Fx ∨ Wx)] 21, EG
23. (∃x)[~~(Ax ∙ ~Hx) ∙ ~(Fx ∨ Wx)] 22, DN
S ol u t i ons to S electe d E x erc i ses 4 4 9
24. (∃x)~[~(Ax ∙ ~Hx) ∨ (Fx ∨ Wx)] 23, DM 23. Cx ⊃ [(Ex ∙ Ex) ⊃ Ax] 22, Exp
25. (∃x)~[(Ax ∙ ~Hx) ⊃ (Fx ∨ Wx)] 24, Impl 24. Cx ⊃ (Ex ⊃ Ax) 23, Taut
26. ~(∀x)[(Ax ∙ ~Hx) ⊃ (Fx ∨ Wx)] 25, QE 25. (Cx ∙ Ex) ⊃ Ax 24, Exp
QED 26. (Ex ∙ Ax) ⊃ ~Fx 6, UI
27. (Ax ∙ Ex) ⊃ ~Fx 26, Com
12. 1. (∀x)(Bx ⊃ Sx) ⊃ (∃x)(Bx ∙ Ax) 28. Ax ⊃ (Ex ⊃ ~Fx) 27, Exp
2. (∀x)(Bx ⊃ Mx) 29. (Cx ∙ Ex) ⊃ (Ex ⊃ ~Fx) 25, 28, HS
3. ~(∃x)(Mx ∙ ~Sx) 30. Cx ⊃ [Ex ⊃ (Ex ⊃ ~Fx)] 29, Exp
4. ~(∃x)(Ax ∙ ~Ix) / (∃x)(Bx ∙ Ix) 31. Cx ⊃ [(Ex ∙ Ex) ⊃ ~Fx] 30, Exp
5. (∀x)~(Mx ∙ ~Sx) 3, QE 32. Cx ⊃ (Ex ⊃ ~Fx) 31, Taut
6. (∀x)(~Mx ∨ ~~Sx) 5, DM 33. (Cx ∙ Ex) ⊃ ~Fx 32, Exp
7. (∀x)(~Mx ∨ Sx) 6, DN 34. (Ex ∙ Cx) ⊃ ~Fx 33, Com
8. (∀x)(Mx ⊃ Sx) 7, Impl 35. Ex ⊃ (Cx ⊃ ~Fx) 34, Exp
9. Mx ⊃ Sx 8, UI 36. Ex ⊃ (~Cx ∨ ~Fx) 35, Impl
10. Bx ⊃ Mx 2, UI 37. Ex ⊃ ~(Cx ∙ Fx) 36, DM
11. Bx ⊃ Sx 10, 9, HS 38. ~Ex ∨ ~(Cx ∙ Fx) 37, Impl
12. (∀x)(Bx ⊃ Sx) 11, UG 39. ~[Ex ∙ (Cx ∙ Fx)] 38, DM
13. (∃x)(Bx ∙ Ax) 1, 12, MP 40. (∀x)~[Ex ∙ (Cx ∙ Fx)] 39, UG
14. (∀x)~(Ax ∙ ~Ix) 4, QE 41. ~(∃x)[Ex ∙ (Cx ∙ Fx)] 40, QE
15. (∀x)(~Ax ∨ ~~Ix) 14, DM QED
16. (∀x)(~Ax ∨ Ix) 15, DN
17. (∀x)(Ax ⊃ Ix) 16, Impl
18. Ba ∙ Aa 13, EI EXERCISES 4.6a
19. Aa ∙ Ba 18, Com 5. 1. (∀x)[Px ⊃ (Qx ∙ Rx)]
20. Aa 19, Simp 2. (∀x)(Qx ⊃ Sx) / (∀x)(Px ⊃ Sx)
21. Aa ⊃ Ia 17, UI 3. Px ACP
22. Ia 21, 20, MP 4. Px ⊃ (Qx ∙ Rx) 1, UI
23. Ba 18, Simp 5. Qx ∙ Rx 4, 3, MP
24. Ba ∙ Ia 23, 22, Conj 6. Qx 5, Simp
25. (∃x)(Bx ∙ Ix) 24, EG 7. Qx ⊃ Sx 2, UI
QED 8. Sx 7, 6, MP
9. Px ⊃ Sx 3–8, CP
16. 1. (∀x)[Ex ⊃ (Ax ∨ ~Mx)] 10. (∀x)(Px ⊃ Sx) 9, UG
2. ~(∃x)[(Ex ∙ Ax) ∙ Fx] QED
3. ~(∃x)[(Ex ∙ ~Mx) ∙ Cx] / ~(∃x)[Ex ∙ (Fx ∙ Cx)]
10. 1. (∀x)(Gx ⊃ Hx)
4. (∀x)~[(Ex ∙ Ax) ∙ Fx] 2, QE
2. ~(∃x)(Ix ∙ ~Gx)
5. (∀x)[~(Ex ∙ Ax) ∨ ~Fx] 4, DM
3. (∀x)(~Hx ⊃ Ix) / (∀x)Hx
6. (∀x)[(Ex ∙ Ax) ⊃ ~Fx] 5, Impl
4. (∃x)~Hx AIP
7. (∀x)~[(Ex ∙ ~Mx) ∙ Cx] 3, QE
5. ~Ha 4, EI
8. (∀x)[~(Ex ∙ ~Mx) ∨ ~Cx] 7, DM
6. Ga ⊃ Ha 1, UI
9. (∀x)[(Ex ∙ ~Mx) ⊃ ~Cx] 8, Impl
7. ~Ga 6, 5, MT
10. Ex ⊃ (Ax ∨ ~Mx) 1, UI
8. (∀x)~(Ix ∙ ~Gx) 2, QE
11. Ex ⊃ (~Mx ∨ Ax) 10, Com
9. ~(Ia ∙ ~Ga) 8, UI
12. Ex ⊃ (Mx ⊃ Ax) 11, Impl
10. ~Ia ∨ ~~Ga 9, DM
13. (Ex ∙ Mx) ⊃ Ax 12, Exp
11. ~~Ga ∨ ~Ia 10, Com
14. (Mx ∙ Ex) ⊃ Ax 13, Com
12. Ga ∨ ~Ia 11, DN
15. Mx ⊃ (Ex ⊃ Ax) 14, Exp
13. ~Ia 12, 7, DS
16. (Ex ∙ ~Mx) ⊃ ~Cx 9, UI
14. ~Ha ⊃ Ia 3, UI
17. Ex ⊃ (~Mx ⊃ ~Cx) 16, Exp
15. ~~Ha 14, 13, MT
18. Ex ⊃ (Cx ⊃ Mx) 17, Cont
16. ~Ha ∙ ~~Ha 5, 15, Conj
19. (Ex ∙ Cx) ⊃ Mx 18, Exp
17. ~(∃x)~Hx 4–16, IP
20. (Ex ∙ Cx) ⊃ (Ex ⊃ Ax) 19, 15, HS
18. (∀x)Hx 17, QE
21. (Cx ∙ Ex) ⊃ (Ex ⊃ Ax) 20, Com
QED
22. Cx ⊃ [Ex ⊃ (Ex ⊃ Ax)] 21, Exp
4 5 0 S ol u t i ons to S electe d E x erc i ses
12. 1. (∀x)[Xx ⊃ ~(Yx ∨ Zx)] ACP 16. False valuation in a one-member domain in
2. (∃x)(Xx ∙ Yx) AIP which:
3. Xd ∙ Yd 2, EI Ia: true
4. Xd ⊃ ~(Yd ∨ Zd) 1, UI Ja: false
5. Xd 3, Simp Ka: false
6. ~(Yd ∨ Zd) 4, 5, MP
7. ~Yd ∙ ~Zd 6, DM
8. ~Yd 7, Simp
9. Yd ∙ Xd 3, Com
10. Yd 9, Simp
11. Yd ∙ ~Yd 10, 8, Conj
12. ~(∃x)(Xx ∙ Yx) 2–11, IP
13. (∀x)[Xx ⊃ ~(Yx ∨ Zx)] ⊃ ~(∃x)(Xx ∙ Yx) 1–12, CP
QED
20. 1. ~{[(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ~Sx)] ∨ [(∃x)(~Rx ∙ Sx) ∨ (∃x)(~Rx ∙ ~Sx)]} AIP
2. ~[(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ~Sx)] ∙ ~[(∃x)(~Rx ∙ Sx) ∨ (∃x)(~Rx ∙ ~Sx)] 1, DM
3. ~[(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ~Sx)] 3, Simp
4. ~(∀x)(Rx ∙ Sx) ∙ ~(∃x)(Rx ∙ ~Sx) 3, DM
5. ~(∀x)(Rx ∙ Sx) 4, Simp
6. (∃x)~(Rx ∙ Sx) 5, QE
7. (∃x)(~Rx ∨ ~Sx) 6, DM
8. ~Ra ∨ ~Sa 7, EI
9. Ra ⊃ ~Sa 8, Impl
10. ~(∃x)(Rx ∙ ~Sx) ∙ ~(∀x)(Rx ∙ Sx) 4, Com
11. ~(∃x)(Rx ∙ ~Sx) 10, Simp
12. (∀x)~(Rx ∙ ~Sx) 11, QE
13. ~(Ra ∙ ~Sa) 12, UI
14. ~Ra ∨ ~~Sa 13, DM
15. Ra ⊃ ~~Sa 14, Impl
16. Ra ⊃ Sa 15, DN
17. ~Sa ⊃ ~Ra 16, Cont
18. Ra ⊃ ~Ra 9, 17, HS
19. ~Ra ∨ ~Ra 18, Impl
20. ~Ra 19, Taut
21. ~[(∃x)(~Rx ∙ Sx) ∨ (∃x)(~Rx ∙ ~Sx)] ∙ ~[(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ~Sx)] 2, Com
22. ~[(∃x)(~Rx ∙ Sx) ∨ (∃x)(~Rx ∙ ~Sx)] 21, Simp
23. ~(∃x)(~Rx ∙ Sx) ∙ ~(∃x)(~Rx ∙ ~Sx) 22, DM
24. ~(∃x)(~Rx ∙ Sx) 23, Simp
25. (∀x)~(~Rx ∙ Sx) 24, QE
26. ~(~Ra ∙ Sa) 25, UI
27. ~~Ra ∨ ~Sa 26, DM
28. ~~~Ra 20, DN
29. ~Sa 27, 28, DS
30. ~(∃x)(~Rx ∙ ~Sx) ∙ ~(∃x)(~Rx ∙ Sx) 23, Com
31. ~(∃x)(~Rx ∙ ~Sx) 30, Simp
32. (∀x)~(~Rx ∙ ~Sx) 31, QE
33. ~(~Ra ∙ ~Sa) 32, UI
34. ~Ra ∙ ~Sa 20, 29, Conj
35. (~Ra ∙ ~Sa) ∙ ~(~Ra ∙ ~Sa) 34, 33, Conj
36. ~~{[(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ~Sx)] ∨ [(∃x)(~Rx ∙ Sx) ∨ (∃x)(~Rx ∙ ~Sx)]} 1–35, IP
37. [(∀x)(Rx ∙ Sx) ∨ (∃x)(Rx ∙ ~Sx)] ∨ [(∃x)(~Rx ∙ Sx) ∨ (∃x)(~Rx ∙ ~Sx)] 36, DN
QED
S ol u t i ons to S electe d E x erc i ses 4 5 7
EXERCISES 5.3c
1. 1. (∀x)(∀y)Axy ACP
2. (∀y)Aay 1, UI
3. Aab 2, UI
4. (∃y)Aay 3, EG
5. (∃x)(∃y)Axy 4, EG
6. (∀x)(∀y)Axy ⊃ (∃x)(∃y)Axy 1–5, CP
QED
5. 1. (∃x)Exx ACP
2. Egg 1, EI
3. (∃y)Egy 2, EG
4. (∃x)(∃y)Exy 3, EG
5. (∃x)Exx ⊃ (∃x)(∃y)Exy 1–4, CP
QED
10. 1. ~[(∃x)(∀y)( Jxy ∙ ~Jyx) ∨ (∀x)(∃y)( Jxy ⊃ Jyx)] AIP
2. ~(∃x)(∀y)( Jxy ∙ ~Jyx) ∙ ~(∀x)(∃y)(Jxy ⊃ Jyx) 1, DM
3. ~(∃x)(∀y)( Jxy ∙ ~Jyx) 2, Simp
4. ~(∀x)(∃y)( Jxy ⊃ Jyx) ∙ ~(∃x)(∀y)( Jxy ∙ ~Jyx) 2, Com
5. ~(∀x)(∃y)( Jxy ⊃ Jyx) 4, Simp
6. (∃x)~(∃y)( Jxy ⊃ Jyx) 5, QE
7. (∃x)(∀y)~( Jxy ⊃ Jyx) 6, QE
8. (∀y)~( Jay ⊃ Jya) 7, EI
9. (∀y)~(~Jay ∨ Jya) 8, Impl
10. (∀y)(~~Jay ∙ ~Jya) 9, DM
11. (∀y)( Jay ∙ ~Jya) 10, DN
12. (∀x)~(∀y)( Jxy ∙ ~Jyx) 3, QE
13. (∀x)(∃y)~( Jxy ∙ ~Jyx) 12, QE
14. (∀x)(∃y)(~Jxy ∨ ~~Jyx) 13, DM
15. (∀x)(∃y)(~Jxy ∨ Jyx) 14, DN
16. (∀x)(∃y)(Jxy ⊃ Jyx) 15, Impl
17. (∃y)(Jay ⊃ Jya) 16, UI
18. Jab ⊃ Jba 17, EI
19. Jab ∙ ~Jba 11, UI
20. Jab 19, Simp
21. Jba 18, 20, MP
22. ~Jba ∙ Jab 19, Com
23. ~Jba 22, Simp
24. Jba ∙ ~Jba 21, 23, Conj
25. ~~[(∃x)(∀y)( Jxy ∙ ~Jyx) ∨ (∀x)(∃y)( Jxy ⊃ Jyx)] 1–24, IP
26. (∃x)(∀y)( Jxy ∙ ~Jyx) ∨ (∀x)(∃y)(Jxy ⊃ Jyx) 25, DN
QED
4 6 2 S ol u t i ons to S electe d E x erc i ses
EXERCISES 5.4
1. (∃x)(∃y)(Sx ∙ Sy ∙ x≠y)
5. (∃x){Sx ∙ Px ∙ Dx ∙ (∀y)[(Sy ∙ Py ∙ Dy) ⊃ y=x]}
9. (∀x)(∀y)(∀z)[(Sx ∙ Axr ∙ Exs ∙ Sy ∙ Ayr ∙ Eys ∙ Sz ∙ Azr ∙ Ezs) ⊃ (x=y ∨ x=z ∨ y=z)]
13. (∃x)(∃y)(∃z){Sx ∙ Axr ∙ Sy ∙ Ayr ∙ Sz ∙ Azr ∙ Exs ∙ Eys ∙ Ezs ∙ x≠y ∙ x≠z ∙ y≠z ∙ (∀w)[(Sw ∙ Awr ∙ Ews) ⊃ (w=x ∨ w=y ∨ w=z)]}
17. (∃x)(Dx ∙ Tcx) ∙ (∀x)[(∃y)(Dy ∙ Txy) ⊃ x=c]
21. Df ∙ Tcf ∙ (∀x)[(Dx ∙ Tcx ∙ x≠f) ⊃ Bfx]
25. (∃x)(∃y)(Wx ∙ Lxh ∙ Wy ∙ Lyh ∙ x≠y)
29. Wf ∙ Ifc ∙ (∀x)[(Wx ∙ Ixc ∙ x≠f) ⊃ Sfx]
33. (∃x){(Cx ∙ Bx) ∙ (∀y)[(Cy ∙ By) ⊃ y=x] ∙ x=n}
37. (∀x)(∀y)(∀z){[Fxn ∙ Fyn ∙ Fzn ∙ (∃w)(Aw ∙ Hxw ∙ Bwg) ∙ (∃w)(Aw ∙ Hyw ∙ Bwg) ∙ (∃w)(Aw ∙ Hzw ∙ Bwg)] ⊃ (x=y ∨ x=z ∨ y=z)}
41. (∃x)(∃y)(Bx ∙ Ixp ∙ By ∙ Iyp ∙ x≠y)
45. Bn ∙ ~Tnp ∙ (∀x)[(Bx ∙ x≠n) ⊃ Txp]
49. Sa ∙ Ial ∙ (∀x)[(Sx ∙ Ixl ∙ x≠a) ⊃ Bax]
53. Sa ∙ Ial ∙ (∀x)[(Px ∙ Wtx) ⊃ ~Rax)] ∙ (∀x)[(Sx ∙ Ixl ∙ x≠a) ⊃ (∃y)(Py ∙ Wty ∙ Rxy)]
57. (∀x)(∀y)[(Lx ∙ Bxg ∙ Ly ∙ Byg) ⊃ y=x]
61. (∀x)(∀y)(∀z){(Lx ∙ Dxp ∙ Ly ∙ Dyp ∙ Lz ∙ Dzp) ⊃ [(x=g ∨ x=s) ∙ (y=g ∨ y=s) ∙ (z=g ∨ z=s)]}
65. (∃x)(∃y)(∃z){Mx ∙ Ix ∙ My ∙ Iy ∙ Mz ∙ Iz ∙ x≠y ∙ y≠z ∙ x≠z ∙ (∀w)[(Mw ∙ Iw) ⊃ (w=x ∨ w=y ∨ w=z)]}
69. Mk ∙ Ix ∙ (∀x)[(Mx ⊃ x≠k) ⊃ ~Ix]
74. (∃x)(∃y)(Dx ∙ Dy ∙ Rx ∙ Ry ∙ ~Fx ∙ ~Fy ∙ x≠y)
77. Pd ∙ Ld ∙ (∀x)[(Px ∙ x≠d) ⊃ ~Lx]
81. (∀x)(∀y)(∀z)[(Dx ∙ Kx ∙ Dy ∙ Ky ∙ Dz ∙ Kz) ⊃ (x=y ∨ x=z ∨ y=z)]
85. (∃x){Dx ∙ Gx ∙ (∀y)[(Dy ∙ Gy) ⊃ y=x] ∙ Kx}
EXERCISES 5.5a
5. 1. Dkm ∙ (∀x)(Dkx ⊃ x=m)
2. Dab
3. Fb ∙ ~Fm / a≠k
4. (∀x)(Dkx ⊃ x=m) ∙ Dkm 1, Com
5. (∀x)(Dkx ⊃ x=m) 4, Simp
6. a=k AIP
7. Dkb 2, 6, IDi
8. Dkb ⊃ b=m 5, UI
9. b=m 8, 7, MP
10. Fb 3, Simp
11. Fm 10, 9, IDi
12. ~Fm ∙ Fb 3, Com
13. ~Fm 12, Simp
14. Fm ∙ ~Fm 11, 13, Conj
15. a≠k 6–14, IP
QED
4 6 4 S ol u t i ons to S electe d E x erc i ses
16. 1. (∀x)(∀y){(Px ∙ Py ∙ x≠y) ⊃ (∃z){(Lz ∙ Czx ∙ Czy ∙ (∀w)[(Lw ∙ Cwx ∙ Cwy) ⊃ w=z]}}
2. Pa ∙ Pb ∙ a≠b
3. Cla ∙ Clb
4. Ll ∙ Lm ∙ l≠m / ~(Cma ∙ Cmb)
5. (∀y){(Pa ∙ Py ∙ a≠y) ⊃ (∃z){(Lz ∙ Cza ∙ Czy ∙ (∀w)[(Lw ∙ Cwa ∙ Cwy) ⊃ w=z]}} 1, UI
6. (Pa ∙ Pb ∙ a≠b) ⊃ (∃z){(Lz ∙ Cza ∙ Czb ∙ (∀w)[(Lw ∙ Cwa ∙ Cwb) ⊃ w=z]} 5, UI
7. (∃z){(Lz ∙ Cza ∙ Czb ∙ (∀w)[(Lw ∙ Cwa ∙ Cwb) ⊃ w=z]} 6, 2, MP
8. Lr ∙ Cra ∙ Crb ∙ (∀w)[(Lw ∙ Cwa ∙ Cwb) ⊃ w=r] 7, EI
9. (∀w)[(Lw ∙ Cwa ∙ Cwb) ⊃ w=r] 8, Simp
10. (Ll ∙ Cla ∙ Clb) ⊃ l=r 9, UI
11. Ll 4, Simp
12. Ll ∙ Cla ∙ Clb 11, 3, Conj
13. l=r 10, 12, MP
14. Cma ∙ Cmb AIP
15. Lm 4, Simp
16. Lm ∙ Cma ∙ Cmb 15, 14, Conj
17. (Lm ∙ Cma ∙ Cmb) ⊃ m=r 9, UI
18. m=r 17, 16, MP
19. r=l 13, IDs
20. m=l 18, 19, IDi
21. l≠m 4, Simp
22. l=m 20, IDs
23. l=m ∙ l≠m 22, 21, Conj
24. ~(Cma ∙ Cmb) 14–23, IP
QED
EXERCISES 5.5c
1. 1. ~(∀x)(∀y)(x=y ≡ y=x) AIP
2. (∃x)~(∀y)(x=y ≡ y=x) 1, QE
3. (∃x)(∃y)~(x=y ≡ y=x) 2, QE
4. (∃x)(∃y)(~x=y ≡ y=x) 3, BDM
5. (∃x)(∃y)(~x=y ≡ x=y) 4, IDs
6. (∃y)(~a=y ≡ a=y) 5, EI
7. ~a=b ≡ a=b 6, EI
8. (~a=b ∙ a=b) ∨ (~~a=b ∙ ~a=b) 7, Equiv
9. (a=b ∙ ~a=b) ∨ (~~a=b ∙ ~a=b) 8, Com
10. (a=b ∙ ~a=b) ∨ (a=b ∙ ~a=b) 9, DN
11. a=b ∙ ~a=b 10, Taut
12. ~~(∀x)(∀y)(x=y ≡ y=x) 1–11, IP
13. (∀x)(∀y)(x=y ≡ y=x) 12, DN
QED
5. 1. x=y ∙ x=z ACP
2. x=y 1, Simp
3. x=z 1, Simp
4. z=y 2, 3, IDi
5. y=z 4, IDs
6. (x=y ∙ x=z) ⊃ y=z 1–5, CP
7. (∀z)[(x=y ∙ x=z) ⊃ y=z] 6, UG
8. (∀y)(∀z)[(x=y ∙ x=z) ⊃ y=z] 7, UG
9. (∀x)(∀y)(∀z)[(x=y ∙ x=z) ⊃ y=z] 8, UG
QED
S ol u t i ons to S electe d E x erc i ses 4 7 1
EXERCISES 5.7a
4. 1. (∀x)[Px ⊃ Pf(x)]
2. (∀x)(Qx ⊃ Px) 12. 1. (∀x)[(Bf(x) ⊃ (Cx ∙ Df(f(x)))]
3. Qa / Pf(a) 2. (∃x)Bf(f(x))
4. Qa ⊃ Pa 2, UI 3. (∃x)Cf(x) ⊃ (∀x)Ex / (∃x)[Df(f(f(x))) ∙ Ef(f(f(x)))]
5. Pa 4, 3, MP 4. Bf(f(a)) 2, EI
6. Pa ⊃ Pf(a) 1, UI 5. B(f(f(a)) ⊃ [Cf(a) ∙ Df(f(f(a)))] 1, UI
7. Pf(a) 6, 5, MP 6. Cf(a) ∙ Df(f(f(a))) 5, 4, MP
QED 7. Cf(a) 6, Simp
8. (∃x)Cf(x) 7, EG
9. (∀x)Ex 3, 8, MP
10. Ef(f(f(a))) 9, UI
11. Df(f(f(a))) 6, Simp
12. Df(f(f(a))) ∙ Ef(f(f(a))) 11, 10, Conj
13. (∃x)[Df(f(f(x))) ∙ Ef(f(f(x)))] 12, EG
QED
4 7 2 S ol u t i ons to S electe d E x erc i ses
addition (Add) A rule of inference of PL, arithmetic, Peano axioms for, 385–386 material conditional and, 106–107
124–125, 127 Arnauld, Antoine, 401 method for proving biconditional con-
ad hominem, 405 association (Assoc) Rules of equivalence clusion, 178
ad populum, 405 of PL, 136, 140, 414 proof strategies, 204
ambiguity, 29, 409–410 atomic formula The simplest type of propositional logic, 164–169
anchoring, 409 formula of a language rules of equivalence, 168, 169
antecedent In a conditional, the formula An atomic formula of F is an n-placed rules of inference, 166, 168
that precedes the ⊃ is called the predicate followed by n singular truth table for, 51
antecedent, 27 terms, 329 biconditional association (BAssoc) A
exercises, 31–32 An atomic formula of M is formed by rule of equivalence of PL that allows
simplifying, 157–158 a predicate followed by a singular you to regroup propositions with
anyone A term that indicates a quantifier, term, 235 two biconditionals, 166, 167, 169
but which should be distinguished An atomic formula of PL is a single biconditional commutativity (BCom)
from ‘anything’, 221 capital letter, 44 A rule of equivalence of PL that
anything A term that indicates a quanti- attribute A grammatical predicate. Attri- allows you to switch the order of
fier, and which may be existential or butes may be simple (as ‘are happy’ formulas around a biconditional,
universal, 215 in ‘Some philosophers are happy’) 166, 168
appeals to emotion, 406 and be regimented as a single predi- biconditional De Morgan’s law (BDM)
appeals to tradition, 405 cate. They may be complex (as ‘is a A rule of equivalence of PL. When
appeal to unreliable authority, 405 big, strong, blue ox’ in ‘Babe is a big, bringing a negation inside parenthe-
argument Collections of propositions, strong, blue ox’) and regimented ses with BDM, make sure to negate
called premises, together with a using multiple predicates, 220 only the formula on the left side of
claim, called the conclusion, that the Aurelius, Marcus, 13 the biconditional, 166, 167, 168
premises are intended to support or Ayer, A. J., 12, 15 biconditional inversion (BInver) A rule
establish, 9 of equivalence of PL. To use BInver,
exercises for determining validity of, basic truth table For a logical operator, negate both sides of the bicondi-
98–102 defines the operator by showing the tional, but do not switch their posi-
logic and, 1–3 truth value of the operation, given tions, 166, 167, 169
and numbered premise-conclusion any possible distribution of truth val- biconditional hypothetical syllogism
form, 29–30 ues of the component premises, 47 (BHS) A rule of inference of PL,
premise-conclusion form, 12–16 begging the question, 403, 407 and works just like ordinary hypo-
translating into propositional logic, Begriffsschrift (Frege), 4, 8 thetical syllogism, 166, 167, 168
34–41 Berkeley, George, 15 biconditional modus ponens (BMP)
valid and invalid, 77–81 biconditional A complex proposition, A rule of inference of PL, parallel
validity and soundness, 16–18 most notably used to represent ‘if to modus ponens, but used when
argumentation. See fallacies and and only if ’ claims, 28–29 the major premise has a bicondi-
argumentation deriving conclusions using rules of tional, rather than a conditional,
Aristotle, 5, 8, 14, 20, 299, 401 inference and equivalence, 169–174 166, 168
475
4 7 6 G L O S S A R Y / I N D E X
biconditional modus tollens (BMT) we use material implication to repre- constant In predicate logic, a singular
A rule of inference of PL. Unlike sent conditionals, 26–28 term that stands for a specific object;
modus tollens, use BMT when you combining, 158 a, b, c, . . . u are used as constants in
have the negation of the term which making conditionals, 156 M and F. In FF, f, g, and h are used
comes before the biconditional in negated, 157 as functors, 213
the major premise, 166, 168 proof strategies, 204 invalidity in M, 284–285
binary operators Logical operators that simplifying antecedents and conse- constructive dilemma (CD) A rule of
relate or connect two propositions, 24 quents, 157–158 inference of PL, 126–127, 128
bivalent logic In a bivalent logic every switching antecedents of nested, 157 contingencies A contingency is a propo-
statement is either true or false, and conditional proof One of three deriva- sition that is true in some rows of a
not both, 46 tion methods. In a conditional proof, truth table and false in others, 69
bound variable A bound variable is at- we indent, assuming the antecedent contradictions Contradiction is used in
tached, or related, to the quantifier of a desired conditional, derive the three different ways in this book:
that binds it. A variable is bound by consequent of our desired condi- A single proposition that is false in
a quantifier when it is in the scope of tional within the indented sequence, every row of its truth table is a con-
the quantifier and they share a vari- and discharge our assumption by tradiction, 69
able, 234 concluding the conditional: if the Two propositions with opposite truth
first line of the sequence, then the values in all rows of the truth table
Cantor, Georg, 6, 7 last line of the sequence, 175 are contradictory, 72
Carroll, Lewis, 411 common error when deriving logical In derivations, a contradiction is any
causal fallacies, 408–409 truths, 186–187 wff of the form α • ~α, 159
charity, principle of, 301–304 derivations in predicate logic with CP, contraposition (Cont) A rule of equiva-
Chrysippus, 5 264–265 lence of PL, 146, 150, 416
circular reasoning, 403 deriving conclusions using, 179–183 Copernicus, Nicolaus, 13
closed sentence A closed sentence has no exercises in deriving logical truths, counterexample A counterexample to an
free variables, 213, 234 190–191 argument is a valuation that makes
commutativity (Com) Rules of equiva- method of, 175 the premises true and the conclusion
lence of PL, 137–138, 140 nested sequence, 177 false, 78
complete system of inference One in propositional logic, 174–179
which every valid argument and conjunction A complex proposition, Darwin, Charles, 13
every logical truth is provable, 114 consisting of two conjuncts. We use Dedekind, Richard, 385
complex formula One that is not atomic, 44 conjuncts to represent many propo- definite descriptions A definite descrip-
complex proposition One that is not sitions that contain ‘and’, 25 tion picks out an object by using a
atomic, 47 proof strategies, 204 descriptive phrase beginning with
truth values of, 51–54 truth table for, 48 ‘the’, as in ‘the person who . . . ,’ or
with unknown truth values, 54–56 conjunction (Conj) A rule of inference ‘the thing that . . .’, 360
complex questions, 410 of PL, 124–125, 127 identity predicate, 360–361
composition, 410 consequent In a conditional, the formula translation with function, 381
compositionality The principle that the that follows the ⊃ is called the con- De Morgan, Augustus, 9
meaning of a complex sentence is sequent, 27 De Morgan’s laws (DM) Rules of equiva-
determined by the meanings of its exercises, 31–32 lence of PL, 135–136, 140, 413, 418.
component part, 23 simplifying, 157–158 See also biconditional De Morgan’s
Comte, Auguste, 15 consistent propositions Two or more law (BDM)
conclusion A proposition in an argument propositions that are true in at least derivation A sequence of formulas, every
that is supposed to follow from the one common row of their truth member of which is an assumed
premises, 9 tables are consistent, 72 premise or follows from earlier for-
separating premises from, 9–12 method of indirect truth tables for, 93 mulas in the sequence according to
conditional A complex claim, often in an consistent valuation A consistent valua- specified rules, 113, 197
‘if . . . then . . .’ form, that consists of tion is an assignment of truth values converting into logical truths, 187–189
an antecedent (the claim that fol- to atomic propositions that makes a exercises for deriving conclusions of
lows the ‘if ’) and a consequent (the set of propositions all true, 90 arguments, 160–164
claim that follows the ‘then’). In PL, exercises determining, 102–104 in F (full first-order logic), 337–343
G L O S S A R Y / I N D E X 4 7 7
functional structure A functional simple identity claims, 353 inference. See rules of inference
structure reflects the complexity superlatives, 355 informal fallacies, 403–404
of a functor term or of the n-tuple syntax for, 352 interpretation An interpretation of
of singular terms in a functor term. translation of, 351–362 a formal language describes the
The functional structure increases identity rules (IDi, IDr and IDs) Three meanings or truth conditions of its
with the number of embedded func- rules governing the identity relation, components. For M, we interpret
tions, 390 352, 370–375 constants, predicates, quantifiers,
derivations and, 390–394 identity theory, 351–352, 380 and the propositional operators,
functor A functor is a symbol used to IDi The indiscernibility of identicals, 274
represent a function. In FF, we use f, also known as Leibniz’s law: if invalid argument, 78
g, and h as functors, 382 α=β, then any wff containing α invalidity
derivations with, 394–396 may be exchanged for a wff in PL (propositional logic), 77-83
functor terms A functor term is a functor containing β in the same places, in M (monadic predicate logic),
followed by an n-tuple of singular 370, 371, 375 280–292
terms in brackets, 383 IDr The reflexive property of identity, in F (full first-order logic), 331–334
α=α, for any singular term α, 370, generating counterexamples to show,
gambler’s fallacy, 409 375 292–298
Gentzen, Gerhard, 8 IDs The symmetry property of identity: irrelevant premises, 404–406
Gödel, Kurt, 8 α=β ← → β=α, for any singular
terms, 370, 375 Jaskowski, Stanislaw, 8
hasty generalization A logical fallacy. inconsistent pair In an inconsistent pair justification A justification in a deriva-
In inductive logic, it is sometimes of propositions, there is no row of tion includes the line numbers and
called induction on too few cases. the truth table in which both state- rule that allows the inference at ev-
Instantiation and generalization ments are true; there is no consistent ery step after the premises, 118
rules for deductive logic are de- valuation, 73
signed to avoid hasty generalization indented sequence An indented se- Kahneman, Daniel, 409
by preventing universal generaliza- quence is a series of lines in a deri- Kant, Immanuel, 5–6, 9, 20
tion from existential premises, 240, vation that do not follow from the
408 premises directly, but only with a languages, logic and, 3–5
Hegel, G. W. F., 5, 14, 20 further assumption, indicated on the law of the excluded middle. See excluded
Hilbert, David, 8 first line of the sequence, 175 middle
Hume, David, 5, 16 indirect proof or reductio ad absurdum, Leibniz, G. W., 6, 15, 370
hypothetical syllogism (HS) A rule of One of three derivation methods. Leibniz’s law, 352, 370
inference of PL, 117, 120 In an indirect proof, we assume the Locke, John, 5
opposite of a desired conclusion, logic
identity predicate indenting to note the assumption, defining, 1–3
‘at least’ and ‘at most’, 356–358 and find a contradiction, some fallacies and argumentation, 401–411
conventions for derivations with statement of the form α • ~α. Then, and languages, 3–5
dropped brackets, 372–375 we discharge our assumption, unin- See also three-valued logics
definite descriptions, 360–362 denting, writing the negation of the logical equivalence Two or more propo-
derivations, 370–375 first line of the assumption in the sitions are logically equivalent when
deriving logical truths of, 380 first line of the indented sequence, they have the same truth conditions,
‘exactly’, 358–359 191 in other words, they have the same
‘except’ and ‘only’, 353–354 derivations in predicate logic with IP, truth values in every row of their
exercises deriving conclusions, 265–266 truth tables, 70
376–378 deriving conclusions of arguments us- logically equivalent ( ← → ) is a metal-
exercises translating arguments using, ing, 198–203 ogical symbol used for “is logically
378–380 method for, 192–193 equivalent to”, 135
exercises translating into first-order indirect truth tables, 83–97 logical truths Logical truths are proposi-
logic, 363–369 consistency and method of, 90–97 tions which are true on any interpre-
identity symbol, ‘=’, 352, 362 method for consistency, 93 tation. For PL, the logical truths are
introducing identity theory, 351–352 method for testing validity, 85 tautologies. Given the completeness
rules, 352, 370–375 induction on too few cases, 408 of PL, M, and F, they are definable
G L O S S A R Y / I N D E X 4 7 9
semantically or proof-theoretically. steps to interpret theory of M, 279 negation A complex proposition used
They can be proved with no prem- syntax for M, 233–237 for denying a proposition. The tilde,
ises, 68, 277 things and people, 220–221 used to represent negation, is the
common error in using conditional translation exercises, 225–232, 237– only unary logical operator in PL, 24
proof to derive, 186–187 238, 250–251, 271–272 proof strategies, 204
conditional and indirect proofs in F, translation using M, 219–225, of quantified formulas, 291
342–343 229–305 statement entailing its own, 159
converting ordinary derivations into, universally quantified formulas and truth table for, 47–48
187–189 existential import, 299 neither Neither is ‘not either’, and is usu-
exercises in determining, 204 vocabulary of M, 233–235 ally represented as the negation of a
invalidity in M, 289–290 main operator The last operator added disjunction, and should be carefully
in propositional logic (PL), 184–189 to a wff according to the formation distinguished from ‘not both’, 26
semantic arguments, 277–278 rules is called the main operator, 44 nested sequence A nested sequence
material conditional. See material arises from an assumption within
M The formal language of monadic predi- implication another assumption, 177
cate logic, 214 material equivalence (Equiv) A rule of new constant A new constant is one that
adjectives, 223–224 equivalence of PL, 147–148, 150, does not appear in either any earlier
‘and’s and ‘or’s and universally quanti- 416, 417 line of the argument or the desired
fied formulas, 299–301 material implication A complex proposi- conclusion, 242
appendices of derivations, 306–308 tion consisting of an antecedent and Newton, Isaac, 6
conditional and indirect proof in M, a consequent, Nicole, Pierre, 401
263–268 biconditional and, 106–107 Niemöller, Martin, 407
constants, 284–285 often used to represent ‘if . . . then..’ Nietzsche, Friedrich, 20
constructing models of theories, statements, 26–28 non sequiturs, 405
279–280 truth table for, 50–51 no one A term that indicates a quantifier,
derivations in M, 238–246 material implication (Impl) A rule of but which should be distinguished
deriving logical truths of M, 273 equivalence of PL, 146–147, 150, 416 from ‘nothing’, 221
domains of one member, 281–283 mathematics not both Not both is usually represented
domains of three or more members, logic and, 2 as the negation of a conjunction, and
285–288 Peano axioms for, 385 should be carefully distinguished
domains of two members, 283–284 Meditations on First Philosophy (Des- from ‘neither’, 26
exercises in deriving conclusions, cartes), 403 n-tuple An n-tuple is a set with structure
247–249, 268–271 metalogic, 71 used to describe an n-place relation.
expanding vocabulary, 236 method of finite universes The method Also, ‘n-tuple’ is a general term for
finding errors in illicit inferences, of finite universes is a semantic pairs, triples, quadruples, and so
252–254 method that can produce counter- on, 329
formation rules for wffs of M, examples to arguments in predicate n-tuple of singular terms An n-tuple of
235–236 logic, 281 singular terms is an ordered series of
invalidity in M, 280–292 Mill, John Stuart, 16, 401 singular terms (constants, variables
logical truths of, 267, 289–290 model A model of a theory is an interpre- or functor terms), 383
negations of quantified formulas, 291 tation on which all of the sentences
only, 221–223 of the theory are true, 277 only Only is a term that often indicates a
overlapping quantifiers, 290–291 modus ponens (MP) A rule of inference quantifier. Sentences with ‘only’ may
propositions whose main operator is of PL, 114–115, 119 be related to sentences using ‘all’,
not a quantifier, 288–289 modus tollens (MT) A rule of inference 221–223
propositions with more than one quan- of PL, 115–116, 120 identity statements, 353–354
tifier, 223 monadic predicate logic Predicate logic open sentence An open sentence has at
quantified sentences with more than in which the predicates take only least one free variable, 213, 234
two predicates, 220 one singular term, 213 operators Logical operators are tools for
quantifier exchange, 254–258 manipulating and combining propo-
quantifiers, domains and charity, narrow scope of quantifier A quanti- sitions or terms. They are defined by
301–304 fier’s scope is narrower the fewer their basic truth tables, 22
semantics for, 273–279 subformulas it contains, 314 negation of, 24
4 8 0 G L O S S A R Y / I N D E X
Peano, Giuseppe, 385 quantifiers. In this book, M, F, and instantiating the same quantifier twice,
Peano axioms, arithmetic, 385–386 FF are all predicate logics, 213 244–245
Peirce, Charles Sanders, 8 languages of, 217 instantiation and generalization rules,
petitio principii, 403 quantifiers, 214–215 243–244
PL The language of propositional logic singular terms and predicates, 213–214 narrower scope, 314
used in this book; the term is also premise-conclusion form, arguments and overlapping, 290–291
used to refer to the system of deduc- numbered, 29–30 putting on the existential, 241
tion used with that language, 4, 22, premises A proposition in an argument putting on the universal, 239–241
27, 29–30 on which the conclusion is based or quantified sentences with two predi-
the biconditional, 106–107, 164–169 should follow, 9 cates, 216–217
conditional proof, 174–179 separating from conclusions, 9–12 taking off the existential, 241–243
inclusive and exclusive disjunction, problem of empty reference, 355 taking off the universal, 238–239
107–108 proof A derivation, or proof, is a sequence translation in M, 301–304
indirect proof, 191–198 of formulas, every member of which universal, 215
indirect truth tables, 83–97 is an assumed premise or follows wider scope, 314
interpreting sentences of, 42–43 from earlier formulas in the se- quantifier exchange (QE) A rule of
logical equivalence and translation, quence according to specified rules, replacement in predicate logic in
105–106 113, 197 which quantifiers may be switched,
logical truths, 184–189, 204–205 strategies, 204 along with surrounding negations,
material conditional, 106–107 proof theory Proof theory is the study 255–256
modus ponens (MP), 114–115 of axioms (if any) and rules of infer- exercise in translating arguments,
modus tollens (MT), 115–116 ence for a formal theory, 274 261–263
notes on translation with PL, 105–111 proposition A statement, often expressed exercises deriving conclusions,
practice with derivations, 156–160 by a sentence, 9 258–261
proof strategies, 204 classifying, 68–74 rules for removing and replacing quan-
rules of equivalence, 135–140, classifying exercises, 74–75 tifiers, 254–256
146–150 consistent, 72 transformations permitted by,
rules of inference, 113–120, 124–128 contingencies, 69 256–258
semantics of, 46–57 contradictions, 69 Quine, W. V., 14
syntax of PL, 43–45 contradictory, 72
translating argument into, 34–41 exercises determining consistency of, reductio ad absurdum. See indirect proof
translating sentences, 32–34 102–104 regimentation A regimentation of an ar-
truth tables, 59–67 inconsistent pairs, 73 gument helps reveal its logical struc-
“unless” and exclusive disjunction, logical equivalence, 70 ture, either by putting the argument
108–111 tautology, 68 into numbered premise–conclusion
valid and invalid arguments, 77–81, valuation, 73 form, or by translating the argument
205–207 propositional logic (PL). See PL into a formal language, 9
Plato, 13, 20, 33 Putnam, Hilary, 13 relational predicates Relational predi-
Playfair, John, 7 cates or polyadic predicates are
polyadic predicates. See relational QED An acronym for the Latin “Quod followed by more than one singular
predicates erat demonstrandum,” or “that term, 310
Port-Royal Logic (Arnauld and Nicole), which was required to be shown,” exercises translating formulas into
401 and is used as a logician’s punctuation English sentences, 327
post hoc ergo propter hoc, 408 mark, to indicate the end of a deriva- exercises translating into predicate
predicate A predicate is an upper–case tion, to show that it is finished, 118 logic, 317–326
letter that precedes a singular term quantifier In predicate logic, operators people and things and using, 313
in predicate logic. Predicates stand that work with variables to stand for quantifier’s scope, 314
for properties, 213 terms like ‘something’, ‘everything’, quantifiers with, 312–313
quantified sentences with two predi- ‘nothing’, and ‘anything’. They may translation using, 310–317
cates, 216–217 be existential (∃) or universal (∀), religion. See philosophy of religion
predicate logic A language that includes 214 Rousseau, Jean Jacques, 14, 15
predicates, singular terms, and existential, 214 rules, governing identity, 352, 370–375
G L O S S A R Y / I N D E X 4 8 1
rules of equivalence A pair of logically disjunctive syllogism (DS), 116, 120 identity theory, 351
equivalent statement forms that exercises identifying, 128–129 interpretations, satisfaction and mod-
allows the replacement of wffs in a existential generalization (EG), 241, els, 274–277
derivation with logically equivalent 246 logical truth, 277–278
wffs. In contrast to a rule of infer- existential instantiation (EI), 242, semantics for M (monadic language),
ence, it may be used on whole lines 246 273–279
or on parts of lines, 135 hypothetical syllogism (HS), 117, 120 set An unordered collection of objects,
appendix on logical equivalence of, modus ponens (MP), 114–115, 119 275
413–418 modus tollens (MT), 115–116, 120 simplification (Simp) A rule of inference
association (Assoc), 136, 140, 414 rules of equivalence and, 139 of PL, 125–126, 127
biconditional association (BAssoc), simplification, 125–126, 127 singular terms In all predicate
166, 167, 168 universal generalization (UG), 240, logics, singular terms are
biconditional commutativity (BCom), 246 lower–case letters that follow
166, 168 universal instantiation (UI), 239, 246 predicates. They may be constants
biconditional De Morgan’s law (BDM), using in derivations, 117–118 (a, b, c, . . . , u) or variables (v, w, x, y,
166, 167, 168, 418 Russell, Bertrand, 360–361 z). In FF, f, g, and h are used as func-
biconditional inversion (BInver), 166, tors, 213
167, 169, 418 satisfy An object satisfies a predicate predicates and, 213–214
commutativity (Com), 137–138, 140 if it is in the set that interprets slippery slope, 407
contraposition (Cont), 146, 150, 416 that predicate. An existentially Smith, Adam, 14
De Morgan’s laws (DM), 135–136, quantified sentence is satisfied if, someone A term that indicates a quanti-
140, 413 and only if, it is satisfied by fier, but which should be distin-
deriving conclusions of arguments us- some object in the domain; a uni- guished from ‘something’, 220
ing, 140–144, 152–156 versally quantified sentence is sound argument A valid argument is
distribution (Dist), 136–137, 140, 415 satisfied if, and only if, it is sound if, and only if, all of its prem-
double negation (DN), 138, 140 satisfied by all objects in the do- ises are true, 17
exportation (Exp), 148–149, 150, 417 main, 277 sound system of inference In a sound
material equivalence (Equiv), 147–148, Schopenhauer, Arthur, 16 system of inference or theory, every
150, 416, 417 scope The scope of an operator is the provable argument is semantically
material implication (Impl), 146–147, range of its application. Scopes may valid; every provable proposition is
150, 416 be wider or narrower; they can be logically true, 114
quantifier exchange (QE), 255–258 increased in extent by the use of sound theory. See sound system of
rules of inference and, 139 punctuation, 233 inference
tautology (Taut), 149, 150, 418 wide and narrow, 314 soundness, 114
rules of inference A rule of inference is scope of an assumption Every line of an exercises, 19–21
used to justify steps in a derivation. indented sequence of a derivation validity and, 17–18
It may be used on whole lines only, that begins with the assumption. straw man, 406
in contrast with a rule of equiva- Nested indented sequences are subformula A formula that is part of an-
lence, which may be used on parts of within the scopes of multiple as- other formula, 235
lines as well, 114 sumptions, 264 subject A subject of a sentence is what is
addition (Add), 124–125, 127 scope of a negation The scope of a nega- discussed; it may be regimented in
biconditional hypothetical syllogism tion is whatever directly follows the predicate logic by one or more predi-
(BHS), 166, 167, 168 tilde, 233 cates, 220
biconditional modus ponens (BMP), scope of a quantifier The scope of a subset A subset of a set is a collection, all
166, 168 quantifier is whatever formula im- of whose members are in the larger
biconditional modus tollens (BMT), mediately follows the quantifier, set, 275
166, 168 233 substitution instance The substitution
conjunction, 124–125, 127 semantics The semantics of a formal instance of a rule is a set of wffs of
constructive dilemma (CD), 126–127, language are the rules for interpret- PL that match the form of the rule,
128 ing the symbols and formulas of the 115
deriving conclusions of arguments us- language, 46 superlatives, identity predicate, 355
ing, 129–134, 140–144, 152–156 for FF, 384–385 syllogism, 402
4 8 2 G L O S S A R Y / I N D E X
syntax The syntax of a logical language is proposition, in other words, their represented as a conditional in
the definition of its vocabulary and truth conditions, 59 which the antecedent is negated,
rules for making formulas, 43 constructing exercises, 74–77 25, 26
of PL, 43–45 constructing for propositions with any exclusive disjunction and, 108–111
of M, 233–237 number of variables, 66 truth table for, 108–110
of F, 328–331 determining the size of, 60–66 unsound A valid argument is unsound
identity statements, 352 determining validity of, 81–83 when at least one of its premises is
system of inference A collection of rules eight-row, 63–65 false, 17
(of inference or equivalence) used exercises, 67–68 unwarranted premises, 406–408
with a logical language. Many sys- four-row, 61–63
tems of inference include axioms, indirect, 83–97 valid argument An argument is valid
though PL and M use no axioms, method for constructing, 59 when the conclusion is a logical con-
and F has only one, 113 method for testing validity, 78 sequence of the premises. In propo-
A System of Logic (Mill), 401 truth values Interpretations of proposi- sitional logic, a valid argument has
tions. In bivalent logic, we use two no row of its truth table in which the
Tarski, Alfred, 8 truth values, true and false. Other log- premises are true and the conclusion
tautology A tautology is a proposition ics, including ones with three or more is false. An invalid argument has at
that is true in every row of its truth truth values, are possible. The truth least one counterexample, 17, 78
table, 68, 415 value of a complex proposition is the validity
tautology (Taut) A rule of equivalence of truth value of its main operator, 46, 47 determining, 205–207
PL, 149, 150, 418 of complex expression, 57–59 exercises, 19–21, 207–209
terms, 419–420 of complex propositions, 51–54 method of indirect truth tables to test,
theorems A sentence of a theory. In logic, complex propositions with unknown, 85
the theorems are also called logical 54–56 method of truth tables to test, 78
truths, 184 and soundness, 17–18
theory A set of sentences, called theo- unary operator A logical operator that valuation A valuation is an assignment
rems, 184 applies to a single proposition, 24 of truth values to simple component
Thoreau, Henry David, 13, 15 universal generalization (UG) The rule propositions, 73
Tocqueville, Alexis de, 14, 15 of inference in predicate logic that variables In predicate logic, a singular
translation, 29 allows us to put a universal quanti- term which may be bound by a
logical equivalence and, 105–106 fier onto a formula, 240, 246 quantifier; v, w, x, y, z are used as
notes on, with M, 299–305 restriction on, in F, 338–339 variables, 213
notes on, with PL, 105–111 universal instantiation (UI) The rule von Clausewitz, Carl, 14
triadic predicates Triadic predicates are of inference in predicate logic that
followed by three singular terms, allows us to take off a universal weak premises, 406–408
310 quantifier, 239, 246 wff A well-formed formula of a formal
truth functions universal quantifier The symbol used to language, 43
biconditional, 51 regiment terms including ‘all’ and exercises, 45–46
conjunction, 48 ‘everything’, 215 formation rules for wffs of F, 328–331
disjunction, 49 ‘and’s and ‘or’s, 299–301 formation rules for wffs of M, 235–236
material implication, 50–51 formulas, 299–301 formation rules of PL, 44–45
negation, 47–48 putting on the, 239–241 wide scope of quantifier A quantifier’s
semantics of PL, 46–57 taking off the, 238–239 scope is wider the more subformulas
truth tables A truth table summarizes unless Ordinarily represented as a it contains, 314
the possible truth values of a disjunction, but may also be Williams, William Carlos, 316, 328