Second, due to a recent policy change in Italy’s national museums I was able to finally take literally thousands of photos of artifacts and spaces in museums that have been forbidden to cameras for years. I’ve started sharing the photos on Twitter (#historypix) so follow me on Twitter if you would enjoy random photos of cool historical artifacts twice a day.
Meanwhile I don’t yet have another full essay ready to post here, but I’m happy to say the reason is that I’m working away on the page proofs of Too Like the Lightning, the final editing step before the books go to press. I’ve even received a photo from my editor of the Advanced Release Copies for book reviewers sitting in a delicious little pile! It’s fun seeing how many different baby steps the book is taking on its long path to becoming real: cover art, page count, typography, physicality in many stages, first the pre-copy-edit Advanced Bound Manuscripts, then the post-copy-edit but pre-page-proof Advanced Release Copies, evolving toward the final hardcover transformation by transformation. My biggest point of suspense at this point is wondering how fat it will be, how heavy in the hand…
And now, a quick piece of history fun:
There is a dimly-lit hallway half way through the Vatican museum (after you’ve looked at 2,000 Roman marbles, 1,000 Etruscan vases and enough overwhelming architecture to make you start feeling slightly punchy) hung on the left-hand side with stunning tapestries of scenes from the life of Christ based on cartoons by Raphael. But on the right-hand side in the same hallway, largely ignored by the thousands of visitors who stumble through, is my favorite Renaissance tapestry cycle, a sequence of images of The Excessively Exciting Life of Pope Urban VIII. My best summary of these images is that, when I showed them to my excellent friend Jonathan (author of our What Color is Pluto? guest post) he scratched his chin and said, “I think the patronage system may have introduced some bias.” And it’s very true, these are an amazing example of Renaissance art whose sole purpose is excessive flattery of the patron, a genre common in all media: histories, biographies, dedications, sculptures, paintings, verses, and, in this case, thread.
These tapestries are fragile and quite faded, and the narrow hallway thronging with Raphael-admirers makes it awkward to get a good angle, but with much effort I think these capture the over-the-top absurdity which makes these tapestries such a delight. Urban VIII now is best known for engaging in unusually complicated military and political maneuvering, expanding and fortifying the papal territories, pushing fiercely against Hapsburg expansion into Italy, finishing the canonization of St. Ignatius of Loyola, persecuting Galileo, commissioning a lot of Bernini sculptures, and spending so much on military and artistic expenses that he got the papacy so head over heels in debt that the Roman people hated him, the Cardinals conspired to depose him (note: it usually takes a few high-profile murders and/or orgies to get them to do that, so this was a LOT of debt), and his successor was left spending 80% of the Vatican’s annual income on interest repayments alone. But let’s see what scenes from his life he himself wanted us to remember:
My favorite is the first: Angels and Muses descend from Heaven to attend the college graduation of young Maffeo Barberini (not yet pope Urban VIII) and give him a laurel crown. If all graduation ceremonies were this exciting, we’d never miss them! Also someone there has a Caduceus, some weird female version of Hermes? Hard to say. And look at the amazing fabric on the robe of the man overseeing the ceremony.
Second, Maffeo Barberini receives the Cardinal’s Hat, attended by an angel, while Pope Paul V who is giving him the hat points in a heavy-handed foreshadowing way to his own pope hat nearby. What could it mean?!
Next, the fateful election! Heavenly allegories of princely virtues come to watch as the wooden slips are counted and the vote counter is astonished by the dramatic result! Note how, propaganda aside, this is useful for showing us what the slips looked like.
In the one above I particularly like the guy who’s peering into the goblet to make absolutely sure no slips are stuck there:
On the other side of the same scene, our modest Urban VIII is so surprised to be elected he practically swoons! And even demands a recount, while the nice acolyte kneels before him with the (excessively heavy) papal tiara on a silver platter.
Now Urban’s adventures as pope! He breaks ground for new construction projects in Rome, attended by some floating cupid creature holding a book for the flying allegorical heart of the city:
He builds new fortresses to defend Rome:
He makes peace between allegorical ladies representing Rome and Etruria (the area right next to Rome: note, if there is strife between Rome and Etruria in the first place, things in Italy are VERY VERY BAD! But the tapestries aren’t going into that):
And finally, Urban VIII defends Rome from Famine and Plague by getting help from St. Peter, St. Paul, Athena, and St. Sebastian. Well done, your Holiness!
How about that for the exciting life of a late Renaissance pope? You get to hang out with lots of allegorical figures, and vaguely pagan deities as well as saints, and everyone around you is always gesturing gracefully! No matter they fought so hard for the papal tiara. Also, no bankers or moneylenders or interest repayment to be found!
More seriously, another century’s propaganda rarely makes it into our canon of what art is worth reproducing, teaching and discussing, but I often find this kind of artifact much more historically informative than most: we can learn details of clothing, spaces and items like how papers are folded, or what voting slips looked like. We can learn which acts a political figure wanted to be remembered for, what seemed important at the time, so different from what we remember. A tapestry of him canonizing St. Ignatius of Loyola would certainly be popular now, but in his day people cared more about immediate military matters, and he had no way to predict how important St. Ignatius would eventually become. Pieces like this are also a good way to remind ourselves that the Renaissance art we usually see on calendars and cell phone cases isn’t representative, it’s our own curated selection of that tiny venn diagram intersection of art that fits the tastes of BOTH then AND now. And a good reminder that we should always attend graduation ceremonies, since you never know when Angels and Muses might descend from Heaven to attend.
My own period I will treat the most briefly in this survey. This may seem like a strange choice, but I can either do a general overview, or get sidetracked discussing individual philosophers, theologians and commentators and their uses of skepticism for another five posts. So, in brief:
In the later Middle Ages, within the philosophical world, the breadth of disagreement within scholarship, how different the far extreme theories were on any given topic, was rather circumscribed. A good example of a really fractious fight is the question of, within your generally Aristotelian tripartite rational immortal soul, which of the two decision-making principles is more powerful, the Intellect or the Will? It’s a big and important question – without it we will starve to death like Buridan’s ass, and be unable to decide whether to send our second sons to Franciscan or a Dominican monasteries, plus we need it to understand how Original Sin, Grace and salvation work. But the breadth of answers is not that big, and the question itself presumes that everyone involved already believes 90% the same thing.
Enter Petrarch, “Let’s read the classics! They’ll make us great like the Romans!” Begin 250 years of working really hard to find, copy, correct, translate, edit, print and proliferate every syllable surviving from antiquity. Now we discover that Epicurus says there’s no afterlife and the universe is made of atoms; Stoics say the universe is one giant contiguous object without motion or individual existence; Plato says there’s reincarnation (What? The Plato we used to have didn’t say that!); and Aristotle totally doesn’t say what we thought he said, it turns out the Organon was a terrible translation (Sorry, Boethius, you did your best, and we love you, but it was a terrible translation.) Suddenly the palette of questions is much broader, and the degree to which people disagree has opened exponentially wider. If we were charting a solar system before, now we’re charting a galaxy. But the humanists still tried hard to make them all agree, much as the scholastics and Peter Abelard had, since the ancients were ALL wonderful and ALL brilliant and ALL right, right? Even the stuff that contradicts the other stuff? Hence Renaissance Syncretism, attempts by philosophers like Marsilio Ficino and Giovanni Pico della Mirandola to take all the authors of antiquity, and Aquinas and a few others in the mix, and show how they were all really saying the same thing, in a roundabout, hidden, glorious, elusive, poetic, we-can-make-like-Abelard-and-make-it-all-make-sense way.
Before you dismiss these syncretic experiments as silly, or as slavish toadying, there is a logic to it if you can zoom out from modern pluralistic thinking for a minute and look at what Renaissance intellectuals had to work with.
To follow their logic chain you must begin–as they did–by positing that Christianity is true, and there is a single monotheistic God who is the source of all goodness, virtue, and knowledge. Wisdom, being wise and good at judgment, helps you tell true from false and right from wrong, and what is true and right will always agree with and point toward God. Therefore all wise people in history have really been aiming toward the same thing–one truth, one source. Plato and Aristotle and their Criteria of Truth are in the background of this, Plato’s description of the Good which is one divine thing that all reasoning minds tend toward, and Aristotle’s idea that reasoning people (philosophers, scientists) working without error will come to identical conclusions even if they’re on opposite sides of the world, because the knowable categories (fish, equilateral triangle, good) are universal. Thus, as Plato and Aristotle say we use reason to gradually approach knowledge, all philosophers in history have been working toward the same thing, and differ only in the errors they make along the way. This is the logic, but they also have evidence, and here you have to remember that Renaissance scholars did not have our modern tools for evaluating chronology and influence. They looked at early Christian writings, and they looked at Plato and Aristotle, and they said, as we do, “Wow, Plato and Aristotle have a lot of ideas in common with these early Christians!” but while we conclude, “Early Christians sure were influenced by Plato and Aristotle,” they instead concluded, “This proves that Plato and Aristotle were aiming toward the same things as Christianity!” And they had further evidence from how tangled their chronologies were. There were certain key texts like the Chaldean Oracles which they thought were much much older than we now think they are, which made it look like ideas we attribute to Plato had independently existed well before Plato. They looked at Plotinus and other late antique Neoplatonists who mixed Plato and Aristotle but claimed the Aristotelian bits were really hidden inside Plato the whole time, and they concluded, “See, Plato and Aristotle were basically saying the same thing!” Similarly confusing were the works of the figure we now call Pseudo-Dionysius, who we think was a late antique Neoplatonist voicing a mature hybrid of Platonism and Aristotelianism with some Stoicism mixed in, but who Renaissance scholars believed was a disciple of Saint Paul, leading them to conclude that Saint Paul believed a lot of this stuff, and making it seem even more like Plato, Aristotle, Stoics, ancient mystics, and Christianity were all aiming at one thing. So any small differences are errors along the way, or resolvable with “sic et non.”
The problem came when they translated more and more texts, and found more contradictions than they could really handle. Ideas much wilder and more out there than they expected suddenly had authoritative possibly-sort-of-proto-Christian authors endorsing them. Settled questions were unsettled again, sleeping dragons woken. For example, it wasn’t until the Fifth Lateran Council in 1513 that the Church officially made belief in the immortality of the soul a required doctrine for all Christians, which does not mean that lots of Christians before 1513 didn’t believe in the afterlife, but that Christians in 1513 were anxious about belief in the afterlife, feeling that it and many other doctrines were suddenly in doubt which had stood un-threatened throughout the Middle Ages. The intellectual landscape was suddenly bigger and stranger.
Remember how I said Cicero would be back? All these humanists read Cicero constantly, including the philosophical dialogs with his approach of presenting different classical sects in dialog, all equally plausible but incompatible, leading to… skepticism. And as they explored those same sects more and more broadly, Cicero the skeptic became something of the wedge that started to expand the crack, not overtly stating “Hey, guys, these people don’t agree!” but certainly pressing the idea that they don’t agree, in ways which humanists had more and more trouble ignoring as more texts came back.
Aaaaaand the Reformation made this more extreme, a lot more extreme, by (A) generating an enormous new mass of theological claims made by contradictory parties, adding another arm to our galactic spiral, and (B) developing huge numbers of fierce and damning counter-arguments to all these claims, which in turn meant developing new tools for countering and eroding belief. Thus, as we reach the 1570s, the world of philosophy is a lot bigger, a lot deadlier (as the Reformation and Counter-Reformation killed many more people for their ideas than the Middle Ages did), and a lot scarier, with vast swarms of arguments and counter-arguments, many of them powerful, persuasive, beautifully reasoned, and completely incompatible. And when you make a beautiful yes-and-no attempt to make Plato and Epicurus agree, you don’t have the men themselves on hand to say “Excuse me, in fact, we don’t agree.” But you did have real live Reformation and Counter-Reformation theologians running around responding to each other in real time, that makes syncretic reconciliation the more impossible.
Remember how Abelard, who able to make St. Jerome and St. Augustine seem to agree, drew followers like Woodstock? Well, now his successors–Scholastic and Humanist, since the Humanists were all ALSO reading Scholasticism all the time–have a thousand times as many authorities to reconcile. You think Jerome and Augustine is hard? Try Calvin and Epicurus! St. Dominic and Zwingli! Thomas Aquinas is a saint now, let’s see if you can Yes-and-No the entire Summa Theologica into agreeing with Epictetus, Pseudo-Dionysius and the Council of Trent at the same time! And remember, in the middle of all this, that most if not all of our Renaissance protagonists still believe in Hell and damnation (or at least something similar to it), and that if you’re wrong you burn in Hellfire forever and ever and ever and so do all your students and it’s your fault. Result: FEAR. And its companion, freethought. Contrary to what we might assume, this is not a case where fear stifled inquiry, but where it stimulated more, firing Renaissance thinkers with the burning need to have a solution to all these contradictions, some way to sort out the safe path amid a thousand pits of Hellfire. New syntheses were proposed, new taxonomies of positions and heresies outlined, and old beliefs reexamined and refined or reaffirmed. And this period of intellectual broadening and competition brought with it an increasing inability to believe that any one of these options is the only right way when there are so many, and they are so good at tearing each other down.
And in the middle of this, experimental and observational science is advancing rapidly, and causing more doubt. We discover new continents that don’t fit in a T-O map (Ptolemy is wrong), new plants that don’t fit existing plant taxonomy (Theophrastus is wrong), details about Animals which don’t match Aristotle (we’d better hope he’s not wrong!), the circulation of the blood which turns the four humors theory on its head (Not Galen! We really needed him!), and magnification lets us finally see the complexity of a flea, and realize there is a whole unexplored micro-universe of detail too small for the naked eye to experience, raising the question “If God made the Earth for humans, why did God bother to make things humans can’t even perceive?”
Youth: “But, Socrates, why did experimental and observational science advance in that period? Discovering new stuff that isn’t in the classics doesn’t have anything to do with reconstructing antiquity, or with the Reformation, does it?”
Good question. A long answer would be a book, but I can make a quick stab at a short one. I would point at several factors. First, after 1300, and increasingly as we approach 1600, European rulers began competing in new ways, many of them cultural. As more and more nobles were convinced by the humanist claim that true nobility and power came from the lost arts of the ancients, so scholarship and unique knowledge, including knowledge of ancient sciences, became mandatory ornaments of court, and politically valuable as ways of advertising a ruler’s wealth and power. Monarchs and newly-risen families who had seized power through war or bribery could add a veneer of nobility by surrounding themselves with libraries, scholars, poets, and scientists, who studied the ancient scientific sources of Greece and Rome but, in order to understand them more fully, also studied newer sources coming from the Middle East, and did new experiments of their own. A new astronomical model of the heavens proclaimed the power of the patron who had paid for it, just as much as a fur-lined cloak or a diamond-studded scepter.
Add to this the increase of the scales of wars caused by increased wealth which could raise larger armies, generating a situation in which new tools for warfare, and especially fortress construction, were increasingly in demand (when you read Leonardo’s discussions of his abilities, more than 75% of the inventions he mentions are tools of war). Add to that the printing press which makes it possible for novelties–whether a rediscovered manuscript or a newly-discovered muscle–to spread exponentially faster, and which makes books much more affordable, so that if only one person in 50,000 could afford a library before now it is one in 5,000, and even merchants could afford a few texts. Education was easier, and educated men were in demand at courts eager to fill themselves with scholars, and advertise their greatness with discoveries.
These are the main facilitators, but I would also cite another fundamental shift. I have talked before about Petrarch, and the humanist project to improve the world by reconstructing a lost golden age. This is the first philosophical movement since ancient stoicism that has had anything to do with the world, since medieval theology’s (perfectly rational in context!) desire to study the Eternal instead of the ephemeral meant that most scholars for many centuries had considered natural philosophy, the study of impermanent natural phenomena, as useless as studying the bathwater instead of the baby. Humanism generated a lot of arguments about why Earth and earthly things were worth more than nothing, even if they agreed Heaven and eternal things were more important, and I think the mindset which said it was a pious and worthwhile thing to translate Livy or write a treatise on good government contributed to the mindset which said it was a pious and worthwhile thing to measure mountains or write a treatise on metallurgy. Thought turned, just a little bit, toward Earth.
There, that’s the Renaissance and Reformation, oversimplified by necessity, but Descartes is chomping at the bit for what comes next. For those who want more, I shall do the crass thing here and say: for more detail, see my book Reading Lucretius in the Renaissance, or Popkin’s History of Skepticism, or wait.
At last, Montaigne!
Like the world which basked in his writings, and shuddered in his “crisis,” I love Montaigne. I love his sentences, his storytelling, his sincerity, his quips, his authorial voice. Reading Montaigne is like like slowly enjoying a glass of whatever complex, rich and subtle beverage you most enjoy a glass of (wine for many, fresh goat milk for me!). Especially because, at the end, your glass is empty. (I see a contented Descartes nodding). When I set about starting to write this series, getting to Montaigne was, in fact, my secret end goal, since, if there is a founder of modern skepticism, it is Michel Eyquem de Montaigne.
Montaigne was unique, an experiment, the natural experiment to follow at the maturation of the Renaissance classical project but still, a unique child, raised as an overt pedagogical experiment outlined by his father: Montaigne grew up speaking only Latin. He was exposed to French in his first three years by country nurses, but from three on he was only allowed contact with people–his tutor, parents and servants–speaking Latin. He was a literal attempt to raise a Cicero or Caesar, formed exclusively by classical ideas, the ideal man that the humanists had been hoping to create. Greek was later added, not with textbooks and the rod as was usual in those days but with games and music, and studies were always made to seem pleasant and wonderful by surrounding him with music (even waking the child every morning with delightful live music). He grew up to be about as perfect a Platonic Philosopher King as one could hope to imagine, studying law and entering politics, as his father wished, achieving the highest honors, but preferring life alone in his library, and frequently retiring to do just that, only to be dragged back into politics actually by popular demand of people who would come bang on his library door demanding that he come out to take up office and rule them. I think often about what it must have been like to be Montaigne, to be so immersed, enjoy these things so much, and only later discover that he was alone in a world with literally no other native speaker of his language. It must have been as difficult as it was wonderful to be Montaigne. But I think I understand why, when he lost his best friend Étienne de la Boétie, Montaigne wrote of his grief, his loss, the pain of solitude, with an intensity rarely approached in the history of human literature. He also wrote Essais, meandering writings, the source of the modern word “essay”, for which every schoolchild has the right to playfully curse him.
I will now go about explaining why Montaigne was so wonderful by describing Voltaire. Yes, it is an odd way to go about it, but the Voltaire example is clearer and more concise than any Montaigne example I have on hand, and, in this, Voltaire was a student of Montaigne, and Montaigne will only smile to see such a beautiful development of his art, as Bacon smiles on Newton, and Socrates on all of us.
At the beginning of this sequence, I outlined two potential sources of knowledge: either (A) Sense Perception i.e. Evidence, or (B) Logic/Reason. The classical skeptics were born when the reliability these two sources of knowledge were drawn into doubt, Sense Perception by the stick in water, Logic by Xeno’s Paradoxes of Motion. Responses included the skeptics’ conclusion “We can’t know anything if we can’t trust Reason or the Senses,” and the various other classical schools’ Criteria of Truth (Plato’s Ideas, Aristotle’s Categories, Epicurus’s weak empiricism, etc.) All refutations we have seen along our long path have been based on undermining one of these types of knowledge sources: so when Duns Scotus fights with Aquinas, he picks on his logic, and when Ockham fights with him he, often, picks on his material sensory evidence. (“Where is the phantasm? Huh? Huh?”)
Everybody, I’d like to introduce you to Leibniz. Leibniz, this is everybody. “Hello!” says Leibniz, “Very nice to meet you all.” We are going to viciously murder Leibniz in about three minutes. “It’s no trouble,” says Leibniz, “I’m quite used to it.” Thank you, Leibniz, we appreciate it.
Leibniz here made many great contributions to philosophy and mathematics, but one particular one was extraordinarily popular, I would go so far as to say faddy, a fad argument which swept Europe in the first half of the 18th century. You have almost certainly heard it before in mocking form, but I will do my best to be fair as we line up our target in our sites:
God is Omnipotent, Omniscient and Omnbenevolent. (Given.) “Grrrr,” quoth Socrates.
Given that God is Omniscient, He knows what the best of all possible worlds is.
Given that God is Omnipotent, He can create the best of all possible worlds.
Given that God is Omnibenevolent, He wants to create the best of all possible worlds.
Any world such a God would make must logically be the best of all possible worlds
This is the best of all possible worlds.
Now, this was a proof written, just like Anselm’s and Aquinas’s, by a philosopher expecting a readership who all believe, both in God, and in Providence. It is a comfortable proof of the logical certainty that there is Providence, that this universe is perfect (as the Stoics first theorized), and anything in it that seems to be bad or evil must, in fact, be part of a greater long-term good that we fail to see because of our limited human perspective. The proof made a huge number of people delighted to have such an elegant and simple argument for something they enthusiastically believed.
But, the proof also the side-effect that arguments about Providence often do, of making people start to try to reason out what the good was behind hidden evils. “Oh, that guy was struck with disease because he did X bad thing.” “Wolves exist to make us live in villages.” “That plague happened because those people were bad.” It was (much like Medieval proofs of the existence of God) a way philosophers could show off their cleverness to an appreciative audience, make themselves known, and put forward theories about right and wrong and what God might want.
In 1755 an enormous earthquake struck the great port city of Lisbon (Portugal), wiping out tens of thousands of people (some estimate up to 100,000) and leveling one of the great gems of European civilization. It remains to this day one of the deadliest earthquakes in recorded history, and many parts of Lisbon are still in ruins almost 300 years later. The shock and horror, to a progressive, optimistic Europe, was stunning. And immediately thereafter, fans of Leibniz started publishing essays about how it was GOOD that this had happened, because of XYZ reason. For example, one argument was that they were persecuting people for their religion, and this was God saying he disapproved <= REAL argument. (Note: Leibniz himself is innocent of all this, having died years before the earthquake – we are speaking of his followers.) Others argued that it was a bad minor effect of God’s general laws, that the physical rules of the Earth which make everything wonderful for humankind also make earthquakes sometimes happen, but that the suffering they cause is negligible against the greater goods that Providence achieves. And if one person in Europe could not stand these noxious, juvenile, pompous, inhumane, self-serving, condescending, boastful, heartless, self-congratulatory responses to unprecedented human suffering, that person was the one pen mightier than any sword, Voltaire.
Would words like these to peace of mind restore
The natives sad of that disastrous shore?
Grieve not, that others’ bliss may overflow,
Your sumptuous palaces are laid thus low;
Your toppled towers shall other hands rebuild;
With multitudes your walls one day be filled;
Your ruin on the North shall wealth bestow,
For general good from partial ills must flow;
You seem as abject to the sovereign power,
As worms which shall your carcasses devour.
No comfort could such shocking words impart,
But deeper wound the sad, afflicted heart.
When I lament my present wretched state,
Allege not the unchanging laws of fate;
Urge not the links of the eternal chain,
’Tis false philosophy and wisdom vain.
The God who holds the chain can’t be enchained;
By His blest Will are all events ordained:
He’s Just, nor easily to wrath gives way,
Why suffer we beneath so mild a sway:
This is the fatal knot you should untie,
Our evils do you cure when you deny?
Men ever strove into the source to pry,
Of evil, whose existence you deny.
If he whose hand the elements can wield,
To the winds’ force makes rocky mountains yield;
If thunder lays oaks level with the plain,
From the bolts’ strokes they never suffer pain.
But I can feel, my heart oppressed demands
Aid of that God who formed me with His hands.
Sons of the God supreme to suffer all
Fated alike; we on our Father call.
No vessel of the potter asks, we know,
Why it was made so brittle, vile, and low?
Vessels of speech as well as thought are void;
The urn this moment formed and that destroyed,
The potter never could with sense inspire,
Devoid of thought it nothing can desire.
The moralist still obstinate replies,
Others’ enjoyments from your woes arise,
To numerous insects shall my corpse give birth,
When once it mixes with its mother earth:
Small comfort ’tis that when Death’s ruthless power
Closes my life, worms shall my flesh devour.
This (in the William F. Fleming translation) is an excerpt from the middle of Voltaire’s Poem on the Lisbon Earthquake, which I heartily encourage you to read in its entirety. The poem summarizes the arguments of Camp Leibniz , and juxtaposes them with heart-wrenching descriptions of the sufferings of the victims, and with Voltaire’s own earnest and passionate expression of exactly why these kinds of arguments about Providence are so difficult to choke down when one is really on the ground suffering and feeling. The human is not a senseless pottery vessel, it is a thinking thing, it feels pain, it asks questions, it feels the special kind of pain that unanswered questions cause, the same pain the skeptics have been trying to help us escape for 3,000 years. But we don’t escape, and the poem captures it. The poem swept across Europe like a firestorm. People read it, people felt it, people recognized in Voltaire’s words the cries of anger in their own hearts. And they agreed. He won. The Leibniz fad ended. An entire continent-wide philosophical movement, slain.
And he used neither Logic nor Evidence.
Did you feel it? The poem persuaded, attacked, undermined, eroded away the respectability of Leibniz, but it did it without using EITHER of the two pillars of argument. There was no chain of reasoning. And there was no empirical observation. You could say there was some logic in the way he juxtaposed claims “God is a kind Maker” with counter-claims “I am not a potter’s jar, I am a thinking thing! I need more!”. You could say there was some empiricism or evidence-based argument in his descriptions of things he saw, or things he felt, since feelings too are sense-perceptions in a way, so reporting how one feels is reporting a sensory fact. But there was nothing in this so rigorous or so real that any of our ancient skeptics would recognize it as the empiricism they were attacking. Those people Voltaire describes – he did not see them, he just imagines them, reaching across the breadth of Europe with the strength of empathy. That potter’s wheel is a metaphor, not a syllogism. Voltaire has used a third thing, neither Reason nor Evidence, as a tool of skepticism.
What do we name this Third Thing? I have heard people propose “common sense” but that’s a terribly vexed term, going back to Cicero at least, which has been used by this point to mean 100 things that are not this thing, so even if you could also call this thing “common sense” it would just create confusion (we don’t need Aristotle looming with a lecture on the dangers of unclear vocabulary). I have heard people propose “sentiment” and I like how galling it feels to try to suggest that “sentiment” should enjoy coequal respect and power with Reason and Evidence, but it isn’t quite that either. I am not yet happy with any name for this Third Thing, and am playing around with many. All I will say is that it is real, it is powerful, it is as effective at persuading one to believe or disbelieve as Reason and Evidence are. And, even if there were shadows of this Third Thing earlier in human history, Montaigne was the smith who sharpened the blade and handed it to Voltaire, and to the rest of us.
Montaigne’s Essais are lovely, meandering, personal, structure-less, rambling musings in which topics flow one upon another, he summarizes an argument made for or against some heresy, then, rather than voicing an opinion, tells you a story about his grandmother that one time, or retells a bit of one of Virgil’s pastorals, or an anecdote about some now-obscure general, and then flows on to a different topic, never stating his opinion on the first but having shaped your thinking, through his meanders, until you feel an answer, a belief or, more often, disbelief, even if he never voiced one. And then he keeps going, taking up another argument, making it feel silly with an allegory about two bakers, another and–have you heard the news from Spain?–another, and another, and oh, the loves of Alexander, another, and another. And as it flows along you get to know him, feel you’re having a conversation with him, and somewhere toward the end you no longer believe any of the philosophical arguments he has just summarized are plausible at all, but he never once argued directly against any of them. It is a little bit like our skeptical Cicero, juxtaposing opposing views and leaving us convinced by none, but it is one level less structured, not actually a dialog with arguments and refutations. Skepticism, without Reason, without Evidence, just with the human honesty that is Montaigne, his doubts, his friendship, his communication to you, dear reader, across the barrier of page, and time, and language, this strange French-Roman, this only native Latin speaker born in a millennium, this alien, has made you realize all the philosophical convictions, everything in that broad spectrum that scholasticism plus the Renaissance plus the Reformation and Counter-Reformation ferocity have laid before you, none of it is what a person really feels deep down inside, not Montaigne, and not you. And so he leaves you a skeptic, in a completely different way from how the ancient skeptics did it, not with theses, or exercises, or lists, or counterarguments, just with… humanity?
Montaigne did it. His contemporaries found it… odd at first, a bit self-centered, this autobiographical meandering, but it was so beautiful, so entrancing, so powerful. It reared a new generation, armed with Reason and Evidence and This Third Thing, and deeply skeptical. Students at universities started raising their hands in class to ask the teachers to prove the school existed. Theologians advising princes started saying maybe it didn’t matter that much what the difference was between the different Christian faiths if they were close enough. A new age of philosophy was born, not a new school, but a new tool for dogmatism’s ancient symbiotic antagonist: doubt.
And, where doubt grows stronger and richer, so does dogmatic philosophy, having that much more to test itself against. Just as, in antiquity, so many amazing schools and ideas were born from trying to respond to Zeno and the Stick in Water, so Montaigne’s new tools of Skepticism, his revival and embellishment of skepticism, the birth, as we call it, of Modern Skepticism, was also the final ingredient necessary for an explosion of new ideas, new schools, new universes described by new philosophers trying to build systems which can stand up against a new skepticism armed, not just against Reason and Evidence, but with That Third Thing.
Thus, as 1600 approaches, the breakneck proliferation of new ideas and factions make Montaigne’s skepticism so popular that students in scholastic and Jesuit schools are starting to raise their hands and demand that the professor prove the existence of the classroom before expecting them to attend class. A “skeptical crisis” takes center stage in Europe’s great intellectual conversation, and multiplying doubt seems to have all the traditional Criteria of Truth in flight. It is onto this stage that Descartes will step, and craft, alongside his contemporaries, the first new systems which will have to cope, not with two avenues of attacking certainty, but, thanks to Montaigne, three. And will fight back against them with Montaigne’s arts as well. Next time.
For now, I will leave you with one more little snippet of the future: I lied to you, about a simple happy ending to Voltaire’s quarrel with Leibniz. Oh, Leibniz was quite dead, not just because the man himself had died but because no philosopher could take his argument seriously after the poem. Ever. Again. In fact, a few years ago I went to a talk at at a philosophy department in which a young scholar was taking on Leibniz’s Best of All Possible Worlds thesis, and picking it apart using beautiful logical argumentation, and at the end everyone applauded and congratulated him, but when the Q&A started the first Q was “Well, um, this was all quite fascinating, but, isn’t Leibniz, I mean, no one takes that argument seriously anymore…” But the young philosopher was correct to point out that, in fact, no one had ever actually directly refuted it with logic. No one saw the need. But if Voltaire’s victory over logical Leibniz was complete, Leibniz was not the most dangerous of foes. Voltaire had contemporaries, after all, armed with Montaigne’s Third Thing just as Voltaire was. Rousseau will fire back, sweet, endearing, maddening Rousseau, not in defense of Leibnitz, but against the poem which he sees as an attack on God. But this battle of two earnest and progressive deists must wait until we have brought about the brave new world that has such creatures in it. For that we need Descartes, Francis Bacon, grim Hobbes, John Locke, and the ambidextrous Bayle.
Socrates, Sartre, Descartes and our Youth have, among them, consumed twelve thousand, six hundred and forty two hypothetical eclairs in the fourteen months since we left them contemplating skepticism on the banks of a cheerily babbling imaginary brook. Much has changed in the interval, not in the land of philosophical thought-experiments (which is ever peaceful unless someone scary like Ockham or Nietzsche gets inside), but in a world two layers of reality removed from theirs. The changes appear in the world of material circumstances which shape and foster this author, who in turn shapes and fosters our philosophical picnickers. Now, having recovered from my transplant shock of being moved to the new and fertile country of University of Chicago, and with my summer work done, and Too Like the Lightningfully revised and on its way toward its May 10th release date (YES!), it is time at last to return to our hypothetical heroes, and to my sketches of the history of philosophical skepticism.
When last we saw them, Socrates, Sartre, Descartes and our Youth had rescued themselves from the throes of absolute doubt by developing Criteria of Truth, which allowed them to differentiate arenas of knowledge where certainty is possible from arenas of knowledge where certainty is not possible. (See their previous dramatic adventures with in Sketches of a History of Skepticism Part 1 and Part 2). To do this, they looked at three systems: Epicureanism, which suggests that we have certain knowledge of the world perceived by the senses, but no certain knowledge of the imperceptible atomic reality beneath; Platonism, which suggests that we have knowledge of the eternal structures that create the material world, i.e. Forms or Ideas, but not of the flawed, corruptible material objects which are the shadows of those eternal structures; and Aristotelianism, which suggests that we can have certain knowledge of logical principles and of categories within Nature, but not of individual objects.
Notably, neither Epicurus nor Aristotle was invited to our picnic, and, while you never know when any given Socrates will turn out to be a Plato in disguise, our particular Socrates seems to be staying safely in the camp of doubt: he knows that he knows nothing. Our object is not to determine which of these classical camps has the correct Criterion of Truth. In fact, our distinguished guests, Descartes and Sartre, aren’t interested in rehashing these three classical systems all of whose criteria are not only familiar, but, to them, long defunct. They have not come through this great distance in time to watch Socrates open the doors of skepticism to our Youth to just meet antiquity’s familiar dogmatists; the twinkle in Descartes’ eye (and his infinite patience dolling out eclairs) tells me he’s waiting for something else.
Descartes and Sartre expect Cicero next — Cicero, whom many might mistake as a voice for the Stoic school (the intellectual party conspicuously missing from the assembly of Plato, Aristotle, and Epicurus) but who is actually more often read by modern scholars as a new and promising kind of Skeptic. Unfortunately, Cicero is currently busy answering a flurry of letters from someone called Petrarch, so has declined to join our little gathering (or possibly he’s just miffed hearing that I’m doing an abbreviated finale to this series, so he’d only get a couple paragraphs, even if he came). So we must do our concise best to cover his contribution on our own. Pyrrho, Zeno and other early skeptical voices argued in favor of doubt by demonstrating the fallibility of the senses and of pure reason: the stick in water that looks bent, the paradoxes of motion which show how logic and reality don’t match. Cicero achieves unbelief (and aims at the eudaimonist tranquility beyond) by a different route, a luxurious one made possible by the fact that he is writing three centuries into the development of philosophy and has many different dogmatic schools to fall back on. In his philosophical dialogs, Cicero presents different interlocutors who put forth different dogmatic positions: Stoic, Platonist, Epicurean; all in dialog with each other, presenting evidence for their own positions and counter-arguments against the conclusions of others. Each interlocutor works strictly by his own Criterion of Truth, and all argue intelligently and well. But they all disagree. When you read them all together, you are left uncertain. No particular voice seems to overtop the others, and the fact that there are so many different equally plausible positions, defended with equally well-defined Criteria of Truth, leaves one with no confidence that any of them is reliable. At no point does Cicero say “I am a skeptic, I think there is no certainty,” — but the effect of reading the dialog is to be left with uncertain feelings. Cicero himself does not seem to have been a Pyrrhonist skeptic, and certainly does seem to hold some philosophical positions, especially moral principles, quite strongly. There is certainly a good case to be made that he has strong Stoic leanings, and there is validity to the Renaissance argument that he should be vaguely clustered in with Seneca and Cato, who subscribe to a mixed-together digest of Roman paganism, Stoicism, some Platonic and a few Aristotelian elements. But especially on big questions of epistemology, ontology and physics, Cicero remains solidly, frustratingly, elusive.
There are many important aspects of Cicero’s work, but for our purposes the most important is this: he has achieved doubt without actually making any skeptical arguments, or counter-arguments. He has not attacked the fundamentals of Stoicism, Platonism or Epicureanism. Instead, he has used the strengths of the three schools to undermine each other. All three schools are convincing. All are plausible. All have evidence and/or logic on their side. As a result, none of the three winds up feeling convincing, even though none of the three has been directly undermined. This is not a new achievement of Cicero’s. Epicurus used a similar technique, and Lucretius, his follower, did so too; and we know Cicero read Lucretius. But Cicero is the most important person to use this technique in antiquity, largely because 1,300 years later it will be Cicero who become the centerpiece of Renaissance education. And Cicero will have no small Medieval legacy as well.
Medieval Certainty, and the Big Question
Stereotypically for a Renaissance historian, I will move quickly through the Middle Ages, though not for the stereotypical reasons. I don’t think that the Middle Ages were an intellectual stasis; I do think that Medieval philosophy is fully of many complex things that I’m just starting to seriously work through in my own studies. I’m not ready to provide a light, fun summary of something which is, for me, still a rich forest to explore. Church Fathers, late Neoplatonists, Chroniclers, theological councils, monastic leaders, rich injections from the Middle East, Maimonides; all intersect with doubt, certainty and Criteria of Truth in rich and fascinating ways that I am not yet prepared to do justice to. So instead I will present an abstraction of one important aspect of Medieval thinking which I hope will help elucidate some overall approaches to doubt, even if I don’t pause to look at individual minds.
When I was in my second year of grad school, I chatted over convenience store cookies in the grad student lounge with a new student entering our program that year, like myself, to study the Renaissance. He poked fun at the philosophers of the Middle Ages. He asked me, “How could anybody possibly be interested in going on and on and on and on like that about God?” And in that moment of politeness, and newness, and fun, I laughed, and nodded. But, happily, we had a good teacher who made us look more at the Medieval, without which we can’t understand the Renaissance, and now I would never laugh at such a comment.
Set aside your modern mindset for a moment, and your modern religious concepts, and see if you can jump into the Medieval mind. To start with, there is a Being of infinite power, Whose existence is known with certainty. (Take that as given — a big given, I know, but it’s a given in this context.) Such a Being created everything that ever has existed or will exist. Everything that happens: events, births, storms, falling objects, thoughts; all were conceived by this Being and exist according to this Being’s script. The Being possesses all knowledge, and all good things are good because they resemble this Being. Everything in the material world is fleeting and imperfect and will someday be destroyed and forgotten, including the entire Earth. But — this Being has access to another universe where all things are eternal and perfect, which will last beyond the end of the material universe, and with this Being’s help there might be some way for us to reach that universe as well. The Being created humans with particular care, and is trying to communicate with us, but direct communication is a difficult process, just as it is difficult for an entomologist to communicate directly with his ants, or for a computer programmer to communicate directly with the artificial intelligences that she has programmed.
Now, the facetious question I laughed at in early grad school comes back, but turned on its head. How could you ever want to study anything other than this Being? It explains everything. You want to know the cause of weather, astronomical events, diseases, time? The answer is this Being. You want to know where the world came from, how thought works, why there is pain? The answer is this Being. History is a script written by this Being, the stars are a diagram drawn by this Being, the suitability and adaptation of animals and plants to their environments is the ingenuity of this Being, and the laws that make rocks sink and wood float and fire burn and rain fall are all decisions made by this Being. If you have any intellectual curiosity at all, wouldn’t it be an act of insanity to dedicate your life to anything other than understanding this Being? And in a world in which there has been, for centuries, effective universal consensus on all these premises, what society would want to fund a school that didn’t study them? Or pay tuition for a child to study something else? Theology dominated other sciences in the Middle Ages, not because people were backward, or closed-minded, or lacked curiosity, but because they were ambitious, keenly intellectual and fixed on the a subject from which they had every reason to expect answers, not just to theological questions, but to all questions. They didn’t have blinders, they had their eyes on the prize, and they felt that choosing to study Natural Philosophy (i.e. the world, nature, biology, plants, animals) instead of Theology was like trying to study toenail clippings instead of the being they were clipped from.
To put it another way: have you ever watched a fun, formulaic, episodic genre show like Buffy the Vampire Slayer, or the X-Files? There’ll be one particular episode where the baddie-of-the-day is Christianity-flavored, and at some point a manifest miracle happens, or an angel or a ghost shows up, and then we have to reset the formula and move onto the next episode, but you spend that whole next episode thinking, “You know, they just found proof of the existence of the afterlife and the immortality of the soul. You’d think they’d decide that’s more important than this conspiracy involving genetically-modified corn.” That’s how people in the Middle Ages felt about people who wanted to study things that weren’t God.
Doubt comes into this in important ways, but not the ways that modern rhetoric about the Middle Ages leads most people to expect.
Wikipedia, at the time of writing, defines Scholasticism as “a method of critical thought which dominated teaching by the academics (“scholastics,” or “schoolmen”) of medieval universities in Europe from about 1100 to 1700. ” It was “a program of employing that [critical] method in articulating and defending dogma in an increasingly pluralistic context.” It “originated as an outgrowth of, and a departure from, Christian monastic schools at the earliest European universities.” Philosophy students traditionally define Scholasticism as “that incredibly boring hard stuff about God that you have to read between the classics and Descartes”. Both definitions are true. Scholasticism is an incredibly tedious, exacting body of philosophy, intentionally impenetrable, obsessed with micro-detail, and happy to spend three thousand words proving to you that Good is good, or to set out a twenty step argument it is better to exist than not exist (this is presumably why Hamlet still hadn’t graduated at age 30). Scholasticism was also so incredibly exciting that, apart from the ever-profitable medical and law schools, European higher education devoted itself to practically nothing else for the whole late Middle Ages, and, even though the intellectual firebrands of both the Renaissance and the 17th and 18th centuries devoted themselves largely to fiercely attacking the scholastic system, it did not truly crumble until deep into the Enlightenment.
Why was Scholasticism so exciting? Even if people who believed in an omnipotent God had good reason to devote their studies pretty-exclusively to Theology, why was this one particularly dense and intentionally difficult method the method for hundreds of years? Why didn’t they write easy-to-read, penetrable treatises, or witty philosophical tales, or even a good old fashioned Platonic-type dialog?
The answer is that Christianity changes the stakes for being wrong. In antiquity, if you’re wrong about philosophy, and the philosophical end of theology, you’ll make incorrect decisions, possibly lead a sadder or less successful life than you would otherwise, and it might mean your legacy isn’t what you wanted it to be, but that’s it. If you’re really, really wrong you might offend Artemis or something and get zapped, but it’s pretty easy to cover your bases by going to the right festivals. By the logic of antiquity, if you put a Platonist and an Epicurean in a room, one of of them will be wrong and living life the wrong way, at least in some ways, but they can both have a nice conversation, and in the end, either they’ll both reincarnate and the Epicurean will have another chance to be right later, or they’ll both disperse into atoms and it won’t matter. OK. In Medieval Christianity, if you’re wrong about theology, your immortal soul goes to Hell forever, where you’ll be tormented by unspeakable devils for the rest of eternity, and everyone else who believes your errors is also likely to lose the chance of eternal paradise and absolute knowledge, and will be plunged into a pit of absolute misery and despair, irrevocably, forever. Error is incredibly dangerous, to you and to everyone around you who might get pulled down with you. If you’re really bad, you might even bring the wrath of God down upon your native city, or if you’re really bad then, while you’re still alive, your soul might depart your body and sink down to Hell, leaving your body to be a house for a devil who will use you to visit evil on the Earth (see Inferno Canto 27). But leaving aside those more extreme and superstition-tainted possibilities, error became more pernicious because of eternal damnation. If people who read your theologically incorrect works go to Hell, you’re infinitely culpable, morally, since every student misled to damnation is literally an infinite crime.
So, if you are a Medieval person, Theology is incredibly valuable, the only kind of study worth doing, but also incredibly dangerous. You want to tread very carefully. You want a lot of safety nets and spotters. You want ways to avoid error. And you know error is easy! Errors of logic, errors of failing senses. Enter Aristotle, or more specifically enter Aristotle’s Organon, a translation of the poetic works of Aristotle completed by dear Boethius, part of the latter’s efforts to preserve Greek learning when he realized Greek and other relics of antiquity were fading. The Organon explains in great detail, how you can go about constructing chains of logic in careful, methodical ways to avoid error. Use only clearly defined unequivocal vocabulary, and strict syllogistic and geometric reasoning. Here it is, foolproof logic in 50 steps, I’ll show you! Sound familiar? This is Aristotle’s old Criterion of Truth, but it’s also the Medieval Theologian’s #1 Christmas Wish List. The Criterion of Truth which was, for Aristotle, a path through the dark woods and a solution to Zeno and the Stick in Water, is, to our theologian, a safety net over a pit of eternal Hellfire. That is why it was so exciting. That was why people who wanted to do theology were willing to train for five years just in logic before even looking at a theological question, just as Astronauts train in simulators for a long time before going out into the deadly vacuum of space! That is even why scholastic texts are so hard to read and understand – they were intentionally written to be difficult to read, partly because they’re using an incredibly complicated method, but even more because they don’t want anyone to read them who hasn’t studied their method, because if you read them unprepared you might misunderstand, and then you’d go to Hell forever and ever and ever, and it would be Thomas Aquinas’s fault. And he would be very sad. When Thomas Aquinas was presented for canonization, after his death, they made the argument that every chapter of the Summa Theologica was itself a miracle. It’s easy to laugh, but if you think about how desperately they wanted perfect logic, and how good Aquinas was at offering it, it’s an argument I understand. If you were dying of thirst in the desert, wouldn’t a glass of water feel like a miracle?
To give credit where credit is due, the mature application of Aristotle’s formal logic to theological questions was not pioneered by Aquinas but by a predecessor: Peter Abelard, the wild rockstar of Medieval Theology. People crowded in thousands and lived in fields to hear Peter Abelard preach, it was like Woodstock, only with more Aristotle. Why were people so excited? Did Abelard finally have the right answer to all things? “Yes and No,” as Peter Abelard would say, “Sic et Non“, that being the the title of his famous book, a demonstration of his skill. (Wait, yes AND no, isn’t that even scarier and worse and more damnable than everything else? This is the most dangerous person ever! Bernard of Clairvaux thought so, but the Woodstock crowd at the Paraclete, they don’t.) Abelard’s skill was taking two apparently contradictory statements and showing, by elaborate roundabout logic tricks, how they agree. Why is this so exciting? Any troll on the internet can do that! No, but he did it seriously, and he did it with Authorities. He would take a bit of Plato that seemed to contradict a bit of Aristotle, and show how they actually agree. Even ballsier, he would take a bit of Plato that pretty manifestly DOES contradict another bit of Plato, and show how they both agree. Then, even better, he would take a bit from St. Augustine that seems to contradict a bit from St. Jerome and show how the two actually agree. “OH THANK GOD!” cries Medieval Europe, desperately perplexed by the following conundrum:
The Church Fathers are saints, and divinely inspired; their words are direct messages from God.
If you believe the Church Fathers and act in accordance with their teachings, they will show you the way to Heaven; if you oppose or doubt them, you are a heretic and damned for all eternity.
The Church Fathers often disagree with each other.
Abelard rescued Medieval Europe from this contradiction, not necessarily by his every answer, but by his technique by which seemingly-contradictory authorities could be reconciled. Plato with Aristotle is handy. Plato with Plato sure is helpful. Jerome with Augustine is eternal salvation. And if he does it with the bits of Scripture that seem to contract the other bits? He is now the most exciting thing since the last time the Virgin Mary showed up in person.
Abelard had a lover–later, wife, but she preferred ‘lover’–the even more extraordinary Heloise, and I consider it immoral to mention him without mentioning her, but her life, her stunningly original philosophical contributions and her terrible treatment at the hands of history are subjects for another essay in its own right. For today, the important part is this: Abelard was exciting for his method, more than his ideas, his way of using Reason to resolve doubts and fears when skepticism loomed. Thus even Scholasticism, the most infamously dogmatic philosophical method in European history, was also in symbiosis with skepticism, responding to it, building from it, developing its vast towers of baby-step elaborate logic because it knew Zeno was waiting.
Proofs of the Existence of God
We are all very familiar with the veins of Christianity which focus on faith without proof as an important part of the divine plan, that God wants to test people, and there is no proof of the existence of God because God wants to be unknowable and elusive in order to test people’s faith. The most concise formula is the facetious one by Douglas Adams, where God says: “I refuse to prove that I exist, because proof denies faith and without faith I am nothing.” It’s a type of argument associated with very traditional, conservative Christianity, and, often, with its more zealous, bigoted, or “medieval” side. I play a game whenever I run into a new scholar who works on Medieval or early modern theological sources, any sources, any period, any place, from pre-Constantine Rome to Renaissance Poland. I ask: “Hey, have you ever run into arguments that God’s existence can’t be proved, or God wants to be known by faith alone, before the Reformation?” Answers: “No.” “Nope.” “Naah.” “No, never.” “Uhhh, not really, no.” “Nope.” “No.” “Nothing like that.” “Hmm… no.” “Never.” “Oh, yeah, one time I thought I found that in this fifth-century guy, but actually it was totally not that at all.” Like biblical literalism, it’s one of these positions that feels old because it’s part of a conservative position now, but it’s actually a very recent development from the perspective of 2,000 years of Christianity plus centuries more of earlier theological conversations. So, that isn’t what the Middle Ages generally does with doubt; it doesn’t rave about faith or God’s existence being elusive. Europe’s Medieval philosophers were so sure of God’s existence that it was considered manifestly obvious, and doubting it was considered a mental illness or a form of mental retardation (“The fool said in his heart ‘there is no God’,” => there must be some kind of brain deficiency which makes people doubt God; for details on this a see Alan C. Kors, Atheism in France, vol. 1). And when St. Anselm and Thomas Aquinas and Duns Scotus work up technical proofs of the existence of God they’re doing it, not because they or anyone was doubting the existence of God, but to demonstrate the efficacy of logic. If you invent a snazzy new metal detector you first aim it at a big hunk of metal to make sure it works. If you design a sophisticated robot arm, you start the test by having it pick up something easy to grab. If you want to demonstrate the power of a new tool of logic, you test it by trying to prove the biggest, simplest, most obvious thing possible: the existence of God.
(PARENTHESIS: Remember, I am skipping many Medieval things of great importance. *cough*Averroes*cough* This is a snapshot, not a survey.)
Three blossoms on the thorny rose of this Medieval trend toward writing proofs of the existence of God are worth stopping to sniff.
The first blossom is the famous William of Ockham (of “razor” fame) and his “anti-proof” of the existence of God. Ockham was a scholastic, writing in response to and in the same style and genre as Abelard, Aquinas, Scotus, and their ilk. But, when one read along and got to the bit where one would expect him to demonstrate his mastery of logic by proving the existence of God, he included instead a plea (paraphrase): Please, guys, stop writing proofs of the existence of God! Everyone believes in Him already anyway. If you keep writing these proofs, and then somebody proves your proof wrong by pointing out an error in your logic, reading the disproof might make people who didn’t doubt the existence of God start to doubt Him because they would start to think the evidence for His Existence doesn’t hold up! Some will read into this Anti-Proof hints of the beginning of “God will not offer proof, He requires faith…” arguments, and perhaps it does play a role in the birth of that vein of thinking. (I say this very provisionally, because it is not my area, and I would want to do a lot of reading before saying anything firm). My gut says, though, that it is more that Ockham thought everyone by nature believed in God, that God’s existence was so incredibly obvious, that God was not trying to hide, rather that he didn’t want the weakness of fractious scholastic in-fighting to erode what he thought was already there in everyone: belief.
Aside: While we are on the subject of Ockham, a few words on his “razor”. Ockham is credited with the principle that the simplest explanation for a thing is most likely to the correct one. That was not, in fact, a formula he put forward in anything like modern scientific terms. Rather, what we refer to as Ockham’s Razor is a distillation of his approach in a specific argument: Ockham hated the Aristotelian-Thomist model of cognition, i.e. the explanation of how sense perception and thoughts work. Hating it was fair, and anyone who has ever studied Aristotle and labored through the agent intellect, and the active intellect, and the passive intellect, and the will, and the phantasm, and innate ideas, and eternal Ideas, and forms, and categories, and potentialities, shares William of Ockham’s desire to pick Thomas Aquinas up and shake him until all the terminology falls out like loose change, and then tell him he’s only allowed to have a sensible number of incredibly technical terms (like 10, 10 would be a HUGE reduction!). Ockham proposed a new model of cognition which he set out to make much simpler, without most of the components posited by Aristotle and Aquinas, and introduced formal Nominalism. (Here Descartes cheers and sets off a little firecracker he’s been saving). Nominalism is the idea that “concepts” are created by the mind based on sense experience, and exist ONLY in the mind (like furniture in a room, adds Sherlock Holmes) rather than in some immaterial external sense (like Platonic forms). Having vastly simplified and revolutionized cognition, Ockham then proceeded to describe the types of concepts, vocabulary terms and linguistic categories we use to refer to concepts in infuriating detail, inventing fifty jillion more technical terms than Aquinas ever used, and driving everyone who read him crazy. (If you are ever transported to a dungeon where you have to fight great philosophers personified as Dungeons & Dragons monsters, the best weapon against Ockham is to grab his razor of +10 against unnecessary terminology and use it on the man himself). One takeaway note from this aside: while “Ockham’s Razor” is a popular rallying cry of modern (post-Darwin) atheism, and more broadly of modern rationalism, that is a modern usage entirely unrelated to the creator himself. He thought that the existence of God was so incredibly obvious, and necessary to explain so many things, from the existence of the universe to the buoyancy of cork, that if you presented him with the principle that the simplest explanation is usually best, he would agree, and happily assume that you believed, along with him, that “God” (being infinitely simple, see Plotinus and Aquinas) is therefore a far simpler answer to 10,000 technical scientific questions than 10,000 separate technical scientific answers. Like Machiavelli, Aristotle and many more, Ockham would have been utterly stunned (and, I think, more than a little scared) if he could have seen how his principles would be used later.
The second blossom (or perhaps thorn?) of this Medieval fad of proving God’s existence was, well, that Ockham was 110% correct. Here again I cite Alan Kors’ masterful Atheism in France; in short, his findings were that, when proving the existence of God became more and more popular, as the first field test to make sure your logical system worked, (a la metal detector…beep, beep, beep, yup it’s working!), it created an incentive for competing logicians to attack people’s proofs of the existence of God (i.e. if it can’t find a giant lump of iron the size of a house it’s not a very good metal detector, is it?) Thus believers spent centuries writing attacks on the existence of God, not because they doubted, but to prove their own mastery of Aristotelian logic superior to others. This then generated thousands of pages of attacks on the existence of God, and, by a bizarre coincidence *cough*cough*, when, in the 17th and 18th centuries, we finally do start getting writings by actual overt “I really think there is no God!” atheists, they use many of the same arguments, which were waiting for them, readily available in volumes upon volumes of Church-generated books. Dogmatism here fed and enriched skepticism, much as skepticism has always fed and enriched dogmatism, in their ongoing and fruitful symbiosis.
The third blossom is, of course, sitting with us dolling out eclairs. Impatient Descartes has been itching, ever since I mentioned Anselm, to leap in with his own Proof of the Existence of God, one which uses a more mature form of Ockham’s Nominalism, coupled with the tools of skepticism, especially doubt of the senses. But Descartes may not speak yet! (Don’t make that angry face at me, Monsieur, you’ll agree when you hear why.) It won’t be Descartes’ turn until we have reviewed a few more details, a little Renaissance and Reformation, and introduced you to Descartes’ great predecessor, the fertile plain on whom Descartes will erect his Cathedral. Smiling now, realizing that we draw near the Illustrious Father of Skeptics whom he has been waiting for, Descartes sits back content, until next time.
But do not fear, the wait will be short this time. Socrates is in more suspense than Descartes, and if I stop writing he’ll start demanding that I define “illustrious” or “next” or “man”, so I’d better plunge straight in. Meanwhile, I hope you will leave this little snapshot with the following takeaways:
Medieval thought was notdominated by the idea that logic and inquiry are bad and Blind Faith should rule; much more often, Medieval thinkers argued that logic and inquiry were wonderful because they could reinforce and explain faith, and protect people from error and eternal damnation. Medieval society threw tons of energy into the pursuit of knowledge (scientia, science), it’s just that they thought theology was 1000x more important than any other topic, so concentrated the resources there.
When you see theologians discussing whether certain areas of knowledge are “beyond human knowledge” or “unknowable”, before you automatically call this a backwards and closed-minded attitude, remember that it comes from Plato, Epicurus and Aristotle, who tried to differentiate knowledge into areas that could be known with certainty, and areas where our sources (senses/logic) are unreliable, so there will always be doubt. The act of dividing certain from uncertain only becomes close-minded when “that falls outside what can be known with certainty” becomes an excuse for telling the bright young questioner to shut up. This happened, but not always.
Even when there were not many philosophers we could call “skeptics” in the formal sense, and the great ancient skeptics were not being read much, skepticism continued to be a huge part of philosophy because the tools developed to combat it (Aristotle’s logical methods, for example) continued to be used, expanded and re-purposed in the ongoing search for certainty.
Welcome to a new feature here on Ex Urbe — the promoted comment.
From time to time, Ada makes a long substantive chewy comment, which could almost be its own post. Making it into an actual post would take valuable time. The comment is already written and fascinating — but hidden down in a comment thread where many people may not notice it. From now on, when this happens, I will extract it and promote it. I may even go back and do this with some older especially awesome comments. You’ll be able to tell the difference between this and a real post, because it’ll say it’s posted by Bluejo, and not by Exurbe, because it will say “a promoted comment”, and also because it won’t be full of beautiful relevant carefully selected art but will have just one or two pieces of much more random art.
I thoroughly enjoyed reading this new post. As I am reviewing macroeconomics, especially the different variations of Solow Model, I cannot help but link “intellectual technology” with the specific endogenous growth model, which attempts to led the model itself generate technological growth without an exogenous “manna from heaven”. In this model, technology growth is expressed endogenously by the factor capital as “productive externalities”, and individual workers, through “learning by doing,” obtain more “skills” as the capital grows. Of course, the “technology factor” in the model I learned is vaguely defined and does not cover the many definitions and various effects of “intellectual technology” not directly related to economic production.
Your conversation with Michael reminds of me the lectures and seminars I took with you at Texas A&M. By the time I took your Intellectual History from Middle Ages to 17th Century, I have already taken some classes on philosophy. Sadly, my fellow philosophy students and I usually fell into anachronism and criticized early thinkers a bit “unfairly” on many issues. That is why your courses were like a beam of light to me, for I was never aware of the fact that we have different logic, concepts, and definition of words from our predecessors and should hence put those thinkers back into their own historical context.
It seems to me that Prof. Peter E. Gordon’s essay “What is intellectual history’ captures the different angles from which you and Michael construe Machiavelli: Michael seems more like a philosophy/political science student who attempts to examine how and why early thinkers’ ideas work or not work for our society based on our modern definitions, concepts, and logic, thus raising more debates on political philosophy and pushing the progress of philosophical innovation; your role as an intellectual historian requires one to be unattached from our own understanding of ideas and concepts and to be aware of even logic that seems to be rooted in our subconsciousness so that to examine a past thinker fairly without rash judgement. Michael is like the one who attempts to keep building the existing tower upward, while you are examining carefully the foundation below. For me personally, it would be nice to have both of these two different ways of thinking.
I have a question: I have been attempting to read a bit of Karl Marx whenever time allows. He argues that our thinking and ideology are a reflection of our material conditions. If we accept his point of view, would it be useful to connect intellectual history with economic history?
Nahua, I think you have hit it spot on with your discussion of Peter Gordon’s essay. When I worked with him at Harvard (I had the privilege of having him on my committee, as well as being his teaching assistant for a course) I remember being struck by how, even when we were teaching thinkers far outside my usual scope like Heidegger, I found his presentation of them welcoming and approachable despite my lack of background, because he approached them in the same context-focused way that I did, evaluating, not their correctness or not or their applicability to the present, but their roots in their contemporary historical contexts and the reasons why they believed what they believed.
For Marx’s comment that “our thinking and ideology are a reflection of our material conditions” I think it is often very useful to connect intellectual history with economic history, not in a strictly deterministic way, but by considering economic changes as major environmental or enabling factors that facilitate or deter intellectual change and/or the dissemination of new ideas. I already discussed the example of how I think the dissemination of feminism in the 19th century was greatly facilitated by the economic liberation of female labor because of the development of industrial cloth production, more efficient ways of doing laundry, cleaning, cooking etc. Ideas about female equality existed in antiquity. They enjoyed a large surge in conversation and support from the intellectual firebrands of the Enlightenment, through figures like Montesquieu, Voltaire and Wollstonecraft. But mass movements and substantial political changes, like female suffrage, came when the economic shift had occurred. To use the “intellectual technology” concept, the technology existed in antiquity and was revived and refined in the 18th century, but it required economic shifts as well to help reach a state when large portions of the population or whole nations/governments could embrace and employ it.
As I work on Renaissance history, I constantly feel the close relationship between economics and the intellectual world as well. Humanism as I understand it began when Petrarch called for a revival of antiquity. Economics comes into this in two ways. First, the reason he thought a revival of antiquity was so desperately necessary was because Italy had become so politically tumultuous and unstable, and was under such threat of cultural or literal invasion from France–these are the consequences, largely, of economic situations, since Italy’s development of banking and its central position as a trade hub for the Mediterranean had filled its small, vulnerable citystates with incomparable wealth, creating situations where powerful families could feud, small powers could hire large mercenary armies, and every king in Europe wanted to invade Italy for a piece of its plump pie. Then after Petrarch, humanism’s ability to spread and succeed was also economically linked. You can’t have a humanist without books, you just can’t, it’s about reading, studying, correcting and living the classics. But in an era when a book cost as much as a house, and more than a year’s salary for a young schoolmaster, a library required a staggering investment of capital. That required wealthy powers–families or governments–to value humanism and have the resources to spend on it. Powers like the Medici, and Florence’s Republican government, were convinced to spend their money on libraries and humanism because they believed it would bring them glory, strength, respect, legitimacy, the love of the people, that it would improve life, heal their souls, bring peace, and make their names ring in posterity, but they couldn’t have made the investment if they hadn’t had the money to invest, and they wouldn’t have believed humanism could yield so much if not for the particular (and particularly tumultuous) economic situation in which Renaissance Italy found itself.
Yesterday I found myself thinking about the history of the book in this light, and comparing it to some comments I heard a scientist make on a panel about space elevators. We all want a space elevator–then space exploration will become much, much less expensive, everyone can afford satellites, space-dependent technologies will become cheap, and we can have a Moon Base, and a Mars program, and all the space stations we want, and all our kids can have field trips to space (slight exaggeration). To have a space elevator, we need incredibly strong cables, probably produced using nanofibers. Developing nanofibers is expensive. What the engineer pointed out is that he has high hopes for nanofiber devlopment, because nanofibers have the ideal demand pattern for a new technology. A new technology like this has the problem that, even if there are giant economic benefits to it later on, the people who pay for its development need a short-term return on that, which is difficult in the new baby stages of a technology when it’s at its most expensive. (Some of you may remember the West Wing episode where they debate the price of a cancer medication, arguing that producing each pill costs 5 cents so it’s unfair to charge more, to which the rebuttal is that the second pill cost 5 cents, but the first pill cost $300 million in research.) Once nanofiber production becomes cheap, absolutely it will be profitable, but while it’s still in the stage of costing $300 million to produce a few yards of thread, that’s a problem, and can be enough to keep a technology from getting support. One of the ways we work around this as a society today is the university system, which (through a form of patronage) supports researchers and gives them liberty to direct research toward avenues expected to be valuable independent of profit. Another is grant funding, which gives money based on arguments for the merit of a project without expecting to be paid back. A third is NASA, which develops new technologies (like velcro, or pyrex) to achieve a particular project (Moon!), which are then used and reused in society for the benefit of all. But looking at just the private sector, at the odds of a technology getting funding from investors rather than non-profits, what the scientist said is that, for a technology to receive funding, you want it to have a big long-term application which will show that you’ll make a steady profit once you can make lots of the thing, but it needs to also to have a short-term application for which a small number of clients will be prepared to pay an enormous amount, so you can sell it while it still costs $300 million, as well as expecting to sell it when it costs 5 cents. Nanofibers, he said, hit this sweet spot because of two demands. The first is body armor, since it looks like nanofibers can create bullet-proof fabric as light as normal fabric, and if we can do that then governments will certainly pay an enormous amount to get bullet-proof clothing for a head of state and his/her bodyguards, and elite military applications. The second is super-high-end lightweight golf clubs, which may seem like a frivolous thing, but there are people who will pay thousands of dollars for an extremely high end golf club, and that is something nanofibers can profit from even while expensive (super lightweight bicycles for racing also qualify). So nanofibers can depend on the excitement of the specific investors who want the expensive version now, and through their patronage develop toward the ability to produce things cheaply.
In this sense the history of the book, especially in the Renaissance, was very similar to the situation with nanofibers. In the early, manuscript stage when each new book cost the equivalent of $50,000 (very rough estimate), libraries were built and humanism was funded because wealthy people like Niccolo Niccoli and Cosimo de Medici believed that humanist libraries would give them and their home city political power and spiritual benefits, helping them toward Heaven. That convinced them to invest their millions. Their investments then created the libraries which could be used later on by larger populations, and reproduced cheaply through printing once it developed, but printing would not have developed if patrons like them weren’t around to make there be demand for the volume of books printing could produce. It took Petrarch, Niccoli and Cosimo to fund a library which could raise a generation of people who could read the classics before there was enough demand to sell the 300-1500 copies of a classical book that a printing press could print. And, working within current capitalism, it may take governments who really want bullet-proof suit jackets to give us our space elevator, though universities, NASA, and private patronage of civilian space programs are certainly also big factors pushing us forward.
In sum, I would say that economics sometimes sparks the generation of new ideas–as the economically-driven strife Petrarch experienced enabled the birth of humanism–but it also strongly affects how easily or quickly a new idea can disseminate, whether it gets patronage and support, or whether its champions have to spread it without the support of elites, patrons or government. Thus, in any given era, an intellectual historian needs to have a sense of funding patterns and patronage systems, so we can understand how ideas travel, where, and why.
One more thought from last night, or rather a test comparison showing how the concept “intellectual technology” can work. I was thinking about comparing atomism and steel.
Steel is a precursor for building skyscrapers. Despite urban demand, we didn’t get a transition to huge, towering metropoles until the development of good steel which could raise our towers of glittering glass. Of course, steel is not the ONLY precursor of the skyscraper–it also requires tempered glass, etc. And it isn’t the only way to build skyscrapers, you can use titanium, or nanotech, but you are very unlikely to get either of those things without going through steel first. Having steel does not guarantee that your society will have skyscrapers. Ancient Rome had steel. In the Middle Ages Europe lost it (though pretty-much everywhere except Europe still had steel). When steel came back in the Renaissance it still didn’t lead immediately to skyscrapers, it required many other developments first, and steel had to combine with other things, including social changes (growth of big cities). But when we look at the history of city development, studying steel is extremely important because the advent of steel-frame construction is a very important phase, and a central enabling factor for the development of modern cities.
My Lucretius book looks at the relationship between atomism and atheism in the same way that this analysis looks at steel and skyscrapers. Atomism was around for a long time, went away, came back, etc. And you can have non-atomic atheism, we have lots of it now. But atomism, as the first fully-developed mechanical model of the working of Nature (the first not dependent on God/gods to make the world work) was, in my opinion, one of the factors that you needed to combine with other developments to reach a situation in which an intellectual could combine mechanical models of nature with skepticism with other factors to develop the first fully functional atheistic model of the world. It’s one of the big factors we have to trace to ask “Why did atheism become a major interlocutor in the history of thought when it did, and not before or after?” just as tracing steel helps us answer “Why did skyscrapers start being built when they did?” There had almost certainly been atheisms before and independent of atomism (just as you can make really tall things, like pyramids or cliff-face cities, without steel-frame construction) but it was rare, and didn’t have the infrastructural repeatability necessary to let it become widespread. Modern atheists don’t use Epicurus, they more frequently use Darwin, just as modern skyscrapers use titanium, but the history of skyscrapers becomes clear when we study the history of steel. Just so, the history of atheism becomes much clearer when we study atomism. Of course, we now use steel for lots of things that aren’t skyscrapers (satellite approaching Pluto!), and similarly atomism has lots of non-atheist applications, but we associate atomism a lot with atheism, just as we think a lot about “towers of glass and steel” and usually think less about the steel bolts in our chairs or the steel spoons we eat with. All applications of steel, or epicuranism, can be worth studying, but skyscrapers/ atheism will never stop being one of the biggest and most interesting, at least in terms of how they changed the face of our modern world. And finally, while minority of buildings are skyscrapers, and a minority of contemporary people are atheists, the study of both is broadly useful because the presence of both in the lives of everyone is a defining factor in our current world.
Hello, patient friends. The delight of brilliant and eager students, the siren call of a new university library, the massing threat of conjoining deadlines, and the thousand micro-tasks of moving across the country have caused a very long gap between posts. But I have several pieces of good news to share today, as well as new thoughts on Machiavelli:
The next installment of my Sketches of a History of Skepticism series is 2/3 finished, and I hope to have it up in a week or three, deadlines permitting.
I have an excellent new assistant named Mack Muldofsky, who is helping me with Ex Urbe, music, research and many other projects. So we have him to thank in a big way if the speed of my posting picks up this summer.
Because I have a lot of deadlines this summer, I have asked some friends to contribute guest entries here, and we have a few planned treating science, literature and history, so that’s something we can look forward to together.
For those following my music, the Sundown Kickstarter is complete, and it is now possible to order online the CD and DVD of my Norse Myth song cycle Sundown: Whispers of Ragnarok. In addition to the discs, you can also order two posters, one of my space exploration anthem “Somebody Will” and one which is a detailed map of the Norse mythological cosmos. CD sales go to supporting the costs of traveling to concerts.
I have several concerts and public events lined up for the summer:
At Mythcon (July 31-Aug 2), Lauren Schiller and myself, performing as the duo “Sassafrass: Trickster and King” will join Guest of Honor Jo Walton for “Norse Hour,” in which she will read Norse myth-themed poetry in alternation with our Norse-themed songs.
Sunday August 9th, I have been invited do a reading of the freshly-polished opening chapters of my novel Too Like the Lightning (due out in Summer 2016) at the Tiptree Award Ceremony event honoring Jo Walton, who couldn’t make it to the initial ceremony but received the Tiptree this year for her novel My Real Children. The event is being held at Borderlands in San Francisco at 3 PM, and will feature readings by local authors, and music performed by myself and Lauren.
Monday August 17th, at 7 PM, I am joining Jo and Lauren again at Powell’s, where Jo will read from her books, Lauren and I will sing, and I will interview Jo and talk about my writing as well as hers.
Finally at Sasquan (Worldcon, Aug 19-23) Lauren and I will have a full concert, I will do another reading from Dogs of Peace, and I will be on several exciting panels.
Meanwhile, I have a little something to share here. I continue to receive frequent responses to my Machiavelli series, and recently one of them sparked such an interesting conversation in e-mail that I wanted to post it here, for others to enjoy and respond to. These are very raw thoughts, and I hope the discussion will gain more participants here in the comment thread (I have trimmed out parts not relevant to the discussion):
In this discussion, I use a term I often use when trying to introduce intellectual history as a concept, and which I have been meaning to write about here for some time, “Intellectual Technology.”
A little conversation about Machiavelli:
I have been reading your blog posts on Machiavelli. You write with tremendous learning, clarity and colour, and really bring past events alive in a brilliant way. But…….. I think you’re far too soft on Machiavelli!!!
I’m working on a PhD about him and it’s fascinating to see that nearly all present-day academics, and indeed academics during much of the second half of the 20th century, have a largely if not completely uncritical admiration for him and his works. He is lauded, for example as a forerunner of pluralism, and supporter of republicanism/democracy, yet his clear inspiration of Italian fascism is almost completely overlooked. The fact that Gramsci revered Machiavelli is dealt with by many scholars, but Mussolini’s admiration for him is hurriedly passed over.
Your post on Machiavelli and atheism is really interesting – in that context the 2013 book Machiavelliby Robert Black would be of interest to you…
Best regards, Michael Sanfey, IEP/UCP Lisbon.
Reply from Ada:
Michael,Thank you for writing in to express your enjoyment of my blog posts. I think your criticisms of Machiavelli are interesting and largely fair, and my own opinions overlap with yours in many ways, though not in others. I agree with you completely that there are inappropriate tendencies in a lot of scholars to praise Machiavelli inappropriately as a proto-modern champion of Democracy, republicanism, pluralism, modern national pride etc., all of which are characterizations are deeply inappropriate and also deeply presentist, reading anachronistic values back into him. But there is also a tendency, dominant earlier in the 20th century, to villify Machiavelli too much in precisely the same anachronistic and presentist way, characterizing him as a fascist or a Nazi and reading back into his work the things that were done in the 20th century by people who used some of his ideas but mixed them with many others. My way of approaching Machiavelli focuses above all on trying to distance him from the present and place him in his context, to show that he is neither a modern hero nor a modern villain since he isn’t modern at all. The question is separate, which you bring up, of how much to blame him or criticize him for opening up the direction of reasoning which led to later consequentialism, and also to fascism which certainly used him as one of its foundational texts. Here I find myself uncomfortable with the idea of historical blame at all, particularly when it’s blame over such a long span of time.
I tend to think of thinkers as toolmakers, or inventors of “intellectual technology”, innovators who have created a new thing which can then be used by many people. New inventions can be used in many ways, and in anticipatable and unanticipatable ways. Just as, for example, carbon steel can be used to raise great towers and send train lines across continents, it can be used to build weapons and take lives, so it is a complex question how much to blame the inventor of carbon steel for its many uses. In this sense, I do believe we can see Machiavelli as a weapon-maker, since the ideas he was generating were directly intended to be used in war and politics. We can compare him very directly to the inventor of gunpowder in this sense. I also see him–and this is much of the heart of my critique–as a defensive weapon maker, i.e. someone working in a period of danger and siege trying to create something with which to defend his homeland. So, imagine now the inventor of gunpowder creating it to defend his homeland from an invasion. Is he responsible for all later uses of gunpowder as well? Is he guilty of criminal negligence for not thinking through the fact that long-term many more people will be killed by his invention than live in his home town? Do the lives saved by gunpowder throughout its history balance out against the lives saved in some kind of (Machiavellian/consequentialist) moral calculus? I don’t think “yes” or “no” are fair answers to such a complex question, but I do think it is important, when we think about Machiavelli and what to hold him responsible for, to remember the circumstances in which he created gunpowder (i.e. consequentialist ethics), and that he invented other great things too, like political science and critical historical reasoning. The debts are complicated, as is the culpability for how inventions are used after the inventor’s death. So while I join you wholeheartedly in wanting to fight back against the distortion of Machiavelli the Mythical proto-modern Republican, I also think it’s valuable to battle against the myth of Machiavelli the proto-Fascist, and try to create a portrait of the real man as I see him, Machiavelli the frightened Florentine.
I do know Bob Black’s Machiavelli book, but disagree with some of his fundamental ideas about humanism itself – another fun topic, and one I enjoy discussing with him at conferences. He’s a challenging interlocutor. There is a very good recent paper by James Hankins on Academia.edu now about the “Virtue Politics” of humanists, which I recommend that you look at if you’re interested in responses to Black.
Best, Ada Palmer, University of Chicago
More from Michael:
First, I want to thank you for this fantastically detailed and brilliant response… I’d like to “come back at you” on consequentialism and some other points:
* Regarding your point about Machiavelli not being modern at all, I see what you mean, albeit you do say of Machiavelli in the post on atheism that “he is in other ways so very modern”. Leo Strauss certainly thought he had a lot to do with the introduction of what we know as “modernity”.
* When you seek to balance the need to fight against the Proto-republican myth and against the Proto-fascist myth, the first of those “myths” enjoys immeasurably wider currency than the second, and I ask myself, why is this?
* On the “intellectual technology” point below, and its being essentially neutral, in this case I wouldn’t agree with you, because we are not talking here about an object like gunpowder, it’s actually concerning something much more important. In ethical terms, Machiavelli took transcendent values out of the equation. As you put it, Machiavelli created “an ethics which works without God” – except that it doesn’t work!!!
* Machiavelli has had a questionable impact in regard to “realism” in International relations. You mention in one of the posts that he backed an alliance with Borgia so as to protect Florence, agreeing to offer money and resources to help Borgia conquer more – a very good example of Machiavelli‘s undoubted sympathy for imperialism.
PPS On the question of Machiavelli being an atheist or not, I really was fascinated by that part of your Ex Urbe writings. I’ve concluded that, whatever about him being an atheist or not, one could certainly describe him as “ungodly” would you agree?
Quick response from Ada:
I think “ungodly” does work for Machiavelli depending on how you define it; it has a connotation of being immoral–which does not fit–but if instead you mean it literally as someone who makes his calculations without thinking much about the divine then it fits.
A supplementary comment on “Intellectual Technology”:
I find “intellectual technology” a very useful concept when I try to describe what I study. Broadly my work is “intellectual history” or “the history of ideas” but what I actually study is a bit more specific: how particular kinds of ideas come into existence, disseminate, and come to be regulated at different points in time. The types of ideas I investigate–atomism, determinism, utilitarianism–move through human culture very much the same way technological innovations do. They come into being in a specific place and time, as a result of a single inventor or collaboration. They spread from that point, but their spread is neither inevitable nor simple. Sometimes they are invented separately by independent people in independent places, and sometimes they exist for centuries before having a substantial impact. When a new idea enters a place and comes into common use, it completely changes the situation and makes actions or institutions which worked before no longer viable. I compare Machiavelli’s utilitarianism to gunpowder above, but here are some other examples of famous cases of technological inventions, and ideas which disseminated in similar patterns:
The Bicycle and Atomism
Leonardo da Vinci sketched a design for a bicycle in the Renaissance, and may have seriously tried to construct one, but afterward no one did so for a very long time. Then many other factors changed: the availability of rubber and light-weight strong metals, the growth of large, centralized cities and a working population in need of inexpensive transit, and suddenly the bicycle was able to combine with these other factors to revolutionize life and society in a huge rush, first across Europe and then well beyond. We have moved on from it to develop more complex technologies that achieve the same function, but still use it and develop it more, and even where we don’t, and cities would not have the shapes they do now without it, and it is still transforming parts of the world it has touched more slowly. Similarly atomism was developed and used for a little while, then languished in notebooks for a long time, before combining with the right factors to spread and rapidly transform society and culture.
The Unity of All Life and Calculus
Newton and Leibnitz developed Calculus independently at the same time. Similarly, both classical Stoicism in Greece and Buddhism in India roughly simultaneously and independently, as far as we can tell, developed the idea that all living things–humans, insects, ancients, people not yet born–are, in fact, parts of one contiguous, interconnected, sacred living thing. This enormously rich and complex concept had a huge number of applications in each society, but seems to have been independently developed to meet the demands for metaphysical and emotional answers of societies at remarkably similar developmental stages. The circumstances were right, and the ideas then went on to be applied in vastly different but still similar ways.
Feminism and the Aztec Wheel
For a long time we thought the Aztecs didn’t have the wheel. More recently we discovered that they had children’s toys which used the wheel, but never developed it beyond that. Which means someone thought of it, and it disseminated a bit and was used in a very narrow way, but not developed further because what we think of as more “advanced” or “industrial” applications (wagon, wheelbarrow) just weren’t compatible with the Aztec world (largely because it was incredibly hilly and didn’t have the elaborate road system Europe developed, relying instead on human legs, stairs, and raw terrain, which were sufficient to let it develop a robust and complex economy and empire of its own. The wheel became more useful in the Americas when European-style city plans and roads were built). Similarly Plato voiced feminism in his Republic, arguing that women and men were fundamentally interchangeable if educated the same way, and people who read the Republic discussed it as a theory among many other elements of the book, but didn’t develop it further (again, I would argue, this was at least in part because the economic and social structures of the classical world depended on the gendered division of labor, particularly for the production of thread in the absence of advanced spinning technology, which is why literally all women in Rome spent tons of time spinning–spinning quotas were even sometimes required by law of prostitutes since if there was a substantial sliver of the female population employed without spinning Rome would run out of cloth. Feminism was better able to become revolutionary in Europe when (among other changes) industrialization reduced the number of hours required for the maintenance of a household and the production of cloth, making it more practical to redirect female labor, and question why it had been locked into that in the first place).
In sum, there is a concreteness to the ideas whose movements I study, a distinct and recognizable traceability. Interpretive analyses, comparative, subjective analyses, analyses of technique, aesthetics, authorial intent, authenticity, such analyses are excellent, but they aren’t intellectual history as I practice and teach it. I trace intellectual technology. Just as the gun, or carbon steel, or the moldboard plow came in at a particular time and had an impact, I study particular ideas whose dissemination changed what it was possible for human beings to do, and what shapes human society can be. It is meaningful to talk about being at an “intellectual tech level” or at least about being pre- or post- a particular piece of intellectual technology (progress, utilitarianism, the scientific method) just as much as we can talk about being pre- or post-computer, gunpowder, or bronze. Such things cannot be un-invented once they disseminate through a society, though some societies regulate or restrict them, and they can be lost, or spend a long time hidden, or undeveloped. Elites often have a legal or practical monopoly on some (intellectual) technologies, but nothing can stop things from sometimes getting into the hands or minds of the poor or the oppressed. Sometimes historians are sure a piece of (intellectual) technology was present because we have direct records of it: a surviving example, a reference, a drawing, something which was obviously made with it. Other times we have only secondary evidence (they were farming X crop which, as far as we know, probably requires the moldboard plow; they described a strange kind of unknown weapon which we think means gun; they were discussing heretics of a particular sort which seems to have involved denial of Providence).
I realize that it would be easy to read my use of “intellectual technology” as an attempt to climb on the pro-science-and-engineering bandwagon, presenting intellectual history as quasi-hard-science, much as we joke that if poets started calling themselves “syllabic engineers” they would suddenly be paid more. But it isn’t a term I’m advocating as a label, necessarily. It’s a term I use for thinking, a semantic tool for describing the specific type of idea history I practice, and linking together my different interests into a coherent whole. When I spell out what I’m working on right now as an historian, it’s actually a rather incoherent list: “the history of atheism, atomic science, skepticism, Platonic and Stoic theology, soul theory, homosexuality, theodicy, witchcraft, gender construction, saints and heavenly politics, Viking metaphysics, the Inquisition, utilitarianism, humanist self-fashioning, and what Renaissance people imagined ancient Rome was like. And if you give me an hour, I can sort-of explain what those things have to do with each other.” Or I can say, “I study how particularly controversial pieces of new intellectual technology come into being and spread over time.”
In that light, then, we can think of Machiavelli as the inventor of a piece of intellectual technology, or rather of several pieces of intellectual technology, since consequential ethics is one, but his new method of historical analysis (political science) is another. We might compare him to someone who invented both the gun and the calculator. How do we feel about that contribution? Positive? Negative? Critical? Celebratory? I think the only universal answer is: we feel strongly.
On the one hand, I have been looking forward for ages to reading and then writing something about “The Litany of Earth,” an amazing novelette by Ruthanna Emrys, acquired for Tor.com by editor Carl Engle-Laird. But on the other hand I personally usually dislike reading reviews, at least traditional reviews of things I have already decided to read. When a reviewer tells me about what I’m going to experience and what excellent things the author is going to do, it disrupts the reading process for me, makes the things mentioned in the review stand out too boldly, interfering with the craftsmanship of a good story in which the author has taken great pains to give each beat just the right amount of emphasis, no more, no less. The memory of the review in my mind makes it like a used book which someone has gone through with highlighter, which can be fascinating as a window on a fellow reader, and delightful for a reread, but it isn’t what I want on first meeting a new text, which in my ideal world consists of me, the reader, placing myself wholly and directly in the hands of the author, with the editor’s touch there too to help spot us along the way. I do not need a co-pilot. And it is more of a problem, for me at least, with short fiction than with long fiction since the review could be half as long as the story and weigh me down with nearly as much weight as the whole thing carries. So, today I have set myself the challenge of writing a review, or non-review, of “The Litany of Earth” that isn’t a co-pilot, or a highlighter, and does as much as possible to get across the story’s strengths and the power of the reading experience while doing my best not to change the relative weight of anything in the story, make anything jump out too boldly, leaving the craftsmanship as untouched as it can be.
I have a seven step plan. (Personal rule: anything with three or more steps counts as a plan. Also, “Profit” is not a step, it’s an outcome, and does not count toward your total of three.)
Recommend you go read “The Litany of Earth” now before I can spoil anything.
Talk amorphously about things the story is doing with structure and world-canon, talking more concretely about a few other pieces of fiction that have done somewhat similar things.
Ramble about Petrarch.
Ramble about Diderot. Dear, dear Diderot…
Urge you to read “The Litany of Earth” again, last chance before I get out my highlighter.
Talk about “The Litany of Earth” directly.
Step One: I strongly recommend that you go read “The Litany of Earth” right now. It’s free online, and if you read it now you won’t be stuck with an intrusive co-pilot even if I do fail in today’s challenge of writing a non-review.
Step Two: Talk amorphously, and compare the story to other works of fiction.
One of the unique literary assets of current fiction is the proliferation of familiar but elaborate and thoroughly developed fictional worlds which authors can step into and use for new purposes. There have always been such worlds as long as there has been literature. Arthuriana is my favorite pre-modern example, a complex and well-populated world rich with explorable relationships and flexible metaphysics ready to be elaborated upon and repurposed. Geoffrey of Monmouth and Thomas Malory and Petrarch and Ariosto and the traditional artists in Naples who decorated (and still decorate) street vendor wagons with Arthur’s knights each repurposed Arthuriana just like Marion Zimmer Bradley and and Monty Python and Gargoyles and Heather Dale and Babylon 5 and the endlessly hilarious antics of the BBC’s Merlin. Each of the later authors in the genealogy has taken advantage not only of the plot, setting and characters but knowing that readers have genre expectations.
In the early 1500s when Ariosto began his chivalric and slightly-Arthurian verse epic Orlando Furioso he took advantage of the fact that readers already associated the topic with epic works and grand tourneys and knights and ladies and courtly-love adultery, baggage which let him write a massive and endless rambling snarl of disjointed and fantastic adventurousness so unwieldy that traditional epic structure is to Orlando Furioso as a sturdy rope is to the unassailable rat’s nest of broken headphones and cables for forgotten electronics that I just fished out of this bottom drawer. No reader, not even in 1516, would put up with it without the promise of Arthurian grandeur to make its massive scale feel appropriate. (I will also argue that the BBC Merlin, for all its tomatoes and giant scorpions, has not actually done anything quite so unreasonable as the point when Ariosto has “Saint Merlin” rise from his tomb to deliver an endless rambling prophecy about how awesome Ariosto’s boss Ipollito D’Este is going to be. Fan service long predates the printing press.) In a more recent continuation of this tradition, modern Arthurian adaptations have given us the previously-silenced P.O.V.s of women, of villains, of third-tier characters, and in some sense it’s quite modern to think about P.O.V. at all. But even very old adaptations take advantage of how not just setting but genre is an asset usable to get the reader to follow the author to places a reader might not normally be willing to go. And, of course, in more recent versions authors have taken advantage of exploring silenced P.O.V.s to critique earlier Arthurian works and their blind spots, as a way of reaching the broader blindnesses and silencings of the past stages of our own society that birthed these worlds.
“Is ‘The Litany of Earth’ Arthuriana?” you may wonder. No. It uses a different mythos. I bring up Arthuriana in order to remind you of the many great things you’ve seen humans create by using and reusing a familiar collective fiction, and in order to reinforce my earlier claim that one of the great assets of current fiction is that we have many, many such worlds. If pre-modern Earth had several dozen rich, lively, reusable mythoi and epic settings, the 20th century has added many, many more in which good (and campy) things have and can be done. Star Trek, Sherlock Holmes, Gundam, the massive united comics universes of Marvel and DC, these each provide as much complexity and material for reuse and reframing as the richest ancient epics, more if, for example, you compare the countless thousands of pages of surviving X-Men to the fragile little Penguin Classics collections of Eddas and fragmentary sagas which preserve what little we still have of the Norse mythic cosmos. Marvel’s universe, and DC’s too, have a fuller population and a more elaborate and eventful history than any mythos we have inherited from antiquity, and my own facetious in-character reviews of the Marvel movies are but the shallowest tip of what can be done with it.
The specific case of this kind of rich reuse whose parallels to “The Litany of Earth” are what brought me down this line analysis comes from the Marvel comics megaverse, the unique and skinny stand-alone Marvels, by Kurt Busiek, illustrated by Alex Ross. What it does with the narrative possibilities of the Marvel universe is very much worth looking at even if one doesn’t care a jot about comics.
Described from the outside and ignoring, for a moment, that these are comic books, the Marvel universe presents us with an Earth-like alternate history in which disasters–supernatural, alien, primordial, divine–have repeatedly threatened Earth, the universe, and, most often, New York City with certain destruction. These have been repeatedly repelled by superheroes, somewhat human somewhat not, and the P.O.V. from which we the reader have always viewed these events has been as one of the superpeople at the heart of the battle, deeply enmeshed in the passionate immediacy of the short-term drama, nemeses, kidnappings, personal backstory, and who’s dead lately. Only rarely have we had works that gave us a longer perspective over time, reflecting personal change, evolving perspectives, how being constantly enmeshed in superbusiness makes a person develop and self-reflect, though notably the works that have done so have been among superhero comics’ shining stars (Dark Knight Returns, Red Son, Watchmen.)
Marvels instead offers a long-term and distanced P.O.V., that of a photographer who lives in New York City and, during his path from rookie to retirement, experiences in order the great, visible cataclysms that have repeatedly shaken Marvel’s Earth. His perspective gives historicity, sentiment, reflection and above all realism to Marvel, using it as alternate history rather than an action setting. The effect is powerful, beautiful and highly recommended for the way it weaves the richness of Marvel’s setting together with good writing to create a truly valuable work of literature. But it also reverses an interesting silencing which has been present in the back of Marvel, and superhero comics, since their inception: the silencing of the Public.
Very much like the women in early versions of Arthuriana, the Public in Marvel (and DC) has not been an agent in itself, but an object to motivate the hero. The Public exists to be rescued, protected, placated, evaded, sometimes feared. The Public has cheered P.O.V. heroes, hounded them, betrayed them, threatened them with pitchforks and torches, somehow being tricked over and over again into doubting the heros even after the last seventeen times they were exonerated. The Marvel Public specifically also persistently hates and fears the X-Men and other mutants despite being saved by them sixteen jillion times, and somehow hates and fears the other heros less even though many of them are aliens or science freaks or robots or other things just as weird as mutants. It is a tool of the author, manipulated by villains, oppressing misfits, causing tension, but virtually never is the reader asked to empathize with the Public. The object of empathy is the hero, or occasionally the villain, but the reader is never supposed to identify with or even think about the emotions of the screaming and yet simultaneously silenced mob. Marvels gives us, at last, the point of view of that mob, or at least one member of it, directing our self-identification and above all our empathy for the first time to something which has been hitherto faceless.
The effect is rather like a stroll through the Uffizi enjoying endless scenes of exciting saints surrounded by choruses of beautiful angels and then hitting the Botticelli room where each angel has a distinctive face and personality and you find yourself wondering what that angel is thinking when it watches Mary come to heaven to be crowned its queen, or sings music for young John the Baptist whose grisly end and subsequent heavenly ascension the angel already knows. Only when Botticelli invites you to see the angels as individuals do you realize that no earlier painting ever did. They had a failure of empathy. They were still beautiful, but here is a rich new direction for empathy which no earlier work has asked us to consider, and which opens up a huge arena we had ignored. Women in Arthuriana; the Public in Marvel; the angels that stand around in paintings of saints.
In just the same way, “The Litany of Earth” uses empathy and P.O.V. to open rich new arenas in one of our other well-known modern fictional settings. And the setting it uses has a fundamental and very problematic failure of empathy rooted deep in its foundations, so addressing that head-on opens a very potent door.
And since I can feel the urge to talk about Naoki Urasawa’s Pluto becoming harder to resist, I believe it is now time to nip that in the bud by moving on to the next stage of my plan.
Step Three: Ramble about Petrarch.
Picture Petrarch in his library, holding his Homer. He has just received it, and turns the stiff vellum pages slowly, his fingertips brushing the precious verses that he has dreamed of since his boyhood. The Iliad in his hands. His friends have always whispered to him of the genius that was Homer, his real friends, not the shortsighted fools he grew up with in Avignon, arrogant Frenchman and slavish Italians like his parents who followed the papacy and its trail of gold even when France snatched it away from Rome. His real friends are long-dead Romans: Cicero, Seneca, Caesar, men like him who love learning, love virtue, love literature, love Rome and Italy enough to fight and give their lives for it, love truth and excellence enough to write of it with passion and powerful words that sting the reader into wanting to become a better person.
Petrarch was born in exile. Not just the geographic exile of his family from their Florentine homeland, no, something deeper. An exile in time. This world has no one he can relate to, no one whose thoughts are shaped like his, who walks the Roman roads and feels the flowing currents of the Empire, whose understanding of the world connects from Egypt up to Britain without being blinded by ephemeral borders, who can name the Muses and knows how truly rich it is to taste the arts of all nine, and how truly poor one is without. Antiquity was his native time, he knows it, but antiquity was cut off too early–he was born too late. His friends are dead, but their voices live, a few, in chunks, in the books in distant libraries which he has spent his life and fortune gathering. His library. Each volume a new shard of a missing friend, those few, battered whispers of ancient voices which survived the Medieval cataclysm that consumed so much. And now, after hearing so many of his friends speak of Homer, call him the Prince of Poets, the climax of all art and literature, divine epic, the centerpiece of all the ancient world, he has it in his hands. It survived. Homer. In Greek. And he can’t read it. Not a word of it. Greek is gone. No one can read it anymore, no one. Homer. He has it in his hand, but he can’t read it, and for all he knows no one ever will again.
This historical moment, Petrarch with his Homer, is one of the most poignant I have ever met in my scholarship. A portrait of discontinuity. The pain when the chain of cultural transmission, of old hands grasping young, that should connect past, present and future is cut off. The cataclysm doesn’t have to be complete to be enough to disrupt, to silence, to jumble, to leave too little, Greek without Homer, Homer without Greek. Petrarch is a Roman. They all are, he and his Renaissance Italians, they have the blood of the Romans, the lands of the Romans, the ruins of the Romans, but not enough for Petrarch to ever really have the life he might have had if he’d been born in the generation after Cicero, and with his Homer in his hands he knows it.
Petrarch did his best. He spent his life collecting the books of the ancients, trying to reassemble the Library of Alexandria, the pinnacle, he knew, of the culture and education which had made the Romans who had made his world. He found many shards, eventually enough that it took more than ten mules to carry his library when he journeyed from city to city. He journeyed much, working everywhere with voice and pen to convince others to share his passion for antiquity, to read the ancients that could be read, Cicero, Seneca, to learn to think as they did and to try to push this world to be Roman again, which for him meant peaceful, broad-reaching, stable, cultured and strong. People listened, and we have the libraries and cathedrals and Michelangelos they made in answer. And Petrarch never gave up on Homer either, but searched the far corners of the Earth for someone with a hint of Greek and eventually, late in life, did find someone to make a jumbled, fragmentary translation, nothing close to what a second-year-Greek student could produce today let alone a fluid translation, but a taste. By late in life he had his New Library of Alexandria, and real hope that it might rear new Romans.
Petrarch wanted to give the library to Florence, to help his homeland make itself the new Rome, but Florence was too caught up with its own faction fighting for anyone to stably take it. Venice was the taker in the end, and he hoped his library would make the great port city like the Alexandria of old, the hub where all books came, and multiplied, and spread. Venice put Petrarch’s library in a humid warehouse and let it rot. We lost it. We lost it again. We lost it the first time because of Vandals and corrupt emperors and economic transformation and plague and all the other factors that conspired to make the Roman Empire decline and fall, but we lost it the second time because Venice is humid and no one cared enough to devote space and expense to a library, even the famous collection of the famous Petrarch. Such a tiny cataclysm, but enough to make discontinuity again. We have learned better since. Petrarch had followers who formed new libraries, Poggio, Niccolo, they repeated Petrarch’s effort, finding books. Eventually princes and governments realized there was power in knowledge. Venice built the Marciana library right at the main landing, so when foreigners arrive in St. Mark’s square they are surrounded by the three facets of power, State in the Doge’s Palace, Church in the Basilica, and Knowledge in the Library. And now we have our Penguin Classics. But we don’t have Petrarch’s library, and we know he had things that were rare, originals, transcriptions of things later lost. There are ancients who made it as far as Petrarch, all the way to the late 1300s, through Vandals, Mongols and the Black Death, before we lost them to one short-sighted disaster. Discontinuity. We have Homer. We don’t know what Petrarch had that we don’t.
This was one of two historical vignettes that came vividly before my mind while I was reading “The Litany of Earth.” The second is…
Step Four: Ramble about Diderot. Dear, dear Diderot…
I must be very careful here. Even though my focus is Renaissance and my native habitat F&SF, Denis Diderot remains my favorite author. Period. My favorite in the history of words. So it is very easy for me to linger too long . But I invoke him today for a very specific reason and shall confine myself strictly to one circumscribed subtopic, however hard the copy of Rameau’s Nephew on my desk stares back.
Three quarters of the way through my survey course on the history of Western thought, I start a lecture by declaring that the Enlightenment Encyclopedia project was the single noblest undertaking in the history of human civilization. I say it because of the defiant, “bring it on!” glances I instantly get from the students, who switch at once from passive listening to critical judgment as they arm themselves with the noblest human undertakings they can think of, and gear up to see if I can follow through on my bold boast. I want that. I want their minds to be full of the Moon Landing, and the Spartans at Thermopylae, and Gandhi, and the US Declaration of Independence, and Mother Teresa, and the Polynesians who braved the infinite Pacific in their tiny log boats; I want it all in their minds’ eyes as I begin.
The Encyclopédie was the life’s work of a century on fire. The newborn concept Progress had taken flight, convincing France and Europe that the human species have the power to change the world instead of just enduring it, that we can fight back against disease, and cold, and mountain crags, and famine cycles, and time, and make each generation’s experience on this Earth a little better. The lion has its claws and strength, the serpent fangs and stealth, the great whales the force of the leviathan, but humans have Reason, and empiricism, and language to let us collaborate, discuss, examine, challenge, and form communities of scientists and thinkers who, like the honeybee, will gather the best fruits of nature and, processing them with our own inborn gifts, produce something good and sweet and useful for the world. The tone here is Francis Bacon’s, but Voltaire popularized it, and by now the fresh passion for collaboration and improvement of the human world had already birthed Descartes’ mathematics, Newton’s optics, Locke’s inalienable rights, calculus, and the Latitudinarian movements toward rational religion which seemed they might finally soothe away the wars that lingered from the Reformation. Everything could be improved if keen minds applied reason to it, from treatments for smallpox which could be preventative instead of palliative, to Europe’s law codes which were not rational constructions but mongrel accumulations of tradition and centuries-old legislation passed during half-forgotten crises and old power struggles whose purpose died with the clans and dynasties that made them but which still had the power to condemn a feeling, thinking person to torture and death.
The Encyclopédie had many purposes. Perhaps the least ambitious was to turn every citizen of Earth into a honeybee. Plato had said that only a tiny sliver of human souls were truly guided by reason–able to become Philosopher Kings–while the vast majority were inexorably dominated by base appetites, the daily dose of food and rest and lust, or by the wild but selfish passions of ambition and pride. For two millennia all had agreed, and even when the Renaissance boasted that human souls could rival angels in dignity and glory through the light of learning and the power of Reason, they meant the souls of a tiny, literate elite. But in 1689 John Locke had argued that humans are born blank slates, and nurture rather than an innate disposition of the soul separated young Newton from his father’s stable boy. The Encyclopédie set out to enable universal education, to collect basic knowledge of all subjects in a form accessible to every literate person, and to their illiterate friends who crowded around to hear new chapters read aloud in the heady excitement of its first release. With such an education, everyone could be a honeybee of Progress, and exponential acceleration in discovery and social improvement would birth a better world. So overwhelming was public demand that Europe ran out of paper, of printer’s ink, even ran out of the types of metal needed to make printing presses, so many new print shops appeared to plagiarize and print and sell more and more copies of the book which promised such a future (See F. A. Kafker, “The Recruitment of the Encyclopedists”).
Yet Diderot and his compatriots had another goal which shows itself in the structure of the Encyclopédie as well as in its bold opening essay. The second half of the 17 volume series is devoted to visual material, a series of beautiful and immensely complicated technical plates which illustrate technology and science. How to fire china dishes, smelt ore, weave rope, irrigate fields, construct ships, calculate distance, catalog fossils and decorate carriages, all are illustrated in loving detail, with diagrams of every tool and its use, every factory and its layout, every human body at work in some complex motion necessary to turn cotton into cloth or rag into precious paper. With this half of the Encyclopédie it is possible to teach one’s self every technological achievement of the age. The first half was intended to provide the same for thought. With its essays it should be possible to understand from their roots the philosophies, ethical systems, law codes, customs, religions, great thinkers of the past and present, all aspects of life and the history of humankind’s evolving mental world. It is a snapshot. A time capsule. With this–Diderot smiles thinking it–with this, if a new Dark Age fell upon humanity and but a single copy of the Encyclopédie survived, it would be possible to reconstruct all human progress. With this, the great steps forward, the hard-earned produce of so many lives, the Spartans at Thermopylae, the Polynesian log boats, will be safe forever. We can’t fall back into the dark again. With this, human achievement is immortal. Yes, Petrarch, it even details how to read, and print, and translate Greek.
Let’s linger on that thought a moment. A beautiful, unifying, optimistic, safe, human moment, warm, like when I first heard that, yes, eventually Petrarch did get to read a sliver of his Homer. Because I’m not going to keep talking about dear Diderot today, much as I would like to.
In 2012/13 we lost 170,000 volumes from the Egyptian Scientific Institute in Cairo to the revolution, 20,000 unique manuscripts in Timbuktu library to a militia fire, and we have barely begun to count the masses of original scientific material burned during a corrupt botched cost-saving effort to reduce the size of the Libraries of Fisheries and Oceans of Canada. More than half of the entries on Wikipedia’s list of destroyed libraries were destroyed after the printing of the Encyclopédie, and the libraries on the list are only a miniscule fraction of the texts lost to disasters, natural and manmade. It doesn’t even list Petrarch’s library, let alone the unique contents of the personal libraries and works that accumulate in every house now that we’re all honeybees. Diderot tried so hard to make it all immortal. He tried so hard he used up all the ink and paper in the world. Yet if my numbers for printing history are right, in the past half century we have destroyed more written material than had been produced in the cumulative history of the Earth up until Diderot’s day. And that does not count World Wars. We’re getting better. On February 14th 2014 a fire at the British National Archives threatening thousands of documents, many centuries old, was successfully quenched with no damage to the collection, thanks substantially to advances in our understandings of fluids and pressure made in the 17th and 18th centuries and neatly explained by the Encyclopédie. That much is indeed immortal (thank you, Diderot!) but much is so very far from everything. It’s still so easy to make mistakes.
One of the most powerful mistakes, for me, is this cenotaph monument of Diderot, in the Pantheon in Paris, celebrating his contributions and how the Encyclopedia and enlightenment enabled so much of the liberty and rights and change that defines our era. Voltaire’s tomb was moved to the Pantheon, Rousseau’s too, but for Diderot there is only this empty cenotaph. I went on a little pilgrimage once to visit Diderot in the out-of-the-way Church of Saint-Roch, where he was buried. There is no tomb to visit. During the French Revolution, Saint-Roch was attacked and mostly destroyed by revolutionaries (carrying banners with Encyclopedist slogans on them!) who, in their zeal to torch the old regime, forgot that their own Diderot was among the Catholic trappings they could only see as symbols of oppression. Once rage and zeal had died down Paris and all France much lamented the mistake, and many others, too late.
Did I mention we very nearly lost Diderot’s work too? A far more frightening loss than just his body. Diderot didn’t include himself, his own precious original intellectual contributions, in his Encyclopédie. He knew he couldn’t. He was an atheist, you see. A real one, not one of these people we suspect like Hobbes and Machiavelli, but an overt atheist who wrote powerful, deeply speculative books trying to hash out the first moral system without divinity in it, fledgling works of an intellectual tradition which was just then being born, since even a few decades earlier no one had dared set pen to paper, for fear of social exile and ready fire and steel of Church and law. But Diderot didn’t publish his own works, not even anonymously. He self-censored. He was the figurehead of the Encyclopédie. An atheist was too frightening back then, too strange, too other. If people had known an atheist was part of it, the project would have been dead in the water. Diderot left instructions for future generations to print his works someday, if the manuscripts survived, but gambling with his own legacy was a price he was willing to pay to immortalize everyone else’s. The surviving manuscript of Rameau’s Nephew in Diderot’s own hand turned up by chance at a used bookshop 1823, one chance street fire away from silence.
Here you get points if you read it before getting this far. It’s free on Tor.com, but you really liked it you can also buy the ebook for a dollar, and give money to Ruthanna and to Tor, and tell them you like excellent original fiction that does brave things with race and historicity.
Step Six: Talk about “The Litany of Earth” directly.
This is a Cthulhu Mythos story which is in no way horror. The richly-designed populated metaphysics and macrohistorical narrative of Lovecraft’s universe is here, but as a tool for reflection on society and self, with a narrative that bears no resemblance in to the classic tense and chilling horror short stories I (for some reason) enjoy as bedtime reading. Ruthanna Emrys uses Lovecraft’s world to comment on Lovecraft’s writing and the deeply ingrained sexism and especially racism that saturates it, repurposing that into a tool to make us think more about the effects of silencing and othering which Lovecraft used his skill and craftsmanship to lure us into participating in. But the message and questions are universal enough that the target audience is not Lovecraft readers or horror readers but any reader who has even a vague distant awareness that the Lovecraft Mythos is a thing, as one has a vague distant awareness of Celtic or Navajo mythology even if one doesn’t study them. If there is any horror in this story, it is the familiar reality that the things we make and do and are are perishable, that human action often worsens that, and that at the end of all our aeons and equations we face entropy. But rather than presuming (as Lovecraft and much horror does) that facing that will lead to mad cackling and gibberish, the story presents the real things we do to try to face that: spirituality, cultural identity, and the effort to preserve the past and transmit it to the future. It turns a setting which was created a vehicle for horror into a vehicle for social commentary and historical reflection.
I suppose I should directly address Lovecraft’s failures of empathy, for those less familiar with his work, or who have met it mainly through its fun, recent iterations in board games and reuses which strive to leave behind the baggage. Racism, sexism, classism and other uncomfortable attitudes are not unexpected in an author who lived from 1890 to 1937. We encounter unpalatable depictions of people of color, and equally unpalatable valorizations of entrenched elites, in most literature of the period, from M. R. James to the original Sherlock Holmes. In Lovecraft’s case, the challenge for those who want to continue to work with his universe is that many of the racist and classist elements are worked deeply into the fabric of his worldbuilding. Many of his frightening inhuman races are clearly used to explore his fear of racial minorities, while the keys to battling evil are reserved for elites, like the affluent, white, male scholars who control his libraries, and the Great Race which controls the greatest library.
While many attempts to rehabilitate and use Lovecraft’s world do so by excising these elements, or minimizing them, or balancing them out by letting you play ethnically diverse characters in a Lovecraft game, this story instead uses those very elements as weapons against the kinds of attitudes which birthed them. If the scary fish-people represent a demonized racial “other” then let them remain exactly that, and show them suffering what targeted minorities have suffered in historical reality. By reversing the point of view and placing the reader within the perspective of the “other”, the original failure of empathy is transformed into a triumph of empathy. Now we are in the place of a woman for whom Lovecraft’s spooky cult rituals are her Passover or Easter, the mysterious symbols her alphabet, “Iä, Cthulhu . . . ” is the comforting prayer she thinks to herself when terrified, and a Necronomicon on Charlie’s shelf is Petrarch’s Homer.
And we aren’t asked to empathize with only one group. We empathize with those deprived of education, in the form of Aphra’s brother Caleb, taking on the classist negative depictions of “degenerate” white rural families common in Lovecraft’s work. With the plight of the Jews and other groups targeted in Germany, invoked by Specter’s discussion of his aunt. With those facing physical and medical challenges, invoked in the powerful opening lines where Aphra describes the pleasure she finds in facing the daily difficulty of walking uphill while she slowly heals. And with women, rarely granted any remotely coequal agency in literature of Lovecraft’s era. Not only is this story a powerful triumph of empathy, but after reading it, whenever we reread original Lovecraft, or anything set in his world, the memory of Aphra Marsh and her tender prayer will forever change the meaning of “Iä, iä, Cthulhu thtagn…” The triumph of empathy diffuses past the boundaries of this story, to enrich our future reading.
Another striking facet is that this is a story about legacy, continuity and deep history that manages to address those questions using only very recent history. Usually stories that want to talk about the deep past use material from periods we associate with the deep past: medieval, Roman Empire, Renaissance, Inuits, Minoans, anything we associate with dusty manuscripts and archaeology and anthropology and old culture. Even I in this entry, when trying to evoke the themes and feelings of this story, went back centuries and consequently had to spend a lot of time explaining to the reader the history I’m talking about (what’s Petrarch’s Homer, what’s up with Diderot, etc.) before I could get to what I wanted to do with it. This story instead uses contemporary history, events so recent and familiar that we all know it already, and have seen its direct effects in those around us and ourselves, or have tried to not see said effects. As a result, the story doesn’t have the baggage of having to explain its history. Instead of needing footnotes and exposition, it touches us directly and personally with our own history and makes us directly face the fact that we too are part of the link of transmission attempting to connect past to future, and our failures can still heal or harm that just as much as Visigoths, the Black Death or the Encyclopédie. The use of modern history makes it impossible for us to distance ourselves, greatly enhancing its power.
I have already discussed, in my own roundabout way using Diderot and Petrarch and Marvel comics, many of the key themes which make this story so powerful: othering, empathy, reversal of point of view, legacy, silencing, translation and transmission, and discontinuity, how easy it is for the powerful engine of society to make mistakes that cut the precious thread. The power with which this story is able to present that theme demonstrates perfectly, for me, the potency of genre fiction as a tool, not for escapism or entertainment, but for depicting reality and history. The tragic discontinuities created by World War II, the destruction of life, education and cultural inheritance generated not only by the most gruesome facets of the war but also by great mistakes like the treatment of Japanese Americans, are difficult to communicate in full with such accurate but emotionless descriptive phrases as, “people were rounded up and held in prison camps.” Attempts to communicate the genuine human impact of such an event easily fall so short. We try hard, but often fail. As a teacher, I remember well the flurry of discussion which surrounded some High School history textbooks which, in their efforts to do justice to the often-silenced story of interned Japanese Americans, had a longer section about that than it did about the rest of the war. Opponents of political correctness used it as a talking point to rail against liberalism gone too far, while apologists focused on the harm done by silencing the events. Yet for me, the centerpiece was the fact that textbooks had to devote that much space to attempting to get the issue across and still largely failed to communicate the event in a way that touched students. “The Litany of Earth” communicates the same event very potently, using the tool of genre to make something most readers might see as only affecting “others” feel universal. The large-scale horror of Lovecraft’s universe revolves around the inevitability that human achievement, and in the end all life, will fading into nothing. The Yith and their library are the only hope for a legacy, one bought at the terrible price of what they do to those whose bodies they commandeer. By creating a parallel between the fragility of all human achievement, preserved only by the Yith, and Aphra’s barely-literate brother Caleb writing of his doomed search for the family library which contained the history and legacy he and Aphra so desperately miss, the fantasy setting puts all readers in Aphra’s place, and the place of those interned, creating universal empathy which no textbook chapter could achieve; neither, in my opinion, could a non-fantasy short story, at least not with such deeply-cutting efficiency. After reading this story, not only the events of Japanese American internment but many parallel situations feel more personally important, and one feels a new sense of personal investment in such issues as the fate of the Iraqi Jewish Archive. This stoking of emotion and investment is a powerful and lingering achievement.
Structurally, the story interweaves experiences from different points in Aphra’s present–where she encounters Specter–with her past arriving in the city and encountering Charlie and his interest in her lost culture and languages. The choice to depict the present scenes in past tense and the flashbacks in present tense might seem counterintuitive, but I found it a powerful and effective choice. Past tense reads as “normal” in prose, so much so that we accept it as an uncomplicated way to depict the main moment of a narrative. In contrast, especially when we have just come from a past tense section, the present tense feels extra-vivid, raw, invasive. It feels like a very certain type of memory, the kind so vivid that, when something reminds us of them, they jump to the forefront of our minds and blot out the here and now with the tense, unquenchable emotions of a very potent then. Trauma makes memories do this, but it is not the traumatic memories of camp life that we experience this way. Instead it is the vividness of tender moments of cultural experience: seeing precious books in Charlie’s study, sharing his drying river, warm things. The transitions to vivid present tense make the reader think about memory and trauma without having to show traumatic events, while simultaneously highlighting how, in such a situation of discontinuity and cultural deprivation, the experiences which are most alive, which blaze in the memory, are these tiny, rare moments of connection, even tragically imperfect connection, with the ghostly echo of Aphra’s lost people.
For me, the triumphant surprise of the story comes in the end, when Aphra approaches the cultists, and chooses to act. Specter’s descriptions of bodies hanging from trees, combined with our familiarity with the copes of creepy cults in Lovecraft and outside, prepare us mid-story to expect that when Aphra approaches the cult they’ll be evil and insane, and she’ll overcome her resentment of the government and do what has to be done. Or possibly the reversal will be stronger with that, and the cult will be good and nice, like Aphra, and the take-home message will be that Specter is wrong and Aphra and the cultists are all just misunderstood and oppressed. It feels like the latter is where the story will take us when we see Wilder and Bergman, and Aphra finds comfort and companionship in participating in a badly-pronounced imitation of her native religion. Even when we hear about the immortality ritual and Bergman refuses to listen to Aphra’s attempts to make her see that her ambition is an illusion, it still feels like we are in the narrative where the cultists are good but misunderstood, and the tragedy is just that there is such deep racial misunderstanding that even Cthulhu-worshipping Bergman cannot believe Aphra’s attempts to help her are sincere. It is a real shock, then, when Aphra called in Specter to shut the group down, because the genre setting raises such a firm expectation that “bad cultist” = “blood and gore” that even when we read about Bergman’s two drowned predecessors it doesn’t register as “human sacrifice” or “bad cult.” Aphra, unlike the reader, is unclouded by genre expectations, and shows us that, precious as this echo of her lost culture is to her, life is more precious still and requires action. The ghostly echo of Aphra’s people that she shares with Charlie is precious enough to blaze in her memory, but she is willing to sacrifice the far more welcome possibility of being an actual priestess for people who sincerely want to share her religion, when she realizes that their cultural misunderstanding will cost human lives. And she cares this deeply despite being an immortal among mortals. The triumph of empathy is complete.
Unlike the numerous vampire stories and other tales which so often present immortals seeing themselves as different, special, unapproachable, and usually superior to mortals, here Aphra’s potential immortality enhances the uniqueness of her perspective and the depth of her loss, but without in any way diminishing her respect for and valuation of the short-lived humans that surround her. The grotesque folder of experimental records which is her mother’s cenotaph does make her reflect on how the loss is greater than the human murderers understood, but does not make her present it as fundamentally different from the deaths of humans, or make her (or us) see her suffering in any way more important or special than that of the Japanese family with whom she lives. The history of Earth that her people have learned from the Yith make her recognize that living until the sun dies is not forever, nor is even the lifespan of the planet-hopping Yith who will persist until the universe has run out of stars and ages to colonize. The Litany of Earth that she shares with Charlie is an equalizer, enabling empathy across even boundaries of mortality by placing finite and indefinite life coequally face-to-face with the ultimate challenges of entropy, extinction and the desire to find something valuable to cling to. “At least the effort is real.” This is something Charlie has despite his failing body, that Aphra’s brother has despite his deprived education, that Aphra has despite her painful solitude, a continuity that overcomes the tragic discontinuity and connects Aphra even with her lost parents, with ancestors, descendants, with forgotten races, races that have not yet evolved, races on distant worlds, races in distant aeons, and with the reader.
One last facet I want to comment on is how the story portrays magic which is at the same time viscerally bodily and also beautiful and positive. This is very unusual, and the more you know about the history of magic the clearer that becomes. Magic, at least positive magic, is much more frequently depicted with connections to the immaterial and spiritual than the bodily: bolts of light, glowing auras, floating illusions, the spirits of great wizards powerfully transcending their age-worn mortal husks. Magical effects that are bodily, using blood, distorting flesh, are usually bad, evil cultism, witchcraft. This trope far predates modern fantasy writing. I have documents from the Renaissance based on ones from Greece discussing magic and differentiating between the good kind which is based on study, scholarship, texts, words of power, perfection of the mind, the soul transcending the body, angelic flight, spiritual messengers, rays and auras of divine power, an intellectual, disembodied and male-dominated “good” magic contrasted, in the same types of texts, with the bad evil magic of ritual sacrifice, sexuality, animal forms, distortion of the body, contagion, blood and associated with witchcraft and with women. Cultural baggage from the Middle Ages is hard to break from even now, and we see this in the palette of special effects Hollywood reserves for good wizards and bad wizards. The tender, intimate, visceral but beautiful magic which Ruthanna Emrys has presented is authentic to Lovecraft and to the rituals we associate with “dark arts” and yet positive, a rehabilitation which works in powerful symbiosis with the story’s treatments of discrimination. Since race and religion are so much in the center of the story, its treatment of gender rarely takes center stage, but in these depictions of magic especially it is potent nonetheless.
I’ll stop discussing the story here, since I resolved to make this review shorter than the story itself, and I’m running close to breaking that resolution.
Step Seven: Sing.
One of the most conspicuous effects when I first read “The Litany of Earth” was that it made me get one of my own songs firmly stuck in my head for many, many hours. The piece is “Longer in Stories than Stone” and it is the big finale chorus to my Viking song cycle, a piece about the fragility of memory and the importance of historical transmission. It is a different treatment but with similar themes, and I found that listening to it a few times live and over and over in my head helped me extend the feelings reading the story awoke in me, and let me continue to enjoy and contemplate its messages for several happy hours. So to celebrate the release of the story (taking advantage of the fact that this blog is no longer anonymous) here is the song, and I hope it will do for you what it did for me and help me extend my period of pleasurable mulling. I hope you enjoy:
Come to rescue us from the dark and gloomy wood of Doubt in which we have been wandering since my first post in this series (did you say hello to Dante?) comes the Criterion of Truth! The idea that, while the skeptics are correct that logic and the senses sometimes fail, they do not always fail, and if we carefully study when they fail, and why, if we identify the source of error, we can differentiate reliable knowledge from unreliable knowledge. For example, our eyes may deceive us when we judge a stick half-submerged in water to be bent, but if we add the testimony of other senses (touch), and of repeated experience (last time we saw an object half-way into water) we can identify the error, and henceforth say that we will not trust sense data based on visual information about objects half-submerged in transparent liquids, but that other sense data may be reliable. Once the causes of error have been defined, once we have a criterion for judging when knowledge is uncertain and when it is reliable, if we thereafter base our conclusions only on what we know is certain, then our conclusions will be reliable, eternal and divine, a steady foundation upon which we may proceed in safety toward that godlike happiness we seek. The Criterion of Truth is the clean and steady light of compromise, which does not banish all shadow, but, like a lantern in the dark, allows a philosophical system to have dogmatic elements while still conceding that much remains in shadow.
“Quite wrong!” cries our Pyrrhonist. “You have it all backwards! Doubt is the steady path toward eudaimonia. The absence of the possibility of certainty is our liberation, not our bane! It is when we embrace the fact that we cannot have certainty that we are finally free from the risk of having our beliefs overturned and our Plutos and Brontosaurs snatched away. It is when truth is firmly beyond human reach that we can finally relax and stop being plagued by curiosity and the endless, restless quest for information. The Criterion of Truth is not a light in darkness, it is a battering ram which has pierced our clean and serene sanctum and smeared it with all the muddled and confusing chaos that we worked so hard to banish! Don’t build a path on this foundation! However steady it may seem, the ground could still give way at any moment and shatter all. And even if it doesn’t, the path will never end. You will exhaust yourself on its construction, your age-gnarled hands still struggling to lay stones when you breathe your last, with never a glimpse of the end in sight, just infinity of toil and darkness. And the you will inflict the same curse upon your children, and your children’s children, and your children’s, children’s, children’s children!”
Whether one sees it as a blessing or a curse, developing a Criterion of Truth is what has allowed, and still allows, dogmatic philosophical systems to exist and progress in a fertile and symbiotic relationship with skepticism, instead of ending with the blank serenity where Pyrrho and other absolute skeptics wanted to dwell forever. Every philosopher with any dogmatic ideas has a criterion of truth (“Yes, even you, Sartre,” says Descartes, “Don’t give me that look!”), and an explanation for the source of error, and frequently I find that, when I am feeling awash in the ideas of a new thinker, one of the best ways to start to get a grip on things is to find the criterion of truth, which gives me an anchor point from which to explore, and to compare that thinker to others I am more familiar with.
Today I shall attempt something a bit compressed but hopefully the compression itself will be fruitful. I intend to briefly examine three of the major classical schools (Platonism, Aristotelianism and Epicureanism) and explain just enough of each system to make clear its criterion of truth and its explanation for the source of error. By laying these out in a compressed form, side-by-side, I hope to show clearly how skepticism is at play in each of the dogmatic systems, and to show what the early approaches to it were, so that when I move forward to major turning points in skepticism it will be clearer just how new and different the new, different things are. Tradition dictates that I start with Platonism, but Socrates is looking a little too aggressively eager now that I mention Plato, and furthermore he was being mean to Sartre while we were away (Don’t pretend you didn’t know that dialog trying to define “being” would make him cry!), so I shall instead start with Epicurus:
The Epicurean Criterion of Truth: Weak Empiricism
Take the stick out of the water. Epicureanism faces up to the skeptical challenge to the reliability of sense data and still chooses to promote the senses as our primary source of information, simply proposing that we should not rely upon first impressions, but should consider sense data reliable only after careful investigation, ideally using multiple senses and instances of observation. But there is more to it than that.
Epicureanism is a mature form of classical atomism, positing that on the micro-level matter is composed of a mixture of vacuum and invisibly tiny, individual components or seeds known as “atoms” which exist in infinite supply but finite varieties (see the modern Periodic Table), and that the substances and patterns we see in nature are caused by different recurring combinations of these atoms. If the same kind of sand appears on two unrelated beaches, it is composed by chance of the same combination of atoms. If a piece of wood is burned and goes from being brown, firm and porous to being white and powdery, some atoms have left it (in the smoke, for example), and the remaining ones look different.
Atoms too are responsible for the apparently changeable properties of objects (remember the seventh mode of Pyrrhonism, that we cannot have certainty because objects take multiple forms). The properties of substances do not derive from atoms themselves but from their combinations. Colors, smells and flavors are all effects of the shapes of atoms, so it is not true that sweet substances contain sweet atoms and red substances red atoms, rather sweet substances contain smooth atoms which are pleasant to the tongue rather than rough, and red objects contain atoms whose combinations create redness. If bronze is red and then turns green, or wood is brown but burns and turns gray, then atoms have entered or left and the new combinations create a different color. And it is on this atomic basis that the Epicureans argue that (a) natural interactions of atoms and vacuum are enough by themselves to explain all observed phenomena, so there is no need to posit fearsome interfering gods, and (b) the soul is just a collection of very fine atoms, distributed in the body and breath, which disperse at death, so there is no need to fear a punitive afterlife.
Atoms are, believe it or not, largely a solution to Zeno’s paradoxes of motion, and also have much to say about our stick in water. As we all recall, Zeno’s arrow can never reach its target because the space in between can be infinitely subdivided into smaller distances which it must cross before it can finish its path, therefore motion is impossible. Epicurus answers: yes. Motion is indeed impossible. Motion is an illusion. The key is that space is not infinitely divisible, as Zeno proposed. Atoms, according to the Epicurean system, are not only the smallest objects but the smallest subdivision of space; it is literally impossible to subdivide either atoms or space further. (Note that if he were around now Epicurus would deny that our modern “atoms” are atoms – he would confer that title upon the smallest known sub-atomic particle, or reserve it for the piece smaller than that which all the king’s horses and all the king’s cyclotrons still can’t detect.) The smallest distance any object can move is one atom-width – any more nuanced motion is impossible. In other words, fluid motion is an illusion, and on the micro-level objects do not slide from one place to another. Rather their atoms pop in an instant from one position to the next atom-width over. One might call it microscopic teleportation. It is by this means that the arrow moves: every component atom in the arrow teleports one space to the left each moment, and thus the arrow proceeds from right to left sequentially.
Positing micro-teleportation as a substitute for motion may seem alien, but it is something we make use of every day in the modern world, and it is in fact much easier to explain Epicurean theories of motion to modern computer-users than it was to people in the past. As you scroll down this page, the cursor of your mouse and the text on the screen seem to move, but in fact nothing is moving. Instead tiny pixels, the atom-widths of your screen, are changing color, or you could say that the black pixels that form the text are teleporting one pixel-width per moment as you scroll. The eye, unable to see such fine distinctions, blurs that micro-teleportation into the illusion of motion. Why couldn’t all motion be a similar illusion? Zeno is defeated, and Reason is once again reliable.
Which is good because Reason is the heart of the system of knowledge Epicurus wants to build. The Epicurean atomic theory, after all, is based on a combination of observations of the sensible world and then logical deductions. We observe that objects change their form when burned, that sea-soaked cloth hung up to dry becomes dry but remains salty, and that the same types of substances recur in many independent locations. From this we deduce the existence of atoms of different types in different combinations without ever directly seeing them. Zeno’s paradox of motion does not, in this interpretation, demonstrate that we can’t trust reason, but that we can’t trust rash, unexamined observations. There seemed to be motion, but with time, patience, observation and reason the Epicurean has determined that that was a mistake, and found a better model.
But this does an interesting thing to sense data, which Epicurus still wants to be more our guide than naked logic. Atomism, which predates Epicurus, seems to have itself arisen from observations of motes in a sunbeam, tiny particles which are invisible normally but visible only in special circumstances, and which all classical atomists cite as sensory evidence for the reality of atoms. From motes in a sunbeam and raw logic, they derive the atomic theory. As Epicureans strive to free themselves from fear of the unknown by observing and explaining natural phenomena through the interaction of atoms, they rely on what they can see, feel, hear and touch to derive their theories. This is empiricism but it is (as Richard Popkin aptly named it) weak empiricism. Why? Because the reality beneath what we observe is invisible. (“Exactly!” cries Sartre, leaping up with sufficient force to knock over Descartes’ thermas.) If atoms are undetectably tiny, and everything we see, taste and smell is a consequence of their combinations rather than the atoms themselves, then we can never have real knowledge of the fundamental substructure of being. There is an insoluble barrier between us and knowledge of true things, the barrier of minuteness. Thus Epicurean empiricism involves surrendering forever any certain knowledge of the truth of things, but in return we can have fairly reliable knowledge based on careful, repeated observation using multiple senses, especially now that logic has been rescued from Zeno’s grasp and is once again our ally.
Source of Error: Twofold. Limitations of the senses, which cannot see atomic reality; unquestioned acceptance of sense data and commonplace cultural assumptions (like superstitions about the gods) which are unreliable because they are not based on careful observation and analysis.
Criterion of Truth: Knowledge is certain when it is based on a combination of careful observation of the sensible world with multiple senses, and careful logical analysis.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of the observable world, and we can make rational deductions about the insensible world which are reliable enough to act upon (since we cannot ever prove or disprove them), but we cannot ever have true and certain knowledge of the invisible atomic world which is Nature’s true reality.
At this point some readers are not particularly disturbed by Epicurus’ surrender of true knowledge of microscopic things. After all, have advanced since 300 BC. We played with microscopes in grade school, we named the proton and the quark and preon, we made molecules out of toothpicks and gummy candies, and the electric blood of splitting atoms blazes in our lightbulbs. We fixed that weakness. “Delusion!” Sartre says, and he is right that, on a fundamental level, this technological advancement has not let us reclaim what Epicurus surrendered. However advanced our science, we still have no cause to believe we have yet perceived or even hypothesized the literally smallest increment of matter. And, separately, even if we had a machine capable of perceiving the smallest part of matter, we would still be limited by our senses since the machine would have to use our senses to transmit its findings to us, transmitting only an approximation, rather than reality. And in addition, the vast majority of our daily decisions would still be based on what we perceive at the macroscopic level. Thus, even with technological aid, the Epicurean surrender of knowledge of the fundamental seeds of things is a considerable one, and divides all knowledge firmly into two camps, the perceivable world about which it is possible to have certainty, and the reality beneath about which it is not. We have a path and shadows, dogmatism and skepticism coextant within one system.
The Platonic Criterion of Truth: the Forms
My approach to Platonism will be rather sideways, but I want to get us to its criterion of truth by a route that is as parallel as possible to Epicurus’. So, for the vast majority of my readers who know basic Platonism already, please read along thinking about Zeno’s paradoxes and the stick in water how this way of outlining Platonism follows the same logical structure Epicurus did.
Plato, like the skeptics, acknowledges that the senses fail and deceive, and, like the atomists, observed that there are recognizable, recurring objects in nature that come into existence in independent parallel to one another: similar rocks, mountains, trees and animals in distant corners of the Earth, which must, he reasoned, have some common source. He also noticed that humans are able to recognize and identify these objects as being the same, even humans who have never met each other, or speak different languages, and even when the objects may have radically different colors and shapes disguising a shared structure – a disguise we see through. Finally he noticed (something Epicurus did not discuss) the fact that humans not only naturally identify objects, but naturally judge them to be better or worse based on unspoken but nonetheless universal criteria. Anyone can tell that a crisp, fresh apple is “better” and a withered, dry one “worse” without having to discuss or debate that fact, or even to be taught it. I could show you a healthy and a diseased version of some deep-sea fish you’ve never heard of and you would nonetheless successfully identify them as “better” and “worse” exemplars of a completely new and unknown thing.
To explain these patterns, and this universal capacity to identify and judge “better” and “worse” examples of things, Plato posited that these objects must have a shared source, but instead of positing a combination of atoms, he posited a source independent of matter that supplied the object’s structure. All quartz crystals, all trees, and all apples take their structures from a separate structure-supplying object, which exists independent of matter and time. It has to, since the objects it generates can come into existence and be destroyed, but the pattern, the archetype, the source remains. Plato named this structural archetype the “Form” and posited that these Forms exist in a separate level of reality. They create the many material manifestations of their structure as a flag pole might cast many shadows on different objects at different times. As some shadows are crisp, straight images of what casts them and others are vague, twisted or distorted, so objects are sometimes fairly straight and sometimes quite twisted manifestations of their Forms. When we judge an object, we judge it based on how good an image it is, how closely it resembles the Form which is the source of its structure. Hence why anyone of any age, in any culture, without the necessity of communication, can judge the superior of two apples, and tell that twisty trees are weird.
But objects are never truly like their Forms because Forms exist on a completely different level of reality, just as the flag pole exists on a different level of reality from its shadows. We know this the same way we know that the godlike eudaimonia we seek cannot be based on fleeting things like lust and truffles. Forms are indestructible – no matter how many trees or apples burn, the Form remains. With that attribute, in the Greek mind, go the others: Forms are eternal, unchanging, perfect, and divine. They cannot be part of this changing and destructible reality, but must exist on some other layer of reality where change and destruction do not exist. Note how this is in many ways exactly symmetrical to Epicurus’s atomic theory, in which atoms are indestructible, unchanging and perfect, and exist on an imperceptible micro-level accessible to us only by deduction, just as real-but-invisible as the Platonic realm of Forms. Both posit a materially inaccessible world which is the source of the structures of the perceivable world.
What about Zeno and the stick in water? Simple: the motions of a flagpole’s shadow across the earth and ground aren’t rational but bizarre, bending and distorting, split in half at times by passing objects, changing and imperfect. Just so the material world. The stick in water looks bent, and motion is rationally impossible, because the entire layer of reality perceived by the senses is itself bent, distorted, an imperfect effect of a perfect reality elsewhere. When we see the stick look bent, or realize that motion makes no sense, it is at that point that we are beginning to perceive the fundamental flaws in sensible reality, and realize that the true, rational, knowable structure lies elsewhere.
True knowledge, reliable, certain knowledge upon which we may build our path toward reliable, certain eudaimonia must therefore be knowledge of Forms, not of passing things. We can have True knowledge of the Form of Apples, the Form of Trees, the Form of Justice, the Form of Humans, but we cannot have true knowledge of a particular apple, tree, case of justice v. injustice, or human, because such things are changing, imperfect, and perishable, so even if we could know them perfectly at one instant, that knowledge would not be lasting, not enough to be a real foundation for happiness. The only permanent, certain knowledge is knowledge of eternal things, since all other knowledge is, like its objects, destructible. Thus the Forms are the path to Happiness.
And now, without any need to address the soul, or Platonic love, or Truth, or the other great Platonic signatures, we can describe the Platonic Criterion of Truth:
Source of Error: The material world perceived by the senses is imperfect and illusory, and conclusions based on observation of it are full of error, and incomplete.
Criterion of Truth: Knowledge is certain when it is based on knowledge of the eternal Forms, which can be perceived by Reason. So long as we rely only upon knowledge of abstract, eternal Forms and not on knowledge of specific material things, we will make no errors.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of the Forms, i.e. of the eternal structures that create the sensible world, but we cannot ever have true and certain knowledge of individual objects within the material world.
Now, our friend Socrates has been waiting all this time to rant about how Plato put all this in his mouth, by using him as an interlocutor in his philosophical dialogs, when all Socrates stood for was the principle that we know nothing, and wisdom begins when we recognize that we know nothing. But explicated like this, in a way which highlights how substantial a portion of human experience Plato has yielded to the shadows of skeptical unknowability, Socrates has far less cause to object. Plato has taken “I know nothing” as his starting point, as, in fact, did Epicurus, both of them beginning by scrapping the received commonplaces of things people thought they knew about the material world, and instead trying to find a space for certainty far removed from the evidently-unknown world of daily experience. We all know that Plato tried to appropriate Socrates to his system, painting Socrates as a Platonist and implying that Socrates agreed with all Plato’s dogmatic ideas as well as his skeptical ones.
But Plato was far from the only one to do this. In the ancient world, Skeptics, Cynics, Stoics, Aristotelians and Neoplatonists all make claims about Socrates really believing what they believed, that Socrates was really a skeptic, or a stoic sage, etc. This is easy because Socrates left us nothing in his own voice, but also because all of them really did begin as he demanded, by doubting everything, declaring that “I know nothing” and then trying to work from that toward a system which carves out one zone for the knowable and surrenders another to the unknowable. Attempts by later sects to appropriate Socrates reflect his fame, but also their universal gratitude for the way his refinement of skepticism created a starting point from which they could approach their Criteria of Truth, and start from there to lay their foundations. And now that I’ve put it that way, Socrates seems much less set on picking a bone with Plato, and much more interested in the bones of the chicken drumsticks Sartre brought, which are much larger than those Descartes brought, which are larger than the ones Socrates is used to, a mystery which definitely bears investigation. We can in part blame one “Aristotle”, though when I mention him our more modern thinkers smile knowingly, thinking of the many stages that had to pass between the ancient empiricist and the alien concept “progress.”
The Aristotelian Criteria of Truth: Categories and Definitions
Aristotle studied with Plato for decades, and his framework has a similar beginning. Yes, we instantly recognize that apple is apple and cat is cat, even if we are on the other side of the world and recognize apple as ringo and cat as neko. And we instantly judge the withered apple as being farther from what an apple ought to be than the crisp one.
What Aristotle doesn’t like is how Plato has the Forms exist in a hypothetical immaterial reality removed from the sensible reality. Instead, he uses the term “form” to refer to structures within natural objects, which are not material but not immaterial either. They are non-material. This may sound like gibberish, but I recently demonstrated it very effectively to my class by taking two apples to the front of the classroom, setting them down while I had a drink of water, then violently smashing one of the apples with repeated blows from the butt end of the water glass, reducing it to a sticky green pulp and producing an extremely startled and, in the front rows, apple-bespattered classroom. “What did I just destroy?” I asked. It took only a few moments of recovery for one to supply: “The form of the apple.” Aristotle even goes so far as to say that forms, rather than matter, are what senses sense. When we see an apple our minds do not register the raw, chaotic matter, they register the structure: apple. When we see smashed apple pulp even then we do not see matter, we see pulp, which has its own structure. We never perceive matter, or rather never recognize matter, never understand matter. All cognition takes place on the level of form, which is why we can identify “apple” at a glance and not have to spend a minute assembling the millions of points of perceived light and color together to deduce that it’s an apple.
But if the form, for Aristotle, is a structure within individual objects, and is destructible, it can’t be a source of eternal certainty, nor can it explain how my colleague in Japan can recognize and judge apple identically to the way I do. For this Aristotle posits Categories. Universal categories exist in nature, non-material structures just like forms, into which the forms of objects fit. Human Reason is capable of identifying these categories, by looking at objects, understanding their forms, and identifying their commonalities, functions etc. We all see the apple and recognize that it fits in the category apple. We further recognize that the category apple fits in the category fruit, that in the category “part of a plant” etc. And that Stamen Apple is a sub-category within the category apple. This allows us to identify and judge even objects which we have never seen before and have no names for. You probably do not know at a glance what the creature pictured to the left here is, but you can identify that it belongs in the category mammal, possibly in the rodent category or maybe more like a tiny deer judging by those skinny legs, but certainly in the medium-sized, ground-dwelling, non-carnivore, probably scavenger eating fruit and bugs and things, not-dangerous-to-humans category. (It is, in fact, a Kanchil or “mouse-deer”). Similarly we can all categorize trees, rocks, fish, and other things. Aristotelian categories are part of Nature itself, eternal and unchanging, and indestructible, since the category apple and the category Kanchil will be unchanged regardless of the creation or destruction of any individual. A withered apple doesn’t harm the category apple, nor does a limping three-legged Kanchil, and the extinction of the T-Rex didn’t erase the category T-Rex.
The extinction of the Brontosaur didn’t erase the category Brontosaur either – it was our discovery that the category was wrong that did so, and here we get toward Aristotle’s ideas of certainty and error. We had not defined our terms carefully enough, had accidentally separated two things that shouldn’t be, and thus were led to error. Error caused by insufficiently clear definitions of our terms. The categories are sources of true, certain and reliable knowledge. Like with Plato’s forms, we cannot Know-with-a-capital-K individual things with certainty, since they are destructible and changing, and the apple which is fresh today will be withered next week. But we can know the categories, and that it always has been and will be the nature of the apple to grow on trees and try to be sweet and colorful to attract animals to eat it and spread seeds, and that it always and will always be the nature of the T-Rex to be a humungous terrifying predator the sight of which inspires fear in all mammals and other smaller creatures. One source of error is when we make mistakes about categorization. We may mistake the Kanchil for a rodent, or a Vaquita for a dolphin, but with more careful observation we realize it is more closely related to a deer. We may mistake the Brontosaur for its own species before we realize it is a juvenile version of another thing, as easy a mistake to make as thinking that a caterpillar and butterfly are different creatures until we examine more closely. We also want to do this with things we may not, in modern parlance, think of as part of Nature, but just as there is the category “cetacean” within which exists the category “porpoise” so too there exist the category “integer” within which exists the category “prime number,” also the category “system of government” within which lies the category “democracy,” and the category “virtue” within which exists the category “justice.” Aristotle, and the rest of Greece with him, does not draw our modern post-Rousseau line between “Natural” and “artificial” placing human works in the latter. Birds are part of Nature, as are humans; birds’ nests are part of Nature, with a category, as are all the things humans create. The category “web page” which contains the category “blog” is as natural as the category “tree”.
Thus Aristotelian certainty comes with careful, systematic investigation of the categories within nature, and if we want to reduce error we can do so best by studying and measuring and comparing objects we see until we can fit them into categories. The more we study, and the more carefully we define our terms, the clearer our conversations will become, less given to assumptions, misunderstandings and error. One source of error, therefore, is equivocal language, words that are sloppily defined and don’t refer to real categories in nature. Brontosaur, planet, motion, Justice, good, are all sloppily-defined terms. Any term which does not point to a real category in Nature is sloppy and may lead us to error. If we use only vocabulary that is carefully worked through and points only at real categories, then our language will be clear, our communication perfect, and the possibility of error greatly reduced. After all, we only want to be talking about categories, not anything that isn’t one. Since, as with Plato’s forms, categories are eternal, unchanging and reliable. On their foundation we can build our path. As with Plato and Epicurus we have surrendered knowledge of individuals, in favor of knowledge of something structural which underlies them.
Excuse me: to proceed farther with Aristotle, I need to go get my fork. Here it is. (Or rather an image of it, one level less real, its Platonic shadow.)
This fork has been part of my life since I was a tiny girl, and it taught me about the Aristotelian sources of error. When I was little, I would help put the silverware away. This fork puzzled me. Why? Because I couldn’t figure out how to categorize it.
Here you see my dilemma. We had one slot for forks, which had tines and metal handles. And one slot for knives, which had blades and wooden handles. Where then goes this fork, which has tines but a wooden handle? Let’s offer the dilemma to our Youth.
Youth: “I think it should go with the metal-handled fork.
Youth: “Because it’s a fork. It’s used for fork things, that’s more important than what it’s made of.”
*Ding!*Ding!*Ding!* Correct! The Youth, like my child self, has correctly identified the Aristotelian distinction between an “essential property” and an “accidental property”. An essential property is a quality of something essential to it being itself, and filling the function it has in Nature; an accidental property is something that could change and it wouldn’t matter. A cat can be black or tabby (accidental) but must be slinky, carnivorous, and endearing to its owner in order to fulfill the functions of a cat. A tree must grow a woody trunk and produce leaves in order to fulfill the functions of a tree. A fork must fit comfortably in my hand and lift chunks of food to my mouth for it to be a fork. If the cat is orange, the tree is forked, and the fork is a futuristic rod that lifts food using a miniature tractor-beam instead of tines, those are accidents. If these things fulfill these functions badly–if a cat is ugly, a tree is all bent and twisted and produces few leaves, or a plastic fork snaps when I try to skewer food with it–we judge them bad examples of what they are. If these things don’t fill these functions at all–a quadrupedal mammal eats grass, a plant produces a soft viny stalk, and a piece of silverware cuts food in half instead of lifting it–we judge they do not belong in the categories cat, tree and fork respectively because they lack their essential properties. If I had mistakenly stored my wooden-handled fork with knives, that would have produced error, the same source of error as when we mistake a Kamchil for a rodent, or when Descartes, living in the 17th century, reads an article about how people from Africa are not the same as people from Europe because their skin is a different color. Mistaking accidental properties for essential ones has introduced error. And to call a robot toy a “cat”, or a metaphor for understanding genealogy a “tree”, or a fifteen-foot fork-shaped sculpture a “fork” is to employ ambiguous language, not referring to its categories, introducing error.
But what about Zeno, and our stick in water? For our stick in water Aristotle, much like the Epicureans, wants us to examine the stick more carefully, multiple times with multiple senses, to correct the mistake. And, like the Epicureans and Plato too, he surrenders true knowledge of individual objects, saying we can know Categories with certainty, after careful examination, but not specific things.
As for Zeno, there he comes from a different angle, attempting to refute Zeno with pure logic. Aristotle is big on observing Nature, but also on logical principles, especially a priori principles. By these he means logical principles which are self-evidently true and require no knowledge or experience to be proved. For example: The same thing cannot both be and not be at the same time. Think about it for a while, take your time. It’s the case, and not only is it the case but it’s the case for lampreys, and thumbtacks, and hypothetical frictionless spheres, and ideas, and systems of government, and people. Even if you were a brain in a jar that had never had any experience of the world outside the mind, you could identify that a concept cannot both exist and not exist at the same time. Here’s another: “One” and “many” are different. It is nonsense to imagine that a thing could be both singular and plural at the same time. That too you can conclude without any basis in anything.
Now, it is possible to use clever syntax to come up with what seem like counter-examples. What about a doughnut hole: surely it exists and doesn’t exist at the same time, for this doughnut has a non-existence which is its hole, and yet here I am eating this doughnut hole. No, says Aristotle. That apparent contradiction is merely a function of unclear vocabulary giving two things the same label when they are utterly different. Similarly this pomegranate is one and many at the same time. Again, no: it is many seeds, but one pomegranate. Use strict vocabulary, unambiguous terms, and discuss only categories, and you will find that Aristotle’s a priori principles are sound.
Reasoning from such starts, and using raw logic without recourse to any knowledge of the material world, he then takes on Zeno. You cannot, says Aristotle, have infinite regression. It may seem you can, but an infinite chain is a logical impossibility because it would never end and never start. When you try to think about it, the mind rebels, just as it does when it tries to think of the one and the many being the same, or a thing both being and not being at the same time. Thus, says Aristotle, Zeno’s paradox is proved false because infinite regression is logically false. We can, now, rely on logic, so long as it is careful and methodical, and based on first principles and on comparison of the categories rather than leaping to conclusions directly from sense impressions of individual objects, which are flawed.
Sources of Error: (1) People using vague vocabulary that is unclearly defined and does not refer to anything Real, (2) Fallibility of individual material objects and rushed conclusions based on observations of such objects (note how similar this latter is to Plato).
Criterion of Truth: Knowledge is certain when it is based exclusively on either or a combination of a priori logical principles which are not dependent on anything other than logic to be certain, and on the eternal Categories which exist universally in Nature, and can be known through observation and discussed using a carefully-defined lexicon of philosophical vocabulary.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of logical principles, and of the Categories, i.e. of the eternal structures within Nature that the forms of objects fall into, but we cannot ever have true and certain knowledge of individual objects within the material world.
Thus we have a third path, clearly delineating the arena of certain, eternal knowledge (on the basis of which we may seek eudaimonia) and separating it from the unknowable, which we surrender forever to skepticism. And once again the unknowable is the realm of matter, individual things, the essence which is given structure and comprehensibility by form. Aristotle, like Epicurus, has given up any chance of understanding matter itself, confining the cognizable world to that of form and structure, the macro-level. And he has surrendered knowledge of individuals, of this apple and this lamprey, granting us only the categories. We can still know an enormous amount in Aristotle’s system, enough to build a vast system of knowledge, a library of definitions, a vast network of genus and species names, and an empirical basis for an entire scientific system. Infinite knowledge lies before us on our Aristotelian path, infinite logic chains to follow, infinite categories to investigate, name, compare and discuss. The surrender, like Epicurus’s surrender of the ability to see atoms, feels minor.
“It’s still delusion!” Sartre says. “The surrender is vast! Infinite! Infinitely more vast and fundamental than your daily world imagines!” This outburst has been building up in poor Sartre for some time, which we can tell because since he’s been holding his knees and rocking back-and-forth and flushing, and only barely sociable enough to thank Descartes for that eclair (which is not, in fact, a lightning bolt but is a delicious pastry named “lightning bolt” in French, much to Aristotle’s chagrin). And, at some risk of frightening our innocent interlocutor the Youth (whom I shall advise to have Socrates hold his hand through the next bit) I will let Sartre continue in his own words, an excerpt from his Nausea(note that this particular translation uses existence rather than being):
“So I was in the park just now. The roots of the chestnut tree were sunk in the ground just under my bench. I couldn’t remember it was a root any more. The words had vanished and with them the significance of things, their methods of use, and the feeble points of reference which men have traced on their surface. I was sitting, stooping forward, head bowed, alone in front of this black, knotty mass, entirely beastly, which frightened me. Then I had this vision. It left me breathless. Never, until these last few days, had I understood the meaning of “existence.” I was like the others, like the ones walking along the seashore, all dressed in their spring finery. I said, like them, “The ocean is green; that white speck up there is a seagull,” but I didn’t feel that it existed or that the seagull was an “existing seagull”; usually existence hides itself. It is there, around us, in us, it is us, you can’t say two words without mentioning it, but you can never touch it. When I believed I was thinking about it, I must believe that I was thinking nothing, my head was empty, or there was just one word in my head, the word “to be.” Or else I was thinking . . . how can I explain it? I was thinking of belonging, I was telling myself that the sea belonged to the class of green objects, or that the green was a part of the quality of the sea. Even when I looked at things, I was miles from dreaming that they existed: they looked like scenery to me. I picked them up in my hands, they served me as tools, I foresaw their resistance. But that all happened on the surface. If anyone had asked me what existence was, I would have answered, in good faith, that it was nothing, simply an empty form which was added to external things without changing anything in their nature. And then all of a sudden, there it was, clear as day: existence had suddenly unveiled itself. It had lost the harmless look of an abstract category: it was the very paste of things, this root was kneaded into existence. Or rather the root, the park gates, the bench, the sparse grass, all that had vanished: the diversity of things, their individuality, were only an appearance, a veneer. This veneer had melted, leaving soft, monstrous masses, all in disorder—naked, in a frightful, obscene nakedness.”
By this point our Youth is very glad to have his hand held, and Descartes is having second thoughts about sharing his eclair with what has evidently turned out to be a lunatic Lovecraftean cultist. But I let Sartre speak here to demonstrate the fact that these surrenders, made in the earliest days of philosophy by system-weavers seeking to escape the web of Zeno and the Stick, are still substantial. Even the most recent modern philosophy returns, from time to time, to these ancient surrenders to unknowability, and some try, like Sartre, to make new inroads toward knowing what the majority of thinkers have given up on. New and, in Sartre’s case, scary inroads. Every system-weaver since Plato may have a Criterion of Truth to be our light in the darkness, our path, our foundation, the circle line for the new philosophical subway system, but the fertile symbiosis between skepticism and dogmatism–the symbiosis which has borne such fruit: Platonic forms, genus and species, atoms, eventually the scientific method itself!–is also still sometimes a hostile symbiosis, and the wild, strong skepticism of Pyrrho still sometimes rears its head to plague Sartre and us, even as we make daily use of soft forms of skepticism like Epicurus’ weak empiricism, and Aristotle’s categories.
Of course, many are the centuries between Epicurus and Sartre, and many the new relationships between doubt and dogma, the new Criteria of Truth and new forms of shadowy un-knowledge which will press upon our fragile paths, before we reach the modern world. So we still have much more to explore in further chapters. Good thing Descartes brought plenty of lightning bolts.
It is easy for us to forget how the Scientific Method, at work behind all this research, is a uniquely flexible and dynamic belief system, one which enables our uniquely flexible and dyamic world. Some will feel uncomfortable with me calling science a “belief system” but in this context I use the phrase “belief system” as a reminder of what the Scientific Method and its associated aparatus have displaced. Science has not replaced religion–they coexist happily, productively, even symbiotically within many arenas, places and individuals, even as they chafe and vie in others. But in the modern West, the Scientific Method has largely displaced older systems for guiding daily micro-decision-making which were more closely tied to religion. We now use science-based reasoning a hundred times a day when we are called upon to make decisions. Whether making a sandwich, buying a new teapot or evaluating an argument, we think about data from past experiences, bring in what facts and hypotheses we have accumulated from educated and informed living, consider the credibility of sources, ask ourselves questions about plausability, probability, evidence and counterargument, speculate about the range of possible errors and outcomes. We go through many steps, often fleeting but still present, before we assemble our sandwich (which recent nutrition advice seems plausible in the ever-changing range?) or buy our teapot (plastic so housemates won’t break it, or ceramic for environmental/health/aesthetic/flavor reasons?) or decide whether to grant a politician’s argument our provisional belief or disbelief. Even for those members of modern Western society whose lives are powerfully informed by faith or institutional religion, who do seriously factor “What would Jesus/Apollo/Whatever do?” into the calculation, evaluatory criteria based on science and its method remain a substantial, if not exclusive, part of our aparatus for daily decision-making.
For my purposes today, the most important part of what I just described is that the belief or disbelief we extend to the politician (or to our teapot) is provisional. We decide that a thing is plausible or implausible, and extend to it a kind of belief which is prepared for the possibility that we will be proven wrong. That thing the politician said might turn out later to be false (or true) when new information arises. A teapot, let’s say I pick one which claims to be safe and eco-sound because of XYZ carbon something something, it may seem that I have given its claims my complete belief if I buy the teapot, but that too is provisional since my long-term purchasing decisions for other objects will be informed by further information, changes in industry, and, of course, my empirical experience of whether or not this teapot serves me (and survives my housemates) well.
What we knew about teapots, coral reefs, moths and treesloths, Arthuriana, protons, and the Greek concept daimon, can all be overturned and yet we remain comfortable with the Scientific Method which produced our old false information, and we are still prepared to let it provide us with new information, then overturn and replace the new information in its turn. We do this without thinking, but it is in no way a universal or natural part of the human psyche. When chatting with my father about the proton research he summed it up nicely, that two possible responses to hearing that how we measure something seems to change its nature, throwing the reliability of empirical testing into question, are: “Science has been disproved!” or “Great! Another thing to figure out using the Scientific Method!” The latter reaction is everyday to those who are versed in and comfortable with the fact that science is not a set of doctrines but a process of discovery, hypothesis, disproof and replacement. Yet the former reaction, “X is wrong therefore the system which yielded X is wrong!” is, in fact, the historical norm. Whether it’s an Aristotelian crying “Plato has been disproved!” or Bernard of Clairveaux crying “Abelard has been disproved!” or a Scotist crying “Aquinas has been disproved!” the clear overthrow of a single sub-principle within a system was, for many centuries, sufficient to shake the foundations of the system as a whole, and drive people to part with it and seek a new one.
All this is a way of previewing the endpoint of the present series, in order to show how important the often-invisible role of doubt is in current human thought. Without skepticism, and important developments in the history of skepticism, we could not have the Scientific Method occupy the position it does in modern daily lives. So I want to sketch out here some of my favorite moments in the history of skepticism, not a complete history (for that see Popkin’s History of Skepticism or Allen’s Doubt’s Boundless Sea), but the spicy highlights that I’ve most enjoyed.
Dogma and Doubt
There are many ways to subdivide philosophy, but one of the most useful is, in my view, the subdivision into dogmatic and skeptical. I’m using these terms in their technical philosophical senses, so I do not intend to invoke any of the contemporary, negative cultural associations of “dogma” or “skeptic.” (Philosophy and history are constantly plagued with the disconnect between formal uses and modern casual uses of terms like these, Epicurean, Hedonist, Realist, Idealist… and it’s worse when I learn the technical term before I meet the popular one. I can’t tell you how confusing it was the first time I was in a conversation where someone used “libertarian” in its contemporary political sense, which I had never met, having learned it from Spinoza class. Them: “FDR is a big foe of Libertarianism.” Me: “Really? I didn’t know FDR denied the existence of freewill. Was he a materialist? A stoic?” And when I tell my students that, for the purposes of Plato class, “Realist” and “Idealist” are synonyms they sometimes look like they’re about to cry…) For today’s purposes by “dogmatic” I mean any philosophical moment or system which argues that something can be known, or that there can be certainty. By “skeptical” I mean a philospohical moment or system arguing that something cannot be known, or that there cannot be certainty. In this sense, Aristotle’s argument that the existence of a Prime Mover can be logically proved from the principle that any chain of events must have a First Cause is dogmatic, as is the conviction that we know with certainty that the square of the length of the hypotenuse of a right triangle is equal to the sum of the squares of the remaining sides. Pierre Bayle’s argument that God’s existence can be known through faith alone is skeptical, as is the argument that quantum uncertainties like Heisenberg’s mean that material reality can never be fully understood because the act of perceiving it alters it. Thus neither skepticism nor dogmatism is more or less tied to theism than the other – both are broad and diverse categories, and most great intellectual traditions have both in there somewhere.
Dogmatic philosophy is what most people usually think of when we think about philosophy: systems that propose particular things. The Platonic Good, Aristotle’s Categories, Descartes’ vortices and and Heidegger’s Being are all founded in claims that we know or can know some thing or set of things with certainty. Yet skeptical arguments, about what cannot be known, have coexisted with dogmatic claims throughout philosophy’s existence, and the two act as foils to one another, arguing, cross-polinating, hybridizing, and spurring each other on, and their interactions have been among the most exciting and fruitful in philosophy’s long history.
I will begin as close to the beginning as I can:
Happiness in Ancient Greece:
While post-17th-century philosophy often puts its primary focus on the quest to explain and describe things and create a system of knowledge, one key unifying attribute common to just about all classical Greek philosophical schools, though different in each, is the goal of attaining eudaimonia (εὐδαιμονία), from “eu” = good, happy, fortunate, and “daimon” = spirit, soul. It’s usually translated as happiness, but it’s both more specific and stronger. Other renderings that help get the idea across include wellbeing, self-contentment, self-fulfillment, spiritual joy, and personal wellfare. It is the kind of happiness which is deep, lasting, tranquil, reliable, complete, and, in the Greek sense, godlike or divine. By “divine” I mean a list of attributes that most Greek philosophers associated with the gods, who were supposed to be immortal, unchanging, indestructable, eternally happy and satisfied, living in a bliss surrounded by beauties and free from pain. These are not Homeric Greek gods who feud and lust and rage, but more abstract philosophical gods personifying unchanging eternal principles, the sort of gods Plato believed in and for which reason he wanted to censor Homer’s depictions of the more fallible and anthropomorphic ones. The word daimon thus occupies a complex space, much debated, but can be rendered as a spirit, soul or thinking thing, referring to a category vaguely encompassing human souls, gods and intermediary spirits. Thus, eudaimonia is the state of having a happy or fortunate spirit, so my favorite way of rendering eudaimonia is “the kind of happiness Platonic gods experience” i.e. long-term, untroubled, indestructable happiness.
Become a philosopher, lead a philosophical life as I do, and you will achieve, or at least approach, happiness–this is the promise made by every sect, from Epicurus and Seneca to Diogenes and Plato. In the classical world, being a philosopher was much more about life, living well and demonstrating one’s philosophical prowess through one’s personal excellence and successes than it was about writing comprehensive masterworks expounding systems (See Hadot’s What is Ancient Philosophy? and Philosophy as a Way of Life). Each classical philosophical school had its own path to happiness, and each entwined it with different parallel goals, such as the pursuit of personal excellence, or understanding of nature, or civic virtue, or piety, or worldly pleasure, or friendship, any number of things, but we find no classical school for which approaching eudaimonia through leading a philosophical life was not a core promise.
I should note in passing that, in later classical writings, it becomes clear that they take the divine aspect of eudaimonia very seriously, and Neoplatonists especially refer to past philosophical sages as “divine,” arguing that Socrates, Plato, Zeno, Diogenes of Athens, Seneca and so on, had achieved states of philosophical happiness that made their souls identical with those of gods, even while they were contained within mortal flesh. The daimon or soul which is happy in eudaimonia is, after all, categorically the same type of thing as a god, and one of the leading differences between a human soul and a divine one is that the divine one experiences indestructable happy serenity. If a philosopher’s soul achieves the same state, is it not a god? Particularly in a culture which already practiced deification and ancestor worshop? Platonic claims about a philosophical soul growing wings, leaving the body and dwelling among the gods helped further cultivate this impulse. The practice of Theurgy, philosophical magic, developed from the idea that such a divine soul, even while resident in human flesh, could work miraculous effects, such as levitation or generating light.
Now, eudaimonia is a high bar to achieve. Indestructable, god-like happiness must be able to stand unchanged in the face of all changes, a great challenge in a human existence beset by a thousand evils including wolves, tyrants, malaria, civil war, famine, injustice, accidental dismemberment, urequited love, and human mortality. All our surviving ancients agreed that real eudaimonia could not be dependent upon external sources, like fame, wealth, property, physical fitness, romantic love, even liberty of person, because such things could be taken away from you by fickle fate, making them unreliable, and your happiness destructable. I say surviving ancients because we do not have the writings of the Hedonist school, which we know focused on positive, experiential pleasures including, probably, food, drink and sex, and who may have been an exception, but their exceptionality doomed them to be silenced by the dissenting majority. Those who did agree agreed that eudaimonia had to be a state of the thinking thing, the mind or soul, independent of experience, body or social position. It was most frequently connected with things like tranquility, self-mastery, acceptance, and taking enjoyment from things that cannot be destroyed, like Truth. It was also connected with freeing the soul from cares, such as fear, anxiety, envy, ambition, possessiveness, and general attachment to Earthly, perishable things.
These classical philosophical schools developed guides for living and decision-making intended to facilitate a happy life, and those with systems of physics and ontology often tied those closely to their paths to happiness. For example, the atomic explanations for the natural non-divine mechanisms behind thunder and lightning were promoted by Epicureanism as something which could make people happy by freeing them from fear of being zapped by a wrathful Zeus. Thus philosophical disciplines like physics, biology and even basic ontology were in their way tools of eudaimonia as much as they were attempts to explain things. Modern scholars even debate whether, in such cases, the physics was the source of the moral philosophy, or a tool developed afterward to support it when eudaimonia seemed to need it as an ally.
One of the sources of pain and unhappiness from which such systems set out to free people was curiosity, i.e. the unhappiness that derives from hungering for answers. This too needed to be satisfied to achieve the stability of eternal, godlike happiness. The quest to end the pain caused by curiosity meant supplying answers, to questions big and small, but especially big. And they needed to be certain answers, which would be reliable and eternal, and stand up to the assails of fortune, or else eternal, reliable eudaimonia could not rest upon them. This added extra energy to the quest for certainty. One wanted to be really, really sure an answer was right, so one could rest comfortably with it, and be happy, and know it would never change. And one wanted the facts which served as foundations for philosophers’ broader advice on how to achieve happiness to also be certain and unchanging. If Plato says the key to happiness is Truth, Excellence and the Good, or Aristotle proposes his Golden Mean, you want their claims to be based in certainty.
Two tools were employed in pursuit of certainty: Logic and Evidence. All dogmatic claims (i.e. claims of certainty) made by any of our classical thinkers were based on one, the other, or both.
Evidence includes any claims based on observation, sensation, lived experience, or, more technically, empiricism. If Aristotle says bony fish and cartelagenous fish are different because he has dissected a hundred of them and can describe how their insides are different, that is empiricism. If Thales or Heraclitus draw conclusions based on seeing how fire emerges from wood, that is empiricism. If Plato asks us to think about when we’ve seen someone beat a dog and say whether it makes the dog better or worse, that too is a kind of empiricism.
Logic includes any argument based on reasoning instead of sense experience. If Aristotle says that a thing cannot both be and not be at the same time, that is an argument based on logic. If Plato asks us if beating a dog makes it worse with respect to the properties of horses or worse with respect to the properties of dogs, that is also an argument asking us to apply logic.
Meanwhile, in a nearby lake…
…a stick fell in a pond, and skepticism was born, like Venus, from the waters. Or rather, from someone who saw the water, and saw a stick sticking half-way out of it, and noticed that the stick looks bent or broken at the point where it goes into the water. And yet, the stick is not bent. The person bends over and touches it, just to be sure, and the fingers confirm the wood is whole and strong. My eyes are lying to me! My eyes can’t be trusted! If this stick isn’t bent, what else that my eyes have told me may be false that I haven’t yet realized? What if the sky isn’t blue? Or milk isn’t white? What if trees have faces, chalk is actually as beautiful as gold, and the sky is swarming with exquisite creatures I have no way to detect? And if I can’t trust my eyes, what about my ears? My hands? Sense perception is unreliable! But in that case, how do I know anything I’ve experienced is as I thought it was? Or even that anything is real?! Panic! Panic more! (Quietly in the background Descartes and Sartre are still carrying out the “Panic more!” instruction nearly two millennia later).
The stick in water is a genuine, ancient example, much discused by pre-Socratic philosophers. We don’t know if it’s actually the first example, since the earliest conversations are lost to time, but it may be, and certainly if it isn’t it was something similar, one of the other optical distortions discussed by ancients, like how a square tower can be mistaken for a round tower when seen from a great distance. What does survive is what later philosophers made of these early discussions of the mystery of the stick in water: epistemology, the study of knowledge, how we know things, and when we can or can’t have certainty. The stick in water challenges any claim that the senses can be relied upon as a source of certainty. Forever after, therefore, any philosopher who wanted to make any claim based on sense perception first had to have a way to explain how we could trust the senses despite this, and other, failings.
And the stick in water has a brother: Zeno’s paradoxes. If the stick in water undermines the credibility of sense-perception, its partner, Zeno’s paradoxes of motion, are what undermine the credibility of the other traditional source of information: logic. You have all heard Zeno’s paradoxes before, but rarely in companionship with the stick in water, which is what gives them their oomph, so it’s worth revisiting one here:
An archer looses an arrow at a target. Before the arrow reaches the target, it must go half way. Next it must go half the remaining distance. Then half that distance. Then half that distance. Half, half, half, half but we can do this forever, so the arrow can never actually reach the target, because it must cross an infinite number of micro-distances first, and nothing can travel infinite distance. Therefore, logically, motion is impossible. Cue polite applause for the logical trick, as at the successful completion of an elegant and challenging ice skating routine. (Cue also Descartes and Sartre glaring at anyone who’s still smiling.)
Why is this more than a cute trick?
Youth: “But we know motion is possible, Socrates.”
Socrates: “How?” (All philosophical dialogs are with Socrates, even when they aren’t.)
Youth: “Because I can hit you. See?” Hits Socrates.
Socrates: “Yes, very good. So you know it is possible because you did it?”
Youth: “Exactly, Socrates. I can do it again if you aren’t convinced.”
Socrates: “If you want to exercise your will in that way (if there’s such a thing as will) then that’s your choice (if there’s such a thing as choice), but first, perhaps you could explain to me, using logic, how you are able to hit me, if your arm has to cross infinite distance first?”
Youth: “I… um… I don’t think I can, Socrates. I just know that I hit you, and could do it again.”
Socrates: “But you can’t explain logically why.”
Socrates: “So wouldn’t you say, then, that logic is incapable of explaining motion?”
Youth: “I guess so, Socrates.”
Socrates: “Doesn’t that bother you? That logic fails to be able to explain something so seemingly simple? Doesn’t that make you distrust logic itself as a tool? It would seem that logic itself is unreliable and can’t lead to certainty.”
Descartes (quietly in the background): “Panic more!”
Youth: “I guess that’s so, Socrates, but it just doesn’t bother me the way it bothers those weirdly dressed men over there.”
Socrates: “And why doesn’t it bother you?”
Youth: “Well, because I know that motion is possible because I can do it and see it. I don’t need logic to explain it.”
Socrates: “So even without logic, you’re sure there is motion… because?”
Youth: “Well, because when I move my arm to hit you, I can see it. When I touch you with my hand, I can feel the impact, the texture of your skin. I can still feel it a little on my own skin, the spot where it struck yours.”
Socrates: “So you know there is motion because your senses tell you so?”
Socrates: “So, since logic is unreliable, you choose to rely on the senses instead?”
Youth: “Yes. I trust things I can see and touch.”
Socrates: “Then tell me, my young friend, have you ever happened to notice what happens when a stick falls so it’s sticking half-way into a pool of water?”
Our youth, whom we shall now leave panicking on the riverbank along with Socrates, Descartes, Sartre and, hopefully, a comfortable picnic, has now received the full impact of why Zeno’s paradoxes of motion matter. They aren’t supposed to convince you there’s no motion, they’re supposed to convince you that logic says there is no motion, therefore we cannot trust logic. Their intended target is any philosopher *cough*Plato*cough*Aristotle*cough* who wants to make the claim that we one can achieve certainty by weaving logic chains together. Anyone whose tool is Logic. Meanwhile, the stick in water attacks any philosopher who wants to rely on sense perception *cough*Aristotle*cough*Epicurus*cough* and say that we know things with certainty through Evidence. When you put both side-by-side, and demand that Zeno shoot an arrow at the stick in water that looks bent, then it seems that both Logic and Evidence are unreliable, and therefore that… there can be no certainty!
Don’t panic, be happy…
The double challenge of the stick in water and Zeno’s paradoxes had many effects.
One was to make all classical thinkers who wanted to maintain dogmatic principles work a lot harder to nuance their claims of certainty, to justify why and in what specific circumstances logic and evidence could be trusted, to explain why they sometimes failed or seemed to fail, and how one could reason or observe more carefully in order to achieve greater levels of certainty. Thus these challenges to reason and evidence let dogmatic philosophers adopt skeptical tools and create systems which had space for both dogma and skepticism in the same system, hybridizing the two to achieve greater levels of clarity, complexity, dynamism and subtlety and jumpstarting countless great philospohcial leaps. To give two quick examples, Aristotle attempted to create a system for achieving infallible logical information by saying that logic is 100% reliable if it is based on a combination of (A) unequivocal carefully defined terms, (B) self-evident first-principles, and (C) geometrically-strict syllogistic reasoning by baby-steps. Stick to these and exclude logical leaps and unclear vocabulary and you can carve out an arena for reliable logic, even if that arena is necessarily finite and cannot touch everything. Similarly Epicurus and Aristotle both proposed a kind of empiricism of repeated observation, where we do not trust just one glance at the stick in water but examine it carefully with all our senses, look at many sticks, and eventually draw conclusions we consider more reliable. And at the same time, these same thinkers gave ground and mixed their dogmatism with skepticism by saying that logic or empiricism worked in some arenas but not others. Epicurus, for example, says we can learn a lot from sense data but we can never learn true details of atomic level since we can’t see anything that small. Aristotle similarly says we can learn about the level of the universe that we can experience and think about, the level of the objects we see and contemplate, but not about the chaotic base substructure which underlies the visible and comprehensible world.
(Sartre, who has just been handed a sandwich by Socrates and is now unconsciously applying the Scientific Method as he considers whether or not to accept Descartes’ offer of mayonaise, looks up here to say that he agrees with Aristotle that there are vast and terrifying unknown depths of being which lie beneath perceived reality. He thinks we should address our long-term attentions to that mystery, and that Aristotle is foolish to cling to pursuing the finite certainties offered by his logic chains and fish observations when no finite knowledge is helpful in the face of the raw unknown infinity beneath. But Sartre is not interested in pursuing eudaimonia, even if he is interested in the short-term, destructable pleasure offered by Descartes’ excellent fresh mayonaise.)
But our ancient Greeks are interested in eudaimonia, and another product of these challenges to reason and evidence, apart from letting dogmatic philosophers hybridize with it, was the birth of Skepticism (big S) as a philosophical school, in addition to skepticism (small S) as an approach. As an approach, skepticism is used by all sorts of thinkers, including Plato and Aristotle in their way, but it was also a school, a rival of Platonists and Stoics. And, like all other ancient schools, Skeptics pursued eudaimonia.
How does doubt lead to happiness? By allowing one to relax and resign one’s self to ignorance, says Pyrrho, the greatest name in pure classical skepticism. We cannot know things with certainty, he says, and this is a release (much as Epicurus thought it was a release to believe there is no afterlife). If we cannot know things with certainty, we don’t have to try. We don’t need to go with Aristotle to the docks and dissect infinite fish. We don’t need to sit with Plato and let him pretend to be Socrates through interminable dialogs. We don’t need to follow Pythagoras and fast ourselves into a trance while contemplating the number ten. We can stop. We can say I don’t know, I can’t know, I’ll never know, no one else knows either, no one is right, no one is wrong (not even people on the internet!), so we can just return to our work and rest. This too, say the skeptics, frees us from pain, from several pains that no dogmatic system can ever free us from. It frees us from the exhaustion of the quest to know. It also frees us from the stress we experience when we turn out to be wrong. If you think you know something, and it’s overturned, that’s stressful and unpleasant. It makes you feel angry, foolish, violated, shaken, abandoned. If you never think you know anything about things, you will never experience the pain of being proved wrong.
You know what the skeptics mean here. You know because you are alive in 2014, and that means you remember when there were nine planets. Weren’t you upset? Wasn’t it distressing and upleasant, shaking your worldview? We learned there were nine planets in kindergarden! Of course Pluto is a planet! Mike Brown, the scientist responsible for getting Pluto’s status stripped away receives hate mail, for precisely this reason: it hurts to be told you’re wrong. And this is far from the only time you, reader, have experienced this. There used to be such a thing as a Brontosaurus. And a Triceratops. (Youth: What! We lost the Triceratops too!”) There used to be four food groups, remember that? And coral reefs used to exist only in the tropics, and moths used to have nothing to do with tree sloths, and you used to have a volume of the complete works of Sappho. And the destruction of all these “truths” have unsettled us to different degrees, because we learned them at different times and they were integral to our worldviews to different degrees. And some we are okay with and with others we smile at the angry t-shirts that say: I remember when there were Nine Planets!
Now, Aristotle would tell us the strife has been caused by the fact that we had not defined “Planet” carefully enough before, so it wasn’t an unequivocal term, and thus led us to confusion and misunderstandings. “But!” says Pyrrho, “if you had never studied these things, if you had not been taught as a child to memorize dinosaurs, or rest your worldview on the label attached to a hunk of rock far off in the darkness where you never have cause to perceive it, then you would not experience this unhappiness! Your belief that you knew something has made you unhappy, destroying eudaimonia. Just admit that you do not know anything with certainty and then you need never experience such pain again!” And in the case of things we were prepared for–the treesloth and the Sappho and Arthur having a knight of African descent–the Scientific Method told us to do just that, to be prepared for truth to be replaced when it was time, because it was never Truth, it was always provisional truth.
Ten Modes of Skepticism
Many exciting things will happen to skepticism as it leaves Greek hands before reaches ours. It will be transformed by Bacon and Montaigne, by Averroes and Ockham, Descartes will finish his potato salad and have his day, and it has more refinement yet to undergo among the Greeks as well, and from their sunny riverbank Socrates and company will watch skepticism surge over the marble walls of Plato’s Academy like ants into their picnic basket. But for today I want to leave you with a taste of raw classical skepticism, so you can taste it for a little while and have a taste of this oddest of philosophies which proposes un-knowledge, rather than knowledge, as its happy goal. To that end, here, to finish, are examples of the Ten Modes of Pyrrhonism (i.e. the kind of raw skepticism practiced by Pyrrho) based on the handbook of Sextus Empiricus (one of our few surviving ancient skeptical authors). It is a list of categories of sources of error, things that can make you be wrong. Many are ones that we are very well prepared for in the modern world and remain on constant or at least near-constant guard against (though rather than guarding against the errors, what Pyrrho and Sextus want us to do is be on guard against imagining we aren’t making errros, i.e. to be on guard against thinking we know something. I see Socrates is nodding in approval, and that the others are too polite to point out the crumbs on his chin).
The Ten Pyrrhonist Proofs that Nothing can be Known with Certainty:
We cannot have certainty because different animals have different senses. When do we encounter this? When walking a dog, sometimes the dog stops to sniff in rapt fascination at a spot on the sidewalk where we see nothing interesting. But evidently there is something very interesting there if a creature as intelligent as a dog is fascinated, and willing to disobey its friend and choke itself by pulling on its collar in order to study this fascinating thing. What an error we commit being unable to see this fascination! Or is the dog in error?
We cannot have certainty because different human beings experience things differently. When do we encounter this? I encounter it when friends drink alcohol. I do not enjoy alcohol. Not only am I not supposed to have it (because of a specific medical condition), but it tastes like nasty poisonous motor oil to me, and yet I see my friends go into paroxysms of delight over the subtleties and complexities of drinks, and my civilization build entire buildings, institutions, customs and industries around this thing which my senses tell me is terrible. My senses and those of my friends differ. Clearly someone must be wrong, unless there is no right here? I also have a color blind housemate who cannot tell that Hello Kitty Hot Chocolate is bright pink, and struggles to play the game Set in mediocre lighting.
We cannot have certainty because our senses disagree with each other. If I want to know if something is good, I ask my senses. Yet sometimes they disagree with each other. My eyes tell me this artichoke is just made of smooth leaves, yet my touch tells me it is prickly. My eyes tell me a lobster is scary and dangerous, and yet my tongue says is delicious. My eyes tell me the molten glass in this glassblowing demo looks goopy and exciting and like a fascinating texture like putty which would be awesome to touch, and yet my touch tells me owwwwwwwwwww hot hot hot hot hot! My touch tells me this cat is delightful and fuzzy and yet my nose tells me I should not be near it because achooo!
We cannot have certainty because sometimes the same things seem different and lead us to different judgments in different circumstances. I might enjoy a food for a long time but then get food poisoning from it and, after that, always be revolted when I smell it. I might feel warm at 70 degrees but then be sick and feel cold at 70 degrees. I might think Gatorade is nasty but then be dehydrated and think it tastes great because my body craves elecrolytes.
We cannot have certainty because the same objects seem different from different perspectives. A mountain that looks like a face from one angle looks like a random jumble from another. A square tower seen from a distance seems round. A stick in water looks bent. The moon above a skyline looks much bigger than the buildings but we have no real sense from that of how enormously big it really is, and can only realize the latter using a lot of math, or a space shuttle.
We cannot have certainty because we never see objects alone. Have you ever had one of those articles of clothing that looks purple in some light and blue in other light, so people argue over which it is? Because it looks one way next to one thing and another way next to another. Well, what does it look like really? We can never see anything alone, we always see it surrouneded by other objects including air. If the stick is distorted by water, is it not also distorted by air? By vacuum? By light? We do not see objects, only groups of objects.
We cannot have certainty because things take multiple forms. Bronze is red, except when it turns green. Water is clear, unless it’s blue, or fluffy snow white. Squid ink is black, unless it’s diluted to form purple, or sepia. That molten glass is enticingly orange and squidgy. What do any of these things really look like?
We cannot have certainty because we experience everything relative to other things. We cannot see a thing without making some judgment about things that are relative: this clementine is small, this stick is long, this lake is large. Small, long and large compared to what? Other objects of comparison intrude themselves into our analysis. The clementine is small compared to oranges, the stick long compared to other sticks, the lake large compared to my back yard. But we cannot judge things without judging them relative to others. To feed his fish for a while my father was growing Giant Amoebas. Giant Amoebas! Amoebas so big you could almost see them with the naked eye! They were huge! They were smaller than grains of sand and yet I thought they were huge!
We cannot have certainty because we are biased by scarcity. I love this one, and I love its classic example. This is about how we judge things to be… well, frankly, how we judge them to be awesome or not. For example, comets are awesome. A little bright speck appears in the night that wasn’t there before, and flies across the heavens, really fast, so fast you can almost see it move! When there is a comet we get very excited. We discuss it, announce it, get out telescopes to look at it. In past ages people might pray to it, or read omens from it; now we photograph it and shoot probes at it. It’s super exciting: little bright speck in sky. Okay. So, every morning an enormous blinding ball of fire rises from the horizon, blotting out the night and transforming the entire sky to a wall of brilliant blue brightness streaked with rippling swaths of other beautiful colors, and it radiates down heat enough to transform our weather, burn our skin and feed countless life forms. It is, from any sense-perception objective sense, ten skillion times more exciting than a comet. But it’s just the sun, so, shrug. We are biased by scarcity. Two poems by Sappho and we all hear, but we find thousands of pages of unknown Renaissance poetry every year.
We cannot have certainty because different peoples have different customs, habits, laws, beliefs and ethics, and are biased by them. I think you all know this one. Though it will take over a millennium for it to get to be so common, since cultural relitivism isn’t a broadly-discussed or accepted thing until the enlightenment when Montesquieu and Voltaire made themselves its champions. Skepticism has a long road ahead of it, from Pyrrho to the present. But for now, let’s sit back with Socrates and picnic on this raw form of classical, eudaimonist skepticism, challenging our science-loving, learning-loving, exploration-loving, post-enlightenment selves to test ourselves with the quesiton of whether it might be a safe and happy thing sometimes, in its own strange way, to not know. And we should also comfort Sartre a bit–he hadn’t heard, before today, about poor Pluto. (Descartes: “What’s Pluto?” Socrates: “Are you sure you want to know?”)
This is not a full post yet, but an update, and a recommendation.
The process of transitioning to new hosting is well underway, bugs are vanishing and new features will be online soon. The site is already loading faster, and other new things will follow. UPDATE: the photo album is now fixed. Links will be a little slower to regenerate, but they will in time. Bug reports remain welcome.
Meanwhile, I have an enthusiastic recommendation to make for everyone who has been enjoying the historical and philosophical side of this blog. My work on figures like Machiavelli and topics like the history of atheism grew out of my training in intellectual history. The turning point that set me solidly on this path was a pair of classes on European intellectual development in the 17th and 18th Centuries, by Prof. Alan Kors at Penn. The lectures are truly amazing, clear and moving, chronicling the development of the scientific method, the crisis sparked by Thomas Hobbes, the new models of mind and nature advanced by Locke and Newton, the extraordinary and oft-neglected Pierre Bayle. The second half covered advent of the Enlightenment, which gave me my first real taste of the great firebrands Voltaire, Rousseau, Montesquieu and Diderot, and the revolution-of the-mind which so shaped our present day. The very same lectures by Alan Kors are now available on CD/DVD/download through The Teaching Company, and usually cost more than $100, but they are temporarily on sale for about $30, a little more if you want the video version. So if you enjoyed my Machiavelli series, and if you like audiobooks, and I can’t recommend them highly enough. You can order them here. (There is not, alas, a printed book equivalent of the same content by the same author, but his book Atheism in France, 1650-1729 is, while out of print and rare, independently excellent.)
Update: that sale is over but new ones come up sometimes and this page has coupons for The Great Courses.
Hopefully that will tide you over until my start-of-semester to-do list eases enough to let me write another essay. Soon!
Two quick announcements, then something fun to share.
First, comments were disabled for a little while. Now they are enabled again. Apologies to everyone who wanted to discuss Beccaria – I hope you still want to discuss him, and now you can.
Second, people have been reporting trouble subscribing by RSS. I have investigated, and it seems that, while Firefox, Explorer etc. are fine, Chrome won’t do RSS (for this site or any site) unless you install a Chrome extension for RSS. Googling “Chrome extension RSS” will supply a variety of equally viable methods. However, for those who are struggling with RSS and can’t get it working, I have created a mailing list which you can register for in the right-hand sidebar. Whenever I make a new post I will e-mail the list to alert people. I recommend, however, that you use RSS instead of the mailing list if you can, because RSS will definitely alert you without, whereas the mailing list is hampered by my ability to remember to do it.
Meanwhile, I will take this opportunity to present another of my favorite objects in the Florentine Museum of the History of Science (aka. Museo Galileo): the Noon Cannon. This is a strange variant on a sundial. A tiny cannon, well under a foot long, is mounted outside, ideally in the gardens of a grand estate. It is fixed in place on a stone slab, with a lens positioned above it. At precisely noon each day, the lens focuses sunlight onto the canon, heating up the powder charge and making it go off. If every morning you load the cannon with a little bit of gunpowder, then you will be reliably alerted to noon by the sound of a small explosion from your garden. The effect is sort-of like a water clock except, instead of tranquil trickling and the tap of wood on stone, there is a ka-boom.
I think the specimen in the museum is probably from the Eighteenth Century, possibly the Seventeenth, but I can’t remember off the top of my head. Of course, no one in our era can see a Noon Cannon and not instantly think of its potential uses in an old-fashioned murder mystery. Simply put shot in the Noon Cannon along with its daily charge, lure the victim to the garden at the specified time, and you can be miles away having an alibi while the Noon Cannon does the rest. “The Colonel put real shot in the Noon Cannon? How dastardly!” The killer could even mess with the lens to make it fire at an unexpected time, then play around with other sources of a substitute noise, a hunting rifle or a champagne cork to simulate the 12 PM shot… it writes itself…