In the comments to the Progress post, a reader asked for clarification on what was so awful about Hobbes, and this was Ada’s response, which I am reposting as a post so that it doesn’t stay buried down there:
The Hobbes reference referred, not to my opinion of him or modern opinions on him, but contemporary opinions of him, how hated and feared he was by his peers in the mid-17th century. I’ll treat him more in the next iteration(s) of my skepticism series, but in brief Hobbes was a student of Bacon (he was actually Bacon’s amanuensis for a while) and used Bacon’s new techniques of observation and methodical reasoning with absolute mastery, BUT used them to come to conclusions that were absolutely terrifying to his peers, attacking the dignity of the human race, the foundations of government, the pillars of morality of his day, in ways whose true terror are hard for us to feel when we read Leviathan in retrospect, having accepted many of Hobbes’s ideas and being armored against the others by John Locke. But among his contemporaries, “The Beast of Malmsbury” as he was called, held an unmatched status as the intellectual terror of his day. In fact there are few thinkers ever in history who were so universally feared and hated–it’s only a slight exaggeration to say that for the two decades after the publication of Leviathan, the sole goal of western European philosophy was to find some way to refute Thomas Hobbes WITHOUT (here’s the tricky part) undermining Bacon. Because Bacon was light, hope, progress, the promise of a better future, and Hobbes was THE BEST wielder of Bacon’s techniques. So they couldn’t just DISMISS Hobbes without undermining Bacon, they had to find a way to take Hobbes on in his own terms and Bacon better than Hobbes did. It took 20 years and John Locke to achieve that, but in the meantime Hobbes so terrified his peers that they literally rewrote the laws of England more than once to extend censorship enough to silence Hobbes.
Also the man Just. Wouldn’t. Die. They wanted him dead and gone so they could forget him and move on but he lived to be 91, a constant reminder of the intellectual terror whose shadow had loomed so long over all of Europe. To give a sample of a contemporary articulation of the fear and amazement Hobbes caused in his peers, here is a satirical broadside published to celebrate his death:
My favorite verse from it is:
“Leviathan the Great is dead! But see
The small Behemoths of his progeny
Survive to battle all divinity!”
So I chose Hobbes as an example because he’s really the first “backfire” of Bacon, the first unexpected, unintended consequence of the new method. Hobbes’s book didn’t cause any atrocities, didn’t result in wars or massacres, but it did spread terror through the entire intellectual world, and was the first sniff of the scarier places that thought would go once Bacon’s call to examine EVERYTHING genuinely did examine everything… even things people did NOT want anyone to doubt. So while Hobbes is wonderful, from the perspective of his contemporaries he was the first warning sign that progress cannot be controlled, and that, while it will change parts of society we think are bad, it will change the parts we value too.
Hope that helps clear it up? I’ll discuss Hobbes more in later works.
Is progress inevitable? Is it natural? Is it fragile? Is it possible? Is it a problematic concept in the first place? Many people are reexamining these kinds of questions as 2016 draws to a close, so I thought this would be a good moment to share the sort-of “zoomed out” discussions the subject that historians like myself are always having.
There is a strange doubleness to experiencing an historic moment while being a historian one’s self. I feel the same shock, fear, overload, emotional exhaustion that so many are, but at the same time another me is analyzing, dredging up historical examples, bigger crises, smaller crises, elections that set the fuse to powder-kegs, elections that changed nothing. I keep thinking about what it felt like during the Wars of the Roses, or the French Wars of Religion, during those little blips of peace, a decade long or so, that we, centuries later, call mere pauses, but which were long enough for a person to be born and grow to political maturity in seeming-peace, which only hindsight would label ‘dormant war.’ But then eventually the last flare ended and then the peace was real. But on the ground it must have felt exactly the same, the real peace and those blips. That’s why I don’t presume to predict — history is a lesson in complexity not predictability — but what I do feel I’ve learned to understand, thanks to my studies, are the mechanisms of historical change, the how of history’s dynamism rather than the what next. So, in the middle of so many discussions of the causes of this year’s events (economics, backlash, media, the not-so-sleeping dragon bigotry), and of how to respond to them (petitions, debate, fundraising, art, despair) I hope people will find it useful to zoom out with me, to talk about the causes of historical events and change in general.
Two threads, which I will later bring together. Thread one: progress. Thread two: historical agency.
Part 1: The Question of Progress As Historians Ask It
“How do you discuss progress without getting snared in teleology?” a colleague asked during a teaching discussion. This is a historian’s succinct if somewhat technical way of asking a question which lies at the back of a lot of the questions people are wrestling with now. Progress — change for the better over historical time. The word has many uses (social progress, technological progress), but the reason it raises red flags for historians is the legacy of Whig history, a school of historical thought whose influence still percolates through many of our models of history. Wikipedia has an excellent opening definition of Whig history:
Whig history… presents the past as an inevitable progression towards ever greater liberty and enlightenment, culminating in modern forms of liberal democracy and constitutional monarchy. In general, Whig historians emphasize the rise of constitutional government, personal freedoms, and scientific progress. The term is often applied generally (and pejoratively) to histories that present the past as the inexorable march of progress towards enlightenment… Whig history has many similarities with the Marxist-Leninist theory of history, which presupposes that humanity is moving through historical stages to the classless, egalitarian society to which communism aspires… Whig history is a form of liberalism, putting its faith in the power of human reason to reshape society for the better, regardless of past history and tradition. It proposes the inevitable progress of mankind.
In other words, this approach presumes a teleology to history, that human societies have always been developing toward some pre-set end state: apple seeds into apple trees, humans into enlightened humans, human societies into liberal democratic paradises.
Some of the problems with this approach are transparent, others familiar to those of my readers who have been engaging with current discourse about the problems/failures/weaknesses of liberalism. But let me unpack some of the other problems, the ones historians in particular worry about.
Developed in the earlier the 20th century, Whig history presents a particular set of values and political and social outcomes as the (A) inevitable and (B) superior end-points of all historical change — political and social outcomes that arise from the Western European tradition. The Eurocentric distortions this introduces are obvious, devaluing all other cultures. But even for a Europeanist like myself, who’s already studying Europe, this approach has a distorting effect by focusing our attentions onto historical moments or changes or people that were “right” or “correct,” that took a step “forward.” When one attempts to write a history using this kind of reasoning, the heroes of this process (the statesman who founded a more liberal-democratic-ish state, the scientist whose invention we still use today, the poet whose pamphlet forwards the cause) loom overlarge in history, receiving too much attention. On the one hand, yes, we need to understand those past figures who are keystones of our present — I teach Plato, and Descartes, and Machiavelli with good reason — but if we study only the keystones, and not the other less conspicuous bricks, we wind up with a very distorted idea of the whole edifice.
Whig history also makes it dangerously easy to stray into placing moral value on those things which advanced the teleologicaly-predetermined future. Such things seem to be “correct” thus “good” thus “better” while those whose elements which did not contribute to this teleological development were “dead ends” or “mistakes” or “wrong” which quickly becomes “bad.” In such a history whole eras can be dismissed as unworthy of study for failing to forward progress (The Middle Ages did great stuff, guys!) while other eras can be disproportionately celebrated for advancing it (The Renaissance did a lot of dumb stuff too!). And, of course, whole regions can be dismissed for “failing” to progress (Africa, Asia) as can sub-regions (Poland, Spain).
To give an example within the realm of intellectual history, teleological intellectual histories very often create the false impression that the only figures involved in a period’s intellectual world were heroes and villains, i.e. thinkers we venerate today, or their nasty bad backwards-looking enemies. This makes it seem as if the time period in question was already just previewing the big debates we have today. Such histories don’t know what to do with thinkers whose ideas were orthogonal to such debates, and if one characterizes the Renaissance as “Faith!” vs. “Reason!” and Marsilio Ficino comes along and says “Let’s use Platonic Reason to heal the soul!” a Whig history doesn’t know what to do with that, and reads it as a “dead end” or “detour.” Only heroes or villains fit the narrative, so Ficino must either become one or the other, or be left out. Teleological intellectual histories also tend to give the false impression that the figures we think are important now were always considered important, and if you bring up the fact that Aristotle was hardly read at all in antiquity and only revived in the Middle Ages, or that the most widely owned author in the Enlightenment was the now-obscure fideist encyclopedist Pierre Bayle, the narrative has to scramble to adopt.
Teleological history is also prone to “presentism” <= a bad thing, but a very useful term! Presentism is when one’s reading of history is distorted by one’s modern perspective, often through projecting modern values onto past events, and especially past people. An essay about the Magna Carta which projects Enlightenment values onto its Medieval authors would be presentist. So are histories of the Renaissance which want to portray it as a battle between Reason and religion, or say that only Florence and/or Venice had the real Renaissance because they were republics, and only the democratic spirit of republics could foster fruitful, modern, forward-thinking people. Presentism is also rearing its head when, in the opening episodes of the new Medici: Masters of Florence TV series, Cosimo de Medici talks about bankers as the masterminds of society, and describes himself as a job-creator, not the conceptual space banking was in in 1420. Presentism is sometimes conscious, but often unconscious, so mindful historians will pause whenever we see something that feels revolutionary, or progressive, or proto-modern, or too comfortable, to check for other readings, and make triple sure we have real evidence. Sometimes things in the past really were more modern than what surrounded them. I spent many dissertation years assembling vast grids of data which eventually painstakingly proved that Machaivelli’s interest in radical Epicurean materialism was exceptional for his day, and more similar to the interests of peers seventy years in his future than his own generation — that Machiavelli was exceptional and forward-thinking may be the least surprising conclusion a Renaissance historian can come to, but we have to prove such things very, very meticulously, to avoid spawning yet another distorted biography which says that Galileo was fundamentally an oppressed Bill Nye. Hint: Galileo was not Bill Nye; he was Galileo.
These problems, in brief, are why discussions of progress, and of teleology, are red flags now for any historian.
Unfortunately, the bathwater here is very difficult to separate from an important baby. Teleological thinking distorts our understanding of the past, but the Whig approach was developed for a reason. (A) It is important to have ways to discuss historical change over time, to talk about the question of progress as a component of that change. (B) It is important to retain some way to compare societies, or at least to assess when people try to compare societies, so we can talk about how different institutions, laws, or social mores might be better or worse than others on various metrics, and how some historical changes might be positive or negative. While avoiding dangerous narratives of triumphant [insert Western phenomenon here] sweeping through and bringing light to a superstitious and backwards [era/people/place], we also want to be able to talk about things like the eradication of smallpox, and our efforts against malaria and HIV, which are undeniably interconnected steps in a process of change over time — a process which is difficult to call by any name but progress.
So how do historians discuss progress without getting snared in teleology?
And how do I, as a science fiction writer, as a science fiction reader, as someone who tears up every time NASA or ESA posts a new picture of our baby space probes preparing to take the next step in our journey to the stars, how do I discuss progress without getting snared in teleology?
I, at least, begin by being a historian, and talking about the history of progress itself.
Part 2: A Brief History of Progress
In the early seventeenth century, Francis Bacon invented progress.
Let me unpack that.
Ideas of social change over time had existed in European thought since antiquity. Early Greek sources talk about a Golden Age of peaceful, pastoral abundance, followed by a Silver Age, when jewels and luxuries made life more opulent but also more complicated. There followed a Bronze Age, when weapons and guards appeared, and also the hierarchy of have and have-nots, and finally an Iron Age of blood and war and Troy. Some ancients added more detail to this narrative, notably Lucretius in his Epicurean epic On the Nature of Things. In his version the transition from simple, rural living to luxury-hungry urbanized hierarchy was explicitly developmental, caused, not by divine planning or celestial influences, but by human invention: as people invented more luxuries they then needed more equipment–technological and social — to produce, defend, control, and war over said luxuries, and so, step-by-step, tranquil simplicity degenerated into sophistication and its discontents.
Lucretius’s developmental model of society has several important components of the concept of progress, but not all of them. It has the state of things vary over the course of human history. It also has humanity as the agent of that change, primarily through technological innovation and social changes which arise in reaction to said innovation. It does not have (A) intentionality behind this change, (B) a positive arc to this change, (C) an infinite or unlimited arc to this change, or–perhaps most critically–(D) the expectation that any more change will occur in the future. Lucretius accounts for how society reached its present, and the mythological eras of Gold, Silver, Bronze and Iron do the same. None of these ancient thinkers speculate — as we do every day — about how the experiences of future generations might continue to change and be fundamentally different from their own. Quantitatively things might be different — Rome’s empire might grow or shrink, or fall entirely to be replaced by another — but fundamentally cities will be cities, plows will be plows, empires will be empires, and in a thousand years bread will still be bread. Even if Lucan or Lucretius speculate, they do not live in our world where bread is already poptarts, and will be something even more outlandish in the next generation.
Medieval Europe came to the realization — and if you grant their starting premises they’re absolutely right — that if the entire world is a temporary construct designed by an omnipotent, omniscient Creator God for the purpose of leading humans through their many trials toward eternal salvation or damnation, then it’s madness to look to Earth history for any cause-to-effect chains, there is one Cause of all effects. Medieval thought is no more monolithic than modern, but many excellent examples discuss the material world as a sort of pageant play being performed for us by God to communicate his moral lessons, and if one stage of history flows into another — an empire rises, prospers, falls — that is because God had a moral message to relate through its progression. Take Dante’s obsession with the Emperor Tiberius, for example. According to Dante, God planned the Crucifixion and wanted His Son to be lawfully executed by all humanity, so the sin and guilt and salvation would be universal, so He created the Roman Empire in order to have there be one government large enough to rule and represent the whole world (remember Dante’s maps have nothing south of Egypt except the Mountain of Purgatory). The empire didn’t develop, it was crafted for God’s purposes, Act II scene iii the Roman Empire Rises, scene v it fulfills its purpose, scene vi it falls. Applause.
Did the Renaissance have progress? No. Not conceptually, though, as in all eras of history, constant change was happening. But the Renaissance did suddenly get closer to the concept too. The Renaissance invented the Dark Ages. Specifically the Florentine Leonardo Bruni invented the Dark Ages in the 1420s-1430s. Following on Petrarch’s idea that Italy was in a dark and fallen age and could rise from it again by reviving the lost arts that had made Rome glorious, Bruni divided history into three sections, good Antiquity, bad Dark Ages, and good Renaissance, when the good things lost in antiquity returned. Humans and God were both agents in this, God who planned it and humans who actually translated the Greek, and measured the aqueducts, and memorized the speeches, and built the new golden age. Renaissance thinkers, fusing ideas from Greece and Rome with those of the Middle Ages, added to old ideas of development the first suggestion of a positive trajectory, but not an infinite one, and not a fundamental one. The change the Renaissance believed in lay in reacquiring excellent things the past had already had and lost, climbing out of a pit back to ground level. That change would be fundamental, but finite, and when Renaissance people talk about “surpassing the ancients” (which they do) they talk about painting more realistic paintings, sculpting more elaborate sculptures, perhaps building more stunning temples/cathedrals, or inventing new clever devices like Leonardo’s heated underground pipes to let you keep your potted lemon tree roots warm in winter (just like ancient Roman underfloor heating!) But cities would be cities, plows would be maybe slightly better plows, and empires would be empires. Surpassing the ancients lay in skill, art, artistry, not fundamentals.
Then in the early seventeenth century, Francis Bacon invented progress.
If we work together — said he — if we observe the world around us, study, share our findings, collaborate, uncover as a human team the secret causes of things hidden in nature, we can base new inventions on our new knowledge which will, in small ways, little by little, make human life just a little easier, just a little better, warm us in winter, shield us in storm, make our crops fail a little less, give us some way to heal the child on his bed. We can make every generation’s experience on this Earth a little better than our own. There are — he said — three kinds of scholar. There is the ant, who ranges the Earth and gathers crumbs of knowledge and piles them, raising his ant-mound, higher and higher, competing to have the greatest pile to sit and gloat upon–he is the encyclopedist, who gathers but adds nothing. There is the spider, who spins elaborate webs of theory from the stuff of his own mind, spinning beautiful, intricate patterns in which it is so easy to become entwined — he is the theorist, the system-weaver. And then there is the honeybee, who gathers from the fruits of nature and, processing them through the organ of his own being, produces something good and useful for the world. Let us be honeybees, give to the world, learning and learning’s fruits. Let us found a new method — the Scientific Method — and with it dedicate ourselves to the advancement of knowledge of the secret causes of things, and the expansion of the bounds of human empire to the achievement of all things possible.
Bacon is a gifted wordsmith, and he knows how to make you ache to be the noble thing he paints you as.
“How, Chancellor Bacon, do we know that we can change the world with this new scientific method thing, since no one has ever tried it before so you have no evidence that knowledge will yield anything good and useful, or that each generation’s experience might be better than the previous?”
It is not an easy thing to prove science works when you have no examples of science working yet.
Bacon’s answer — the answer which made kingdom and crown stream passionate support and birthed the Academy of Sciences–may surprise the 21st-century reader, accustomed as we are to hearing science and religion framed as enemies. We know science will work–Bacon replied–because of God. There are a hundred thousand things in this world which cause us pain and suffering, but God is Good. He gave the cheetah speed, the lion claws. He would not have sent humanity out into this wilderness without some way to meet our needs. He would not have given us the desire for a better world without the means to make it so. He gave us Reason. So, from His Goodness, we know that Reason must be able to achieve all He has us desire. God gave us science, and it is an act of Christian charity, an infinite charity toward all posterity, to use it.
They believed him.
And that is the first thing which, in my view, fits every modern definition of progress. Francis Bacon died from pneumonia contracted while experimenting with using snow to preserve chickens, attempting to give us refrigeration, by which food could be stored and spread across a hungry world. Bacon envisioned technological progress, medical progress, but also the small social progresses those would create, not just Renaissance glories for the prince and the cathedral, but food for the shepherd, rest for the farmer, little by little, progress. As Bacon’s followers reexamined medicine from the ground up, throwing out old theories and developing…
I’m going to tangent for a moment. It really took two hundred years for Bacon’s academy to develop anything useful. There was a lot of dissecting animals, and exploding metal spheres, and refracting light, and describing gravity, and it was very, very exciting, and a lot of it was correct, but–as the eloquent James Hankins put it–it was actually the nineteenth century that finally paid Francis Bacon’s I.O.U., his promise that, if you channel an unfathomable research budget, and feed the smartest youths of your society into science, someday we’ll be able to do things we can’t do now, like refrigerate chickens, or cure rabies, or anesthetize. There were a few useful advances (better navigational instruments, Franklin’s lightning rod) but for two hundred years most of science’s fruits were devices with no function beyond demonstrating scientific principles. Two hundred years is a long time for a vastly-complex society-wide project to keep getting support and enthusiasm, fed by nothing but pure confidence that these discoveries streaming out of the Royal Society papers will eventually someday actually do something. I just think… I just think that keeping it up for two hundred years before it paid off, that’s… that’s really cool.
…okay, I was in the middle of a sentence: As Bacon’s followers reexamined science from the ground up, throwing out old theories and developing new correct ones which would eventually enable effective advances, it didn’t take long for his followers to apply his principle (that we should attack everything with Reason’s razor and keep only what stands) to social questions: legal systems, laws, judicial practices, customs, social mores, social classes, religion, government… treason, heresy… hello, Thomas Hobbes. In fact the scientific method that Bacon pitched, the idea of progress, proved effective in causing social change a lot faster than genuinely useful technology. Effectively the call was: “Hey, science will improve our technology! It’s… it’s not doing anything yet, so… let’s try it out on society? Yeah, that’s doing… something… and — Oh! — now the technology’s doing stuff too!” Except that sentence took three hundred years.
We know now, as Bacon’s successors learned, with harsher and harsher vividness in successive generations, that attempts at progress can also cause negative effects, atrocious ones. Like Thomas Hobbes. And the Terror phase of the French Revolution. And the life-expectancy in cities plummeting as industrialization spread soot, and pollutants, and cholera, and mercury-impregnated wallpaper, and lead-whitened bread, Mmmmm lead-whitened bread… And just as technological discoveries had their monstrous offspring, like lead-whitened bread, the horrors of colonization were some of the monstrous offspring of the social applications of Reason. Monstrous offspring we are still wrestling with today.
Part 3: Progresses
We now use the word “progress” in many senses, many more than Bacon and his peers did. There is “technological progress.” There is “social progress.” There is “economic progress.” We sometimes lump these together, and sometimes separate them.
Thus the general question “Has progress failed?” can mean several things. It can mean, “Have our collective efforts toward the improvement of the human condition failed to achieve their desired results?” This is being asked frequently these days in the context of social progress, as efforts toward equality and tolerance are facing backlash.
But “Has progress failed?” can also mean “Has the development of science and technology, our application of Reason to things, failed to make the lived experience of people better/ happier/ less painful? Have the changes been bad or neutral instead of good?” In other words, was Bacon right that human’s using Reason and science can change our world, but wrong that we can make it better?
I want to stress that it is no small intellectual transformation that “progress” can now be used in a negative sense as well as a positive one. The concept as Bacon crystallized it, and as the Enlightenment spread it, was inherently positive, and to use it in a negative sense would be nonsensical, like using “healing” in a negative sense. But look at how we actually use “progress” in speech today. Sometimes it is positive (“Great progress this year!”) and sometimes negative (“Swallowed up by progress…”). This is a revolutionary change from Bacon’s day, enabled by two differences between ourselves and Bacon.
First we have watched the last several centuries. For us, progress is sometimes the first heart transplant and the footprints on the Moon, and sometimes it’s the Belgian Congo with its Heart of Darkness. Sometimes it’s the annihilation of smallpox and sometimes it’s polio becoming worse as a result of sanitation instead of better. Sometimes it’s Geraldine Roman, the Phillipines’ first transgender congresswoman, and sometimes it’s Cristina Calderón, the last living speaker of the Yaghan language. Progress has yielded fruits much more complex than honey, which makes sentences like “The prison of progress” sensical to us.
We have also broadened progress. For Bacon, progress was the honey and the honeybees, hard, systematic, intentional human action creating something sweet and useful for mankind. It was good. It was new. And it was intentional. In its nascent form, Bacon’s progress did not differentiate between progress the phenomenon and progress the concept. If you asked Bacon “Was there progress in the Middle Ages?” he would have answered, “No. We’re starting to have progress right now.” And he’s correct about the concept being new, about intentional or self-aware progress, progress as a conscious effort, being new. But if we turn to Wikipedia it defines “Progress (historical)” as “the idea that the world can become increasingly better in terms of science, technology, modernization, liberty, democracy, quality of life, etc.” Notice how agency and intentionality are absent from this. Because there was technological and social change before 1600, there were even technological and social changes that undeniably made things better, even if they came less frequently than they do in the modern world. So the phenomenon we study through the whole of history, far before the maturation of the concept.
As “progress” broadened to include unsystematic progress as well as the modern project of progress, that was the moment we acquired the questions “Is progress natural?” and “Is progress inevitable?” Because those questions require progress to be something that happens whether people intend it or not. In a sense, Bacon’s notion of progress wasn’t as teleological as Whig history. Bacon believed that human action could begin the process of progress, and that God gave Reason to humanity with this end in mind, but Bacon thought humans had to use a system, act intentionally, gather the pollen to make the honey, he didn’t think they honey just flowed. Not until progress is broadened to include pre-modern progress, and non-systematic, non-intentional modern progress, can the fully teleological idea of an inescapable momentum, an inevitability, join the manifold implications of the word “progress.”
Now I’m going to show you two maps.
This is map of global population, rendered to look like a terrain. It shows the jagged mountain ranges of south and east Asia, the vast, sweeping valleys of forest and wilderness. The most jagged spikes may be a little jarring, the intensity of India and China, but even those are rich brown mountains, while the whole thing has the mood of a semi-untouched world, much more pastoral wilderness than city, and almost everywhere a healthy green. This makes progress, or at least the spread of population, feel like a natural phenomenon, a neutral phenomenon.
This is the Human Ooze Map. This map shows exactly the same data, reoriented to drip down instead of spiking up, and to be a pus-like yellow against an ominous black background. Instantly the human metropolises are not natural spikes within a healthy terrain, but an infection clinging to every oozing coastline, with the densest mega-cities seeming to bleed out amidst the goop, like open pustules.
Both these maps show one aspect of ‘progress’. Whether the teeming cities of our modern day are an apocalyptic infection, or a force as natural as the meandering of shores and tree-lines, depends on how we present the narrative, and the moral assumptions that underlie that presentation. Presentism and the progress narrative in general have very similar distorting effects. When we examine past phenomena, institutions, events, people, ideas, some feel viscerally good or viscerally bad, right or wrong, forward-moving or backward-moving, values they acquire from narratives which we ourselves have created, and which orient how we analyze history, just as these mapmakers have oriented population up, or down, resulting in radically different feelings. Jean-Jacques Rousseau’s model of the Noble Savage, happier the rural simplicity of Lucretius’s Golden Age rather than in the stressful ever-changing urban world of progress, is itself an image progress presented like the Human Ooze Map, reversing the moral presentation of the same facts.
Realizing that the ways we present data about progress are themselves morally charged can help us clarify questions that are being asked right now about liberalism, and nationalism, and social change, and opposition to social change. Because when we ask whether the world is experiencing a “failure” or a “revolution” or a “regression” or a “backlash” or a “last gasp” or a “pendulum swing” or a “prelude to triumph” etc., all these characterizations reorient data around different facets of the concept of progress, positive or negative, natural or intentional, just as these two maps reorient population around different morally-charged visualizations.
In sum: post colonialism, post industrialization, post Hobbes, we can no longer talk about progress as a unilateral, uncomplicated, good, not without distorting history, and ignoring the terrible facets of the last several centuries. Bacon thought there would be only honey, he was wrong. But we can’t not discuss progress because, during these same centuries, each generation’s experience has been increasingly different from the last generation, and science and human action are propelling this change. And there has been some honey. We need ways to talk about that.
But not without bearing in mind how we invest progress with different kinds of moral weight (the terrain or the ooze…)
And not without a question Bacon never thought to ask, because he did not realize (as we do) that technological and social change had been going on for many centuries before he made the action conscious. So Bacon never thought to ask: Do we have any power over progress?
Part 4: Do Individuals Have the Power to Change History?
Feelings of helplessness and despair have also been big parts of the shock of 2016. Helplessness and despair are questions, as well as feelings. They ask: Am I powerless? Can I personally do anything to change this? Do individuals have any power to shape history? Are we just swept along by the vast tides of social forces? Are we just cogs in the machine? What changes history?
Within a history department this divide often manifests methodologically.
Economic historians, and social historians, write masterful examinations of how vast social and economic forces, and their changes, whether incremental or rapid, have shaped history. Let’s call that Great Forces history. Whenever you hear people comparing our current wealth gap to the eve of the French Revolution, that is Great Forces history. When a Marxist talks about the inevitable interactions of proletariat and bourgeoisie, or when a Whig historian talks about the inevitable march of progress, those are also kinds of Great Forces history.
Great Forces history is wonderful, invaluable. It lets us draw illuminating comparisons, and helps us predict, not what will happen but what could happen, by looking at what has happened in similar circumstances. I mentioned earlier the French Wars of Religion, with their intermittent blips of peace. My excellent colleague Brian Sandberg of NIU (a brilliant historian of violence) recently pointed out to me that France during the Catholic-Protestant religious wars was about 10% Protestant, somewhat comparable to the African American population of the USA today which is around 13%. A striking comparison, though with stark differences. In particular, France’s Protestant/Calvinist population fell disproportionately in the wealthy, politically-empowered aristocratic class (comprising 30% of the ruling class), in contrast with African Americans today who fall disproportionately in the poorer, politically-disempowered classes. These similarities and differences make it very fruitful to look at the mechanisms of civil violence in 16th and 17th century France (how outbreaks of violence started, how they ended, who against whom) to help us understand the similar-yet-different ways civil violence might operate around us now. That kind of comparison is, in my view, Great Forces history at its most fruitful. (You can read more by Brian Sandberg on this issue in his book, on his blog, and on the Center for the Study of Religious Violence blog; more citations at the end of this article.)
But are we all, then, helpless water droplets, with no power beyond our infinitesimal contribution to the tidal forces of our history? Is there room for human agency?
History departments also have biographers, and intellectual historians, and micro-historians, who churn out brilliant histories of how one town, one woman, one invention, one idea reshaped our world. Readers have seen me do this here on Ex Urbe, describing how Beccaria persuaded Europe to discontinue torture, how Petrarch sparked the Renaissance, how Machiavelli gave us so much. Histories of agents, of people who changed the world. Such histories are absolutely true — just as the Great Forces histories are — but if Great Forces histories tell us we are helpless droplets in a great wave, these histories give us hope that human agency, our power to act meaningfully upon our world, is real. I am quite certain that one of the causes of the explosive response to the Hamilton musical right now is its firm, optimistic message that, yes, individuals can, and in fact did, reshape this world — and so can we.
This kind of history, inspiring as it is, is also dangerous. The antiquated/old-fashioned/bad version of this kind of history is Great Man history, the model epitomized by Thomas Carlyle’s Heroes, Hero-Worship and the Heroic in History (a gorgeous read) which presents humanity as a kind of inert but rich medium, like agar ready for a bacterial culture. Onto this great and ready stage, Nature (or God or Providence) periodically sends a Great Man, a leader, inventor, revolutionary, firebrand, who makes empires rise, or fall, or leads us out of the black of ignorance. Great Man history is very prone to erasing everyone outside a narrow elite, erasing women, erasing the negative consequences of the actions of Great Men, justifying atrocities as the collateral damage of greatness, and other problems which I hope are familiar to my readers.
But when done well, histories of human agency are valuable. Are true. Are hope.
So if Great Forces history is correct, and useful, and Human Agency history is also correct, and useful… how do we balance that? They are, after all, contradictory.
Part 5: The Papal Election of 2016
Every year in my Italian Renaissance class, here at the University of Chicago, I run a simulation of a Renaissance papal election, circa 1490-1500. Each student is a different participant in the process, and they negotiate, form coalitions, and, eventually, elect a pope. And then they have a war, and destroy some chunk of Europe. Each student receives a packet describing that students’ character’s goals, background, personality, allies and enemies, and a packet of resources, cards representing money, titles, treasures, armies, nieces and nephews one can marry off, contracts one can sign, artists or scholars one can use to boost one’s influence, or trade to others as commodities: “I’ll give you Leonardo if you send three armies to guard my city from the French.”
Some students in the simulation play powerful Cardinals wielding vast economic resources and power networks, with clients and subordinates, complicated political agendas, and a strong shot at the papacy. Others are minor Cardinals, with debts, vulnerabilities, short-term needs to some personal crisis in their home cities, or long-term hopes of rising on the coattails of others and perhaps being elected three or four popes from now. Others, locked in a secret chamber in the basement, are the Crowned Heads of Europe — the King of France, the Queen of Castile, the Holy Roman Emperor — who smuggle secret orders (text messages) to their agents in the conclave, attempting to forge alliances with Italian powers, and gain influence over the papacy so they can use Church power to strengthen their plans to launch invasions or lay claim to distant thrones. And others are not Cardinals at all but functionaries who count the votes, distribute the food, the guard who keeps watch, the choir director who entertains the churchmen locked in the Sistine, who have no votes but can hear, and watch, and whisper.
There are many aspects to this simulation, which I may someday to discuss here at greater length (for now you can read a bit about it on our History Department blog), but for the moment I just want to talk about the outcomes, and what structures the outcomes. I designed this simulation not to have any pre-set outcome. I looked into the period as best I could, and gave each historical figure the resources and goals that I felt accurately reflected that person’s real historical resources and actions. I also intentionally moved some characters in time, including some Cardinals and political issues which do not quite overlap with each other, in order to make this an alternate history, not a mechanical reconstruction, so that students who already knew what happened to Italy in this period would know they couldn’t have the “correct” outcome even if they tried, which frees everyone to pursue goals, not “correct” choices, and to genuinely explore the range of what could happen without being too locked in to what did. I set up the tensions and the actors to simulate what I felt the situation was when the election begin, then left it free to flow.
I have now run the simulation four times. Each time some outcomes are similar, similar enough that they are clearly locked in by the greater political webs and economic forces. The same few powerful Cardinals are always leading candidates for the throne. There is usually also a wildcard candidate, someone who has never before been one of the top contenders, but circumstances bring a coalition together. And, usually, perhaps inevitably, a juggernaut wins, one of the Cardinals who began with a strong power-base, but it’s usually very, very close. And the efforts of the wildcard candidate, and the coalition that formed around that wildcard, always have a powerful effect on the new pope’s policies and first actions, who’s in the inner circle and who’s out, what opposition parties form, and that determines which city-states rise and which city-states burn as Italy erupts in war.
And the war is Always. Totally. Different.
Because as the monarchies race to make alliances and team up against their enemies, they get pulled back-and-forth by the ricocheting consequences of small actions: a marriage, an insult, a bribe traded for a whisper, someone paying off someone else’s debts, someone taking a shine to a bright young thing. Sometimes France invades Spain. Sometimes France and Spain unite to invade the Holy Roman Empire. Sometimes England and Spain unite to keep the French out of Italy. Sometimes France and the Empire unite to keep Spain out of Italy. Once they made a giant pan-European peace treaty, with a set of marriage alliances which looked likely to permanently unify all four great Crowns, but it was shattered by the sudden assassination of a crown prince.
So when I tell people about this election, and they ask me “Does it always have the same outcome?” the answer is yes and no. Because the Great Forces always push the same way. The strong factions are strong. Money is power. Blood is thicker than promises. Virtue is manipulable. In the end, a bad man will be pope. And he will do bad things. The war is coming, and the land — some land somewhere — will burn. But the details are always different. A Cardinal needs to gather fourteen votes to get the throne, but it’s never the same fourteen votes, so it’s never the same fourteen people who get papal favor, whose agendas are strengthened, whose homelands prosper while their enemies fall. And I have never once seen a pope elected in this simulation who did not owe his victory, not only to those who voted, but to one or more of the humble functionaries, who repeated just the right whisper at just the right moment, and genuinely handed the throne to Monster A instead of Monster B. And from that functionary flow the consequences. There are always several kingmakers in the election, who often do more than the candidate himself to get him on the throne, but what they do, who they help, and which kingmaker ends up most favored, most influential, can change a small war in Genoa into a huge war in Burgundy, a union of thrones between France and England into another century of guns and steel, or determine which decrees the new pope signs. That sometimes matters more than whether war is in Burgundy or Genoa, since papal signatures resolve questions such as: Who gets the New World? Will there be another crusade? Will the Inquisition grow more tolerant or less toward new philosophies? Who gets to be King of Naples? These things are different every time, though shaped by the same forces.
Frequently the most explosive action is right after the pope is elected, after the Great Forces have thrust a bad man onto Saint Peter’s throne, and set the great and somber stage for war, often that’s the moment that I see human action do most. That’s when I get the after-midnight message on the day before the war begins: “Secret meeting. 9AM. Economics cafe. Make sure no one sees you. Sforza, Medici, D’Este, Dominicans. Borgia has the throne but he will not be master of Italy.” And together, these brave and haste-born allies, they… faicceed? Fail and succeed? They give it all they have: diplomacy, force, wealth, guile, all woven together. They strike. The bad pope rages, sends forces out to smite these enemies. The kings and great thrones take advantage, launch invasions. The armies clash. One of the rebel cities burns, but the other five survive, and Borgia (that year at least) is not Master of Italy.
We feel it, the students as myself, coming out of the simulation. The Great Forces were real, and were unstoppable. The dam was about to break. No one could stop it. But the human agents — even the tiniest junior clerk who does the paperwork — the human agents shaped what happened, and every action had its consequences, imperfect, entwined, but real. The dam was about to break, but every person there got to dig a channel to try to direct the waters once they flowed, and that is what determined the real shape of the flood, its path, its damage. No one controlled what happened, and no one could predict what happened, but those who worked hard and dug their channels, most of them succeeded in diverting most of the damage, achieving many of their goals, preventing the worst. Not all, but most.
And what I see in the simulation I also see over and over in real historical sources.
This is how both kinds of history are true. There are Great Forces. Economics, class, wealth gaps, prosperity, stagnation, these Great Forces make particular historical moments ripe for change, ripe for war, ripe for wealth, ripe for crisis, ripe for healing, ripe for peace. But individuals also have real agency, and our actions determine the actual consequences of these Great Forces as they reshape our world. We have to understand both, and study both, and act on the world now remembering that both are real.
So, can human beings control progress? Yes and no.
Part 6: Ways to Talk About Progress in the 21st Century
Few things have taught me more about the world than keeping a fish tank.
You get some new fish, put them in your fish tank, everything’s fine. You get some more new fish, the next morning one of them has killed almost all the others. Another time you get a new fish and it’s all gaspy and pumping its gills desperately, because it’s from alkeline waters and your tank is too acidic for it. So you put in a little pH adjusting powder and… all the other fish get sick from the Ammonia that releases and die. Another time you get a new fish and it’s sick! So you put fish antibiotics in the water, aaaand… they kill all the symbiotic bacteria in your filter system and the water gets filled with rotting bacteria, and the fish die. Another time you do absolutely nothing, and the fish die.
What’s happening? The same thing that happened in the first two centuries after Francis Bacon, when the science was learning tons, but achieving little that actually improved daily life. The system is more complex than it seems. A change which achieves its intended purpose also throws out-of-whack vital forces you did not realize were connected to it. The acidity buffer in the fish tank increases the nutrients in the water, which causes an algae bloom, which uses up the oxygen and suffocates the catfish. The marriage alliance between Milan and Ferrara makes Venice friends with Milan, which makes Venice’s rival Genoa side with Spain, which makes Spain reluctant to anger Portugal, which makes them agree to a marriage alliance, and then Spain is out of princesses and can’t marry the Prince of Wales, and the next thing you know there are soldiers from Scotland attacking Bologna. A seventeenth-century surgeon realizes that cataracts are caused by something white and opaque appearing at the front of the eye so removes it, not yet understanding that it’s the lens and you really need it.
So when I hear people ask “Has social progress has failed?” or “Has liberalism failed?” or “Has the Civil Rights Movement failed?” my zoomed-in self, my scared self, the self living in this crisis feels afraid and uncertain, but my zoomed-out self, my historian self answers very easily. No. These movements have done wonders, achieved tons! But they have also done what all movements do in a dynamic historical system: they have had large, complicated consequences. They have added something to the fish tank. Because the same Enlightenment impulse to make a better, more rational world, where everyone would have education and equal political empowerment BOTH caused the brutalities of the Belgian Congo AND gave me the vote. And that’s the sort of thing historians look at, all day.
But if the consequences of our actions are completely unpredictable, would it be better to say that change is real but progress controlled by humans is just an idea which turned out to be wrong? No. I say no. Because I gradually got better at understanding the fish tank. Because the doctors gradually figured out how the eye really does function. Because some of our civil rights have come by blood and war, and others have come through negotiation and agreement. Because we as humans are gradually learning more about how our world is interconnected, and how we can take action within that interconnected system. And by doing so we really have achieve some of what Francis Bacon and his followers waited for through those long centuries: we have made the next generation’s experience on this Earth a little better than our own. Not smoothly, and not quickly, but actually. Because, in my mock papal election, the dam did break, but those students who worked hard to dig their channels did direct the flood, and most of them managed to achieve some of what they aimed at, though they always caused some other effects too.
Is it still blowing up in our faces?
Is it going to keep blowing up in our faces, over and over?
Is it going to blow up so much, sometimes, that it doesn’t seem like it’s actually any better?
Is that still progress?
Because there was a baby in the bathwater of Whig history. If we work hard at it, we can find metrics for comparing times and places which don’t privilege particular ideologies. Metrics like infant mortality. Metrics like malnutrition. Metrics like the frequency of massacres. We can even find metrics for social progress which don’t irrevocably privilege a particular Western value system. One of my favorite social progress metrics is: “What portion of the population of this society can be murdered by a different portion of the population and have the murderer suffer no meaningful consequences?” The answer, for America in 2017, is not 0%. But it’s also not 90%. That number has gone down, and is now far below the geohistorical norm. That is progress. That, and infant mortality, and the conquest of smallpox. These are genuine improvements to the human condition, of the sort that Bacon and his followers believed would come if they kept working to learn the causes and secret motions of things. And they were right. While Whig history privileges a very narrow set of values, metrics which track things like infant mortality, or murder with impunity, still privilege particular values — life, justice, equality — but aim to be compatible with as many different cultures, and even time periods, as possible. They are metrics which stranded time travelers would find it fairly easy to explain, no matter where they were dumped in Earth’s broad timeline. At least that’s our aim. And such metrics are the best tool we have at present to make the comparisons, and have the discussions about progress, that we need to have to grapple with our changing world.
Because progress is both a concept and a phenomenon.
The concept is the hope that collective human effort can make every generation’s experience on this Earth a little better than the previous generation’s. That concept has itself become a mighty force shaping the human experience, like communism, iron, or the wheel. It is valuable thing to look at the effects that concept has had, to talk about how some have been destructive and others constructive, and to study, from a zoomed-out perspective, the consequences, successes, and failures of different movements or individuals who have acted in the name of progress.
The phenomenon is also real. My own personal assessment of it is just that, a personal assessment, with no authority beyond some years spent studying history. I hope to keep reexamining and improving this assessment all the days of my life. But here at the beginning of 2017 I would say this:
Progress is not inevitable, but it is happening.
It is not transparent, but it is visible.
It is not safe, but it is beneficial.
It is not linear, but it is directional.
It is not controllable, but it is us. In fact, it is nothing but us.
Progress is also natural, in my view, not in the sense that it will inevitably triumph over its doomed opposition, but in the sense that the human animal is part of nature, so the Declaration of the Rights of Man is as natural as a bird’s nest or a beaver dam. There is no teleology, no inevitable correct ending locked in from time immemorial. But I personally think there is a certain outcome to progress, gradual but certain: the decrease of pain in the human condition over time. Because there is so much desire in this world to make a better one. Bacon was right that we ache for it. And the real measurable changes we have made show that he was also right that we can use Reason and collective effort to meet our desires, even if the process is agonizingly slow, imperfect, and dangerous. But we know now how to go about learning the causes and secret motions of things. And how to use that knowledge.
We are also learning to understand the accidental negative consequences of progress, looking out for them, mitigating them, preventing them, creating safety nets. We’re getting better at it. Slowly, but we are.
Zooming back in hurts. It’s easy to say “the French Wars of Religion” and erase the little blips of peace, but it’s hard to feel fear and pain, or watch a friend feel fear and pain. Sometimes I hear people say they think that things today are worse than they’ve ever been, especially the hate, or the race relations in the USA, that they’re worse now than ever. That we’ve made no progress, quite the opposite. Similarly, I think a person who grew up during one of the peaceful pauses in the French Wars of Religion might say, when the violence restarted, that the wars were worse now than they had ever been, and farther than ever from real peace. They aren’t actually worse now. They genuinely were worse before. But they are really, really bad right now, and it does really, really hurt.
The slowness of social progress is painful, I think especially because it’s the aspect of progress that seemed it would come fastest. During that first century, when Bacon’s followers were waiting in maddening impatience for their better medical knowledge to result in any actual increase in their ability to save lives, social progress was already working wonders. The Enlightenment did extend franchise, end torture on an entire continent, achieved much, and had this great, heady, explosive feeling of victory and momentum. It seemed like social progress was already half-way-done before tech even got started. But Charles Babbage kicked off programmable computing in 1833 and now my pocket contains 100x the computing power needed to get Apollo XI to the Moon, so why, if Olympe de Gouges wrote the Declaration of the Rights of Woman and the Citizen in 1791, do we still not have equal pay?
Because society is a very complicated fish tank. Because we still have a lot to learn about the causes and secret motions of society.
But if there is a dam right now, ready to break and usher in a change, Great Forces are still shaped by human action. Our action.
Studying history has proved to me, over and over, that things used to be worse. That they are better now. Progress is real. That’s a consolation, but a hollow one while we’re still here facing the pain. What fills its hollowness, for me at least, is remembering that secret meeting in the Economics cafe, that hasty plan, diplomacy, quick action — not a second chance after the disaster, but a next chance. And a next. And a next, to take actions that really did achieve things, even if not everything. Human action combining with the flood is not powerlessness. And that’s how I think progress really works.
And as promised, more citations on the demographics of religious violence in France, with thanks to Brian Sandberg:
Brian Sandberg, Warrior Pursuits: Noble Culture and Civil Conflict in Early Modern France (Baltimore, MD: Johns Hopkins University Press, 2010).
Philip Benedict, “The Huguenot Population of France, 1600-85,” in The Faith and Fortunes of France’s Huguenots, 1600-85 (Aldershot: Ashgate, 2001), 39-42, 92-95.
Arlette Jouanna, La France du XVIe siècle, 1483-1598 (Paris: Presses Universitaires de France, 1996), 325-340.
Jacques Dupâquier, ed., De la Renaissance à 1789, vol. 2 of Histoire de la population française (Paris: Presses Universitaires de France, 1988), 81-94.
Off to Italy again. This seems like a good time to share a link to a video of an illustrated talk Ada gave at the Lumen Christi institute in Chicago in February. It’s a fascinating overview of the place of San Marco in Florence, with lots of excellent pictures. It’s like an audio version of an Ex Urbe post, with Fra Angelico, the meaning of blue, the Magi, the Medici, Savonarola, confraternities, and the complexities of Renaissance religious and artistic patronage.
And here’s one of the pictures mentioned but not shown in the presentation, a nine panel illustration by Filippo Dolcaiati “The History of Antonio Rinaldeschi.” It depicts the real historical fate of Rinaldeschi, who became drunk while gambling and threw manure at an icon of the Virgin Mary. A fascinating incident for demonstrating the functions of confraternities, and for demonstrating how seriously the people of Florence took the protection offered by saints and icons.
Second, due to a recent policy change in Italy’s national museums I was able to finally take literally thousands of photos of artifacts and spaces in museums that have been forbidden to cameras for years. I’ve started sharing the photos on Twitter (#historypix) so follow me on Twitter if you would enjoy random photos of cool historical artifacts twice a day.
Meanwhile I don’t yet have another full essay ready to post here, but I’m happy to say the reason is that I’m working away on the page proofs of Too Like the Lightning, the final editing step before the books go to press. I’ve even received a photo from my editor of the Advanced Release Copies for book reviewers sitting in a delicious little pile! It’s fun seeing how many different baby steps the book is taking on its long path to becoming real: cover art, page count, typography, physicality in many stages, first the pre-copy-edit Advanced Bound Manuscripts, then the post-copy-edit but pre-page-proof Advanced Release Copies, evolving toward the final hardcover transformation by transformation. My biggest point of suspense at this point is wondering how fat it will be, how heavy in the hand…
And now, a quick piece of history fun:
There is a dimly-lit hallway half way through the Vatican museum (after you’ve looked at 2,000 Roman marbles, 1,000 Etruscan vases and enough overwhelming architecture to make you start feeling slightly punchy) hung on the left-hand side with stunning tapestries of scenes from the life of Christ based on cartoons by Raphael. But on the right-hand side in the same hallway, largely ignored by the thousands of visitors who stumble through, is my favorite Renaissance tapestry cycle, a sequence of images of The Excessively Exciting Life of Pope Urban VIII. My best summary of these images is that, when I showed them to my excellent friend Jonathan (author of our What Color is Pluto? guest post) he scratched his chin and said, “I think the patronage system may have introduced some bias.” And it’s very true, these are an amazing example of Renaissance art whose sole purpose is excessive flattery of the patron, a genre common in all media: histories, biographies, dedications, sculptures, paintings, verses, and, in this case, thread.
These tapestries are fragile and quite faded, and the narrow hallway thronging with Raphael-admirers makes it awkward to get a good angle, but with much effort I think these capture the over-the-top absurdity which makes these tapestries such a delight. Urban VIII now is best known for engaging in unusually complicated military and political maneuvering, expanding and fortifying the papal territories, pushing fiercely against Hapsburg expansion into Italy, finishing the canonization of St. Ignatius of Loyola, persecuting Galileo, commissioning a lot of Bernini sculptures, and spending so much on military and artistic expenses that he got the papacy so head over heels in debt that the Roman people hated him, the Cardinals conspired to depose him (note: it usually takes a few high-profile murders and/or orgies to get them to do that, so this was a LOT of debt), and his successor was left spending 80% of the Vatican’s annual income on interest repayments alone. But let’s see what scenes from his life he himself wanted us to remember:
My favorite is the first: Angels and Muses descend from Heaven to attend the college graduation of young Maffeo Barberini (not yet pope Urban VIII) and give him a laurel crown. If all graduation ceremonies were this exciting, we’d never miss them! Also someone there has a Caduceus, some weird female version of Hermes? Hard to say. And look at the amazing fabric on the robe of the man overseeing the ceremony.
Second, Maffeo Barberini receives the Cardinal’s Hat, attended by an angel, while Pope Paul V who is giving him the hat points in a heavy-handed foreshadowing way to his own pope hat nearby. What could it mean?!
Next, the fateful election! Heavenly allegories of princely virtues come to watch as the wooden slips are counted and the vote counter is astonished by the dramatic result! Note how, propaganda aside, this is useful for showing us what the slips looked like.
In the one above I particularly like the guy who’s peering into the goblet to make absolutely sure no slips are stuck there:
On the other side of the same scene, our modest Urban VIII is so surprised to be elected he practically swoons! And even demands a recount, while the nice acolyte kneels before him with the (excessively heavy) papal tiara on a silver platter.
Now Urban’s adventures as pope! He breaks ground for new construction projects in Rome, attended by some floating cupid creature holding a book for the flying allegorical heart of the city:
He builds new fortresses to defend Rome:
He makes peace between allegorical ladies representing Rome and Etruria (the area right next to Rome: note, if there is strife between Rome and Etruria in the first place, things in Italy are VERY VERY BAD! But the tapestries aren’t going into that):
And finally, Urban VIII defends Rome from Famine and Plague by getting help from St. Peter, St. Paul, Athena, and St. Sebastian. Well done, your Holiness!
How about that for the exciting life of a late Renaissance pope? You get to hang out with lots of allegorical figures, and vaguely pagan deities as well as saints, and everyone around you is always gesturing gracefully! No matter they fought so hard for the papal tiara. Also, no bankers or moneylenders or interest repayment to be found!
More seriously, another century’s propaganda rarely makes it into our canon of what art is worth reproducing, teaching and discussing, but I often find this kind of artifact much more historically informative than most: we can learn details of clothing, spaces and items like how papers are folded, or what voting slips looked like. We can learn which acts a political figure wanted to be remembered for, what seemed important at the time, so different from what we remember. A tapestry of him canonizing St. Ignatius of Loyola would certainly be popular now, but in his day people cared more about immediate military matters, and he had no way to predict how important St. Ignatius would eventually become. Pieces like this are also a good way to remind ourselves that the Renaissance art we usually see on calendars and cell phone cases isn’t representative, it’s our own curated selection of that tiny venn diagram intersection of art that fits the tastes of BOTH then AND now. And a good reminder that we should always attend graduation ceremonies, since you never know when Angels and Muses might descend from Heaven to attend.
My own period I will treat the most briefly in this survey. This may seem like a strange choice, but I can either do a general overview, or get sidetracked discussing individual philosophers, theologians and commentators and their uses of skepticism for another five posts. So, in brief:
In the later Middle Ages, within the philosophical world, the breadth of disagreement within scholarship, how different the far extreme theories were on any given topic, was rather circumscribed. A good example of a really fractious fight is the question of, within your generally Aristotelian tripartite rational immortal soul, which of the two decision-making principles is more powerful, the Intellect or the Will? It’s a big and important question – without it we will starve to death like Buridan’s ass, and be unable to decide whether to send our second sons to Franciscan or a Dominican monasteries, plus we need it to understand how Original Sin, Grace and salvation work. But the breadth of answers is not that big, and the question itself presumes that everyone involved already believes 90% the same thing.
Enter Petrarch, “Let’s read the classics! They’ll make us great like the Romans!” Begin 250 years of working really hard to find, copy, correct, translate, edit, print and proliferate every syllable surviving from antiquity. Now we discover that Epicurus says there’s no afterlife and the universe is made of atoms; Stoics say the universe is one giant contiguous object without motion or individual existence; Plato says there’s reincarnation (What? The Plato we used to have didn’t say that!); and Aristotle totally doesn’t say what we thought he said, it turns out the Organon was a terrible translation (Sorry, Boethius, you did your best, and we love you, but it was a terrible translation.) Suddenly the palette of questions is much broader, and the degree to which people disagree has opened exponentially wider. If we were charting a solar system before, now we’re charting a galaxy. But the humanists still tried hard to make them all agree, much as the scholastics and Peter Abelard had, since the ancients were ALL wonderful and ALL brilliant and ALL right, right? Even the stuff that contradicts the other stuff? Hence Renaissance Syncretism, attempts by philosophers like Marsilio Ficino and Giovanni Pico della Mirandola to take all the authors of antiquity, and Aquinas and a few others in the mix, and show how they were all really saying the same thing, in a roundabout, hidden, glorious, elusive, poetic, we-can-make-like-Abelard-and-make-it-all-make-sense way.
Before you dismiss these syncretic experiments as silly, or as slavish toadying, there is a logic to it if you can zoom out from modern pluralistic thinking for a minute and look at what Renaissance intellectuals had to work with.
To follow their logic chain you must begin–as they did–by positing that Christianity is true, and there is a single monotheistic God who is the source of all goodness, virtue, and knowledge. Wisdom, being wise and good at judgment, helps you tell true from false and right from wrong, and what is true and right will always agree with and point toward God. Therefore all wise people in history have really been aiming toward the same thing–one truth, one source. Plato and Aristotle and their Criteria of Truth are in the background of this, Plato’s description of the Good which is one divine thing that all reasoning minds tend toward, and Aristotle’s idea that reasoning people (philosophers, scientists) working without error will come to identical conclusions even if they’re on opposite sides of the world, because the knowable categories (fish, equilateral triangle, good) are universal. Thus, as Plato and Aristotle say we use reason to gradually approach knowledge, all philosophers in history have been working toward the same thing, and differ only in the errors they make along the way. This is the logic, but they also have evidence, and here you have to remember that Renaissance scholars did not have our modern tools for evaluating chronology and influence. They looked at early Christian writings, and they looked at Plato and Aristotle, and they said, as we do, “Wow, Plato and Aristotle have a lot of ideas in common with these early Christians!” but while we conclude, “Early Christians sure were influenced by Plato and Aristotle,” they instead concluded, “This proves that Plato and Aristotle were aiming toward the same things as Christianity!” And they had further evidence from how tangled their chronologies were. There were certain key texts like the Chaldean Oracles which they thought were much much older than we now think they are, which made it look like ideas we attribute to Plato had independently existed well before Plato. They looked at Plotinus and other late antique Neoplatonists who mixed Plato and Aristotle but claimed the Aristotelian bits were really hidden inside Plato the whole time, and they concluded, “See, Plato and Aristotle were basically saying the same thing!” Similarly confusing were the works of the figure we now call Pseudo-Dionysius, who we think was a late antique Neoplatonist voicing a mature hybrid of Platonism and Aristotelianism with some Stoicism mixed in, but who Renaissance scholars believed was a disciple of Saint Paul, leading them to conclude that Saint Paul believed a lot of this stuff, and making it seem even more like Plato, Aristotle, Stoics, ancient mystics, and Christianity were all aiming at one thing. So any small differences are errors along the way, or resolvable with “sic et non.”
The problem came when they translated more and more texts, and found more contradictions than they could really handle. Ideas much wilder and more out there than they expected suddenly had authoritative possibly-sort-of-proto-Christian authors endorsing them. Settled questions were unsettled again, sleeping dragons woken. For example, it wasn’t until the Fifth Lateran Council in 1513 that the Church officially made belief in the immortality of the soul a required doctrine for all Christians, which does not mean that lots of Christians before 1513 didn’t believe in the afterlife, but that Christians in 1513 were anxious about belief in the afterlife, feeling that it and many other doctrines were suddenly in doubt which had stood un-threatened throughout the Middle Ages. The intellectual landscape was suddenly bigger and stranger.
Remember how I said Cicero would be back? All these humanists read Cicero constantly, including the philosophical dialogs with his approach of presenting different classical sects in dialog, all equally plausible but incompatible, leading to… skepticism. And as they explored those same sects more and more broadly, Cicero the skeptic became something of the wedge that started to expand the crack, not overtly stating “Hey, guys, these people don’t agree!” but certainly pressing the idea that they don’t agree, in ways which humanists had more and more trouble ignoring as more texts came back.
Aaaaaand the Reformation made this more extreme, a lot more extreme, by (A) generating an enormous new mass of theological claims made by contradictory parties, adding another arm to our galactic spiral, and (B) developing huge numbers of fierce and damning counter-arguments to all these claims, which in turn meant developing new tools for countering and eroding belief. Thus, as we reach the 1570s, the world of philosophy is a lot bigger, a lot deadlier (as the Reformation and Counter-Reformation killed many more people for their ideas than the Middle Ages did), and a lot scarier, with vast swarms of arguments and counter-arguments, many of them powerful, persuasive, beautifully reasoned, and completely incompatible. And when you make a beautiful yes-and-no attempt to make Plato and Epicurus agree, you don’t have the men themselves on hand to say “Excuse me, in fact, we don’t agree.” But you did have real live Reformation and Counter-Reformation theologians running around responding to each other in real time, that makes syncretic reconciliation the more impossible.
Remember how Abelard, who able to make St. Jerome and St. Augustine seem to agree, drew followers like Woodstock? Well, now his successors–Scholastic and Humanist, since the Humanists were all ALSO reading Scholasticism all the time–have a thousand times as many authorities to reconcile. You think Jerome and Augustine is hard? Try Calvin and Epicurus! St. Dominic and Zwingli! Thomas Aquinas is a saint now, let’s see if you can Yes-and-No the entire Summa Theologica into agreeing with Epictetus, Pseudo-Dionysius and the Council of Trent at the same time! And remember, in the middle of all this, that most if not all of our Renaissance protagonists still believe in Hell and damnation (or at least something similar to it), and that if you’re wrong you burn in Hellfire forever and ever and ever and so do all your students and it’s your fault. Result: FEAR. And its companion, freethought. Contrary to what we might assume, this is not a case where fear stifled inquiry, but where it stimulated more, firing Renaissance thinkers with the burning need to have a solution to all these contradictions, some way to sort out the safe path amid a thousand pits of Hellfire. New syntheses were proposed, new taxonomies of positions and heresies outlined, and old beliefs reexamined and refined or reaffirmed. And this period of intellectual broadening and competition brought with it an increasing inability to believe that any one of these options is the only right way when there are so many, and they are so good at tearing each other down.
And in the middle of this, experimental and observational science is advancing rapidly, and causing more doubt. We discover new continents that don’t fit in a T-O map (Ptolemy is wrong), new plants that don’t fit existing plant taxonomy (Theophrastus is wrong), details about Animals which don’t match Aristotle (we’d better hope he’s not wrong!), the circulation of the blood which turns the four humors theory on its head (Not Galen! We really needed him!), and magnification lets us finally see the complexity of a flea, and realize there is a whole unexplored micro-universe of detail too small for the naked eye to experience, raising the question “If God made the Earth for humans, why did God bother to make things humans can’t even perceive?”
Youth: “But, Socrates, why did experimental and observational science advance in that period? Discovering new stuff that isn’t in the classics doesn’t have anything to do with reconstructing antiquity, or with the Reformation, does it?”
Good question. A long answer would be a book, but I can make a quick stab at a short one. I would point at several factors. First, after 1300, and increasingly as we approach 1600, European rulers began competing in new ways, many of them cultural. As more and more nobles were convinced by the humanist claim that true nobility and power came from the lost arts of the ancients, so scholarship and unique knowledge, including knowledge of ancient sciences, became mandatory ornaments of court, and politically valuable as ways of advertising a ruler’s wealth and power. Monarchs and newly-risen families who had seized power through war or bribery could add a veneer of nobility by surrounding themselves with libraries, scholars, poets, and scientists, who studied the ancient scientific sources of Greece and Rome but, in order to understand them more fully, also studied newer sources coming from the Middle East, and did new experiments of their own. A new astronomical model of the heavens proclaimed the power of the patron who had paid for it, just as much as a fur-lined cloak or a diamond-studded scepter.
Add to this the increase of the scales of wars caused by increased wealth which could raise larger armies, generating a situation in which new tools for warfare, and especially fortress construction, were increasingly in demand (when you read Leonardo’s discussions of his abilities, more than 75% of the inventions he mentions are tools of war). Add to that the printing press which makes it possible for novelties–whether a rediscovered manuscript or a newly-discovered muscle–to spread exponentially faster, and which makes books much more affordable, so that if only one person in 50,000 could afford a library before now it is one in 5,000, and even merchants could afford a few texts. Education was easier, and educated men were in demand at courts eager to fill themselves with scholars, and advertise their greatness with discoveries.
These are the main facilitators, but I would also cite another fundamental shift. I have talked before about Petrarch, and the humanist project to improve the world by reconstructing a lost golden age. This is the first philosophical movement since ancient stoicism that has had anything to do with the world, since medieval theology’s (perfectly rational in context!) desire to study the Eternal instead of the ephemeral meant that most scholars for many centuries had considered natural philosophy, the study of impermanent natural phenomena, as useless as studying the bathwater instead of the baby. Humanism generated a lot of arguments about why Earth and earthly things were worth more than nothing, even if they agreed Heaven and eternal things were more important, and I think the mindset which said it was a pious and worthwhile thing to translate Livy or write a treatise on good government contributed to the mindset which said it was a pious and worthwhile thing to measure mountains or write a treatise on metallurgy. Thought turned, just a little bit, toward Earth.
There, that’s the Renaissance and Reformation, oversimplified by necessity, but Descartes is chomping at the bit for what comes next. For those who want more, I shall do the crass thing here and say: for more detail, see my book Reading Lucretius in the Renaissance, or Popkin’s History of Skepticism, or wait.
At last, Montaigne!
Like the world which basked in his writings, and shuddered in his “crisis,” I love Montaigne. I love his sentences, his storytelling, his sincerity, his quips, his authorial voice. Reading Montaigne is like like slowly enjoying a glass of whatever complex, rich and subtle beverage you most enjoy a glass of (wine for many, fresh goat milk for me!). Especially because, at the end, your glass is empty. (I see a contented Descartes nodding). When I set about starting to write this series, getting to Montaigne was, in fact, my secret end goal, since, if there is a founder of modern skepticism, it is Michel Eyquem de Montaigne.
Montaigne was unique, an experiment, the natural experiment to follow at the maturation of the Renaissance classical project but still, a unique child, raised as an overt pedagogical experiment outlined by his father: Montaigne grew up speaking only Latin. He was exposed to French in his first three years by country nurses, but from three on he was only allowed contact with people–his tutor, parents and servants–speaking Latin. He was a literal attempt to raise a Cicero or Caesar, formed exclusively by classical ideas, the ideal man that the humanists had been hoping to create. Greek was later added, not with textbooks and the rod as was usual in those days but with games and music, and studies were always made to seem pleasant and wonderful by surrounding him with music (even waking the child every morning with delightful live music). He grew up to be about as perfect a Platonic Philosopher King as one could hope to imagine, studying law and entering politics, as his father wished, achieving the highest honors, but preferring life alone in his library, and frequently retiring to do just that, only to be dragged back into politics actually by popular demand of people who would come bang on his library door demanding that he come out to take up office and rule them. I think often about what it must have been like to be Montaigne, to be so immersed, enjoy these things so much, and only later discover that he was alone in a world with literally no other native speaker of his language. It must have been as difficult as it was wonderful to be Montaigne. But I think I understand why, when he lost his best friend Étienne de la Boétie, Montaigne wrote of his grief, his loss, the pain of solitude, with an intensity rarely approached in the history of human literature. He also wrote Essais, meandering writings, the source of the modern word “essay”, for which every schoolchild has the right to playfully curse him.
I will now go about explaining why Montaigne was so wonderful by describing Voltaire. Yes, it is an odd way to go about it, but the Voltaire example is clearer and more concise than any Montaigne example I have on hand, and, in this, Voltaire was a student of Montaigne, and Montaigne will only smile to see such a beautiful development of his art, as Bacon smiles on Newton, and Socrates on all of us.
At the beginning of this sequence, I outlined two potential sources of knowledge: either (A) Sense Perception i.e. Evidence, or (B) Logic/Reason. The classical skeptics were born when the reliability these two sources of knowledge were drawn into doubt, Sense Perception by the stick in water, Logic by Xeno’s Paradoxes of Motion. Responses included the skeptics’ conclusion “We can’t know anything if we can’t trust Reason or the Senses,” and the various other classical schools’ Criteria of Truth (Plato’s Ideas, Aristotle’s Categories, Epicurus’s weak empiricism, etc.) All refutations we have seen along our long path have been based on undermining one of these types of knowledge sources: so when Duns Scotus fights with Aquinas, he picks on his logic, and when Ockham fights with him he, often, picks on his material sensory evidence. (“Where is the phantasm? Huh? Huh?”)
Everybody, I’d like to introduce you to Leibniz. Leibniz, this is everybody. “Hello!” says Leibniz, “Very nice to meet you all.” We are going to viciously murder Leibniz in about three minutes. “It’s no trouble,” says Leibniz, “I’m quite used to it.” Thank you, Leibniz, we appreciate it.
Leibniz here made many great contributions to philosophy and mathematics, but one particular one was extraordinarily popular, I would go so far as to say faddy, a fad argument which swept Europe in the first half of the 18th century. You have almost certainly heard it before in mocking form, but I will do my best to be fair as we line up our target in our sites:
God is Omnipotent, Omniscient and Omnbenevolent. (Given.) “Grrrr,” quoth Socrates.
Given that God is Omniscient, He knows what the best of all possible worlds is.
Given that God is Omnipotent, He can create the best of all possible worlds.
Given that God is Omnibenevolent, He wants to create the best of all possible worlds.
Any world such a God would make must logically be the best of all possible worlds
This is the best of all possible worlds.
Now, this was a proof written, just like Anselm’s and Aquinas’s, by a philosopher expecting a readership who all believe, both in God, and in Providence. It is a comfortable proof of the logical certainty that there is Providence, that this universe is perfect (as the Stoics first theorized), and anything in it that seems to be bad or evil must, in fact, be part of a greater long-term good that we fail to see because of our limited human perspective. The proof made a huge number of people delighted to have such an elegant and simple argument for something they enthusiastically believed.
But, the proof also the side-effect that arguments about Providence often do, of making people start to try to reason out what the good was behind hidden evils. “Oh, that guy was struck with disease because he did X bad thing.” “Wolves exist to make us live in villages.” “That plague happened because those people were bad.” It was (much like Medieval proofs of the existence of God) a way philosophers could show off their cleverness to an appreciative audience, make themselves known, and put forward theories about right and wrong and what God might want.
In 1755 an enormous earthquake struck the great port city of Lisbon (Portugal), wiping out tens of thousands of people (some estimate up to 100,000) and leveling one of the great gems of European civilization. It remains to this day one of the deadliest earthquakes in recorded history, and many parts of Lisbon are still in ruins almost 300 years later. The shock and horror, to a progressive, optimistic Europe, was stunning. And immediately thereafter, fans of Leibniz started publishing essays about how it was GOOD that this had happened, because of XYZ reason. For example, one argument was that they were persecuting people for their religion, and this was God saying he disapproved <= REAL argument. (Note: Leibniz himself is innocent of all this, having died years before the earthquake – we are speaking of his followers.) Others argued that it was a bad minor effect of God’s general laws, that the physical rules of the Earth which make everything wonderful for humankind also make earthquakes sometimes happen, but that the suffering they cause is negligible against the greater goods that Providence achieves. And if one person in Europe could not stand these noxious, juvenile, pompous, inhumane, self-serving, condescending, boastful, heartless, self-congratulatory responses to unprecedented human suffering, that person was the one pen mightier than any sword, Voltaire.
Would words like these to peace of mind restore
The natives sad of that disastrous shore?
Grieve not, that others’ bliss may overflow,
Your sumptuous palaces are laid thus low;
Your toppled towers shall other hands rebuild;
With multitudes your walls one day be filled;
Your ruin on the North shall wealth bestow,
For general good from partial ills must flow;
You seem as abject to the sovereign power,
As worms which shall your carcasses devour.
No comfort could such shocking words impart,
But deeper wound the sad, afflicted heart.
When I lament my present wretched state,
Allege not the unchanging laws of fate;
Urge not the links of the eternal chain,
’Tis false philosophy and wisdom vain.
The God who holds the chain can’t be enchained;
By His blest Will are all events ordained:
He’s Just, nor easily to wrath gives way,
Why suffer we beneath so mild a sway:
This is the fatal knot you should untie,
Our evils do you cure when you deny?
Men ever strove into the source to pry,
Of evil, whose existence you deny.
If he whose hand the elements can wield,
To the winds’ force makes rocky mountains yield;
If thunder lays oaks level with the plain,
From the bolts’ strokes they never suffer pain.
But I can feel, my heart oppressed demands
Aid of that God who formed me with His hands.
Sons of the God supreme to suffer all
Fated alike; we on our Father call.
No vessel of the potter asks, we know,
Why it was made so brittle, vile, and low?
Vessels of speech as well as thought are void;
The urn this moment formed and that destroyed,
The potter never could with sense inspire,
Devoid of thought it nothing can desire.
The moralist still obstinate replies,
Others’ enjoyments from your woes arise,
To numerous insects shall my corpse give birth,
When once it mixes with its mother earth:
Small comfort ’tis that when Death’s ruthless power
Closes my life, worms shall my flesh devour.
This (in the William F. Fleming translation) is an excerpt from the middle of Voltaire’s Poem on the Lisbon Earthquake, which I heartily encourage you to read in its entirety. The poem summarizes the arguments of Camp Leibniz , and juxtaposes them with heart-wrenching descriptions of the sufferings of the victims, and with Voltaire’s own earnest and passionate expression of exactly why these kinds of arguments about Providence are so difficult to choke down when one is really on the ground suffering and feeling. The human is not a senseless pottery vessel, it is a thinking thing, it feels pain, it asks questions, it feels the special kind of pain that unanswered questions cause, the same pain the skeptics have been trying to help us escape for 3,000 years. But we don’t escape, and the poem captures it. The poem swept across Europe like a firestorm. People read it, people felt it, people recognized in Voltaire’s words the cries of anger in their own hearts. And they agreed. He won. The Leibniz fad ended. An entire continent-wide philosophical movement, slain.
And he used neither Logic nor Evidence.
Did you feel it? The poem persuaded, attacked, undermined, eroded away the respectability of Leibniz, but it did it without using EITHER of the two pillars of argument. There was no chain of reasoning. And there was no empirical observation. You could say there was some logic in the way he juxtaposed claims “God is a kind Maker” with counter-claims “I am not a potter’s jar, I am a thinking thing! I need more!”. You could say there was some empiricism or evidence-based argument in his descriptions of things he saw, or things he felt, since feelings too are sense-perceptions in a way, so reporting how one feels is reporting a sensory fact. But there was nothing in this so rigorous or so real that any of our ancient skeptics would recognize it as the empiricism they were attacking. Those people Voltaire describes – he did not see them, he just imagines them, reaching across the breadth of Europe with the strength of empathy. That potter’s wheel is a metaphor, not a syllogism. Voltaire has used a third thing, neither Reason nor Evidence, as a tool of skepticism.
What do we name this Third Thing? I have heard people propose “common sense” but that’s a terribly vexed term, going back to Cicero at least, which has been used by this point to mean 100 things that are not this thing, so even if you could also call this thing “common sense” it would just create confusion (we don’t need Aristotle looming with a lecture on the dangers of unclear vocabulary). I have heard people propose “sentiment” and I like how galling it feels to try to suggest that “sentiment” should enjoy coequal respect and power with Reason and Evidence, but it isn’t quite that either. I am not yet happy with any name for this Third Thing, and am playing around with many. All I will say is that it is real, it is powerful, it is as effective at persuading one to believe or disbelieve as Reason and Evidence are. And, even if there were shadows of this Third Thing earlier in human history, Montaigne was the smith who sharpened the blade and handed it to Voltaire, and to the rest of us.
Montaigne’s Essais are lovely, meandering, personal, structure-less, rambling musings in which topics flow one upon another, he summarizes an argument made for or against some heresy, then, rather than voicing an opinion, tells you a story about his grandmother that one time, or retells a bit of one of Virgil’s pastorals, or an anecdote about some now-obscure general, and then flows on to a different topic, never stating his opinion on the first but having shaped your thinking, through his meanders, until you feel an answer, a belief or, more often, disbelief, even if he never voiced one. And then he keeps going, taking up another argument, making it feel silly with an allegory about two bakers, another and–have you heard the news from Spain?–another, and another, and oh, the loves of Alexander, another, and another. And as it flows along you get to know him, feel you’re having a conversation with him, and somewhere toward the end you no longer believe any of the philosophical arguments he has just summarized are plausible at all, but he never once argued directly against any of them. It is a little bit like our skeptical Cicero, juxtaposing opposing views and leaving us convinced by none, but it is one level less structured, not actually a dialog with arguments and refutations. Skepticism, without Reason, without Evidence, just with the human honesty that is Montaigne, his doubts, his friendship, his communication to you, dear reader, across the barrier of page, and time, and language, this strange French-Roman, this only native Latin speaker born in a millennium, this alien, has made you realize all the philosophical convictions, everything in that broad spectrum that scholasticism plus the Renaissance plus the Reformation and Counter-Reformation ferocity have laid before you, none of it is what a person really feels deep down inside, not Montaigne, and not you. And so he leaves you a skeptic, in a completely different way from how the ancient skeptics did it, not with theses, or exercises, or lists, or counterarguments, just with… humanity?
Montaigne did it. His contemporaries found it… odd at first, a bit self-centered, this autobiographical meandering, but it was so beautiful, so entrancing, so powerful. It reared a new generation, armed with Reason and Evidence and This Third Thing, and deeply skeptical. Students at universities started raising their hands in class to ask the teachers to prove the school existed. Theologians advising princes started saying maybe it didn’t matter that much what the difference was between the different Christian faiths if they were close enough. A new age of philosophy was born, not a new school, but a new tool for dogmatism’s ancient symbiotic antagonist: doubt.
And, where doubt grows stronger and richer, so does dogmatic philosophy, having that much more to test itself against. Just as, in antiquity, so many amazing schools and ideas were born from trying to respond to Zeno and the Stick in Water, so Montaigne’s new tools of Skepticism, his revival and embellishment of skepticism, the birth, as we call it, of Modern Skepticism, was also the final ingredient necessary for an explosion of new ideas, new schools, new universes described by new philosophers trying to build systems which can stand up against a new skepticism armed, not just against Reason and Evidence, but with That Third Thing.
Thus, as 1600 approaches, the breakneck proliferation of new ideas and factions make Montaigne’s skepticism so popular that students in scholastic and Jesuit schools are starting to raise their hands and demand that the professor prove the existence of the classroom before expecting them to attend class. A “skeptical crisis” takes center stage in Europe’s great intellectual conversation, and multiplying doubt seems to have all the traditional Criteria of Truth in flight. It is onto this stage that Descartes will step, and craft, alongside his contemporaries, the first new systems which will have to cope, not with two avenues of attacking certainty, but, thanks to Montaigne, three. And will fight back against them with Montaigne’s arts as well. Next time.
For now, I will leave you with one more little snippet of the future: I lied to you, about a simple happy ending to Voltaire’s quarrel with Leibniz. Oh, Leibniz was quite dead, not just because the man himself had died but because no philosopher could take his argument seriously after the poem. Ever. Again. In fact, a few years ago I went to a talk at at a philosophy department in which a young scholar was taking on Leibniz’s Best of All Possible Worlds thesis, and picking it apart using beautiful logical argumentation, and at the end everyone applauded and congratulated him, but when the Q&A started the first Q was “Well, um, this was all quite fascinating, but, isn’t Leibniz, I mean, no one takes that argument seriously anymore…” But the young philosopher was correct to point out that, in fact, no one had ever actually directly refuted it with logic. No one saw the need. But if Voltaire’s victory over logical Leibniz was complete, Leibniz was not the most dangerous of foes. Voltaire had contemporaries, after all, armed with Montaigne’s Third Thing just as Voltaire was. Rousseau will fire back, sweet, endearing, maddening Rousseau, not in defense of Leibnitz, but against the poem which he sees as an attack on God. But this battle of two earnest and progressive deists must wait until we have brought about the brave new world that has such creatures in it. For that we need Descartes, Francis Bacon, grim Hobbes, John Locke, and the ambidextrous Bayle.
Socrates, Sartre, Descartes and our Youth have, among them, consumed twelve thousand, six hundred and forty two hypothetical eclairs in the fourteen months since we left them contemplating skepticism on the banks of a cheerily babbling imaginary brook. Much has changed in the interval, not in the land of philosophical thought-experiments (which is ever peaceful unless someone scary like Ockham or Nietzsche gets inside), but in a world two layers of reality removed from theirs. The changes appear in the world of material circumstances which shape and foster this author, who in turn shapes and fosters our philosophical picnickers. Now, having recovered from my transplant shock of being moved to the new and fertile country of University of Chicago, and with my summer work done, and Too Like the Lightningfully revised and on its way toward its May 10th release date (YES!), it is time at last to return to our hypothetical heroes, and to my sketches of the history of philosophical skepticism.
When last we saw them, Socrates, Sartre, Descartes and our Youth had rescued themselves from the throes of absolute doubt by developing Criteria of Truth, which allowed them to differentiate arenas of knowledge where certainty is possible from arenas of knowledge where certainty is not possible. (See their previous dramatic adventures with in Sketches of a History of Skepticism Part 1 and Part 2). To do this, they looked at three systems: Epicureanism, which suggests that we have certain knowledge of the world perceived by the senses, but no certain knowledge of the imperceptible atomic reality beneath; Platonism, which suggests that we have knowledge of the eternal structures that create the material world, i.e. Forms or Ideas, but not of the flawed, corruptible material objects which are the shadows of those eternal structures; and Aristotelianism, which suggests that we can have certain knowledge of logical principles and of categories within Nature, but not of individual objects.
Notably, neither Epicurus nor Aristotle was invited to our picnic, and, while you never know when any given Socrates will turn out to be a Plato in disguise, our particular Socrates seems to be staying safely in the camp of doubt: he knows that he knows nothing. Our object is not to determine which of these classical camps has the correct Criterion of Truth. In fact, our distinguished guests, Descartes and Sartre, aren’t interested in rehashing these three classical systems all of whose criteria are not only familiar, but, to them, long defunct. They have not come through this great distance in time to watch Socrates open the doors of skepticism to our Youth to just meet antiquity’s familiar dogmatists; the twinkle in Descartes’ eye (and his infinite patience dolling out eclairs) tells me he’s waiting for something else.
Descartes and Sartre expect Cicero next — Cicero, whom many might mistake as a voice for the Stoic school (the intellectual party conspicuously missing from the assembly of Plato, Aristotle, and Epicurus) but who is actually more often read by modern scholars as a new and promising kind of Skeptic. Unfortunately, Cicero is currently busy answering a flurry of letters from someone called Petrarch, so has declined to join our little gathering (or possibly he’s just miffed hearing that I’m doing an abbreviated finale to this series, so he’d only get a couple paragraphs, even if he came). So we must do our concise best to cover his contribution on our own. Pyrrho, Zeno and other early skeptical voices argued in favor of doubt by demonstrating the fallibility of the senses and of pure reason: the stick in water that looks bent, the paradoxes of motion which show how logic and reality don’t match. Cicero achieves unbelief (and aims at the eudaimonist tranquility beyond) by a different route, a luxurious one made possible by the fact that he is writing three centuries into the development of philosophy and has many different dogmatic schools to fall back on. In his philosophical dialogs, Cicero presents different interlocutors who put forth different dogmatic positions: Stoic, Platonist, Epicurean; all in dialog with each other, presenting evidence for their own positions and counter-arguments against the conclusions of others. Each interlocutor works strictly by his own Criterion of Truth, and all argue intelligently and well. But they all disagree. When you read them all together, you are left uncertain. No particular voice seems to overtop the others, and the fact that there are so many different equally plausible positions, defended with equally well-defined Criteria of Truth, leaves one with no confidence that any of them is reliable. At no point does Cicero say “I am a skeptic, I think there is no certainty,” — but the effect of reading the dialog is to be left with uncertain feelings. Cicero himself does not seem to have been a Pyrrhonist skeptic, and certainly does seem to hold some philosophical positions, especially moral principles, quite strongly. There is certainly a good case to be made that he has strong Stoic leanings, and there is validity to the Renaissance argument that he should be vaguely clustered in with Seneca and Cato, who subscribe to a mixed-together digest of Roman paganism, Stoicism, some Platonic and a few Aristotelian elements. But especially on big questions of epistemology, ontology and physics, Cicero remains solidly, frustratingly, elusive.
There are many important aspects of Cicero’s work, but for our purposes the most important is this: he has achieved doubt without actually making any skeptical arguments, or counter-arguments. He has not attacked the fundamentals of Stoicism, Platonism or Epicureanism. Instead, he has used the strengths of the three schools to undermine each other. All three schools are convincing. All are plausible. All have evidence and/or logic on their side. As a result, none of the three winds up feeling convincing, even though none of the three has been directly undermined. This is not a new achievement of Cicero’s. Epicurus used a similar technique, and Lucretius, his follower, did so too; and we know Cicero read Lucretius. But Cicero is the most important person to use this technique in antiquity, largely because 1,300 years later it will be Cicero who become the centerpiece of Renaissance education. And Cicero will have no small Medieval legacy as well.
Medieval Certainty, and the Big Question
Stereotypically for a Renaissance historian, I will move quickly through the Middle Ages, though not for the stereotypical reasons. I don’t think that the Middle Ages were an intellectual stasis; I do think that Medieval philosophy is fully of many complex things that I’m just starting to seriously work through in my own studies. I’m not ready to provide a light, fun summary of something which is, for me, still a rich forest to explore. Church Fathers, late Neoplatonists, Chroniclers, theological councils, monastic leaders, rich injections from the Middle East, Maimonides; all intersect with doubt, certainty and Criteria of Truth in rich and fascinating ways that I am not yet prepared to do justice to. So instead I will present an abstraction of one important aspect of Medieval thinking which I hope will help elucidate some overall approaches to doubt, even if I don’t pause to look at individual minds.
When I was in my second year of grad school, I chatted over convenience store cookies in the grad student lounge with a new student entering our program that year, like myself, to study the Renaissance. He poked fun at the philosophers of the Middle Ages. He asked me, “How could anybody possibly be interested in going on and on and on and on like that about God?” And in that moment of politeness, and newness, and fun, I laughed, and nodded. But, happily, we had a good teacher who made us look more at the Medieval, without which we can’t understand the Renaissance, and now I would never laugh at such a comment.
Set aside your modern mindset for a moment, and your modern religious concepts, and see if you can jump into the Medieval mind. To start with, there is a Being of infinite power, Whose existence is known with certainty. (Take that as given — a big given, I know, but it’s a given in this context.) Such a Being created everything that ever has existed or will exist. Everything that happens: events, births, storms, falling objects, thoughts; all were conceived by this Being and exist according to this Being’s script. The Being possesses all knowledge, and all good things are good because they resemble this Being. Everything in the material world is fleeting and imperfect and will someday be destroyed and forgotten, including the entire Earth. But — this Being has access to another universe where all things are eternal and perfect, which will last beyond the end of the material universe, and with this Being’s help there might be some way for us to reach that universe as well. The Being created humans with particular care, and is trying to communicate with us, but direct communication is a difficult process, just as it is difficult for an entomologist to communicate directly with his ants, or for a computer programmer to communicate directly with the artificial intelligences that she has programmed.
Now, the facetious question I laughed at in early grad school comes back, but turned on its head. How could you ever want to study anything other than this Being? It explains everything. You want to know the cause of weather, astronomical events, diseases, time? The answer is this Being. You want to know where the world came from, how thought works, why there is pain? The answer is this Being. History is a script written by this Being, the stars are a diagram drawn by this Being, the suitability and adaptation of animals and plants to their environments is the ingenuity of this Being, and the laws that make rocks sink and wood float and fire burn and rain fall are all decisions made by this Being. If you have any intellectual curiosity at all, wouldn’t it be an act of insanity to dedicate your life to anything other than understanding this Being? And in a world in which there has been, for centuries, effective universal consensus on all these premises, what society would want to fund a school that didn’t study them? Or pay tuition for a child to study something else? Theology dominated other sciences in the Middle Ages, not because people were backward, or closed-minded, or lacked curiosity, but because they were ambitious, keenly intellectual and fixed on the a subject from which they had every reason to expect answers, not just to theological questions, but to all questions. They didn’t have blinders, they had their eyes on the prize, and they felt that choosing to study Natural Philosophy (i.e. the world, nature, biology, plants, animals) instead of Theology was like trying to study toenail clippings instead of the being they were clipped from.
To put it another way: have you ever watched a fun, formulaic, episodic genre show like Buffy the Vampire Slayer, or the X-Files? There’ll be one particular episode where the baddie-of-the-day is Christianity-flavored, and at some point a manifest miracle happens, or an angel or a ghost shows up, and then we have to reset the formula and move onto the next episode, but you spend that whole next episode thinking, “You know, they just found proof of the existence of the afterlife and the immortality of the soul. You’d think they’d decide that’s more important than this conspiracy involving genetically-modified corn.” That’s how people in the Middle Ages felt about people who wanted to study things that weren’t God.
Doubt comes into this in important ways, but not the ways that modern rhetoric about the Middle Ages leads most people to expect.
Wikipedia, at the time of writing, defines Scholasticism as “a method of critical thought which dominated teaching by the academics (“scholastics,” or “schoolmen”) of medieval universities in Europe from about 1100 to 1700. ” It was “a program of employing that [critical] method in articulating and defending dogma in an increasingly pluralistic context.” It “originated as an outgrowth of, and a departure from, Christian monastic schools at the earliest European universities.” Philosophy students traditionally define Scholasticism as “that incredibly boring hard stuff about God that you have to read between the classics and Descartes”. Both definitions are true. Scholasticism is an incredibly tedious, exacting body of philosophy, intentionally impenetrable, obsessed with micro-detail, and happy to spend three thousand words proving to you that Good is good, or to set out a twenty step argument it is better to exist than not exist (this is presumably why Hamlet still hadn’t graduated at age 30). Scholasticism was also so incredibly exciting that, apart from the ever-profitable medical and law schools, European higher education devoted itself to practically nothing else for the whole late Middle Ages, and, even though the intellectual firebrands of both the Renaissance and the 17th and 18th centuries devoted themselves largely to fiercely attacking the scholastic system, it did not truly crumble until deep into the Enlightenment.
Why was Scholasticism so exciting? Even if people who believed in an omnipotent God had good reason to devote their studies pretty-exclusively to Theology, why was this one particularly dense and intentionally difficult method the method for hundreds of years? Why didn’t they write easy-to-read, penetrable treatises, or witty philosophical tales, or even a good old fashioned Platonic-type dialog?
The answer is that Christianity changes the stakes for being wrong. In antiquity, if you’re wrong about philosophy, and the philosophical end of theology, you’ll make incorrect decisions, possibly lead a sadder or less successful life than you would otherwise, and it might mean your legacy isn’t what you wanted it to be, but that’s it. If you’re really, really wrong you might offend Artemis or something and get zapped, but it’s pretty easy to cover your bases by going to the right festivals. By the logic of antiquity, if you put a Platonist and an Epicurean in a room, one of of them will be wrong and living life the wrong way, at least in some ways, but they can both have a nice conversation, and in the end, either they’ll both reincarnate and the Epicurean will have another chance to be right later, or they’ll both disperse into atoms and it won’t matter. OK. In Medieval Christianity, if you’re wrong about theology, your immortal soul goes to Hell forever, where you’ll be tormented by unspeakable devils for the rest of eternity, and everyone else who believes your errors is also likely to lose the chance of eternal paradise and absolute knowledge, and will be plunged into a pit of absolute misery and despair, irrevocably, forever. Error is incredibly dangerous, to you and to everyone around you who might get pulled down with you. If you’re really bad, you might even bring the wrath of God down upon your native city, or if you’re really bad then, while you’re still alive, your soul might depart your body and sink down to Hell, leaving your body to be a house for a devil who will use you to visit evil on the Earth (see Inferno Canto 27). But leaving aside those more extreme and superstition-tainted possibilities, error became more pernicious because of eternal damnation. If people who read your theologically incorrect works go to Hell, you’re infinitely culpable, morally, since every student misled to damnation is literally an infinite crime.
So, if you are a Medieval person, Theology is incredibly valuable, the only kind of study worth doing, but also incredibly dangerous. You want to tread very carefully. You want a lot of safety nets and spotters. You want ways to avoid error. And you know error is easy! Errors of logic, errors of failing senses. Enter Aristotle, or more specifically enter Aristotle’s Organon, a translation of the poetic works of Aristotle completed by dear Boethius, part of the latter’s efforts to preserve Greek learning when he realized Greek and other relics of antiquity were fading. The Organon explains in great detail, how you can go about constructing chains of logic in careful, methodical ways to avoid error. Use only clearly defined unequivocal vocabulary, and strict syllogistic and geometric reasoning. Here it is, foolproof logic in 50 steps, I’ll show you! Sound familiar? This is Aristotle’s old Criterion of Truth, but it’s also the Medieval Theologian’s #1 Christmas Wish List. The Criterion of Truth which was, for Aristotle, a path through the dark woods and a solution to Zeno and the Stick in Water, is, to our theologian, a safety net over a pit of eternal Hellfire. That is why it was so exciting. That was why people who wanted to do theology were willing to train for five years just in logic before even looking at a theological question, just as Astronauts train in simulators for a long time before going out into the deadly vacuum of space! That is even why scholastic texts are so hard to read and understand – they were intentionally written to be difficult to read, partly because they’re using an incredibly complicated method, but even more because they don’t want anyone to read them who hasn’t studied their method, because if you read them unprepared you might misunderstand, and then you’d go to Hell forever and ever and ever, and it would be Thomas Aquinas’s fault. And he would be very sad. When Thomas Aquinas was presented for canonization, after his death, they made the argument that every chapter of the Summa Theologica was itself a miracle. It’s easy to laugh, but if you think about how desperately they wanted perfect logic, and how good Aquinas was at offering it, it’s an argument I understand. If you were dying of thirst in the desert, wouldn’t a glass of water feel like a miracle?
To give credit where credit is due, the mature application of Aristotle’s formal logic to theological questions was not pioneered by Aquinas but by a predecessor: Peter Abelard, the wild rockstar of Medieval Theology. People crowded in thousands and lived in fields to hear Peter Abelard preach, it was like Woodstock, only with more Aristotle. Why were people so excited? Did Abelard finally have the right answer to all things? “Yes and No,” as Peter Abelard would say, “Sic et Non“, that being the the title of his famous book, a demonstration of his skill. (Wait, yes AND no, isn’t that even scarier and worse and more damnable than everything else? This is the most dangerous person ever! Bernard of Clairvaux thought so, but the Woodstock crowd at the Paraclete, they don’t.) Abelard’s skill was taking two apparently contradictory statements and showing, by elaborate roundabout logic tricks, how they agree. Why is this so exciting? Any troll on the internet can do that! No, but he did it seriously, and he did it with Authorities. He would take a bit of Plato that seemed to contradict a bit of Aristotle, and show how they actually agree. Even ballsier, he would take a bit of Plato that pretty manifestly DOES contradict another bit of Plato, and show how they both agree. Then, even better, he would take a bit from St. Augustine that seems to contradict a bit from St. Jerome and show how the two actually agree. “OH THANK GOD!” cries Medieval Europe, desperately perplexed by the following conundrum:
The Church Fathers are saints, and divinely inspired; their words are direct messages from God.
If you believe the Church Fathers and act in accordance with their teachings, they will show you the way to Heaven; if you oppose or doubt them, you are a heretic and damned for all eternity.
The Church Fathers often disagree with each other.
Abelard rescued Medieval Europe from this contradiction, not necessarily by his every answer, but by his technique by which seemingly-contradictory authorities could be reconciled. Plato with Aristotle is handy. Plato with Plato sure is helpful. Jerome with Augustine is eternal salvation. And if he does it with the bits of Scripture that seem to contract the other bits? He is now the most exciting thing since the last time the Virgin Mary showed up in person.
Abelard had a lover–later, wife, but she preferred ‘lover’–the even more extraordinary Heloise, and I consider it immoral to mention him without mentioning her, but her life, her stunningly original philosophical contributions and her terrible treatment at the hands of history are subjects for another essay in its own right. For today, the important part is this: Abelard was exciting for his method, more than his ideas, his way of using Reason to resolve doubts and fears when skepticism loomed. Thus even Scholasticism, the most infamously dogmatic philosophical method in European history, was also in symbiosis with skepticism, responding to it, building from it, developing its vast towers of baby-step elaborate logic because it knew Zeno was waiting.
Proofs of the Existence of God
We are all very familiar with the veins of Christianity which focus on faith without proof as an important part of the divine plan, that God wants to test people, and there is no proof of the existence of God because God wants to be unknowable and elusive in order to test people’s faith. The most concise formula is the facetious one by Douglas Adams, where God says: “I refuse to prove that I exist, because proof denies faith and without faith I am nothing.” It’s a type of argument associated with very traditional, conservative Christianity, and, often, with its more zealous, bigoted, or “medieval” side. I play a game whenever I run into a new scholar who works on Medieval or early modern theological sources, any sources, any period, any place, from pre-Constantine Rome to Renaissance Poland. I ask: “Hey, have you ever run into arguments that God’s existence can’t be proved, or God wants to be known by faith alone, before the Reformation?” Answers: “No.” “Nope.” “Naah.” “No, never.” “Uhhh, not really, no.” “Nope.” “No.” “Nothing like that.” “Hmm… no.” “Never.” “Oh, yeah, one time I thought I found that in this fifth-century guy, but actually it was totally not that at all.” Like biblical literalism, it’s one of these positions that feels old because it’s part of a conservative position now, but it’s actually a very recent development from the perspective of 2,000 years of Christianity plus centuries more of earlier theological conversations. So, that isn’t what the Middle Ages generally does with doubt; it doesn’t rave about faith or God’s existence being elusive. Europe’s Medieval philosophers were so sure of God’s existence that it was considered manifestly obvious, and doubting it was considered a mental illness or a form of mental retardation (“The fool said in his heart ‘there is no God’,” => there must be some kind of brain deficiency which makes people doubt God; for details on this a see Alan C. Kors, Atheism in France, vol. 1). And when St. Anselm and Thomas Aquinas and Duns Scotus work up technical proofs of the existence of God they’re doing it, not because they or anyone was doubting the existence of God, but to demonstrate the efficacy of logic. If you invent a snazzy new metal detector you first aim it at a big hunk of metal to make sure it works. If you design a sophisticated robot arm, you start the test by having it pick up something easy to grab. If you want to demonstrate the power of a new tool of logic, you test it by trying to prove the biggest, simplest, most obvious thing possible: the existence of God.
(PARENTHESIS: Remember, I am skipping many Medieval things of great importance. *cough*Averroes*cough* This is a snapshot, not a survey.)
Three blossoms on the thorny rose of this Medieval trend toward writing proofs of the existence of God are worth stopping to sniff.
The first blossom is the famous William of Ockham (of “razor” fame) and his “anti-proof” of the existence of God. Ockham was a scholastic, writing in response to and in the same style and genre as Abelard, Aquinas, Scotus, and their ilk. But, when one read along and got to the bit where one would expect him to demonstrate his mastery of logic by proving the existence of God, he included instead a plea (paraphrase): Please, guys, stop writing proofs of the existence of God! Everyone believes in Him already anyway. If you keep writing these proofs, and then somebody proves your proof wrong by pointing out an error in your logic, reading the disproof might make people who didn’t doubt the existence of God start to doubt Him because they would start to think the evidence for His Existence doesn’t hold up! Some will read into this Anti-Proof hints of the beginning of “God will not offer proof, He requires faith…” arguments, and perhaps it does play a role in the birth of that vein of thinking. (I say this very provisionally, because it is not my area, and I would want to do a lot of reading before saying anything firm). My gut says, though, that it is more that Ockham thought everyone by nature believed in God, that God’s existence was so incredibly obvious, that God was not trying to hide, rather that he didn’t want the weakness of fractious scholastic in-fighting to erode what he thought was already there in everyone: belief.
Aside: While we are on the subject of Ockham, a few words on his “razor”. Ockham is credited with the principle that the simplest explanation for a thing is most likely to the correct one. That was not, in fact, a formula he put forward in anything like modern scientific terms. Rather, what we refer to as Ockham’s Razor is a distillation of his approach in a specific argument: Ockham hated the Aristotelian-Thomist model of cognition, i.e. the explanation of how sense perception and thoughts work. Hating it was fair, and anyone who has ever studied Aristotle and labored through the agent intellect, and the active intellect, and the passive intellect, and the will, and the phantasm, and innate ideas, and eternal Ideas, and forms, and categories, and potentialities, shares William of Ockham’s desire to pick Thomas Aquinas up and shake him until all the terminology falls out like loose change, and then tell him he’s only allowed to have a sensible number of incredibly technical terms (like 10, 10 would be a HUGE reduction!). Ockham proposed a new model of cognition which he set out to make much simpler, without most of the components posited by Aristotle and Aquinas, and introduced formal Nominalism. (Here Descartes cheers and sets off a little firecracker he’s been saving). Nominalism is the idea that “concepts” are created by the mind based on sense experience, and exist ONLY in the mind (like furniture in a room, adds Sherlock Holmes) rather than in some immaterial external sense (like Platonic forms). Having vastly simplified and revolutionized cognition, Ockham then proceeded to describe the types of concepts, vocabulary terms and linguistic categories we use to refer to concepts in infuriating detail, inventing fifty jillion more technical terms than Aquinas ever used, and driving everyone who read him crazy. (If you are ever transported to a dungeon where you have to fight great philosophers personified as Dungeons & Dragons monsters, the best weapon against Ockham is to grab his razor of +10 against unnecessary terminology and use it on the man himself). One takeaway note from this aside: while “Ockham’s Razor” is a popular rallying cry of modern (post-Darwin) atheism, and more broadly of modern rationalism, that is a modern usage entirely unrelated to the creator himself. He thought that the existence of God was so incredibly obvious, and necessary to explain so many things, from the existence of the universe to the buoyancy of cork, that if you presented him with the principle that the simplest explanation is usually best, he would agree, and happily assume that you believed, along with him, that “God” (being infinitely simple, see Plotinus and Aquinas) is therefore a far simpler answer to 10,000 technical scientific questions than 10,000 separate technical scientific answers. Like Machiavelli, Aristotle and many more, Ockham would have been utterly stunned (and, I think, more than a little scared) if he could have seen how his principles would be used later.
The second blossom (or perhaps thorn?) of this Medieval fad of proving God’s existence was, well, that Ockham was 110% correct. Here again I cite Alan Kors’ masterful Atheism in France; in short, his findings were that, when proving the existence of God became more and more popular, as the first field test to make sure your logical system worked, (a la metal detector…beep, beep, beep, yup it’s working!), it created an incentive for competing logicians to attack people’s proofs of the existence of God (i.e. if it can’t find a giant lump of iron the size of a house it’s not a very good metal detector, is it?) Thus believers spent centuries writing attacks on the existence of God, not because they doubted, but to prove their own mastery of Aristotelian logic superior to others. This then generated thousands of pages of attacks on the existence of God, and, by a bizarre coincidence *cough*cough*, when, in the 17th and 18th centuries, we finally do start getting writings by actual overt “I really think there is no God!” atheists, they use many of the same arguments, which were waiting for them, readily available in volumes upon volumes of Church-generated books. Dogmatism here fed and enriched skepticism, much as skepticism has always fed and enriched dogmatism, in their ongoing and fruitful symbiosis.
The third blossom is, of course, sitting with us dolling out eclairs. Impatient Descartes has been itching, ever since I mentioned Anselm, to leap in with his own Proof of the Existence of God, one which uses a more mature form of Ockham’s Nominalism, coupled with the tools of skepticism, especially doubt of the senses. But Descartes may not speak yet! (Don’t make that angry face at me, Monsieur, you’ll agree when you hear why.) It won’t be Descartes’ turn until we have reviewed a few more details, a little Renaissance and Reformation, and introduced you to Descartes’ great predecessor, the fertile plain on whom Descartes will erect his Cathedral. Smiling now, realizing that we draw near the Illustrious Father of Skeptics whom he has been waiting for, Descartes sits back content, until next time.
But do not fear, the wait will be short this time. Socrates is in more suspense than Descartes, and if I stop writing he’ll start demanding that I define “illustrious” or “next” or “man”, so I’d better plunge straight in. Meanwhile, I hope you will leave this little snapshot with the following takeaways:
Medieval thought was notdominated by the idea that logic and inquiry are bad and Blind Faith should rule; much more often, Medieval thinkers argued that logic and inquiry were wonderful because they could reinforce and explain faith, and protect people from error and eternal damnation. Medieval society threw tons of energy into the pursuit of knowledge (scientia, science), it’s just that they thought theology was 1000x more important than any other topic, so concentrated the resources there.
When you see theologians discussing whether certain areas of knowledge are “beyond human knowledge” or “unknowable”, before you automatically call this a backwards and closed-minded attitude, remember that it comes from Plato, Epicurus and Aristotle, who tried to differentiate knowledge into areas that could be known with certainty, and areas where our sources (senses/logic) are unreliable, so there will always be doubt. The act of dividing certain from uncertain only becomes close-minded when “that falls outside what can be known with certainty” becomes an excuse for telling the bright young questioner to shut up. This happened, but not always.
Even when there were not many philosophers we could call “skeptics” in the formal sense, and the great ancient skeptics were not being read much, skepticism continued to be a huge part of philosophy because the tools developed to combat it (Aristotle’s logical methods, for example) continued to be used, expanded and re-purposed in the ongoing search for certainty.
Welcome to a new feature here on Ex Urbe — the promoted comment.
From time to time, Ada makes a long substantive chewy comment, which could almost be its own post. Making it into an actual post would take valuable time. The comment is already written and fascinating — but hidden down in a comment thread where many people may not notice it. From now on, when this happens, I will extract it and promote it. I may even go back and do this with some older especially awesome comments. You’ll be able to tell the difference between this and a real post, because it’ll say it’s posted by Bluejo, and not by Exurbe, because it will say “a promoted comment”, and also because it won’t be full of beautiful relevant carefully selected art but will have just one or two pieces of much more random art.
I thoroughly enjoyed reading this new post. As I am reviewing macroeconomics, especially the different variations of Solow Model, I cannot help but link “intellectual technology” with the specific endogenous growth model, which attempts to led the model itself generate technological growth without an exogenous “manna from heaven”. In this model, technology growth is expressed endogenously by the factor capital as “productive externalities”, and individual workers, through “learning by doing,” obtain more “skills” as the capital grows. Of course, the “technology factor” in the model I learned is vaguely defined and does not cover the many definitions and various effects of “intellectual technology” not directly related to economic production.
Your conversation with Michael reminds of me the lectures and seminars I took with you at Texas A&M. By the time I took your Intellectual History from Middle Ages to 17th Century, I have already taken some classes on philosophy. Sadly, my fellow philosophy students and I usually fell into anachronism and criticized early thinkers a bit “unfairly” on many issues. That is why your courses were like a beam of light to me, for I was never aware of the fact that we have different logic, concepts, and definition of words from our predecessors and should hence put those thinkers back into their own historical context.
It seems to me that Prof. Peter E. Gordon’s essay “What is intellectual history’ captures the different angles from which you and Michael construe Machiavelli: Michael seems more like a philosophy/political science student who attempts to examine how and why early thinkers’ ideas work or not work for our society based on our modern definitions, concepts, and logic, thus raising more debates on political philosophy and pushing the progress of philosophical innovation; your role as an intellectual historian requires one to be unattached from our own understanding of ideas and concepts and to be aware of even logic that seems to be rooted in our subconsciousness so that to examine a past thinker fairly without rash judgement. Michael is like the one who attempts to keep building the existing tower upward, while you are examining carefully the foundation below. For me personally, it would be nice to have both of these two different ways of thinking.
I have a question: I have been attempting to read a bit of Karl Marx whenever time allows. He argues that our thinking and ideology are a reflection of our material conditions. If we accept his point of view, would it be useful to connect intellectual history with economic history?
Nahua, I think you have hit it spot on with your discussion of Peter Gordon’s essay. When I worked with him at Harvard (I had the privilege of having him on my committee, as well as being his teaching assistant for a course) I remember being struck by how, even when we were teaching thinkers far outside my usual scope like Heidegger, I found his presentation of them welcoming and approachable despite my lack of background, because he approached them in the same context-focused way that I did, evaluating, not their correctness or not or their applicability to the present, but their roots in their contemporary historical contexts and the reasons why they believed what they believed.
For Marx’s comment that “our thinking and ideology are a reflection of our material conditions” I think it is often very useful to connect intellectual history with economic history, not in a strictly deterministic way, but by considering economic changes as major environmental or enabling factors that facilitate or deter intellectual change and/or the dissemination of new ideas. I already discussed the example of how I think the dissemination of feminism in the 19th century was greatly facilitated by the economic liberation of female labor because of the development of industrial cloth production, more efficient ways of doing laundry, cleaning, cooking etc. Ideas about female equality existed in antiquity. They enjoyed a large surge in conversation and support from the intellectual firebrands of the Enlightenment, through figures like Montesquieu, Voltaire and Wollstonecraft. But mass movements and substantial political changes, like female suffrage, came when the economic shift had occurred. To use the “intellectual technology” concept, the technology existed in antiquity and was revived and refined in the 18th century, but it required economic shifts as well to help reach a state when large portions of the population or whole nations/governments could embrace and employ it.
As I work on Renaissance history, I constantly feel the close relationship between economics and the intellectual world as well. Humanism as I understand it began when Petrarch called for a revival of antiquity. Economics comes into this in two ways. First, the reason he thought a revival of antiquity was so desperately necessary was because Italy had become so politically tumultuous and unstable, and was under such threat of cultural or literal invasion from France–these are the consequences, largely, of economic situations, since Italy’s development of banking and its central position as a trade hub for the Mediterranean had filled its small, vulnerable citystates with incomparable wealth, creating situations where powerful families could feud, small powers could hire large mercenary armies, and every king in Europe wanted to invade Italy for a piece of its plump pie. Then after Petrarch, humanism’s ability to spread and succeed was also economically linked. You can’t have a humanist without books, you just can’t, it’s about reading, studying, correcting and living the classics. But in an era when a book cost as much as a house, and more than a year’s salary for a young schoolmaster, a library required a staggering investment of capital. That required wealthy powers–families or governments–to value humanism and have the resources to spend on it. Powers like the Medici, and Florence’s Republican government, were convinced to spend their money on libraries and humanism because they believed it would bring them glory, strength, respect, legitimacy, the love of the people, that it would improve life, heal their souls, bring peace, and make their names ring in posterity, but they couldn’t have made the investment if they hadn’t had the money to invest, and they wouldn’t have believed humanism could yield so much if not for the particular (and particularly tumultuous) economic situation in which Renaissance Italy found itself.
Yesterday I found myself thinking about the history of the book in this light, and comparing it to some comments I heard a scientist make on a panel about space elevators. We all want a space elevator–then space exploration will become much, much less expensive, everyone can afford satellites, space-dependent technologies will become cheap, and we can have a Moon Base, and a Mars program, and all the space stations we want, and all our kids can have field trips to space (slight exaggeration). To have a space elevator, we need incredibly strong cables, probably produced using nanofibers. Developing nanofibers is expensive. What the engineer pointed out is that he has high hopes for nanofiber devlopment, because nanofibers have the ideal demand pattern for a new technology. A new technology like this has the problem that, even if there are giant economic benefits to it later on, the people who pay for its development need a short-term return on that, which is difficult in the new baby stages of a technology when it’s at its most expensive. (Some of you may remember the West Wing episode where they debate the price of a cancer medication, arguing that producing each pill costs 5 cents so it’s unfair to charge more, to which the rebuttal is that the second pill cost 5 cents, but the first pill cost $300 million in research.) Once nanofiber production becomes cheap, absolutely it will be profitable, but while it’s still in the stage of costing $300 million to produce a few yards of thread, that’s a problem, and can be enough to keep a technology from getting support. One of the ways we work around this as a society today is the university system, which (through a form of patronage) supports researchers and gives them liberty to direct research toward avenues expected to be valuable independent of profit. Another is grant funding, which gives money based on arguments for the merit of a project without expecting to be paid back. A third is NASA, which develops new technologies (like velcro, or pyrex) to achieve a particular project (Moon!), which are then used and reused in society for the benefit of all. But looking at just the private sector, at the odds of a technology getting funding from investors rather than non-profits, what the scientist said is that, for a technology to receive funding, you want it to have a big long-term application which will show that you’ll make a steady profit once you can make lots of the thing, but it needs to also to have a short-term application for which a small number of clients will be prepared to pay an enormous amount, so you can sell it while it still costs $300 million, as well as expecting to sell it when it costs 5 cents. Nanofibers, he said, hit this sweet spot because of two demands. The first is body armor, since it looks like nanofibers can create bullet-proof fabric as light as normal fabric, and if we can do that then governments will certainly pay an enormous amount to get bullet-proof clothing for a head of state and his/her bodyguards, and elite military applications. The second is super-high-end lightweight golf clubs, which may seem like a frivolous thing, but there are people who will pay thousands of dollars for an extremely high end golf club, and that is something nanofibers can profit from even while expensive (super lightweight bicycles for racing also qualify). So nanofibers can depend on the excitement of the specific investors who want the expensive version now, and through their patronage develop toward the ability to produce things cheaply.
In this sense the history of the book, especially in the Renaissance, was very similar to the situation with nanofibers. In the early, manuscript stage when each new book cost the equivalent of $50,000 (very rough estimate), libraries were built and humanism was funded because wealthy people like Niccolo Niccoli and Cosimo de Medici believed that humanist libraries would give them and their home city political power and spiritual benefits, helping them toward Heaven. That convinced them to invest their millions. Their investments then created the libraries which could be used later on by larger populations, and reproduced cheaply through printing once it developed, but printing would not have developed if patrons like them weren’t around to make there be demand for the volume of books printing could produce. It took Petrarch, Niccoli and Cosimo to fund a library which could raise a generation of people who could read the classics before there was enough demand to sell the 300-1500 copies of a classical book that a printing press could print. And, working within current capitalism, it may take governments who really want bullet-proof suit jackets to give us our space elevator, though universities, NASA, and private patronage of civilian space programs are certainly also big factors pushing us forward.
In sum, I would say that economics sometimes sparks the generation of new ideas–as the economically-driven strife Petrarch experienced enabled the birth of humanism–but it also strongly affects how easily or quickly a new idea can disseminate, whether it gets patronage and support, or whether its champions have to spread it without the support of elites, patrons or government. Thus, in any given era, an intellectual historian needs to have a sense of funding patterns and patronage systems, so we can understand how ideas travel, where, and why.
One more thought from last night, or rather a test comparison showing how the concept “intellectual technology” can work. I was thinking about comparing atomism and steel.
Steel is a precursor for building skyscrapers. Despite urban demand, we didn’t get a transition to huge, towering metropoles until the development of good steel which could raise our towers of glittering glass. Of course, steel is not the ONLY precursor of the skyscraper–it also requires tempered glass, etc. And it isn’t the only way to build skyscrapers, you can use titanium, or nanotech, but you are very unlikely to get either of those things without going through steel first. Having steel does not guarantee that your society will have skyscrapers. Ancient Rome had steel. In the Middle Ages Europe lost it (though pretty-much everywhere except Europe still had steel). When steel came back in the Renaissance it still didn’t lead immediately to skyscrapers, it required many other developments first, and steel had to combine with other things, including social changes (growth of big cities). But when we look at the history of city development, studying steel is extremely important because the advent of steel-frame construction is a very important phase, and a central enabling factor for the development of modern cities.
My Lucretius book looks at the relationship between atomism and atheism in the same way that this analysis looks at steel and skyscrapers. Atomism was around for a long time, went away, came back, etc. And you can have non-atomic atheism, we have lots of it now. But atomism, as the first fully-developed mechanical model of the working of Nature (the first not dependent on God/gods to make the world work) was, in my opinion, one of the factors that you needed to combine with other developments to reach a situation in which an intellectual could combine mechanical models of nature with skepticism with other factors to develop the first fully functional atheistic model of the world. It’s one of the big factors we have to trace to ask “Why did atheism become a major interlocutor in the history of thought when it did, and not before or after?” just as tracing steel helps us answer “Why did skyscrapers start being built when they did?” There had almost certainly been atheisms before and independent of atomism (just as you can make really tall things, like pyramids or cliff-face cities, without steel-frame construction) but it was rare, and didn’t have the infrastructural repeatability necessary to let it become widespread. Modern atheists don’t use Epicurus, they more frequently use Darwin, just as modern skyscrapers use titanium, but the history of skyscrapers becomes clear when we study the history of steel. Just so, the history of atheism becomes much clearer when we study atomism. Of course, we now use steel for lots of things that aren’t skyscrapers (satellite approaching Pluto!), and similarly atomism has lots of non-atheist applications, but we associate atomism a lot with atheism, just as we think a lot about “towers of glass and steel” and usually think less about the steel bolts in our chairs or the steel spoons we eat with. All applications of steel, or epicuranism, can be worth studying, but skyscrapers/ atheism will never stop being one of the biggest and most interesting, at least in terms of how they changed the face of our modern world. And finally, while minority of buildings are skyscrapers, and a minority of contemporary people are atheists, the study of both is broadly useful because the presence of both in the lives of everyone is a defining factor in our current world.
Hello, patient friends. The delight of brilliant and eager students, the siren call of a new university library, the massing threat of conjoining deadlines, and the thousand micro-tasks of moving across the country have caused a very long gap between posts. But I have several pieces of good news to share today, as well as new thoughts on Machiavelli:
The next installment of my Sketches of a History of Skepticism series is 2/3 finished, and I hope to have it up in a week or three, deadlines permitting.
I have an excellent new assistant named Mack Muldofsky, who is helping me with Ex Urbe, music, research and many other projects. So we have him to thank in a big way if the speed of my posting picks up this summer.
Because I have a lot of deadlines this summer, I have asked some friends to contribute guest entries here, and we have a few planned treating science, literature and history, so that’s something we can look forward to together.
For those following my music, the Sundown Kickstarter is complete, and it is now possible to order online the CD and DVD of my Norse Myth song cycle Sundown: Whispers of Ragnarok. In addition to the discs, you can also order two posters, one of my space exploration anthem “Somebody Will” and one which is a detailed map of the Norse mythological cosmos. CD sales go to supporting the costs of traveling to concerts.
I have several concerts and public events lined up for the summer:
At Mythcon (July 31-Aug 2), Lauren Schiller and myself, performing as the duo “Sassafrass: Trickster and King” will join Guest of Honor Jo Walton for “Norse Hour,” in which she will read Norse myth-themed poetry in alternation with our Norse-themed songs.
Sunday August 9th, I have been invited do a reading of the freshly-polished opening chapters of my novel Too Like the Lightning (due out in Summer 2016) at the Tiptree Award Ceremony event honoring Jo Walton, who couldn’t make it to the initial ceremony but received the Tiptree this year for her novel My Real Children. The event is being held at Borderlands in San Francisco at 3 PM, and will feature readings by local authors, and music performed by myself and Lauren.
Monday August 17th, at 7 PM, I am joining Jo and Lauren again at Powell’s, where Jo will read from her books, Lauren and I will sing, and I will interview Jo and talk about my writing as well as hers.
Finally at Sasquan (Worldcon, Aug 19-23) Lauren and I will have a full concert, I will do another reading from Dogs of Peace, and I will be on several exciting panels.
Meanwhile, I have a little something to share here. I continue to receive frequent responses to my Machiavelli series, and recently one of them sparked such an interesting conversation in e-mail that I wanted to post it here, for others to enjoy and respond to. These are very raw thoughts, and I hope the discussion will gain more participants here in the comment thread (I have trimmed out parts not relevant to the discussion):
In this discussion, I use a term I often use when trying to introduce intellectual history as a concept, and which I have been meaning to write about here for some time, “Intellectual Technology.”
A little conversation about Machiavelli:
I have been reading your blog posts on Machiavelli. You write with tremendous learning, clarity and colour, and really bring past events alive in a brilliant way. But…….. I think you’re far too soft on Machiavelli!!!
I’m working on a PhD about him and it’s fascinating to see that nearly all present-day academics, and indeed academics during much of the second half of the 20th century, have a largely if not completely uncritical admiration for him and his works. He is lauded, for example as a forerunner of pluralism, and supporter of republicanism/democracy, yet his clear inspiration of Italian fascism is almost completely overlooked. The fact that Gramsci revered Machiavelli is dealt with by many scholars, but Mussolini’s admiration for him is hurriedly passed over.
Your post on Machiavelli and atheism is really interesting – in that context the 2013 book Machiavelliby Robert Black would be of interest to you…
Best regards, Michael Sanfey, IEP/UCP Lisbon.
Reply from Ada:
Michael,Thank you for writing in to express your enjoyment of my blog posts. I think your criticisms of Machiavelli are interesting and largely fair, and my own opinions overlap with yours in many ways, though not in others. I agree with you completely that there are inappropriate tendencies in a lot of scholars to praise Machiavelli inappropriately as a proto-modern champion of Democracy, republicanism, pluralism, modern national pride etc., all of which are characterizations are deeply inappropriate and also deeply presentist, reading anachronistic values back into him. But there is also a tendency, dominant earlier in the 20th century, to villify Machiavelli too much in precisely the same anachronistic and presentist way, characterizing him as a fascist or a Nazi and reading back into his work the things that were done in the 20th century by people who used some of his ideas but mixed them with many others. My way of approaching Machiavelli focuses above all on trying to distance him from the present and place him in his context, to show that he is neither a modern hero nor a modern villain since he isn’t modern at all. The question is separate, which you bring up, of how much to blame him or criticize him for opening up the direction of reasoning which led to later consequentialism, and also to fascism which certainly used him as one of its foundational texts. Here I find myself uncomfortable with the idea of historical blame at all, particularly when it’s blame over such a long span of time.
I tend to think of thinkers as toolmakers, or inventors of “intellectual technology”, innovators who have created a new thing which can then be used by many people. New inventions can be used in many ways, and in anticipatable and unanticipatable ways. Just as, for example, carbon steel can be used to raise great towers and send train lines across continents, it can be used to build weapons and take lives, so it is a complex question how much to blame the inventor of carbon steel for its many uses. In this sense, I do believe we can see Machiavelli as a weapon-maker, since the ideas he was generating were directly intended to be used in war and politics. We can compare him very directly to the inventor of gunpowder in this sense. I also see him–and this is much of the heart of my critique–as a defensive weapon maker, i.e. someone working in a period of danger and siege trying to create something with which to defend his homeland. So, imagine now the inventor of gunpowder creating it to defend his homeland from an invasion. Is he responsible for all later uses of gunpowder as well? Is he guilty of criminal negligence for not thinking through the fact that long-term many more people will be killed by his invention than live in his home town? Do the lives saved by gunpowder throughout its history balance out against the lives saved in some kind of (Machiavellian/consequentialist) moral calculus? I don’t think “yes” or “no” are fair answers to such a complex question, but I do think it is important, when we think about Machiavelli and what to hold him responsible for, to remember the circumstances in which he created gunpowder (i.e. consequentialist ethics), and that he invented other great things too, like political science and critical historical reasoning. The debts are complicated, as is the culpability for how inventions are used after the inventor’s death. So while I join you wholeheartedly in wanting to fight back against the distortion of Machiavelli the Mythical proto-modern Republican, I also think it’s valuable to battle against the myth of Machiavelli the proto-Fascist, and try to create a portrait of the real man as I see him, Machiavelli the frightened Florentine.
I do know Bob Black’s Machiavelli book, but disagree with some of his fundamental ideas about humanism itself – another fun topic, and one I enjoy discussing with him at conferences. He’s a challenging interlocutor. There is a very good recent paper by James Hankins on Academia.edu now about the “Virtue Politics” of humanists, which I recommend that you look at if you’re interested in responses to Black.
Best, Ada Palmer, University of Chicago
More from Michael:
First, I want to thank you for this fantastically detailed and brilliant response… I’d like to “come back at you” on consequentialism and some other points:
* Regarding your point about Machiavelli not being modern at all, I see what you mean, albeit you do say of Machiavelli in the post on atheism that “he is in other ways so very modern”. Leo Strauss certainly thought he had a lot to do with the introduction of what we know as “modernity”.
* When you seek to balance the need to fight against the Proto-republican myth and against the Proto-fascist myth, the first of those “myths” enjoys immeasurably wider currency than the second, and I ask myself, why is this?
* On the “intellectual technology” point below, and its being essentially neutral, in this case I wouldn’t agree with you, because we are not talking here about an object like gunpowder, it’s actually concerning something much more important. In ethical terms, Machiavelli took transcendent values out of the equation. As you put it, Machiavelli created “an ethics which works without God” – except that it doesn’t work!!!
* Machiavelli has had a questionable impact in regard to “realism” in International relations. You mention in one of the posts that he backed an alliance with Borgia so as to protect Florence, agreeing to offer money and resources to help Borgia conquer more – a very good example of Machiavelli‘s undoubted sympathy for imperialism.
PPS On the question of Machiavelli being an atheist or not, I really was fascinated by that part of your Ex Urbe writings. I’ve concluded that, whatever about him being an atheist or not, one could certainly describe him as “ungodly” would you agree?
Quick response from Ada:
I think “ungodly” does work for Machiavelli depending on how you define it; it has a connotation of being immoral–which does not fit–but if instead you mean it literally as someone who makes his calculations without thinking much about the divine then it fits.
A supplementary comment on “Intellectual Technology”:
I find “intellectual technology” a very useful concept when I try to describe what I study. Broadly my work is “intellectual history” or “the history of ideas” but what I actually study is a bit more specific: how particular kinds of ideas come into existence, disseminate, and come to be regulated at different points in time. The types of ideas I investigate–atomism, determinism, utilitarianism–move through human culture very much the same way technological innovations do. They come into being in a specific place and time, as a result of a single inventor or collaboration. They spread from that point, but their spread is neither inevitable nor simple. Sometimes they are invented separately by independent people in independent places, and sometimes they exist for centuries before having a substantial impact. When a new idea enters a place and comes into common use, it completely changes the situation and makes actions or institutions which worked before no longer viable. I compare Machiavelli’s utilitarianism to gunpowder above, but here are some other examples of famous cases of technological inventions, and ideas which disseminated in similar patterns:
The Bicycle and Atomism
Leonardo da Vinci sketched a design for a bicycle in the Renaissance, and may have seriously tried to construct one, but afterward no one did so for a very long time. Then many other factors changed: the availability of rubber and light-weight strong metals, the growth of large, centralized cities and a working population in need of inexpensive transit, and suddenly the bicycle was able to combine with these other factors to revolutionize life and society in a huge rush, first across Europe and then well beyond. We have moved on from it to develop more complex technologies that achieve the same function, but still use it and develop it more, and even where we don’t, and cities would not have the shapes they do now without it, and it is still transforming parts of the world it has touched more slowly. Similarly atomism was developed and used for a little while, then languished in notebooks for a long time, before combining with the right factors to spread and rapidly transform society and culture.
The Unity of All Life and Calculus
Newton and Leibnitz developed Calculus independently at the same time. Similarly, both classical Stoicism in Greece and Buddhism in India roughly simultaneously and independently, as far as we can tell, developed the idea that all living things–humans, insects, ancients, people not yet born–are, in fact, parts of one contiguous, interconnected, sacred living thing. This enormously rich and complex concept had a huge number of applications in each society, but seems to have been independently developed to meet the demands for metaphysical and emotional answers of societies at remarkably similar developmental stages. The circumstances were right, and the ideas then went on to be applied in vastly different but still similar ways.
Feminism and the Aztec Wheel
For a long time we thought the Aztecs didn’t have the wheel. More recently we discovered that they had children’s toys which used the wheel, but never developed it beyond that. Which means someone thought of it, and it disseminated a bit and was used in a very narrow way, but not developed further because what we think of as more “advanced” or “industrial” applications (wagon, wheelbarrow) just weren’t compatible with the Aztec world (largely because it was incredibly hilly and didn’t have the elaborate road system Europe developed, relying instead on human legs, stairs, and raw terrain, which were sufficient to let it develop a robust and complex economy and empire of its own. The wheel became more useful in the Americas when European-style city plans and roads were built). Similarly Plato voiced feminism in his Republic, arguing that women and men were fundamentally interchangeable if educated the same way, and people who read the Republic discussed it as a theory among many other elements of the book, but didn’t develop it further (again, I would argue, this was at least in part because the economic and social structures of the classical world depended on the gendered division of labor, particularly for the production of thread in the absence of advanced spinning technology, which is why literally all women in Rome spent tons of time spinning–spinning quotas were even sometimes required by law of prostitutes since if there was a substantial sliver of the female population employed without spinning Rome would run out of cloth. Feminism was better able to become revolutionary in Europe when (among other changes) industrialization reduced the number of hours required for the maintenance of a household and the production of cloth, making it more practical to redirect female labor, and question why it had been locked into that in the first place).
In sum, there is a concreteness to the ideas whose movements I study, a distinct and recognizable traceability. Interpretive analyses, comparative, subjective analyses, analyses of technique, aesthetics, authorial intent, authenticity, such analyses are excellent, but they aren’t intellectual history as I practice and teach it. I trace intellectual technology. Just as the gun, or carbon steel, or the moldboard plow came in at a particular time and had an impact, I study particular ideas whose dissemination changed what it was possible for human beings to do, and what shapes human society can be. It is meaningful to talk about being at an “intellectual tech level” or at least about being pre- or post- a particular piece of intellectual technology (progress, utilitarianism, the scientific method) just as much as we can talk about being pre- or post-computer, gunpowder, or bronze. Such things cannot be un-invented once they disseminate through a society, though some societies regulate or restrict them, and they can be lost, or spend a long time hidden, or undeveloped. Elites often have a legal or practical monopoly on some (intellectual) technologies, but nothing can stop things from sometimes getting into the hands or minds of the poor or the oppressed. Sometimes historians are sure a piece of (intellectual) technology was present because we have direct records of it: a surviving example, a reference, a drawing, something which was obviously made with it. Other times we have only secondary evidence (they were farming X crop which, as far as we know, probably requires the moldboard plow; they described a strange kind of unknown weapon which we think means gun; they were discussing heretics of a particular sort which seems to have involved denial of Providence).
I realize that it would be easy to read my use of “intellectual technology” as an attempt to climb on the pro-science-and-engineering bandwagon, presenting intellectual history as quasi-hard-science, much as we joke that if poets started calling themselves “syllabic engineers” they would suddenly be paid more. But it isn’t a term I’m advocating as a label, necessarily. It’s a term I use for thinking, a semantic tool for describing the specific type of idea history I practice, and linking together my different interests into a coherent whole. When I spell out what I’m working on right now as an historian, it’s actually a rather incoherent list: “the history of atheism, atomic science, skepticism, Platonic and Stoic theology, soul theory, homosexuality, theodicy, witchcraft, gender construction, saints and heavenly politics, Viking metaphysics, the Inquisition, utilitarianism, humanist self-fashioning, and what Renaissance people imagined ancient Rome was like. And if you give me an hour, I can sort-of explain what those things have to do with each other.” Or I can say, “I study how particularly controversial pieces of new intellectual technology come into being and spread over time.”
In that light, then, we can think of Machiavelli as the inventor of a piece of intellectual technology, or rather of several pieces of intellectual technology, since consequential ethics is one, but his new method of historical analysis (political science) is another. We might compare him to someone who invented both the gun and the calculator. How do we feel about that contribution? Positive? Negative? Critical? Celebratory? I think the only universal answer is: we feel strongly.
On the one hand, I have been looking forward for ages to reading and then writing something about “The Litany of Earth,” an amazing novelette by Ruthanna Emrys, acquired for Tor.com by editor Carl Engle-Laird. But on the other hand I personally usually dislike reading reviews, at least traditional reviews of things I have already decided to read. When a reviewer tells me about what I’m going to experience and what excellent things the author is going to do, it disrupts the reading process for me, makes the things mentioned in the review stand out too boldly, interfering with the craftsmanship of a good story in which the author has taken great pains to give each beat just the right amount of emphasis, no more, no less. The memory of the review in my mind makes it like a used book which someone has gone through with highlighter, which can be fascinating as a window on a fellow reader, and delightful for a reread, but it isn’t what I want on first meeting a new text, which in my ideal world consists of me, the reader, placing myself wholly and directly in the hands of the author, with the editor’s touch there too to help spot us along the way. I do not need a co-pilot. And it is more of a problem, for me at least, with short fiction than with long fiction since the review could be half as long as the story and weigh me down with nearly as much weight as the whole thing carries. So, today I have set myself the challenge of writing a review, or non-review, of “The Litany of Earth” that isn’t a co-pilot, or a highlighter, and does as much as possible to get across the story’s strengths and the power of the reading experience while doing my best not to change the relative weight of anything in the story, make anything jump out too boldly, leaving the craftsmanship as untouched as it can be.
I have a seven step plan. (Personal rule: anything with three or more steps counts as a plan. Also, “Profit” is not a step, it’s an outcome, and does not count toward your total of three.)
Recommend you go read “The Litany of Earth” now before I can spoil anything.
Talk amorphously about things the story is doing with structure and world-canon, talking more concretely about a few other pieces of fiction that have done somewhat similar things.
Ramble about Petrarch.
Ramble about Diderot. Dear, dear Diderot…
Urge you to read “The Litany of Earth” again, last chance before I get out my highlighter.
Talk about “The Litany of Earth” directly.
Step One: I strongly recommend that you go read “The Litany of Earth” right now. It’s free online, and if you read it now you won’t be stuck with an intrusive co-pilot even if I do fail in today’s challenge of writing a non-review.
Step Two: Talk amorphously, and compare the story to other works of fiction.
One of the unique literary assets of current fiction is the proliferation of familiar but elaborate and thoroughly developed fictional worlds which authors can step into and use for new purposes. There have always been such worlds as long as there has been literature. Arthuriana is my favorite pre-modern example, a complex and well-populated world rich with explorable relationships and flexible metaphysics ready to be elaborated upon and repurposed. Geoffrey of Monmouth and Thomas Malory and Petrarch and Ariosto and the traditional artists in Naples who decorated (and still decorate) street vendor wagons with Arthur’s knights each repurposed Arthuriana just like Marion Zimmer Bradley and and Monty Python and Gargoyles and Heather Dale and Babylon 5 and the endlessly hilarious antics of the BBC’s Merlin. Each of the later authors in the genealogy has taken advantage not only of the plot, setting and characters but knowing that readers have genre expectations.
In the early 1500s when Ariosto began his chivalric and slightly-Arthurian verse epic Orlando Furioso he took advantage of the fact that readers already associated the topic with epic works and grand tourneys and knights and ladies and courtly-love adultery, baggage which let him write a massive and endless rambling snarl of disjointed and fantastic adventurousness so unwieldy that traditional epic structure is to Orlando Furioso as a sturdy rope is to the unassailable rat’s nest of broken headphones and cables for forgotten electronics that I just fished out of this bottom drawer. No reader, not even in 1516, would put up with it without the promise of Arthurian grandeur to make its massive scale feel appropriate. (I will also argue that the BBC Merlin, for all its tomatoes and giant scorpions, has not actually done anything quite so unreasonable as the point when Ariosto has “Saint Merlin” rise from his tomb to deliver an endless rambling prophecy about how awesome Ariosto’s boss Ipollito D’Este is going to be. Fan service long predates the printing press.) In a more recent continuation of this tradition, modern Arthurian adaptations have given us the previously-silenced P.O.V.s of women, of villains, of third-tier characters, and in some sense it’s quite modern to think about P.O.V. at all. But even very old adaptations take advantage of how not just setting but genre is an asset usable to get the reader to follow the author to places a reader might not normally be willing to go. And, of course, in more recent versions authors have taken advantage of exploring silenced P.O.V.s to critique earlier Arthurian works and their blind spots, as a way of reaching the broader blindnesses and silencings of the past stages of our own society that birthed these worlds.
“Is ‘The Litany of Earth’ Arthuriana?” you may wonder. No. It uses a different mythos. I bring up Arthuriana in order to remind you of the many great things you’ve seen humans create by using and reusing a familiar collective fiction, and in order to reinforce my earlier claim that one of the great assets of current fiction is that we have many, many such worlds. If pre-modern Earth had several dozen rich, lively, reusable mythoi and epic settings, the 20th century has added many, many more in which good (and campy) things have and can be done. Star Trek, Sherlock Holmes, Gundam, the massive united comics universes of Marvel and DC, these each provide as much complexity and material for reuse and reframing as the richest ancient epics, more if, for example, you compare the countless thousands of pages of surviving X-Men to the fragile little Penguin Classics collections of Eddas and fragmentary sagas which preserve what little we still have of the Norse mythic cosmos. Marvel’s universe, and DC’s too, have a fuller population and a more elaborate and eventful history than any mythos we have inherited from antiquity, and my own facetious in-character reviews of the Marvel movies are but the shallowest tip of what can be done with it.
The specific case of this kind of rich reuse whose parallels to “The Litany of Earth” are what brought me down this line analysis comes from the Marvel comics megaverse, the unique and skinny stand-alone Marvels, by Kurt Busiek, illustrated by Alex Ross. What it does with the narrative possibilities of the Marvel universe is very much worth looking at even if one doesn’t care a jot about comics.
Described from the outside and ignoring, for a moment, that these are comic books, the Marvel universe presents us with an Earth-like alternate history in which disasters–supernatural, alien, primordial, divine–have repeatedly threatened Earth, the universe, and, most often, New York City with certain destruction. These have been repeatedly repelled by superheroes, somewhat human somewhat not, and the P.O.V. from which we the reader have always viewed these events has been as one of the superpeople at the heart of the battle, deeply enmeshed in the passionate immediacy of the short-term drama, nemeses, kidnappings, personal backstory, and who’s dead lately. Only rarely have we had works that gave us a longer perspective over time, reflecting personal change, evolving perspectives, how being constantly enmeshed in superbusiness makes a person develop and self-reflect, though notably the works that have done so have been among superhero comics’ shining stars (Dark Knight Returns, Red Son, Watchmen.)
Marvels instead offers a long-term and distanced P.O.V., that of a photographer who lives in New York City and, during his path from rookie to retirement, experiences in order the great, visible cataclysms that have repeatedly shaken Marvel’s Earth. His perspective gives historicity, sentiment, reflection and above all realism to Marvel, using it as alternate history rather than an action setting. The effect is powerful, beautiful and highly recommended for the way it weaves the richness of Marvel’s setting together with good writing to create a truly valuable work of literature. But it also reverses an interesting silencing which has been present in the back of Marvel, and superhero comics, since their inception: the silencing of the Public.
Very much like the women in early versions of Arthuriana, the Public in Marvel (and DC) has not been an agent in itself, but an object to motivate the hero. The Public exists to be rescued, protected, placated, evaded, sometimes feared. The Public has cheered P.O.V. heroes, hounded them, betrayed them, threatened them with pitchforks and torches, somehow being tricked over and over again into doubting the heros even after the last seventeen times they were exonerated. The Marvel Public specifically also persistently hates and fears the X-Men and other mutants despite being saved by them sixteen jillion times, and somehow hates and fears the other heros less even though many of them are aliens or science freaks or robots or other things just as weird as mutants. It is a tool of the author, manipulated by villains, oppressing misfits, causing tension, but virtually never is the reader asked to empathize with the Public. The object of empathy is the hero, or occasionally the villain, but the reader is never supposed to identify with or even think about the emotions of the screaming and yet simultaneously silenced mob. Marvels gives us, at last, the point of view of that mob, or at least one member of it, directing our self-identification and above all our empathy for the first time to something which has been hitherto faceless.
The effect is rather like a stroll through the Uffizi enjoying endless scenes of exciting saints surrounded by choruses of beautiful angels and then hitting the Botticelli room where each angel has a distinctive face and personality and you find yourself wondering what that angel is thinking when it watches Mary come to heaven to be crowned its queen, or sings music for young John the Baptist whose grisly end and subsequent heavenly ascension the angel already knows. Only when Botticelli invites you to see the angels as individuals do you realize that no earlier painting ever did. They had a failure of empathy. They were still beautiful, but here is a rich new direction for empathy which no earlier work has asked us to consider, and which opens up a huge arena we had ignored. Women in Arthuriana; the Public in Marvel; the angels that stand around in paintings of saints.
In just the same way, “The Litany of Earth” uses empathy and P.O.V. to open rich new arenas in one of our other well-known modern fictional settings. And the setting it uses has a fundamental and very problematic failure of empathy rooted deep in its foundations, so addressing that head-on opens a very potent door.
And since I can feel the urge to talk about Naoki Urasawa’s Pluto becoming harder to resist, I believe it is now time to nip that in the bud by moving on to the next stage of my plan.
Step Three: Ramble about Petrarch.
Picture Petrarch in his library, holding his Homer. He has just received it, and turns the stiff vellum pages slowly, his fingertips brushing the precious verses that he has dreamed of since his boyhood. The Iliad in his hands. His friends have always whispered to him of the genius that was Homer, his real friends, not the shortsighted fools he grew up with in Avignon, arrogant Frenchman and slavish Italians like his parents who followed the papacy and its trail of gold even when France snatched it away from Rome. His real friends are long-dead Romans: Cicero, Seneca, Caesar, men like him who love learning, love virtue, love literature, love Rome and Italy enough to fight and give their lives for it, love truth and excellence enough to write of it with passion and powerful words that sting the reader into wanting to become a better person.
Petrarch was born in exile. Not just the geographic exile of his family from their Florentine homeland, no, something deeper. An exile in time. This world has no one he can relate to, no one whose thoughts are shaped like his, who walks the Roman roads and feels the flowing currents of the Empire, whose understanding of the world connects from Egypt up to Britain without being blinded by ephemeral borders, who can name the Muses and knows how truly rich it is to taste the arts of all nine, and how truly poor one is without. Antiquity was his native time, he knows it, but antiquity was cut off too early–he was born too late. His friends are dead, but their voices live, a few, in chunks, in the books in distant libraries which he has spent his life and fortune gathering. His library. Each volume a new shard of a missing friend, those few, battered whispers of ancient voices which survived the Medieval cataclysm that consumed so much. And now, after hearing so many of his friends speak of Homer, call him the Prince of Poets, the climax of all art and literature, divine epic, the centerpiece of all the ancient world, he has it in his hands. It survived. Homer. In Greek. And he can’t read it. Not a word of it. Greek is gone. No one can read it anymore, no one. Homer. He has it in his hand, but he can’t read it, and for all he knows no one ever will again.
This historical moment, Petrarch with his Homer, is one of the most poignant I have ever met in my scholarship. A portrait of discontinuity. The pain when the chain of cultural transmission, of old hands grasping young, that should connect past, present and future is cut off. The cataclysm doesn’t have to be complete to be enough to disrupt, to silence, to jumble, to leave too little, Greek without Homer, Homer without Greek. Petrarch is a Roman. They all are, he and his Renaissance Italians, they have the blood of the Romans, the lands of the Romans, the ruins of the Romans, but not enough for Petrarch to ever really have the life he might have had if he’d been born in the generation after Cicero, and with his Homer in his hands he knows it.
Petrarch did his best. He spent his life collecting the books of the ancients, trying to reassemble the Library of Alexandria, the pinnacle, he knew, of the culture and education which had made the Romans who had made his world. He found many shards, eventually enough that it took more than ten mules to carry his library when he journeyed from city to city. He journeyed much, working everywhere with voice and pen to convince others to share his passion for antiquity, to read the ancients that could be read, Cicero, Seneca, to learn to think as they did and to try to push this world to be Roman again, which for him meant peaceful, broad-reaching, stable, cultured and strong. People listened, and we have the libraries and cathedrals and Michelangelos they made in answer. And Petrarch never gave up on Homer either, but searched the far corners of the Earth for someone with a hint of Greek and eventually, late in life, did find someone to make a jumbled, fragmentary translation, nothing close to what a second-year-Greek student could produce today let alone a fluid translation, but a taste. By late in life he had his New Library of Alexandria, and real hope that it might rear new Romans.
Petrarch wanted to give the library to Florence, to help his homeland make itself the new Rome, but Florence was too caught up with its own faction fighting for anyone to stably take it. Venice was the taker in the end, and he hoped his library would make the great port city like the Alexandria of old, the hub where all books came, and multiplied, and spread. Venice put Petrarch’s library in a humid warehouse and let it rot. We lost it. We lost it again. We lost it the first time because of Vandals and corrupt emperors and economic transformation and plague and all the other factors that conspired to make the Roman Empire decline and fall, but we lost it the second time because Venice is humid and no one cared enough to devote space and expense to a library, even the famous collection of the famous Petrarch. Such a tiny cataclysm, but enough to make discontinuity again. We have learned better since. Petrarch had followers who formed new libraries, Poggio, Niccolo, they repeated Petrarch’s effort, finding books. Eventually princes and governments realized there was power in knowledge. Venice built the Marciana library right at the main landing, so when foreigners arrive in St. Mark’s square they are surrounded by the three facets of power, State in the Doge’s Palace, Church in the Basilica, and Knowledge in the Library. And now we have our Penguin Classics. But we don’t have Petrarch’s library, and we know he had things that were rare, originals, transcriptions of things later lost. There are ancients who made it as far as Petrarch, all the way to the late 1300s, through Vandals, Mongols and the Black Death, before we lost them to one short-sighted disaster. Discontinuity. We have Homer. We don’t know what Petrarch had that we don’t.
This was one of two historical vignettes that came vividly before my mind while I was reading “The Litany of Earth.” The second is…
Step Four: Ramble about Diderot. Dear, dear Diderot…
I must be very careful here. Even though my focus is Renaissance and my native habitat F&SF, Denis Diderot remains my favorite author. Period. My favorite in the history of words. So it is very easy for me to linger too long . But I invoke him today for a very specific reason and shall confine myself strictly to one circumscribed subtopic, however hard the copy of Rameau’s Nephew on my desk stares back.
Three quarters of the way through my survey course on the history of Western thought, I start a lecture by declaring that the Enlightenment Encyclopedia project was the single noblest undertaking in the history of human civilization. I say it because of the defiant, “bring it on!” glances I instantly get from the students, who switch at once from passive listening to critical judgment as they arm themselves with the noblest human undertakings they can think of, and gear up to see if I can follow through on my bold boast. I want that. I want their minds to be full of the Moon Landing, and the Spartans at Thermopylae, and Gandhi, and the US Declaration of Independence, and Mother Teresa, and the Polynesians who braved the infinite Pacific in their tiny log boats; I want it all in their minds’ eyes as I begin.
The Encyclopédie was the life’s work of a century on fire. The newborn concept Progress had taken flight, convincing France and Europe that the human species have the power to change the world instead of just enduring it, that we can fight back against disease, and cold, and mountain crags, and famine cycles, and time, and make each generation’s experience on this Earth a little better. The lion has its claws and strength, the serpent fangs and stealth, the great whales the force of the leviathan, but humans have Reason, and empiricism, and language to let us collaborate, discuss, examine, challenge, and form communities of scientists and thinkers who, like the honeybee, will gather the best fruits of nature and, processing them with our own inborn gifts, produce something good and sweet and useful for the world. The tone here is Francis Bacon’s, but Voltaire popularized it, and by now the fresh passion for collaboration and improvement of the human world had already birthed Descartes’ mathematics, Newton’s optics, Locke’s inalienable rights, calculus, and the Latitudinarian movements toward rational religion which seemed they might finally soothe away the wars that lingered from the Reformation. Everything could be improved if keen minds applied reason to it, from treatments for smallpox which could be preventative instead of palliative, to Europe’s law codes which were not rational constructions but mongrel accumulations of tradition and centuries-old legislation passed during half-forgotten crises and old power struggles whose purpose died with the clans and dynasties that made them but which still had the power to condemn a feeling, thinking person to torture and death.
The Encyclopédie had many purposes. Perhaps the least ambitious was to turn every citizen of Earth into a honeybee. Plato had said that only a tiny sliver of human souls were truly guided by reason–able to become Philosopher Kings–while the vast majority were inexorably dominated by base appetites, the daily dose of food and rest and lust, or by the wild but selfish passions of ambition and pride. For two millennia all had agreed, and even when the Renaissance boasted that human souls could rival angels in dignity and glory through the light of learning and the power of Reason, they meant the souls of a tiny, literate elite. But in 1689 John Locke had argued that humans are born blank slates, and nurture rather than an innate disposition of the soul separated young Newton from his father’s stable boy. The Encyclopédie set out to enable universal education, to collect basic knowledge of all subjects in a form accessible to every literate person, and to their illiterate friends who crowded around to hear new chapters read aloud in the heady excitement of its first release. With such an education, everyone could be a honeybee of Progress, and exponential acceleration in discovery and social improvement would birth a better world. So overwhelming was public demand that Europe ran out of paper, of printer’s ink, even ran out of the types of metal needed to make printing presses, so many new print shops appeared to plagiarize and print and sell more and more copies of the book which promised such a future (See F. A. Kafker, “The Recruitment of the Encyclopedists”).
Yet Diderot and his compatriots had another goal which shows itself in the structure of the Encyclopédie as well as in its bold opening essay. The second half of the 17 volume series is devoted to visual material, a series of beautiful and immensely complicated technical plates which illustrate technology and science. How to fire china dishes, smelt ore, weave rope, irrigate fields, construct ships, calculate distance, catalog fossils and decorate carriages, all are illustrated in loving detail, with diagrams of every tool and its use, every factory and its layout, every human body at work in some complex motion necessary to turn cotton into cloth or rag into precious paper. With this half of the Encyclopédie it is possible to teach one’s self every technological achievement of the age. The first half was intended to provide the same for thought. With its essays it should be possible to understand from their roots the philosophies, ethical systems, law codes, customs, religions, great thinkers of the past and present, all aspects of life and the history of humankind’s evolving mental world. It is a snapshot. A time capsule. With this–Diderot smiles thinking it–with this, if a new Dark Age fell upon humanity and but a single copy of the Encyclopédie survived, it would be possible to reconstruct all human progress. With this, the great steps forward, the hard-earned produce of so many lives, the Spartans at Thermopylae, the Polynesian log boats, will be safe forever. We can’t fall back into the dark again. With this, human achievement is immortal. Yes, Petrarch, it even details how to read, and print, and translate Greek.
Let’s linger on that thought a moment. A beautiful, unifying, optimistic, safe, human moment, warm, like when I first heard that, yes, eventually Petrarch did get to read a sliver of his Homer. Because I’m not going to keep talking about dear Diderot today, much as I would like to.
In 2012/13 we lost 170,000 volumes from the Egyptian Scientific Institute in Cairo to the revolution, 20,000 unique manuscripts in Timbuktu library to a militia fire, and we have barely begun to count the masses of original scientific material burned during a corrupt botched cost-saving effort to reduce the size of the Libraries of Fisheries and Oceans of Canada. More than half of the entries on Wikipedia’s list of destroyed libraries were destroyed after the printing of the Encyclopédie, and the libraries on the list are only a miniscule fraction of the texts lost to disasters, natural and manmade. It doesn’t even list Petrarch’s library, let alone the unique contents of the personal libraries and works that accumulate in every house now that we’re all honeybees. Diderot tried so hard to make it all immortal. He tried so hard he used up all the ink and paper in the world. Yet if my numbers for printing history are right, in the past half century we have destroyed more written material than had been produced in the cumulative history of the Earth up until Diderot’s day. And that does not count World Wars. We’re getting better. On February 14th 2014 a fire at the British National Archives threatening thousands of documents, many centuries old, was successfully quenched with no damage to the collection, thanks substantially to advances in our understandings of fluids and pressure made in the 17th and 18th centuries and neatly explained by the Encyclopédie. That much is indeed immortal (thank you, Diderot!) but much is so very far from everything. It’s still so easy to make mistakes.
One of the most powerful mistakes, for me, is this cenotaph monument of Diderot, in the Pantheon in Paris, celebrating his contributions and how the Encyclopedia and enlightenment enabled so much of the liberty and rights and change that defines our era. Voltaire’s tomb was moved to the Pantheon, Rousseau’s too, but for Diderot there is only this empty cenotaph. I went on a little pilgrimage once to visit Diderot in the out-of-the-way Church of Saint-Roch, where he was buried. There is no tomb to visit. During the French Revolution, Saint-Roch was attacked and mostly destroyed by revolutionaries (carrying banners with Encyclopedist slogans on them!) who, in their zeal to torch the old regime, forgot that their own Diderot was among the Catholic trappings they could only see as symbols of oppression. Once rage and zeal had died down Paris and all France much lamented the mistake, and many others, too late.
Did I mention we very nearly lost Diderot’s work too? A far more frightening loss than just his body. Diderot didn’t include himself, his own precious original intellectual contributions, in his Encyclopédie. He knew he couldn’t. He was an atheist, you see. A real one, not one of these people we suspect like Hobbes and Machiavelli, but an overt atheist who wrote powerful, deeply speculative books trying to hash out the first moral system without divinity in it, fledgling works of an intellectual tradition which was just then being born, since even a few decades earlier no one had dared set pen to paper, for fear of social exile and ready fire and steel of Church and law. But Diderot didn’t publish his own works, not even anonymously. He self-censored. He was the figurehead of the Encyclopédie. An atheist was too frightening back then, too strange, too other. If people had known an atheist was part of it, the project would have been dead in the water. Diderot left instructions for future generations to print his works someday, if the manuscripts survived, but gambling with his own legacy was a price he was willing to pay to immortalize everyone else’s. The surviving manuscript of Rameau’s Nephew in Diderot’s own hand turned up by chance at a used bookshop 1823, one chance street fire away from silence.
Here you get points if you read it before getting this far. It’s free on Tor.com, but you really liked it you can also buy the ebook for a dollar, and give money to Ruthanna and to Tor, and tell them you like excellent original fiction that does brave things with race and historicity.
Step Six: Talk about “The Litany of Earth” directly.
This is a Cthulhu Mythos story which is in no way horror. The richly-designed populated metaphysics and macrohistorical narrative of Lovecraft’s universe is here, but as a tool for reflection on society and self, with a narrative that bears no resemblance in to the classic tense and chilling horror short stories I (for some reason) enjoy as bedtime reading. Ruthanna Emrys uses Lovecraft’s world to comment on Lovecraft’s writing and the deeply ingrained sexism and especially racism that saturates it, repurposing that into a tool to make us think more about the effects of silencing and othering which Lovecraft used his skill and craftsmanship to lure us into participating in. But the message and questions are universal enough that the target audience is not Lovecraft readers or horror readers but any reader who has even a vague distant awareness that the Lovecraft Mythos is a thing, as one has a vague distant awareness of Celtic or Navajo mythology even if one doesn’t study them. If there is any horror in this story, it is the familiar reality that the things we make and do and are are perishable, that human action often worsens that, and that at the end of all our aeons and equations we face entropy. But rather than presuming (as Lovecraft and much horror does) that facing that will lead to mad cackling and gibberish, the story presents the real things we do to try to face that: spirituality, cultural identity, and the effort to preserve the past and transmit it to the future. It turns a setting which was created a vehicle for horror into a vehicle for social commentary and historical reflection.
I suppose I should directly address Lovecraft’s failures of empathy, for those less familiar with his work, or who have met it mainly through its fun, recent iterations in board games and reuses which strive to leave behind the baggage. Racism, sexism, classism and other uncomfortable attitudes are not unexpected in an author who lived from 1890 to 1937. We encounter unpalatable depictions of people of color, and equally unpalatable valorizations of entrenched elites, in most literature of the period, from M. R. James to the original Sherlock Holmes. In Lovecraft’s case, the challenge for those who want to continue to work with his universe is that many of the racist and classist elements are worked deeply into the fabric of his worldbuilding. Many of his frightening inhuman races are clearly used to explore his fear of racial minorities, while the keys to battling evil are reserved for elites, like the affluent, white, male scholars who control his libraries, and the Great Race which controls the greatest library.
While many attempts to rehabilitate and use Lovecraft’s world do so by excising these elements, or minimizing them, or balancing them out by letting you play ethnically diverse characters in a Lovecraft game, this story instead uses those very elements as weapons against the kinds of attitudes which birthed them. If the scary fish-people represent a demonized racial “other” then let them remain exactly that, and show them suffering what targeted minorities have suffered in historical reality. By reversing the point of view and placing the reader within the perspective of the “other”, the original failure of empathy is transformed into a triumph of empathy. Now we are in the place of a woman for whom Lovecraft’s spooky cult rituals are her Passover or Easter, the mysterious symbols her alphabet, “Iä, Cthulhu . . . ” is the comforting prayer she thinks to herself when terrified, and a Necronomicon on Charlie’s shelf is Petrarch’s Homer.
And we aren’t asked to empathize with only one group. We empathize with those deprived of education, in the form of Aphra’s brother Caleb, taking on the classist negative depictions of “degenerate” white rural families common in Lovecraft’s work. With the plight of the Jews and other groups targeted in Germany, invoked by Specter’s discussion of his aunt. With those facing physical and medical challenges, invoked in the powerful opening lines where Aphra describes the pleasure she finds in facing the daily difficulty of walking uphill while she slowly heals. And with women, rarely granted any remotely coequal agency in literature of Lovecraft’s era. Not only is this story a powerful triumph of empathy, but after reading it, whenever we reread original Lovecraft, or anything set in his world, the memory of Aphra Marsh and her tender prayer will forever change the meaning of “Iä, iä, Cthulhu thtagn…” The triumph of empathy diffuses past the boundaries of this story, to enrich our future reading.
Another striking facet is that this is a story about legacy, continuity and deep history that manages to address those questions using only very recent history. Usually stories that want to talk about the deep past use material from periods we associate with the deep past: medieval, Roman Empire, Renaissance, Inuits, Minoans, anything we associate with dusty manuscripts and archaeology and anthropology and old culture. Even I in this entry, when trying to evoke the themes and feelings of this story, went back centuries and consequently had to spend a lot of time explaining to the reader the history I’m talking about (what’s Petrarch’s Homer, what’s up with Diderot, etc.) before I could get to what I wanted to do with it. This story instead uses contemporary history, events so recent and familiar that we all know it already, and have seen its direct effects in those around us and ourselves, or have tried to not see said effects. As a result, the story doesn’t have the baggage of having to explain its history. Instead of needing footnotes and exposition, it touches us directly and personally with our own history and makes us directly face the fact that we too are part of the link of transmission attempting to connect past to future, and our failures can still heal or harm that just as much as Visigoths, the Black Death or the Encyclopédie. The use of modern history makes it impossible for us to distance ourselves, greatly enhancing its power.
I have already discussed, in my own roundabout way using Diderot and Petrarch and Marvel comics, many of the key themes which make this story so powerful: othering, empathy, reversal of point of view, legacy, silencing, translation and transmission, and discontinuity, how easy it is for the powerful engine of society to make mistakes that cut the precious thread. The power with which this story is able to present that theme demonstrates perfectly, for me, the potency of genre fiction as a tool, not for escapism or entertainment, but for depicting reality and history. The tragic discontinuities created by World War II, the destruction of life, education and cultural inheritance generated not only by the most gruesome facets of the war but also by great mistakes like the treatment of Japanese Americans, are difficult to communicate in full with such accurate but emotionless descriptive phrases as, “people were rounded up and held in prison camps.” Attempts to communicate the genuine human impact of such an event easily fall so short. We try hard, but often fail. As a teacher, I remember well the flurry of discussion which surrounded some High School history textbooks which, in their efforts to do justice to the often-silenced story of interned Japanese Americans, had a longer section about that than it did about the rest of the war. Opponents of political correctness used it as a talking point to rail against liberalism gone too far, while apologists focused on the harm done by silencing the events. Yet for me, the centerpiece was the fact that textbooks had to devote that much space to attempting to get the issue across and still largely failed to communicate the event in a way that touched students. “The Litany of Earth” communicates the same event very potently, using the tool of genre to make something most readers might see as only affecting “others” feel universal. The large-scale horror of Lovecraft’s universe revolves around the inevitability that human achievement, and in the end all life, will fading into nothing. The Yith and their library are the only hope for a legacy, one bought at the terrible price of what they do to those whose bodies they commandeer. By creating a parallel between the fragility of all human achievement, preserved only by the Yith, and Aphra’s barely-literate brother Caleb writing of his doomed search for the family library which contained the history and legacy he and Aphra so desperately miss, the fantasy setting puts all readers in Aphra’s place, and the place of those interned, creating universal empathy which no textbook chapter could achieve; neither, in my opinion, could a non-fantasy short story, at least not with such deeply-cutting efficiency. After reading this story, not only the events of Japanese American internment but many parallel situations feel more personally important, and one feels a new sense of personal investment in such issues as the fate of the Iraqi Jewish Archive. This stoking of emotion and investment is a powerful and lingering achievement.
Structurally, the story interweaves experiences from different points in Aphra’s present–where she encounters Specter–with her past arriving in the city and encountering Charlie and his interest in her lost culture and languages. The choice to depict the present scenes in past tense and the flashbacks in present tense might seem counterintuitive, but I found it a powerful and effective choice. Past tense reads as “normal” in prose, so much so that we accept it as an uncomplicated way to depict the main moment of a narrative. In contrast, especially when we have just come from a past tense section, the present tense feels extra-vivid, raw, invasive. It feels like a very certain type of memory, the kind so vivid that, when something reminds us of them, they jump to the forefront of our minds and blot out the here and now with the tense, unquenchable emotions of a very potent then. Trauma makes memories do this, but it is not the traumatic memories of camp life that we experience this way. Instead it is the vividness of tender moments of cultural experience: seeing precious books in Charlie’s study, sharing his drying river, warm things. The transitions to vivid present tense make the reader think about memory and trauma without having to show traumatic events, while simultaneously highlighting how, in such a situation of discontinuity and cultural deprivation, the experiences which are most alive, which blaze in the memory, are these tiny, rare moments of connection, even tragically imperfect connection, with the ghostly echo of Aphra’s lost people.
For me, the triumphant surprise of the story comes in the end, when Aphra approaches the cultists, and chooses to act. Specter’s descriptions of bodies hanging from trees, combined with our familiarity with the copes of creepy cults in Lovecraft and outside, prepare us mid-story to expect that when Aphra approaches the cult they’ll be evil and insane, and she’ll overcome her resentment of the government and do what has to be done. Or possibly the reversal will be stronger with that, and the cult will be good and nice, like Aphra, and the take-home message will be that Specter is wrong and Aphra and the cultists are all just misunderstood and oppressed. It feels like the latter is where the story will take us when we see Wilder and Bergman, and Aphra finds comfort and companionship in participating in a badly-pronounced imitation of her native religion. Even when we hear about the immortality ritual and Bergman refuses to listen to Aphra’s attempts to make her see that her ambition is an illusion, it still feels like we are in the narrative where the cultists are good but misunderstood, and the tragedy is just that there is such deep racial misunderstanding that even Cthulhu-worshipping Bergman cannot believe Aphra’s attempts to help her are sincere. It is a real shock, then, when Aphra called in Specter to shut the group down, because the genre setting raises such a firm expectation that “bad cultist” = “blood and gore” that even when we read about Bergman’s two drowned predecessors it doesn’t register as “human sacrifice” or “bad cult.” Aphra, unlike the reader, is unclouded by genre expectations, and shows us that, precious as this echo of her lost culture is to her, life is more precious still and requires action. The ghostly echo of Aphra’s people that she shares with Charlie is precious enough to blaze in her memory, but she is willing to sacrifice the far more welcome possibility of being an actual priestess for people who sincerely want to share her religion, when she realizes that their cultural misunderstanding will cost human lives. And she cares this deeply despite being an immortal among mortals. The triumph of empathy is complete.
Unlike the numerous vampire stories and other tales which so often present immortals seeing themselves as different, special, unapproachable, and usually superior to mortals, here Aphra’s potential immortality enhances the uniqueness of her perspective and the depth of her loss, but without in any way diminishing her respect for and valuation of the short-lived humans that surround her. The grotesque folder of experimental records which is her mother’s cenotaph does make her reflect on how the loss is greater than the human murderers understood, but does not make her present it as fundamentally different from the deaths of humans, or make her (or us) see her suffering in any way more important or special than that of the Japanese family with whom she lives. The history of Earth that her people have learned from the Yith make her recognize that living until the sun dies is not forever, nor is even the lifespan of the planet-hopping Yith who will persist until the universe has run out of stars and ages to colonize. The Litany of Earth that she shares with Charlie is an equalizer, enabling empathy across even boundaries of mortality by placing finite and indefinite life coequally face-to-face with the ultimate challenges of entropy, extinction and the desire to find something valuable to cling to. “At least the effort is real.” This is something Charlie has despite his failing body, that Aphra’s brother has despite his deprived education, that Aphra has despite her painful solitude, a continuity that overcomes the tragic discontinuity and connects Aphra even with her lost parents, with ancestors, descendants, with forgotten races, races that have not yet evolved, races on distant worlds, races in distant aeons, and with the reader.
One last facet I want to comment on is how the story portrays magic which is at the same time viscerally bodily and also beautiful and positive. This is very unusual, and the more you know about the history of magic the clearer that becomes. Magic, at least positive magic, is much more frequently depicted with connections to the immaterial and spiritual than the bodily: bolts of light, glowing auras, floating illusions, the spirits of great wizards powerfully transcending their age-worn mortal husks. Magical effects that are bodily, using blood, distorting flesh, are usually bad, evil cultism, witchcraft. This trope far predates modern fantasy writing. I have documents from the Renaissance based on ones from Greece discussing magic and differentiating between the good kind which is based on study, scholarship, texts, words of power, perfection of the mind, the soul transcending the body, angelic flight, spiritual messengers, rays and auras of divine power, an intellectual, disembodied and male-dominated “good” magic contrasted, in the same types of texts, with the bad evil magic of ritual sacrifice, sexuality, animal forms, distortion of the body, contagion, blood and associated with witchcraft and with women. Cultural baggage from the Middle Ages is hard to break from even now, and we see this in the palette of special effects Hollywood reserves for good wizards and bad wizards. The tender, intimate, visceral but beautiful magic which Ruthanna Emrys has presented is authentic to Lovecraft and to the rituals we associate with “dark arts” and yet positive, a rehabilitation which works in powerful symbiosis with the story’s treatments of discrimination. Since race and religion are so much in the center of the story, its treatment of gender rarely takes center stage, but in these depictions of magic especially it is potent nonetheless.
I’ll stop discussing the story here, since I resolved to make this review shorter than the story itself, and I’m running close to breaking that resolution.
Step Seven: Sing.
One of the most conspicuous effects when I first read “The Litany of Earth” was that it made me get one of my own songs firmly stuck in my head for many, many hours. The piece is “Longer in Stories than Stone” and it is the big finale chorus to my Viking song cycle, a piece about the fragility of memory and the importance of historical transmission. It is a different treatment but with similar themes, and I found that listening to it a few times live and over and over in my head helped me extend the feelings reading the story awoke in me, and let me continue to enjoy and contemplate its messages for several happy hours. So to celebrate the release of the story (taking advantage of the fact that this blog is no longer anonymous) here is the song, and I hope it will do for you what it did for me and help me extend my period of pleasurable mulling. I hope you enjoy:
Come to rescue us from the dark and gloomy wood of Doubt in which we have been wandering since my first post in this series (did you say hello to Dante?) comes the Criterion of Truth! The idea that, while the skeptics are correct that logic and the senses sometimes fail, they do not always fail, and if we carefully study when they fail, and why, if we identify the source of error, we can differentiate reliable knowledge from unreliable knowledge. For example, our eyes may deceive us when we judge a stick half-submerged in water to be bent, but if we add the testimony of other senses (touch), and of repeated experience (last time we saw an object half-way into water) we can identify the error, and henceforth say that we will not trust sense data based on visual information about objects half-submerged in transparent liquids, but that other sense data may be reliable. Once the causes of error have been defined, once we have a criterion for judging when knowledge is uncertain and when it is reliable, if we thereafter base our conclusions only on what we know is certain, then our conclusions will be reliable, eternal and divine, a steady foundation upon which we may proceed in safety toward that godlike happiness we seek. The Criterion of Truth is the clean and steady light of compromise, which does not banish all shadow, but, like a lantern in the dark, allows a philosophical system to have dogmatic elements while still conceding that much remains in shadow.
“Quite wrong!” cries our Pyrrhonist. “You have it all backwards! Doubt is the steady path toward eudaimonia. The absence of the possibility of certainty is our liberation, not our bane! It is when we embrace the fact that we cannot have certainty that we are finally free from the risk of having our beliefs overturned and our Plutos and Brontosaurs snatched away. It is when truth is firmly beyond human reach that we can finally relax and stop being plagued by curiosity and the endless, restless quest for information. The Criterion of Truth is not a light in darkness, it is a battering ram which has pierced our clean and serene sanctum and smeared it with all the muddled and confusing chaos that we worked so hard to banish! Don’t build a path on this foundation! However steady it may seem, the ground could still give way at any moment and shatter all. And even if it doesn’t, the path will never end. You will exhaust yourself on its construction, your age-gnarled hands still struggling to lay stones when you breathe your last, with never a glimpse of the end in sight, just infinity of toil and darkness. And the you will inflict the same curse upon your children, and your children’s children, and your children’s, children’s, children’s children!”
Whether one sees it as a blessing or a curse, developing a Criterion of Truth is what has allowed, and still allows, dogmatic philosophical systems to exist and progress in a fertile and symbiotic relationship with skepticism, instead of ending with the blank serenity where Pyrrho and other absolute skeptics wanted to dwell forever. Every philosopher with any dogmatic ideas has a criterion of truth (“Yes, even you, Sartre,” says Descartes, “Don’t give me that look!”), and an explanation for the source of error, and frequently I find that, when I am feeling awash in the ideas of a new thinker, one of the best ways to start to get a grip on things is to find the criterion of truth, which gives me an anchor point from which to explore, and to compare that thinker to others I am more familiar with.
Today I shall attempt something a bit compressed but hopefully the compression itself will be fruitful. I intend to briefly examine three of the major classical schools (Platonism, Aristotelianism and Epicureanism) and explain just enough of each system to make clear its criterion of truth and its explanation for the source of error. By laying these out in a compressed form, side-by-side, I hope to show clearly how skepticism is at play in each of the dogmatic systems, and to show what the early approaches to it were, so that when I move forward to major turning points in skepticism it will be clearer just how new and different the new, different things are. Tradition dictates that I start with Platonism, but Socrates is looking a little too aggressively eager now that I mention Plato, and furthermore he was being mean to Sartre while we were away (Don’t pretend you didn’t know that dialog trying to define “being” would make him cry!), so I shall instead start with Epicurus:
The Epicurean Criterion of Truth: Weak Empiricism
Take the stick out of the water. Epicureanism faces up to the skeptical challenge to the reliability of sense data and still chooses to promote the senses as our primary source of information, simply proposing that we should not rely upon first impressions, but should consider sense data reliable only after careful investigation, ideally using multiple senses and instances of observation. But there is more to it than that.
Epicureanism is a mature form of classical atomism, positing that on the micro-level matter is composed of a mixture of vacuum and invisibly tiny, individual components or seeds known as “atoms” which exist in infinite supply but finite varieties (see the modern Periodic Table), and that the substances and patterns we see in nature are caused by different recurring combinations of these atoms. If the same kind of sand appears on two unrelated beaches, it is composed by chance of the same combination of atoms. If a piece of wood is burned and goes from being brown, firm and porous to being white and powdery, some atoms have left it (in the smoke, for example), and the remaining ones look different.
Atoms too are responsible for the apparently changeable properties of objects (remember the seventh mode of Pyrrhonism, that we cannot have certainty because objects take multiple forms). The properties of substances do not derive from atoms themselves but from their combinations. Colors, smells and flavors are all effects of the shapes of atoms, so it is not true that sweet substances contain sweet atoms and red substances red atoms, rather sweet substances contain smooth atoms which are pleasant to the tongue rather than rough, and red objects contain atoms whose combinations create redness. If bronze is red and then turns green, or wood is brown but burns and turns gray, then atoms have entered or left and the new combinations create a different color. And it is on this atomic basis that the Epicureans argue that (a) natural interactions of atoms and vacuum are enough by themselves to explain all observed phenomena, so there is no need to posit fearsome interfering gods, and (b) the soul is just a collection of very fine atoms, distributed in the body and breath, which disperse at death, so there is no need to fear a punitive afterlife.
Atoms are, believe it or not, largely a solution to Zeno’s paradoxes of motion, and also have much to say about our stick in water. As we all recall, Zeno’s arrow can never reach its target because the space in between can be infinitely subdivided into smaller distances which it must cross before it can finish its path, therefore motion is impossible. Epicurus answers: yes. Motion is indeed impossible. Motion is an illusion. The key is that space is not infinitely divisible, as Zeno proposed. Atoms, according to the Epicurean system, are not only the smallest objects but the smallest subdivision of space; it is literally impossible to subdivide either atoms or space further. (Note that if he were around now Epicurus would deny that our modern “atoms” are atoms – he would confer that title upon the smallest known sub-atomic particle, or reserve it for the piece smaller than that which all the king’s horses and all the king’s cyclotrons still can’t detect.) The smallest distance any object can move is one atom-width – any more nuanced motion is impossible. In other words, fluid motion is an illusion, and on the micro-level objects do not slide from one place to another. Rather their atoms pop in an instant from one position to the next atom-width over. One might call it microscopic teleportation. It is by this means that the arrow moves: every component atom in the arrow teleports one space to the left each moment, and thus the arrow proceeds from right to left sequentially.
Positing micro-teleportation as a substitute for motion may seem alien, but it is something we make use of every day in the modern world, and it is in fact much easier to explain Epicurean theories of motion to modern computer-users than it was to people in the past. As you scroll down this page, the cursor of your mouse and the text on the screen seem to move, but in fact nothing is moving. Instead tiny pixels, the atom-widths of your screen, are changing color, or you could say that the black pixels that form the text are teleporting one pixel-width per moment as you scroll. The eye, unable to see such fine distinctions, blurs that micro-teleportation into the illusion of motion. Why couldn’t all motion be a similar illusion? Zeno is defeated, and Reason is once again reliable.
Which is good because Reason is the heart of the system of knowledge Epicurus wants to build. The Epicurean atomic theory, after all, is based on a combination of observations of the sensible world and then logical deductions. We observe that objects change their form when burned, that sea-soaked cloth hung up to dry becomes dry but remains salty, and that the same types of substances recur in many independent locations. From this we deduce the existence of atoms of different types in different combinations without ever directly seeing them. Zeno’s paradox of motion does not, in this interpretation, demonstrate that we can’t trust reason, but that we can’t trust rash, unexamined observations. There seemed to be motion, but with time, patience, observation and reason the Epicurean has determined that that was a mistake, and found a better model.
But this does an interesting thing to sense data, which Epicurus still wants to be more our guide than naked logic. Atomism, which predates Epicurus, seems to have itself arisen from observations of motes in a sunbeam, tiny particles which are invisible normally but visible only in special circumstances, and which all classical atomists cite as sensory evidence for the reality of atoms. From motes in a sunbeam and raw logic, they derive the atomic theory. As Epicureans strive to free themselves from fear of the unknown by observing and explaining natural phenomena through the interaction of atoms, they rely on what they can see, feel, hear and touch to derive their theories. This is empiricism but it is (as Richard Popkin aptly named it) weak empiricism. Why? Because the reality beneath what we observe is invisible. (“Exactly!” cries Sartre, leaping up with sufficient force to knock over Descartes’ thermas.) If atoms are undetectably tiny, and everything we see, taste and smell is a consequence of their combinations rather than the atoms themselves, then we can never have real knowledge of the fundamental substructure of being. There is an insoluble barrier between us and knowledge of true things, the barrier of minuteness. Thus Epicurean empiricism involves surrendering forever any certain knowledge of the truth of things, but in return we can have fairly reliable knowledge based on careful, repeated observation using multiple senses, especially now that logic has been rescued from Zeno’s grasp and is once again our ally.
Source of Error: Twofold. Limitations of the senses, which cannot see atomic reality; unquestioned acceptance of sense data and commonplace cultural assumptions (like superstitions about the gods) which are unreliable because they are not based on careful observation and analysis.
Criterion of Truth: Knowledge is certain when it is based on a combination of careful observation of the sensible world with multiple senses, and careful logical analysis.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of the observable world, and we can make rational deductions about the insensible world which are reliable enough to act upon (since we cannot ever prove or disprove them), but we cannot ever have true and certain knowledge of the invisible atomic world which is Nature’s true reality.
At this point some readers are not particularly disturbed by Epicurus’ surrender of true knowledge of microscopic things. After all, have advanced since 300 BC. We played with microscopes in grade school, we named the proton and the quark and preon, we made molecules out of toothpicks and gummy candies, and the electric blood of splitting atoms blazes in our lightbulbs. We fixed that weakness. “Delusion!” Sartre says, and he is right that, on a fundamental level, this technological advancement has not let us reclaim what Epicurus surrendered. However advanced our science, we still have no cause to believe we have yet perceived or even hypothesized the literally smallest increment of matter. And, separately, even if we had a machine capable of perceiving the smallest part of matter, we would still be limited by our senses since the machine would have to use our senses to transmit its findings to us, transmitting only an approximation, rather than reality. And in addition, the vast majority of our daily decisions would still be based on what we perceive at the macroscopic level. Thus, even with technological aid, the Epicurean surrender of knowledge of the fundamental seeds of things is a considerable one, and divides all knowledge firmly into two camps, the perceivable world about which it is possible to have certainty, and the reality beneath about which it is not. We have a path and shadows, dogmatism and skepticism coextant within one system.
The Platonic Criterion of Truth: the Forms
My approach to Platonism will be rather sideways, but I want to get us to its criterion of truth by a route that is as parallel as possible to Epicurus’. So, for the vast majority of my readers who know basic Platonism already, please read along thinking about Zeno’s paradoxes and the stick in water how this way of outlining Platonism follows the same logical structure Epicurus did.
Plato, like the skeptics, acknowledges that the senses fail and deceive, and, like the atomists, observed that there are recognizable, recurring objects in nature that come into existence in independent parallel to one another: similar rocks, mountains, trees and animals in distant corners of the Earth, which must, he reasoned, have some common source. He also noticed that humans are able to recognize and identify these objects as being the same, even humans who have never met each other, or speak different languages, and even when the objects may have radically different colors and shapes disguising a shared structure – a disguise we see through. Finally he noticed (something Epicurus did not discuss) the fact that humans not only naturally identify objects, but naturally judge them to be better or worse based on unspoken but nonetheless universal criteria. Anyone can tell that a crisp, fresh apple is “better” and a withered, dry one “worse” without having to discuss or debate that fact, or even to be taught it. I could show you a healthy and a diseased version of some deep-sea fish you’ve never heard of and you would nonetheless successfully identify them as “better” and “worse” exemplars of a completely new and unknown thing.
To explain these patterns, and this universal capacity to identify and judge “better” and “worse” examples of things, Plato posited that these objects must have a shared source, but instead of positing a combination of atoms, he posited a source independent of matter that supplied the object’s structure. All quartz crystals, all trees, and all apples take their structures from a separate structure-supplying object, which exists independent of matter and time. It has to, since the objects it generates can come into existence and be destroyed, but the pattern, the archetype, the source remains. Plato named this structural archetype the “Form” and posited that these Forms exist in a separate level of reality. They create the many material manifestations of their structure as a flag pole might cast many shadows on different objects at different times. As some shadows are crisp, straight images of what casts them and others are vague, twisted or distorted, so objects are sometimes fairly straight and sometimes quite twisted manifestations of their Forms. When we judge an object, we judge it based on how good an image it is, how closely it resembles the Form which is the source of its structure. Hence why anyone of any age, in any culture, without the necessity of communication, can judge the superior of two apples, and tell that twisty trees are weird.
But objects are never truly like their Forms because Forms exist on a completely different level of reality, just as the flag pole exists on a different level of reality from its shadows. We know this the same way we know that the godlike eudaimonia we seek cannot be based on fleeting things like lust and truffles. Forms are indestructible – no matter how many trees or apples burn, the Form remains. With that attribute, in the Greek mind, go the others: Forms are eternal, unchanging, perfect, and divine. They cannot be part of this changing and destructible reality, but must exist on some other layer of reality where change and destruction do not exist. Note how this is in many ways exactly symmetrical to Epicurus’s atomic theory, in which atoms are indestructible, unchanging and perfect, and exist on an imperceptible micro-level accessible to us only by deduction, just as real-but-invisible as the Platonic realm of Forms. Both posit a materially inaccessible world which is the source of the structures of the perceivable world.
What about Zeno and the stick in water? Simple: the motions of a flagpole’s shadow across the earth and ground aren’t rational but bizarre, bending and distorting, split in half at times by passing objects, changing and imperfect. Just so the material world. The stick in water looks bent, and motion is rationally impossible, because the entire layer of reality perceived by the senses is itself bent, distorted, an imperfect effect of a perfect reality elsewhere. When we see the stick look bent, or realize that motion makes no sense, it is at that point that we are beginning to perceive the fundamental flaws in sensible reality, and realize that the true, rational, knowable structure lies elsewhere.
True knowledge, reliable, certain knowledge upon which we may build our path toward reliable, certain eudaimonia must therefore be knowledge of Forms, not of passing things. We can have True knowledge of the Form of Apples, the Form of Trees, the Form of Justice, the Form of Humans, but we cannot have true knowledge of a particular apple, tree, case of justice v. injustice, or human, because such things are changing, imperfect, and perishable, so even if we could know them perfectly at one instant, that knowledge would not be lasting, not enough to be a real foundation for happiness. The only permanent, certain knowledge is knowledge of eternal things, since all other knowledge is, like its objects, destructible. Thus the Forms are the path to Happiness.
And now, without any need to address the soul, or Platonic love, or Truth, or the other great Platonic signatures, we can describe the Platonic Criterion of Truth:
Source of Error: The material world perceived by the senses is imperfect and illusory, and conclusions based on observation of it are full of error, and incomplete.
Criterion of Truth: Knowledge is certain when it is based on knowledge of the eternal Forms, which can be perceived by Reason. So long as we rely only upon knowledge of abstract, eternal Forms and not on knowledge of specific material things, we will make no errors.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of the Forms, i.e. of the eternal structures that create the sensible world, but we cannot ever have true and certain knowledge of individual objects within the material world.
Now, our friend Socrates has been waiting all this time to rant about how Plato put all this in his mouth, by using him as an interlocutor in his philosophical dialogs, when all Socrates stood for was the principle that we know nothing, and wisdom begins when we recognize that we know nothing. But explicated like this, in a way which highlights how substantial a portion of human experience Plato has yielded to the shadows of skeptical unknowability, Socrates has far less cause to object. Plato has taken “I know nothing” as his starting point, as, in fact, did Epicurus, both of them beginning by scrapping the received commonplaces of things people thought they knew about the material world, and instead trying to find a space for certainty far removed from the evidently-unknown world of daily experience. We all know that Plato tried to appropriate Socrates to his system, painting Socrates as a Platonist and implying that Socrates agreed with all Plato’s dogmatic ideas as well as his skeptical ones.
But Plato was far from the only one to do this. In the ancient world, Skeptics, Cynics, Stoics, Aristotelians and Neoplatonists all make claims about Socrates really believing what they believed, that Socrates was really a skeptic, or a stoic sage, etc. This is easy because Socrates left us nothing in his own voice, but also because all of them really did begin as he demanded, by doubting everything, declaring that “I know nothing” and then trying to work from that toward a system which carves out one zone for the knowable and surrenders another to the unknowable. Attempts by later sects to appropriate Socrates reflect his fame, but also their universal gratitude for the way his refinement of skepticism created a starting point from which they could approach their Criteria of Truth, and start from there to lay their foundations. And now that I’ve put it that way, Socrates seems much less set on picking a bone with Plato, and much more interested in the bones of the chicken drumsticks Sartre brought, which are much larger than those Descartes brought, which are larger than the ones Socrates is used to, a mystery which definitely bears investigation. We can in part blame one “Aristotle”, though when I mention him our more modern thinkers smile knowingly, thinking of the many stages that had to pass between the ancient empiricist and the alien concept “progress.”
The Aristotelian Criteria of Truth: Categories and Definitions
Aristotle studied with Plato for decades, and his framework has a similar beginning. Yes, we instantly recognize that apple is apple and cat is cat, even if we are on the other side of the world and recognize apple as ringo and cat as neko. And we instantly judge the withered apple as being farther from what an apple ought to be than the crisp one.
What Aristotle doesn’t like is how Plato has the Forms exist in a hypothetical immaterial reality removed from the sensible reality. Instead, he uses the term “form” to refer to structures within natural objects, which are not material but not immaterial either. They are non-material. This may sound like gibberish, but I recently demonstrated it very effectively to my class by taking two apples to the front of the classroom, setting them down while I had a drink of water, then violently smashing one of the apples with repeated blows from the butt end of the water glass, reducing it to a sticky green pulp and producing an extremely startled and, in the front rows, apple-bespattered classroom. “What did I just destroy?” I asked. It took only a few moments of recovery for one to supply: “The form of the apple.” Aristotle even goes so far as to say that forms, rather than matter, are what senses sense. When we see an apple our minds do not register the raw, chaotic matter, they register the structure: apple. When we see smashed apple pulp even then we do not see matter, we see pulp, which has its own structure. We never perceive matter, or rather never recognize matter, never understand matter. All cognition takes place on the level of form, which is why we can identify “apple” at a glance and not have to spend a minute assembling the millions of points of perceived light and color together to deduce that it’s an apple.
But if the form, for Aristotle, is a structure within individual objects, and is destructible, it can’t be a source of eternal certainty, nor can it explain how my colleague in Japan can recognize and judge apple identically to the way I do. For this Aristotle posits Categories. Universal categories exist in nature, non-material structures just like forms, into which the forms of objects fit. Human Reason is capable of identifying these categories, by looking at objects, understanding their forms, and identifying their commonalities, functions etc. We all see the apple and recognize that it fits in the category apple. We further recognize that the category apple fits in the category fruit, that in the category “part of a plant” etc. And that Stamen Apple is a sub-category within the category apple. This allows us to identify and judge even objects which we have never seen before and have no names for. You probably do not know at a glance what the creature pictured to the left here is, but you can identify that it belongs in the category mammal, possibly in the rodent category or maybe more like a tiny deer judging by those skinny legs, but certainly in the medium-sized, ground-dwelling, non-carnivore, probably scavenger eating fruit and bugs and things, not-dangerous-to-humans category. (It is, in fact, a Kanchil or “mouse-deer”). Similarly we can all categorize trees, rocks, fish, and other things. Aristotelian categories are part of Nature itself, eternal and unchanging, and indestructible, since the category apple and the category Kanchil will be unchanged regardless of the creation or destruction of any individual. A withered apple doesn’t harm the category apple, nor does a limping three-legged Kanchil, and the extinction of the T-Rex didn’t erase the category T-Rex.
The extinction of the Brontosaur didn’t erase the category Brontosaur either – it was our discovery that the category was wrong that did so, and here we get toward Aristotle’s ideas of certainty and error. We had not defined our terms carefully enough, had accidentally separated two things that shouldn’t be, and thus were led to error. Error caused by insufficiently clear definitions of our terms. The categories are sources of true, certain and reliable knowledge. Like with Plato’s forms, we cannot Know-with-a-capital-K individual things with certainty, since they are destructible and changing, and the apple which is fresh today will be withered next week. But we can know the categories, and that it always has been and will be the nature of the apple to grow on trees and try to be sweet and colorful to attract animals to eat it and spread seeds, and that it always and will always be the nature of the T-Rex to be a humungous terrifying predator the sight of which inspires fear in all mammals and other smaller creatures. One source of error is when we make mistakes about categorization. We may mistake the Kanchil for a rodent, or a Vaquita for a dolphin, but with more careful observation we realize it is more closely related to a deer. We may mistake the Brontosaur for its own species before we realize it is a juvenile version of another thing, as easy a mistake to make as thinking that a caterpillar and butterfly are different creatures until we examine more closely. We also want to do this with things we may not, in modern parlance, think of as part of Nature, but just as there is the category “cetacean” within which exists the category “porpoise” so too there exist the category “integer” within which exists the category “prime number,” also the category “system of government” within which lies the category “democracy,” and the category “virtue” within which exists the category “justice.” Aristotle, and the rest of Greece with him, does not draw our modern post-Rousseau line between “Natural” and “artificial” placing human works in the latter. Birds are part of Nature, as are humans; birds’ nests are part of Nature, with a category, as are all the things humans create. The category “web page” which contains the category “blog” is as natural as the category “tree”.
Thus Aristotelian certainty comes with careful, systematic investigation of the categories within nature, and if we want to reduce error we can do so best by studying and measuring and comparing objects we see until we can fit them into categories. The more we study, and the more carefully we define our terms, the clearer our conversations will become, less given to assumptions, misunderstandings and error. One source of error, therefore, is equivocal language, words that are sloppily defined and don’t refer to real categories in nature. Brontosaur, planet, motion, Justice, good, are all sloppily-defined terms. Any term which does not point to a real category in Nature is sloppy and may lead us to error. If we use only vocabulary that is carefully worked through and points only at real categories, then our language will be clear, our communication perfect, and the possibility of error greatly reduced. After all, we only want to be talking about categories, not anything that isn’t one. Since, as with Plato’s forms, categories are eternal, unchanging and reliable. On their foundation we can build our path. As with Plato and Epicurus we have surrendered knowledge of individuals, in favor of knowledge of something structural which underlies them.
Excuse me: to proceed farther with Aristotle, I need to go get my fork. Here it is. (Or rather an image of it, one level less real, its Platonic shadow.)
This fork has been part of my life since I was a tiny girl, and it taught me about the Aristotelian sources of error. When I was little, I would help put the silverware away. This fork puzzled me. Why? Because I couldn’t figure out how to categorize it.
Here you see my dilemma. We had one slot for forks, which had tines and metal handles. And one slot for knives, which had blades and wooden handles. Where then goes this fork, which has tines but a wooden handle? Let’s offer the dilemma to our Youth.
Youth: “I think it should go with the metal-handled fork.
Youth: “Because it’s a fork. It’s used for fork things, that’s more important than what it’s made of.”
*Ding!*Ding!*Ding!* Correct! The Youth, like my child self, has correctly identified the Aristotelian distinction between an “essential property” and an “accidental property”. An essential property is a quality of something essential to it being itself, and filling the function it has in Nature; an accidental property is something that could change and it wouldn’t matter. A cat can be black or tabby (accidental) but must be slinky, carnivorous, and endearing to its owner in order to fulfill the functions of a cat. A tree must grow a woody trunk and produce leaves in order to fulfill the functions of a tree. A fork must fit comfortably in my hand and lift chunks of food to my mouth for it to be a fork. If the cat is orange, the tree is forked, and the fork is a futuristic rod that lifts food using a miniature tractor-beam instead of tines, those are accidents. If these things fulfill these functions badly–if a cat is ugly, a tree is all bent and twisted and produces few leaves, or a plastic fork snaps when I try to skewer food with it–we judge them bad examples of what they are. If these things don’t fill these functions at all–a quadrupedal mammal eats grass, a plant produces a soft viny stalk, and a piece of silverware cuts food in half instead of lifting it–we judge they do not belong in the categories cat, tree and fork respectively because they lack their essential properties. If I had mistakenly stored my wooden-handled fork with knives, that would have produced error, the same source of error as when we mistake a Kamchil for a rodent, or when Descartes, living in the 17th century, reads an article about how people from Africa are not the same as people from Europe because their skin is a different color. Mistaking accidental properties for essential ones has introduced error. And to call a robot toy a “cat”, or a metaphor for understanding genealogy a “tree”, or a fifteen-foot fork-shaped sculpture a “fork” is to employ ambiguous language, not referring to its categories, introducing error.
But what about Zeno, and our stick in water? For our stick in water Aristotle, much like the Epicureans, wants us to examine the stick more carefully, multiple times with multiple senses, to correct the mistake. And, like the Epicureans and Plato too, he surrenders true knowledge of individual objects, saying we can know Categories with certainty, after careful examination, but not specific things.
As for Zeno, there he comes from a different angle, attempting to refute Zeno with pure logic. Aristotle is big on observing Nature, but also on logical principles, especially a priori principles. By these he means logical principles which are self-evidently true and require no knowledge or experience to be proved. For example: The same thing cannot both be and not be at the same time. Think about it for a while, take your time. It’s the case, and not only is it the case but it’s the case for lampreys, and thumbtacks, and hypothetical frictionless spheres, and ideas, and systems of government, and people. Even if you were a brain in a jar that had never had any experience of the world outside the mind, you could identify that a concept cannot both exist and not exist at the same time. Here’s another: “One” and “many” are different. It is nonsense to imagine that a thing could be both singular and plural at the same time. That too you can conclude without any basis in anything.
Now, it is possible to use clever syntax to come up with what seem like counter-examples. What about a doughnut hole: surely it exists and doesn’t exist at the same time, for this doughnut has a non-existence which is its hole, and yet here I am eating this doughnut hole. No, says Aristotle. That apparent contradiction is merely a function of unclear vocabulary giving two things the same label when they are utterly different. Similarly this pomegranate is one and many at the same time. Again, no: it is many seeds, but one pomegranate. Use strict vocabulary, unambiguous terms, and discuss only categories, and you will find that Aristotle’s a priori principles are sound.
Reasoning from such starts, and using raw logic without recourse to any knowledge of the material world, he then takes on Zeno. You cannot, says Aristotle, have infinite regression. It may seem you can, but an infinite chain is a logical impossibility because it would never end and never start. When you try to think about it, the mind rebels, just as it does when it tries to think of the one and the many being the same, or a thing both being and not being at the same time. Thus, says Aristotle, Zeno’s paradox is proved false because infinite regression is logically false. We can, now, rely on logic, so long as it is careful and methodical, and based on first principles and on comparison of the categories rather than leaping to conclusions directly from sense impressions of individual objects, which are flawed.
Sources of Error: (1) People using vague vocabulary that is unclearly defined and does not refer to anything Real, (2) Fallibility of individual material objects and rushed conclusions based on observations of such objects (note how similar this latter is to Plato).
Criterion of Truth: Knowledge is certain when it is based exclusively on either or a combination of a priori logical principles which are not dependent on anything other than logic to be certain, and on the eternal Categories which exist universally in Nature, and can be known through observation and discussed using a carefully-defined lexicon of philosophical vocabulary.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of logical principles, and of the Categories, i.e. of the eternal structures within Nature that the forms of objects fall into, but we cannot ever have true and certain knowledge of individual objects within the material world.
Thus we have a third path, clearly delineating the arena of certain, eternal knowledge (on the basis of which we may seek eudaimonia) and separating it from the unknowable, which we surrender forever to skepticism. And once again the unknowable is the realm of matter, individual things, the essence which is given structure and comprehensibility by form. Aristotle, like Epicurus, has given up any chance of understanding matter itself, confining the cognizable world to that of form and structure, the macro-level. And he has surrendered knowledge of individuals, of this apple and this lamprey, granting us only the categories. We can still know an enormous amount in Aristotle’s system, enough to build a vast system of knowledge, a library of definitions, a vast network of genus and species names, and an empirical basis for an entire scientific system. Infinite knowledge lies before us on our Aristotelian path, infinite logic chains to follow, infinite categories to investigate, name, compare and discuss. The surrender, like Epicurus’s surrender of the ability to see atoms, feels minor.
“It’s still delusion!” Sartre says. “The surrender is vast! Infinite! Infinitely more vast and fundamental than your daily world imagines!” This outburst has been building up in poor Sartre for some time, which we can tell because since he’s been holding his knees and rocking back-and-forth and flushing, and only barely sociable enough to thank Descartes for that eclair (which is not, in fact, a lightning bolt but is a delicious pastry named “lightning bolt” in French, much to Aristotle’s chagrin). And, at some risk of frightening our innocent interlocutor the Youth (whom I shall advise to have Socrates hold his hand through the next bit) I will let Sartre continue in his own words, an excerpt from his Nausea(note that this particular translation uses existence rather than being):
“So I was in the park just now. The roots of the chestnut tree were sunk in the ground just under my bench. I couldn’t remember it was a root any more. The words had vanished and with them the significance of things, their methods of use, and the feeble points of reference which men have traced on their surface. I was sitting, stooping forward, head bowed, alone in front of this black, knotty mass, entirely beastly, which frightened me. Then I had this vision. It left me breathless. Never, until these last few days, had I understood the meaning of “existence.” I was like the others, like the ones walking along the seashore, all dressed in their spring finery. I said, like them, “The ocean is green; that white speck up there is a seagull,” but I didn’t feel that it existed or that the seagull was an “existing seagull”; usually existence hides itself. It is there, around us, in us, it is us, you can’t say two words without mentioning it, but you can never touch it. When I believed I was thinking about it, I must believe that I was thinking nothing, my head was empty, or there was just one word in my head, the word “to be.” Or else I was thinking . . . how can I explain it? I was thinking of belonging, I was telling myself that the sea belonged to the class of green objects, or that the green was a part of the quality of the sea. Even when I looked at things, I was miles from dreaming that they existed: they looked like scenery to me. I picked them up in my hands, they served me as tools, I foresaw their resistance. But that all happened on the surface. If anyone had asked me what existence was, I would have answered, in good faith, that it was nothing, simply an empty form which was added to external things without changing anything in their nature. And then all of a sudden, there it was, clear as day: existence had suddenly unveiled itself. It had lost the harmless look of an abstract category: it was the very paste of things, this root was kneaded into existence. Or rather the root, the park gates, the bench, the sparse grass, all that had vanished: the diversity of things, their individuality, were only an appearance, a veneer. This veneer had melted, leaving soft, monstrous masses, all in disorder—naked, in a frightful, obscene nakedness.”
By this point our Youth is very glad to have his hand held, and Descartes is having second thoughts about sharing his eclair with what has evidently turned out to be a lunatic Lovecraftean cultist. But I let Sartre speak here to demonstrate the fact that these surrenders, made in the earliest days of philosophy by system-weavers seeking to escape the web of Zeno and the Stick, are still substantial. Even the most recent modern philosophy returns, from time to time, to these ancient surrenders to unknowability, and some try, like Sartre, to make new inroads toward knowing what the majority of thinkers have given up on. New and, in Sartre’s case, scary inroads. Every system-weaver since Plato may have a Criterion of Truth to be our light in the darkness, our path, our foundation, the circle line for the new philosophical subway system, but the fertile symbiosis between skepticism and dogmatism–the symbiosis which has borne such fruit: Platonic forms, genus and species, atoms, eventually the scientific method itself!–is also still sometimes a hostile symbiosis, and the wild, strong skepticism of Pyrrho still sometimes rears its head to plague Sartre and us, even as we make daily use of soft forms of skepticism like Epicurus’ weak empiricism, and Aristotle’s categories.
Of course, many are the centuries between Epicurus and Sartre, and many the new relationships between doubt and dogma, the new Criteria of Truth and new forms of shadowy un-knowledge which will press upon our fragile paths, before we reach the modern world. So we still have much more to explore in further chapters. Good thing Descartes brought plenty of lightning bolts.