Welcome to a new feature here on Ex Urbe — the promoted comment.
From time to time, Ada makes a long substantive chewy comment, which could almost be its own post. Making it into an actual post would take valuable time. The comment is already written and fascinating — but hidden down in a comment thread where many people may not notice it. From now on, when this happens, I will extract it and promote it. I may even go back and do this with some older especially awesome comments. You’ll be able to tell the difference between this and a real post, because it’ll say it’s posted by Bluejo, and not by Exurbe, because it will say “a promoted comment”, and also because it won’t be full of beautiful relevant carefully selected art but will have just one or two pieces of much more random art.
I thoroughly enjoyed reading this new post. As I am reviewing macroeconomics, especially the different variations of Solow Model, I cannot help but link “intellectual technology” with the specific endogenous growth model, which attempts to led the model itself generate technological growth without an exogenous “manna from heaven”. In this model, technology growth is expressed endogenously by the factor capital as “productive externalities”, and individual workers, through “learning by doing,” obtain more “skills” as the capital grows. Of course, the “technology factor” in the model I learned is vaguely defined and does not cover the many definitions and various effects of “intellectual technology” not directly related to economic production.
Your conversation with Michael reminds of me the lectures and seminars I took with you at Texas A&M. By the time I took your Intellectual History from Middle Ages to 17th Century, I have already taken some classes on philosophy. Sadly, my fellow philosophy students and I usually fell into anachronism and criticized early thinkers a bit “unfairly” on many issues. That is why your courses were like a beam of light to me, for I was never aware of the fact that we have different logic, concepts, and definition of words from our predecessors and should hence put those thinkers back into their own historical context.
It seems to me that Prof. Peter E. Gordon’s essay “What is intellectual history’ captures the different angles from which you and Michael construe Machiavelli: Michael seems more like a philosophy/political science student who attempts to examine how and why early thinkers’ ideas work or not work for our society based on our modern definitions, concepts, and logic, thus raising more debates on political philosophy and pushing the progress of philosophical innovation; your role as an intellectual historian requires one to be unattached from our own understanding of ideas and concepts and to be aware of even logic that seems to be rooted in our subconsciousness so that to examine a past thinker fairly without rash judgement. Michael is like the one who attempts to keep building the existing tower upward, while you are examining carefully the foundation below. For me personally, it would be nice to have both of these two different ways of thinking.
I have a question: I have been attempting to read a bit of Karl Marx whenever time allows. He argues that our thinking and ideology are a reflection of our material conditions. If we accept his point of view, would it be useful to connect intellectual history with economic history?
Nahua, I think you have hit it spot on with your discussion of Peter Gordon’s essay. When I worked with him at Harvard (I had the privilege of having him on my committee, as well as being his teaching assistant for a course) I remember being struck by how, even when we were teaching thinkers far outside my usual scope like Heidegger, I found his presentation of them welcoming and approachable despite my lack of background, because he approached them in the same context-focused way that I did, evaluating, not their correctness or not or their applicability to the present, but their roots in their contemporary historical contexts and the reasons why they believed what they believed.
For Marx’s comment that “our thinking and ideology are a reflection of our material conditions” I think it is often very useful to connect intellectual history with economic history, not in a strictly deterministic way, but by considering economic changes as major environmental or enabling factors that facilitate or deter intellectual change and/or the dissemination of new ideas. I already discussed the example of how I think the dissemination of feminism in the 19th century was greatly facilitated by the economic liberation of female labor because of the development of industrial cloth production, more efficient ways of doing laundry, cleaning, cooking etc. Ideas about female equality existed in antiquity. They enjoyed a large surge in conversation and support from the intellectual firebrands of the Enlightenment, through figures like Montesquieu, Voltaire and Wollstonecraft. But mass movements and substantial political changes, like female suffrage, came when the economic shift had occurred. To use the “intellectual technology” concept, the technology existed in antiquity and was revived and refined in the 18th century, but it required economic shifts as well to help reach a state when large portions of the population or whole nations/governments could embrace and employ it.
As I work on Renaissance history, I constantly feel the close relationship between economics and the intellectual world as well. Humanism as I understand it began when Petrarch called for a revival of antiquity. Economics comes into this in two ways. First, the reason he thought a revival of antiquity was so desperately necessary was because Italy had become so politically tumultuous and unstable, and was under such threat of cultural or literal invasion from France–these are the consequences, largely, of economic situations, since Italy’s development of banking and its central position as a trade hub for the Mediterranean had filled its small, vulnerable citystates with incomparable wealth, creating situations where powerful families could feud, small powers could hire large mercenary armies, and every king in Europe wanted to invade Italy for a piece of its plump pie. Then after Petrarch, humanism’s ability to spread and succeed was also economically linked. You can’t have a humanist without books, you just can’t, it’s about reading, studying, correcting and living the classics. But in an era when a book cost as much as a house, and more than a year’s salary for a young schoolmaster, a library required a staggering investment of capital. That required wealthy powers–families or governments–to value humanism and have the resources to spend on it. Powers like the Medici, and Florence’s Republican government, were convinced to spend their money on libraries and humanism because they believed it would bring them glory, strength, respect, legitimacy, the love of the people, that it would improve life, heal their souls, bring peace, and make their names ring in posterity, but they couldn’t have made the investment if they hadn’t had the money to invest, and they wouldn’t have believed humanism could yield so much if not for the particular (and particularly tumultuous) economic situation in which Renaissance Italy found itself.
Yesterday I found myself thinking about the history of the book in this light, and comparing it to some comments I heard a scientist make on a panel about space elevators. We all want a space elevator–then space exploration will become much, much less expensive, everyone can afford satellites, space-dependent technologies will become cheap, and we can have a Moon Base, and a Mars program, and all the space stations we want, and all our kids can have field trips to space (slight exaggeration). To have a space elevator, we need incredibly strong cables, probably produced using nanofibers. Developing nanofibers is expensive. What the engineer pointed out is that he has high hopes for nanofiber devlopment, because nanofibers have the ideal demand pattern for a new technology. A new technology like this has the problem that, even if there are giant economic benefits to it later on, the people who pay for its development need a short-term return on that, which is difficult in the new baby stages of a technology when it’s at its most expensive. (Some of you may remember the West Wing episode where they debate the price of a cancer medication, arguing that producing each pill costs 5 cents so it’s unfair to charge more, to which the rebuttal is that the second pill cost 5 cents, but the first pill cost $300 million in research.) Once nanofiber production becomes cheap, absolutely it will be profitable, but while it’s still in the stage of costing $300 million to produce a few yards of thread, that’s a problem, and can be enough to keep a technology from getting support. One of the ways we work around this as a society today is the university system, which (through a form of patronage) supports researchers and gives them liberty to direct research toward avenues expected to be valuable independent of profit. Another is grant funding, which gives money based on arguments for the merit of a project without expecting to be paid back. A third is NASA, which develops new technologies (like velcro, or pyrex) to achieve a particular project (Moon!), which are then used and reused in society for the benefit of all. But looking at just the private sector, at the odds of a technology getting funding from investors rather than non-profits, what the scientist said is that, for a technology to receive funding, you want it to have a big long-term application which will show that you’ll make a steady profit once you can make lots of the thing, but it needs to also to have a short-term application for which a small number of clients will be prepared to pay an enormous amount, so you can sell it while it still costs $300 million, as well as expecting to sell it when it costs 5 cents. Nanofibers, he said, hit this sweet spot because of two demands. The first is body armor, since it looks like nanofibers can create bullet-proof fabric as light as normal fabric, and if we can do that then governments will certainly pay an enormous amount to get bullet-proof clothing for a head of state and his/her bodyguards, and elite military applications. The second is super-high-end lightweight golf clubs, which may seem like a frivolous thing, but there are people who will pay thousands of dollars for an extremely high end golf club, and that is something nanofibers can profit from even while expensive (super lightweight bicycles for racing also qualify). So nanofibers can depend on the excitement of the specific investors who want the expensive version now, and through their patronage develop toward the ability to produce things cheaply.
In this sense the history of the book, especially in the Renaissance, was very similar to the situation with nanofibers. In the early, manuscript stage when each new book cost the equivalent of $50,000 (very rough estimate), libraries were built and humanism was funded because wealthy people like Niccolo Niccoli and Cosimo de Medici believed that humanist libraries would give them and their home city political power and spiritual benefits, helping them toward Heaven. That convinced them to invest their millions. Their investments then created the libraries which could be used later on by larger populations, and reproduced cheaply through printing once it developed, but printing would not have developed if patrons like them weren’t around to make there be demand for the volume of books printing could produce. It took Petrarch, Niccoli and Cosimo to fund a library which could raise a generation of people who could read the classics before there was enough demand to sell the 300-1500 copies of a classical book that a printing press could print. And, working within current capitalism, it may take governments who really want bullet-proof suit jackets to give us our space elevator, though universities, NASA, and private patronage of civilian space programs are certainly also big factors pushing us forward.
In sum, I would say that economics sometimes sparks the generation of new ideas–as the economically-driven strife Petrarch experienced enabled the birth of humanism–but it also strongly affects how easily or quickly a new idea can disseminate, whether it gets patronage and support, or whether its champions have to spread it without the support of elites, patrons or government. Thus, in any given era, an intellectual historian needs to have a sense of funding patterns and patronage systems, so we can understand how ideas travel, where, and why.
One more thought from last night, or rather a test comparison showing how the concept “intellectual technology” can work. I was thinking about comparing atomism and steel.
Steel is a precursor for building skyscrapers. Despite urban demand, we didn’t get a transition to huge, towering metropoles until the development of good steel which could raise our towers of glittering glass. Of course, steel is not the ONLY precursor of the skyscraper–it also requires tempered glass, etc. And it isn’t the only way to build skyscrapers, you can use titanium, or nanotech, but you are very unlikely to get either of those things without going through steel first. Having steel does not guarantee that your society will have skyscrapers. Ancient Rome had steel. In the Middle Ages Europe lost it (though pretty-much everywhere except Europe still had steel). When steel came back in the Renaissance it still didn’t lead immediately to skyscrapers, it required many other developments first, and steel had to combine with other things, including social changes (growth of big cities). But when we look at the history of city development, studying steel is extremely important because the advent of steel-frame construction is a very important phase, and a central enabling factor for the development of modern cities.
My Lucretius book looks at the relationship between atomism and atheism in the same way that this analysis looks at steel and skyscrapers. Atomism was around for a long time, went away, came back, etc. And you can have non-atomic atheism, we have lots of it now. But atomism, as the first fully-developed mechanical model of the working of Nature (the first not dependent on God/gods to make the world work) was, in my opinion, one of the factors that you needed to combine with other developments to reach a situation in which an intellectual could combine mechanical models of nature with skepticism with other factors to develop the first fully functional atheistic model of the world. It’s one of the big factors we have to trace to ask “Why did atheism become a major interlocutor in the history of thought when it did, and not before or after?” just as tracing steel helps us answer “Why did skyscrapers start being built when they did?” There had almost certainly been atheisms before and independent of atomism (just as you can make really tall things, like pyramids or cliff-face cities, without steel-frame construction) but it was rare, and didn’t have the infrastructural repeatability necessary to let it become widespread. Modern atheists don’t use Epicurus, they more frequently use Darwin, just as modern skyscrapers use titanium, but the history of skyscrapers becomes clear when we study the history of steel. Just so, the history of atheism becomes much clearer when we study atomism. Of course, we now use steel for lots of things that aren’t skyscrapers (satellite approaching Pluto!), and similarly atomism has lots of non-atheist applications, but we associate atomism a lot with atheism, just as we think a lot about “towers of glass and steel” and usually think less about the steel bolts in our chairs or the steel spoons we eat with. All applications of steel, or epicuranism, can be worth studying, but skyscrapers/ atheism will never stop being one of the biggest and most interesting, at least in terms of how they changed the face of our modern world. And finally, while minority of buildings are skyscrapers, and a minority of contemporary people are atheists, the study of both is broadly useful because the presence of both in the lives of everyone is a defining factor in our current world.
Hello, patient friends. The delight of brilliant and eager students, the siren call of a new university library, the massing threat of conjoining deadlines, and the thousand micro-tasks of moving across the country have caused a very long gap between posts. But I have several pieces of good news to share today, as well as new thoughts on Machiavelli:
The next installment of my Sketches of a History of Skepticism series is 2/3 finished, and I hope to have it up in a week or three, deadlines permitting.
I have an excellent new assistant named Mack Muldofsky, who is helping me with Ex Urbe, music, research and many other projects. So we have him to thank in a big way if the speed of my posting picks up this summer.
Because I have a lot of deadlines this summer, I have asked some friends to contribute guest entries here, and we have a few planned treating science, literature and history, so that’s something we can look forward to together.
For those following my music, the Sundown Kickstarter is complete, and it is now possible to order online the CD and DVD of my Norse Myth song cycle Sundown: Whispers of Ragnarok. In addition to the discs, you can also order two posters, one of my space exploration anthem “Somebody Will” and one which is a detailed map of the Norse mythological cosmos. CD sales go to supporting the costs of traveling to concerts.
I have several concerts and public events lined up for the summer:
At Mythcon (July 31-Aug 2), Lauren Schiller and myself, performing as the duo “Sassafrass: Trickster and King” will join Guest of Honor Jo Walton for “Norse Hour,” in which she will read Norse myth-themed poetry in alternation with our Norse-themed songs.
Sunday August 9th, I have been invited do a reading of the freshly-polished opening chapters of my novel Too Like the Lightning (due out in Summer 2016) at the Tiptree Award Ceremony event honoring Jo Walton, who couldn’t make it to the initial ceremony but received the Tiptree this year for her novel My Real Children. The event is being held at Borderlands in San Francisco at 3 PM, and will feature readings by local authors, and music performed by myself and Lauren.
Monday August 17th, at 7 PM, I am joining Jo and Lauren again at Powell’s, where Jo will read from her books, Lauren and I will sing, and I will interview Jo and talk about my writing as well as hers.
Finally at Sasquan (Worldcon, Aug 19-23) Lauren and I will have a full concert, I will do another reading from Dogs of Peace, and I will be on several exciting panels.
Meanwhile, I have a little something to share here. I continue to receive frequent responses to my Machiavelli series, and recently one of them sparked such an interesting conversation in e-mail that I wanted to post it here, for others to enjoy and respond to. These are very raw thoughts, and I hope the discussion will gain more participants here in the comment thread (I have trimmed out parts not relevant to the discussion):
In this discussion, I use a term I often use when trying to introduce intellectual history as a concept, and which I have been meaning to write about here for some time, “Intellectual Technology.”
A little conversation about Machiavelli:
I have been reading your blog posts on Machiavelli. You write with tremendous learning, clarity and colour, and really bring past events alive in a brilliant way. But…….. I think you’re far too soft on Machiavelli!!!
I’m working on a PhD about him and it’s fascinating to see that nearly all present-day academics, and indeed academics during much of the second half of the 20th century, have a largely if not completely uncritical admiration for him and his works. He is lauded, for example as a forerunner of pluralism, and supporter of republicanism/democracy, yet his clear inspiration of Italian fascism is almost completely overlooked. The fact that Gramsci revered Machiavelli is dealt with by many scholars, but Mussolini’s admiration for him is hurriedly passed over.
Your post on Machiavelli and atheism is really interesting – in that context the 2013 book Machiavelliby Robert Black would be of interest to you…
Best regards, Michael Sanfey, IEP/UCP Lisbon.
Reply from Ada:
Michael,Thank you for writing in to express your enjoyment of my blog posts. I think your criticisms of Machiavelli are interesting and largely fair, and my own opinions overlap with yours in many ways, though not in others. I agree with you completely that there are inappropriate tendencies in a lot of scholars to praise Machiavelli inappropriately as a proto-modern champion of Democracy, republicanism, pluralism, modern national pride etc., all of which are characterizations are deeply inappropriate and also deeply presentist, reading anachronistic values back into him. But there is also a tendency, dominant earlier in the 20th century, to villify Machiavelli too much in precisely the same anachronistic and presentist way, characterizing him as a fascist or a Nazi and reading back into his work the things that were done in the 20th century by people who used some of his ideas but mixed them with many others. My way of approaching Machiavelli focuses above all on trying to distance him from the present and place him in his context, to show that he is neither a modern hero nor a modern villain since he isn’t modern at all. The question is separate, which you bring up, of how much to blame him or criticize him for opening up the direction of reasoning which led to later consequentialism, and also to fascism which certainly used him as one of its foundational texts. Here I find myself uncomfortable with the idea of historical blame at all, particularly when it’s blame over such a long span of time.
I tend to think of thinkers as toolmakers, or inventors of “intellectual technology”, innovators who have created a new thing which can then be used by many people. New inventions can be used in many ways, and in anticipatable and unanticipatable ways. Just as, for example, carbon steel can be used to raise great towers and send train lines across continents, it can be used to build weapons and take lives, so it is a complex question how much to blame the inventor of carbon steel for its many uses. In this sense, I do believe we can see Machiavelli as a weapon-maker, since the ideas he was generating were directly intended to be used in war and politics. We can compare him very directly to the inventor of gunpowder in this sense. I also see him–and this is much of the heart of my critique–as a defensive weapon maker, i.e. someone working in a period of danger and siege trying to create something with which to defend his homeland. So, imagine now the inventor of gunpowder creating it to defend his homeland from an invasion. Is he responsible for all later uses of gunpowder as well? Is he guilty of criminal negligence for not thinking through the fact that long-term many more people will be killed by his invention than live in his home town? Do the lives saved by gunpowder throughout its history balance out against the lives saved in some kind of (Machiavellian/consequentialist) moral calculus? I don’t think “yes” or “no” are fair answers to such a complex question, but I do think it is important, when we think about Machiavelli and what to hold him responsible for, to remember the circumstances in which he created gunpowder (i.e. consequentialist ethics), and that he invented other great things too, like political science and critical historical reasoning. The debts are complicated, as is the culpability for how inventions are used after the inventor’s death. So while I join you wholeheartedly in wanting to fight back against the distortion of Machiavelli the Mythical proto-modern Republican, I also think it’s valuable to battle against the myth of Machiavelli the proto-Fascist, and try to create a portrait of the real man as I see him, Machiavelli the frightened Florentine.
I do know Bob Black’s Machiavelli book, but disagree with some of his fundamental ideas about humanism itself – another fun topic, and one I enjoy discussing with him at conferences. He’s a challenging interlocutor. There is a very good recent paper by James Hankins on Academia.edu now about the “Virtue Politics” of humanists, which I recommend that you look at if you’re interested in responses to Black.
Best, Ada Palmer, University of Chicago
More from Michael:
First, I want to thank you for this fantastically detailed and brilliant response… I’d like to “come back at you” on consequentialism and some other points:
* Regarding your point about Machiavelli not being modern at all, I see what you mean, albeit you do say of Machiavelli in the post on atheism that “he is in other ways so very modern”. Leo Strauss certainly thought he had a lot to do with the introduction of what we know as “modernity”.
* When you seek to balance the need to fight against the Proto-republican myth and against the Proto-fascist myth, the first of those “myths” enjoys immeasurably wider currency than the second, and I ask myself, why is this?
* On the “intellectual technology” point below, and its being essentially neutral, in this case I wouldn’t agree with you, because we are not talking here about an object like gunpowder, it’s actually concerning something much more important. In ethical terms, Machiavelli took transcendent values out of the equation. As you put it, Machiavelli created “an ethics which works without God” – except that it doesn’t work!!!
* Machiavelli has had a questionable impact in regard to “realism” in International relations. You mention in one of the posts that he backed an alliance with Borgia so as to protect Florence, agreeing to offer money and resources to help Borgia conquer more – a very good example of Machiavelli‘s undoubted sympathy for imperialism.
PPS On the question of Machiavelli being an atheist or not, I really was fascinated by that part of your Ex Urbe writings. I’ve concluded that, whatever about him being an atheist or not, one could certainly describe him as “ungodly” would you agree?
Quick response from Ada:
I think “ungodly” does work for Machiavelli depending on how you define it; it has a connotation of being immoral–which does not fit–but if instead you mean it literally as someone who makes his calculations without thinking much about the divine then it fits.
A supplementary comment on “Intellectual Technology”:
I find “intellectual technology” a very useful concept when I try to describe what I study. Broadly my work is “intellectual history” or “the history of ideas” but what I actually study is a bit more specific: how particular kinds of ideas come into existence, disseminate, and come to be regulated at different points in time. The types of ideas I investigate–atomism, determinism, utilitarianism–move through human culture very much the same way technological innovations do. They come into being in a specific place and time, as a result of a single inventor or collaboration. They spread from that point, but their spread is neither inevitable nor simple. Sometimes they are invented separately by independent people in independent places, and sometimes they exist for centuries before having a substantial impact. When a new idea enters a place and comes into common use, it completely changes the situation and makes actions or institutions which worked before no longer viable. I compare Machiavelli’s utilitarianism to gunpowder above, but here are some other examples of famous cases of technological inventions, and ideas which disseminated in similar patterns:
The Bicycle and Atomism
Leonardo da Vinci sketched a design for a bicycle in the Renaissance, and may have seriously tried to construct one, but afterward no one did so for a very long time. Then many other factors changed: the availability of rubber and light-weight strong metals, the growth of large, centralized cities and a working population in need of inexpensive transit, and suddenly the bicycle was able to combine with these other factors to revolutionize life and society in a huge rush, first across Europe and then well beyond. We have moved on from it to develop more complex technologies that achieve the same function, but still use it and develop it more, and even where we don’t, and cities would not have the shapes they do now without it, and it is still transforming parts of the world it has touched more slowly. Similarly atomism was developed and used for a little while, then languished in notebooks for a long time, before combining with the right factors to spread and rapidly transform society and culture.
The Unity of All Life and Calculus
Newton and Leibnitz developed Calculus independently at the same time. Similarly, both classical Stoicism in Greece and Buddhism in India roughly simultaneously and independently, as far as we can tell, developed the idea that all living things–humans, insects, ancients, people not yet born–are, in fact, parts of one contiguous, interconnected, sacred living thing. This enormously rich and complex concept had a huge number of applications in each society, but seems to have been independently developed to meet the demands for metaphysical and emotional answers of societies at remarkably similar developmental stages. The circumstances were right, and the ideas then went on to be applied in vastly different but still similar ways.
Feminism and the Aztec Wheel
For a long time we thought the Aztecs didn’t have the wheel. More recently we discovered that they had children’s toys which used the wheel, but never developed it beyond that. Which means someone thought of it, and it disseminated a bit and was used in a very narrow way, but not developed further because what we think of as more “advanced” or “industrial” applications (wagon, wheelbarrow) just weren’t compatible with the Aztec world (largely because it was incredibly hilly and didn’t have the elaborate road system Europe developed, relying instead on human legs, stairs, and raw terrain, which were sufficient to let it develop a robust and complex economy and empire of its own. The wheel became more useful in the Americas when European-style city plans and roads were built). Similarly Plato voiced feminism in his Republic, arguing that women and men were fundamentally interchangeable if educated the same way, and people who read the Republic discussed it as a theory among many other elements of the book, but didn’t develop it further (again, I would argue, this was at least in part because the economic and social structures of the classical world depended on the gendered division of labor, particularly for the production of thread in the absence of advanced spinning technology, which is why literally all women in Rome spent tons of time spinning–spinning quotas were even sometimes required by law of prostitutes since if there was a substantial sliver of the female population employed without spinning Rome would run out of cloth. Feminism was better able to become revolutionary in Europe when (among other changes) industrialization reduced the number of hours required for the maintenance of a household and the production of cloth, making it more practical to redirect female labor, and question why it had been locked into that in the first place).
In sum, there is a concreteness to the ideas whose movements I study, a distinct and recognizable traceability. Interpretive analyses, comparative, subjective analyses, analyses of technique, aesthetics, authorial intent, authenticity, such analyses are excellent, but they aren’t intellectual history as I practice and teach it. I trace intellectual technology. Just as the gun, or carbon steel, or the moldboard plow came in at a particular time and had an impact, I study particular ideas whose dissemination changed what it was possible for human beings to do, and what shapes human society can be. It is meaningful to talk about being at an “intellectual tech level” or at least about being pre- or post- a particular piece of intellectual technology (progress, utilitarianism, the scientific method) just as much as we can talk about being pre- or post-computer, gunpowder, or bronze. Such things cannot be un-invented once they disseminate through a society, though some societies regulate or restrict them, and they can be lost, or spend a long time hidden, or undeveloped. Elites often have a legal or practical monopoly on some (intellectual) technologies, but nothing can stop things from sometimes getting into the hands or minds of the poor or the oppressed. Sometimes historians are sure a piece of (intellectual) technology was present because we have direct records of it: a surviving example, a reference, a drawing, something which was obviously made with it. Other times we have only secondary evidence (they were farming X crop which, as far as we know, probably requires the moldboard plow; they described a strange kind of unknown weapon which we think means gun; they were discussing heretics of a particular sort which seems to have involved denial of Providence).
I realize that it would be easy to read my use of “intellectual technology” as an attempt to climb on the pro-science-and-engineering bandwagon, presenting intellectual history as quasi-hard-science, much as we joke that if poets started calling themselves “syllabic engineers” they would suddenly be paid more. But it isn’t a term I’m advocating as a label, necessarily. It’s a term I use for thinking, a semantic tool for describing the specific type of idea history I practice, and linking together my different interests into a coherent whole. When I spell out what I’m working on right now as an historian, it’s actually a rather incoherent list: “the history of atheism, atomic science, skepticism, Platonic and Stoic theology, soul theory, homosexuality, theodicy, witchcraft, gender construction, saints and heavenly politics, Viking metaphysics, the Inquisition, utilitarianism, humanist self-fashioning, and what Renaissance people imagined ancient Rome was like. And if you give me an hour, I can sort-of explain what those things have to do with each other.” Or I can say, “I study how particularly controversial pieces of new intellectual technology come into being and spread over time.”
In that light, then, we can think of Machiavelli as the inventor of a piece of intellectual technology, or rather of several pieces of intellectual technology, since consequential ethics is one, but his new method of historical analysis (political science) is another. We might compare him to someone who invented both the gun and the calculator. How do we feel about that contribution? Positive? Negative? Critical? Celebratory? I think the only universal answer is: we feel strongly.
On the one hand, I have been looking forward for ages to reading and then writing something about “The Litany of Earth,” an amazing novelette by Ruthanna Emrys, acquired for Tor.com by editor Carl Engle-Laird. But on the other hand I personally usually dislike reading reviews, at least traditional reviews of things I have already decided to read. When a reviewer tells me about what I’m going to experience and what excellent things the author is going to do, it disrupts the reading process for me, makes the things mentioned in the review stand out too boldly, interfering with the craftsmanship of a good story in which the author has taken great pains to give each beat just the right amount of emphasis, no more, no less. The memory of the review in my mind makes it like a used book which someone has gone through with highlighter, which can be fascinating as a window on a fellow reader, and delightful for a reread, but it isn’t what I want on first meeting a new text, which in my ideal world consists of me, the reader, placing myself wholly and directly in the hands of the author, with the editor’s touch there too to help spot us along the way. I do not need a co-pilot. And it is more of a problem, for me at least, with short fiction than with long fiction since the review could be half as long as the story and weigh me down with nearly as much weight as the whole thing carries. So, today I have set myself the challenge of writing a review, or non-review, of “The Litany of Earth” that isn’t a co-pilot, or a highlighter, and does as much as possible to get across the story’s strengths and the power of the reading experience while doing my best not to change the relative weight of anything in the story, make anything jump out too boldly, leaving the craftsmanship as untouched as it can be.
I have a seven step plan. (Personal rule: anything with three or more steps counts as a plan. Also, “Profit” is not a step, it’s an outcome, and does not count toward your total of three.)
Recommend you go read “The Litany of Earth” now before I can spoil anything.
Talk amorphously about things the story is doing with structure and world-canon, talking more concretely about a few other pieces of fiction that have done somewhat similar things.
Ramble about Petrarch.
Ramble about Diderot. Dear, dear Diderot…
Urge you to read “The Litany of Earth” again, last chance before I get out my highlighter.
Talk about “The Litany of Earth” directly.
Step One: I strongly recommend that you go read “The Litany of Earth” right now. It’s free online, and if you read it now you won’t be stuck with an intrusive co-pilot even if I do fail in today’s challenge of writing a non-review.
Step Two: Talk amorphously, and compare the story to other works of fiction.
One of the unique literary assets of current fiction is the proliferation of familiar but elaborate and thoroughly developed fictional worlds which authors can step into and use for new purposes. There have always been such worlds as long as there has been literature. Arthuriana is my favorite pre-modern example, a complex and well-populated world rich with explorable relationships and flexible metaphysics ready to be elaborated upon and repurposed. Geoffrey of Monmouth and Thomas Malory and Petrarch and Ariosto and the traditional artists in Naples who decorated (and still decorate) street vendor wagons with Arthur’s knights each repurposed Arthuriana just like Marion Zimmer Bradley and and Monty Python and Gargoyles and Heather Dale and Babylon 5 and the endlessly hilarious antics of the BBC’s Merlin. Each of the later authors in the genealogy has taken advantage not only of the plot, setting and characters but knowing that readers have genre expectations.
In the early 1500s when Ariosto began his chivalric and slightly-Arthurian verse epic Orlando Furioso he took advantage of the fact that readers already associated the topic with epic works and grand tourneys and knights and ladies and courtly-love adultery, baggage which let him write a massive and endless rambling snarl of disjointed and fantastic adventurousness so unwieldy that traditional epic structure is to Orlando Furioso as a sturdy rope is to the unassailable rat’s nest of broken headphones and cables for forgotten electronics that I just fished out of this bottom drawer. No reader, not even in 1516, would put up with it without the promise of Arthurian grandeur to make its massive scale feel appropriate. (I will also argue that the BBC Merlin, for all its tomatoes and giant scorpions, has not actually done anything quite so unreasonable as the point when Ariosto has “Saint Merlin” rise from his tomb to deliver an endless rambling prophecy about how awesome Ariosto’s boss Ipollito D’Este is going to be. Fan service long predates the printing press.) In a more recent continuation of this tradition, modern Arthurian adaptations have given us the previously-silenced P.O.V.s of women, of villains, of third-tier characters, and in some sense it’s quite modern to think about P.O.V. at all. But even very old adaptations take advantage of how not just setting but genre is an asset usable to get the reader to follow the author to places a reader might not normally be willing to go. And, of course, in more recent versions authors have taken advantage of exploring silenced P.O.V.s to critique earlier Arthurian works and their blind spots, as a way of reaching the broader blindnesses and silencings of the past stages of our own society that birthed these worlds.
“Is ‘The Litany of Earth’ Arthuriana?” you may wonder. No. It uses a different mythos. I bring up Arthuriana in order to remind you of the many great things you’ve seen humans create by using and reusing a familiar collective fiction, and in order to reinforce my earlier claim that one of the great assets of current fiction is that we have many, many such worlds. If pre-modern Earth had several dozen rich, lively, reusable mythoi and epic settings, the 20th century has added many, many more in which good (and campy) things have and can be done. Star Trek, Sherlock Holmes, Gundam, the massive united comics universes of Marvel and DC, these each provide as much complexity and material for reuse and reframing as the richest ancient epics, more if, for example, you compare the countless thousands of pages of surviving X-Men to the fragile little Penguin Classics collections of Eddas and fragmentary sagas which preserve what little we still have of the Norse mythic cosmos. Marvel’s universe, and DC’s too, have a fuller population and a more elaborate and eventful history than any mythos we have inherited from antiquity, and my own facetious in-character reviews of the Marvel movies are but the shallowest tip of what can be done with it.
The specific case of this kind of rich reuse whose parallels to “The Litany of Earth” are what brought me down this line analysis comes from the Marvel comics megaverse, the unique and skinny stand-alone Marvels, by Kurt Busiek, illustrated by Alex Ross. What it does with the narrative possibilities of the Marvel universe is very much worth looking at even if one doesn’t care a jot about comics.
Described from the outside and ignoring, for a moment, that these are comic books, the Marvel universe presents us with an Earth-like alternate history in which disasters–supernatural, alien, primordial, divine–have repeatedly threatened Earth, the universe, and, most often, New York City with certain destruction. These have been repeatedly repelled by superheroes, somewhat human somewhat not, and the P.O.V. from which we the reader have always viewed these events has been as one of the superpeople at the heart of the battle, deeply enmeshed in the passionate immediacy of the short-term drama, nemeses, kidnappings, personal backstory, and who’s dead lately. Only rarely have we had works that gave us a longer perspective over time, reflecting personal change, evolving perspectives, how being constantly enmeshed in superbusiness makes a person develop and self-reflect, though notably the works that have done so have been among superhero comics’ shining stars (Dark Knight Returns, Red Son, Watchmen.)
Marvels instead offers a long-term and distanced P.O.V., that of a photographer who lives in New York City and, during his path from rookie to retirement, experiences in order the great, visible cataclysms that have repeatedly shaken Marvel’s Earth. His perspective gives historicity, sentiment, reflection and above all realism to Marvel, using it as alternate history rather than an action setting. The effect is powerful, beautiful and highly recommended for the way it weaves the richness of Marvel’s setting together with good writing to create a truly valuable work of literature. But it also reverses an interesting silencing which has been present in the back of Marvel, and superhero comics, since their inception: the silencing of the Public.
Very much like the women in early versions of Arthuriana, the Public in Marvel (and DC) has not been an agent in itself, but an object to motivate the hero. The Public exists to be rescued, protected, placated, evaded, sometimes feared. The Public has cheered P.O.V. heroes, hounded them, betrayed them, threatened them with pitchforks and torches, somehow being tricked over and over again into doubting the heros even after the last seventeen times they were exonerated. The Marvel Public specifically also persistently hates and fears the X-Men and other mutants despite being saved by them sixteen jillion times, and somehow hates and fears the other heros less even though many of them are aliens or science freaks or robots or other things just as weird as mutants. It is a tool of the author, manipulated by villains, oppressing misfits, causing tension, but virtually never is the reader asked to empathize with the Public. The object of empathy is the hero, or occasionally the villain, but the reader is never supposed to identify with or even think about the emotions of the screaming and yet simultaneously silenced mob. Marvels gives us, at last, the point of view of that mob, or at least one member of it, directing our self-identification and above all our empathy for the first time to something which has been hitherto faceless.
The effect is rather like a stroll through the Uffizi enjoying endless scenes of exciting saints surrounded by choruses of beautiful angels and then hitting the Botticelli room where each angel has a distinctive face and personality and you find yourself wondering what that angel is thinking when it watches Mary come to heaven to be crowned its queen, or sings music for young John the Baptist whose grisly end and subsequent heavenly ascension the angel already knows. Only when Botticelli invites you to see the angels as individuals do you realize that no earlier painting ever did. They had a failure of empathy. They were still beautiful, but here is a rich new direction for empathy which no earlier work has asked us to consider, and which opens up a huge arena we had ignored. Women in Arthuriana; the Public in Marvel; the angels that stand around in paintings of saints.
In just the same way, “The Litany of Earth” uses empathy and P.O.V. to open rich new arenas in one of our other well-known modern fictional settings. And the setting it uses has a fundamental and very problematic failure of empathy rooted deep in its foundations, so addressing that head-on opens a very potent door.
And since I can feel the urge to talk about Naoki Urasawa’s Pluto becoming harder to resist, I believe it is now time to nip that in the bud by moving on to the next stage of my plan.
Step Three: Ramble about Petrarch.
Picture Petrarch in his library, holding his Homer. He has just received it, and turns the stiff vellum pages slowly, his fingertips brushing the precious verses that he has dreamed of since his boyhood. The Iliad in his hands. His friends have always whispered to him of the genius that was Homer, his real friends, not the shortsighted fools he grew up with in Avignon, arrogant Frenchman and slavish Italians like his parents who followed the papacy and its trail of gold even when France snatched it away from Rome. His real friends are long-dead Romans: Cicero, Seneca, Caesar, men like him who love learning, love virtue, love literature, love Rome and Italy enough to fight and give their lives for it, love truth and excellence enough to write of it with passion and powerful words that sting the reader into wanting to become a better person.
Petrarch was born in exile. Not just the geographic exile of his family from their Florentine homeland, no, something deeper. An exile in time. This world has no one he can relate to, no one whose thoughts are shaped like his, who walks the Roman roads and feels the flowing currents of the Empire, whose understanding of the world connects from Egypt up to Britain without being blinded by ephemeral borders, who can name the Muses and knows how truly rich it is to taste the arts of all nine, and how truly poor one is without. Antiquity was his native time, he knows it, but antiquity was cut off too early–he was born too late. His friends are dead, but their voices live, a few, in chunks, in the books in distant libraries which he has spent his life and fortune gathering. His library. Each volume a new shard of a missing friend, those few, battered whispers of ancient voices which survived the Medieval cataclysm that consumed so much. And now, after hearing so many of his friends speak of Homer, call him the Prince of Poets, the climax of all art and literature, divine epic, the centerpiece of all the ancient world, he has it in his hands. It survived. Homer. In Greek. And he can’t read it. Not a word of it. Greek is gone. No one can read it anymore, no one. Homer. He has it in his hand, but he can’t read it, and for all he knows no one ever will again.
This historical moment, Petrarch with his Homer, is one of the most poignant I have ever met in my scholarship. A portrait of discontinuity. The pain when the chain of cultural transmission, of old hands grasping young, that should connect past, present and future is cut off. The cataclysm doesn’t have to be complete to be enough to disrupt, to silence, to jumble, to leave too little, Greek without Homer, Homer without Greek. Petrarch is a Roman. They all are, he and his Renaissance Italians, they have the blood of the Romans, the lands of the Romans, the ruins of the Romans, but not enough for Petrarch to ever really have the life he might have had if he’d been born in the generation after Cicero, and with his Homer in his hands he knows it.
Petrarch did his best. He spent his life collecting the books of the ancients, trying to reassemble the Library of Alexandria, the pinnacle, he knew, of the culture and education which had made the Romans who had made his world. He found many shards, eventually enough that it took more than ten mules to carry his library when he journeyed from city to city. He journeyed much, working everywhere with voice and pen to convince others to share his passion for antiquity, to read the ancients that could be read, Cicero, Seneca, to learn to think as they did and to try to push this world to be Roman again, which for him meant peaceful, broad-reaching, stable, cultured and strong. People listened, and we have the libraries and cathedrals and Michelangelos they made in answer. And Petrarch never gave up on Homer either, but searched the far corners of the Earth for someone with a hint of Greek and eventually, late in life, did find someone to make a jumbled, fragmentary translation, nothing close to what a second-year-Greek student could produce today let alone a fluid translation, but a taste. By late in life he had his New Library of Alexandria, and real hope that it might rear new Romans.
Petrarch wanted to give the library to Florence, to help his homeland make itself the new Rome, but Florence was too caught up with its own faction fighting for anyone to stably take it. Venice was the taker in the end, and he hoped his library would make the great port city like the Alexandria of old, the hub where all books came, and multiplied, and spread. Venice put Petrarch’s library in a humid warehouse and let it rot. We lost it. We lost it again. We lost it the first time because of Vandals and corrupt emperors and economic transformation and plague and all the other factors that conspired to make the Roman Empire decline and fall, but we lost it the second time because Venice is humid and no one cared enough to devote space and expense to a library, even the famous collection of the famous Petrarch. Such a tiny cataclysm, but enough to make discontinuity again. We have learned better since. Petrarch had followers who formed new libraries, Poggio, Niccolo, they repeated Petrarch’s effort, finding books. Eventually princes and governments realized there was power in knowledge. Venice built the Marciana library right at the main landing, so when foreigners arrive in St. Mark’s square they are surrounded by the three facets of power, State in the Doge’s Palace, Church in the Basilica, and Knowledge in the Library. And now we have our Penguin Classics. But we don’t have Petrarch’s library, and we know he had things that were rare, originals, transcriptions of things later lost. There are ancients who made it as far as Petrarch, all the way to the late 1300s, through Vandals, Mongols and the Black Death, before we lost them to one short-sighted disaster. Discontinuity. We have Homer. We don’t know what Petrarch had that we don’t.
This was one of two historical vignettes that came vividly before my mind while I was reading “The Litany of Earth.” The second is…
Step Four: Ramble about Diderot. Dear, dear Diderot…
I must be very careful here. Even though my focus is Renaissance and my native habitat F&SF, Denis Diderot remains my favorite author. Period. My favorite in the history of words. So it is very easy for me to linger too long . But I invoke him today for a very specific reason and shall confine myself strictly to one circumscribed subtopic, however hard the copy of Rameau’s Nephew on my desk stares back.
Three quarters of the way through my survey course on the history of Western thought, I start a lecture by declaring that the Enlightenment Encyclopedia project was the single noblest undertaking in the history of human civilization. I say it because of the defiant, “bring it on!” glances I instantly get from the students, who switch at once from passive listening to critical judgment as they arm themselves with the noblest human undertakings they can think of, and gear up to see if I can follow through on my bold boast. I want that. I want their minds to be full of the Moon Landing, and the Spartans at Thermopylae, and Gandhi, and the US Declaration of Independence, and Mother Teresa, and the Polynesians who braved the infinite Pacific in their tiny log boats; I want it all in their minds’ eyes as I begin.
The Encyclopédie was the life’s work of a century on fire. The newborn concept Progress had taken flight, convincing France and Europe that the human species have the power to change the world instead of just enduring it, that we can fight back against disease, and cold, and mountain crags, and famine cycles, and time, and make each generation’s experience on this Earth a little better. The lion has its claws and strength, the serpent fangs and stealth, the great whales the force of the leviathan, but humans have Reason, and empiricism, and language to let us collaborate, discuss, examine, challenge, and form communities of scientists and thinkers who, like the honeybee, will gather the best fruits of nature and, processing them with our own inborn gifts, produce something good and sweet and useful for the world. The tone here is Francis Bacon’s, but Voltaire popularized it, and by now the fresh passion for collaboration and improvement of the human world had already birthed Descartes’ mathematics, Newton’s optics, Locke’s inalienable rights, calculus, and the Latitudinarian movements toward rational religion which seemed they might finally soothe away the wars that lingered from the Reformation. Everything could be improved if keen minds applied reason to it, from treatments for smallpox which could be preventative instead of palliative, to Europe’s law codes which were not rational constructions but mongrel accumulations of tradition and centuries-old legislation passed during half-forgotten crises and old power struggles whose purpose died with the clans and dynasties that made them but which still had the power to condemn a feeling, thinking person to torture and death.
The Encyclopédie had many purposes. Perhaps the least ambitious was to turn every citizen of Earth into a honeybee. Plato had said that only a tiny sliver of human souls were truly guided by reason–able to become Philosopher Kings–while the vast majority were inexorably dominated by base appetites, the daily dose of food and rest and lust, or by the wild but selfish passions of ambition and pride. For two millennia all had agreed, and even when the Renaissance boasted that human souls could rival angels in dignity and glory through the light of learning and the power of Reason, they meant the souls of a tiny, literate elite. But in 1689 John Locke had argued that humans are born blank slates, and nurture rather than an innate disposition of the soul separated young Newton from his father’s stable boy. The Encyclopédie set out to enable universal education, to collect basic knowledge of all subjects in a form accessible to every literate person, and to their illiterate friends who crowded around to hear new chapters read aloud in the heady excitement of its first release. With such an education, everyone could be a honeybee of Progress, and exponential acceleration in discovery and social improvement would birth a better world. So overwhelming was public demand that Europe ran out of paper, of printer’s ink, even ran out of the types of metal needed to make printing presses, so many new print shops appeared to plagiarize and print and sell more and more copies of the book which promised such a future (See F. A. Kafker, “The Recruitment of the Encyclopedists”).
Yet Diderot and his compatriots had another goal which shows itself in the structure of the Encyclopédie as well as in its bold opening essay. The second half of the 17 volume series is devoted to visual material, a series of beautiful and immensely complicated technical plates which illustrate technology and science. How to fire china dishes, smelt ore, weave rope, irrigate fields, construct ships, calculate distance, catalog fossils and decorate carriages, all are illustrated in loving detail, with diagrams of every tool and its use, every factory and its layout, every human body at work in some complex motion necessary to turn cotton into cloth or rag into precious paper. With this half of the Encyclopédie it is possible to teach one’s self every technological achievement of the age. The first half was intended to provide the same for thought. With its essays it should be possible to understand from their roots the philosophies, ethical systems, law codes, customs, religions, great thinkers of the past and present, all aspects of life and the history of humankind’s evolving mental world. It is a snapshot. A time capsule. With this–Diderot smiles thinking it–with this, if a new Dark Age fell upon humanity and but a single copy of the Encyclopédie survived, it would be possible to reconstruct all human progress. With this, the great steps forward, the hard-earned produce of so many lives, the Spartans at Thermopylae, the Polynesian log boats, will be safe forever. We can’t fall back into the dark again. With this, human achievement is immortal. Yes, Petrarch, it even details how to read, and print, and translate Greek.
Let’s linger on that thought a moment. A beautiful, unifying, optimistic, safe, human moment, warm, like when I first heard that, yes, eventually Petrarch did get to read a sliver of his Homer. Because I’m not going to keep talking about dear Diderot today, much as I would like to.
In 2012/13 we lost 170,000 volumes from the Egyptian Scientific Institute in Cairo to the revolution, 20,000 unique manuscripts in Timbuktu library to a militia fire, and we have barely begun to count the masses of original scientific material burned during a corrupt botched cost-saving effort to reduce the size of the Libraries of Fisheries and Oceans of Canada. More than half of the entries on Wikipedia’s list of destroyed libraries were destroyed after the printing of the Encyclopédie, and the libraries on the list are only a miniscule fraction of the texts lost to disasters, natural and manmade. It doesn’t even list Petrarch’s library, let alone the unique contents of the personal libraries and works that accumulate in every house now that we’re all honeybees. Diderot tried so hard to make it all immortal. He tried so hard he used up all the ink and paper in the world. Yet if my numbers for printing history are right, in the past half century we have destroyed more written material than had been produced in the cumulative history of the Earth up until Diderot’s day. And that does not count World Wars. We’re getting better. On February 14th 2014 a fire at the British National Archives threatening thousands of documents, many centuries old, was successfully quenched with no damage to the collection, thanks substantially to advances in our understandings of fluids and pressure made in the 17th and 18th centuries and neatly explained by the Encyclopédie. That much is indeed immortal (thank you, Diderot!) but much is so very far from everything. It’s still so easy to make mistakes.
One of the most powerful mistakes, for me, is this cenotaph monument of Diderot, in the Pantheon in Paris, celebrating his contributions and how the Encyclopedia and enlightenment enabled so much of the liberty and rights and change that defines our era. Voltaire’s tomb was moved to the Pantheon, Rousseau’s too, but for Diderot there is only this empty cenotaph. I went on a little pilgrimage once to visit Diderot in the out-of-the-way Church of Saint-Roch, where he was buried. There is no tomb to visit. During the French Revolution, Saint-Roch was attacked and mostly destroyed by revolutionaries (carrying banners with Encyclopedist slogans on them!) who, in their zeal to torch the old regime, forgot that their own Diderot was among the Catholic trappings they could only see as symbols of oppression. Once rage and zeal had died down Paris and all France much lamented the mistake, and many others, too late.
Did I mention we very nearly lost Diderot’s work too? A far more frightening loss than just his body. Diderot didn’t include himself, his own precious original intellectual contributions, in his Encyclopédie. He knew he couldn’t. He was an atheist, you see. A real one, not one of these people we suspect like Hobbes and Machiavelli, but an overt atheist who wrote powerful, deeply speculative books trying to hash out the first moral system without divinity in it, fledgling works of an intellectual tradition which was just then being born, since even a few decades earlier no one had dared set pen to paper, for fear of social exile and ready fire and steel of Church and law. But Diderot didn’t publish his own works, not even anonymously. He self-censored. He was the figurehead of the Encyclopédie. An atheist was too frightening back then, too strange, too other. If people had known an atheist was part of it, the project would have been dead in the water. Diderot left instructions for future generations to print his works someday, if the manuscripts survived, but gambling with his own legacy was a price he was willing to pay to immortalize everyone else’s. The surviving manuscript of Rameau’s Nephew in Diderot’s own hand turned up by chance at a used bookshop 1823, one chance street fire away from silence.
Here you get points if you read it before getting this far. It’s free on Tor.com, but you really liked it you can also buy the ebook for a dollar, and give money to Ruthanna and to Tor, and tell them you like excellent original fiction that does brave things with race and historicity.
Step Six: Talk about “The Litany of Earth” directly.
This is a Cthulhu Mythos story which is in no way horror. The richly-designed populated metaphysics and macrohistorical narrative of Lovecraft’s universe is here, but as a tool for reflection on society and self, with a narrative that bears no resemblance in to the classic tense and chilling horror short stories I (for some reason) enjoy as bedtime reading. Ruthanna Emrys uses Lovecraft’s world to comment on Lovecraft’s writing and the deeply ingrained sexism and especially racism that saturates it, repurposing that into a tool to make us think more about the effects of silencing and othering which Lovecraft used his skill and craftsmanship to lure us into participating in. But the message and questions are universal enough that the target audience is not Lovecraft readers or horror readers but any reader who has even a vague distant awareness that the Lovecraft Mythos is a thing, as one has a vague distant awareness of Celtic or Navajo mythology even if one doesn’t study them. If there is any horror in this story, it is the familiar reality that the things we make and do and are are perishable, that human action often worsens that, and that at the end of all our aeons and equations we face entropy. But rather than presuming (as Lovecraft and much horror does) that facing that will lead to mad cackling and gibberish, the story presents the real things we do to try to face that: spirituality, cultural identity, and the effort to preserve the past and transmit it to the future. It turns a setting which was created a vehicle for horror into a vehicle for social commentary and historical reflection.
I suppose I should directly address Lovecraft’s failures of empathy, for those less familiar with his work, or who have met it mainly through its fun, recent iterations in board games and reuses which strive to leave behind the baggage. Racism, sexism, classism and other uncomfortable attitudes are not unexpected in an author who lived from 1890 to 1937. We encounter unpalatable depictions of people of color, and equally unpalatable valorizations of entrenched elites, in most literature of the period, from M. R. James to the original Sherlock Holmes. In Lovecraft’s case, the challenge for those who want to continue to work with his universe is that many of the racist and classist elements are worked deeply into the fabric of his worldbuilding. Many of his frightening inhuman races are clearly used to explore his fear of racial minorities, while the keys to battling evil are reserved for elites, like the affluent, white, male scholars who control his libraries, and the Great Race which controls the greatest library.
While many attempts to rehabilitate and use Lovecraft’s world do so by excising these elements, or minimizing them, or balancing them out by letting you play ethnically diverse characters in a Lovecraft game, this story instead uses those very elements as weapons against the kinds of attitudes which birthed them. If the scary fish-people represent a demonized racial “other” then let them remain exactly that, and show them suffering what targeted minorities have suffered in historical reality. By reversing the point of view and placing the reader within the perspective of the “other”, the original failure of empathy is transformed into a triumph of empathy. Now we are in the place of a woman for whom Lovecraft’s spooky cult rituals are her Passover or Easter, the mysterious symbols her alphabet, “Iä, Cthulhu . . . ” is the comforting prayer she thinks to herself when terrified, and a Necronomicon on Charlie’s shelf is Petrarch’s Homer.
And we aren’t asked to empathize with only one group. We empathize with those deprived of education, in the form of Aphra’s brother Caleb, taking on the classist negative depictions of “degenerate” white rural families common in Lovecraft’s work. With the plight of the Jews and other groups targeted in Germany, invoked by Specter’s discussion of his aunt. With those facing physical and medical challenges, invoked in the powerful opening lines where Aphra describes the pleasure she finds in facing the daily difficulty of walking uphill while she slowly heals. And with women, rarely granted any remotely coequal agency in literature of Lovecraft’s era. Not only is this story a powerful triumph of empathy, but after reading it, whenever we reread original Lovecraft, or anything set in his world, the memory of Aphra Marsh and her tender prayer will forever change the meaning of “Iä, iä, Cthulhu thtagn…” The triumph of empathy diffuses past the boundaries of this story, to enrich our future reading.
Another striking facet is that this is a story about legacy, continuity and deep history that manages to address those questions using only very recent history. Usually stories that want to talk about the deep past use material from periods we associate with the deep past: medieval, Roman Empire, Renaissance, Inuits, Minoans, anything we associate with dusty manuscripts and archaeology and anthropology and old culture. Even I in this entry, when trying to evoke the themes and feelings of this story, went back centuries and consequently had to spend a lot of time explaining to the reader the history I’m talking about (what’s Petrarch’s Homer, what’s up with Diderot, etc.) before I could get to what I wanted to do with it. This story instead uses contemporary history, events so recent and familiar that we all know it already, and have seen its direct effects in those around us and ourselves, or have tried to not see said effects. As a result, the story doesn’t have the baggage of having to explain its history. Instead of needing footnotes and exposition, it touches us directly and personally with our own history and makes us directly face the fact that we too are part of the link of transmission attempting to connect past to future, and our failures can still heal or harm that just as much as Visigoths, the Black Death or the Encyclopédie. The use of modern history makes it impossible for us to distance ourselves, greatly enhancing its power.
I have already discussed, in my own roundabout way using Diderot and Petrarch and Marvel comics, many of the key themes which make this story so powerful: othering, empathy, reversal of point of view, legacy, silencing, translation and transmission, and discontinuity, how easy it is for the powerful engine of society to make mistakes that cut the precious thread. The power with which this story is able to present that theme demonstrates perfectly, for me, the potency of genre fiction as a tool, not for escapism or entertainment, but for depicting reality and history. The tragic discontinuities created by World War II, the destruction of life, education and cultural inheritance generated not only by the most gruesome facets of the war but also by great mistakes like the treatment of Japanese Americans, are difficult to communicate in full with such accurate but emotionless descriptive phrases as, “people were rounded up and held in prison camps.” Attempts to communicate the genuine human impact of such an event easily fall so short. We try hard, but often fail. As a teacher, I remember well the flurry of discussion which surrounded some High School history textbooks which, in their efforts to do justice to the often-silenced story of interned Japanese Americans, had a longer section about that than it did about the rest of the war. Opponents of political correctness used it as a talking point to rail against liberalism gone too far, while apologists focused on the harm done by silencing the events. Yet for me, the centerpiece was the fact that textbooks had to devote that much space to attempting to get the issue across and still largely failed to communicate the event in a way that touched students. “The Litany of Earth” communicates the same event very potently, using the tool of genre to make something most readers might see as only affecting “others” feel universal. The large-scale horror of Lovecraft’s universe revolves around the inevitability that human achievement, and in the end all life, will fading into nothing. The Yith and their library are the only hope for a legacy, one bought at the terrible price of what they do to those whose bodies they commandeer. By creating a parallel between the fragility of all human achievement, preserved only by the Yith, and Aphra’s barely-literate brother Caleb writing of his doomed search for the family library which contained the history and legacy he and Aphra so desperately miss, the fantasy setting puts all readers in Aphra’s place, and the place of those interned, creating universal empathy which no textbook chapter could achieve; neither, in my opinion, could a non-fantasy short story, at least not with such deeply-cutting efficiency. After reading this story, not only the events of Japanese American internment but many parallel situations feel more personally important, and one feels a new sense of personal investment in such issues as the fate of the Iraqi Jewish Archive. This stoking of emotion and investment is a powerful and lingering achievement.
Structurally, the story interweaves experiences from different points in Aphra’s present–where she encounters Specter–with her past arriving in the city and encountering Charlie and his interest in her lost culture and languages. The choice to depict the present scenes in past tense and the flashbacks in present tense might seem counterintuitive, but I found it a powerful and effective choice. Past tense reads as “normal” in prose, so much so that we accept it as an uncomplicated way to depict the main moment of a narrative. In contrast, especially when we have just come from a past tense section, the present tense feels extra-vivid, raw, invasive. It feels like a very certain type of memory, the kind so vivid that, when something reminds us of them, they jump to the forefront of our minds and blot out the here and now with the tense, unquenchable emotions of a very potent then. Trauma makes memories do this, but it is not the traumatic memories of camp life that we experience this way. Instead it is the vividness of tender moments of cultural experience: seeing precious books in Charlie’s study, sharing his drying river, warm things. The transitions to vivid present tense make the reader think about memory and trauma without having to show traumatic events, while simultaneously highlighting how, in such a situation of discontinuity and cultural deprivation, the experiences which are most alive, which blaze in the memory, are these tiny, rare moments of connection, even tragically imperfect connection, with the ghostly echo of Aphra’s lost people.
For me, the triumphant surprise of the story comes in the end, when Aphra approaches the cultists, and chooses to act. Specter’s descriptions of bodies hanging from trees, combined with our familiarity with the copes of creepy cults in Lovecraft and outside, prepare us mid-story to expect that when Aphra approaches the cult they’ll be evil and insane, and she’ll overcome her resentment of the government and do what has to be done. Or possibly the reversal will be stronger with that, and the cult will be good and nice, like Aphra, and the take-home message will be that Specter is wrong and Aphra and the cultists are all just misunderstood and oppressed. It feels like the latter is where the story will take us when we see Wilder and Bergman, and Aphra finds comfort and companionship in participating in a badly-pronounced imitation of her native religion. Even when we hear about the immortality ritual and Bergman refuses to listen to Aphra’s attempts to make her see that her ambition is an illusion, it still feels like we are in the narrative where the cultists are good but misunderstood, and the tragedy is just that there is such deep racial misunderstanding that even Cthulhu-worshipping Bergman cannot believe Aphra’s attempts to help her are sincere. It is a real shock, then, when Aphra called in Specter to shut the group down, because the genre setting raises such a firm expectation that “bad cultist” = “blood and gore” that even when we read about Bergman’s two drowned predecessors it doesn’t register as “human sacrifice” or “bad cult.” Aphra, unlike the reader, is unclouded by genre expectations, and shows us that, precious as this echo of her lost culture is to her, life is more precious still and requires action. The ghostly echo of Aphra’s people that she shares with Charlie is precious enough to blaze in her memory, but she is willing to sacrifice the far more welcome possibility of being an actual priestess for people who sincerely want to share her religion, when she realizes that their cultural misunderstanding will cost human lives. And she cares this deeply despite being an immortal among mortals. The triumph of empathy is complete.
Unlike the numerous vampire stories and other tales which so often present immortals seeing themselves as different, special, unapproachable, and usually superior to mortals, here Aphra’s potential immortality enhances the uniqueness of her perspective and the depth of her loss, but without in any way diminishing her respect for and valuation of the short-lived humans that surround her. The grotesque folder of experimental records which is her mother’s cenotaph does make her reflect on how the loss is greater than the human murderers understood, but does not make her present it as fundamentally different from the deaths of humans, or make her (or us) see her suffering in any way more important or special than that of the Japanese family with whom she lives. The history of Earth that her people have learned from the Yith make her recognize that living until the sun dies is not forever, nor is even the lifespan of the planet-hopping Yith who will persist until the universe has run out of stars and ages to colonize. The Litany of Earth that she shares with Charlie is an equalizer, enabling empathy across even boundaries of mortality by placing finite and indefinite life coequally face-to-face with the ultimate challenges of entropy, extinction and the desire to find something valuable to cling to. “At least the effort is real.” This is something Charlie has despite his failing body, that Aphra’s brother has despite his deprived education, that Aphra has despite her painful solitude, a continuity that overcomes the tragic discontinuity and connects Aphra even with her lost parents, with ancestors, descendants, with forgotten races, races that have not yet evolved, races on distant worlds, races in distant aeons, and with the reader.
One last facet I want to comment on is how the story portrays magic which is at the same time viscerally bodily and also beautiful and positive. This is very unusual, and the more you know about the history of magic the clearer that becomes. Magic, at least positive magic, is much more frequently depicted with connections to the immaterial and spiritual than the bodily: bolts of light, glowing auras, floating illusions, the spirits of great wizards powerfully transcending their age-worn mortal husks. Magical effects that are bodily, using blood, distorting flesh, are usually bad, evil cultism, witchcraft. This trope far predates modern fantasy writing. I have documents from the Renaissance based on ones from Greece discussing magic and differentiating between the good kind which is based on study, scholarship, texts, words of power, perfection of the mind, the soul transcending the body, angelic flight, spiritual messengers, rays and auras of divine power, an intellectual, disembodied and male-dominated “good” magic contrasted, in the same types of texts, with the bad evil magic of ritual sacrifice, sexuality, animal forms, distortion of the body, contagion, blood and associated with witchcraft and with women. Cultural baggage from the Middle Ages is hard to break from even now, and we see this in the palette of special effects Hollywood reserves for good wizards and bad wizards. The tender, intimate, visceral but beautiful magic which Ruthanna Emrys has presented is authentic to Lovecraft and to the rituals we associate with “dark arts” and yet positive, a rehabilitation which works in powerful symbiosis with the story’s treatments of discrimination. Since race and religion are so much in the center of the story, its treatment of gender rarely takes center stage, but in these depictions of magic especially it is potent nonetheless.
I’ll stop discussing the story here, since I resolved to make this review shorter than the story itself, and I’m running close to breaking that resolution.
Step Seven: Sing.
One of the most conspicuous effects when I first read “The Litany of Earth” was that it made me get one of my own songs firmly stuck in my head for many, many hours. The piece is “Longer in Stories than Stone” and it is the big finale chorus to my Viking song cycle, a piece about the fragility of memory and the importance of historical transmission. It is a different treatment but with similar themes, and I found that listening to it a few times live and over and over in my head helped me extend the feelings reading the story awoke in me, and let me continue to enjoy and contemplate its messages for several happy hours. So to celebrate the release of the story (taking advantage of the fact that this blog is no longer anonymous) here is the song, and I hope it will do for you what it did for me and help me extend my period of pleasurable mulling. I hope you enjoy:
Come to rescue us from the dark and gloomy wood of Doubt in which we have been wandering since my first post in this series (did you say hello to Dante?) comes the Criterion of Truth! The idea that, while the skeptics are correct that logic and the senses sometimes fail, they do not always fail, and if we carefully study when they fail, and why, if we identify the source of error, we can differentiate reliable knowledge from unreliable knowledge. For example, our eyes may deceive us when we judge a stick half-submerged in water to be bent, but if we add the testimony of other senses (touch), and of repeated experience (last time we saw an object half-way into water) we can identify the error, and henceforth say that we will not trust sense data based on visual information about objects half-submerged in transparent liquids, but that other sense data may be reliable. Once the causes of error have been defined, once we have a criterion for judging when knowledge is uncertain and when it is reliable, if we thereafter base our conclusions only on what we know is certain, then our conclusions will be reliable, eternal and divine, a steady foundation upon which we may proceed in safety toward that godlike happiness we seek. The Criterion of Truth is the clean and steady light of compromise, which does not banish all shadow, but, like a lantern in the dark, allows a philosophical system to have dogmatic elements while still conceding that much remains in shadow.
“Quite wrong!” cries our Pyrrhonist. “You have it all backwards! Doubt is the steady path toward eudaimonia. The absence of the possibility of certainty is our liberation, not our bane! It is when we embrace the fact that we cannot have certainty that we are finally free from the risk of having our beliefs overturned and our Plutos and Brontosaurs snatched away. It is when truth is firmly beyond human reach that we can finally relax and stop being plagued by curiosity and the endless, restless quest for information. The Criterion of Truth is not a light in darkness, it is a battering ram which has pierced our clean and serene sanctum and smeared it with all the muddled and confusing chaos that we worked so hard to banish! Don’t build a path on this foundation! However steady it may seem, the ground could still give way at any moment and shatter all. And even if it doesn’t, the path will never end. You will exhaust yourself on its construction, your age-gnarled hands still struggling to lay stones when you breathe your last, with never a glimpse of the end in sight, just infinity of toil and darkness. And the you will inflict the same curse upon your children, and your children’s children, and your children’s, children’s, children’s children!”
Whether one sees it as a blessing or a curse, developing a Criterion of Truth is what has allowed, and still allows, dogmatic philosophical systems to exist and progress in a fertile and symbiotic relationship with skepticism, instead of ending with the blank serenity where Pyrrho and other absolute skeptics wanted to dwell forever. Every philosopher with any dogmatic ideas has a criterion of truth (“Yes, even you, Sartre,” says Descartes, “Don’t give me that look!”), and an explanation for the source of error, and frequently I find that, when I am feeling awash in the ideas of a new thinker, one of the best ways to start to get a grip on things is to find the criterion of truth, which gives me an anchor point from which to explore, and to compare that thinker to others I am more familiar with.
Today I shall attempt something a bit compressed but hopefully the compression itself will be fruitful. I intend to briefly examine three of the major classical schools (Platonism, Aristotelianism and Epicureanism) and explain just enough of each system to make clear its criterion of truth and its explanation for the source of error. By laying these out in a compressed form, side-by-side, I hope to show clearly how skepticism is at play in each of the dogmatic systems, and to show what the early approaches to it were, so that when I move forward to major turning points in skepticism it will be clearer just how new and different the new, different things are. Tradition dictates that I start with Platonism, but Socrates is looking a little too aggressively eager now that I mention Plato, and furthermore he was being mean to Sartre while we were away (Don’t pretend you didn’t know that dialog trying to define “being” would make him cry!), so I shall instead start with Epicurus:
The Epicurean Criterion of Truth: Weak Empiricism
Take the stick out of the water. Epicureanism faces up to the skeptical challenge to the reliability of sense data and still chooses to promote the senses as our primary source of information, simply proposing that we should not rely upon first impressions, but should consider sense data reliable only after careful investigation, ideally using multiple senses and instances of observation. But there is more to it than that.
Epicureanism is a mature form of classical atomism, positing that on the micro-level matter is composed of a mixture of vacuum and invisibly tiny, individual components or seeds known as “atoms” which exist in infinite supply but finite varieties (see the modern Periodic Table), and that the substances and patterns we see in nature are caused by different recurring combinations of these atoms. If the same kind of sand appears on two unrelated beaches, it is composed by chance of the same combination of atoms. If a piece of wood is burned and goes from being brown, firm and porous to being white and powdery, some atoms have left it (in the smoke, for example), and the remaining ones look different.
Atoms too are responsible for the apparently changeable properties of objects (remember the seventh mode of Pyrrhonism, that we cannot have certainty because objects take multiple forms). The properties of substances do not derive from atoms themselves but from their combinations. Colors, smells and flavors are all effects of the shapes of atoms, so it is not true that sweet substances contain sweet atoms and red substances red atoms, rather sweet substances contain smooth atoms which are pleasant to the tongue rather than rough, and red objects contain atoms whose combinations create redness. If bronze is red and then turns green, or wood is brown but burns and turns gray, then atoms have entered or left and the new combinations create a different color. And it is on this atomic basis that the Epicureans argue that (a) natural interactions of atoms and vacuum are enough by themselves to explain all observed phenomena, so there is no need to posit fearsome interfering gods, and (b) the soul is just a collection of very fine atoms, distributed in the body and breath, which disperse at death, so there is no need to fear a punitive afterlife.
Atoms are, believe it or not, largely a solution to Zeno’s paradoxes of motion, and also have much to say about our stick in water. As we all recall, Zeno’s arrow can never reach its target because the space in between can be infinitely subdivided into smaller distances which it must cross before it can finish its path, therefore motion is impossible. Epicurus answers: yes. Motion is indeed impossible. Motion is an illusion. The key is that space is not infinitely divisible, as Zeno proposed. Atoms, according to the Epicurean system, are not only the smallest objects but the smallest subdivision of space; it is literally impossible to subdivide either atoms or space further. (Note that if he were around now Epicurus would deny that our modern “atoms” are atoms – he would confer that title upon the smallest known sub-atomic particle, or reserve it for the piece smaller than that which all the king’s horses and all the king’s cyclotrons still can’t detect.) The smallest distance any object can move is one atom-width – any more nuanced motion is impossible. In other words, fluid motion is an illusion, and on the micro-level objects do not slide from one place to another. Rather their atoms pop in an instant from one position to the next atom-width over. One might call it microscopic teleportation. It is by this means that the arrow moves: every component atom in the arrow teleports one space to the left each moment, and thus the arrow proceeds from right to left sequentially.
Positing micro-teleportation as a substitute for motion may seem alien, but it is something we make use of every day in the modern world, and it is in fact much easier to explain Epicurean theories of motion to modern computer-users than it was to people in the past. As you scroll down this page, the cursor of your mouse and the text on the screen seem to move, but in fact nothing is moving. Instead tiny pixels, the atom-widths of your screen, are changing color, or you could say that the black pixels that form the text are teleporting one pixel-width per moment as you scroll. The eye, unable to see such fine distinctions, blurs that micro-teleportation into the illusion of motion. Why couldn’t all motion be a similar illusion? Zeno is defeated, and Reason is once again reliable.
Which is good because Reason is the heart of the system of knowledge Epicurus wants to build. The Epicurean atomic theory, after all, is based on a combination of observations of the sensible world and then logical deductions. We observe that objects change their form when burned, that sea-soaked cloth hung up to dry becomes dry but remains salty, and that the same types of substances recur in many independent locations. From this we deduce the existence of atoms of different types in different combinations without ever directly seeing them. Zeno’s paradox of motion does not, in this interpretation, demonstrate that we can’t trust reason, but that we can’t trust rash, unexamined observations. There seemed to be motion, but with time, patience, observation and reason the Epicurean has determined that that was a mistake, and found a better model.
But this does an interesting thing to sense data, which Epicurus still wants to be more our guide than naked logic. Atomism, which predates Epicurus, seems to have itself arisen from observations of motes in a sunbeam, tiny particles which are invisible normally but visible only in special circumstances, and which all classical atomists cite as sensory evidence for the reality of atoms. From motes in a sunbeam and raw logic, they derive the atomic theory. As Epicureans strive to free themselves from fear of the unknown by observing and explaining natural phenomena through the interaction of atoms, they rely on what they can see, feel, hear and touch to derive their theories. This is empiricism but it is (as Richard Popkin aptly named it) weak empiricism. Why? Because the reality beneath what we observe is invisible. (“Exactly!” cries Sartre, leaping up with sufficient force to knock over Descartes’ thermas.) If atoms are undetectably tiny, and everything we see, taste and smell is a consequence of their combinations rather than the atoms themselves, then we can never have real knowledge of the fundamental substructure of being. There is an insoluble barrier between us and knowledge of true things, the barrier of minuteness. Thus Epicurean empiricism involves surrendering forever any certain knowledge of the truth of things, but in return we can have fairly reliable knowledge based on careful, repeated observation using multiple senses, especially now that logic has been rescued from Zeno’s grasp and is once again our ally.
Source of Error: Twofold. Limitations of the senses, which cannot see atomic reality; unquestioned acceptance of sense data and commonplace cultural assumptions (like superstitions about the gods) which are unreliable because they are not based on careful observation and analysis.
Criterion of Truth: Knowledge is certain when it is based on a combination of careful observation of the sensible world with multiple senses, and careful logical analysis.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of the observable world, and we can make rational deductions about the insensible world which are reliable enough to act upon (since we cannot ever prove or disprove them), but we cannot ever have true and certain knowledge of the invisible atomic world which is Nature’s true reality.
At this point some readers are not particularly disturbed by Epicurus’ surrender of true knowledge of microscopic things. After all, have advanced since 300 BC. We played with microscopes in grade school, we named the proton and the quark and preon, we made molecules out of toothpicks and gummy candies, and the electric blood of splitting atoms blazes in our lightbulbs. We fixed that weakness. “Delusion!” Sartre says, and he is right that, on a fundamental level, this technological advancement has not let us reclaim what Epicurus surrendered. However advanced our science, we still have no cause to believe we have yet perceived or even hypothesized the literally smallest increment of matter. And, separately, even if we had a machine capable of perceiving the smallest part of matter, we would still be limited by our senses since the machine would have to use our senses to transmit its findings to us, transmitting only an approximation, rather than reality. And in addition, the vast majority of our daily decisions would still be based on what we perceive at the macroscopic level. Thus, even with technological aid, the Epicurean surrender of knowledge of the fundamental seeds of things is a considerable one, and divides all knowledge firmly into two camps, the perceivable world about which it is possible to have certainty, and the reality beneath about which it is not. We have a path and shadows, dogmatism and skepticism coextant within one system.
The Platonic Criterion of Truth: the Forms
My approach to Platonism will be rather sideways, but I want to get us to its criterion of truth by a route that is as parallel as possible to Epicurus’. So, for the vast majority of my readers who know basic Platonism already, please read along thinking about Zeno’s paradoxes and the stick in water how this way of outlining Platonism follows the same logical structure Epicurus did.
Plato, like the skeptics, acknowledges that the senses fail and deceive, and, like the atomists, observed that there are recognizable, recurring objects in nature that come into existence in independent parallel to one another: similar rocks, mountains, trees and animals in distant corners of the Earth, which must, he reasoned, have some common source. He also noticed that humans are able to recognize and identify these objects as being the same, even humans who have never met each other, or speak different languages, and even when the objects may have radically different colors and shapes disguising a shared structure – a disguise we see through. Finally he noticed (something Epicurus did not discuss) the fact that humans not only naturally identify objects, but naturally judge them to be better or worse based on unspoken but nonetheless universal criteria. Anyone can tell that a crisp, fresh apple is “better” and a withered, dry one “worse” without having to discuss or debate that fact, or even to be taught it. I could show you a healthy and a diseased version of some deep-sea fish you’ve never heard of and you would nonetheless successfully identify them as “better” and “worse” exemplars of a completely new and unknown thing.
To explain these patterns, and this universal capacity to identify and judge “better” and “worse” examples of things, Plato posited that these objects must have a shared source, but instead of positing a combination of atoms, he posited a source independent of matter that supplied the object’s structure. All quartz crystals, all trees, and all apples take their structures from a separate structure-supplying object, which exists independent of matter and time. It has to, since the objects it generates can come into existence and be destroyed, but the pattern, the archetype, the source remains. Plato named this structural archetype the “Form” and posited that these Forms exist in a separate level of reality. They create the many material manifestations of their structure as a flag pole might cast many shadows on different objects at different times. As some shadows are crisp, straight images of what casts them and others are vague, twisted or distorted, so objects are sometimes fairly straight and sometimes quite twisted manifestations of their Forms. When we judge an object, we judge it based on how good an image it is, how closely it resembles the Form which is the source of its structure. Hence why anyone of any age, in any culture, without the necessity of communication, can judge the superior of two apples, and tell that twisty trees are weird.
But objects are never truly like their Forms because Forms exist on a completely different level of reality, just as the flag pole exists on a different level of reality from its shadows. We know this the same way we know that the godlike eudaimonia we seek cannot be based on fleeting things like lust and truffles. Forms are indestructible – no matter how many trees or apples burn, the Form remains. With that attribute, in the Greek mind, go the others: Forms are eternal, unchanging, perfect, and divine. They cannot be part of this changing and destructible reality, but must exist on some other layer of reality where change and destruction do not exist. Note how this is in many ways exactly symmetrical to Epicurus’s atomic theory, in which atoms are indestructible, unchanging and perfect, and exist on an imperceptible micro-level accessible to us only by deduction, just as real-but-invisible as the Platonic realm of Forms. Both posit a materially inaccessible world which is the source of the structures of the perceivable world.
What about Zeno and the stick in water? Simple: the motions of a flagpole’s shadow across the earth and ground aren’t rational but bizarre, bending and distorting, split in half at times by passing objects, changing and imperfect. Just so the material world. The stick in water looks bent, and motion is rationally impossible, because the entire layer of reality perceived by the senses is itself bent, distorted, an imperfect effect of a perfect reality elsewhere. When we see the stick look bent, or realize that motion makes no sense, it is at that point that we are beginning to perceive the fundamental flaws in sensible reality, and realize that the true, rational, knowable structure lies elsewhere.
True knowledge, reliable, certain knowledge upon which we may build our path toward reliable, certain eudaimonia must therefore be knowledge of Forms, not of passing things. We can have True knowledge of the Form of Apples, the Form of Trees, the Form of Justice, the Form of Humans, but we cannot have true knowledge of a particular apple, tree, case of justice v. injustice, or human, because such things are changing, imperfect, and perishable, so even if we could know them perfectly at one instant, that knowledge would not be lasting, not enough to be a real foundation for happiness. The only permanent, certain knowledge is knowledge of eternal things, since all other knowledge is, like its objects, destructible. Thus the Forms are the path to Happiness.
And now, without any need to address the soul, or Platonic love, or Truth, or the other great Platonic signatures, we can describe the Platonic Criterion of Truth:
Source of Error: The material world perceived by the senses is imperfect and illusory, and conclusions based on observation of it are full of error, and incomplete.
Criterion of Truth: Knowledge is certain when it is based on knowledge of the eternal Forms, which can be perceived by Reason. So long as we rely only upon knowledge of abstract, eternal Forms and not on knowledge of specific material things, we will make no errors.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of the Forms, i.e. of the eternal structures that create the sensible world, but we cannot ever have true and certain knowledge of individual objects within the material world.
Now, our friend Socrates has been waiting all this time to rant about how Plato put all this in his mouth, by using him as an interlocutor in his philosophical dialogs, when all Socrates stood for was the principle that we know nothing, and wisdom begins when we recognize that we know nothing. But explicated like this, in a way which highlights how substantial a portion of human experience Plato has yielded to the shadows of skeptical unknowability, Socrates has far less cause to object. Plato has taken “I know nothing” as his starting point, as, in fact, did Epicurus, both of them beginning by scrapping the received commonplaces of things people thought they knew about the material world, and instead trying to find a space for certainty far removed from the evidently-unknown world of daily experience. We all know that Plato tried to appropriate Socrates to his system, painting Socrates as a Platonist and implying that Socrates agreed with all Plato’s dogmatic ideas as well as his skeptical ones.
But Plato was far from the only one to do this. In the ancient world, Skeptics, Cynics, Stoics, Aristotelians and Neoplatonists all make claims about Socrates really believing what they believed, that Socrates was really a skeptic, or a stoic sage, etc. This is easy because Socrates left us nothing in his own voice, but also because all of them really did begin as he demanded, by doubting everything, declaring that “I know nothing” and then trying to work from that toward a system which carves out one zone for the knowable and surrenders another to the unknowable. Attempts by later sects to appropriate Socrates reflect his fame, but also their universal gratitude for the way his refinement of skepticism created a starting point from which they could approach their Criteria of Truth, and start from there to lay their foundations. And now that I’ve put it that way, Socrates seems much less set on picking a bone with Plato, and much more interested in the bones of the chicken drumsticks Sartre brought, which are much larger than those Descartes brought, which are larger than the ones Socrates is used to, a mystery which definitely bears investigation. We can in part blame one “Aristotle”, though when I mention him our more modern thinkers smile knowingly, thinking of the many stages that had to pass between the ancient empiricist and the alien concept “progress.”
The Aristotelian Criteria of Truth: Categories and Definitions
Aristotle studied with Plato for decades, and his framework has a similar beginning. Yes, we instantly recognize that apple is apple and cat is cat, even if we are on the other side of the world and recognize apple as ringo and cat as neko. And we instantly judge the withered apple as being farther from what an apple ought to be than the crisp one.
What Aristotle doesn’t like is how Plato has the Forms exist in a hypothetical immaterial reality removed from the sensible reality. Instead, he uses the term “form” to refer to structures within natural objects, which are not material but not immaterial either. They are non-material. This may sound like gibberish, but I recently demonstrated it very effectively to my class by taking two apples to the front of the classroom, setting them down while I had a drink of water, then violently smashing one of the apples with repeated blows from the butt end of the water glass, reducing it to a sticky green pulp and producing an extremely startled and, in the front rows, apple-bespattered classroom. “What did I just destroy?” I asked. It took only a few moments of recovery for one to supply: “The form of the apple.” Aristotle even goes so far as to say that forms, rather than matter, are what senses sense. When we see an apple our minds do not register the raw, chaotic matter, they register the structure: apple. When we see smashed apple pulp even then we do not see matter, we see pulp, which has its own structure. We never perceive matter, or rather never recognize matter, never understand matter. All cognition takes place on the level of form, which is why we can identify “apple” at a glance and not have to spend a minute assembling the millions of points of perceived light and color together to deduce that it’s an apple.
But if the form, for Aristotle, is a structure within individual objects, and is destructible, it can’t be a source of eternal certainty, nor can it explain how my colleague in Japan can recognize and judge apple identically to the way I do. For this Aristotle posits Categories. Universal categories exist in nature, non-material structures just like forms, into which the forms of objects fit. Human Reason is capable of identifying these categories, by looking at objects, understanding their forms, and identifying their commonalities, functions etc. We all see the apple and recognize that it fits in the category apple. We further recognize that the category apple fits in the category fruit, that in the category “part of a plant” etc. And that Stamen Apple is a sub-category within the category apple. This allows us to identify and judge even objects which we have never seen before and have no names for. You probably do not know at a glance what the creature pictured to the left here is, but you can identify that it belongs in the category mammal, possibly in the rodent category or maybe more like a tiny deer judging by those skinny legs, but certainly in the medium-sized, ground-dwelling, non-carnivore, probably scavenger eating fruit and bugs and things, not-dangerous-to-humans category. (It is, in fact, a Kanchil or “mouse-deer”). Similarly we can all categorize trees, rocks, fish, and other things. Aristotelian categories are part of Nature itself, eternal and unchanging, and indestructible, since the category apple and the category Kanchil will be unchanged regardless of the creation or destruction of any individual. A withered apple doesn’t harm the category apple, nor does a limping three-legged Kanchil, and the extinction of the T-Rex didn’t erase the category T-Rex.
The extinction of the Brontosaur didn’t erase the category Brontosaur either – it was our discovery that the category was wrong that did so, and here we get toward Aristotle’s ideas of certainty and error. We had not defined our terms carefully enough, had accidentally separated two things that shouldn’t be, and thus were led to error. Error caused by insufficiently clear definitions of our terms. The categories are sources of true, certain and reliable knowledge. Like with Plato’s forms, we cannot Know-with-a-capital-K individual things with certainty, since they are destructible and changing, and the apple which is fresh today will be withered next week. But we can know the categories, and that it always has been and will be the nature of the apple to grow on trees and try to be sweet and colorful to attract animals to eat it and spread seeds, and that it always and will always be the nature of the T-Rex to be a humungous terrifying predator the sight of which inspires fear in all mammals and other smaller creatures. One source of error is when we make mistakes about categorization. We may mistake the Kanchil for a rodent, or a Vaquita for a dolphin, but with more careful observation we realize it is more closely related to a deer. We may mistake the Brontosaur for its own species before we realize it is a juvenile version of another thing, as easy a mistake to make as thinking that a caterpillar and butterfly are different creatures until we examine more closely. We also want to do this with things we may not, in modern parlance, think of as part of Nature, but just as there is the category “cetacean” within which exists the category “porpoise” so too there exist the category “integer” within which exists the category “prime number,” also the category “system of government” within which lies the category “democracy,” and the category “virtue” within which exists the category “justice.” Aristotle, and the rest of Greece with him, does not draw our modern post-Rousseau line between “Natural” and “artificial” placing human works in the latter. Birds are part of Nature, as are humans; birds’ nests are part of Nature, with a category, as are all the things humans create. The category “web page” which contains the category “blog” is as natural as the category “tree”.
Thus Aristotelian certainty comes with careful, systematic investigation of the categories within nature, and if we want to reduce error we can do so best by studying and measuring and comparing objects we see until we can fit them into categories. The more we study, and the more carefully we define our terms, the clearer our conversations will become, less given to assumptions, misunderstandings and error. One source of error, therefore, is equivocal language, words that are sloppily defined and don’t refer to real categories in nature. Brontosaur, planet, motion, Justice, good, are all sloppily-defined terms. Any term which does not point to a real category in Nature is sloppy and may lead us to error. If we use only vocabulary that is carefully worked through and points only at real categories, then our language will be clear, our communication perfect, and the possibility of error greatly reduced. After all, we only want to be talking about categories, not anything that isn’t one. Since, as with Plato’s forms, categories are eternal, unchanging and reliable. On their foundation we can build our path. As with Plato and Epicurus we have surrendered knowledge of individuals, in favor of knowledge of something structural which underlies them.
Excuse me: to proceed farther with Aristotle, I need to go get my fork. Here it is. (Or rather an image of it, one level less real, its Platonic shadow.)
This fork has been part of my life since I was a tiny girl, and it taught me about the Aristotelian sources of error. When I was little, I would help put the silverware away. This fork puzzled me. Why? Because I couldn’t figure out how to categorize it.
Here you see my dilemma. We had one slot for forks, which had tines and metal handles. And one slot for knives, which had blades and wooden handles. Where then goes this fork, which has tines but a wooden handle? Let’s offer the dilemma to our Youth.
Youth: “I think it should go with the metal-handled fork.
Youth: “Because it’s a fork. It’s used for fork things, that’s more important than what it’s made of.”
*Ding!*Ding!*Ding!* Correct! The Youth, like my child self, has correctly identified the Aristotelian distinction between an “essential property” and an “accidental property”. An essential property is a quality of something essential to it being itself, and filling the function it has in Nature; an accidental property is something that could change and it wouldn’t matter. A cat can be black or tabby (accidental) but must be slinky, carnivorous, and endearing to its owner in order to fulfill the functions of a cat. A tree must grow a woody trunk and produce leaves in order to fulfill the functions of a tree. A fork must fit comfortably in my hand and lift chunks of food to my mouth for it to be a fork. If the cat is orange, the tree is forked, and the fork is a futuristic rod that lifts food using a miniature tractor-beam instead of tines, those are accidents. If these things fulfill these functions badly–if a cat is ugly, a tree is all bent and twisted and produces few leaves, or a plastic fork snaps when I try to skewer food with it–we judge them bad examples of what they are. If these things don’t fill these functions at all–a quadrupedal mammal eats grass, a plant produces a soft viny stalk, and a piece of silverware cuts food in half instead of lifting it–we judge they do not belong in the categories cat, tree and fork respectively because they lack their essential properties. If I had mistakenly stored my wooden-handled fork with knives, that would have produced error, the same source of error as when we mistake a Kamchil for a rodent, or when Descartes, living in the 17th century, reads an article about how people from Africa are not the same as people from Europe because their skin is a different color. Mistaking accidental properties for essential ones has introduced error. And to call a robot toy a “cat”, or a metaphor for understanding genealogy a “tree”, or a fifteen-foot fork-shaped sculpture a “fork” is to employ ambiguous language, not referring to its categories, introducing error.
But what about Zeno, and our stick in water? For our stick in water Aristotle, much like the Epicureans, wants us to examine the stick more carefully, multiple times with multiple senses, to correct the mistake. And, like the Epicureans and Plato too, he surrenders true knowledge of individual objects, saying we can know Categories with certainty, after careful examination, but not specific things.
As for Zeno, there he comes from a different angle, attempting to refute Zeno with pure logic. Aristotle is big on observing Nature, but also on logical principles, especially a priori principles. By these he means logical principles which are self-evidently true and require no knowledge or experience to be proved. For example: The same thing cannot both be and not be at the same time. Think about it for a while, take your time. It’s the case, and not only is it the case but it’s the case for lampreys, and thumbtacks, and hypothetical frictionless spheres, and ideas, and systems of government, and people. Even if you were a brain in a jar that had never had any experience of the world outside the mind, you could identify that a concept cannot both exist and not exist at the same time. Here’s another: “One” and “many” are different. It is nonsense to imagine that a thing could be both singular and plural at the same time. That too you can conclude without any basis in anything.
Now, it is possible to use clever syntax to come up with what seem like counter-examples. What about a doughnut hole: surely it exists and doesn’t exist at the same time, for this doughnut has a non-existence which is its hole, and yet here I am eating this doughnut hole. No, says Aristotle. That apparent contradiction is merely a function of unclear vocabulary giving two things the same label when they are utterly different. Similarly this pomegranate is one and many at the same time. Again, no: it is many seeds, but one pomegranate. Use strict vocabulary, unambiguous terms, and discuss only categories, and you will find that Aristotle’s a priori principles are sound.
Reasoning from such starts, and using raw logic without recourse to any knowledge of the material world, he then takes on Zeno. You cannot, says Aristotle, have infinite regression. It may seem you can, but an infinite chain is a logical impossibility because it would never end and never start. When you try to think about it, the mind rebels, just as it does when it tries to think of the one and the many being the same, or a thing both being and not being at the same time. Thus, says Aristotle, Zeno’s paradox is proved false because infinite regression is logically false. We can, now, rely on logic, so long as it is careful and methodical, and based on first principles and on comparison of the categories rather than leaping to conclusions directly from sense impressions of individual objects, which are flawed.
Sources of Error: (1) People using vague vocabulary that is unclearly defined and does not refer to anything Real, (2) Fallibility of individual material objects and rushed conclusions based on observations of such objects (note how similar this latter is to Plato).
Criterion of Truth: Knowledge is certain when it is based exclusively on either or a combination of a priori logical principles which are not dependent on anything other than logic to be certain, and on the eternal Categories which exist universally in Nature, and can be known through observation and discussed using a carefully-defined lexicon of philosophical vocabulary.
Zones of the Knowable and Unknowable: We can have true and certain knowledge of logical principles, and of the Categories, i.e. of the eternal structures within Nature that the forms of objects fall into, but we cannot ever have true and certain knowledge of individual objects within the material world.
Thus we have a third path, clearly delineating the arena of certain, eternal knowledge (on the basis of which we may seek eudaimonia) and separating it from the unknowable, which we surrender forever to skepticism. And once again the unknowable is the realm of matter, individual things, the essence which is given structure and comprehensibility by form. Aristotle, like Epicurus, has given up any chance of understanding matter itself, confining the cognizable world to that of form and structure, the macro-level. And he has surrendered knowledge of individuals, of this apple and this lamprey, granting us only the categories. We can still know an enormous amount in Aristotle’s system, enough to build a vast system of knowledge, a library of definitions, a vast network of genus and species names, and an empirical basis for an entire scientific system. Infinite knowledge lies before us on our Aristotelian path, infinite logic chains to follow, infinite categories to investigate, name, compare and discuss. The surrender, like Epicurus’s surrender of the ability to see atoms, feels minor.
“It’s still delusion!” Sartre says. “The surrender is vast! Infinite! Infinitely more vast and fundamental than your daily world imagines!” This outburst has been building up in poor Sartre for some time, which we can tell because since he’s been holding his knees and rocking back-and-forth and flushing, and only barely sociable enough to thank Descartes for that eclair (which is not, in fact, a lightning bolt but is a delicious pastry named “lightning bolt” in French, much to Aristotle’s chagrin). And, at some risk of frightening our innocent interlocutor the Youth (whom I shall advise to have Socrates hold his hand through the next bit) I will let Sartre continue in his own words, an excerpt from his Nausea(note that this particular translation uses existence rather than being):
“So I was in the park just now. The roots of the chestnut tree were sunk in the ground just under my bench. I couldn’t remember it was a root any more. The words had vanished and with them the significance of things, their methods of use, and the feeble points of reference which men have traced on their surface. I was sitting, stooping forward, head bowed, alone in front of this black, knotty mass, entirely beastly, which frightened me. Then I had this vision. It left me breathless. Never, until these last few days, had I understood the meaning of “existence.” I was like the others, like the ones walking along the seashore, all dressed in their spring finery. I said, like them, “The ocean is green; that white speck up there is a seagull,” but I didn’t feel that it existed or that the seagull was an “existing seagull”; usually existence hides itself. It is there, around us, in us, it is us, you can’t say two words without mentioning it, but you can never touch it. When I believed I was thinking about it, I must believe that I was thinking nothing, my head was empty, or there was just one word in my head, the word “to be.” Or else I was thinking . . . how can I explain it? I was thinking of belonging, I was telling myself that the sea belonged to the class of green objects, or that the green was a part of the quality of the sea. Even when I looked at things, I was miles from dreaming that they existed: they looked like scenery to me. I picked them up in my hands, they served me as tools, I foresaw their resistance. But that all happened on the surface. If anyone had asked me what existence was, I would have answered, in good faith, that it was nothing, simply an empty form which was added to external things without changing anything in their nature. And then all of a sudden, there it was, clear as day: existence had suddenly unveiled itself. It had lost the harmless look of an abstract category: it was the very paste of things, this root was kneaded into existence. Or rather the root, the park gates, the bench, the sparse grass, all that had vanished: the diversity of things, their individuality, were only an appearance, a veneer. This veneer had melted, leaving soft, monstrous masses, all in disorder—naked, in a frightful, obscene nakedness.”
By this point our Youth is very glad to have his hand held, and Descartes is having second thoughts about sharing his eclair with what has evidently turned out to be a lunatic Lovecraftean cultist. But I let Sartre speak here to demonstrate the fact that these surrenders, made in the earliest days of philosophy by system-weavers seeking to escape the web of Zeno and the Stick, are still substantial. Even the most recent modern philosophy returns, from time to time, to these ancient surrenders to unknowability, and some try, like Sartre, to make new inroads toward knowing what the majority of thinkers have given up on. New and, in Sartre’s case, scary inroads. Every system-weaver since Plato may have a Criterion of Truth to be our light in the darkness, our path, our foundation, the circle line for the new philosophical subway system, but the fertile symbiosis between skepticism and dogmatism–the symbiosis which has borne such fruit: Platonic forms, genus and species, atoms, eventually the scientific method itself!–is also still sometimes a hostile symbiosis, and the wild, strong skepticism of Pyrrho still sometimes rears its head to plague Sartre and us, even as we make daily use of soft forms of skepticism like Epicurus’ weak empiricism, and Aristotle’s categories.
Of course, many are the centuries between Epicurus and Sartre, and many the new relationships between doubt and dogma, the new Criteria of Truth and new forms of shadowy un-knowledge which will press upon our fragile paths, before we reach the modern world. So we still have much more to explore in further chapters. Good thing Descartes brought plenty of lightning bolts.
It is easy for us to forget how the Scientific Method, at work behind all this research, is a uniquely flexible and dynamic belief system, one which enables our uniquely flexible and dyamic world. Some will feel uncomfortable with me calling science a “belief system” but in this context I use the phrase “belief system” as a reminder of what the Scientific Method and its associated aparatus have displaced. Science has not replaced religion–they coexist happily, productively, even symbiotically within many arenas, places and individuals, even as they chafe and vie in others. But in the modern West, the Scientific Method has largely displaced older systems for guiding daily micro-decision-making which were more closely tied to religion. We now use science-based reasoning a hundred times a day when we are called upon to make decisions. Whether making a sandwich, buying a new teapot or evaluating an argument, we think about data from past experiences, bring in what facts and hypotheses we have accumulated from educated and informed living, consider the credibility of sources, ask ourselves questions about plausability, probability, evidence and counterargument, speculate about the range of possible errors and outcomes. We go through many steps, often fleeting but still present, before we assemble our sandwich (which recent nutrition advice seems plausible in the ever-changing range?) or buy our teapot (plastic so housemates won’t break it, or ceramic for environmental/health/aesthetic/flavor reasons?) or decide whether to grant a politician’s argument our provisional belief or disbelief. Even for those members of modern Western society whose lives are powerfully informed by faith or institutional religion, who do seriously factor “What would Jesus/Apollo/Whatever do?” into the calculation, evaluatory criteria based on science and its method remain a substantial, if not exclusive, part of our aparatus for daily decision-making.
For my purposes today, the most important part of what I just described is that the belief or disbelief we extend to the politician (or to our teapot) is provisional. We decide that a thing is plausible or implausible, and extend to it a kind of belief which is prepared for the possibility that we will be proven wrong. That thing the politician said might turn out later to be false (or true) when new information arises. A teapot, let’s say I pick one which claims to be safe and eco-sound because of XYZ carbon something something, it may seem that I have given its claims my complete belief if I buy the teapot, but that too is provisional since my long-term purchasing decisions for other objects will be informed by further information, changes in industry, and, of course, my empirical experience of whether or not this teapot serves me (and survives my housemates) well.
What we knew about teapots, coral reefs, moths and treesloths, Arthuriana, protons, and the Greek concept daimon, can all be overturned and yet we remain comfortable with the Scientific Method which produced our old false information, and we are still prepared to let it provide us with new information, then overturn and replace the new information in its turn. We do this without thinking, but it is in no way a universal or natural part of the human psyche. When chatting with my father about the proton research he summed it up nicely, that two possible responses to hearing that how we measure something seems to change its nature, throwing the reliability of empirical testing into question, are: “Science has been disproved!” or “Great! Another thing to figure out using the Scientific Method!” The latter reaction is everyday to those who are versed in and comfortable with the fact that science is not a set of doctrines but a process of discovery, hypothesis, disproof and replacement. Yet the former reaction, “X is wrong therefore the system which yielded X is wrong!” is, in fact, the historical norm. Whether it’s an Aristotelian crying “Plato has been disproved!” or Bernard of Clairveaux crying “Abelard has been disproved!” or a Scotist crying “Aquinas has been disproved!” the clear overthrow of a single sub-principle within a system was, for many centuries, sufficient to shake the foundations of the system as a whole, and drive people to part with it and seek a new one.
All this is a way of previewing the endpoint of the present series, in order to show how important the often-invisible role of doubt is in current human thought. Without skepticism, and important developments in the history of skepticism, we could not have the Scientific Method occupy the position it does in modern daily lives. So I want to sketch out here some of my favorite moments in the history of skepticism, not a complete history (for that see Popkin’s History of Skepticism or Allen’s Doubt’s Boundless Sea), but the spicy highlights that I’ve most enjoyed.
Dogma and Doubt
There are many ways to subdivide philosophy, but one of the most useful is, in my view, the subdivision into dogmatic and skeptical. I’m using these terms in their technical philosophical senses, so I do not intend to invoke any of the contemporary, negative cultural associations of “dogma” or “skeptic.” (Philosophy and history are constantly plagued with the disconnect between formal uses and modern casual uses of terms like these, Epicurean, Hedonist, Realist, Idealist… and it’s worse when I learn the technical term before I meet the popular one. I can’t tell you how confusing it was the first time I was in a conversation where someone used “libertarian” in its contemporary political sense, which I had never met, having learned it from Spinoza class. Them: “FDR is a big foe of Libertarianism.” Me: “Really? I didn’t know FDR denied the existence of freewill. Was he a materialist? A stoic?” And when I tell my students that, for the purposes of Plato class, “Realist” and “Idealist” are synonyms they sometimes look like they’re about to cry…) For today’s purposes by “dogmatic” I mean any philosophical moment or system which argues that something can be known, or that there can be certainty. By “skeptical” I mean a philospohical moment or system arguing that something cannot be known, or that there cannot be certainty. In this sense, Aristotle’s argument that the existence of a Prime Mover can be logically proved from the principle that any chain of events must have a First Cause is dogmatic, as is the conviction that we know with certainty that the square of the length of the hypotenuse of a right triangle is equal to the sum of the squares of the remaining sides. Pierre Bayle’s argument that God’s existence can be known through faith alone is skeptical, as is the argument that quantum uncertainties like Heisenberg’s mean that material reality can never be fully understood because the act of perceiving it alters it. Thus neither skepticism nor dogmatism is more or less tied to theism than the other – both are broad and diverse categories, and most great intellectual traditions have both in there somewhere.
Dogmatic philosophy is what most people usually think of when we think about philosophy: systems that propose particular things. The Platonic Good, Aristotle’s Categories, Descartes’ vortices and and Heidegger’s Being are all founded in claims that we know or can know some thing or set of things with certainty. Yet skeptical arguments, about what cannot be known, have coexisted with dogmatic claims throughout philosophy’s existence, and the two act as foils to one another, arguing, cross-polinating, hybridizing, and spurring each other on, and their interactions have been among the most exciting and fruitful in philosophy’s long history.
I will begin as close to the beginning as I can:
Happiness in Ancient Greece:
While post-17th-century philosophy often puts its primary focus on the quest to explain and describe things and create a system of knowledge, one key unifying attribute common to just about all classical Greek philosophical schools, though different in each, is the goal of attaining eudaimonia (εὐδαιμονία), from “eu” = good, happy, fortunate, and “daimon” = spirit, soul. It’s usually translated as happiness, but it’s both more specific and stronger. Other renderings that help get the idea across include wellbeing, self-contentment, self-fulfillment, spiritual joy, and personal wellfare. It is the kind of happiness which is deep, lasting, tranquil, reliable, complete, and, in the Greek sense, godlike or divine. By “divine” I mean a list of attributes that most Greek philosophers associated with the gods, who were supposed to be immortal, unchanging, indestructable, eternally happy and satisfied, living in a bliss surrounded by beauties and free from pain. These are not Homeric Greek gods who feud and lust and rage, but more abstract philosophical gods personifying unchanging eternal principles, the sort of gods Plato believed in and for which reason he wanted to censor Homer’s depictions of the more fallible and anthropomorphic ones. The word daimon thus occupies a complex space, much debated, but can be rendered as a spirit, soul or thinking thing, referring to a category vaguely encompassing human souls, gods and intermediary spirits. Thus, eudaimonia is the state of having a happy or fortunate spirit, so my favorite way of rendering eudaimonia is “the kind of happiness Platonic gods experience” i.e. long-term, untroubled, indestructable happiness.
Become a philosopher, lead a philosophical life as I do, and you will achieve, or at least approach, happiness–this is the promise made by every sect, from Epicurus and Seneca to Diogenes and Plato. In the classical world, being a philosopher was much more about life, living well and demonstrating one’s philosophical prowess through one’s personal excellence and successes than it was about writing comprehensive masterworks expounding systems (See Hadot’s What is Ancient Philosophy? and Philosophy as a Way of Life). Each classical philosophical school had its own path to happiness, and each entwined it with different parallel goals, such as the pursuit of personal excellence, or understanding of nature, or civic virtue, or piety, or worldly pleasure, or friendship, any number of things, but we find no classical school for which approaching eudaimonia through leading a philosophical life was not a core promise.
I should note in passing that, in later classical writings, it becomes clear that they take the divine aspect of eudaimonia very seriously, and Neoplatonists especially refer to past philosophical sages as “divine,” arguing that Socrates, Plato, Zeno, Diogenes of Athens, Seneca and so on, had achieved states of philosophical happiness that made their souls identical with those of gods, even while they were contained within mortal flesh. The daimon or soul which is happy in eudaimonia is, after all, categorically the same type of thing as a god, and one of the leading differences between a human soul and a divine one is that the divine one experiences indestructable happy serenity. If a philosopher’s soul achieves the same state, is it not a god? Particularly in a culture which already practiced deification and ancestor worshop? Platonic claims about a philosophical soul growing wings, leaving the body and dwelling among the gods helped further cultivate this impulse. The practice of Theurgy, philosophical magic, developed from the idea that such a divine soul, even while resident in human flesh, could work miraculous effects, such as levitation or generating light.
Now, eudaimonia is a high bar to achieve. Indestructable, god-like happiness must be able to stand unchanged in the face of all changes, a great challenge in a human existence beset by a thousand evils including wolves, tyrants, malaria, civil war, famine, injustice, accidental dismemberment, urequited love, and human mortality. All our surviving ancients agreed that real eudaimonia could not be dependent upon external sources, like fame, wealth, property, physical fitness, romantic love, even liberty of person, because such things could be taken away from you by fickle fate, making them unreliable, and your happiness destructable. I say surviving ancients because we do not have the writings of the Hedonist school, which we know focused on positive, experiential pleasures including, probably, food, drink and sex, and who may have been an exception, but their exceptionality doomed them to be silenced by the dissenting majority. Those who did agree agreed that eudaimonia had to be a state of the thinking thing, the mind or soul, independent of experience, body or social position. It was most frequently connected with things like tranquility, self-mastery, acceptance, and taking enjoyment from things that cannot be destroyed, like Truth. It was also connected with freeing the soul from cares, such as fear, anxiety, envy, ambition, possessiveness, and general attachment to Earthly, perishable things.
These classical philosophical schools developed guides for living and decision-making intended to facilitate a happy life, and those with systems of physics and ontology often tied those closely to their paths to happiness. For example, the atomic explanations for the natural non-divine mechanisms behind thunder and lightning were promoted by Epicureanism as something which could make people happy by freeing them from fear of being zapped by a wrathful Zeus. Thus philosophical disciplines like physics, biology and even basic ontology were in their way tools of eudaimonia as much as they were attempts to explain things. Modern scholars even debate whether, in such cases, the physics was the source of the moral philosophy, or a tool developed afterward to support it when eudaimonia seemed to need it as an ally.
One of the sources of pain and unhappiness from which such systems set out to free people was curiosity, i.e. the unhappiness that derives from hungering for answers. This too needed to be satisfied to achieve the stability of eternal, godlike happiness. The quest to end the pain caused by curiosity meant supplying answers, to questions big and small, but especially big. And they needed to be certain answers, which would be reliable and eternal, and stand up to the assails of fortune, or else eternal, reliable eudaimonia could not rest upon them. This added extra energy to the quest for certainty. One wanted to be really, really sure an answer was right, so one could rest comfortably with it, and be happy, and know it would never change. And one wanted the facts which served as foundations for philosophers’ broader advice on how to achieve happiness to also be certain and unchanging. If Plato says the key to happiness is Truth, Excellence and the Good, or Aristotle proposes his Golden Mean, you want their claims to be based in certainty.
Two tools were employed in pursuit of certainty: Logic and Evidence. All dogmatic claims (i.e. claims of certainty) made by any of our classical thinkers were based on one, the other, or both.
Evidence includes any claims based on observation, sensation, lived experience, or, more technically, empiricism. If Aristotle says bony fish and cartelagenous fish are different because he has dissected a hundred of them and can describe how their insides are different, that is empiricism. If Thales or Heraclitus draw conclusions based on seeing how fire emerges from wood, that is empiricism. If Plato asks us to think about when we’ve seen someone beat a dog and say whether it makes the dog better or worse, that too is a kind of empiricism.
Logic includes any argument based on reasoning instead of sense experience. If Aristotle says that a thing cannot both be and not be at the same time, that is an argument based on logic. If Plato asks us if beating a dog makes it worse with respect to the properties of horses or worse with respect to the properties of dogs, that is also an argument asking us to apply logic.
Meanwhile, in a nearby lake…
…a stick fell in a pond, and skepticism was born, like Venus, from the waters. Or rather, from someone who saw the water, and saw a stick sticking half-way out of it, and noticed that the stick looks bent or broken at the point where it goes into the water. And yet, the stick is not bent. The person bends over and touches it, just to be sure, and the fingers confirm the wood is whole and strong. My eyes are lying to me! My eyes can’t be trusted! If this stick isn’t bent, what else that my eyes have told me may be false that I haven’t yet realized? What if the sky isn’t blue? Or milk isn’t white? What if trees have faces, chalk is actually as beautiful as gold, and the sky is swarming with exquisite creatures I have no way to detect? And if I can’t trust my eyes, what about my ears? My hands? Sense perception is unreliable! But in that case, how do I know anything I’ve experienced is as I thought it was? Or even that anything is real?! Panic! Panic more! (Quietly in the background Descartes and Sartre are still carrying out the “Panic more!” instruction nearly two millennia later).
The stick in water is a genuine, ancient example, much discused by pre-Socratic philosophers. We don’t know if it’s actually the first example, since the earliest conversations are lost to time, but it may be, and certainly if it isn’t it was something similar, one of the other optical distortions discussed by ancients, like how a square tower can be mistaken for a round tower when seen from a great distance. What does survive is what later philosophers made of these early discussions of the mystery of the stick in water: epistemology, the study of knowledge, how we know things, and when we can or can’t have certainty. The stick in water challenges any claim that the senses can be relied upon as a source of certainty. Forever after, therefore, any philosopher who wanted to make any claim based on sense perception first had to have a way to explain how we could trust the senses despite this, and other, failings.
And the stick in water has a brother: Zeno’s paradoxes. If the stick in water undermines the credibility of sense-perception, its partner, Zeno’s paradoxes of motion, are what undermine the credibility of the other traditional source of information: logic. You have all heard Zeno’s paradoxes before, but rarely in companionship with the stick in water, which is what gives them their oomph, so it’s worth revisiting one here:
An archer looses an arrow at a target. Before the arrow reaches the target, it must go half way. Next it must go half the remaining distance. Then half that distance. Then half that distance. Half, half, half, half but we can do this forever, so the arrow can never actually reach the target, because it must cross an infinite number of micro-distances first, and nothing can travel infinite distance. Therefore, logically, motion is impossible. Cue polite applause for the logical trick, as at the successful completion of an elegant and challenging ice skating routine. (Cue also Descartes and Sartre glaring at anyone who’s still smiling.)
Why is this more than a cute trick?
Youth: “But we know motion is possible, Socrates.”
Socrates: “How?” (All philosophical dialogs are with Socrates, even when they aren’t.)
Youth: “Because I can hit you. See?” Hits Socrates.
Socrates: “Yes, very good. So you know it is possible because you did it?”
Youth: “Exactly, Socrates. I can do it again if you aren’t convinced.”
Socrates: “If you want to exercise your will in that way (if there’s such a thing as will) then that’s your choice (if there’s such a thing as choice), but first, perhaps you could explain to me, using logic, how you are able to hit me, if your arm has to cross infinite distance first?”
Youth: “I… um… I don’t think I can, Socrates. I just know that I hit you, and could do it again.”
Socrates: “But you can’t explain logically why.”
Socrates: “So wouldn’t you say, then, that logic is incapable of explaining motion?”
Youth: “I guess so, Socrates.”
Socrates: “Doesn’t that bother you? That logic fails to be able to explain something so seemingly simple? Doesn’t that make you distrust logic itself as a tool? It would seem that logic itself is unreliable and can’t lead to certainty.”
Descartes (quietly in the background): “Panic more!”
Youth: “I guess that’s so, Socrates, but it just doesn’t bother me the way it bothers those weirdly dressed men over there.”
Socrates: “And why doesn’t it bother you?”
Youth: “Well, because I know that motion is possible because I can do it and see it. I don’t need logic to explain it.”
Socrates: “So even without logic, you’re sure there is motion… because?”
Youth: “Well, because when I move my arm to hit you, I can see it. When I touch you with my hand, I can feel the impact, the texture of your skin. I can still feel it a little on my own skin, the spot where it struck yours.”
Socrates: “So you know there is motion because your senses tell you so?”
Socrates: “So, since logic is unreliable, you choose to rely on the senses instead?”
Youth: “Yes. I trust things I can see and touch.”
Socrates: “Then tell me, my young friend, have you ever happened to notice what happens when a stick falls so it’s sticking half-way into a pool of water?”
Our youth, whom we shall now leave panicking on the riverbank along with Socrates, Descartes, Sartre and, hopefully, a comfortable picnic, has now received the full impact of why Zeno’s paradoxes of motion matter. They aren’t supposed to convince you there’s no motion, they’re supposed to convince you that logic says there is no motion, therefore we cannot trust logic. Their intended target is any philosopher *cough*Plato*cough*Aristotle*cough* who wants to make the claim that we one can achieve certainty by weaving logic chains together. Anyone whose tool is Logic. Meanwhile, the stick in water attacks any philosopher who wants to rely on sense perception *cough*Aristotle*cough*Epicurus*cough* and say that we know things with certainty through Evidence. When you put both side-by-side, and demand that Zeno shoot an arrow at the stick in water that looks bent, then it seems that both Logic and Evidence are unreliable, and therefore that… there can be no certainty!
Don’t panic, be happy…
The double challenge of the stick in water and Zeno’s paradoxes had many effects.
One was to make all classical thinkers who wanted to maintain dogmatic principles work a lot harder to nuance their claims of certainty, to justify why and in what specific circumstances logic and evidence could be trusted, to explain why they sometimes failed or seemed to fail, and how one could reason or observe more carefully in order to achieve greater levels of certainty. Thus these challenges to reason and evidence let dogmatic philosophers adopt skeptical tools and create systems which had space for both dogma and skepticism in the same system, hybridizing the two to achieve greater levels of clarity, complexity, dynamism and subtlety and jumpstarting countless great philospohcial leaps. To give two quick examples, Aristotle attempted to create a system for achieving infallible logical information by saying that logic is 100% reliable if it is based on a combination of (A) unequivocal carefully defined terms, (B) self-evident first-principles, and (C) geometrically-strict syllogistic reasoning by baby-steps. Stick to these and exclude logical leaps and unclear vocabulary and you can carve out an arena for reliable logic, even if that arena is necessarily finite and cannot touch everything. Similarly Epicurus and Aristotle both proposed a kind of empiricism of repeated observation, where we do not trust just one glance at the stick in water but examine it carefully with all our senses, look at many sticks, and eventually draw conclusions we consider more reliable. And at the same time, these same thinkers gave ground and mixed their dogmatism with skepticism by saying that logic or empiricism worked in some arenas but not others. Epicurus, for example, says we can learn a lot from sense data but we can never learn true details of atomic level since we can’t see anything that small. Aristotle similarly says we can learn about the level of the universe that we can experience and think about, the level of the objects we see and contemplate, but not about the chaotic base substructure which underlies the visible and comprehensible world.
(Sartre, who has just been handed a sandwich by Socrates and is now unconsciously applying the Scientific Method as he considers whether or not to accept Descartes’ offer of mayonaise, looks up here to say that he agrees with Aristotle that there are vast and terrifying unknown depths of being which lie beneath perceived reality. He thinks we should address our long-term attentions to that mystery, and that Aristotle is foolish to cling to pursuing the finite certainties offered by his logic chains and fish observations when no finite knowledge is helpful in the face of the raw unknown infinity beneath. But Sartre is not interested in pursuing eudaimonia, even if he is interested in the short-term, destructable pleasure offered by Descartes’ excellent fresh mayonaise.)
But our ancient Greeks are interested in eudaimonia, and another product of these challenges to reason and evidence, apart from letting dogmatic philosophers hybridize with it, was the birth of Skepticism (big S) as a philosophical school, in addition to skepticism (small S) as an approach. As an approach, skepticism is used by all sorts of thinkers, including Plato and Aristotle in their way, but it was also a school, a rival of Platonists and Stoics. And, like all other ancient schools, Skeptics pursued eudaimonia.
How does doubt lead to happiness? By allowing one to relax and resign one’s self to ignorance, says Pyrrho, the greatest name in pure classical skepticism. We cannot know things with certainty, he says, and this is a release (much as Epicurus thought it was a release to believe there is no afterlife). If we cannot know things with certainty, we don’t have to try. We don’t need to go with Aristotle to the docks and dissect infinite fish. We don’t need to sit with Plato and let him pretend to be Socrates through interminable dialogs. We don’t need to follow Pythagoras and fast ourselves into a trance while contemplating the number ten. We can stop. We can say I don’t know, I can’t know, I’ll never know, no one else knows either, no one is right, no one is wrong (not even people on the internet!), so we can just return to our work and rest. This too, say the skeptics, frees us from pain, from several pains that no dogmatic system can ever free us from. It frees us from the exhaustion of the quest to know. It also frees us from the stress we experience when we turn out to be wrong. If you think you know something, and it’s overturned, that’s stressful and unpleasant. It makes you feel angry, foolish, violated, shaken, abandoned. If you never think you know anything about things, you will never experience the pain of being proved wrong.
You know what the skeptics mean here. You know because you are alive in 2014, and that means you remember when there were nine planets. Weren’t you upset? Wasn’t it distressing and upleasant, shaking your worldview? We learned there were nine planets in kindergarden! Of course Pluto is a planet! Mike Brown, the scientist responsible for getting Pluto’s status stripped away receives hate mail, for precisely this reason: it hurts to be told you’re wrong. And this is far from the only time you, reader, have experienced this. There used to be such a thing as a Brontosaurus. And a Triceratops. (Youth: What! We lost the Triceratops too!”) There used to be four food groups, remember that? And coral reefs used to exist only in the tropics, and moths used to have nothing to do with tree sloths, and you used to have a volume of the complete works of Sappho. And the destruction of all these “truths” have unsettled us to different degrees, because we learned them at different times and they were integral to our worldviews to different degrees. And some we are okay with and with others we smile at the angry t-shirts that say: I remember when there were Nine Planets!
Now, Aristotle would tell us the strife has been caused by the fact that we had not defined “Planet” carefully enough before, so it wasn’t an unequivocal term, and thus led us to confusion and misunderstandings. “But!” says Pyrrho, “if you had never studied these things, if you had not been taught as a child to memorize dinosaurs, or rest your worldview on the label attached to a hunk of rock far off in the darkness where you never have cause to perceive it, then you would not experience this unhappiness! Your belief that you knew something has made you unhappy, destroying eudaimonia. Just admit that you do not know anything with certainty and then you need never experience such pain again!” And in the case of things we were prepared for–the treesloth and the Sappho and Arthur having a knight of African descent–the Scientific Method told us to do just that, to be prepared for truth to be replaced when it was time, because it was never Truth, it was always provisional truth.
Ten Modes of Skepticism
Many exciting things will happen to skepticism as it leaves Greek hands before reaches ours. It will be transformed by Bacon and Montaigne, by Averroes and Ockham, Descartes will finish his potato salad and have his day, and it has more refinement yet to undergo among the Greeks as well, and from their sunny riverbank Socrates and company will watch skepticism surge over the marble walls of Plato’s Academy like ants into their picnic basket. But for today I want to leave you with a taste of raw classical skepticism, so you can taste it for a little while and have a taste of this oddest of philosophies which proposes un-knowledge, rather than knowledge, as its happy goal. To that end, here, to finish, are examples of the Ten Modes of Pyrrhonism (i.e. the kind of raw skepticism practiced by Pyrrho) based on the handbook of Sextus Empiricus (one of our few surviving ancient skeptical authors). It is a list of categories of sources of error, things that can make you be wrong. Many are ones that we are very well prepared for in the modern world and remain on constant or at least near-constant guard against (though rather than guarding against the errors, what Pyrrho and Sextus want us to do is be on guard against imagining we aren’t making errros, i.e. to be on guard against thinking we know something. I see Socrates is nodding in approval, and that the others are too polite to point out the crumbs on his chin).
The Ten Pyrrhonist Proofs that Nothing can be Known with Certainty:
We cannot have certainty because different animals have different senses. When do we encounter this? When walking a dog, sometimes the dog stops to sniff in rapt fascination at a spot on the sidewalk where we see nothing interesting. But evidently there is something very interesting there if a creature as intelligent as a dog is fascinated, and willing to disobey its friend and choke itself by pulling on its collar in order to study this fascinating thing. What an error we commit being unable to see this fascination! Or is the dog in error?
We cannot have certainty because different human beings experience things differently. When do we encounter this? I encounter it when friends drink alcohol. I do not enjoy alcohol. Not only am I not supposed to have it (because of a specific medical condition), but it tastes like nasty poisonous motor oil to me, and yet I see my friends go into paroxysms of delight over the subtleties and complexities of drinks, and my civilization build entire buildings, institutions, customs and industries around this thing which my senses tell me is terrible. My senses and those of my friends differ. Clearly someone must be wrong, unless there is no right here? I also have a color blind housemate who cannot tell that Hello Kitty Hot Chocolate is bright pink, and struggles to play the game Set in mediocre lighting.
We cannot have certainty because our senses disagree with each other. If I want to know if something is good, I ask my senses. Yet sometimes they disagree with each other. My eyes tell me this artichoke is just made of smooth leaves, yet my touch tells me it is prickly. My eyes tell me a lobster is scary and dangerous, and yet my tongue says is delicious. My eyes tell me the molten glass in this glassblowing demo looks goopy and exciting and like a fascinating texture like putty which would be awesome to touch, and yet my touch tells me owwwwwwwwwww hot hot hot hot hot! My touch tells me this cat is delightful and fuzzy and yet my nose tells me I should not be near it because achooo!
We cannot have certainty because sometimes the same things seem different and lead us to different judgments in different circumstances. I might enjoy a food for a long time but then get food poisoning from it and, after that, always be revolted when I smell it. I might feel warm at 70 degrees but then be sick and feel cold at 70 degrees. I might think Gatorade is nasty but then be dehydrated and think it tastes great because my body craves elecrolytes.
We cannot have certainty because the same objects seem different from different perspectives. A mountain that looks like a face from one angle looks like a random jumble from another. A square tower seen from a distance seems round. A stick in water looks bent. The moon above a skyline looks much bigger than the buildings but we have no real sense from that of how enormously big it really is, and can only realize the latter using a lot of math, or a space shuttle.
We cannot have certainty because we never see objects alone. Have you ever had one of those articles of clothing that looks purple in some light and blue in other light, so people argue over which it is? Because it looks one way next to one thing and another way next to another. Well, what does it look like really? We can never see anything alone, we always see it surrouneded by other objects including air. If the stick is distorted by water, is it not also distorted by air? By vacuum? By light? We do not see objects, only groups of objects.
We cannot have certainty because things take multiple forms. Bronze is red, except when it turns green. Water is clear, unless it’s blue, or fluffy snow white. Squid ink is black, unless it’s diluted to form purple, or sepia. That molten glass is enticingly orange and squidgy. What do any of these things really look like?
We cannot have certainty because we experience everything relative to other things. We cannot see a thing without making some judgment about things that are relative: this clementine is small, this stick is long, this lake is large. Small, long and large compared to what? Other objects of comparison intrude themselves into our analysis. The clementine is small compared to oranges, the stick long compared to other sticks, the lake large compared to my back yard. But we cannot judge things without judging them relative to others. To feed his fish for a while my father was growing Giant Amoebas. Giant Amoebas! Amoebas so big you could almost see them with the naked eye! They were huge! They were smaller than grains of sand and yet I thought they were huge!
We cannot have certainty because we are biased by scarcity. I love this one, and I love its classic example. This is about how we judge things to be… well, frankly, how we judge them to be awesome or not. For example, comets are awesome. A little bright speck appears in the night that wasn’t there before, and flies across the heavens, really fast, so fast you can almost see it move! When there is a comet we get very excited. We discuss it, announce it, get out telescopes to look at it. In past ages people might pray to it, or read omens from it; now we photograph it and shoot probes at it. It’s super exciting: little bright speck in sky. Okay. So, every morning an enormous blinding ball of fire rises from the horizon, blotting out the night and transforming the entire sky to a wall of brilliant blue brightness streaked with rippling swaths of other beautiful colors, and it radiates down heat enough to transform our weather, burn our skin and feed countless life forms. It is, from any sense-perception objective sense, ten skillion times more exciting than a comet. But it’s just the sun, so, shrug. We are biased by scarcity. Two poems by Sappho and we all hear, but we find thousands of pages of unknown Renaissance poetry every year.
We cannot have certainty because different peoples have different customs, habits, laws, beliefs and ethics, and are biased by them. I think you all know this one. Though it will take over a millennium for it to get to be so common, since cultural relitivism isn’t a broadly-discussed or accepted thing until the enlightenment when Montesquieu and Voltaire made themselves its champions. Skepticism has a long road ahead of it, from Pyrrho to the present. But for now, let’s sit back with Socrates and picnic on this raw form of classical, eudaimonist skepticism, challenging our science-loving, learning-loving, exploration-loving, post-enlightenment selves to test ourselves with the quesiton of whether it might be a safe and happy thing sometimes, in its own strange way, to not know. And we should also comfort Sartre a bit–he hadn’t heard, before today, about poor Pluto. (Descartes: “What’s Pluto?” Socrates: “Are you sure you want to know?”)
This is not a full post yet, but an update, and a recommendation.
The process of transitioning to new hosting is well underway, bugs are vanishing and new features will be online soon. The site is already loading faster, and other new things will follow. UPDATE: the photo album is now fixed. Links will be a little slower to regenerate, but they will in time. Bug reports remain welcome.
Meanwhile, I have an enthusiastic recommendation to make for everyone who has been enjoying the historical and philosophical side of this blog. My work on figures like Machiavelli and topics like the history of atheism grew out of my training in intellectual history. The turning point that set me solidly on this path was a pair of classes on European intellectual development in the 17th and 18th Centuries, by Prof. Alan Kors at Penn. The lectures are truly amazing, clear and moving, chronicling the development of the scientific method, the crisis sparked by Thomas Hobbes, the new models of mind and nature advanced by Locke and Newton, the extraordinary and oft-neglected Pierre Bayle. The second half covered advent of the Enlightenment, which gave me my first real taste of the great firebrands Voltaire, Rousseau, Montesquieu and Diderot, and the revolution-of the-mind which so shaped our present day. The very same lectures by Alan Kors are now available on CD/DVD/download through The Teaching Company, and usually cost more than $100, but they are temporarily on sale for about $30, a little more if you want the video version. So if you enjoyed my Machiavelli series, and if you like audiobooks, and I can’t recommend them highly enough. You can order them here. (There is not, alas, a printed book equivalent of the same content by the same author, but his book Atheism in France, 1650-1729 is, while out of print and rare, independently excellent.)
Update: that sale is over but new ones come up sometimes and this page has coupons for The Great Courses.
Hopefully that will tide you over until my start-of-semester to-do list eases enough to let me write another essay. Soon!
Two quick announcements, then something fun to share.
First, comments were disabled for a little while. Now they are enabled again. Apologies to everyone who wanted to discuss Beccaria – I hope you still want to discuss him, and now you can.
Second, people have been reporting trouble subscribing by RSS. I have investigated, and it seems that, while Firefox, Explorer etc. are fine, Chrome won’t do RSS (for this site or any site) unless you install a Chrome extension for RSS. Googling “Chrome extension RSS” will supply a variety of equally viable methods. However, for those who are struggling with RSS and can’t get it working, I have created a mailing list which you can register for in the right-hand sidebar. Whenever I make a new post I will e-mail the list to alert people. I recommend, however, that you use RSS instead of the mailing list if you can, because RSS will definitely alert you without, whereas the mailing list is hampered by my ability to remember to do it.
Meanwhile, I will take this opportunity to present another of my favorite objects in the Florentine Museum of the History of Science (aka. Museo Galileo): the Noon Cannon. This is a strange variant on a sundial. A tiny cannon, well under a foot long, is mounted outside, ideally in the gardens of a grand estate. It is fixed in place on a stone slab, with a lens positioned above it. At precisely noon each day, the lens focuses sunlight onto the canon, heating up the powder charge and making it go off. If every morning you load the cannon with a little bit of gunpowder, then you will be reliably alerted to noon by the sound of a small explosion from your garden. The effect is sort-of like a water clock except, instead of tranquil trickling and the tap of wood on stone, there is a ka-boom.
I think the specimen in the museum is probably from the Eighteenth Century, possibly the Seventeenth, but I can’t remember off the top of my head. Of course, no one in our era can see a Noon Cannon and not instantly think of its potential uses in an old-fashioned murder mystery. Simply put shot in the Noon Cannon along with its daily charge, lure the victim to the garden at the specified time, and you can be miles away having an alibi while the Noon Cannon does the rest. “The Colonel put real shot in the Noon Cannon? How dastardly!” The killer could even mess with the lens to make it fire at an unexpected time, then play around with other sources of a substitute noise, a hunting rifle or a champagne cork to simulate the 12 PM shot… it writes itself…
“Make everyone read Beccaria!” is one of many sentiments I share with François-Marie Arouet, better known as Voltaire.
This post was prompted by two things.
The first was this comment responding my post about the two recent Borgia TV series, which mentioned TV depictions of horrific pre-modern executions.
Jen: “I am watching the final episode of The Borgias, Season 2, in which Savonarola is tortured and burnt at the stake, and again I find myself wondering – what was the supposed justification and thinking behind these acts? What did the church think burning people achieved? I know it was meant to be symbolic in some way, but of what I don’t know. I just do not understand why people were capable of such hideous acts of evil and why they did not realise that it was evil? How on earth could they reconcile this acts with their supposed devout religious beliefs??? Why was torture used without a second thought? So many questions about humanity and religion. Why did it take so long for us humans to develop a moral compass, and to value compassion?”
Addressing all these questions would take me deep into fraught realms of psychology, speculation, and accusation, and also deep into unhappy contemporary controversies over torture and capital punishment, none of which I want to stick my foot in. I do believe I can respond in one useful way with an historical portrait of one important moment in the history of this question. This is also one of those great undersung moments of real history which is so unilaterally good that it can all make us feel that much more proud to be human.
My second prompt was a recent experience with jury duty. There was some excitement among my friends when I was summoned for jury duty, speculating about how exactly I would get myself disqualified, since they were confident no attorney in the land would want me. I did rather want to be on the jury, in the name of interesting life experiences, so I started out trying to be inert and quiet, but eventually the defense attorney brought up that he saw from the sheet that I was a professor and asked me what I taught, and it was clear from that that I was pre-disqualified whatever I did, so decided thereafter to be honest. The jury selection scene was so stereotypical as to be almost a parody of itself, with a clean-cut young city slicker prosecutor with a distinctively stylish haircut, black pinstripe suit, rimless glasses who had such a boyish face he might have passed for an undergrad, facing off against a gray-haired defense attorney in a corduroy jacket and jeans with a southern drawl and a giant belt buckle shaped like Texas.
In his slow, meandering style (and with a gratuitous, emotionally manipulative photo of a mother cradling a baby on his Powerpoint, which was absolutely unrelated to any aspect of the case at hand) the defense attorney proceeded to go along the line and ask each potential juror what they thought the purpose of judicial punishment was: deterrence or rehabilitation. When asked to define “deterrence,” he explained it as “punishment, let’s get ’em, eye for an eye, tooth for a tooth.” He went along getting a ratio of about two rehabilitations to one deterrence until he got to me. I froze a moment, pursed my lips, then delivered what was honestly absolutely the most restrained impassioned speech I could manage. “You’re conflating two different types of justice,” I said (rough reconstruction). “Eye for an eye justice isn’t deterrence, it’s retributive justice, and the two are radically different. Retributive justice selects punishments with the goal of inflicting some punishment on the guilty party in order to achieve some kind of justice, balance, repentance, or fairness. Deterrence-based justice instead selects punishments based on what effect the punishment will have on the general population as a disincentive discouraging the crime in question. The two are not only different but, from an historical perspective, directly opposed, and their opposition is at the heart of all post-Enlightenment judicial codes including our own, thanks to the influence of Voltaire and Cesare Beccaria.” By this point the court stenographer declared me her eternal enemy and halted the proceedings so I could spell Cesare Beccaria for her, slowly, twice. Both the lawyers gave that special sort of “And this is why we don’t put people with Ph.D.s on juries” smiles at me, but I was satisfied to find that two other prospective jurors after me did speak up and say, “I agree with the professor, retribution isn’t deterrence.”
It is the moment of the birth of this distinction that I want to visit today. This moment addresses Jen’s questions about why medieval governments and the Church used so much violent torture, not by analyzing the Middle Ages, but by revisiting the first moment that the very questions Jen asked were asked by someone else, and thereby entered the central conversation of European thought, with real and wonderful consequences.
Some other day I will sing the praises of the Enlightenment in their full glory. For now suffice to say that the Age of Reason deserved its title. In the seventeenth century, the new philosophers, especially Descartes and Francis Bacon, had birthed the new and exciting idea that, by applying Reason and systematic analysis to things, human beings could find ways to alter them to make them more rational and better, for the good of all humankind. They saw Reason as a tool supplied by Nature and/or God to let human beings govern themselves and improve their condition, with the power to achieve anything humanity could dream of if we work carefully enough and long enough. In this spirit, intellectuals investigated engines, spinning methods, the circulation of the blood, birthing procedures, baking chemistry, light, optics, physics, and refrigeration, and discovered many new things which promised greatness, and some which were already delivering. As the eighteenth century approached, the methods which had been being applied primarily to what we might call hard sciences (with the terrifying exception of the shadowy “Beast of Malmesbury” a.k.a. Thomas Hobbes, whose fascinating infamy I hope someday to treat as I have Machiavelli’s) began with increasing frequency to be applied to other matters: government, law, justice (see Montesquieu and Locke), religion (Rousseau, Paine), and eventually crimes and punishments. If human institutions are held up for examination before the Light of Reason, claims the Method, they can be revised to be more rational and better, also better in line with Nature – with these improvements we will make a better world. It was this effort which was spearheaded by the great lights we remember: the Encyclopedia Project, Voltaire, Diderot, d’Holbach, d’Alambert, Franklin, Jefferson, and taken even further by other more chilling figures like La Mettrie and Sade.
Cesare Beccaria was from Milan, a nobleman and jurist under the Hapsburgs. He and other excited young intellectuals were enthusiastic readers of the firebrand treatises of Voltaire and others which trickled down from France. In that spirit, they set up their own intellectual circle jokingly named “L’Accademia dei pugni” (the Academy of Fists). Beccaria was interested in applying Reason’s razor to the ancient law codes he was now empowered to enforce (in the name of foreign but theoretically enlightened rulers in a conquered but civilized land). The young Beccaria, who was only 26 at the time, collaborated with Pietro and Alessandro Verri and produced, in 1764, a tiny little treatise On Crimes and Punishments. It was released anonymously, to protect its radical authors. It was thereafter translated into French where it became an immediate sensation, particularly since Voltaire, The Pen Mightier than Any Sword, embraced the treatise like a long-lost child, wrote a commentary on it, and shoved it at everyone. Though there were three minds behind the treatise, Beccaria was chosen to author it because of his flare for rhetoric. You can see it in the opening lines, which precisely express the first time someone asked Jen’s big question “Why did Europe of that era use such gruesome punishments?”:
Some remains of the laws of an ancient conquering people, compiled on the authority of a prince who reigned twelve centuries ago in Constantinople, later mingled with Lombard customs and collected in hodge-podge volumes by unofficial and obscure commentators–this is what forms the traditional opinions that in a large part of Europe are nonetheless called “law.” Moreover, it is today as pernicious as it is common that an opinion of Carpzov, an ancient custom cited by Claro, or a torture suggested with irate complacency by Farinacci, should be the laws unhesitatingly followed by those who ought to dispose of the lives and fortunes of men only with diffidence. (Young translation, Hackett, 1986)
In On Crimes and Punishments Beccaria examined the purpose of extreme punishments, thereby exposing, certainly not the only answer, but a set of answers which he then used to propose a shocking new way to think about punishment: deterrence.
Beccaria begins from the extremely Enlightenment position of considering the pleasure-pain principle the natural core of human (and animal) life. Animals, people among them, pursue happiness and flee unhappiness: pleasures including food and love but also virtue and success; pains including physical pain, deprivation, shame, and death. The purpose of a legal system is to ensure and protect a situation which will secure the most happiness for the most people. Just as a farmer must examine his methods to choose the techniques that will produce the most wheat of the best quality, so must the jurist examine his laws and punishments and choose those which will best protect and cultivate the common happiness of the people.
Beccaria follows Montesquieu, following Locke, in his political fundamentals. He believes in Laws of Nature, among them the rights to life, liberty, and pursuit of happiness. He believes that governments are instituted by a Social Contract, created by humans for mutual protection and benefit. Fearing their defenselessness in the State of Nature, early humans united together, sacrificing a small portion of their liberty to create the sovereignty of the state so it could protect them “against the private usurpations against each individual.” In this system, governments were not created by God with divine right, as was the traditional view, but they do have divine sources in that Reason and Nature are divine creations, and Reason is God’s gift to humanity to let humans protect and govern themselves. He therefore will not accept arguments that invoke religious justification against Reason, because in the dominantly Deist spirit of the Enlightenment, even an Italian Catholic believes that God is Light and Reason and therefore that if Reason and divine edicts seem to contradict there must be a mistake somewhere. Reason and religion, if both true, will always, the age believed, align. In his treatise on the small topic of crime and punishment, therefore, Beccaria sees himself contributing a footnote to preceding treatises on rational government, rational law and rational religion, Montesquieu’s Spirit of the Laws foremost among them. And by “he sees himself contributing a small footnote,” I mean in the most sweet and adorable way, as this passage sums up:
The immortal President de Montesquieu touched hastily upon this matter. Indivisible truth has compelled me to follow the shining footsteps of this great man… I shall count myself fortunate if I, as did he, can earn the secret gratitude of the little-known and peace-loving followers of reason and if I can inspire the sweet thrill with which sensitive souls respond to whoever upholds the interests of humanity! (Introduction)
And he took up this great topic with the overt intention of beginning an international dialog, inviting replies thus:
Whoever would wish to honor me with his criticisms, I repeat, should not begin, then, by supposing that I hold principles which are subversive either of virtue or of religion… But anyone who will write with the decency that becomes honorable men and with enough intelligence to free me from proving elementary principles, of whatever character he may be, will find me not so much a man eager to reply in his own defense as a peaceful friend of the truth. (Address to the Reader)
In all this, it is important to remember that, in Beccaria’s examinations of “Why do we use torture?” and “Why do we execute people?”, he does not have modern psychology in his analytic repertoire. He cannot, as we would, suggest that public executions were social catharsis, venting aggression in a controlled way, as sports would later. He cannot discuss the psychological relationship between the authority and the condemned, or talk about how sentences reinforce personal power or vent subconscious drives. He acts, as all pre-Freud thinkers do, on the belief that all human behavior is based on active, conscious decision-making. Some actions may be unexamined, i.e. based on bad logic and false conclusions, and actions based on imperfect information lead to error, but they are still based on some form of mental calculation, and the better examined they are, the more likely they are to be right. The judges enforcing the old mongrel legal code, part Roman, part Lombard, which Beccaria asks us to question, do so, in his view, in an unexamined way, falsely believing that that code is good and right in itself, or at least serves their ends. They have not examined it under the light of reason and asked what the utility is of each law and punishment. But they still decide to enforce this law code rationally, consciously, knowingly, not for hidden reasons deep in the root of the inaccessible mind.
What, Beccaria asks, is the purpose of legal punishment?
By Beccaria’s metric, all activities of the state must serve its primary function, that is, to provide the most happiness to the greatest number of citizens. This follows from the principle that the state is founded on the basis of reason for the protection and happiness of the people. Any aspect of the government, and within that of the legal system, which does not help serve this mandate to protect and distribute happiness will be rejected as irrational. All punishments, then, must serve to increase human happiness. He agrees with Montesquieu that “every punishment which does not derive from absolute necessity is tyrannical.” (ch. 2) From this he concludes three principles: (1) That only law, and not individuals with some kind of special authority, can justly impose punishments, (2) that if punishments derive from a social contract which binds all people equally, then all people equivalently bind the state equally and are entitled to the same treatment and the same punishment under the law, and (3) that excessively cruel punishments which have no benefit to public happiness have no justification and are tyrannical, and contrary to the virtue of reasoning people.
How do we determine the appropriate severity for a punishment? It should, he argues, be measured based on the harm done to the nation by the crime, and the punishment should be proportional, and focused on preventing the crime. In other words, deterrence. Ever the Enlightenment scientist, Beccaria likens self-interest to gravity, a powerful and universal force driving people toward action which can only be stopped by an opposing force. Thus when self-interest directs toward crime, that drive must be countered by an opposing one: fear of punishment. Prevention of crime, then, is the sole justification for judicial punishment in Beccaria’s analysis, not retribution, nor the at-this-point-largely-undreamed-of idea of rehabilitation.
Can the cries of a poor wretch turn back time and undo actions which have already been done?… The purpose of punishment, then, is nothing other than to dissuade the criminal from doing fresh harm… punishments and the method of inflicting them should be chosen that, mindful of the proportion between crime and punishment, will make the most effective and lasting impression on men’s minds and inflict the least torment on the body of the criminal. (ch. 12)
He does, however, review (in ch. 7) what he sees as other traditional justifications for proposing punishments, and it is here that his treatise gives us a snapshot of what one legal expert saw as the logic underlying the mass of gradually-accumulated law.
Some people, he says, have measured crimes on the basis of the dignity of the injured party (an interesting metric, and one the modern world has left far behind). Here he would be thinking of how a crime of a commoner against a nobleman is far more harshly dealt with than one against another commoner. If this is the system of logic, we can see why offenses against the Crown or against a lawful feudal lord could be punished with great severity, if they are read as injuring the Dignity, Grace, or Person of the sovereign. To use the Robin Hood example, if one hunts the king’s deer this seems like a minor injury if we see it as harming the deer, forest, or warden, but if the offense is seen as being one against the dignity and rights of the king then, by rank proportion logic, a punishment sufficient to avenge an offense against such great dignity must indeed be extreme. Yet, Beccaria argues, this type of reasoning cannot be the true metric people are using, because if so then crimes against God, i.e. blasphemy or irreverence, would be punished far more gruesomely and severely than the assassination of a monarch. Crimes against God were indeed punished very severely in his era (see the extreme examples of burning at the stake), but the assassin of a king was certainly regarded with more hatred, and executed with more gruesome creativity. In addition, actual burning at the stake for heresy or blasphemy or even witchcraft was, in the era of the Inquisition Beccaria was familiar with, exceptionally rare. Extreme cases like that of our dear Giordano Bruno did indeed end with blood and fire (a particularly visceral reality for me since he was burned alive a few paces from the apartment where I used to live in Rome). But in the Italian Inquisition such cases were rare, exceptions, usually examples brought on by some special political circumstance, and the usual sentence for blasphemy or even devil worship was being forced to sit through a bunch of boring religious re-education seminars and recite a lot of prayers (see the work of Nicholas Davidson on the Inquisition in Venice). Clearly, Beccaria concludes, the logic of the current law cannot always be that the punishment is chosen to be proportional to the dignity of the victim, but that type of thinking does seem, to him, to be an inconsistent but present factor in the thought behind the gore.
Other people, Beccaria says, have proposed that the punishment should be in proportion to the crime, i.e. “that the gravity of sin should play some part in the measurement of crimes.” In other words, that the purpose of punishment could be to achieve some kind of abstract balance or justice, righting wrongs, giving criminals their just deserts, etc. This reasoning he sees behind some aspects of the current law, and certainly it fits an eye for an eye and a life for a life, though doesn’t quite help us understand the practices of hacking off a hand for theft, or sawing a man in half from crotch to head for committing murder on a day that irritated the pope. But choosing punishments to balance the gravity of sin Beccaria says is also contrary to Reason. His argument? He asks us to look at “the relationships between men and men, and between men and God.” The former, he says, are relations of equality in which issues of common utility are primary, since those are what form the relationships between people. Thus utility, not abstract justice, should govern such relationships, and thus if punishments are to be based on relations between people, then utility, i.e. deterrence, should be the deciding factor. As for relations “between men and God,” it is here that Beccaria puts the idea of abstract, cosmic, or universal justness demanding that a crime be punished. He then argues that it is not humanity’s task to pursue universal justice.
If [God] has established eternal punishments for anyone who disobeys His omnipotence, what insect will dare to supplement divine justice? What insect will wish to avenge [wrongs against] the Being Who is sufficient unto Himself, Who cannot receive impressions of pleasure or pain from objects, and Who alone among all beings acts without being acted upon? The seriousness of sin depends upon the unfathomable malice of the human heart, and finite beings cannot know this without revelation. How, then, can a standard for punishing crimes be drawn from this? In such a case, men might punish when God forgives and forgive when God punishes. If men can be in conflict with the Almighty by offending Him, they can also be so by punishing.
It is interesting for the modern observer to note how directly Beccaria equates notions of abstract justice or balance with the idea that crimes are offenses against God. At no point in his treatise does Beccaria undertake to argue against any concept of secular universal justice. Justice is, for him, either a question of balancing individual relations between people, where utility should reign, or it is a matter of religion. Sin, with all its religious weight, is the word he chooses when discussing the idea of proportional punishment–people, he says, think punishment should balance sin, not evil, or wrong. It does not occur to Beccaria that anyone might propose a secular moral code demanding that killers get their just deserts, etc. The only secular principles he would accept are those of Nature and Reason, though for him, as for so many Enlightenment figures, these factors are far from secular in his understanding. Despite Pierre Bayle’s comparatively recent but (in)famous argument to the contrary, Beccaria is still very much thinking in the era when even such a radical as Thomas Paine believed that an atheist could not be a citizen, would not respect the law, and would never have any reason to refrain from crime.
These, then, are Beccaria’s notions of what logic lay buried under the accumulated traditions and contradictions of pre-modern European law: avenging the dignity of the injured party, and proportioning punishment to sin. He rejects both of these as irrational, saying we may justly assign punishment only when it secures public happiness. For those who have read my Machiavelli entry on the three branches of Ethics, note here how Beccaria is arguing that human relations must be analyzed using utilitarianism, confining deontology to divine questions, though one can certainly make the case that he is applying a kind of deontology of his own, using his understanding of Nature and Reason as his abstract internal laws. This kind of Reason-based deontology, closely aligned to utilitarianism, is common among those Enlightenment figures who invoke Laws of Nature, or so-called self-evident principles.
Deterrence reigns, for Beccaria, as the keyword of the day. The purpose of punishment is to discourage crime, not to achieve balance or to avenge the dignity of the injured party. From this conclusion, Beccaria then derives a set of new and original guidelines for how punishments should be selected. Among them we find the following ideas:
Preventing crime is more valuable than punishing it.
Punishments for crimes should be proportional to the harm done to society by the crime.
Punishments should be as mild as they can be while still being an effective deterrent.
Every crime offends society, but only some crimes threaten the state with destruction, and it is on the latter that laws and punishments should focus.
Honor (the “despotism of opinion”) is not a clear and consistent moral code but a vague and blurry accumulation, hard for us to articulate and understand because it is so personal, much as an object too close to the eye is blurry and hard to focus on. Conflicts between honor, society’s self-interest, and the law have long caused strife.
Dueling is destructive, and in punishing those who cause strife by dueling the party who caused the offense should be held culpable, not the party who challenged him to the duel who “through no fault of his own, has been constrained to defend something that the laws on the books do not assure him, that is, the opinion which others hold of him.” (ch. 10)
Secret denunciations are more tools of calumny than justice and cause more harm than good (it was a widespread practice at the time to have boxes wherein citizens could deposit secret denunciations accusing each other of crimes, especially sodomy and blasphemy, and this was widely abused).
The more promptly punishment follows crime, the more powerful a deterrent it will be.
Since the criminal is doing pleasure-pain calculus, it is less important that the punishment be gruesome than that it be inescapable. The certainty of a mild punishment which is still bad enough to more than counter the benefit of the crime is more effective than a severe punishment which the criminal has a realistic hope of evading.
Crimes against property can be punished with fines, but crimes against persons must be punished with corporal punishment (which includes imprisonment/unfreedom) because otherwise people are reduced in dignity to objects bought and sold. He targets this sentiment particularly against the wealthy, who, in his era, generally paid a fine for crimes including murder, instead of suffering personal punishment.
Banishment is appropriate for those who have been accused of an atrocious crime which is not certain, and who cannot therefore be tolerated to remain. But the property of the banished person should not be confiscated by the state, since that is too powerful an incentive to corruption.
Punishments should be visited on individuals, not whole families, because punishing families as a unit encourages a spirit which thinks of the family as a political unit, rather than individual citizens, and this spirit is opposed to republican sentiment. Such a system would have people think of the paterfamilias as a monarch, and make the nation see itself as ten thousand tiny monarchies instead of fifty thousand free-thinking citizens. (From modern eyes, this is a great example of a sentiment widely agreed with in the modern era, that the individual and not the family should suffer for a crime, but justified by wholly period logic not present in modern legal discourse.)
Crimes are best prevented by combining enlightenment with liberty. The best possible preventative is perfect education.
Crimes can also be prevented by the state awarding rewards for virtue.
And, of course, at the heart of the new ground he intends to break, ground not treated by Montesquieu in whose footsteps Beccaria so reverently treads, lies torture:
What is the purpose of torture?
One proposed purpose, he begins, again trying to puzzle out what logic lies behind the present laws so he can point out its flaws, is that torture helps secure confession and extract truth. Torture’s usefulness as a method of extracting truth had long been a key assumption of the law, so much so that under some legal systems confessions were only admissible if they were extracted under torture, since that was considered the most reliable system (see Roman policies on interrogating slaves, where torture was a necessity before the court would listen). Beccaria then makes the argument (new in his day) that pain breaks innocent people too, so torture will force false confessions from the innocent. Thus, he concludes, torture is not a reliable path to truth, so the goal of extracting information does not rationally justify the use of torture. If torture has any real utility, it must therefore be as a punishment, rather than an interrogation tool. This leads to a very novel and yet, to us, very familiar argument:
A man cannot be called ‘guilty’ before the judge has passed sentence, and society cannot withdraw its protection except when it has been determined that he has violated the contracts on the basis of which that protection was granted to him. What right, then, other than the right of force, gives a judge the power to inflict punishment on a citizen while the question of his guilt or innocence is still in doubt?
In more familiar words, innocent until proven guilty. The argument is more utilitarian than moral: techniques which secure false confession are injurious to justice and society. He further argues that torture is better for the criminal than for the innocent man, a weird but interesting argument. Torture provides the criminal the chance to say “Hey, I deserve this pain, but if I endure it they’ll acquit me and I’ll be spared worse pain,” helping him bear it, while the innocent man suffers not only torture but the despair-inducing knowledge of knowing that he suffers unjustifiably, so if he is found guilty he suffers an injustice, and if he is acquitted he still suffers unjust torture. And on the practice, common in his day, of torturing the guilty to try to force him to confess to other crimes in addition to the one he is accused of, here Beccaria dips into some of his most biting rhetoric, writing: “This is equivalent to the following line of reasoning: ‘You are guilty of one crime; hence it is possible that you are guilty of a hundred others. This doubt weighs on me, and I want to reassure myself by using my criterion of truth. The law torments you because you are guilty, because you may be guilty, because I want you to be guilty’.”
Torture cannot therefore, Beccaria concludes, be useful before conviction, and must used only after conviction, as a punishment, not a tool. But what function does it serve then? The purpose of torture could be to purge or cleanse the soul with pain. This idea is closely tied to religion, not just to Christianity but to a much broader palette of belief systems which hold that pain can discipline the body, clarify the mind, and cleanse the soul. In a broader sense (placing Beccaria’s discussion in context) Christian ideas of Purgatory and Plato’s depiction of the soul’s cleansing before reincarnation both use this idea that fire and pain can burn away past sin and also past bad moral/intellectual development, removing the weight of sin and past dark thoughts, making the soul pure, light, and open to truth. This is also reflected in monastic practices of mortification of the flesh, in the West and East. In this model, the idea is that the pain of an excruciating death is actually good for the convict by helping cleanse the soul and increasing the chances that the criminal will reform, either mending wicked ways and leading a good life thereafter, or, in the case of lethal tortures, paying for the crime before death, increasing the chance of getting into Heaven. Beccaria is so concise and articulate that it keeps being most efficient to just quote him directly:
Another ridiculous reason for torture is the purgation of infamy; that is, a man judged infamous by law must confirm his deposition with the dislocation of his bones. This abuse should not be tolerated in the eighteenth century. The underlying belief is that pain, which is a sensation, purges infamy, which is simply a moral relationship… It is not difficult to go back to the origin of this ridiculous law… This custom seems to be taken from religious and spiritual ideas which have so much influence on the thoughts of men, nations and ages. An infallible dogma assures us that the blemishes which result from human weakness and which yet have not deserved the eternal wrath of the Great Being must be purged with an incomprehensible fire. Now infamy is a civil blemish, and, since pain and fire remove spiritual and disembodied stains, will the spasms of torture not remove a civil stain, namely infamy? (Ch. 16)
In other words, he believes that the concept of Purgatory, and related beliefs that spiritual suffering purges sin and cleans the soul, led people to presume that physical suffering could purge the worldly equivalent of sin, “infamy” or criminality.
This is linked to the idea of certain crimes–mainly intellectual crimes such as heresy, blasphemy, or witchcraft–being somehow contagious, or harming the community of people who contact the criminal, either by spreading, or by inviting divine wrath which might, when punishing one sinner, withhold blessings from neighbors as well, so the plague or famine affects the whole city, doing public harm. Thus the purpose of torture could be to cleanse, not the convict, but the city or society. Here we turn naturally to the questions of heresy, blasphemy, atheism and other crimes of thought which loom ever over the populace, especially over the intellectual. This question Beccaria… evades… for now.
What is the purpose of gruesome execution? Here again torture fails Beccaria’s utility test. Beccaria argues that death is a sufficiently ultimate punishment that anyone who would not be deterred from a crime by death would not be deterred from it by death plus agony. If the sole purpose of punishment is to deter crime, heaping extra punishment on top of death counts nothing. In fact, he goes further. Over time, he argues, as gruesome executions are repeated, and seen as spectacles, the hearts of people are hardened and the torture loses its edge as a deterrent. Since fear is at the heart of deterrence, Beccaria argues that what really matters in cases of Ultimate Punishment is not the actual severity of the punishment but the fact that it be Ultimate. Whatever the severest punishment of a society is, that will command the most fear from the would-be criminal. He posits two imaginary civilizations, one having as its Ultimate Punishment some brutal and protracted death, and the other perpetual slavery. He argues that the two will be equal in how successfully they deter crime, since in both the punishment will loom in the imagination as Ultimate Punishment, instilling the same fear. An interesting theory. As for making a public spectacle of executions, he argues that this trains people to think of execution with a mixture of fear, scorn, pity and perverse enthusiasm. With moderate punishments, though, fear is the only reaction, making them more effective deterrents. “The limit that the legislator should assign to the rigor of punishment, then, seems to be the point at which the feeling of compassion begins to outweigh every other emotion in the hearts of those who witness a chastisement…” (ch. 28). One flaw in the death penalty, he says, is that it means one crime supplies only one example of punishment to the nation, while a lifetime’s hard labor may let the nation continue to see and remember the crime, criminal, and punishment, and so be deterred lifelong. This, of course, posits a system in which the populace has the opportunity to see the “enslaved” prisoner at work, and thereby be constantly reminded of the fruits of crime – Beccaria’s world is one of rock pits and chain gangs, not closed prisons which keep the imprisoned populace out of the public eye and memory.
Beccaria therefore advocates mildness of punishments, and argues against the death penalty, not because he thinks it is immoral, but because he thinks it is less useful than lifelong punishments. He also argues that it might make people suspect the law of hypocrisy, when those employed to punish homicide commit it, and that this confusion could undermine public respect for the law. Executions, he says, encourage bloodlust in the populace, and decreases, he thinks, rather than increasing, deterrance. But his argument against it is not entrenched – he is far more interested in arguing against gruesome punishments than against death, which he presents simply as a reasonable option which is not to be preferred while others are more effective. He does throw the full flower of his rhetoric into his argument against the death penalty, but not in order to move the reader’s passions to horror at how terrible it is to execute people. Instead he stresses how much everyone would rejoice and love their monarchs if the monarchs discarded the old laws and instituted new laws based on the Light of Reason. “How happy humanity would be if laws were being given to it for the first time, now that we see beneficent monarchs seated on the thrones of Europe!” (ch. 29). Today’s enlightened princes, he argues, genuinely want to make good and better laws, and in this Age of Reason they could finally strike down the old and muddled law and replace it with something rational and good, saving all humanity from the tyranny of archaic and defective law codes. He finishes this section with a sentiment very alien and unexpected to the modern reader: “If such monarchs, I say, allow ancient laws to remain, it is the result of the infinite difficulty of stripping errors of the venerable rust of many centuries. This is a reason for enlightened citizens to desire more ardently the continued increase of their authority.” In other words, he believes that the best way to eliminate torture and gruesome executions is to have an absolute authoritarian monarch, who, moved by the spirit of the Enlightenment, and empowered to rewrite law and government as he will, will make a better, more rational government. Here modern readers, raised to associate “innocent until proven guilty” and bans on “cruel and unusual punishment” with democratic anti-authoritarian sentiments, experience a moment of healthy historical whiplash.
Toward the very end, after his outline of a new ethic of punishment and his declaration of confidence in enlightened monarchy, Beccaria at last turns, timidly, to that most dangerous of issues, that is the punishment of heresy, blasphemy, atheism, etc. I say most dangerous because this is the arena which could get our author in very deep and potentially lethal political trouble. At this moment, the violence of the Wars of Religion continues to flare, fresh religious persecutions and burnings are constantly in the news, and Beccaria must be a good Catholic or risk paying a lethal price which had become more and more common as Reformation concerns spread. If the pre-Reformation Inquisition’s most common punishment for heresy was a tedious course of lectures, this deep into the age of heavily politicized religious violence it was rarely the slow and methodical inquisitors and more often the swift secular magistrates, or the mob, who burned or massacred. Beccaria is a proud, free-thinking optimist who wants to reform and improve the human condition. His heart has thrilled at Voltaire’s calls for religious tolerance, at the pro-peace “Irenist” movement that had finally let England stop massacring its citizens over the differences between transubstantiation and consubstantiation. But he is also a Milanese Catholic and knows what fate awaits he who dares wake the barely-sleeping dragon. He does not even dare name the issue he addresses in “Chapter 39: On a Particular Type of Crime“. It is single brave paragraph, which I will quote almost in its entirety, so as to give you a full taste of the artful irony and quiet grief of this intellectual forced to bridle himself, to make the mandatory profession of support for religious persecution. Yet, through the coat of lies he must paint across his message, the passion of his objection shines, clear as a star:
The reader of this work will notice that I have omitted a kind of crime which covered Europe with human blood and raised those terrible pyres where living human bodies fed the fire. It was a pleasing entertainment and an agreeable concert of the blind mob to hear the muffled, confused groans of poor wretches issuing out of vortices of black smoke–the smoke of human limbs–amid the crackling of charred bones and the sizzling of palpitating entrails. But rational men will see that the place where I live, the present age, and the matter at hand do not permit me to examine the nature of such a crime. It would take me too long and too far from my subject to prove how a perfect uniformity of thought is necessary in a state, the example of many nations to the contrary not withstanding; how opinions that differ only in a few subtle and obscure points altogether beyond human comprehension can nonetheless disturb public order if one of them is not authorized to the exclusion of the others… It would take me too long to prove that, however odious the triumph of force over human minds may seem, since the only fruits of its conquest are dissembling and, consequently, degradation; however contrary it may seem to the spirit of gentleness and brotherly love enjoined by reason and the authority we most revere; it is still necessary and indispensable… [In this treatise] I speak only of crimes that arise from human nature and from the social contract. I do not address myself to sins; their punishment, even in this world, should be governed by principles other than those of a narrow philosophy.
A sad self-portrait peeks through here. Odious force has triumphed over human minds and degraded Beccaria. He, and his partners, must dissemble. Yet the conquest is not full. He has hope still that his little treatise will be read by kindred spirits, by those “sensitive souls [who] respond to whoever upholds the interests of humanity,” by those fellow readers of Locke and Montesquieu who will read between the lines and recognize him as a “peaceful friend of the truth.” His hope is not in vain.
What were the consequences of young Beccaria’s little treatise On Crimes and Punishments?
It spread like wildfire. There it penetrated the salon culture whose radical intellectual experiments had inspired Beccaria’s Accademia dei pugni. And it reached Voltaire. Voltaire, who exercised literally unprecedented influence, as a new age saturated with printing houses made it possible for the first time for an independent intellectual to support himself, and see that his ideas reached every corner of literate Europe with unheard-of speed. Voltaire, whose wit and incisiveness made everyone who could sit up and listen, not only intellectuals but the great public he entertained. Voltaire, who had just come through the terrible crisis of the Lisbon Earthquake, the death of his beloved Emilie, and in Candide (1759) proclaimed his conviction that it is the duty of a thinking person to cultivate the human garden. This moment began the latter stage of Voltaire’s career, when he moved from popularizing Enlightenment ideals to direct political activism. He campaigned against religious violence and judicial murder. He spoke out against particular cases and trials and fired France with outrage and calls for reform. And he made sure everyone read Beccaria.
And it worked.
Rarely in the history of thought do I have a chance to say the outcome was so simply good, but it worked. Within their lifetimes, Voltaire and Beccaria saw real reform, a sincere and solid transformation of the legal codes of most of Europe, the spread of deterrence-based justicial thought. Within decades, judicial torture virtually vanished from European law. The laws of America, and of the other new constitutions drafted in the latter 18th century, all show the touch of Beccaria’s call. It worked. The change was not absolute, of course. Torture, the primary target, retreated, as did the notions of retributive justice, avenging dignity, and purging sin. But prisons were still squalid, punishments severe, and other things Beccaria had campaigned against remained, capital punishment primary among them. But even here there was what Beccaria would call progress. The guillotine lives in infamy, but it too was a consequence of this call for enlightened justice: a quick, egalitarian execution, death with the least possible suffering, and equal for all, giving no advantage to the noble, who had long been able to hire an expert and humane headsman while the poor man suffered the clumsy hackings of an amateur who might take many blows to sever a writhing neck. Most states judged death still necessary, but agreed that law and punishment should bind all men equally, and that unnecessary pain did not serve the public good. It is strange to call the guillotine a happy ending, but it was in a small way, and even more victorious was the dialog that birthed it. The first country to ever abolish the death penalty was the Duchy of Tuscany, which did so on Beccaria’s utilitarian grounds rather than principle (Hey, look, Machiavelli! Your new branch of ethics, flourishing in Florence!).
Between them, Beccaria and Voltaire made people think seriously and critically about the tortures which had been employed so long without consideration of their purpose. Beccaria asked people to ask themselves why we use torture, and the reading public did just that. Judges examined the questions, jurists, even kings. And they did change things. Even the sad and careful chapter about “a particular type of crime” had its impact. After all, in the eighteenth century so many carried the torches of reform that even among the magistrates and priests and censors whose job it was to suppress threats to the status quo, many were secret sympathizers, in favor of the changes they were employed to slow, and willing to read Beccaria’s chapter “on a particular type of crime” and realize (as we can’t fail to) his true meaning, but give it the stamp of approval anyway, and hope wholeheartedly that it would do some good. It did. Not universal good, not perfect. It needed a next step, and there were many atrocities it did not manage to prevent, especially in the colonial world. But it did real good nonetheless. The days of European governments and Churches sawing men in half gave way. And when later on there were movements to reduce violence against slaves and conquered peoples, these too owe some thanks to the 26-year-old jurist from Milan who turned his friends’ idealistic ambition into such potent prose.
The target of Beccaria’s treatise was not torture itself, nor the death penalty, nor even the concept of retributive justice. His target was the unquestioning acceptance with which his age enforced the mass of traditional opinions which was then called “law.” We have not eliminated torture from the world, but, in the nations touched by the Enlightenment at least, that unquestioning acceptance of old laws has been conquered. We still have much to fix, many more steps to take in the footsteps of Voltaire and Montesquieu, but if, when I turn up for small town jury duty, the defense attorney begins by asking the jurors our opinions about the purpose of punishment, then, even if he blurs deterrence and retribution, even if the court stenographer doesn’t know how to spell Beccaria’s name, Beccaria is present in the conversation, and the fact that there is a conversation is his victory. And ours.
Periodically readers ask me if there is a “tip jar” or other way to support Ex Urbe and say thank you for my work. Normally the answer is no, I do it out of pure enjoyment and the desire to share the places and periods I study. But short-term there is a way.
I compose music (polyphonic a cappella and mythology-themed folk music), and have just launched a kickstarter to raise money for my next project. I have been working on this set of songs for ten years, and have given my mythological subject the same care I have given the Renaissance in my posts, so it has come together to be quite something (though I do say so myself). I can’t post a direct link here because I still want to preserve the public anonymity of this blog, but if any readers would like to support me, or check out the project, please e-mail me (JEDDMason at gmail dot com) and I will send you the link. Thank you.
Meanwhile, for your enjoyment…
I want to introduce you to my favorite artifact at the “Museo Galileo,” the Florentine museum of the History of Science. It is a unique artifact, a “Military Compass.” Designed in the seventeenth century, it is the epitome of a gentleman’s weapon: a dagger which splits apart to become a geometric compass, so the educated wielder can use it to measure ground, calculate the sizes of fortresses, distances, and aim artillery:
A hinge in the hilt allows the blade to snap apart. The two halves are marked as rulers, and a weight drops out of the handle into the center to form a central measure, making it possible to use it as a compass, a level and for many other types of calculations. The pommel flips open to reveal a small magnetic compass, and pieces fold out from inside the hollow blade to provide additional tools for calculation. But if it is folded up, it is as deadly as a dagger should be. Such weapons were far from commonplace, more curiosities, status symbols and showpieces, but they were certainly designed to be usable. We have records of them as early as the 16th century, when one was designed for a Medici commander.
Instructions for the use of such a weapon are preserved in an anonymous 16th century Italian manuscript, also on display:
The museum also preserves other gentlemen’s instruments, such as walking sticks with concealed barometers, telescopes and other paraphernalia far more impressive than a cane-sword, but this one has always caught my imagination as something which encapsulates the ideal of the warrior-scholar-adventurer-noble which is so much at the heart of romantic notions of the “Renaissance man”, both our notions and the ideals they had in the period. I want to read a book where the protagonist carries this. I want to read ten books where the protagonist carries this. Also, as a reminder that the past never fails to sex-segregate no matter how unnecessary it is, they also have items intended for ladies of the period at the museum, including a “lady’s microscope” which is petite, delicate and carved out of ivory.
Was Machiavelli an atheist? We don’t know and never will, but we can learn much about our society’s attitudes toward atheism by examining the persistence of the question, and the different reasons we have asked it over and over for centuries even though we know we have no proof.
No historical discipline can be honestly called “neutral”, but the study of atheism (and of its cousins skepticism, deism, and more general freethought, heterodoxy and radical religion) has always been particularly charged because it is so impossible to be detached from the central question. Setting aside the elaborate and bloody history of religious violence, oppression and entanglement in politics, whether you answer “No,” “Yes,” “Maybe,” or “Sort-of,” the question of whether or not there is a divine force and/or being(s) ordering or governing the cosmos, your answer has an enormous impact on your everyday actions, decision-making, ethics, attitudes toward law and government, and every other corner of the human condition. Even if religion and government had never mixed in the history of the Earth, if tomorrow you encountered irrefutable proof that the answer to the question was the opposite of what you had hitherto believed, your life and actions thenceforth would be radically different. The stakes are high, and personal. This makes it hard for historians to be calm about it.
Historians did not try to be calm about it in the early, juicy days when atheism was first presented as having a history. In the late sixteenth and seventeenth centuries, pamphlets and books discussing famous atheists were a thriller genre, scandalous tales of tyrants and madmen which occupied largely the same niche as biographies of serial killers, or penny museums displaying the death masks of executed murderers. Treatises on “Infamous Atheists” served a slightly more learned audience than wax heads and the numerous early versions of the Sweeny Todd legend, but only slightly, and as they proliferated in printing shops tales of the scandalous excesses of Tiberius and Caligula under the label “atheist” were part morality play, part voyeurism, and part slander as each particular collection targeted its audience’s enemies. French collections accused Italians and Englishmen of atheism while Italian collections accused Frenchmen; Catholic collections accused Martin Luther and John Calvin of atheism, while Protestant collections accused popes and papists, and almost all European collections accused Muslims and Jews of atheism in a spirit of general racism and lack of accountability and lexical clarity.
You may note that neither Martin Luther nor Caligula is on record as ever having philosophically attacked the existence of God, but the logic chain of these collections is, from our perspective, backwards: (1) Fear of Hell drives men to good behavior. (2) These men were bad. (3) These men did not fear Hell. (4) These men were atheists. In the Renaissance, sinful living in overt defiance of divine law was considered evidence of atheism, to the degree that we have records of many atheism trials from the fourteenth through sixteenth centuries in which the evidence brought by the prosecution involves no statement of unbelief on the part of the accused. Rather the evidence will be sinful living, promiscuity, homosexuality, gluttony, irreverence of civic and religious authority, anything from a monk taking in a mistress to a drunkard running around in public with no pants on (See Nicholas Davidson, “Atheism in Italy 1500-1700,” in Atheism from the Reformation to the Enlightenment, ed. Michael Hunter & David Wootton (Oxford, 1992), 55-86, esp. 56-7).
Serious attempts to write a history of atheism began in the later nineteenth century, when secularization had progressed enough that an atheist was no longer a thrilling exotic creature, but was instead a black sheep in a land with many, many sheep of which some were even more alarming colors than black. It was also at this point that histories of atheism bifurcated. Some presented pessimistic accounts (by theist authors) of the modern decay of morals as atheism proliferated, while others presented optimistic accounts (by guess who) of the progress of secularization. Even in their more objective accounts, when dealing with earlier periods when atheism was rare and its traces elusive, these historians were, or rather we historians are, still prone to hyperenthusiasm when we think we have found what we are looking for, as whale watchers may mistake any dark shape for a humped back.
Everyone (whether theist or atheist) who studies pre-modern atheism is excited when we find evidence of it. This is because secularization, this brave new world in which atheism is both commonplace and legal, is an essential characteristic of the modern Western world, one of its unique features, differentiating us, here, now, from all earlier times and all other places. When I say the modern world is secularized I do not mean that atheism is a majority or even a plurality—it remains a small minority. What I mean is that atheism is universally present in Western discourse as a coequal interlocutor in theological debate, and all contemporary Western theists have lived their whole lives in contact with atheism, debating with atheists or at least expecting they might have to do so, and generally knowing that atheism is a commonplace alternative to their own views. This is radically different from the pre-modern situation, in which people saw atheists as elusive and invisible enemies (rather like vampires), and most books on the subject described atheism as a form of mental illness (often thought to be inborn), or as a moral perversion (compared in the period to homosexuality), while the genuine philosophical atheist was expected to be so extraordinarily rare that we might see only a couple in a century (such categories are employed by David Derodon in his treatise L’atheism Convaincu (1659), see Alan Kors, Atheism in France (1990), p. 28).
If the study of history is more than mere delight in exciting stories of past exploits, it is an attempt to understand our origins and ourselves. When we comb the past and spot something characteristically modern—be it the scientific method, hygiene, feminism or atheism—we are excited because we have found an early trace of home. Religious tolerance and the presence of atheism as a coequal participant in religious discourse in our own day is part of what makes us radically different from our predecessors. The following claim may seem counter-intuitive, but if I were to send an average modern American theist back in time to the seventeenth century, I think that person would debate more comfortably with an early atheist than with a theist of the same era, because the atheist, while disagreeing with our time traveler, would be disagreeing with somewhat familiar vocabulary and justifications, while the seventeenth-century theist would be going on about Aristotle, and teleology, and angels pushing the Moon around, and other fruits of an alien religious conversation that has no experience of 90% of the theological issues which our modern time traveler is used to considering. The seventeenth-century atheist probably knows what “natural selection” is (he read about it in Lucretius) but the corresponding theist probably hasn’t read such a rare and stigmatized text, so when our time traveler says “I want a proof of the existence of God that stands up against natural selection,” the atheist can have that conversation, while the theist is much less prepared. For most Renaissance theists Thomas Aquinas’ Proof of the Existence of God from Design is unassailable; for us, it’s been assailed every minute of every day of our lives; for the early atheist, it has an assailant, and it’s a similar assailant to the one we moderns are used to, so we can talk about it with the atheist and feel more at home than if we tried to talk to a theist who had never experienced any such attack. A pre-modern theist is, of course, well prepared for attacks from heresies we no longer worry much about: Arianism, Averroism, Antinomianism, but Darwin is a bolt from the blue. Not so much so for the early atheist, who, whether right or wrong, is more prepared for modern conversations than the average theist of his day. Thus, for atheists and believers alike, the history of atheism is the history of theology coming to be shaped more like what we’re used to in the modern era. Hence why even a theist historian thinks it’s super special awesome when we spot a bona fide atheist before the Enlightenment.
The study of what was going on with atheism before the mid-seventeenth century is not, and cannot be, the study of actual atheists. There are none for us to study. There may have been some, there may not, but in a period when saying “I think there is no God” led pretty directly to arrest and execution, no one said it. No one wrote it. If anyone thought it, not even private letters can confirm. Knowing that an atheist won’t fess up in documents, we historians naturally read between the lines, seeking hints of heterodoxy in the subtext of a treatise or the double meaning of a couplet. This is the only place we can realistically expect to find evidence, but it is also prone to giving us false positives. As Lucien Febvre put it in his enormously influential The Problem of Unbelief in the Sixteenth Century: the Religion of Rabelais, we moderns are bound to see that rare beast the atheist around every dark corner. We see him because we want to.
The first really real for sure definite actual atheists who, by golly, said they were atheists (OMG!) date to the mid-seventeenth century, the Libertine movement, when a push toward religious tolerance (largely in the name of stopping the Reformation wars of religion before they wiped out all homo sapiens on the European continent) meant that wealth and power were enough to armor figures like the Earl of Rochester and his circle (including the bone-chilling Charles Blount) sufficiently that they could be known to be atheists and survive so long as they denied it in public. This trend strengthened in the Enlightenment. I often compare late seventeenth- and eighteenth-century atheism to late nineteenth- and early twentieth-century homosexuality: there were circles in which one could let it be an open secret that one was an [atheist/homosexual] and it would be okay so long as one didn’t ruffle too many feathers or say anything in public or in front of civic authorities. One was always at risk of prosecution, and if one wanted to be safe and respected one kept it carefully hidden (as Diderot hid his atheist works), but there was enough sympathy within the apparatus of power that one could write of one’s [atheism/homosexuality] in private letters, and even hint at it in public works, and more often than not be safe. The pre-seventeenth-century atheist enjoyed no such safety, so not even in Renaissance private correspondence (where talk of homosexuality is quite commonplace) do we see even the most timid hand raised when the historian calls back: “Is anybody there an atheist? Anybody? Machiavelli?”
Why is Machiavelli our favorite candidate? Many reasons. First, he is in other ways so very modern. Having spotted someone who thinks about history as we do, and thinks about ethics as we do, and definitely, provably thinks in a very much more modern way than others of his century, he is a natural candidate for other modern twists including atheism.
Second, he was called an atheist by so many people for so long. The mystique of vague, beard-stroking villainy invoked by the term “Machiavellian” (Note: Machiavelli did not have a beard) falls nicely into the pre-modern logic chain: (1) Fear of Hell drives men to good behavior. (2) Machiavelli advocates sinful behavior including lies, betrayal, murder and reign by terror. (3) Machiavelli does not fear Hell. (4) Machiavelli was an atheist.
But there are more focused reasons than that. If we return to Febvre’s warning that we are prone to spot atheists in every shadow, Febvre argues that, instead of seeking the rare beast of our desiring, we should instead confine ourselves to searching for a habitat capable of supporting him; only then can we safely say that we have found him, not his shadow. By “habitat” Febvre means the apparatus of other ideas related to atheism which make atheism easier and more likely.
Imagine that you are a biologist studying a particular fungus. This fungus is hard to find, but often grows around the roots of a particular tree species, with which it has an unexplained but well-documented symbiosis. You thus survey mainly regions where this tree is common. And if you hope to trace your fungus back to before material records survive, you might trace the history of that tree species, through fossils or early human artifacts made of its wood, and conclude that, while you can’t be sure the fungus was there too, the odds are certainly better than the odds of it having been in places where its tree friend was unknown. You have not provably found your fungus, but what you have is certainly enough to talk about, and enough to get people excited if your fungus is a truffle and may yield millions in delicious profit if your information leads to improved cultivation.
Now, for the truffle substitute the elusive pre-1650 atheist, and for the tree substitute the ancient Greek theory that matter is made of atoms. The two are unrelated, and the atomic theory does not attack theism in any way, but it is certainly easier for atheism to flourish when “How was the world made if God didn’t do it?” can be answered with “Atoms interacting chaotically in the void clumped together to form substances… bla bla… planets… bla bla… natural selection… bla bla… people etc.” instead of “I don’t know.”
“I don’t know,” is the centerpiece here. Medieval and Renaissance Europe had perfectly respectable answers to all scientific and sociological questions, they just all depended on God all the time. Take gravity, for example. Celestial bodies are moved by angels. As for why some earthly objects fall and others rise, morally inferior objects fall down toward Satan and morally superior ones rise up toward God, sorting themselves out into natural layers like oil and water. Stones sink in water because Water is superior to Earth, hot air rises because Fire is superior to Air, and virtuous men go to Heaven because good souls are light and wicked souls are heavy with sins which make them fall to the circle of Hell corresponding to the weight of their sins: nine circles separated out in layers, again like oil and water. God established the first societies, handed down the first laws, created the first languages, and directed the rise and fall of empires to communicate His Will. If one wanted to be an atheist in the Middle Ages one had to throw away 90% of all science and social theory, and when asked “Why do rocks sink?” or “How do planets move?” or “Where did the world come from?” one had to answer, “I have no idea.” Turning one’s back on social answers in that way is very difficult, and is part of why the study of atheism is so closely tied to the study of philosophical skepticism—only very recently have atheists had the leisure of both denying God and still having a functional model of the universe. Early atheists had to be, largely, skeptics. They also had to embrace a not-particularly-functional partial worldview which made rest of the world (which had a much more complete one) think they were completely crazy. I thus sometimes compare Medieval atheists with modern creationists, since both are individuals willing to say, “I believe this one thing so fiercely that I will throw away all the other things to keep it, even if it makes everyone think I’m nuts.” Doing this is very hard. Doing it when other ideas are around to satisfy the gaps left by removing God from science becomes much easier.
How then do we seek the habitat capable of supporting the invisible pre-1650 atheist? We look for radical scientific theories: atomism, vacuum, heliocentrism, anything which makes Nature more self-sufficient and less dependent on divine participation. We look for related theological challenges: attacks on the immortality of the soul, on miraculous intervention, on Providence, on angelology, anything which diminishes how often God is part of the answer to some basic question. We look for who is reading ancient texts which offer alternate explanations to Christian theological ones: Epicurus, Lucretius, Plato, Pythagorean cult writings, Cicero’s skeptical dialogs, Seneca. Who is reading all this? Machiavelli.
In the pre-modern world, a firestorm of accusations of atheism and wickedness awaited anyone who raised a powerful and persuasive alternate answer to some question whose traditional answer depended on God. This firestorm fell even if the author in question never made any atheist arguments, which, generally, they didn’t. It happened often, and fiercely.
Thomas Hobbes awoke one such firestorm when his Leviathan suggested that savage man, living in a state of terror and war in his caves and trees, might through reason and self-interest alone come together and develop society and government. Until that time, Europe had no explanation for how government came to be other than that God instituted it; no explanation for kings other than that God raised them to glory; no explanation for what glue should hold men together, loyal to the law, other than fear of divine punishment. Hobbes’ alternative does not say “There is no God,” but it says, “Government and society arose without God’s participation,” a political theory which an atheist and a theist might equally use. It gives the atheist an answer, and thereby so terrified England that she passed law after law against “atheism” specifically and personally targeting Hobbes and banning him from publishing in genre after genre, until he spent his final years producing bad translations of Homer and filling them with not-so-subtle Hobbesian political notions one can spot between the lines.
Machiavelli awoke such a firestorm by creating an ethics which works without God. Utilitarianism depends entirely on evaluating the earthly consequences of an act, and can be used as a functional system for decision-making whether or not there exists any external divine force or absolute code of Truth. He also painted a world of politics in which he recommends actions which are the same that one might take if there is no God watching. In order for people to be virtuous they must first be alive—doesn’t that sound like the sentiment of someone who isn’t thinking about Heaven? It is justified and necessary to kill and lie in order to protect the stability of the state and the lives of the people—doesn’t that sound like there isn’t a separate Judgment waiting? The man who will do so much—even serve the Medici who tortured him—in order to guard and protect Earthly Florence seems to have an Earthly mistress, and not to be thinking of a Higher One. He certainly talks like an atheist, and he certainly created the first system of politics and ethics which an atheist could coherently employ.
In addition to all this, there is what we can glean about religious attitudes from Machiavelli’s personal sentiments and behavior. We know that he was a military commander, and fought, and killed people. We know that he was what I think of as “averagely promiscuous” for a Renaissance man based on my experience of letters and autobiographies, which is to say that (while married) he had both male and female lovers, and wrote comfortably and playfully about friends doing the same. We know friends wrote to him for advice about their love-affairs, which he freely gave, though warning against getting too caught up in them. We know he helped his family in a push for a profitable priestly position for his brother, and was thus involved in minor acts of simony. We know he owned many pagan classics and loved to read them, including a fascinating little volume in his own handwriting (now at the Vatican) which contains his complete transcriptions of two texts, first Lucretius’ De Rerum Natura (On the Nature of Things, containing antiquity’s best account of god-free atomistic physics and denial of the soul, afterlife, Providence and prayer), and second Terrence’s Eunuch (containing one of the most uncomfortable scenes in all of ancient comedy which the young hero boasts triumphantly about having just committed rape). Machiavelli himself wrote the infamous comedy The Mandrake, which does not contain rape, but in which the twist is that in the end all the deception and adultery goes just dandy and in the end no comeuppance is had and everyone carries on committing deception and adultery and lives happily ever after, including those being deceived. We know he had a sense of humor, and we know he often directed it against the antics of priests and monks. I will include one sample of this edge of him, taken from a letter from late in his life, when he was sent on behalf of the Florentine wool guild to recruit a preacher for Lent (an extremely high-profile public performance, rather like picking who will play at superbowl half-time), Machiavelli wrote to his high-ranking political friend Guicciardini, from Carpi, May 17th 1521 (Note the playful way he juxtaposes the mandatory obsequious Renaissance opening address with the base setting of the second sentence.)
Magnificent one, my most respected superior. I was sitting on the toilet when your messenger arrived, and just at that moment I was mulling over the absurdities of this world; I was completely absorbed in imagining my style of preacher for Florence: he should be just what would please me, because I am going to be as pigheaded about this idea as I am about my other ideas. And because never did I disappoint that republic whenever I was able to help her out – if not with deeds, then with words; if not with words than with signs – I have no intention of disappointing her now. In truth, I know that I am at variance with the ideas of her citizens, as I am in many other matters. They would like a preacher who would teach them the way to paradise, and I should like to find one who would teach them the way to go to the Devil. Furthermore, they would like their man to be prudent, honest and genuine, and I should like to find one who would be madder than Ponzo (who at first followed Savonarola, then switched), wilier than Fra Girolamo (Savonarola), and more hypocritical than Frater Alberto (either a Boccaccio character or someone whom Alexander VI sent to Florence and who recommended summoning Savonarola to Rome so they could seize him under false pretenses), because I think it would be a fine thing – something worthy of the goodness of these times – should everything we have experienced in many friars be experienced in one of them. For I believe that the following would be the true way to go to Paradise: learn the way to Hell in order to steer clear of it. Moreover, since I am aware how much belief there is in an evil man who hides under the cloak of religion, I can readily conjure up how much belief there would be in a good man who walks in truth, and not in pretense, tramping through the muddy footprints of Saint Francis. So, since my imaginative creation strikes me as a good one, I intend to choose Rovaio (Riovanni Gualberto, “the north wind” or “the hangman”), and I think if he is like his brothers and sisters he will be just the right man.” (Translation from Machiavelli and His Friends, Their Personal Correspondence, James B. Atkinsons and David Sices eds. (DeKalb: Northern Illinois University Press) 1996, p, 336).
Later in the letter Machiavelli says that he is trying to come up with ways to actively stir up trouble among the monks he’s staying with just to entertain himself. This sparks a hilarious sequence in which Guicciardini starts sending Machiavelli letters with increasing frequency, and stuffing them with random papers to make the packages fat, to get the monks to think that some important political thing is going on. At one point a letter arrives saying that Guicciardini instructed the messenger to jog the last quarter mile so he would be sweaty and out-of-breath when he arrives, and Machaivelli describes with glee the increasing hubbub and attention he receives in the monastery as people become convinced that something of European import must be stirring. Unfortunately a later letter hints that Machiavelli thinks they are on to the prank, and the correspondence ends there.
You now have pretty-much as much evidence as anyone does about Machiavelli’s religious beliefs. Smells like an atheist, doesn’t he? His manifest unorthodoxy, the unique modernity of his ethics and political attitudes, and his playful anticlericalism, not to mention his charisma as an historical figure, inevitably tempt us into wondering whether we have found here a beautiful specimen of the rare beast we seek. But until we develop a time-traveling telepathy ray to let us read the thoughts of the dead, we must remain very wary. Is Machiavelli religiously unorthodox? Absolutely. Does he deny the existence of the divine? Perhaps, perhaps not. 1520 is very early, and there are many genres of heterodoxy besides denial of God which we may be smelling here. Thinking forward two hundred years, Enlightenment deism with its Clockmaker God denies divine intervention in Nature, removes the Hand of God from politics and lessens theology’s role in ethics without removing God. If Machiavelli is an early deist, rather than an early atheist, that is certainly enough to fit comfortably his model of politics without God as a central factor, his ethics which segregates Earthly activities and consequences from broader divine concerns, and his interest in Lucretius and pagan scientific models for how Nature can function without constant divine maintenance. If Machiavelli thinks these monks are corrupt and hypocritical, so would Voltaire, Rousseau, and even Martin Luther, without any of them being atheists. Radicals yes, atheists no. We may, in fact, ascribe any number of heterodoxies to Machiavelli, and as we review the history of writings about him we in a sense review the history of what radical religious veins we are most worried about, since whatever is most scary tends to be ascribed to him in any given decade. These days it is often atheism, nihilism, skepticism, rarely deism, since we are at present as a society very comfortable with the Clockmaker model and associate it more with the bright and kind Enlightenment than with he-who-advocates-fear-over-love.
Is Machiavelli an atheist? We have no idea, but by looking at why we want him to be, or don’t want him to be, or think he was, or think he wasn’t, and why new historians keep trying to answer this literally unanswerable question, we can watch the evolution of our own societal anxieties about the origins of unbelief, and how we understand how we got to this modern situation in which theism must stand constantly prepared to face its thousand enemies and is not (like Baldur) so secure in the presumption that no one will aim for the heart that it doesn’t realize it might have to dodge. This is a slight exaggeration, as Medieval Christianity did prepare itself for onslaughts of atheism, and we have numerous practice debates written by theologians showing how they would argue with imaginary atheists since they had no real ones about to spar with. (Alan Kors in his meticulous history Atheism in France has argued that these practice debates against pretend atheists were actually critical in introducing atheist arguments to broad audiences and thus themselves responsible for propagating atheism, even though they were written by theists for theists in a world populated probably only by theists.) But it is not much of an exaggeration, since such preparation was much more an academic exercise than real sharpening of mental blades. Since Machiavelli is the first of the great, famous possible-atheists—before Hobbes, before Spinoza, before Bayle, and before the real beast Rochester—Machiavelli is where we turn to test our anxieties about how our world came to be so secularized.
In the small talk phase of a party, I often answer “What do you do?” with “I study the history of atheism.” The response usually takes the general form of, “Tell me more!” but as discussion unfolds I often feel one of two undercurrents shaping my new acquaintance’s replies: either “I’m an atheist and, since I presume you’ll agree with me, I now want to vent at you about how much I hate organized religion and my parents,” or “I’m a theist but pride myself on being rational about it, and I’m scared that if I tell you I’m a believer I’ll sound like the kind of religious nut that gives theism a bad name.” I sympathize with the anxieties behind both these reactions, but both sadden me. They are symptoms of the debate done badly: an atheist motivated more by rebellion than by Reason, a theist shamed into buying into rhetoric in the worst sense. They are what happens when people grow up surrounded by others who care more about propagating their own beliefs than about helping young people meet and explore great questions for themselves (see comment thread). I love this debate. I love all of the people on all the sides. I love the passion, and earnestness, and urgency of writings on atheism, by both sides. It is the essence of the examined life and the exercise of Human Reason at its most intense. I love everyone involved: Plato, Aquinas, Ockham, Ficino, Sade, Nietzsche. I love when a student comes to my office hours and asks me directly, “I want you to be a Socratic gadfly for me and help me test my position,” whichever position it is. I do it. I love it. When I wonder whether Machiavelli was an atheist, it’s not because I want to know, but because I want to talk to him about it, at length, and we would stay up all night, and eat all the cheese and olives, and drink all the wine, and Voltaire would come, and Hobbes, and Locke, and Rochester and Rousseau would get plastered and piss themselves, and Diderot would help me mop it up while we talked about Leibnitz and the imperfection of Creation, and Machiavelli would keep pace with us even though most of the ideas in question would be two hundred years younger than him. They would be new to him, but he would understand them easily and join in comfortably to the debate. He should be there. There isn’t anybody else we know of from Machiavelli’s century who really should be there in the imaginary salon where we revisit the Enlightenment debates that made this modern era secular the way it is. Just Machiavelli. That’s why we can’t stop asking.
(Here ends my Machiavelli series. I hope you have enjoyed it, and thank you for being patient. Also, I have now added a substantial discussion of atheism in the classical world in the comment thread on this post, for those interested. You can also read my entries on remembering the Borgias, and the Borgias in TV dramas.)
If you’re interested in reading more about the history of atheism, skepticism, heterodoxy, deism and freethought, I recommend these sources:
Allen, Don Cameron. Doubt’s Boundless Sea; Skepticism and Faith in the Renaissance. Baltimore, Johns Hopkins. 1964.
Hunter, Michael and David Wootton ed. Atheism from the Reformation to the Enlightenment. Oxford: Clarendon. 1992.
Kors, Alan Charles. Atheism in France, 1650-1729. Vol. 1, Princeton: 1990. (The long-awaited second volume is forthcoming.)
Popkin, R. H. History of Scepticism from Savonarola to Bayle. Oxford: 2003. (Earlier editions of the book have titles, “History of Scepticism from X-other-dude to Y-other-other-dude. All editions are good, but the most recent is the most comprehensive.)
Betts, C. J. Early Deism in France, From the so-called ‘déistes’ of Lyon (1564) to Voltaire’s ‘Lettres philosophiques’ (1734). The Hague: Martinus Nijhogg Publishers. 1984.
Buckley, Michael J. At the Origins of Modern Atheism. New Haven: Yale University Press. 1987.
Febvre, Lucien. The Problem of Unbelief in the Sixteenth Century: the Religion of Rabelais. Cambridge MA: Harvard University Press. 1982.
Ginsburg, Carlo. The Cheese and the Worms. New York: Penguin Publishers. 1992.
Jacob, Margaret. The Radical Enlightenment: Pantheists, Freemasons and Republicans. London, Boston: Allen & Unwin. 1981.
Kristeller, P. O. ‘The Myth of Renaissance Atheism and the French Tradition of Free Thought’. Journal of the History of Philosophy, 6 (1968), pp. 233-443.
Lemay, J. A. Leo ex. Deism, Masonry and the Enlightenment. Newark: University of Delaware Press. 1987.
Wagar, Warren W. ed. The Secular Mind: Transformations of Faith in Modern Europe. New York: Holmes & Meier Publishers inc. 1982.
Wilson, Catherine. Epicureanism at the Origins of Modernity. Oxford: 2008
Wootton, David. ‘Lucien Febvre and the Problem of Unbelief in the Early Modern Period’ The Journal of Modern History, Vol. 60, No. 4. (Dec., 1988), pp. 695-730.