The Ambulatory Sesquipedalian

Typography and Textbook Selection

Effective typography promotes content, presenting it without drawing attention to the presentation. For it is the content that justifies the product; a monospaced manuscript is more valuable than an immaculately presented lorem ipsum. The typography should catch the attention but not hold it, passing it subconsciously to the content. And once the reader attends to the content, the typographer must ease his stay, calling attention neither to its failures nor its extravagances. While typography can be beautiful in its own right, the virtue of typography is selflessness, and thus in its selflessness should we seek its beauty.

This makes evaluation of typography remarkably difficult. If you look at a book and think first, what beautiful typography!, it is likely that the typography is poor, unwilling to yield the reader’s attention to the content. Moreover, refusing to look past presentation to content subjects the presentation to a scrutiny it was never designed to bear: the best typography is such because of the attention it draws away from itself to the content, and can look merely plain to a reviewer who refuses to follow. On the other hand, typographic flaws often only appear when one tries to attend to the content.

I think of these matters on account of the state of textbooks. There are, to be sure, well-designed textbooks, venerable texts of sciences little changing and modern works of mathematicians who have not forgotten elegance. But many others from my experience, the freshman’s texts of calculus or economics, and even more so the grade-schoolers text of mathematics, display none of this restraint. They have lists of features on the back cover, and pictures and sidebars and boxes and colours; features that catch the eye in passing and pull it from elsewhere, and content outside the main text flow that inhibits a natural flow of reading. How are these monstrosities selected? Who can think this effective pedagogy?

And I suspect the answer is that those who chose these textbooks never tried to read them. They may have opened the covers, perhaps glanced at a few paragraphs, but never sat down to read, to lose themselves within the book. And if given two textbooks, one with sidebars on famous mathematicians and one without, does not the former sound better? Until reading it and finding no way to read these asides without pausing from and forgetting the main argument. Removing the distractions would improve comprehension of the main content, and as the purpose of the textbook is, above all else, to teach that primary content, dropping extraneous information thus improves the work as a whole. Likewise, vivid headings and icons advertising Real-world Applications and the like appeal to the hasty review, demonstrating features to those who drop in with no intent to stay or to learn. But the reader does not need a piece of clip-art to tell him that he is now in an application section, for he is reading the text itself! These are at best wasted space, and at worst an active distraction, pulling the eye from elsewhere on the page (and likely clearing short-term memory in the process).

The judge of typography must be either a reader or a typographer; a reader can evaluate the typography of what he reads from his own experience, and a typographer knows to set aside the superficial impressions, looks for coherent flow, despises the distractions that attract the untrained eye. But I see little evidence of either approach to modern textbook selection, and, as always, we get what we deserve: an industry furnishing textbooks to those who select against decent typography will inevitably produce badly typeset books.

Regarding Due Process

Many that live deserve death. And some that die deserve life. Can you give it to them?


Ammianus Marcellinus relates an anecdote of the Emperor Julian which illustrates the enforcement of this principle in the Roman law. Numerius, the governor of Narbonensis, was on trial before the Emperor, and, contrary to the usage in criminal cases, the trial was public. Numerius contented himself with denying his guilt, and there was not sufficient proof against him. His adversary, Delphidius, a passionate man, seeing that the failure of the accusation was inevitable, could not restrain himself, and exclaimed, Oh, illustrious Cæsar! if it is sufficient to deny, what hereafter will become of the guilty? to which Julian replied, If it suffices to accuse, what will become of the innocent? Rerum Gestarum, L. XVIII, c. 1.

Coffin v. United States

Due process does not exist to help the guilty; they deserve a punishment fit to their crime, and do not become less deserving because the case was mishandled. Nor does due process exist to help the obviously innocent; a system in which they need fear the legal system has problems beyond disrespect for due process. Rather, due process exists to help the apparently guilty.

The American media and people have a history of opposing due process when it seems inconvenient, griping about letting people off on technicalities and complaining about the verdicts in the Simpson and Zimmerman trial, among many others. But in making these arguments, they forget about the third group, the innocent who seem guilty. The people who were in the wrong place at the wrong time, or who happen to be among the few people whose DNA matches that of a true criminal, or who are misidentified by an eyewitness struggling to differentiate strangers.

We require proof beyond reasonable doubt because the preponderance of the evidence points to many people, and not all are guilty.1 We prohibit double jeopardy because retrying those we think are guilty until they are convicted would undercut the requirement of reasonable doubt. We prohibit forced testimony because given sufficient pressure, both guilty and innocent confess. We have requirements of procedural due process to allow the innocent to explain and challenge the evidence that falsely points to their guilt.

We could change our legal system to avoid what we see as unjust acquittals; require only the preponderance of evidence to show guilt, admit confessions obtained before telling prisoners their rights, allow double jeopardy for prosecutorial incompetence. But all these changes affect not only the guilty, but also the innocent who seem guilty. We declare not guilty the probably guilty against whom a certain, lawful case cannot be made not because the guilty among them do not deserve to see the consequences of law, but because some among them are innocent. Perhaps this makes me soft on crime—but far sooner would I be soft toward criminals than harsh toward innocents!

  1. Thus remember that a juror may, and must, say not guilty while thinking the defendant guilty, if that thought admits reasonable doubt. This is, I think, the point most often neglected in coverage of acquittals–proof beyond reasonable doubt is stronger even than clear and convincing evidence. When complaining that a jury let of someone you are convinced is guilty, remember that the jury may well be convinced of his guilt as well.

The Fall (and Rise) of Privacy

Googling “the end of privacy” (with the quotation marks) produces 809,000 results, plus a log of a search that I do not entirely trust Google to delete. Nearly every result (among a limited sample) worries about the implications of technology, from Facebook’s graph search to the constant threat of recording of Google glass. Maintaining dual lives becomes difficult: search engines and advertising cookies can track your browsing habits even where you do not log in, and social networks often make it difficult to separate what is visible to various audiences.

First, what is privacy? The term derives from separation, and most dictionaries will define it as the ability to be free from observation or attention. But that definition is inadequate, for by it privacy remains easy: abstain. Remain apart from society, from technology. The internet speeds communication, but what is never disclosed is never communicated. No, a demand for privacy is not a demand for secrecy but for control, the ability to selectively disclose information without fear of it escaping its intended sphere—the ability to disclose on Facebook a lifestyle that you do not wish prospective employers to see, or the ability to record your thoughts without fear of their being caught in a government dragnet. Perfect privacy is perfect control over the disclosure of information, the ability to reveal it to the precise subset of the world’s population you desire to see it.

While the end of privacy is much discussed, few seem to remember that it also had a beginning. The vast majority of humans lived nearly their entire life within an area of a few miles, usually in a small, discrete community. In the medieval village everyone knows everyone else, and word of mouth could spread as quickly across a community as twitter can across today’s geographically disparate communities. In such a community no public action is private (as any observer has direct communication with all other community members), and the privacy of group-secret action depends on the discretion of those involved. Privacy as we value it cannot exist.

While there have always been travelers and the like who were largely exempt from the privacy constraints of the enclosed community, the first factor to change this situation for large groups of people was urbanization. Roman Londonium held 50000 people within less than a square mile of land; more people than you would meet in a lifetime within ten minutes’ walk. This breaks the universal pairwise communication possible in the village, and it becomes much easier to live a life, even a fairly public one, hidden from some part of one’s daily acquaintances. But urbanization progressed slowly, first accounting for half the world’s population this century.

The true rise of privacy is tied to the rise of the automobile: while the city increases privacy by moving more people within walking distance, the automobile increases privacy by expanding the range of daily life. Convenient travel created the possibility of compartmentalization: your neighbors need not know your coworkers, nor your coworkers your leisure associates. And with this separation came the possibility of the double life: stay out of the newspaper and the police blotter and your employer need never know what you do on weekends. And this, it seems to me, is the standard of privacy the internet fails.

Now the internet does present some new challenges, particularly with respect to persistence and lack of context. What sets the new age of low privacy apart from the old is the ability to collect partial information about total strangers, and in particular what governments are likely to attempt to do with such information, and these issues bear more thought. But it is worth asking, I think, whether the rise of privacy was a good thing in the first place. Privacy is only important if you have something to hide, and at least in general one should not.1 I keep access to my Facebook information somewhat restricted (I think), but I worry little: the story it tells is not one I see a need to hide, whether from strangers, schools, or employers. The worries over privacy largely come back to integrity or authenticity: if you are proud of your life, show it. If you are not, change it.

  1. With a major exception for information such as authentication tokens (passwords, credit card numbers, etc.). But these do not factor into a privacy debate, as they should never be even partially public.

Are the Liberal Arts Worth Saving?

Persons attempting to find a motive in this narrative will be prosecuted; persons attempting to find a moral in it will be banished; persons attempting to find a plot in it will be shot.

Mark Twain, preface to Adventures of Huckleberry Finn

The Common Core State Standards Initiative, gaining momentum as a national curriculum standard, requires that a substantial proportion (25-70% by grade level) of assigned reading be informational texts; recommendations tend toward the dry, politically motivated, and data/conclusion focused. This has produced no shortage of outrage among the defenders of the modern liberal arts curriculum, who argue the importance of assigned fiction reading to education. But their fight is vain: the liberal arts are long dead in American public education, and these defend an impostor, a vapid curriculum appealing to the greats of fiction to hide its abandonment of the greats of thought.

The liberal arts, in their medieval formalization, are grammar, logic, rhetoric, arithmetic, geometry, music, and astronomy. It should, I think, be obvious that the reading of fiction1 is not essential to any of these ends, and only noticeably helpful toward one (grammar, if supplemented by formal study). Indeed, the liberal arts long predate the novel;2 a fact which should give pause to those who think an assault on novels to be an assault on the liberal arts.

The trivium (grammar, logic, and rhetoric) aim only to teach thinking and communication: grammar aids comprehension and is the foundation of effective presentation, logic aids clarity and accuracy of thought, while rhetoric polishes the presentation. The Trivium thus comprises the tools requisite to participate in the great conversation—after grammar one can listen; after logic one can analyze and expand on what one hears; after rhetoric one can contribute. The Quadrivium, by contrast, is the start of applied studies, the starting point of the student’s entry into that conversation. What then, are we to make of literature, which though I find nowhere among the seven, claims consideration not merely as one of the liberal arts, but as the essence of them all? I see no reason for its inclusion in any curriculum, let alone this hallowed one.

Some say that the study of literature is necessary to encourage a love of reading, but this disagrees with all my experience. First, I am unconvinced of the essential causal relation supposed. For one who loves reading reads widely, but does it follow that one who reads widely loves reading? Surely not, if he reads for the sake of his report card. A reader need not be told to read; the justification of assigned reading must be in its attraction to the non-reader. So what of the non-reader? Are there not some who read only due to a lack of appreciation for its virtues, and who need only mandatory experience to amend their opinion? Perhaps. But to assign a book as schoolwork puts reading in the worst light, forcing an association of reading with tedium and coercion. For every child brought to a love of reading by assignments, I suspect that two, who could have loved reading in a different light, force themselves to hate it for its associations. Even I, an avid reader from a young age, found my literature classes distasteful; among my favorite works of fiction you will find not one that I first met under duress.

Which brings me to a vaguely relevant aside: it is too often forgotten that English literature, as I have seen it taught, is vastly removed from reading itself. If reading assignment were of the form Read a novel a week throughout the term, I might approximate sympathy. But the schools do not content themselves with forcing students to read, they force them to analyze, to find imagery and symbolism and patent nonsense. If a novel is best suited to diversion, to subject it to such drudgery is to mock its purpose and talent. And if it serves to enlighten, do not analyze it in the intellectual dark of the literature class, but behold its purported lessons to the light of the subject to which they belong. For Literature is a discipline without a subject within the scheme of intellectual inquiry; whenever an author aspires beyond diversion to insight he puts the judgment of his work beyond the competence of the teacher of English.

Other arguments for literature abound, and I shall not take the time to dismiss all in detail. Reading aids grammar and writing—but that means only that the curriculum should contain reading, not its hypostatisation into its own subject (and materials for that purpose must be selected with considerable care—far more than I have seen in my literature classes). I put the burden of proof on the defender of the subject: why should I think disastrous the removal of a subject whose introduction coincided with the collapse of American education?

But while I reject the claims of the present curricula to represent even a shadow of the true liberal arts, I am no proponent of the Common Core, for it distracts from the purpose of education in its own way. The object of education is the refinement of thought and communication; not the teaching of facts, or procedures, or opinions. I thus object to the Core’s specification of informational texts, and the examples I see confirm my fears. As an economics major, I hold that tables of economic data, and even statistical summaries, are useless to the informed and dangerous to the ignorant (post hoc…). A policy brief is dangerous for opposite reasons; it gives not data nor argument but conclusion, to be believed because this is a scientific age and it was written by experts.3 Information can be acquired as needed; in schooling intended for all, teach what all must know: to read, to think, to write, and to do so in that order. American education is in disarray, this I cannot doubt; a defense of the status quo demonstrates ignorance or negligence. But the solution is not to press madly on, thinking that where little changes brought the opposite of the intended effect great changes will succeed—it is to remember what was forgotten.

  1. From fiction I exclude the philosophical dialogues and extended thought-experiments, focusing on works read as diversion. And even among what might conceivably be read for pleasure, I do not deny the intellectual benefits of Dostoevsky or the like.

  2. Although they do not predate fiction, and Homer was central to ancient Greek education. But Homer filled a role far different from that of the novel in modern education; a better comparison would probably be a subset the roles of the Bible in 18th-century Protestant education: a common source of reference and standard of values. It only forms precedent for minute study of a small core, common to all; especially in America, cultural heterogeneity precludes such a text. There is not, and can never be, the Great American novel.

  3. I point to this cult of expertise as a primary cause (alongside the abandonment of logic in primary/secondary education) of the disintegration of opinion in our culture. For in this scientific age, we are taught that they validation of opinion is beyond our competence, to be left to the experts. So I find my experts, and my opponent finds his; and they disagree, and we disagree, and each says that his opponent disregards the exports, and is therefore ignorant. Engagement and reconciliation of opinions occurs only when each man judges the arguments for himself. But this is a subject for another time.

Regarding Normality


  1. According to a square or rule; perpendicular; forming a right angle.
  2. According to a rule or principle.
  3. Relating to rudiments or elements; teaching rudiments or first principles; as normal schools in France.

Noah Webster's 1828 dictionary


  1. conforming to a standard; usual, typical, or expected

Oxford Online Dictionary

Normal is not the norm,
It’s just a uniform…

Delain, We Are the Others

While listening to Delain, I was struck by the etymological insight of those lyrics: normal and norm are cognate, both derived from the Latin norma, carpenter's square, rule, pattern. Etymologicially, a norm is a rule while normal describes what conforms to that rule, as in Webster’s 1828 definition (interestingly, I can find no derivative of norma in Samuel Johnson’s dictionary). Modern dictionary definitions are mixed; most give mention to the prescriptive sense, but include also (sometimes as primary) some equivalent of usual or common; with only rare exception, my encounters with the word in the wild have been in the latter sense. The Oxford dictionary claims that this sense first appears in the early 19th century;1 perhaps significantly, this is close to the publication of Democracy in America, in which a prescient de Tocqueville writes of an aggressive egalitarianism, bringing down the great when the poor could be elevated no further.

So why should ordinary have a prescriptive synonym? The good, the right, should be a standard leading all on, not leading the laggard few while preaching complacence to the masses and self-consciousness to the great. Normal can describe the right, or the common, but never both.

  1. Sadly, the nearest OED is two miles away, and I am feeling lazy.

A call to stay home

And they said to him, Behold, you have become old, and your sons have not walked in your ways. Now appoint a king to us, to judge us, like all the nations. And the thing was evil in Samuel’s eyes when they said, Give to us a king to judge us. And Samuel prayed to Jehovah. And Jehovah said to Samuel, Listen to the voice of the people, to all that they say to you. For they have not rejected you, but they have rejected Me from reigning over them. According to all the works that they have done from the day I brought them up out of Egypt, even to this day, when they forsook Me and served other gods, so they are also doing to you.

1 Samuel 8:4-6

Put not your trust in princes, in a son of man, for there is no salvation in him.

Psalms 146:3-5

Tomorrow is election day, and tomorrow I shall make a deliberate choice, that I deem to be in the best interests of myself and this nation, and to best express my preferences with respect to the choices of the election: I shall go to work, and I shall return home, and never shall I set foot in the polls. But today, facing accusations of cowardice and dereliction of civic duty, I feel called to defend my plans, and to defend all who choose likewise on principle.

I am a voluntaryist, rejecting every initiation of force, opposing not only the regulation of voluntary transactions and the punishment of victimless crimes, but the very foundation of our government finances, coercive taxation. I thus can support no candidate with ballot access; I have fundamental disagreements even with Gary Johnson. Indeed, to me, the very act of running for president disqualifies a man: for I trust no man who claims to be Madison’s angel, who claims the wisdom and incorruptible righteousness to rule over men, to wield the awful power of the state.

Many have proceeded to tell me that even if I disagree with all candidates, I should vote for the one with whom I disagree least so as to minimize the harm done, often backing this claim with an apocalyptic depiction of the likely consequences of the wrong candidate being elected. But I reject the burden this supposes, of being caretaker of the world. It is my duty to act justly before God and neighbor, and leave worrying about tomorrow to God in his wisdom, foreknowledge, and omnipotent providence. So I ask: by not voting, whom do I wrong? What punishment to I inflict on the innocent, by not voting? Whom do I rob? How do I dishonor God?

It may be said that by not voting for the least among evils, I fail to defend people from the worse. But suppose I do vote for the lesser evil, and he wins, and my vote is the margin: what shall I say to those he oppresses? Shall I tell them that I judged their suffering an acceptable loss? It is not my choice to make. To be sure, one can prioritize his efforts to help, even if it means that some he could have helped (with opportunity cost) are not helped.1 But that does not allow me to become a participant in particular oppression so that I might reduce the sum of oppression.2

And in the end, a politician is but a man, fallen, fallible. Obama calls me to hope, and I hope—but not in him; my hope, in all that matters, is in the Lord. While politicians are known for little so much as lying, he is Truth. While even the best-meaning politicians are thwarted by their ignorance, he is omniscient. While even the President has but little power in greater contexts, he is omnipotent. To put trust in a candidate for political office before trust in God is idolatry. And if we trust God first, what need have we of a second?

Tomorrow, I shall live my life. I shall strive to love and glorify God and love and serve my neighbor. I shall pray Thy will be done, and I shall trust that whatever trials may arise from the result of this election are within God’s holy plan. Why should I vote? Why should I endorse someone I find to openly oppress the innocent? Why should I endorse a system of greed, corruption, of plunder legalized and production punished? Either way, the sun also rises. Either way, the Lord’s will is done. And either way, the winner will break his promises, make mistakes, disappoint those who trusted him.

  1. I here assume a duty to maximal charity; weakening that duty negates the argument in favor of a duty to vote before it interferes with my response.

  2. Not least because this consequentialist argument runs roughshod over a variety of incommensurabilities.

I am alive!

0.1 A bit of history

Since I moved The Ambulatory Sesquipedalian from, it was hosted on a VPS running Wordpress/MySQL. I was never satisfied with that arrangement (Wordpress is not very hackable, MySQL is bloated, and I hate PHP), but my successor, a blog written in Erlang, stalled due to poor library support, and a successor using Haskell/Happstack got the worse of my senior year schedule.

My incentives changed last spring, when my webhost terminated my contract without warning or even notification.1 Not wanting to go through reinstalling Wordpress (I desire to never become familiar with SQL), I took this as incentive to revisit the Haskell replacement (hblog, for lack of imagination). I finally think this ready for deployment, although rounding out features will take some time more. That said, I do hope to be more active henceforth than I was through the latter stage of the former instantiation.

0.2 What is missing

You may have already noticed that I am missing the archives from the previous blog; as part of the new engine I have modernized to HTML 5, and adding those archives will require considerable effort toward reformatting them. While I do plan to get to these, my primary task is still adding missing features to the engine, including comments, RSS feeds, and a non-stupid admin authentication method. In the meantime, if you notice problems or have suggestions for additions, I welcome the feedback.

0.3 Some implementation notes

Having now talked more about myself here than in the summed prior history of this blog, I would like to turn to hblog itself. It is written in Haskell using the Happstack webapp framework and the acid-state noSQL database. While Haskell does not lend itself to cgi scripting, excellent HTML generation libraries (I use BlazeHtml) make a monolithic webapp practical: hblog sits in place of the server2, parsing the HTTP requests into function calls that return HTML, allowing complete type-safety and performance considerably above that of AMP.

Use of Haskell for this project was not without its frustrations, primarily relating to the complex monad stack required by the combination of happstack, web-routes, and acid-state. Nonetheless, I appreciate the power and safety of an advanced type system and heavily higher-order language, even if it can be cryptic:

clump :: Ord a => [a] -> [(a, Int)]
clump = liftM (head &&& length) . group . sort

This function takes a list of items (with duplicates) and returns a list (without duplicates) of each item and the number of its occurences (used in generating the category and tag lists in the sidebar). Function composition (and monads and arrows!) makes this function, conceptually simple but mildly tedious to write in the structured and conventional OO3 languages of my experience, into a near triviality.

The last major component of the present infrastructure is the HTML generator. I hate writing prose in HTML; writing in a structured markup language (as opposed to unstructured markups that infer structure from the text) imposes high overhead, exacerbated in HTML by its horrific verbosity. In my last blog I used a short Lua program that produced HTML from a LaTeX-like slashcode markup, but the PEG parser was a kludge, and I would much prefer to never touch it again. I was steeling myself to write a similar generator in Haskell (I much prefer Parsec to LPeg) when I found Pandoc, a markup converter written in Haskell. I thus write this post in Markdown (with a handful of LaTeX commands to tag certain spans so that they can be styled in CSS), which Pandoc converts to HTML. My extensions are implemented as filters operating on the Pandoc AST between the Markdown reader and HTML writer; such a filter can be as simple as this that I use to increment header levels (so that they do not claim equality with the h1 post header):

incHeadings :: Pandoc -> Pandoc
incHeadings = bottomUp inc
  where inc (Header n l) = Header (succ n) l
        inc x = x

Far easier to write a few of these functions than write my own parser! Moreover, using Pandoc gives me certain features, such as code syntax highlighting and conversion of LaTeX math to MathML, nearly for free (even if the syntax file for Haskell highlighting desperately needs help).

Edit 9 Nov 2012: Haskell highlighting helped. It was not too difficult to rewrite the xml syntax specification used by highlighting-kate; the challenge is convincing both Pandoc and HBlog to build against the modified library; the optimal solution seems to be to use cabal-dev and add-source.

  1. They seem to have left the consumer market entirely; I discovered the cancellation when I could not access my account to raise a ticket regarding my server being down.

  2. More precisely, hblog sits behind nginx as a reverse proxy handling compression and caching, but the proxy is only needed to serve multiple sites from one ip address.

  3. I specify conventional because there is a strong similarity between the abstraction and encapsulation provided by a strong type system with algebraic types and type classes (as in Haskell) and that of the more common OO languages. I see little difference, for example, between Haskell and the CLOS.

Government under the Law

Most Americans, and other Westerners, claim to value a government under the law. But to what law do they refer? This law could be a law the government sets for itself, but then the statement would be meaningless: even a despotism is likely to have some standard procedures, yet no one would consider such a government to be under the law. Indeed, such a system does not strictly fit the statement, for the government would be simultaneously over and under the law, rather than purely under it. Thus, the government cannot be the source of this law.

Perhaps, then, this law is a law set by the people, yet unique to the government. But following this standard, any representative government (directly or indirectly) would be under the law, for the people in those circumstances establish a law for the government, but that law may yet be highly mutable. The Athenian admirals were executed in consonance with the laws established by the Athenians, yet few would consider the Athenian ochlarchy to be government under the law. Really, this is naught but a special case of the first instance, where the government sets the law; for in a representative, the people are the government, and thus no law alterable (or even once established and then immutable) by the people can base a government under the law.

Thus the law under which a government under the law operates comes not from the government, nor from the people. Whence else can it come? If from some external, active source, merely considering that source of law to be the government restores the objection. Thus, the law must be eternally immutable, existing outside any human agency: the natural law. A government under the law means a government under the standards of human rights established in the essential order of the universe.

0.1 Government under the law and rule by law compared

This definition of government under the law dissatisfies some, for it describes a government where the laws accord with natural law, but in which men may still rule. Should we not seek a form of government in which laws and not men rule? No, for it is impossible. Government under the law is possible, if difficult to attain and to maintain; rule of law cannot exist. Only that possessed of agency can truly rule, and agency is to God and his rational creations alone; to say that anything else rules is merely to inaccurately say that men rule in accord with laws. But they are under no necessary compulsion to do so. If a judge judges by whim, and not by law, what power has the law to punish him? Only another ruler, a man and not a law, can do so, and that ruler is either under the authority of another man or is a law unto himself. What legislator, legislating from inclination and contrary to the Constitution, has been struck by lightning emanating from the violated article? To ask the question is absurd, for parchment cannot act.

Government under the law is a condition of government, transient with each new law and each new judicial decision, and not a form of government. No form of government can assure adherence to natural law by elevating laws above men; we must only seek that form most likely to uphold natural law.

Thoughts on Independence Day

How many people say My country, right or wrong while today celebrating those who decided that England would not be their country because it was wrong? Today Americans celebrate the declaration that our rights are endowed by our creator, and yet celebrate a government that proclaims itself their guarantor. They celebrate the declaration that these rights are inalienable, and a government that disregards them at will. The declaration of a duty to throw off a government that disregards those rights, and a government that declare those who stand for these rights traitors, terrorists.

Let us not be caught up in the patriotic sentiment, the self-praising history that aligns the principles of the present government with the principles of the war for independence. Let us consider this day a call for choice, between the sentiments that led the signers to proclaim independence of the English government, and to enforce that proclamation with blood, and the sentiment of unconditional obedience. Between the fear of tyranny and the fear of uncertainty, between hope and apathy.

Some will say that all is different, for we have an electoral recourse denied to this country’s founders. Some say that a representative government cannot be in the wrong, that by our participation (or non-participation) we consent to whatever happens. Of these I ask, what difference does it make whether he who denies the rights of another calls himself king or voter? And others say that while representative government can be wrong, the solution is in the polls. And I remind, ballot box, jury box, cartridge box. By all means, work within the system so long as doing so bears hope–but such hope is long past. Who can wean the masses from the idea of profiting at the expense of their neighbor? Plunder, Bastiat calls it, legalized plunder, and as the thief who sees no moral wrong in his plunder is stopped by force of law, so government, whether power lies with voter or king, is stopped by force when it ceases to see the wrong in its legalized plunder.

But let us see also the folly of the founders–the folly of assuming that any people, given the ability to delegate their plunder, to despoil their neighbor while dirtying neither hands nor conscience–could long resist. The American experiment has failed, and let us not delude ourselves that we need only return to some golden era of government. That time has passed irrevocably, passed when the plunder of the many by the many was first discovered. Never again can the people be given the coercive power of government, and expected to use it wisely. Nor, as was found even by the 18th century, can we expect better of the kings. I see but one solution–destroying that temptation of legitimized coercion, plunder by law.

Concerning the Unum Argumentum

Perhaps the most intriguing of the sundry logical proofs of God’s existence is the unum argumentum of Anselm of Canterbury, commonly known by Kant’s term, the ontological argument (although Kant applied the term to a different argument).

I shall let Anselm present his case, from his Proslogion:

But surely that than which a greater cannot be thought cannot be only in the understanding. For if it were only in the understanding, it could be thought to exist also in reality–which is greater. Therefore, if that than which a greater cannot be though existed only in the understanding, then that than which a greater cannot be thought existed only in the understanding, then that than which a greater cannot be thought would be that than which a greater can be thought! But surely this conclusion is impossible. Hence, without doubt, something than which a greater cannot be thought exists both in the understanding and in reality.

Crucially, this argument does not deal with reality, as do the famous Five Ways of Thomas Aquinas, but rather with the thought. Fundamentally, Anselm is not arguing for the existence of God; he is arguing that to doubt the existence of God involves a mental contradiction.

Also important to this argument is Anselm’s definition of greater. While I shall not hear attempt to analyze Anselm’s rather complex usage of the term, the important point is contained within the aforequoted passage: to exist in reality is greater than to exist only in the understanding. Thus, greatest is equivalent to greatest in all respects except existence, and existing in reality. Furthermore, that than which a greater cannot be thought is equivalent to the greatest that can be thought. Thus, we may restate Anselm’s argument as That which is thought to be greatest in all respects except existence and is thought to exist cannot be thought to be only in the understanding. Phrased this way, it becomes nearly trivial, for it is manifestly absurd to think that something defined as existing does not exist. Thus, the argument is a reductio ad absurdum: to deny the existence of God is self-contradictory.

But at the same time, so is to deny the existence of a unicorn that exists, for if one asserts that such a unicorn does not exist, clearly one is not referring to the same unicorn. Thus becomes manifest the invalidity of the argument: if we permit such a reductio, anything may be proven to exist. The problem is that thoughts themselves may be false, having no correspondence to reality. To think of a unicorn that exists is to think of a unicorn and to think that it exists; clearly, the second thought is false. Existence cannot signify the unicorn; it can only make a statement about the unicorn, which statement may be true or false. Similarly, to think of the greatest in all respects except existence that exists is to think of the greatest in all respects except existence and to think that it exists; the second statement does not signify the subject, but makes an additional claim about it which may be true or false. Anselm’s inclusion of existence as an element of goodness, however, serves to hide this invalid operation.

The Present Defined

I have been long intrigued by the fact that, in Greek, both past and future times can take the simple aspect (action completed at a time, i.e. he ran) in the aorist and future tenses, respectively, but that the present tense conveys only progressive aspect (action continuing through a time, i.e. he was running). Ultimately, the only explanation is to define the present as the widthless boundary between past and future. Thus, progressive action cannot occur in the present because, as the present occupies no time, it can only encompass an action taking no time, and such an action is inconceivable (as it would then be possible to repeat that action an infinite number of times simultaneously). The past and future, on the other hand, are more than even definite, non-zero periods of time, but are rays, extending indefinitely away from the present. Thus, while an action can continue to both sides of the present (progressive aspect), terminate at or before the present (perfect aspect) or start at or after the present (to which no simple Greek aspect corresponds), no action can take place in the present (simple aspect).

But this definition of the present, while philosophically sound, is unsatisfying for certain practical purposes. Thus, I would like to advance a separate definition, for the purposes of praxeology: the present is that period of time demarcated by (but not including) the last point in time at which one can receive information regarding reality in a certain place and the first point at which one’s action can influence reality at that point. The most startling aspect of this definition of the present is that it is not absolute: such a present is defined only with respect to a certain actor, and has a different extent at different places and with different technology. At the same time, I think that these attributes of the present so defined are useful for certain purposes. While it matters very much to a general in his planning room whether a comment was made five seconds ago or will be made in five seconds, it matters very little whether a given event in the battle happened even five minutes before or after the time of his action, if it would take ten for news of the battle to reach him or for his orders to reach the battlefield. Historically, I think that this well explains the infamous Battle of New Orleans: from our modern perspective, it seems horrible that a battle should take place two weeks after the signing of the treaty, but if one considers the primitive communications available at the time, the two events may well be considered contemporaneous. Philosophical nuances may, in places, have very little bearing on practical reality, and time seems to be one of those instances.

Pax Americana

Entering the term Pax Americana in Google produces 433,000 results. Some use the term in derision; too many in approbation. While our politicians have not endorsed the term, many have embraced the concept of a global peace enforced by the United States. Yet let us look at the antecedent of all this, the Pax Romana. Is such a hegemony a worthy goal?

The Pax Romana is a period of roughly 150 years, from 27 to 180 AD, in which the Roman Empire enjoyed a period of relative security. And for the citizens of Rome, this was certainly a period of prosperity and safety. No barbarians or wayward generals crossed the Rubicon; the emperors were, for the most part, good, at least relative to those of other times (at least if one omits consideration of Nero and Caligula). But the citizens of Rome were not the entirety of the world, and while many others shared the Roman Empire, few shared the Roman Peace and this peace was still a time of war and of oppression. This was the peace of Claudius’s legions as they conquered Britain; this was the peace of the one million civilian dead when Titus besieged Jerusalem; this was the peace of Boudica and the bloody suppression of her war for freedom: a time of peace for the rulers; a time of suffering for the ruled. But not even the rulers escaped unscathed, for they found it to be impossible to rule others while retaining their own virtue. For Rome this was a time of prosperity, but not the prosperity of Republican austerity: instead, this was the prosperity of bread and circuses, of decadence, of depravity. As the empire rose in might, as the Roman eagles spread across the world, the Romans forgot what it was to be Roman, forgot the ideals of citizen service, of governmental accountability, that had first built the prosperity of Rome, replacing them with spectacle, with handouts, with despotism. And thus, the Roman Empire fell, its foundations cut away by its final, superficial triumph.

So, what out we to expect from a Pax Americana? Peace in our time, perhaps. But not a peace of virtue, of mutual desire and mutual cooperation. Rather, a peace by war, and not even by the threat of war, but by the active use of war. And since this would a peace of force, of hegemony, let us not speak of ourselves as a global police force and of making the world safe for democracy. Justice is not to be found in violence, aside from the redress of specific wrongs. Furthermore, as our society comes to rely on war, the slave will become the master. Just as the Roman Republic could not grant the powers to its generals necessary for the latter to expand the empire without risking the ascension of those generals into supreme power, consummated first in the triumvirates and ultimately in the Caesars, we cannot expect to found a world order dependent on the military and still expect to retain control of that military. Let us remember that few men in the history of the world valued their form of government more highly than did the citizens of the Roman Republic, and yet the Roman Republic became the Roman Empire. Let us not feel so secure in our Constitution as to allow forces that must in time destroy that Constitution. Peace through force we may have; but if we desire a just or a lasting peace, let us look to cooperation and not antagonism.