Dying for the telephone company

"The modern nation-state, in whatever guise, is a dangerous and unmanageable institution, presenting itself on the one hand as a bureaucratic supplier of goods and services, which is always about to, but never actually does, give its clients value for money, and on the other as a repository of sacred values, which from time to time invites one to lay down one’s life on it’s behalf… It is like being asked to die for the telephone company…. The shared public goods of the modern nation-state are not the common goods of a genuine nation-wide community and, when the nation-state masquerades as the guardian of such a common good, the outcome is bound to be either ludicrous or disastrous or both." -- Alasdair MacIntyre, Whose Justice, Which Rationality?

Rationalism in International Politics

E. H. Carr critiqued it at a crucial moment in Europe's history.


If you doubt that "Americanism" is a religion, watch the beginning of a football game. A huge religious icon (the American flag) is spread across the field. Everyone puts their hands over their hearts (similar to making the sign of the cross) and then sings a religious hymn ("The Star Spangled Banner"). The singer is surrounded by a coterie of "monks": Marines, Navy SEALs, paratroopers, etc. Then, like a great spectacle in the coliseum from pagan times, two groups of warriors do battle, interspersed with ads touting consumption (the chief sacrament of Americanism) and the mystical ecstasies that can be achieved by total devotion to one's subcult (favorite team).

Every time Nick Rowe writes a macro post...

you should contemplate it very carefully... you will always learn to think about he macroeconomy more deeply.

In Which I Knock the Bottom out of Niall Ferguson

Noah, lost at sea

Noah Smith is trying to defend empiricism in economics, but when it comes to empirical facts about the history of science... well, those we can just make up to suit our purposes! And so he writes:

"Our most spectacularly successful leaps of theoretical insight - Newton's Principia, Einstein's relativity stuff, Mendel's theory of inheritance - were all very closely guided by data. The general pattern was that some new measurement technology would be invented - telescopes, plant hybridization experiments, etc. - that would provide some new unexplained data. Then some smart theorists would come up with a new theoretical framework (paradigm?) to explain it, and the new framework would then also explain a bunch of other stuff besides, and so people would switch to the new theory."

Now, I haven't studied the history surrounding Mendel much, so I am not going to comment on it (imagine that: choosing not to write about something because one doesn't know much about it!), except to note that it is a little weird to call plant hybridization experiments a "measurement technology." But with Newton and Einstein, Smith just doesn't know what he is talking about.

First, Newton: the telescope was what spurred on the Principia?! This is a bizarre contention. Perhaps it is true that discovering that Jupiter has moons played some small part in prompting Newton's new physics: I spent a year studying the scientific revolution in graduate school, and subsequently read the top scholarly biography of Newton, but while I don't recall those moons being mentioned as important in Newton's thinking, I won't categorically deny that they might have played a part. However, Kepler's conceptual breakthrough in realizing that the planets have elliptical orbits was much more important to Newton's physics, and it had nothing to do with telescopes. Kepler did rely on improved data collected by Tycho Brahe, but that data could have been handled with epicycles, and the idea of elliptical orbits might have been arrived at without that new data: it was abandoning the idea that celestial objects must move in circles that was the crucial factor here: a new idea.

But what is perhaps even more salient in this regard is that Newton's three laws of motion are not empirically verifiable as a whole: they really are a re-conceptualization of motion, and we need to assume at least one of them to empirically verify the other two.

With Einstein, Smith is on even shakier ground, and it is noteworthy that here he does not even try to suggest what new "measurement technology" prompted Einstein's breakthrough. And as far as "new unexplained data" goes, it is usually the negative result of the Michelson-Morley experiment that empiricists indicate as the impetus for Einstein's special theory of relativity. But Einstein himself told Michael Polanyi that "The Michelson–Morley experiment had no role in the foundation of the theory... the theory of relativity was not founded to explain its outcome at all." In fact, it was Einstein's thought experiment considering what it would be like to travel alongside a beam of light that was the chief driver for developing the "relativity stuff."

So: oops! When it comes to history, scientific "empiricists" appear not to care about "the data" in the least!

PS: Since I do care about the data, I am prompting my friend Thony, who knows much more history of science than I, to correct me here if I have strayed from "the data."


One-book-itis is a malady that strikes amateurs in an academic field (e..g. history) when their reading in that field, on a particular topic, is largely restricted to one strong defense of a controversial position about that topic. The amateur simply doesn't know the field (e.g. history) well enough to realize that:

1) Of course any competent professional historian can marshall a strong case for any position he puts forward: he wouldn't put a case forward unless he could marshall strong evidence for it, and his entire professional life has been spent learning how to make the historical case for proposition X strong.

In particular, what the amateur overlooks here is that their champion for this controversial position is in a dialogue with other professional historians. And whatever view he is disputing, those others themselves put forward good cases for the view he is disputing: if they hadn't, he wouldn't even bother disputing it!

2) The professional discussion is nuanced. Say the topic is the causes of some revolution. The "old" view was that the main cause was the decadent actions of the royal family. The "new" view is that it was due to the ascendancy of a propertied class in the towns.

The amateur reads a single book, making the case for the new view, and becomes its enthusiastic proponent: "Smythe-Williams crushes the idiots who think the cause was royal decadence." But if the amateur were to attend a conference where a panel of "new-viewers" and "old-viewers" discussed the issue, he would find widespread agreement among the panelists that both sides have a good case, and that of course the discussion is simply over a matter of emphasis.

A case in point that has come up in comments on this very blog: did Rome "fall," or was there a smooth transition from "late Antiquity" to "the early Middle Ages"?

Bryan Ward-Perkins and Peter Heather have each written books arguing for the "fall" side of things, and an amateur who has read either book might declare either one to be gospel, and claim that it "demonstrated" that the gradualists have been completely wrong. By contrast, a professional reviewing the books understands that they appear as part of a dialogue, and that of course they are stressing one side of the events that they feel their predecessors have under-emphasized, and recognize the validity of claims for the other side. As O'Donnell writes in his review just linked:

"[Heather] is well aware, e.g., of the work of C.R. Whittaker on the symbiotic relations and evolution of relations back and forth across the Roman frontiers, but I suspect that the general reader of this volume will benefit little from it -- it takes the sharp scholarly eye to notice that the qualification is being made and then dropped."

The amateur lacks the "sharp scholarly eye" necessary to notice the qualifications, which essentially say, "Of course the gradualists are not nuts, and there is a lot about this transition that was, in fact, gradual, but I think they have over-emphasized that side of things, and unduly neglected the sudden transitions that occurred."

And an actual scholar of the period in question can recognize the merit in the "more of a fall" case, and still demur:

"In the end, both books are too linear in argument, too much devoted to special pleading for a single line of argument to sustain victory on a crowded field of interpreters. Heather is the better narrative history for the reader who wants to know what happened, while Ward-Perkins does a better job of situating narrative in a context of interpretative possibilities. If there is an implicit moral to each book, Ward-Perkins's is that human prosperity and happiness are fragile things and need to be worked at assiduously, while Heather's is that immigrants can be very bad for a society. The present reviewer will still be numbered amid the Reformers [gradual transition] and not the Counters [sudden fall], but of the two he finds Ward-Perkins's message more persuasive."

The Scientific Achievement of the Middle Ages

A few quotes from the work with the above title by Richard C. Dales:

"The really important thing to be noted, however, is the rapidity with which the scientists of the later thirteenth and fourteenth centuries learned to differ with Aristotle..." (quoting Lynn White).

"The striking thing about this [twelfth] century is the attitude of its scientists. These men are daring, original, inventive, skeptical of traditional authorities although sometimes overly impressed by new ones, and above all steadfastly determined to discover purely rational explanations of natural phenomena."

"Despite the fact that many excellent illuminating studies of medieval science, as well as the texts of the works themselves, have been published in easily accessible volumes during the past fifty years, it is not unusual to find even well-educated people abysmally ignorant of the subject. Unfortunately this does not inhibit them from writing authoritatively about it."

My review of The Cambridge Economic History of Modern Britian

To appear soon in History: Review of New Books.


Floud, Roderick, Jane Humphries, and Paul Johnson. The Cambridge Economic History of Modern Britain. Volume 1: 1700-1870. Cambridge: Cambridge University Press, 2014.

This work is an excellent survey of the important region and period of economic history that was Britain’s industrial revolution. It consists of fifteen essays by a variety of top scholars, each taking up a different aspect of the overall subject: nutrition, international trade, technology, ideology, agriculture, transportation, regional variations, occupations, labor markets, finance, social mobility, and political economy. With such a wealth of information on hand, a short review can only sample a few of the abundant offerings in the volume.

Rational heating

Houses used to have radiators. These were "irrational," as it was hotter near the radiator than on the other side of the room. What people wanted was uniform heat over the entire house.

Except, if they have any sense, that's not what they want. Some people will find the uniformly heated room chilly, while others find it stifling. When we had radiators and fires, one could move closer to the heat source, or further from it, and set one's own room temperature. Now we must all have a single temperature, like it or not.

My thermostat is a Presbyterian

I have said before on this blog that if we wish to ascribe thoughts about chess to a chess-playing computer, we should, for the very same reasons, ascribe thoughts about home heating to our thermostats. It is nice to see that one of the founders of the discipline of artificial intelligence agrees with me on this point:

'In 1979 McCarthy wrote an article[22] entitled "Ascribing Mental Qualities to Machines." In it he wrote, "Machines as simple as thermostats can be said to have beliefs..."'

Of course, McCarthy thinks thermostats have beliefs about home heating and Big Blue has beliefs about chess, while I think neither is true, but we agree that the evidence should lead us to decide both cases the same way. (It is like we agree on the proposition, "If Joe is guilty, then Bill is guilty too," but disagree on whether Joe is guilty.)

Lost in the Medicine Cabinet

13-digit ISBN required, without hyphen

Every time you see a message like this from a web site, a programming angel falls from the sky and is imprisoned on earth until he can get the programmer who wrote that code to stop being a lazy so-and-so. Do you realize how easy it is to strip a hyphen out of a string of text?

Programmers: accept any reasonable format, and change it for the user into the format you need!

Absolute elsewhere in the stones of your mind...

I was in the waiting room at my chiropractor's office. I had a book to read, but the scene playing out in front of me caught my attention:

There was a girl of about seven sitting directly across from me, with a book in her lap. Her mother sat at right angles to both of us, phone in hand, studying the screen and typing. The girl asked, "Mommy, can I read to you?"

Her mother grunted something that might be interpreted as a yes. The girl began reading and the mothers face remained fixed on her phone, her fingers still typing. Every 30 seconds or so, the mother would look up, and give her daughter about a one second glance. At one point, she corrected the girls pronunciation of "Himalaya."

The girl mentioned yetis, and read that they were "apple-like creatures." I was puzzled by this for a moment, and then realized that she had read "ape-like creatures," and did not know that the dash meant that she should break her pronunciation at that point. Her mother took no notice of the "apple-like creatures" roaming the Himalayas, and continued typing.

I am sure this mom is frantically enrolling her daughter in piano lessons and ballet classes and whatever else she thinks the girl needs to "get ahead." But the thing her daughter needs the very most, her mother's assurance that she is of some importance in the world, this mother cannot give her. Because she was doing something more important, like rescheduling the HR meeting that had been on for 10 AM Monday, and seeing if everyone could make it at 1 PM instead.

George Will, Bullsh*&^er

As described here.

It is shocking how often the lie that Obama uses the first person a lot in his speeches has been repeated, given how often it has been shown to be false.

Germaine Greer speaks sense

here, but as a trans activist quoted in the article noted, speaking sense is "out of date."

Bonaventure on the Trinity

I have sometimes had commenters remark that my metaphysical interpretations of the Trinity surely must be completely novel, and have nothing to do with any traditional idea about it. Well, here is Ettienne Gilson, commenting on St. Bonaventure's ideas on the Trinity, from about 800 years ago:

"Now, it is clear that within such a substance [as a necessary being] the origin holds the place of principle; the exemplar, of means; the final cause, as its name indicates, of end; and as it likewise appears that the Father is the Principle and the Holy Spirit the End, it follows that the Son is the Means. Thus the Father is the original foundation, the Holy Spirit the completion, and the Son the mental word..." -- "The Spirit of St. Bonaventure"

If we absorb the above, we can see that, for instance, Mises's work on praxeology has a trinitarian basis, even though he would have hated to have heard this!

Hipster "multiculturalism"

At my Italian class, one student, a thirty-something hipster with scraggly beard, skinny arms, and nervous hands, was corrected by the instructor: a woman author is a "scrittrice," not a "scrittore."

In response, he rolled his eyes and mumbled something about how sexist this all was.

Entire languages are subject to condemnation if they do not live up to the standards of the twenty-first century Brooklyn hipster!

(Interestingly, the same fellow often corrects the instructor on basic language points, e.g., "That's not reflexive!" in a case where Italian uses a reflexive verb but English doesn't. Even the logical structure of the language is not up to snuff in his eyes.)

Macro Themes

On my fourth round of teaching macroeconomics, I am really able to tie much of the course together around the theme of "upholders of Say's Law" versus "Keynesians" (with "Keynesians" acting as a synecdoche for "all general glut theorists").

For instance, I was just teaching the chapter of our text on unemployment. When we discussed structural unemployment, I told the class about how the general glut debate initially launched in the wake of the Napoleonic Wars. "The defenders of Say's Law were not idiots: they saw that there were idle resources. But their explanation was that after 20 years of fighting, the European economy was structured around war: it would take time to change factories for making cannons into factories for making sweaters."

And then I explained how a similar structural explanation was offered for the recent housing-led downturn. And I noted that the Keynesians needn't deny that these structural imbalances occur, but only need hold that they are just an aspect of (and perhaps even the trigger for!) a more widespread malaise that affects business in general.

Our chapter on supply and demand was similarly linked to this topic by noting that the diagrams imply instantaneous achievement of equilibrium -- in which case Say's Law must always hold! But what if the adjustments to changed conditions take time? What if they take years?

I daresay few groups of college freshmen will have Say's Law more on their minds than will my macro students!

The Laffer Curve Is No Joke

I have seen a number of pieces from the American left mocking the very idea of the Laffer Curve, as though it is idiotic to think that lowering taxes could ever raise tax revenue. But pretty much every trained economist would admit that there is such a curve; the only question is where its maximum lies.


"One great success was the Commutation Act of 1784 which reduced tea duties from 119% to 12.5%, successfully killing smuggling and enhancing the public revenue, a never to be forgotten lesson." -- Julian Hoppit, "Political how are an British economic life, 1650-1870," from The Cambridge Economic History of Modern Britain, Volume I, p. 360, emphasis mine.

Lionel Robbins Discusses "History"

I didn't have a book to bring to the gym at work today, so I scanned the shelves of my (shared) office and plucked from them Lionel Robbins' A History of Economic Thought. Now mind you, I have no axe to grind with Robbins, and the remarks of his I will highlight below have little bearing on any practical current debate. I only note them to show how very wrong even major thinkers often are when they wander outside their area of expertise.

I started with Robbins' second lecture, on Plato and Aristotle. The first sign of trouble was when Robbins says that in The Laws, Plato has a "fascist conception" of the best society, rather than a communist one as in The Republic. So Robbins is trying to line thinkers of 2400 years ago with the political parties of his day, a completely hopeless task that falsifies the past.

Next up: "Before the Renaissance Plato was not at all well know, whereas "the Philosopher" (Aristotle) was appealed to by most of the writers on moral philosophy from Thomas Aquinas downward." This is a mangled version of something Robbins heard as an undergrad. In fact, for centuries, it was Aristotle who was not well known. It is true that when his works were recovered from Islamic sources, he eclipsed Plato in importance, and that only changed with Renaissance NeoPlatonism. But there was no period in which Plato was "not at all well known."

Robbins then goes on to offer the near-mandatory disclaimer that he doesn't agree with Aristotle on slavery. He says that "There were enlightened people...who were beginning to question the institution of slavery, and Aristotle... thought that as a moral philosopher he ought to give some justification thereof." Note that Robbins doesn't list any of these "enlightened people." I suspect that is because he doesn't actually know of any. And I suspect that is because there weren't any. And I have been told this by Garrett Fagan, who is a real historian of this period.

Further, for Robbins, this whole discussion is "all very lame and dull stuff," so he doesn't even bother with what Aristotle's argument actually is. If he had, he would have found Aristotle arguing that certain people naturally work well self-guided, while others need to take orders. Thus, Aristotle would have looked at the factory workers of Robbins' day and said, "Oh, I see you've devised a new form of slavery!"

Robbins continues by reciting the much-discredited idea that Medieval thinkers were enslaved to Aristotle and didn't think for themselves, a bit of nonsense refuted as easily as by glancing at the long lists of propositions from Aristotle that were condemned in the Middle Ages.

Our final flub comes when Robbins accuses Aristotle of having "a value judgment creeping in" to his economic analysis. So Aristotle, who would reject that whole fact / value dichotomy and hold that one certainly can derive an ought from an is, was just accidentally letting value judgments "creep in"!

This is only the first five pages of this chapter, and basically we have found one truly awful historical error per page.

Moral: Read real historians! Dabblers in history from other fields, and pop historians, tend to produce rubbish. (There are, of course, many exceptions, but they are in the minority.)

Was It the Government That Was (Chiefly) Responsible for the Low-Fat Diet?

Nutrition science is (apparently) seriously revising its recommendations for the amount of fat we should have in our diets. In response, many of my libertarian Facebook friends have been posting things like, "See: never pay attention to government nutritional guidelines."

But was it really the government that drove these recommendations? My impression -- and I have not studied this history in any depth, so this is only an impression -- is that this was more a matter of nutrition scientists jumping to plausible conclusions with too little evidence at hand. Studies showed that the presence of cholesterol in the blood had a positive correlation with heart disease. Therefore, people should lower their cholesterol intake. This hypothesis proceeded on the sensible idea that if we suffer from having too much of X in our bodies, we should put less X into our bodies.

But nutrition and physiology are very complex subjects, and it seems that this plausible idea was not tested sufficiently. Perhaps the real problem is a genetic predisposition to accumulate cholesterol in the blood, and diet has little to do with this. Or perhaps something else entirely!

In any case, government health agencies did pick up this low-fat ball and run with it, perhaps foolishly. (I would never deny that government bureaucracies often behave foolishly!) But the public choice explanations I've seen for assigning nefarious motives to this decision don't make sense to me: there are meat and dairy lobbies as well as grain lobbies, and it's not clear why even the grain lobby would be behind the low-meat-consumption guidelines, since it takes much more than a pound of grain to produce a pound of meat.

Scientific hubris and a naive faith in expert pronouncements seem sufficient to explain what happened here: Does anyone more familiar with this history than me know a reason why this is not so?

Algorithms and Their Implementations

As I taught my students the Sieve of Eratosthenes, I described the sieve verbally, then I had them "run" the sieve on paper, then program it in Visual Basic. And as I did so, I contemplated, "What is the sieve itself, apart from its implementations?"

This of course is but one more version of the problem that Plato and Aristotle grappled with, on the relation of the forms to their particulars. Plato's approach was to regard the particulars as inferior "copies" of the forms, which tended to lead to a contempt for the world of particulars, and Gnostic ideas like the creation of the sensible world by an evil demiurge. On the other hand, Aristotle emphasized the particulars, leading him to posit a God who was so totally removed from the world that He did not even know it existed.

The problem, it seemed to me as I contemplated this matter, is that each view is one-sided, and treats the algorithm and its implementations as though they could be pulled apart. But the implementations are meaningless except as implementations of the form. We might understand this by saying that the "algorithm itself" is the generative source of the implementations. But it is not something different from those implementations, nor did it stamp them out, as though on an assembly line. We might say the implementations of the form are one in being with the form, and that they are begotten, not made by it.

Furthermore, we finite beings are only able to perceive the form itself through the implementations. There is simply no way to convey to anyone else what the generative form of the algorithm is without pointing them to the form fleshed out in an implementation, or, we might say, as humans, our only path to the generative source is through the source made flesh.

But we might further notice that we haven't completed our analysis of this matter just yet: neither the generative form nor its implementation are complete without the actual running of the algorithm, until we give it life by executing it, on paper, or in our head, or on a computer. The generative form is made fully real through its implementations by the power of executing it. But that power is also one in being with the form and the implementations: we might say the execution of an algorithm proceeds from its generative form and its implementation.

Somehow, this all reminds me of something, but I'm not sure what at the moment.


The scientific study of the psyche, undertaken by people who generally do not believe that the psyche exists.

Bryan Caplan Explains the Liberal Attitude Towards Religion

I was at the NYU market process colloquium one day when Caplan was presenting. I challenged his notion of rationality, saying that his own view lacked the resources to say why worrying about material well being is rational, while following Biblical injunctions on behavior is irrational. (Something he nevertheless held to be true.)

Caplan's response was along the lines of, "Oh, so we're supposed to be following the dictates of a bunch of desert shepherds from 3000 years ago?" (I quote from memory, but I certainly do not have the essence of Caplan's response wrong.)

The first thing I will point out is that Israel Kirzner, who is an orthodox rabbi (and many times the economist that Caplan is), was sitting next to me in that room. So Caplan was quite deliberately mocking Kirzner's life choices, and in a forum in which he knew Kirzner could not respond. (Kirzner is, of course, too self-possessed and too much of a gentleman to even show a response to Caplan's infantile behavior. Me, on the other hand...)

The second is to note just how stupid Caplan's response is. Imagine he had used the Pythagorean theorem to prove something in his paper, and I had complained, "Oh, so we're supposed to be following the mathematical reasoning of some tunic-wearing Greek cult leader from 2500 years ago?"

Of course, mockery is all that people like Caplan and his fellow GMU ignoramus Tyler Cowen have in their arsenal: if they were actually to engage on these topics with a thinker like, say, Alasdair MacIntyre, they would be completely crushed. So, mockery it is!

Teaching Programming Without a Net

This morning I did something I had never done before: I wrote a program in front of an audience.

I had assigned my introduction to programming students the Sieve of Eratosthenes as a problem. I had already written a sieve  in Visual Basic based on Stepanov and Rose's guidelines. But I wanted my students to implement a much simpler version -- they are beginners, after all!

Today, for the first time, I came to class, quite deliberately, without having written the program I was going to show them in advance. I told them, "I want to show you how a programmer thinks through a problem like this."

And I programmed the whole algorithm live, describing each step, and using the Visual Studio debugger to examine what was going on. It was a bit nerve wracking: what if I froze up, and couldn't think of what to do next?

But we got through it, and the students loved it. I will be doing this again.

PS: Having gotten them through the sieve, I need one more algorithm for them to code before I assign them their independent project. Any suggestions?

The Segregated Pop Charts?

I've mentioned this before, but I am always befuddled by claims that Michael Jackson was "the first" black artist who could score hits with white audiences. I was reminded of this again when I happened to be pointed to Billboard's list of number one pop singles. Take the year I first started listening to pop music, 1970, and consider that the United States is about 13% black, so it is just about impossible to reach number one on the overall pop chart if you are selling only to black record buyers.

What I see for that year is that black artists occupied the number one spot 19 out of 52 weeks: almost 40% of the time. And it was not one "crossover" artist: it was five different ones. And me, a white kid in the suburbs, owned records by all five.

The previous year, 1969, I count black artists at number one 20 out of 52 weeks. And again it is five different artists, with only two overlaps with 1970 (Diana Ross and the Supremes, and Sly and the Family Stone). And I owned records by two of the three acts that would not reach number one in 1970. (I don't recall owning anything by Marvin Gaye at that time.)

Black Americans have had a tough enough time, what with slavery and Jim Crow and lynchings, that we really don't need to make up slights that didn't exist.

The promise that the study of the brain is "on the verge"...

of explaining how the brain "produces" consciousness...

is a check that has been "in the mail" for 250 years now!

If People Are Willing to Pay a Lot for It...

it must be bad?!

That's what town rankings like this one always seem to imply: they penalize towns for being pricey. But towns are pricey precisely because they are desirable places to live!

When Law Goes Private...

it turns out that it favors the wealthy even more than does state-made law!

Who could have imagined that in a market for law, the people who can pay the most for law get the law they want?!

The terrible assumption made in the optimistic case for an anarcho-capitalist justice system is that what people will want to pay for in private law is perfect justice, so far as it can be achieved.

Why in the world should that be the case? Large corporations will pay for law that will favor large corporations. Lawsuits based on corporate malfeasance will be increasingly hard to win. Intellectual property law will be strengthened, tremendously. The few dollars you have to spend on "private defense agencies" are peanuts compared to the billions upon billions large corporations will be able to spend.

And, as in any market, the consumer will win: corporations will get exactly the laws they want.

What should be jettisoned from Scholastic philosophy

"Dedicated as they were to the understanding of faith, our theologians accepted without criticism a great deal of ready-made philosophical and scientific knowledge that had no necessary relation to Christian revelation -- and, be it noted, these are precisely the dead and antiquated parts of their work, which we have absolutely no reason to preserve." -- Etienne Gilson, "Historical Research"

I have been trying, apparently without success, to convince the Thomists at Ed Feser's blog of this point, especially concerning the clearly antiquated doctrine of the division of life into non-sentient plants and sentient animals. Forget that this division totally ignores fungii, which are neither plants nor animals, it is also entirely dependent upon "movement" being defined as "movement at the pace at which humans move." Plants move around plenty, just more slowly than we do. And this antiquated division must classify single-celled organisms as "animals," since, seen under a microscope, they are clearly moving around, hunting, and so forth. But since plants evolved from such single-celled creatures, that would mean that at some point in their evolution, they "passed out" and lost sentience!

It is entirely understandable that Aristotle or Aquinas accepted this division. But today, we have a wealth of research showing us just how active plants are. Both Aristotle and Aquinas would have changed their view in the face of this research. It does their followers no credit that they will not do so.

My response to Walter Block

Is online here at Cosmos and Taxis.


I was looking at a paper published in the Cambridge Economic History of Modern Britain, and I found this passage:

"[The South Sea Company's] first act happened to be the successful conversion of 9 million pounds of government debt into company stock. For this service the government undertook to pay interest at 6%.""

This left me a little puzzled: just what was the government paying 6% on, if its bonds had been converted to South Sea Company stock? I wrote a friend who is an expert on the history of money and banking, and he agreed that the passage is confusing, and said, "The Wikipedia entry on the South Sea Company is better."

So, between a peer-reviewed book from Cambridge and Wikipedia... Wikipedia wins!

And high school teachers are still advising their students never to use it.


The Distributist Definition of the Capitalist State

"The two marks, then, defining the Capitalist State are: (i) that the citizens thereof are politically free: i.e. can use or withhold at will their possessions or their labor, but are also (ii) divided into capitalist and proletarian in such proportions that the state as a whole is not characterized by the institution of ownership among free citizens, but by the restriction of ownership to a section markedly less than the whole, or even to a small minority. Such a Capitalist State is essentially divided into two classes of free citizens, the one capitalist or owning, the other propertyless or proletarian." -- Hillaire Belloc, The Servile State, p. 16

The truth about the "subjectivity" of value judgments

Reading Frances Woolley's post about throwing away pumpkin seeds led me to contemplate this point.

By noting that "waste is a value judgment," Woolley seems to imply that it is "merely subjective," and therefore beyond dispute.

The fact that we can, and do, successfully challenge these judgments by others, and sometimes get them to change their mind, shows that this is not the case. But the confusion is understandable: the claim that value judgments are not subjective is often conflated with the notion that everyone whomsoever in any circumstance whatsoever ought to make the same judgment. So, a waste nanny might badger all others about how awful it is for them to throw away their seeds. And that is clearly a mistake.

The truth lies in between: it is either a good idea or a bad idea for me to save and toast my pumpkin seeds. It is not a matter of my whims. But whether or not it is a good or bad idea depends on my "particular circumstances of time and place," and, perhaps most importantly, on the sort of person I am.

The good for Socrates may consist in tossing his pumpkin seeds and spending the time he might have spent roasting them in contemplating justice. The good for a couch potato might consist in getting up off of his butt and cleaning and toasting the seeds, since he was just going to be watching bad TV otherwise. And the good for a farmer might consist in saving the seeds for next year's crop. The fact that the good can vary from person to person does not make it a matter of whim!

Neuroscience: Uncovering the "secrets of consciousness"?

An actual neuroscientist (at least one in training) is honest about where things stand. The findings:
We have no f*%king clue how to simulate a brain.

We have no f*%king clue how to wire up a brain.

We have no f*%king clue what makes human brains work so well.

We have no f*%king clue what the parameters are.

We have no f*%king clue what the important thing to simulate is.
So, we are just about there, hey?

And note that this person is still a materialist, and still thinks we are dealing with a "machine" that thinks. S/he is just honest enough to admit that we have no clue how that "machine" operates.