Wednesday, May 30, 2007

A little Greek is a dangerous thing

Today classical studies are enjoying a modest resurgence. For most of us, this means reading the classics in translation--at best. And (let’s be realistic) for many it is popular films like “Three Hundred” that do the trick.

Some people do remember bits of high school Latin. Nowadays taught mainly on the college level, Greek is a very rare accomplishment. Yet because scientific terms like “stratosphere” and “biology” are made up of Greek roots, most educated people have a sort of rough and ready set of terms.

This knowledge easily leads to overconfidence, and overconfidence not restricted to novices. An interesting case is that of the Swedish bishop and biblical scholar Anders Nygren (1890-1978). In his monograph “Eros and Agape,” first published in Swedish in 1930-1936, he analyzed the connotations of two Greek words for love, eros (sexual love)and agape (spiritual love), concluding that agape is the only truly Christian kind of love, and that eros (an expression of the individual's desires) turns us away from God. In the English-speaking world many are familiar with C.S. Lewis appropriation of Nygren’s pair. However, Lewis complicated matters by adding two other terms, philia and storge. Riffing off Nygren without acknowledgement, Pope Benedicts XVI in his first encyclical, Deus Caritas Est, held that both eros and agape are aspects of divine love. Alas, ancient Greek recognizes no hard and fast distinction among these this tangle of terms, though agape is preferred in the New Testament.

The Alsatian theologian Oscar Cullmann proposed another pair of terms that proved influential in the mid-20th century. In his 1946 monograph “Christus und die Zeit,” Cullmann asserted that the Greeks had two very different words to expresss the concept of time. One is chronos, the steady measurable procession that is tracked by human instruments and that we recognized as built into the structure of the cosmos. Contrasting with this, Cullmann held was the tern kairos, which designates the special character of a particular moment. When we are advised to “seize the time,” kairos is what is meant. Yet as the Scottish theologian James Barr showed, the Greek language shows no absolute contrast between the two. Language, at least, offers no warrant for Cullmann’s interesting dichotomy.

From his reading of Plato’s Gorgias, the political philosopher Leo Strauss imported a concept called “thumos,” defined as a kind of vigorous manly assertiveness. Indeed, the concept has been championed by Harvard professor Harvey Mansfield in his recent book “Manliness.” Straussians like Mansfield, would have it that the Greeks had a well developed concept of thumos, which we can access and use. They had no such thing.

What is common to all these appropriations from the Greek--eros, agape, philia, storge, chronos, kairos, thumos--is the confidence that when one appropriates such a word one is automatically connecting with a well developed system of thought. It would seem that the ancient Greeks, those clever fellows, did the work for us; all we have to do is access it.

Alas, these coinages are neo-Greek. They have only as much authority as the appropriators can attach to them. As we noted, the elaborate constructions of Nygren and Cullmann are fast receding into oblivion. This is also likely to be the fate of Mansfield’s deliverances.

The moral is this. Beware of Greek gifts, especially by those claiming to be experts.

Saturday, May 26, 2007

The insufferable Hillary Clinton

Someone said that one could form a "Hillary Clinton Book of the Month Club." Maybe it should be Book of the Week. I don't plan to read any of them, but enough information is now on hand to make clear why the Clintonessa is infuriating.

She is bpth self-righteous and opportunistic. There is a place for both in our political system. but not combined in one person. Among others, Woodrow Wilson and Ralph Nader have been consistently self-righteous. Ever reluctant to grant the cogency of any opposing political view, these individuals stick to their guns. The are not trimmers.

Then there are the pragmatists in politics, of whom Franklin Roosevelt was the supreme example. A story (possibly apocryphal) says that one morning he was visited by a group of Protestant pastors, who compained about growing Catholic influence. "You're absolutely right!" the president exclamed. "I'll exert every effort to curb this menace." An hour later, a Catholic group came calling. They complained that Catholics were ignored and discriminated against. "Right you are," said Roosevelt. "I will do everything I can to correct this great wrong."

Despite this opportunism, Roosevelt was a very effective president.

Because of she combines two incompatibles, self-righteousness and opportunism, a Hillary presidency would be a disaster. A political cartoon was telling. In one panel Hillary says "If I had known then what I know now I would not have voted to authorize the Iraq war." "What is it that you didn't know then?" says the interviewer. "I didn't know that the war would be unpopular."

Hillary's self-righteousness was again on display the other day when she vehemently refused to answer a question about how she would vote on the current war authorization bill. In the end she voted against it, a safe vote since she knew the bill would pass and "our troops would be supported." I don't believe for a moment that Hillary would end the Iraq war if she became president. She admits that she would keep 50,000 troups there--a number easily revised upwards.

And all of her prevarications she serves up with liberal helpings of self-righteousness.

Winston Churchill observed that one can always trust the Americans to do the right thing--after they have exhausted every other alternative. I am hoping that we will do the right thing and reject Hillary for the Democratic nomination. Nut alas there may not be enough time for the slow-witted American public to come to this realization.

Wednesday, May 23, 2007

Eugen Weber, 1925-2007

Yesterday I was saddened to learn of the death of Eugen Weber, professor of modern European history at UCLA. I took his course in Western civilization in the spring of 1957, my first year in college. Bored by high school, I keenly looked forward to attending the university as the true beginning of my education.

Some of my teachers at UCLA disappointed me. Not Eugen Weber. His lectures, combining telling detail with astute generalizations, were enthralling. He was handsome and self-assured, impressing us all by his cosmopolitanism. Eugen was born in Bucharest of (I suppose) Jewish parents. When he was 12, his parents were astute enough to enroll him in an English boarding school. After the war, he discovered his true calling: the history of modern France. (As far as I know, Eugen was not related to Max Weber, about whom I recently wrote.)

The New York Times obituary stressed Weber’s methodological empiricism and avoidance of grand themes. This is not my recollection, though perhaps he became more skeptical as he went along. To be sure, Weber had a practical side, derived from a serious fund of experience. Once (anent the Crusaders) I fatuously remarked about how single combat must be more satisfying than bombardment and other form of “distance warfare.” He immediately contradicted me, saying that he had had to bayonet enemy soldiers when he was with the British Army in Sicily. It was not satisfying at all.

Since Weber’s political views inclined to the left, it might seem surprising that he chose as his first major research project the Action Francaise, a far-right movement. In those days, we thought that the French were all lefties, and only gradually did one become aware (in large measure because of Weber’s work) that France had been, regrettably, a hothouse of proto-fascist thought,.

Weber’s most innovative book was “Peasants into Frenchmen” (1976). Here he showed, with much detail, how France was not (as some French people think even today) some Platonic idea that had always existed. Instead it was the creation of relatively recent times, beginning in fact with the Third Republic in 1871. The political centralization and educational reforms of that regime began to mold local particularisms into a sense of national identity. As Benedict Anderson and others have since observed in more general terms, nations are, in many cases, artifacts, not “natural” entities.

Many will have seen Eugen Weber on public television in the 52-part series “The Western Tradition,” produced in 1989 by WGBH in Boston. Perhaps because I had heard much of the material in class thirty years before, I was not so impressed. In fact, the television lectures were a kind of elegy for a particular concept of Europe, which now seems dated and exclusivist. Some have even gone so far as to call this approach “Nato history,” a kind of enabling instrument masking some of the seamier aspects of Cold War realpolitik.

During World War II my parents assumed that Europe would never recover from the devastation inflicted by that horrific conflict. Only a few years after the end of the war, though, a book appeared titled “Fire in the Ashes.” Indeed, Europe was making a remarkable come-back, though still very much dependent on the shield of American power. It was in those days--my Weber period, I suppose--that I and many other sensitive ypung Americans conceived the idea of escaping from the crass commercialism of America to Europe, the true seat, we thought, of all genuine culture. I did indeed reside for a time in Italy and England, and have been back to Europe many times. I also found in trips to Latin America, Africa, and Asia that there is much more to be seen in the world at large. I tempered my Eurocentrism.

It was ironic that I aspired to make the reverse journey of Weber. He had started in Romania, then sojourned in England and France. After a brief period in Canada he had settled by the warm shores of the Pacific Ocean.

In those days one might have said that Weber was a “refugee.” This term is unkind. The best rubric for this geographical and intellectual trajectory is “the Transatlantic Migration.” It was only when I got to NYU in 1956 that I experienced the brain power of this phenomenon at full strength. The teachers who influenced me there were German Jews, including Richard Krautheimer, Karl Lehmann, and Erwin Panofsky. After at titanic struggle, the Third Reich was defeated, a struggle my teachers aided. Yet the other Germany, the good one, had, in these paragons of intellect, vanquished England and France combined. Germania: non omnis moriar.

Those heady days of graduate school in New York gave way to new experiences in London. There I found little nourishment in indigenous thinkers, Little Englanders as they for the most part were. My greatest inspiration came in two mighty figures of Austrian origin, Karl Popper and Ernst Gombrich.

With all this stimulatsion I practically forgot about Eugen Weber. One affinity remained, for I share Weber’s love for France. Intellectually, though, I suppose I still speak German--trotzdem (“nonetheless,” as another Austrian, the architect Adolf Loos would have put it).

Monday, May 21, 2007

Max Weber revisited

In the recent flurry of anti-religion books, Christopher Hitchens has decided to turn up the volume. He is getting a lot of attention--except from the people who matter.

The figures for religious adherence among the world’s 6.5 billion people are staggering. According to the latest data available to me, Buddhists reckon 375 million; Christians are 2.1 billion; Hindus, 851 million; and Muslims 1.2 billion. Estimates of unbelievers, who are difficult to count for various reasons, range from 150 million to 750 million. Moreover, this imbalance is likely to change--and not in the direction that Hitchens and his fellow debunkers would like. Christians and Muslims are vigorous proselityzers, while Hindus have a high birthrate. Over time, many people in Russia and China who went over to state-sponsored unbelief are likely to return to the faith of their ancestors. Unfortunately, the religiously indifferent in Western Europe have a very low birthrate. They are increasingly challenged by the Muslims in their midst.
To be sure, some of this faith is nominal. Confining ourselves to Christians, a recent survey showed that a majority did not know who delivered the Sermon on the Mount. Some believers, we are told, hold that Joan of Arc was Noah’s wife. In other cases, the overall commitment is tenuous at best, as seen in a friend who proclaims himself a Episcopalian atheist. We have all encountered George Santayana’s assertion “There is no God, and the Virgin Mary is his mother.”

All the same, it must be a sobering thought that poor knowledge and doubt may, under proper circumstances, yield to fervent study and active participation. Moreovef, in order to have a “clash of civilizations” you do not need orthodox Christianists and Muslims--just a an intense loyalty to one’s historic community, a community that is ostensibly menaced by the other.

These considerations give me no joy. There may be a silver lining, though. Apart from countries like Russia and China, unbelievers are likely to be more prosperous than their believing counterparts. Still, the causality may not run in the direction that one might assume. That is to say, in advanced Western countries it is not the status of unbelief that promotes worldly prosperity. Rather, it is the pervasiveness of higher education which, in preparing the young for success, also tends to erode their religious commitment. When all is said and done, in the hurly burly of the real world there is truth in the quip “Jesus saves, Moses invests, but Mr. Secular excels as an arbitrageur. “

Looking over the world’s population, though, it seems clear that the world’s poor tend strongly to be religious. Their faith offers consolation, but at the same time it holds them back by making the resigned to their faith.
It would appear then that in the aggregate religious attachment is dysfunctional to economic progress. Is thus always true, though? A hundred years ago a German sociologist Max Weber offered evidence for a test case in which religious faith advanced the believer’s chances for worldly success,

In his The Protestant Ethic and the Spirit of Capitalism (1904), Weber posited a close link between Puritan ideals and the rise of capitalism. At first glance, this claim seems counterintuitive, as religious devotion was usually accompanied by rejection of worldly interests, including the pursuit of wealth and possessions. Why was that not the case with Protestantism? Weber addresses this seeming paradox in his book.

The German sociologist defined the spirit of capitalism (which he understood as a unique accomplishment of Western culture) as the complex of ideas and habits that favor the rational pursuit of economic gain. To be sure, Weber acknowledged, such a spirit is not limited to the West if one considers it as the attitude of individuals, for industrious people are found everywhere. Such individuals—heroic entrepreneurs, as he termed them—could not by themselves establish a new economic order. The most common tendencies were the lust for profit with minimum effort and the idea that work was a curse and burden to be avoided, especially when it exceeded what was enough for modest life. As he wrote:
In order that a manner of life well adapted to the peculiarities of the capitalism… could come to dominate others, it had to originate somewhere, and not in isolated individuals alone, but as a way of life common to the whole groups of man.

After defining the “spirit of capitalism,” Weber advanced reasons pointing to its origins in the religious ideas of the Reformation. Weber held that certain forms of Protestantism favored the rational pursuit of economic gain and that worldly activities had been given positive spiritual and moral meaning. Worldly prosperity was not the goal of those religious ideas, but rather a byproduct stemming from the inherent logic of those doctrines.

In the absence of the traditional medieval assurances from religious authority, Weber argued that Protestants began to look for other "signs" that they were saved. Calvin and his followers taught a doctrine of double predestination, in which from the beginning God chose some people for salvation and others for damnation. The inability to influence one's own salvation presented a very difficult problem for Calvin's followers. It became an absolute duty to believe that one was chosen for salvation, and to dispel any doubt, for lack of self-confidence showed insufficient faith indicating damnation. In this way, self-confidence took the place of assurance of God's grace.

Weber believed that by the time in which he wrote his monograph the religious underpinnings of the Protestant ethic had substantially faded. As a significant milestone along this path, he cited the writing of Benjamin which emphasized frugality, hard work and thrift, but were mostly free of spiritual content. Franklin exemplifies the process of secularization of the Puritan ethic.

Weber’s approach was not monistic, for he acknowledged that while Puritan religious ideas had had a major influence on the development of economic order in Europe and United States, they were not the only significant factor. Others included rationalism in scientific pursuits, merging observation with mathematics, precision of scholarship and jurisprudence, together with systematization of government administration and economic enterprise. In the end, the study of the Protestant ethic, according to Weber, merely revealed one phase of the the larger process of emancipation from the magical world view. This process yielded that disenchantment of the world that he regarded as the distinguishing peculiarity of the Western culture we know today.

The monograph forms an integral part of Weber's criticisms of Karl Marx and his theories. While Marx held, generally speaking, that all human institutions--including religion--were the product of economic factors, The Protestant Ethic turns this theory on its head by implying that a religious movement fostered capitalism, not the other way around.

What has been the fate of this century-old book? In fact, Max Weber’s theory of the role that Protestantism, especially Calvinism, played in the development of capitalism in Western Europe has had a profound effect on the thinking of sociologists and historians. This effect was not limited to the 20th century, but continues today.

Criticism has flowed from all sides. Yet the effect of these criticisms has not been to demolish the book, but rather to demonstrate its continuing vitality as an instrument for beginning to think about one of the most important problems of our own times, that is. the uneven distribution of economic prosperity throughout the world.

Some scholars hold that Weber misunderstood the views of Benjamin Franklin. As Weber was not an Americanist, that may well be the case. However his larger point--that ideas that are of religious origin may continue to circulate in disguised form--seems valid.

Other criticisms go to his assertion that modern capitalism could not have come to fruition in Europe without an ethic or spirit which had its roots in ascetic Protestantism. Thus it has been pointed out that the seeds of capitalism may be found in medieval Catholic Europe. This is indeed true, but whether these trends could have come to full flower without the intervention of the Reformation must remain uncertain--perhaps improbable.

Others hold that the driving force behind capitalism was not asceticism but rationality. The question then becomes, what is the origin of this rationality? One can certainly observe the use of rational methods of argument in the writings of Thomas Aquinas and other Scholastic thinkers. However, rationalism reached is full flower in Western Europe only in the 17th century, after the wars of religion that ensued in the wake of the Reformation had made clear the futility of dogmatic suppression of dissent--at least in some quarters of northern Europe, witness careers of Spinoza, Hobbes, and Descartes.

The economic historian Jacob Viner has pointed to an instance that seems a significant exception to Weber’s thesis. Pre-18th-century Scotland was a relatively backward country, despite its adoption of Calvinism in the form of the Presbyterian Church. One might argue, though, that the effect was merely delayed. Indeed, a Scotsman, Adam Smith, emerged as the most persuasive analyst of “the spirit of capitalism,” and indeed its actual workings.

Perhaps the most serious criticism of Weber stems from his Eurocentrism. He made a special study of the economic history of China and Japan, concluding that those countries were destined to remain backward because they were not Protestant. The failure of this prediction would appear to be damning.

Yet it is not if we enlarge the bounds of the Weber thesis into a general theory. This theory posits the role of various religions in fostering economic prosperity. It is noteworthy that the countries in Asia that have achieved the greatest economic prosperity--Japan, China, South Korea, Taiwan, and Singapore--are all countries that share a heritage that combines Buddhism and Confucianism. Buddhism supplies asceticism and sefl-denial, with Confucianism complementing it through the emphasis on this-worldly bonds of family and clan. This mixture seems to function in Asia in a way that is analogous to Protestantism in Europe. Where these elements are lacking, as in the Philippines and Indonesia, economic backwardness persists. Thailand, to be sure, has Buddhism but (apparently) no Confucianism. Closer inspection shows that economic progress in Thailand is due mainly to “Chi-Thais,” ethnic Chinese who have settled there.

Nor is the fact that mainland China is officially Communist an obstacle to the application of the broader version of the Weber thesis proposed here. As we have seen with the case of Benjamin Franklin, Max Weber recognized that elements of the religious orientation could survive in secularized form.

What then about the advance of India? So far this has been a much more uneven process. Nonetheless, the Indian phenomenon probably reflects the role of certain high castes, especially the Brahmins, whose self-denying ethos shows significant similarities with that of Protestant asceticism. There is also the role of numerically small minorities, such as the relatively sparse Parsis, who follow the Persian faith of Zoroaster. Finally, one cannot discount the role of the (mainly Protestant) British colonialists. India is the only large nonwhite country of the former British empire to have successfully replicated the concepts of democracy and the rule of law promoted by its imperial overlords. It may well be that economic progress also reflects, at a remove to be sure, the Protestant values long cherished at home by the former colonizing power.

Returning to the question raised at the outset of this essay, religion is often a hindrance to economic advance. But not always. In various parts of the world, including most significantly East Asia, it has served to promote economic success.

Sunday, May 20, 2007

Philosophia perennis

My learned friend at Gayspecies has indiputably derived great benefit from his extensive course work in philoophy. Unlike many with a college education he has not allowed this talent to lie fallow, but has instead continually refereshed and invigorated it. Sometimes, though I wonder if he does not go a bit overboard. In a recent posting on problems in our universities he has set forth the the following program

"If I could structure the "ideal" liberal education for undergraduates at our colleges and universities, I'd divide all studies into four broad areas of general focus, and require all students to apportion their coursework equally among each of the four (e.g., 30 semester units in each group):

"Creative Philosophy (Arts, Literature, Music, Theater, Film, etc.)
Practical Philosophy (Ethics, Politics, Sociology, History, Economics, etc.)
Speculative Philosophy (Metaphysics, Psychology, Epistemology, Literary Theory, etc.)
Natural Philosophy (Chemistry, Physics, Biology, Mathematics)

"In addition, the trivuum of grammar, rhetoric, and logic would be mandatory, first courses. They are indispensable to a liberal education (and to a good life)."

This idea of subsuming most worthwhile subjects under the rubric of philosophy ignores--it seems to me--the wariness that many now feel with regard to philosophy as it is commonly pursued in English-speaking universities.

To be sure, virtually every academic field evokes dislike, even hatred from some quarter or other. Rudolf Wittkower, my boss during my brief time teaching at Columbia Universiity, once vouchsafed to me the following: "As to sociologists, they should all be killed!" This savage recommendation occurred, mind you, at Columbia University, the homebase of the brilliant sociologist Robert Merton. Every word he wrote is golden, to be read and read, and pondered.

So I am not concerned with the common-variety form of academic backbiting, but with the hostility that phlosophy generates sui generis. The first reason for this dislike stems from the pride that those equipped with philosophical training often affect. Sometimes they seem to think that they are inherently smarter than anyone else; at other times they seem to believe that it is the study of philosophy that has made them such. Perhaps it is both. To this, I suppose, the vulgar response is "If you're so smart, why aren't you rich?" From what I can gather Gayspecies is fairly well off, but I doubt that he got that way from studying philosophy. There are too many philosophical cabdrivers driving around.

A more serious problem arises from the definition of philosophy itself. One view is that philosopby addresses, with great acumen and insight, a limited number of topics that are intrinsic to itself. Over the centuries, philosophy has seen the emigration of a number of fields formerly within its purview, starting with the natural sciences in the 17th century and culminating (possibly) with the emancipation of psychology a little over a hundred years ago. In my day, a half century ago, philosophy seemed to have reached a limit in this shedding process, for (within the limiis of the proto-analytic trend then hegemonic) it rejected not only the fields mentioned, but saw fit to cast into the outer darkness metaphysics, ethics, and aesthetics. These fields were redesignated "poetry," possibly charming, but affording no access to truth.

At the same time, there survived the older notion that philosophy is the Queen of the Sciences. As such, it is entitled to intervene at any point in any field in order to dispell confusions and set the practitioners of the right path. Well, in my own realm of art history I have seen nothing but mischiief in the effort of professional philsophers to intervene and set us on the right path. Colleagues in other disciplines have told me the same.

The universalizing concept of philosophy seems to me quite simply a product of arrogance.

Thus there are two forms of arrogance. First, is the notion, cited above, that individuals with philosophical training are per se smarter than everyone else. Secondly, there is the idea that philosophy and its practitioners constitute a kind of Herrenvolk, with a sublime mission to govern and regulate everyone else. In practice their efforts in the latter realm are regularly ignored and rebuffed, but hope springs eternal, as seen in the above-cited utopian proposal to herd most areas of study into the great corral of academic philosophy.

Ne sutor ultra crepidam. Let us all look to the health of our own discipline before we seek to offer therepy to another.

By the way, the Ph.D. degree (I have one) is a mere conventional title. It carries no further implications than does, say, the term "bachelor" underlying the B.A. degree; there is no understanding that the married--or for that matter women--are not
eligible to obtain it. Similarly, few would accept that possessors of the M.A. degree are entitlted, ipso facto, to be our masters.

Saturday, May 19, 2007

Fifty years of Britain

I first saw London fifty years ago. A graduate student, I was on one of those circular tours of Europe--of the “If this is Tuesday, it must be Belgium” type. We didn’t visit Belgium, but we did go to the Netherlands, West Germany, Austria, Italy, Switzerland, and France.

I have long felt an affinity with what I regard as the more profound and rigorous traditions of Germany, France, and Italy. British empiricism has always struck me as a lackluster affair, cobbled together out of spare parts in some dreary English backstreet. Still, like most American intellectuals, I felt the tug of Anglophilia.

Although the effect of World War II were still evident in bombed out sites and low standards of cuisine, I found London quite charming. Who doesn’t? At any rate, returned for a week in 1960, exploring the possibilities of writing a dissertation on a manuscript in the British Museum. For this purpose, I obtained a Fulbright grant in 1963. I was to remain in London for four years. With almost total flex time, I explored (with my partner, another Fulbrighter) much of what the city had to offer.
By 1967, when an academic job beckoned from stateside, I could sense warning clouds. Germany--and soon France and Italy--overtook Britain. Inflation surged ahead, and friends who remained in London had to make drastic changes in their standard of living. Successive governments proved incapable of getting control of the situation--until, that is, Margaret Thatcher came along.

Most Americans have trouble giving Margaret Thatcher her due. Not so Tony Blair who, even though he is a Labourite, does not flinch before the sobriquet “son of Thatcher.” At any rate the whole matter has been turned around. Now it is Americans, with their shrinking dollar, who feel poor in Britain.

It seems that this remarkable series of transformations has been charted in a new book by an able Scottish journalist, Andrew Marr, who explains why he loves living in Britain in this piece from The Independent.


The story of the British in the immediate aftermath of the Second World War is a morally attractive one with much to learn from--a time of optimism and energy, despite apparently crippling difficulties.

Politicians on both sides of the political divide believe that Britain will be important in the new world to be built and a great force for good. Returning soldiers and millions of civilians are determined to make up for lost time, to live happier lives.
Patriotism is not narrow, there is such a thing as society, and the common good is not laughed at. Labour is promising a New Jerusalem and though no one is entirely sure of what that magical city might feel like to live in, it clearly involves a new deal in health, schooling and housing.

In British film there is great energy and ambition. Designers and architects have brought over here plans originally drawn in Europe between the wars to create a brighter, airier and more colourful country. In science and technology Britain seems to have achieved great things which augur well for peacetime.

There is a general and justified pride in victory, not yet much tainted by fear of nuclear confrontation to come. If people are still hungry and ill housed, they are safe again. If they are grieving, they also have much to look forward to, for the baby boom is at full pitch.

There is much in the Britain of the later Forties that would surprise or even disgust people now.

It was not just the shattered cities or the tight rations that would arch modern eyebrows, but the snobbery and casual racism--even, despite the freshly shocking evidence of the concentration camps, widespread anti-Semitism.

Yet overall, this was a country brimming with hope. In history, no quality rubs up as brightly.

The great debate about the meaning of our post-war history has been, roughly, an argument between Left and Right.

There are historians of the Centre Left such as Peter Hennessy who are generally impressed by the country's leaders and get under their skin as they wrestled with dilemmas.

Then there are those led by Correlli Barnett who emphasise failure and missed opportunities, at least until Margaret Thatcher arrives to save the situation in 1979.

Everyone else struggles between these force-fields. And so what is my view? That we grumpy people, perpetually outraged by the stupidity and deceit of our rotten rulers, have (whisper it gently) had rather a good 60 years.

Britain suffered a crisis in the Seventies, a national nervous breakdown, and has recovered since. Britain in the Forties and Fifties was a damaged and inefficient country which would be overtaken by formerly defeated nations such as France, Germany and Japan.

But the longer story, the bigger picture, is that Britain successfully shifted from being one kind of country, an inefficient imperialist manufacturer struggling to maintain her power, to become a wealthier social democracy, and did this without revolution.

And shift she did, in the greatest scuttle in the world.

British governments, Labour and Tory, duly got rid of the Empire. This meant the deaths of untold numbers in other continents - Muslims and Hindus caught up in ethnic cleansing, the African victims of massacre and dictatorship, civil war and famine for the Arabs, Cypriots and many nationalities of the Far East.

Britain, meanwhile, refocused on her new role as a junior partner in the Cold War, close to Europe but never quite European, speaking the same language as Americans, but never meaning exactly the same.

Always, we have been a country on the edge.

We moved from being on the edge of defeat, to the edge of bankruptcy, to the edge of nuclear annihilation and the edge of the American empire, and came out on the other side to find ourselves on the cutting edge of the modern condition, a post-industrial and multi-ethnic island, crowded, inventive and rich.

The years before Thatcher were not a steady slide into disaster.

Nobody has put this relative British success better than the American historian George Bernstein, who called his account of post-1945 Britain The Myth of Decline and who said of the years before the crisis of the Seventies:

''Britain's performance in providing for the wellbeing of its people--as measured by employment, a safety net that kept them out of poverty, and improved standards of living--was outstanding.''

And this despite ferocious economic conditions.

There is a danger of distorting real history with false endings. If one decides that the breakdown of the Seventies was the single most important thing to have happened to post-war Britain, which shadows everything before and since, then inevitably the story of the Forties, Fifties and Sixties becomes darker.

Humdrum events dutifully rearrange themselves as ominous warnings.

All the things that went right, all the successful lives that were lived during 30 crowded years, the triumphs of style and technology, the better health, the time of low inflation, the money in pockets, the holidays and the businesses that grew and thrived, are subtly surrounded with ''yes, but'' brackets... guess what's coming next.

But this is a strange way of thinking. In personal terms it would be like defining the meaning of a life, with all its ups and downs, entirely by reference to a single bout of serious illness or marital break-up in middle age.

Does this mean we should cheer our leaders? Certainly not. For most of the modern period politics has served Britain less well than our self-congratulation about parliamentary democracy might suggest.

Good people, acting honourably, failed to lead well. We have been run by cliques of Right and Left who did not understand the direction the country was taking.

Hennessy is right: the political class was intelligent and faced terrible choices which are easy to brush aside afterwards when the dangers have passed.

But Barnett is also right: we could have had a better country, had we had clearer-minded leaders who did not shrink from telling hard truths, or from treating the voters like adults.

So, Labour did not build a New Jerusalem. So, the Tory Cabinets of the Fifties and early Sixties failed to create the restored great power, the New Elizabethan Age they dreamed of.

The Wilson and Heath years were supposed to be a time of modernisation, a refitted, retooled Britain. They ended with trade unions rampant and the lights flickering out.

John Major set out promising to create a country at ease with itself and ended up with a country ill at ease, above all with John Major.

Tony Blair's New Labour Britain was never as cool or efficient as he told us it would be, even before the Iraq war. Nor was it whiter than white.

Each failure occurred on its own terms.

The exceptions were the Labour government of 1945, which developed a Welfare State even if it did not achieve the social transformation it wanted, and Margaret Thatcher's first two administrations, which addressed the British crisis head-on. Both set templates for what followed.

But even these two counter-examples are not completely clear.

Post-war Labour ran out of popularity and momentum within a couple of years, while Mrs Thatcher's vision of a remoralised, hard-working nation of savers and strong families was hardly what the partying, divided, ''loadsamoney'', easy credit, big-hair Eighties delivered.

What follows is a story of the failure of political elites. Often the famous political names, those faces familiar from a thousand cartoons and newsreels, seem to me like buzzing flywheels with broken teeth, failing to move the huge and complex structures of daily life.

If that was all, it would be a depressing tale. But it is not.

Opening markets, well-educated and busy people, a relatively uncorrupt and law-abiding national tradition, and an optimistic relish for the new technologies and experiences

offered by 20th-century life all make the British experience generally better than political history alone would suggest.
In the more recent decades the retreat of faith and ideology, and their replacement by consumerism and celebrity may have made us a less dignified lot.

Yet modern Britain has made great advances in science, culture and finance which have benefited, and will benefit, the world.
Among the puzzles facing humanity at the beginning of the 21st century are global warming; the mystery of consciousness; and how ageing Western societies adapt to the new migrant cultures they require to keep them functioning.
British people have been important in bringing answers, just as they were seminal in the development of the Web, and in creating modern music and television. We have become a world island in a new way.

In the period covered by this book, the dominant experience has been acceleration. We have lived faster. We have seen, heard, communicated, changed and travelled more. We have experienced a material profusion and perhaps a philosophical or religious emptiness that marks us off from earlier times.

If, by an act of science or magic, a small platoon of British people from 1945 could be time-travelled 60 or so years into the future, what would they make of us?

They would be nudging one another and trying not to laugh. They would be shocked by the different colours of skin. They would be surprised by the crammed and busy roads, the garish shops, the lack of smoke in the air.

They would be amazed at how big so many of us are - not just tall but shamefully fat. They would be impressed by the clean hair, the new-looking clothes and the youthful faces of the new British.

But they would feel shock and revulsion at the gross wastefulness, the food flown from Zambia or Peru then promptly thrown out of houses and supermarkets uneaten, the mountains of intricately designed and hurriedly discarded music players, television sets and fridges, clothes and furniture; the ugly marks of painted, distorted words on walls and the litter everywhere of plastic and coloured paper.

They would wonder at our lack of church-going, our flagrant openness about sex, our divorce habit - our amazingly warm and comfortable houses.

They would then discuss it all in voices that might make us laugh at them - insufferably posh or quaintly regional. Yet these alien people were us. They are us.

The crop-haired urchins of the Forties are our pensioners now. The impatient, lean, young adults of 1947 with their imperial convictions or socialist beliefs are around us still in wheelchairs or hidden in care homes.
It was their lives and the choices they made which led to here and now. So although they might stare at us and ask, ''Who are these alien people?'' we could reply: ''We are you, what you chose to become.''

Andrew Marr's A History of Modern Britain has been published by Macmillan at £25.

Friday, May 18, 2007

Jefferson and Saxonism

Still flourishing in some philological and art-historical circles, Anglo-Saxon studies took a hit in the 1960s with its insistence on “relevance.” The requirement that English majors study the language, once common in our universities, has disappeared. Even graduate students get little exposure to this once vital field, whose origins go back to a directive of Thomas Jefferson that it be taught at the University of Virginia. To be sure, there are several popular translations of Beowulf (and a 1977 rock opera with that name!), but these encounters sidestep any encounter with the language. Allen J. Frantzen explains this paradox--dismissal and partial survival (in special enclaves)--in his book Desire for Origins: New Languages, Old English, and Teaching the Tradition.

In England the matter has retained more interest, in part because of the popular enthusiasm for archaeology. An old dispute has reached a surprising conclusion. For a long time historical demographers had debated whether the invading Angles, Saxons, and Jutes had simply ethnically cleansed the indigenous Celtic population, or had absorbed them into their own stock. It turns out that neither is true. DNA and other genetic analyses have shown that the bulk of the population of England (and presumably the other parts of the British Isles) descends from an original peopling as the ice sheets retreated some 10,000 years ago. These folk came from northern Spain. In all likelihood, the closest ethnic affinities of the modern English are with the Basques.

These discoveries are very recent. During the early modern period a powerful set of myths took root in England concerning the Anglo-Saxons. In the 17th century these views became entangled with the dispute between the parliamentary faction and and the monarchy. According to the defenders of the privileges of parliament, the English possess a natural sense of liberty which came, with the Angles, Saxons and Jutes, from the forests of northern Germany. By tradition this settlement began with the arrival of the Jutish chieftains Hengist and Horsa, who reputedly landed in southern England in 449 CE. The brutal Norman conquest of 1066 occluded these virtues, but failed to suppress them completely. In fact, the cause of freedom and the “natural rights of Englishmen” made a comeback with the granting of Magna Carta in 1215.

Language still offers some attestation to this legend of origins, as the part of Germany from which the proto-English came is still termed Lower Saxony. In part for this reason, the overall theory of special English virtue owing to the settlement of the Angles, Saxons and Jutes, is commonly termed Saxonism.

The notion also bonded with the fascination with the Goths, a continental Germanic group who ostensibly created Gothic architecture. The Gothic heritage blended synergistically with other trends to form the “Gothic balance.” This expression, favored by James Herrington, serves as a kind of shorthand for the principle of mixed government in which no branch will have supremacy. Others preferred the presumed original purity of the Saxon foundations, without any “Gothic” admixture.

The original narrative proved very congenial to Thomas Jefferson. At several points during his life he took up a project for an Anglo-Saxon grammar which remained unfinished. Yet Jefferson’s interest in the Saxon heritage went far beyond matters of philology. He held that the forward movement of British settlement in North America was a continuation of the original migration of Hengist and Horsa. It was all part of the vigorous expansion of a superior group of people. Jefferson even went so far as to suggest that the form of government being adopted in the emerging United States represented a restoration of the sublime Anglo-Saxon principles. It was now North America that represented these verities, not a corrupt England under the rule of foreign monarchs.

Thomas Jefferson held that the basis of the common law was shaped in the immediate aftermath of the arrival of Hengist and Horsa in the mid-fifth century. Since England was not converted to Christianity until two centuries later, the common law is by definition pagan.

Jefferson sought to give these ideas visual form in his proposal for the design of the Great Seal of the United States. One side was to bear the images of Hengist and Horsa. The other was to depict a pillar of fire leading the Chosen People into the Promised Land. The racial character of this combination is unmistakable. Those of English heritage must predominate on the new continent because of the primordial excellence of the Anglo-Saxons, personified by Hengist and Horsa. The pillar of fire designates the collective side. It belongs to what is termed the theory of manifest destiny, the idea that the original settlers of British North America were entitled to exercise supremacy over the whole continent--and beyond.

Jefferson’s enthusiasm for his presumed Germano-English ancestors foreshadows the contemporary preoccupation with “roots,” the idea that ethnicity plays a special role in one’s identity. In contemporary parlance, it is the tribal myth of the WASPS. In their exclusiveness, though, Jefferson’s Saxonist beliefs were the immediate ancestor of Nativism, with its suspicion of all immigrants of non-English stock. As such, the ideology is poorly suited to an increasingly multiethnic America. Perhaps that is why this strand of Jeferson’s thought does not figure, as far as I can tell, in any of the current accounts of the ideas of he Founders of the American Republic..

In recent years the iconic status of Thomas Jefferson has sustained a number of shocks, including the revelation of his affair with Sally Hennings, the awareness of his convictions regarding the supposed inferiority of blacks, his faltering support of civil liberties, and his proposal that homosexuals be castrated. Yet his adoption of the Saxonist myth may be the worst of these faults, enlisted as it is in his ideas of American triumphalism and Anglo-Saxon supremacy.


A reader suggests that Jefferson's well-known univeralism remains paramount. Perhaps so, but I am not sure the Founder's Enlightenment universalism overrides his seemingly episodic preoccupation with his roots. After all, there is a similar problem in the contrast between his stubborn insistence on black inferiority vs. the ringing language of the Declaration of Independence. Can we really say that Jefferson's Negrophobia, which was almost pathological, was episodic?

It may be that he adumbrated an answer to the first question in his "A Summary View of the Rights of British North America." There he says that the ancestors of the British Americans had twice exercised a "right which nature has given to all men," that is, to change their place of residence. Thus rights are potentialy universal, but most peoples have become servile and neglect the exercise of these rights. The Saxons, broadly defined, owe their superiority to this exercise.

Over the years Saxonism has become deeply unfashionable, indeed forgotten. Not so the universalism of the Declaration of Independence. As I noted, though, that universalism is problematic.


Sunday, May 13, 2007

Second Amendment shift

A story in the New York Times (May 7, 2007) outlines a recent shift among liberal legal scholars regarding the controversial right to bear arms found in the Second Amendment. This shift lies behind the March decision of a federal appeals court in striking down a DC gun-control law.

Until quite recently the conventional wisdom was that the Amendment provided only for a collective right, a right that could only be excercised by state militias. According to Professor Sanford Levinson, arguably the initiator of the new trend, "[t]he standard liberal position is that the Second Amendment is basically just read out of the Constitution." The new understanding of the individual right to bear arms is endorsed by (of all people) Professor Lawrence Tribe of Harvard. Others are holding out.

The history of the interpretation offers two contradictory lessons. First, political considerations often trump honest readings of the Constitution. Since liberals make up the overwhelming majority of our legal professoriate, it is their biases that come into play. This political bias is particularly evident in the older interpretations of the Second Amendment, hecause liheral interpreters generally act to hroaden the scope of the Constitution. Here, they sought to narrow it. However, there is another more hopeful conclusion and that is that eventually honest scholarship may win out.

All this brings to mind Winston Churchill’s observation: "One can always rely on the Americans to do the right thing—after they have exhausted every other option."

Howevering in the background is the much advertised conflict between "strict constructionism" and the "living Constitution," approach in which that document sometimes seems to amount to just a piece of silly putty. A case in point is the expansion of the Commerce Clause, a development dating back to 1903, which gained much impetus during the New Deal.

The contrast is a false antithesis, however, as very few scholars adhere to strict constructionism, strictly constructed. Even Mr. Justice Scalia, often regarded as the leader of the SC faction, denies it, saying that he is a "textualist." Textualism has been defined as a formalist theory of statutory interpretation which holds that a statute's ordinary meaning should govern its interpretation, as opposed to inquiries into non-textual sources such as the intention of the legislature in passing the law, the problem it was intended to remedy, or substantive questions of the justice and rectitude of the law. It should not be confused with the plain-meaning approach, which looked to the dictionary definitions of words, without reference to common public understanding or context.

This is not the place to discuss the overall merits of gun control. Let me end, though, with a story told hy the philosopher Leo Strauss. As a university student in Germany in the 1920s he belonged to a Jewish Club. One day they received a visit from the famous Zionist Emmanuel (Vladimir) Jabotinsky. He asked what the Club did. "Oh, we study Torah, the Talmud, Jewish history, and the Jewish contribution to modern intellectual life." "And what ahout pistol practice?" the visitor asked. "That we don’t do," came the answer.

If only the Jews of Central Europe had armed themselves, at least they could have gone down fighting. Only at the end the of the war did a few brave fighters in the Warsaw Ghetto finally adopt this tactic.

The more general point is that posession of firearms, and a proper understanding of their use, is an important resource in the protection of minority rights. We see a current example of this in the Pink Pistols, an association of gay men who arm themselves where it is legal to do so. Even the thought that the despised fags might be armed is enough to deter some gay bashers.

Monday, May 07, 2007

Unusual college appointments

It seems that Jim McGreevy, ex-governor of New Jersey and self-proclaimed "gay American," has just completed teaching a college course on ethics. This field of expertised is interesting, since not only did McGreevy deceive both of his wives about his sexuality, his administration was plagued with corruption allegations.

Can other such appointments be far behind? In fact, news has just come in of two new faculty members at Strom Thurmond University, in Brackish, South Carolina.

David Duke will head the department of Judaic Studies. "After all," he said, "I know more about Jews and their conspiratorial effort to take over the world than almost anybody. Why shouldn't I direct such studies?"

A course on gay studies will be offered by Rev. Fred Phelps of the God Hates Fags ministry.
Gay and lesbian students are advised not to sign up for the laboratory sessions. They might not survive.


Sunday, May 06, 2007

The appalling Tenet

Many find that the enigmatic writings of Nostradamus contain warnings for our own times. An even more ancient source of warnings, it seems to me, stems from the late Roman SATOR square. It turns out to have an important message concerning a current figure, George Tenet. (I should remark parenthetically that I am appalled by the amount of attention this noxious apparatchik is receiving, and only marginally reassured by the hard questions that are being asked, occasionally at least, by his interlocutors in the media.)

At any rate, here is the square, which reads both horizontally and vertically.


Note that the word TENET occurs biaxially, as it forms both the vertical and horizontal center. The identification is confirmed by the first word, SATOR, which means "sower" in Latin. The given name "George" means an agriculturalist in Greek (corresponding to Tenet’s ethnicity).
The second word AREPO is mysterious; it is generally taken to be the name of the sower. However, if pronounced in English it yields "A rape, oh!" The erstwhile spook has indeed participated in the gangrape of the American public.

TENET, usually rendered as "holds," is now perfectly clear.

OPERA seems to mean "with effort." George Tenet’s efforts to retain his footing in Washington were indeed strenuous, though they ultimately failed. A supplementary explanation, complimenting the first, relies on the modern acceptation of opera. Tenet used to claim that his "hair was on fire." His temperament does indeed seem operatic—and of course false through and through.

Finally, ROTAS means wheels. Tenet claims only to have been a cog in the war machine. In fact, he was one of its big wheels.


Friday, May 04, 2007

Gettin the blues in Russian

Oddly, this article does not mention one factor influencing Russian speakers' view of the shades of blue, and that is that one of meanings of goluboy is "gay." See the section on the trope of Color Symbolism in the main body of this site, or (more conveniently) in the companion site

Russian speakers get the blues
11:46 01 May 2007 news service
Roxanne Khamsi
The language you speak can affect how you see the world, a new study of colour perception indicates. Native speakers of Russian – which lacks a single word for "blue" – discriminated between light and dark blues differently from their English-speaking counterparts, researchers found.
The Russian language makes an obligatory distinction between light blue, pronounced "goluboy", and dark blue, pronounced "siniy". Jonathan Winawer at MIT in the US and colleagues set out to determine whether this linguistic distinction influences colour perception.
The team recruited 50 people from the Boston area in Massachusetts, US, roughly half of whom were native Russian speakers.
Volunteers viewed three blue squares on a screen and had to indicate by pushing a button whether the single square on top matched the bottom right or bottom left square in terms of hue. In total there were 20 different shades of blue.

Subjects had to pick which one of the two bottom squares [not reproduced here] matched the colour of the top square.

Subjects completed two types of tests: in one version, the three squares were of a similar shade, whereas the other test involved one square that was a markedly different shade - for example, distinguishing a dark blue from a light blue.
English speakers were no better at distinguishing between dark and light blues than they were at telling apart two blues of a similar shade.
Russian speakers, by comparison, were 10% faster at distinguishing between light (goluboy) blues and dark (siniy) blues than at discriminating between blues within the same shade category.
"This is the first time that evidence has been offered to show cross-linguistic differences in colour perception in an objective task," says Winawer.
Moreover, when Russian speakers had to memorise an eight-digit number while doing the colour task, they were no better at distinguishing between dark and light blues and those within of a similar shade.
Winawer believes that this is because the concentration needed to memorise the number interfered with their verbal brainpower – removing the extra boost that the Russian language gives in classifying light and dark blues.

Journal reference: Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.0701644104)

Source: NewScientist

Wednesday, May 02, 2007

Cordonning off people

In Baghdad the US military has begun to implement a policy of enclosing neighborhoods behind high walls, with checkpoints to control entrances and exits. Ostensibly, the British successfully applied this system to end the counterinsurgency in Malaysia. Yet when applied in Vietnam it did not work.

These walls, separating small groups within a larger whole are different from large-scale international walls, beginning with the Great Wall of China.

It is worth recalling the origins of such enclaves. They are not pleasant to contemplate. What we are concerned with is human polderization. A polder is a low-lying tract of land that forms an artificial hydrological entity, enclosed by dikes. The term stems from the Dutch, and indeed the best known examples are found in the Netherlands, where agriculturally useful land has been wrested from the sea by such methods.

During World War II Hitler’s theorists applied the term "human polderization" (Polderisierung) to the territories being conquered in the East, especially the Ukraine. These vast territories were to be controlled by walled cities inhabited by ethnic Germans. Superhighways, again restricted to the Germans, linked the enclaves. In the interstices, the "polders" that resulted from this slicing and dicing, the native population would live, relegated to a status of helotry.

This seems to be what the Israelis have in mind for the Arabs in the East Bank. Settlement proceeds apace there, with special roads for Israeli citizens only. Checkpoints restrict the movement of the indigenous population, which is essentially confined to a series of Bantustans. Perhaps the appropriate term for them is human polders.

Recently a group of German parliamentarians returned from a visit to the state of Israel. Some of them compared the policies currently being implemented by the Israeli government in the Occupied Territories to measures taken in the Third Reich. A scandal ensued. Yet was the perception of those parliamentarians entirely wrong?


Tuesday, May 01, 2007


A spoonerism occurs when two familiar terms in a sentence are mutually transposed or switched. The effect is generally comical, but sometimes more profound. The reference is to the English don, the Rev. William Archibald Spooner (1844-1930), warden of New College, Oxford.

Because Spooner’s slips and quips were achieved orally—he did not commit them to writing—we are dependent on the reports of others. As a result some ascriptions are clearly apocryphal.
With its ramblings, the following gem seems authentic: "Poor soul, very sad; her late husband you know, a very sad death—eaten by missionaries—poor soul."

Still, I prefer the snappy quality of this variation on the words of St. Paul: "In a dark, glassly." This gnomic saying is worth pondering. Was Spooner anticipating night goggles, whose vitreous lenses enable one to see at night? Or is it just an observation about the strain of trying to see in the dark, an effort making one’s eyes glaze over, so to speak? But maybe none of these is correct. Anyone who feels the onset of cataracts knows what he meant.

Many spoonerisms originate as slips of the tongue. Most of these lack profundity, as "lack of pies" (for "pack of lies").

Some spoonerisms cut deeper, though. Anthony Burgess’ variation on a Dickens title—"A Sale of Two Titties"--seems merely comical—until one remembers that toughs among the Revolutionary mobs would lop off portions of the bodies of the guillotined aristocrats, using them as ornaments.

Oscar Wilde remarked that "work is the curse of the drinking classes." The idea that those who drink should form a whole social class is intriguing—if implausible.

Curiously (for a Christian) T. S. Eliot chose to invert a precept of St. Paul, saying "The spirit killeth, but the letter giveth life." I suppose the idea is that gassy generalities are unhelpful, while precision is always welcome.

Here are some that I thought of:

The motto of a supremely conceited person would be "Nothing alien to me is human."

"Politics is an extension of war by other means." In fact, this assertion follows logically from Clausewitz' original observation, for it war is an extension of politics, it follows that they are really the same thing and each is an extension of the other.

"I have sworn on the altar of tyranny, eternal hostility against every form of God over the mind of man." This could have been Stalin’s motto (see previous posting).

"Up to now the philosophers have merely changed reality. The important thing is to understand it." This is a variation on Marx’s eleventh thesis on Feuerbach. The idea is that most philosophical schemes subject reality to a Procrustean bed in order to make it fit their schemes. Representing reality accurately, perhaps for the first time, represents a much greater challenge.

"Whatever doesn’t make me stronger kills me." This precept riffs off a Nietzschean tag currently popular among young people. The idea envisaged by the revised version is this. We are all, as Blaise Pascal remarked, under a sentence of death, with but a temporary reprieve. Making us stronger extends that reprieve; without this buttressing we resume our downward glide path to extinction.