Wednesday, September 30, 2009

Michel and me

It was a front-page review in the Times Literary Supplement (July 2, 1970) of “L’Archėologie du savoir” that first alerted me to the importance of Michel Foucault. Almost forty years ago! (Ou sont les neiges d'antan? etc.) The anonymous piece makes much of Foucault’s extravagant claim that “man is only a recent invention,” probably soon to disappear. Nonetheless, I immediately asked a friend of mine in Paris to get the book for me. After it arrived, I realized that the volume was a kind of sequel to a celebrated predecessor, the 1966 “Les mots et les choses: une archėologie des sciences humaines.” The two books need to be read together.

In fact, Foucault began his literary career with a ponderous tome of almost 600 pages, Folie et déraison: Histoire de la folie à l'âge classique (1961), the printed version of his French doctoral thesis. For a long time this work circulated in English in the guise of a poor abridgment. Finally, in 2006, Routledge issued a full English translation entitled The History of Madness.

At first glance one may ask how can one write a history of madness? That Foucault approached the topic of mental illness from this standpoint was an early indication of his capacity to reformulate conventional ideas in ways that are challenging. However, the theme was in the air; one need only think of the contemporary work of R. D. Laing and Thomas Szasz.

Foucault begins his account in the Middle Ages, with the social and physical exclusion of lepers. He argues that with the gradual disappearance of leprosy, madness came to occupy the empty niche, so to speak, that had been vacated. The 15th-century motif of the ship of fools is a literary version of one such exclusionary practice, that of sending mad people away in boats. In 17th-century Europe, in a movement which Foucault famously describes as the Great Confinement, "unreasonable" individuals were locked away and institutionalized. Continuing the story, Foucault explains that in the 18th century, madness came to be perceived as the reverse of Reason (that is, déraison) and, finally, in the 19th century as mental illness.

Foucault also argues that madness was silenced by Reason, losing its power to signify the limits of social order and to point to the truth. He examines the rise of scientific and "humanitarian" treatments of the insane, notably at the hands of the alienists Philippe Pinel and Samuel Tuke, who he suggests began the conceptualization of madness as “mental illness.” He maintains that these interventions, ostensibly a humanistic mitigation of the earlier harsh methods, were in fact no less controlling than previous practices. Pinel's treatment of the mad amounted to an extended aversion therapy, including such drastic treatments as freezing showers and use of a straitjacket.

This account is probably the first surfacing of a recurrent stratagem on the part of Foucault: things are not what you think, but rather the opposite. This is the device known to historians as Revisionism.

Because of its length and seeming abstruseness, the doctoral tome attracted little attention at the time. Five years later, though, matters took a dramatic turn with Foucault’s next book, which created a sensation in France.

Foucault’s Les Mots et les choses: une archéologie des sciences humaines, just noticed, was published in 1966. It was translated into English and published by Pantheon Books in 1970 under the title The Order of Things: An Archaeology of the Human Sciences. Somewhat startlingly, the book opens with an extended analysis of Diego Velázquez's painting Las Meninas (Museo del Prado) and its complex manipulation of sight-lines, hiddenness, and appearance. This piece had originally been a separate study. However, Foucault’s editor perceptively suggested that it could serve as an overture to the monograph. The book’s central thesis proposes that all periods of history have possessed specific underlying conditions of truth that demarcated the realm of what was deemed acceptable, as, for example, scientific discourse. Foucault argues that these conditions of discourse have changed over time, in major and relatively sudden shifts, from one period's "episteme" to another. It is evident that the term episteme is simply a repackaging of Hegel’s concept of the Zeitgeist or Spirit of the Age. In recent centuries, according to Foucault, the Renaissance episteme yielded to the Classic episteme, and that in turn to the modern episteme. This terminology indicates a recurrent problem with Foucault--his preference for provincial French concepts not readily translatable into other languages. What he terms the “classical era” is what is known in most European countries as the Baroque.

L’archéologie du savoir (1969) is essentially a sequel to Les Mots et les choses. In this book Foucault makes reference to Anglo-American analytical philosophy, particularly speech-act theory. Seeming to address the criticism of Francophone parochialism, Foucault shows a certain willingness to break out of the constraints of national boundaries.

In the new book, Foucault directs his analysis toward the "statement" (énoncé), the basic unit of discourse. The iconic notion of “discourse,” how we talk about things instead of the things themselves, was to become a hallmark of Foucaultian research and speculation, and of structuralism in general.

A few years later Foucault’s interest took a new turn, one which was in keeping with the sexual revolution and also a new wave of books on the subject, such as Jonathan Ned Katz’s “Gay American History” (1976), a massive collection of documents; and Sir Kenneth Dover’s “Greek Homosexuality” (1978). At the same time Foucault’s own homosexual orientation became generally known.

La volonté de savoir (1976) bears the supertitle “Histoire de la sexualité.” This introductory volume, a kind of big pamphlet, was to inaugurate a series of six, to be entitled: 2) La chair et le corps; 3) La croisade des enfants; 4) La femme, la mère et l’hysterique; 5) Les pervers; 6) Population et races. Even though some of these themes recur in Foucault’s later work, as far as I can see none of the five promised sequelae ever appeared.

What we have instead is a different set that conscripts the 1976 volume to serve as the prologue to something quite different. The volumes are individually titled The Will to Knowledge (Histoire de la sexualité, 1: la volonté de savoir--which we have already noted), The Use of Pleasure (Histoire de la sexualité, II: l'usage des plaisirs), and The Care of the Self (Histoire de la sexualité, III: le souci de soi). The last two appeared as a pair in 1984, the year of Foucault’s death.

Ignoring the two major contributions of 1984, most readers restrict their attention to the first, programmatic volume, which focuses primarily on the last two centuries. In this rapidly sketched account, the functioning of sexuality as an analytics of power is traced in terms of the rise of a science of sexuality (scientia sexualis) and the emergence of “biopower” in the West. In this volume he questions the repressive hypothesis, the now-conventional belief that we have, particularly since the 19th century, "repressed" our natural sexual drives. This repression meme was of course widely held at the height of the Sexual Revolution. Foucault seeks to show that in its heyday what we think of as sexual "repression" did not function as a way to dismiss the subject. Instead, the accompanying discourse, the incessant chatter if you will, actually elevated sexuality to a new status. It became recognized as a core element--perhaps THE core element--of our identity.

In Volume One, Foucault posits a watershed in human history, between the Counter-Reformation and the Industrial Revolution, when the Catholic church and the state sought to control people's sexuality for the stability of the church and the benefit of the economy, respectively. He points to a realignment of the Vatican's views on sexuality during this period as an attempt to make people feel the need to attend confession more often, thus increasing the church's dominion.

Characteristically, Foucault shows little interest in contemporary Protestant Europe. He detects a shift in focus by the French government from viewing citizens as "subjects" to "a population,” a quasiscientific concept that could be manipulated according to the needs of the economy. He claims that his francocentric analysis was emblematic of a process that spead across Europe in the wake of the Industrial Revolution. However, since the Industrial Revolution began in England, would it not be better to start with that country?

As I have noted, the original plan for completing the series effectively vanished. The second two volumes, The Use of Pleasure (Histoire de la sexualité, II: l'usage des plaisirs) and The Care of the Self (Histoire de la sexualité, III: le souci de soi) deal with attitudes towards sexual behavior in Greek and Roman antiquity. Both appeared in 1984, the second volume being translated in 1985, and the third in 1986.

After the first volume came out in 1976, Foucault's thinking had indeed shifted. He extended his analysis of government to its "wider sense of techniques and procedures designed to direct the behavior of men," which involved a new consideration of the "examination of conscience" and confession in early Christian literature. These themes of early Christian literature began to loom large in Foucault's work alongside his study of Greek and Roman literature, perhaps not unreasonably because Christian moralism stands athwart us and the ancient Greco-Roman world.
Secularists, who idolize Greece and Rome, wish that matters were otherwise; but this is the way things are.

Foucault's death from AIDS left his undertaking incomplete, and the planned fourth volume of his History of Sexuality, the one on Christianity, was never published. That fourth volume was to be entitled Confessions of the Flesh (Les aveux de la chair). In fact the text was almost complete before Foucault's death intervened. The manuscript lingers in the limbo of the Foucault archive; under restrictions established in Foucault’s will, it cannot be published.

We must now return to the first volume of Foucault’s sexuality series. A particular passage in this work turned out to have an amazing career in the field of gay scholarship. Beginning in the late 1970s and 1980s a kind of prairie fire swept the field, which was attracting many adherents, some academic, others not. This was the Social Construction (SC) approach, about which I have written elsewhere. Few could object to its moderate form, namely that individual human behavior is strongly conditioned by the social context, and that scholars must avoid the temptation of anachronism.

However, many SC scholars embraced a more controversial notion. In a nutshell they held there was no homosexuality prior to its “invention” in Western Europe about 1870. Of course these savants did not deny that same-sex behavior had been occurring since time immemorial: how could they? Yet they maintained that only in the second half of the 19th century did a sense of homosexual identity emerge, THE HOMOSEXUAL in short. Thus was born the thesis of the “modern homosexual,” a somewhat redundant expression, because adepts insisted that there was no other. A few scholars, especially those involved with data from Britain, pushed the origins of the complex further back, to about 1700. Common to both schools, however, was the notion that at some specific point in time, “the homosexual” was i n v e n t e d.

This ploy drew upon several models. In another much contested view, Denis de Rougemont and other historians had asserted that there was no such thing as romantic love until the 12th century. For his part, Philippe Ariès claimed that childhood was invented only after the close of the Middle Ages. At this point I am not concerned with the historical accuracy (or not) of these proposals, but simply with their (generally unacknowledged) role as precedents.

The Social Constructionist students of same-sex behavior found their charter in a short section of the first volume of Foucault’s series. Since few of these scholars, who tend to be anglophone monoglots, read French easily, their interpretation relies on the faulty 1978 English-language translation. In order to show the distortions that have crept in as a result of this second-hand sourcing, it is necessary first to turn to the French text.

“La sodomie--celle des anciens droits civil ou canonique--était un type d’actes interdits; leur auteur n’en était que le sujet juridique. L’homosexuel du XIXe siècle est devenu un personnage: un passé, une histoire et une enfance, un caractère, une forme de vie; une morpholgie aussi avec une anatomie indiscrète et peut-être une physiologie mystérieuse” (top of page 59). Shorn of its rhetorical flights, the import of the passage is clear. Foucault contrasts two historical types. Under the first regime, same-sex behavior is understood in terms of acts; under the second--when the homosexual person appears--it is understood in terms of actors.

Subsequently, some observers have pointed out that the contrast is not so stark, because after all the “sodomite” was a person too. At all events, except for the literal rendering “personage” (when “person” would be better) Robert Hurley’s translation of this first passage is acceptable and need not detain us further.

Not so the conclusion of the same paragraph, at the bottom of page 59 of the French edition, where we read the now-famous words: “Le sodomite était un relaps, l’homosexuel est maintenant une espèce.” Hurley: “The sodomite had been a temporary aberration; the homosexual was now a species.” This rendering is not acceptable. The sodomite was not an “aberration.” a term that belongs to a much later context, but a “backslider.” Worse is the choice of the word “species” for espèce. In ordinary French the latter word is generally used to signify sort or kind, as in common expressions such as “espèce de snob.” It is true that in certain contexts the term can mean species. Yet the fact that Foucault is not using it in this sense is shown by the extraordinary catalogue of “espèces” amassed in the following paragraph, which includes zoophiles, auto-monosexualists, mixoscopophiles, gynecomasts, presbyophles sexoesthetic inverts, and dyspareunist women. These terms are outlandish, and that is the effect that is intended. Still,they are subcategories that can be named, or were once thought to have this quality. At best they qualify as sorts or kinds, not separate species.

Let us proceed a bit further. If we were to grant, for the purposes of argument, that these oddities are in fact separate species, why have the Social Constructionists not turned their attention dutifully to writing the history of each one? As constituents of the reorganization of knowledge that took place around 1870 surely it would be important to do so instead of ignoring them.

In fact, these latter “espèces” are but remnants from the Old Curiosity Shoppe of sex research. That is decidedly not true of homosexuals, who have always existed in some sense or another. As we have noted, the sodomite, supposedly so radically different, was a “personnage” also. During the European Middle Ages, and at other times and places, there were more than just same-sex acts, there were actors as well.

In fact changes in ways of thinking about human behavior rarely take place with the suddenness that SC advocates claim. Establishing such neat and clean caesurae may be an effective classroom technique, but it does not meet the challenge of the reality principle. That this is so is shown by the fact that there are two SC schools: the majority, the Bolsheviks if you will, accept a date of about 1870 (the term “homosexual” itself was introduced in 1869), while the Mensheviks, the minority, propose 1700. This divergence in chronology signals that something must be wrong.

In fact the great sea change in attitudes to sexuality in the Western world took place much earlier, during late antiquity. At that point, Christianity, taking its cue in part from certain pagan ascetic thinkers, forced society into its torture chamber of sexual tabooing, a dungeon from which we have only recently begun to emerge.

Foucault refrained from taking an explicit stand on the Social Construction question. However, he chose to ally himself with John Boswell, SC’s most prominent opponent. So the appeal to Foucault as the “godfather” of SC seems to be mistaken.

Today we do not hear much about Social Construction in the field of sex research. However, its unfortunate heritage lives on in "son of SC," that is, Queer Theory, a protean topic to which I will return.

Labels:

Sunday, September 27, 2009

Coming out in middle school

The feature story of this week’s New York Times Magazine (September 27) is entitled “Coming Out in Middle School.” It is by Benoit Denizet-Lewis, a respected gay journalist who is on the staff of the magazine.

The writer remarks, “[t]hough most adolescents who come out do so in high school, sex researchers and counselors say that middle-school students are increasingly coming out to friends or family or to an adult in school. Just how they’re faring in a world that wasn’t expecting them — and that isn’t so sure a 12-year-old can know if he’s gay — is a complicated question that defies simple geographical explanations. Though gay kids in the South and in rural areas tend to have a harder time than those on the coasts, I met gay youth who were doing well in socially conservative areas like Tulsa and others in progressive cities who were afraid to come out.”

Many young men in this situation find support from their parents, particularly the mothers. Peer groups can also be helpful, especially it seems those made up of young women, many of whom regard themselves as bi. There are also support groups that organize dances and other get-togethers.

This development is particularly notable in view of the widespread bullying that has been rife in these schools. It is almost as if the coming out was a kind of act of defiance. It may be protective as well, because of the support that these young people are getting. They no longer have to face the dangers alone.

“As a response to anti-gay bullying and harassment,” Denizet-Lewis notes, “at least 120 middle schools across the country have formed gay-straight alliance (G.S.A.) groups, where gay and lesbian students — and their straight peers — meet to brainstorm strategies for making their campus safer. Other schools are letting students be part of the national Day of Silence each April (participants take a vow of silence for a day to symbolize the silencing effect of anti-gay harassment), which last year was held in memory of Lawrence King, a 15-year-old gay junior-high student in Oxnard, Calif., who was shot and killed at school by a 14-year-old classmate.”

Denezit-Lewis says the many of the young gay men he interviewed “seemed less burdened with shame and self-loathing than their older gay peers. What had changed? Not only were there increasingly accurate and positive portrayals of gays and lesbians in popular culture, but most teenagers were by then regular Internet users. Going online broke through the isolation that had been a hallmark of being young and gay, and it allowed gay teenagers to find information to refute what their families or churches sometimes still told them — namely, that they would never find happiness and love.”

For understandable reasons, the article shies away from the following question. Are these young--some very young--men having sex or not? It seems that for some individuals it is enough to establish their status (orientation)--to know who they are and let others know. Yet others are apparently engaging in sexual activity. Assuming that there is a positive family and societal response to begin with, few eyebrows seem to be raised if the age of the two partners is close. One individual profiled by Denizet-Lewis who is 13 has a boyfriend who is 14. The connection may have a sexual component or it may not. But in any case, sex or no sex, if the “boyfriend” was 24 instead, there would be universal cries of horror.

Evidently, these coeval relationships fall into what is sometimes called the Romeo-and-Juliet exception. Still, the activity is illegal under current age of consent laws, at least in most states.

It has occurred to me that in the long run this teenage coming-out phenomenon may be the equivalent of medical marijuana. That is, it may serve as a way to acknowledge the undeniable fact that young people--including gay teenagers--are sexual beings, and are likely to act on these feelings. Still, in view of the widespread horror about sex with youngsters in other contexts, the outcome is uncertain, even though the prospects are potentially interesting.

At all events we are confronted with a paradox. The television series “To Catch a Predator” continues to draw large audiences, showing the entrapment procedures deployed against adult males who would have sex with thirteen-year olds. To me, these programs are salacious and disgusting, because the individuals are enticed into committing a crime--which in fact they haven't committed, since once they enter the stake-out house they are humiliated and charged before anything sexual can occur. Where is the ACLU when we need it?

Mine seems to be distinctly a minority view. Apparently most viewers feel satisfaction at the witch-hunt therein depicted. "These monsters of depravity are getting what they deserve." That is one mindset. Yet it seems to coexist uneasily with another, which evidences growing toleration when actual thirteen-year olds openly proclaim their gay orientation and proceed to date other teenagers. There is something here that doesn’t compute.

I am not a boy lover (despite the malicious slanders that occasionally surface about me on the Internet). But it does seem that there is a puzzle here, one that may, just possibly, cause some changes in our view about sex and the age of consent.

I am not legislating such changes. In fact I am fairly agnostic: I have no particular investment in attitudes shifting on these issues. I am just observing.

For some the change would be a real boon; for others, though, maybe not. So, as I say, I am not prescribing--just attempting to peer into a very cloudy crystal ball.

One thing is sure. Attitudes may change, but the law will only follow long after.

UPDATE (Oct. 1). As if on cue, an old case has surfaced in which an adult had nonconsensual sex with a 13-year old girl. The man of course is the film director Roman Polanski, who has been arrested in Switzerland. What Polanski did in Los Angeles was clearly a crime, and he should have been made to do his time. Why did it take so long to catch up to him? This aspect suggests that class differences in the administration of justice played a role. There is one rule for the rich and famous, another for the rest of us.

If anyone doubts that Polanski's charmed life as a fugitive from justice is not an instance of class privilege, just look at those poor slobs on "To Catch a Predator." They haven't even done anything sexual, but they are arrested an prosectuted anyway. What would happen if Harvey Weinstein had showed up at Entrapment Villa? That is just a rhetorical question.

And there is another difference as well. If we heard of a Catholic priest who had forced himself on a 13-year old and then gotten away abroad, where he prospered, even receiving awards from Catholic groups, we would be appalled. But someone who is admired by the politically correct in Hollywood gets different treatment.

Still, the case is hard to understand completely--and this I suppose is emblematic of the whole matter of teenage sex. Consent, we generally agree, is essential. Obviously (I would think) an 8 year old cannot give informed consent. At what age though can one be deemed to do so? Maybe there is a gray area between say 14 and 18 where the matter has to be judged case by case. However, this murky concept just will not do where the law is concerned.

One thing seems certain. Almost anyone who attempts to grapple with these issues will end up with egg on his face--or worse. That includes yours truly.

.

Labels:

Saturday, September 26, 2009

Calling all Sandelistas!

Many years ago, because classrooms were overcrowded, the Sorbonne in Paris introduced the practice of broadcasting some popular courses over the radio waves. I have long felt that our most prestigious universities must begin systematically to offer their best courses on television. The classes would be free--though not the course credit. For that you would still have to enroll and do the papers and exams. So why is there a problem?

In fact things seem to be moving in that direction. Lectures given at Duke and Stanford are available on iTunes U. Some years ago M.I.T. developed its own software to make classes available. Or so I am told; I haven’t sampled these offerings personally.

I am tempted by a new initiative from Harvard, as reported by Patricia Cohen in the Arts Section of the New York Times for September 29. Over the years Professor Michael J. Sandel has attracted some 14,000 students to his wildly popular course commonly known as “Justice.”

“Now Mr. Sandal [sic] gets to play himself on television, not to mention online, as Harvard and public television stations across the country allow viewers to sit in on his classroom discussions about Wall Street bonuses and Aristotle, same-sex marriage and Kant, for the next 12 weeks.

“But what is new about Harvard’s venture, more than five years in the making, is that it is the first time that public broadcasters can remember a regular college course’s being presented on television. What’s more, it is also a highly produced multimedia event, with high-definition video, interactive Webcasts, podcasts, a new book and a speaking tour.”

In short, the telecasts will lure viewers with high-production values and spinoffs, instead of being shunted off to watch a dreary "video that looks as if it were made with a convenience-store security camera, as most Internet courses do, without the slides, syllabus, and other materials available to actual students.” (Sandel)

It is somewhat disappointing to learn that the lectures are not being televised live. In fact they were taped in 2005 and 2006 and first used for Harvard’s Extension School and for alumni. In 2007 WGBH, the Boston public broadcaster, joined forces with Harvard.

The station secured a grant from POM Wonderful, the juice company, to put the course on the air. Then Professor Sandel raised the rest of the money — about $600,000 in all — much of it coming from former students, Sandelistas if you will. Each 50-minute class has been edited down to 30 minutes; two appear in each television episode.

Patricia Cohen offers the following sample problem. “Would you switch a runaway trolley from one track to another if it meant killing one person instead of five? Would it be just as moral to push a person in front of the speeding trolley to stop it and save the five? What about a surgeon killing one healthy person and using his organs so that five people who needed organ transplants could live? Is that moral? Why not?”

The example given seems typical of the hypothetical dilemmas that abound in popular classes and newspaper columns on ethics. To his credit, Sandel tries to use real examples as reported in the media. However, they still tend to be atypical, and therefore are not representative of the issues that call for decisions in everyday life.

I am a bit skeptical, as is my wont. But others are euphoric. One observer thinks that if Sandel can keep going before the national media “he might do for us what Socrates did for the ancient Greeks. He might succeed in making moral reflection a public endeavor, not a solitary activity. . . . He and his students (disciples?) might shame our politicians into doing the right thing more often.” This comment sounds a bit like a goo-goo wet dream. I should explain that "goo-goo" is not a sexual term, but a sobriquet for those earnest folks who think that “good government”--as they define it--will solve all our problems.

According to Sandel, the issue that arouses the strongest feelings among the students is affirmative action. I gather that the professor is for it; many students are dubious, in part because they worked so hard to get into Harvard.

Sandel has made telling criticisms of that ubiquitous windbag John Rawls, supposedly the last word in enlightened political theory. To his credit, he is not a doctrinaire “progressive.” Still one wonders if Harvard would choose to showcase someone who is actually critical of affirmative action.

On and off campus, academics lean to the liberal side. This perceived imbalance is one of the reasons why ordinary citizens distrust professors, especially when they seek to offer moral guidance.

At all events, I may not watch the TV series. Instead, as an old Gutenberg type, I can just read the accompanying book, which has drawn praise from both E. J. Dionne, a liberal, and George Will, a conservative. Yet one reader of the book, found on Amazon, points up some instances of liberal bias--though not so labeled. These reported views are those the reader favors, so perhaps we should take the account with a grain of salt.

“He wants philosophy to be used on economics, not just on matters of abortion and gay marriage. Sandel demonstrates that the growing inequality in the U.S. undermines the solidarity that a democracy requires. Sandel points to the hollowing out of the public realm on which a democratic society depends. As public services decline and decline, as we let our common spaces for all but wealthy Americans deteriorate, we undermine our shared democratic citizenship.”

Also on Amazon, Herbert Gintis provides a more searching critique:

“The most important thing the student learns from this book is that . . . leading a moral life is the highest goal to which we can aspire. I learned moral philosophy in an era dominated by the sort of analytical philosophy according to which moral statements are meaningless utterances, and moral behavior is irrational and constricting. At its best, I was taught that moral principles were an individual's private property, and were about as important as one's musical or artistic taste. For Sandel, morality is not an accoutrement of the genteel life, but is the source of all meaning in life, and he conveys this message to the reader without an ounce of preachiness or self-righteousness.

“In his previous writings, Sandel has been a major critic of John Rawls's theory of justice, which has been the centerpiece of liberal democratic political philosophy for almost forty years. Rawls embraces a Kantian ethic that extends the Categorical Imperative (do unto others ... ) in a way relevant to social policy and political philosophy. According to Rawls, we must erect social institutions using principles that we would individually be willing to accept if we were behind a "veil of ignorance" that prevented us from knowing what position we would hold in the resulting social order. He suggests two major principles. The first is the lexical priority of liberty, meaning that no social order has the right to constrain freedom in the name of some type of social engineering. The second is the principle that society should be organized so that the well-being of least well off is maximized. This leads to a radical egalitarianism in which the question of the justice of the distribution of wealth and income is the major moral issue in society. In particular, it leads to a hyper-individualism in which the moral principles of individuals [are] of no importance in their claim to a "just share" of the material wealth of society, and individuals are worthy of respect whatever they happen to choose as a way of life, provided they leave room for others to pursue their individual goals. Sandel rightly rejects this political philosophy on the grounds that by favoring "rights" over "the good," we necessarily degrade political democracy and republican virtues.

“Sandel's alternative is to embrace a form of virtue ethics according to which the moral is what would be enacted by the virtuous individual, and we can tell what is virtuous by inspecting the character of human nature and the embeddedness of individuals in a close fabric of social life. The virtuous individual will "flourish" through acting in according with his or her highest nature, and immorality is a form of self-destruction brought on through ignorance or laziness.

“The main thing missing from this book is an appreciation for the science [sic] of human morality. Humans make morality in the same sense that they make food, babies, art, music, and war. Sandel does not appear to realize that theories of morality should explain moral behavior, much as linguistics attempts to explain human verbal communication. Philosophers appear to have the idea that the philosophical "experts" have no more reason to study people's actual moral beliefs than physicists have to study folk-physics. This is a serious error, which leads philosophers to seek the "one true theory" from which all moral truths can be deduced. There is no "one true theory." All of the major branches of moral philosophy are represented in the everyday moralizing of people. Obligation, consideration of consequences, a sense of virtue, and even visceral feelings of cleanliness and propriety are all involved in how people make moral choices.

“Because Sandel does not treat moral behavior as worthy of scientific study, he misses one major point about human morality: the strong underlying unity of moral sensibility across all societies and covering most social issues. The motivating force of Sandel's book is moral conflict, either in the form of an individual having to make choices that necessarily involve opting for the lesser evil (for instance, should soldiers kill an innocent shepherd to save the lives of nineteen patriotic soldiers, or should a living fetus be sacrificed to satisfy the preferences of the importuned mother), when in fact most major moral choices concern good versus evil, and what is considered good and evil is pretty much the same the world over. Everywhere, people cherish honesty, loyalty, hard-work, bravery, considerateness, trustworthiness, and charity. Similarly, everywhere people prefer insiders to outsiders, and take pleasure in hurting those who violate personal integrity or social rules. It is these moral values that have made humanity the imposing presence . . . upon the planet, and, if we are to survive into the future, it is these basic moral values, which are universal from small tribes of hunter-gathers to the vast populations of advanced technological society, that will provide the energy for the tasks that lie ahead of us.”

So Herbert Gintis.

As a former, though wayward student of cultural anthropology, I am a little dubious about "universal moral values." Also, preferring insiders to outsiders does not make for harmonious relations with others. It is a universal that is also a nonuniversal.

The gratuitous assumption that human beings all agree on certain core values reminds me a bit of the vacuous naivete' of a Karen Armstrong--horresco referens.

Labels:

Thursday, September 24, 2009

Waving the bloody shirt

In American history, the expression "waving the bloody shirt" denotes the demagogic practice of politicians referencing the blood of martyrs or heroes in order to inspire support, avoid criticism, and to lay down the gauntlet to opponents.

The phrase originated in the following context. During the period immediately after the Civil War (ca. 1865-1885) ambitious and unscrupulous politicians sought to gain advantage by fostering sectionalist animosities. The phrase implied that the Democratic Party (with its bastion in the defeated states of the South) was responsible for the bloodshed of the war and the assassination of Abraham Lincoln. The Civil War had indeed produced many bloody garments. Some candidates of the Republican Party as well a few candidates of other parties rivaling the Democratic Party deployed this notion to gain election.

Somewhat fancifully, the term "bloody shirt" has been traced back to the aftermath of the murder of the third caliph, Uthman, in 656 CE, when a bloody shirt and some hair alleged to be from his beard were used in what is widely regarded as a cynical ploy to gain support for revenge against opponents. In a more familiar example, the device also appears in the funeral oration scene in William Shakespeare's Julius Caesar, in which Mark Antony brandishes Julius Caesar's blood-stained toga to stir up the emotions of his fellow Romans. Shakespeare reprised the actual circumstances of Caesar's funeral in 44 BCE, when Marc Antony displayed the toga to the crowd.

In American history a similar literalism marked an incident featuring Benjamin Franklin Butler of Massachusetts (1818-1893). Making a speech on the floor of the U.S. House of Representatives, Butler held up the soiled shirt of a carpetbagger allegedly whipped by the Ku Klux Klan.

The concept of waving the bloody shirt also invites more metaphorical exploitation. Arguably, Michael Moore does this in his new film “Capitalism: A Love Story,” which features scenes of individuals abused by their employers and threatened with loss of their homes. The strategy is familiar: to incite a vivid sense indignation among supporters and to warn opponents of the possible retribution that lies ahead. Possibly the effect of the film (which I have not yet seen) suffers from the absence of actual blood. Yet the metaphorical use does not require this element.

To skeptics this sort of thing amounts to “Victimology 101.”

Actual blood was present in the incident that may count as the first modern exemplar of the strategy. Joseph Bara (1779-1793) was a young republican soldier during the French Revolution. Having been trapped by the enemy and being ordered to cry "Vive le Roi" ("Long live the King") to save his own life, he supposedly preferred instead to die crying "Vive la République" ("Long live the Republic"). Maximilien Robespierre seized upon the boy's death as a propaganda opportunity. Praising the adolescent at the Convention's tribune, Robespierre claimed that "only the French have thirteen-year-old heroes." He then had his remains transferred to the Panthéon. There is some reason to doubt the details of this story. Nonetheless, the artist Jacques-Louis David selected the incident as the subject of a moving though unfinished painting in which the body of a delicate, androgynous youth lies prone, enveloped by the Revolutionary colors.

Not surprisingly, the motif had cross-over power. Horst Wessel (1907-1930) was a German Nazi activist who was made a posthumous hero of the Nazi movement following his death in 1930, when a Communist shot him over a trivial incident. In early 1929 Wessel had written the lyrics for a new Nazi "fighting song" (Kampflied). This was the anthem later known as "Die Fahne hoch" from its opening line, or simply as the "Horst Wessel Song." The Nazis claimed that the young man also wrote the music, but in fact the tune derives from a World I song of the German Imperial Navy. After the Nazis came to power in 1933 an elaborate memorial was erected over the grave, and it became the site of annual pilgrimages. In this way Nazi propaganda interwove the legend of Wessel’s “martyr’s death” with the song to create a bloody-shirt motif--which of course lost its cachet with their defeat in 1945.

American bloody-shirt incidents tend to involve collective victims.

The Haymarket riot was a disturbance that took place on Tuesday May 4, 1886, at the Haymarket Square in Chicago. The event had began as a rally in support of striking workers. An unknown person threw a bomb at police as they dispersed the public meeting. The bomb blast and ensuing gunfire resulted in the deaths of eight police officers and an unknown number of civilians. In the internationally publicized legal proceedings that followed, eight anarchists were tried for murder. Four were put to death, and one committed suicide in prison.

The interpretation of the Haymarket event was bitterly contested. Popular literature offered the caricature of the “bomb-throwing anarchist.” Yet the left regarded those who were convicted as martyrs. The Haymarket affair is generally considered to have been an important influence on the origin of international May Day observances for workers. Ironically, the most important events of this kind occur in Europe, with the American Labor Day observed on a different date,

In 1992 the site of the incident was designated as a Chicago Landmark. In 1997 the Haymarket Martyrs' Monument in nearby Forest Park was listed on the National Register of Historic Places and as a National Historic Landmark.

The Triangle Shirtwaist Factory fire in New York City on March 25, 1911, was the largest industrial disaster in the history of the city of New York, causing the death of 146 garment workers who either died from the fire or jumped to their deaths. As a workplace disaster in New York City the Triangle fire has been surpassed only by 9/11. The disaster led to legislation requiring improved factory safety standards and helped spur the growth of the International Ladies' Garment Workers' Union, which fought for better and safer working conditions for sweatshop workers in that industry. The Triangle Shirtwaist Factory was located inside the Asch Building, now known as the Brown Building of Science. It has been designated as a National Historic Landmark and a New York City landmark.

Progressives and proponents of the labor movement often invoke the Haymarket events and the Triangle fire as bloody-shirt memes intended to indict the perceived evils of untrammeled capitalism.

Secularists and opponents of religion have their own favorite bloody-shirt motifs.

Giordano Bruno (1548-1600) was an Italian philosopher, mathematician, and occultist best known as a proponent of heliocentrism and the infinity of the universe. His cosmological theories went beyond the Copernican model in identifying the sun as just one of an infinite number of independently moving heavenly bodies. In addition to his cosmological writings. Bruno also composed extensive works on the art of memory, an assemblage of mnemonic techniques and principles. In 1600 he was burned at the stake at the Campo de’ Fiori, a market square in Rome, after the Inquisition found him guilty of heresy.

During the 19th and early 20th centuries commentators focusing on his astronomical beliefs hailed him as a martyr for free thought and modern scientific ideas. More recent studies indicate that these assessments are anachronistic. In her brilliant 1964 monograph the British scholar Frances Yates challenged the description of his beliefs as scientific, arguing instead that Bruno was primarily concerned with evolving an occultist or magical view of the universe. His world model melded elements derived from Arab astrology, Neoplatonism, and Renaissance Hermeticism. In addition he believed in metempsychosis (reincarnation), holding that human souls could be reborn in the bodies of animals. Yates’ interpretation is supported not only by Bruno’s extensive writings, but by his place in the broad hermetic tradition that stems largely, but not completely from the legendary Hermes Trismegistus. Giving practical effect to his beliefs, he cast horoscopes and may have engaged in magical procedures, behavior which his opponents found threatening. In short he was feared as a powerful magician.

Despite this evidence, the conventional wisdom clings to the notion that Bruno was a "martyr of science.” This meme pairs him with Galileo Galilei. The traditional view posits that, even though Bruno's theological beliefs were a significant factor in his heresy trial, his Copernicanism and cosmological beliefs played a central role in the outcome. Thus the affair is iconic of the purported relentless Roman Catholic hostility to science. Yet according to the Stanford Encyclopedia of Philosophy, "in 1600 there was no official Catholic position on the Copernican system, and it was certainly not a heresy. When [...] Bruno [...] was burned at the stake as a heretic, it had nothing to do with his writings in support of Copernican cosmology."

Along these lines, secularists and humanists have summoned a whole era before the bar of judgment.

Some historians and others employ the term Dark Ages to designate a period of cultural decline that ostensibly blighted Western Europe between the fall of Rome and the eventual recovery of learning. However, growing understanding of the accomplishments of the Middle Ages in the 19th century challenged the characterization of the entire period as one of darkness. Such blanket disparagement is clearly nonsense. Some observers have adopted a compromise position, restricting the application of the DA term to the early Middle Ages. However, most scholars today hold that it is best to discard this disparaging label altogether.

The outlines of a concept of a Dark Age have been detected in the work of the Italian scholar Petrarch (Francesco Petrarca), during the 1330s. However, the application was more limited than it later become, having beern originally crafted to apply to deteriorating standards in Late Latin literature. In this rather specific context, Petrarch viewed the centuries since the fall of Rome as "dark" compared to the light of classical antiquity. Later historians expanded the notion to refer to the transitional period between Roman times and the High Middle Ages, designating not only the purported decline of Latin literature, but also a dearth of objective contemporary historiography, demographic retrogression, limited building activity, and modest material cultural achievements in general. Some commentators note that it is essentially a matter of perception, for the events of the period seem "dark" to us only because of the paucity of artistic and cultural output, including historical records, when compared with both earlier and later times. Even so, many have found such survivals as the Sutton Hoo treasure, the just-found Staffordshire treasure, and the epic poem Beowulf to be of surpassing cultural interest.

Heedless of these cautions, popular culture has glommed onto the DA label as a vehicle for a free-for-all tarring of the entire Middle Ages as a time of backwardness, enhancing its pejorative sting and expanding its scope. It is this broad popular sense that has appealed to some modern secularists and humanists, who are eager to exalt ancient Greece, the Renaissance, and the Enlightenment by arraigning the Middle Ages as a horrid counter-example. In this sense the term Dark Ages is not actually a descriptor but serves as a weapon in a kind of culture war.

Even with regard to the early Middle Ages, the rise of archaeology and other techniques of specialized study in the 20th century has shed much light on the period, offering a more nuanced understanding of its positive developments. Other, more neutral terms of periodization have come to the fore: Late Antiquity, the Early Middle Ages, and the Great Migrations--depending on which aspects of culture are envisaged.

During the 17th and 18th centuries--often called the Age of Enlightenment--religion was commonly perceived as antithetical to reason. As the "Age of Faith," the Middle Age was caricatured as the polar opposite of contemporary times, rather smugly viewed as the Age of Reason. Even such perceptive thinkers as Immanuel Kant and Voltaire fell under the sway of this dualistic mindset. Yet just as Petrarch, seeing himself as standing on the threshold of a "new age,” was disparaging the centuries leading up until his own time, so too were the Enlightenment writers criticizing the centuries up until their own. These extended well after Petrarch's era (the Trecento), since religious domination and conflict were still common up to the time of the outbreak of the French Revolution. Because of this extension--actually a kind of sleight of hand--events like the condemnations of Bruno and Galileo are often covertly assigned to the Middle Ages--to which of course they do not belong.

In this way a complex conceptual evolution had taken place. Petrarch's original metaphor of light versus dark had been expanded in time, at least implicitly. Even if the pioneering humanists who came after him no longer saw themselves living in a dark age, their times were still not light enough. At least that was the case for 18th-century writers who saw themselves as living in the real Age of Enlightenment, while the period targeted by their own condemnation had been stretched so as to embrace what we now call Early Modern times (aka the Renaissance and Baroque). Moreover, Petrarch's metaphor of darkness, which he used chiefly to deplore what he saw as a lack of secular achievements, was sharpened so as to take on a more explicitly anti-religious connotation. This development foreshadows the contemporary culture wars waged by Dawkins, Harris, Hitchens, and their followers.

In the early 19th century, the Romantic writers sought to reverse the negative assessment of Enlightenment critics. The word "Gothic" had been a term of opprobrium akin to "Vandal" until a few self-confident mid-18th-century English "goths" like Horace Walpole and William Beckford initiated the Gothic Revival in the arts—which for the following Romantic generation began to take on a positive image. To be sure, this image, in reaction to a bleak world dominated by a grim rationalism in which reason trumped emotion, expressed a somewhat overenthusiastic view of the Chivalric Era as a Golden Age. Contrasting with the excesses of the French Revolution, the Middle Ages were viewed in a nostalgic light, as a period of social and ecological harmony and spiritual inspiration. There was also a widespread concern with the environmental and social upheavals and sterile utilitarianism of the emerging industrial revolution. The Romantic view of the earlier centuries suffuses modern-day fairs and festivals celebrating the period with costumes and events. However, the Romantic critique also heralded today’s Green Movement.

In conclusion, one must question whether it is possible today to use the term "Dark Ages" in a neutral way. Some scholars may intend it thus, but confusion is sown among ordinary readers. Moreover, the vast expansion of new knowledge concerning the history and culture of the Early Middle Ages, which 20th-century scholarship has achieved, means that these centuries are no longer dark even in the sense of "unknown to us." Consequently, many academic writers prefer simply to avoid the expression. That is surely the best course.

All the same, disparagement has taken its toll. Films and novels assume as a matter of course that the "Dark Ages” was a time of backwardness. The popular movie Monty Python and the Holy Grail is a humorous example. On the History Channel, the 2007 television show The Dark Ages called the period "600 years of degenerate, godless, inhuman behavior.” (The term “godless” is a curious reversal.)

The public idea of the Middle Ages as a supposed "Dark Age" also surfaces in misconceptions regarding the study of nature during this period. Such contemporary historians of science as David C. Lindberg and Ronald Numbers have targeted the stereotype that the Middle Ages was a "time of ignorance and superstition," the blame for which is to be laid on the Christian Church for allegedly "placing the word of religious authorities over personal experience and rational activity." These scholars emphasize that this view is essentially a caricature. For example, a claim that was first propagated in the 19th century and still lives on in popular culture is the assertion that everyone in the Middle Ages believed that the Earth was flat. Lindberg and Numbers hold that this notion is mistaken: "There was scarcely a Christian scholar of the Middle Ages who did not acknowledge [Earth's] sphericity and even know its approximate circumference.” Ronald Numbers rightly castigates such misconceptions as "the Church prohibited autopsies and dissections during the Middle Ages," "the rise of Christianity killed off ancient science," and "the medieval Christian church suppressed the growth of natural philosophy."

These myths still pass as historical truth. Yet they are not supported by current historical research.

Labels:

Wednesday, September 23, 2009

Medievalia

My latest post delineating positive features of the European medieval heritage has elicited dissent amongst careful readers of this blog. In forthcoming pieces I will endeavor to flesh out these observations, which some evidently find counterintuitive.

In the meantime, one may consult a blog that is ancillary to this one: Medmod. While I have not added to this roundup lately, it distills the findings of a graduate course I gave two years ago, with particular attention to the way medieval archetypes shaped modern art.

You may reach this cognate site by going to the Profile of this one, or by consulting Medmod.blogspot.com. To read the lectures in sequence, go to the end first.

Tuesday, September 22, 2009

A major cultural debt

The prominent German philosopher Jürgen Habermas, who is an atheist, has acknowledged that "Christianity, and nothing else, is the ultimate foundation of liberty, conscience, human rights, and democracy, the benchmarks of Western civilization. To this we have no other options. We continue to nourish ourselves from this source. Everything else is postmodern chatter."

At first sight this claim seems counterintuitive, if not downright bizarre. Weren’t liberty, human rights, and democracy--to name just three--achieved essentially by OVERCOMING Christianity?

Is Habermas just pulling our leg? I don’t think so, for if there is anything that uniformly characterizes the German thinker’s longwinded books it is the absence of humor. Still, he is a smart guy. Assuming that Habermas has been quoted correctly, what could he possibly have been thinking?

Some distinctions are needed at the outset. I believe that Habermas means what used to be termed Christendom. That is, the reference is not primarily to the faith of the New Testament; in my view, and as I have tried to show in previous postings, that is simply the manifestation of a wayward Jewish sect. Only one factor noted, conscience--especially as a nagging, corrosive presence--can be ascribed to that source, as seen in the writings of Paul. By and large, though "Primitive Chritianity" is not in question: that is why a reference to the Council of Nicaea is unhelpful.

It is also important to exclude, by and large, Eastern Orthodox Christianity. That branch of the faith disembogued in the autocracy of the Third Rome in Moscow.

As I see it, what Habermas is talking about is the achievement of a quarreling lot of peoples clinging to the extreme Western edge of the Eurasian land mass during the period that is still misleadingly termed, by some at least, the Dark Ages.

The first component is demographic. The invading hordes who brought down the crumbling Western Roman Empire triggered a period of complex ethnic negotiation and amalgamation. This process led to the emergence of the competing nation states of France, England, Spain, and the rest. As in ancient Greece and China, political pluralism fostered healthy competition and divergence of views.

In addition to the incipient nation states there was the power of the papacy. The papacy as an agent of democracy? What a grotesque notion! Not so, for during the High Middle Ages the papacy was locked in a battle for supremacy with the Holy Roman Empire, a struggle sometimes known as the Investiture Contest. Most Europeans know about one key episode in this struggle, when Emperor Henry IV, ostensibly the all-powerful ruler of the West, was compelled to go to the castle of Canossa in 1077 to beg forgiveness of Pope Gregory VII. In the end, however, the dispute between the emperor and the pope was a draw--each power was to exist in its own sphere This is the origin of our modern doctrine of separation of church and state, which has no counterpart in the ancient world--or for that matter in Islam.

Medieval Christianity contributed to democracy in other ways. One was the custom of the monastic orders electing representatives to attend a kind of summit meeting in Rome. From this custom arose the idea of representative government as found in the “estates” or parliaments of various countries. As we know it, representative democracy differs from the participatory democracy of the ancient Greek city states because it is not necessary for the citizens to assemble in person--we send a representative instead.

Gradually these representative institutions became strong enough to challenge the power of the monarch. The parliaments excersised their prerogative by requiring that the king obtain their approval before appropriating money. This tradition survives in our own practice (not always carefully observed) of assigning the details of the budget to the Congress.

The nobility also played a part in resisting the arbitrariness of royal rule. The most famous incident of this kind is Magna Carta, which the barons extracted from King John at Runnymede in 1215. Although the Great Charter originally only protected the rights of the aristocracy, the monarchy had been brought to the edge of a slippery slope that led to fundamental guarantees of rights to all citizens. Of course, traces of the old language still remain, as when Britons speak of “the crown” (that is, the government) and being an “English subject.” Everyone knows, however, that Queen Elizabeth reigns, but does not rule.

Moreover, from their customary law traditions Western European Christianity evolved a unique concept of the rule of law. To early Germanic rulers was ascribed not the practice of making law but of finding it. In this “supreme fiction” the law was regarded as a stable, preexisting entity, and not just a kind of silly putty to be reshaped as any powerful person saw fit. By contrast, Roman emperors could modify the law by decree whenever they wished.

These are just a few examples of the way our liberties owe a debt to medieval Christianity. Many others could be cited.

There were also major contributions to literature, the arts, and the natural sciences. Let me cite just one example from each. Rhymed poetry began among the so-called barbarian peoples of early medieval Europe, replacing the quantitative poetry of Greece and Rome. The technique of oil painting emerged about 1300 in Northwest Europe, to be perfected later by the Van Eycks and their successors. About the same time, eyeglasses appeared in Venice. The principle of modifying sight by the intervention of a vitreous medium was the essential forerunner of the microscope and telescope. Without eyeglasses there would be no Galileo and no modern astronomy.

Today, in the view of many observers, Western Europe finds itself adrift in the face of determined adversaries within its gates. Europeans seem to have no core values. In fact they do. But acknowledging these strengths means discarding, once and for all, the historical amnesia that an insouciant secularism has brought in its wake.

Labels:

Monday, September 21, 2009

Fluctuating reputations: Laski and Joad

Prominent public intellectuals, some of them at least, have the capacity to generate whole climates of opinion. At least they seem to do so, for one could argue contrariwise that intellectuals merely ride the waves instead of launching them. Probably it is something of both.

We have just passed through a Francotropic period, Our latest fin-de-siècle era was graced by such eminentos as Claude Lévi-Strauss and Michel Foucault, Jacques Lacan and Jacques Derrida. Of these, only Lévi-Strauss has enduring value for me, though of course anthropologists and others have singled out certain abitrary and implausible features of his thinking. Still, C.-L.-S. seems to have a fertility and staying power that the others--mostly played out and found out--now clearly lack. The French themselves seem to acknowledge this distinction, as Lévi-Strauss is the only one of the four to hve been admitted into the pantheon of French letters, Gallimard's Pléiade series.

In the mid-20th century we looked not to France but to Britain. Educated people were fascinated by a kind of intellectual tennis foursome--a two-versus-two duel. On the progressive side stood George Bernard Shaw and H. G. Wells. They were opposed by the conservative thinkers G. K. Chesterton and Hilaire Belloc. Today no one sees this “battle of the titans” as having any relevance. Ezra Pound, one of the first doubters, called Shaw an “aesthetic pygmy.” That jibe probably goes too far, and all four can command at least a few readers now.

Matters are very different for two British intellectuals who once stood very high in public esteem, Harold Laski and C. E. M. Joad.

Harold Joseph Laski (1893-1950) was an English political theorist and journalist, who occupied a strategic post at the London School of Economics. Never one to hide his light under a bushel, he turned out of mass of books, and was active on the lecture circuit of American universities. At the age of 23 he brashly initiated a correspondence with the noted US Supreme Court Justice Oliver Wendell Holmes, who was then 75. Their 19-year old epistolary relationship yielded two volumes of correspondence, published in 1953.

Laski served as a member of the executive committee of the Fabian Society (1922-36), and in 1936 he joined the Executive Committee of the Labour Party. In 1945-46 he was chairman of the Labour Party.

In keeping with his Fabian background, Laski espoused a moderate (some would say revisionist) version of Marxism. Having taught a generation of future Indian leaders, he had a major impact on the politics and formation of India. To his credit, he was steady in his unremitting advocacy of Indian independence. One Indian prime minister exclaimed that '”there is a vacant chair at every cabinet meeting in India reserved for the ghost of Professor Harold Laski.”

It has been a different story in Britain and America. Having risen so high, why did Laski fall so low? He had hitched his wagon to a star: the British Labour Party. When it rose to power in 1945 there was widespread hope that the democratic-socialist principles espoused by Labour would be adopted everywhere.

In Britain Laski's reputation suffered through an affair that remains murky. On June 16, 1945, responding to a questioner at a political rally, Laski averred that, if necessary, socialism might have to be achieved by violence. After this off-the-cuff observation was quoted in the Daily Express, Laski angrily denied that he had made the remark, and proceded to sue the newspaper for libel. Sir Patrick Hastings, the counsel for the defence, took the jury through Laski's writings, arguing that even if he had not uttered the exact words the paper ascribed to him, they were generally consistent with his views. Hastings also implied that there was something "un-English" about being an intellectual, as Laski clearly was. There were distinct anti-Semitic overtones. Harold Laski lost his libel case and was ordered to pay all court costs.

Even so, his reputation remained high in the United States. Yet as Labour and the British model together lost their international cachet, so ultimately did Laski’s fame decline.

In addition he probably wrote too much too fast. George Orwell homed in on a section from Laski’s book, "Essay in Freedom of Expression," as an example of "especially bad" writing. Overall, his work has an air of assurance--often amounting to smugness--that, in view of the fragility of the policies he was advocating, ultimately came to seem false and self-serving.

Laski’s perpetual earnestness could be wearying. Not so, the roguish C. E. M. Joad.

Cyril Edwin Mitchinson Joad (1891-1963) was an English philosopher and broadcasting personality. In his day he was famous for his appearance on the The Brains Trust, an extremely popular BBC Radio wartime discussion program. He ranked as an early media celebrity, until his fame and fortune collapsed in the Train Ticket Scandal of 1948.

Born in Durham and raised in Southampton, he received a very strict Christian upbringing. At Balliol College, Oxford, he honed his skills as a popular philosopher and debater. Swept along by the currents of the time he became a Syndicalist, a Guild Socialist, and then a Fabian.

Joad joined the Board of Trade in 1914, aiming to infuse the civil service with a socialist ethos. In the months leading up to World War I he displayed an ardent pacifism at the risk of unpopularity, a fate which also attended George Bernard Shaw and Bertrand Russell at that time.

After Joad's separation from his wife Mary, he moved to Hampstead in London, where he lived with a student teacher named Marjorie Thomson. She was the first of his many mistresses, all of whom were introduced as “Mrs Joad.” He characterized sexual desire as "a buzzing bluebottle that needed to be swatted promptly before it distracted a man of intellect from higher things."

Imbued with the liberal ideas of Bloomsbury, the era in which he lived offered many opportunities to obtain sexual release, In this it contrasted, he believed, with the previous epoch. "Conscience was the barmaid of the Victorian soul. Recognizing that human beings were fallible and that their failings, though regrettable, must be humored, conscience would permit, rather ungraciously perhaps, the indulgence of a number of carefully selected desires." Joad's desires were not so carefully selected.

A misogynist, he held that female minds lacked objectivity, and made it clear that he had no interest in talking to women who would not go to bed with him. By mid-life Joad was "short and rotund, with bright little eyes, round, rosy cheeks, and a stiff, bristly beard." As a kind of test, he elected to wear shabby clothing: anyone who was put off by his slovenliness he deemed too petty to merit acquaintance.

In 1930 he left the Civil Service to become of Head of the Department of Philosophy and Psychology at Birkbeck College, University of London, which specialized in educating “nontraditional” students. Joad's two works of popularization, “Guide to Modern Thought “(1933) and “Guide to Philosophy” (1936), made him well-known to the educated public at large.

He continued his own nontraditional lifestyle. In 1925 he was expelled from the Fabian Society because of his sexual escapades at its summer school and did not rejoin until 1943.

Joad was also interested in the supernatural and partnered Harry Price on a number of ghost-hunting expeditions. He traveled to the Harz Mountains in Germany to help Price to try to prove that the “Blocksberg Tryst” would turn a male goat into a handsome prince. At home he organized rambles and rode recklessly through the countryside. He also had a passion for hunting.

A tireless lecturer, his popularity grew and grew. Ever in quest of the limelight, after the outbreak of World War II (1939) he implored the Ministry of Information to make use of him. His wish was granted, for in January 1940 Joad joined a wartime discussion program called The Brains Trust. At a time of great hunger for information and entertainment, the BBC radio production caught on immediately, attracting millions of listeners. His fund of anecdotes and mild humor served him in great stead. The program’s topics ranged from sublime to the ridiculous--from "What is the meaning of life?" to "How can a fly land upside-down on the ceiling?" Ever the popular philosopher, Joad nearly always responded to a question with the catchphrase "It all depends on what you mean by …" Improbably enough, the public generally considered him the greatest British philosopher of the day.

After the War he began to discard his religious skepticism and to turn to the Church of England, as evidenced by his book "The Recovery of Belief." He also started to entertain doubts about socialism. Stlll, his career was more successful than ever before: he had become a household name. But he also had made many enemies, and they were to have the last laugh.

In his hubris Joad even boasted in print that “I cheat the railway company whenever I can.” In April 1948 he was convicted of travellng on a Waterloo-Exeter train without a valid ticket. Because of his celebrity, this contretemps made frontpage headlines in the national newspapers. While the amount was trifling, the fine of £2 effaced all hopes of a peerage and triggered his dismissal from the BBC. This disaster had a serious effect on his health, and he became confined to his bed in Hampstead. His fame and broadcasting career were over.

Joad had had quite a run. He had the temerity to publish a book attacking Americans without having visited the country. He also alleged that the philosophy of logical positivism, which had made his own efforts in the field seem puny and dated, had helped to promote Nazism. In fact, the leading logical positivists had to flee the Continent because of their principled stand against fascism.

During his lifetime, C. E. M. Joad published more than 75 books. He was widely quoted. He may be said to have paved the way for such contemporary popularizing philosophers as Martha Nussbaum and Peter Singer. His journey from Christianity to unbelief and back again to Christianity has been followed by many others.

NOTE. Even today the notion persists that there is something un-English about being an intellectual. It could be argued that, because of the mistaken belief that they have no adequate homegrown exemplars, educated Anglophones gravitate to French gurus by way of compensation. With Emile Zola, many believe, the French actually invented the category. At all events, the notion of the absence of intellectuals on English soil has been refuted in a weighty volume by Stefan Collini, "Absent Minds: Intellectuals In Britain" (Oxford, 2006). This book is a bit of a hard slog, but I recommend it.

Labels:

Sunday, September 20, 2009

The strange case of Karen Armstrong

Karen Armstrong (b. 1944) is a British scholar who has been widely acclaimed for her eirenic studies of the Abrahamic religions. A former Catholic nun, she improbably maintains that "[a]ll the great traditions are saying the same thing in much the same way, despite their surface differences." Ostensibly they all share a commitment to compassion, as expressed by way of the Golden Rule: “Do not do to others what you would not have done to you.” Her claims of the universality of this precept have been disputed.

Karen Armstrong has published at least twenty-two books. Admired in many quarters for her “inclusive” approach, she has received many awards and honors; for these, see the her entry in Wikopedia.

Since 9/11 she has been a popular speaker and writer on the subject of Islam. In 1999 the Muslim Public Affairs Council of Los Angeles gave Armstrong an award for "media fairness.” Without any training in the field, she has been acclaimed for promoting a "more objective" view of Islam to a wide public in Europe and North America. In reality she offers a bland, anodyne image of a humanistic Islam, shorn of the harsh elements of intolerance and violence that are undeniably present in the Qur’an, the hadith traditions, and in the historic expansion of the faith. In devising this construction she starts from the received view, taking no notice of the new scholarship that has cast doubt on much of the traditional wisdom. From this material she selects the components needed to fit her sanitized image of Islam.

The historian Efraim Karsh, head of Mediterranean Studies at King's College London, characterizes Armstrong's biography of Muhammad as "revisionist" and inaccurate. He calls her treatment of the controversial issue of the Banu Qurayza tribe in her book "Muhammad: A Prophet for Our Time" "a travesty of the truth."

In one of her columns in the British newspaper The Guardian, Armstrong argues that in the present situation it is important to understand our adversaries. Fair enough. Then, however, she turns her principle on its head by insisting that Islamic terrorism must not be referred to as such. “Jihad,” we learn, “is a cherished spiritual value that, for most Muslims, has no connection with violence.” How reassuring.

Yet as David Thompson has noted “[t]he Muslims who do commit acts of terrorism do so, by their own account, because of what they perceive as core Islamic teachings. The names they give themselves--jihadist, mujahedin, shahid--have no meaning outside of an Islamic context. But Armstrong would have us ignore what terrorists repeatedly tell us about themselves and their motives.”

Armstrong asserts that “until recently, no Muslim thinker had ever claimed [violent jihad] was a central tenet of Islam.” This statement is staggering. In fact contemporary jihadists draw upon theological traditions reaching back to Muhammad’s own time. In his famous book “The Muqadimmah,” the fifteenth-century historian Ibn Khaldun endorsed the findings of five centuries of prior Sunni theology regarding jihad: “[i]n the Muslim community, the holy war is a religious duty, because of the… mission to convert everybody to Islam either by persuasion or by force … Islam is under obligation to gain power over other nations.” Here Ibn Khaldun simply concurs with the deliverances of the ulama, the religious authorities, for as one law manual states, “Islamic holy war against followers of other religions, such as Jews, is required unless they convert to Islam.”

As a military leader, it was inevitable that Muhammed would himself conceive his task as one of expansion by force. In fact Islam is the only major religion that has been spread primarily by conquest. In his book “Legacy of Islam,” Andrew Bostom has reviewed the inception and progress of this tendency. First there were the jihad campaigns of religious “cleansing” throughout the Arab Peninsula. Particularly notable were the five centuries of jihad campaigns in India, during which tens of millions of Hindus and Buddhists were slaughtered or enslaved to further Islamic influence, not to mention similar campaigns in Egypt, Palestine, Syria, Armenia, North Africa, Spain, and the Balkans. All these violent endeavors are thoroughly documented by Muslim sources themselves.

In another one of her Guardian columns, Armstrong claims that, “until the 20th century, anti-Semitism was not part of Islamic culture,” insisting that anti-Semitism ranks as a purely Western invention. To be sure, medieval Christianity was a major incubator of anti-Semitism. However, it also occurred in classical antiquity and throughout the history of Islam, beginning with several dismal episodes recorded in the Qur'an itself.

Here Armstrong is playing to a stereotype rampant among the politically correct. In this view, the evil imperialist West is boundlessly capable of spreading corruption wherever it goes. By contrast, the Third World is inherently virtuous, even though, when one looks more closely, this accolade is gained only at the price of its being depicted as passive, unchanging, and devoid of agency. In short, this purported gain in understanding is reached only by donning the otherwise forbidden glasses of Orientalism.

As I suggested at the outset, Armstrong has never considered whitewashing Islam as her sole task. She has set herself up as some kind of ultimate authority on religion itself.

On September 12, the Wall Street Journal published a pair of commissioned articles by Karen Armstrong and Richard Dawkins, who were asked to respond independently to the question "Where does evolution leave God?"

At the start of her piece Armstrong concedes that evolution “has indeed dealt a blow to the idea of a benign creator, literally conceived. It tells us that there is no Intelligence controlling the cosmos, and that life itself is the result of a blind process of natural selection, in which innumerable species failed to survive. The fossil record reveals a natural history of pain, death and racial extinction, so if there was a divine plan, it was cruel, callously prodigal and wasteful. Human beings were not the pinnacle of a purposeful creation; like everything else, they evolved by trial and error and God had no direct hand in their making. No wonder so many fundamentalist Christians find their faith shaken to the core.”

Curiously enough, this process may offer gain as well as pain. Armstrong maintains that until recent centuries the Abrahamic faiths had generally understood religion merely as a matter of behavior and feeling. “[M]any of the most influential Jewish, Christian and Muslim thinkers understood that what we call ‘God’ is merely a symbol that points beyond itself to an indescribable transcendence, whose existence cannot be proved but is only intuited by means of spiritual exercises and a compassionate lifestyle that enable us to cultivate new capacities of mind and heart.”

Somehow, she avers, towards the end of the 17th century this marvelous intuitive understanding got lost through an unfortunate overemphasis on “hard fact.” That would be sad--if true. Here she simply expunges centuries of strenuous medieval effort to prove the factual existence of God, beginning with Anselm of Canterbury (ca. 1033-1109), who originated the ontological argument. Surely an ex-nun should know better than to broadcast such fibs. Still, she obstinately holds that only at the end of the 17th century did It became necessary to prove the objective existence of God. Her only witnesses for this bizarre claim seem to be Sir Isaac Newton and the ubiquitous William Paley.

But there was more to come, for in the 19th century “Darwin showed that there could be no proof for God's existence.” This claim is debatable.

Armstrong cannot leave it at that, and so she swings into high gear with one of her infallible pronunciamentos. “Symbolism was essential to premodern religion, because it was only possible to speak about the ultimate reality—God, Tao, Brahman or Nirvana—analogically, since it lay beyond the reach of words. Jews and Christians both developed audaciously innovative and figurative methods of reading the Bible, and every statement of the Qu’ran [sic] is called an ayah ('parable'). St Augustine (354-430), a major authority for both Catholics and Protestants, insisted that if a biblical text contradicted reputable science, it must be interpreted allegorically.” This last assertion is untrue, for like most of his patristic colleagues Augustine required that in every instance the Bible must be interpreted literally as well as allegorically. The allegorical reading supplements the literal one, it does not supplant it.

Then comes another dubious generalization. “Most cultures believed that there were two recognized ways of arriving at truth. The Greeks called them mythos and logos. Both were essential and neither was superior to the other; they were not in conflict but complementary, each with its own sphere of competence. Logos ("reason") was the pragmatic mode of thought that enabled us to function effectively in the world and had, therefore, to correspond accurately to external reality. But it could not assuage human grief or find ultimate meaning in life's struggle. For that people turned to mythos, stories that made no pretensions to historical accuracy but should rather be seen as an early form of psychology; if translated into ritual or ethical action, a good myth showed you how to cope with mortality, discover an inner source of strength, and endure pain and sorrow with serenity.” Armstrong speaks of “most cultures,” but she cites only the ancient Greeks. Yet the ancient Egyptians, to note just one example, made no distinction between mythos and logos.

Can she top what she has done so far? Yes, she can, for now comes a true whopper. “In the ancient world, a cosmology was not regarded as factual but was primarily therapeutic; it was recited when people needed an infusion of that mysterious power that had—somehow—brought something out of primal nothingness: at a sickbed, a coronation or during a political crisis. Some cosmologies taught people how to unlock their own creativity, others made them aware of the struggle required to maintain social and political order.” This is nonsense. People studied cosmology because among other things observation of the heavenly bodies taught them when to plant crops.

The value of religion, it seems is “to help us live creatively with realities for which there are no easy solutions and find an interior haven of peace.” Religion, we learn, is “a kind of art form that, like music or painting, introduces us to a mode of knowledge that is different from the purely rational and which cannot easily be put into words. At its best, it holds us in an attitude of wonder.” Well, I taught the history of art for many years and I am certain that it is not like religion.

Here is how she concludes her recital. “But what of the pain and waste that Darwin unveiled? All the major traditions insist that the faithful meditate on the ubiquitous suffering that is an inescapable part of life; because, if we do not acknowledge this uncomfortable fact, the compassion that lies at the heart of faith is impossible. The almost unbearable spectacle of the myriad species passing painfully into oblivion is not unlike some classic Buddhist meditations on the First Noble Truth (‘Existence is suffering’), the indispensable prerequisite for the transcendent enlightenment that some call Nirvana—and others call God."

Nirvana--that is to say, extinction--is God? Isn’t that what Dawkins has been telling us all along, that God is nonexistent?

In his own Wall Street Journal piece, commenting on views like Armstrong’s, Dawkins pertinently asks: “Where does that leave God? The kindest thing to say is that it leaves him with nothing to do, and no achievements that might attract our praise, our worship or our fear. Evolution is God's redundancy notice, his pink slip. But we have to go further. A complex creative intelligence with nothing to do is not just redundant. A divine designer is all but ruled out by the consideration that he must at least as complex as the entities he was wheeled out to explain . . .

“Now [Dawkins goes on], there is a certain class of sophisticated modern theologian who will say something like this: ‘Good heavens, of course we are not so naive or simplistic as to care whether God exists. Existence is such a 19th-century preoccupation! It doesn't matter whether God exists in a scientific sense. What matters is whether he exists for you or for me. If God is real for you, who cares whether science has made him redundant? Such arrogance! Such elitism.’

“Well, if that's what floats your canoe, you'll be paddling it up a very lonely creek. The mainstream belief of the world's peoples is very clear. They believe in God, and that means they believe he exists in objective reality, just as surely as the Rock of Gibraltar exists. If sophisticated theologians or postmodern relativists think they are rescuing God from the redundancy scrap-heap by downplaying the importance of existence, they should think again. Tell the congregation of a church or mosque that existence is too vulgar an attribute to fasten onto their God, and they will brand you an atheist.”


Just now Karen Armstrong has published a kind of summation of her views, “The Case for God: What Religion Really Means.” At present this book is only available in England, so that I know it only through notices, such as the incisive review by Simon Blackburn.

The review, which appeared in The Guardian on July 4, 2009, bears the heading “All quiet on the God front.” Perhaps it should have been entitled “Nothing will come of nothing.” Here is Blackburn.

“. . . Karen Armstrong takes the reader through a history of religious practice in many different cultures, arguing that in the good old days and purest forms they all come to much the same thing. They use devices of ritual, mystery, drama, dance and meditation in order to enable us better to cope with the vale of tears in which we find ourselves. Religion is therefore properly a matter of a practice, and may be compared with art or music. These are similarly difficult to create, and even to appreciate. But nobody who has managed either would doubt that something valuable has happened in the process. We come out of the art gallery or concert hall enriched and braced, elevated and tranquil, and may even fancy ourselves better people, though the change may or may not be noticed by those around us.
“This is religion as it should be, and, according to Armstrong, as it once was in all the world's best traditions. However, there is a serpent in this paradise, as in others. Or rather, several serpents, but the worst is the folly of intellectualising the practice. This makes it into a matter of belief, argument, and ultimately dogma. It debases religion into a matter of belief in a certain number of propositions, so that if you can recite those sincerely you are an adept, and if you can't you fail. This is Armstrong's principal target. With the scientific triumphs of the 17th century, religion stopped being a practice and started to become a theory--in particular the theory of the divine architect. This is a perversion of anything valuable in religious practice, Armstrong writes, and it is only this perverted view that arouses the scorn of modern "militant" atheists. So Dawkins, Dennett, Hitchens and Harris have chosen a straw man as a target. Real religion is serenely immune to their discovery that it is silly to talk of a divine architect.
“So what should the religious adept actually say by way of expressing his or her faith? Nothing. This is the ‘apophatic’ tradition, in which nothing about God can be put into words. Armstrong firmly recommends silence, having written at least 15 books on the topic. Words such as "God" have to be seen as symbols, not names, but any word falls short of describing what it symbolises, and will always be inadequate, contradictory, metaphorical or allegorical. The mystery at the heart of religious practice is ineffable, unapproachable by reason and by language. Silence is its truest expression. The right kind of silence, of course, not that of the pothead or inebriate. The religious state is exactly that of Alice after hearing the nonsense poem ‘Jabberwocky’: "Somehow it seems to fill my head with ideas--only I don't exactly know what they are." . . .
“Armstrong is not presenting a case for God in the sense most people in our idolatrous world would think of it. The ordinary man or woman in the pew or on the prayer mat probably thinks of God as a kind of large version of themselves with mysterious powers and a rather nasty temper. That is the vice of theory again, and as long as they think like that, ordinary folk are not truly religious, whatever they profess. By contrast, Armstrong promises that her kinds of practice will make us better, wiser, more forgiving, loving, courageous, selfless, hopeful and just. Who can be against that?
“The odd thing is that the book presupposes that such desirable improvements are the same thing as an increase in understanding--only a kind of understanding that has no describable content. It is beyond words, yet is nevertheless to be described in terms of awareness and truth. But why should we accept that? Imagine that I come out of the art gallery or other trance with a beatific smile on my face. I have enjoyed myself, and feel better. Perhaps I give a coin to the beggar I ignored on the way in. Even if I do so, there is no reason to describe the improvement in terms of my having understood anything. If I feel more generous, well and good, but the proof of that pudding is not my beatific smile but how I behave. As Wittgenstein, whose views on religion Armstrong thoroughly endorses, also said, an inner process stands in need of outward criteria. You can feel good without being good, and be good without stretching your understanding beyond words. Her experience of ‘Jabberwocky’ may have improved Alice.
“Silence is just that. It is a kind of lowest common denominator of the human mind. The machine is idling. Which direction it then goes after a period of idling is a highly unpredictable matter. As David Hume put it, in human nature there is ‘some particle of the dove, mixed in with the wolf and the serpent.’ So we can expect that some directions will be better and others worse. And that is what, alas, we always find, with or without the song and dance.”

Labels:

Saturday, September 19, 2009

ACORN

For some time I have been avidly reading the postings of the independent commentator Glenn Greenwald. I am not alone, for many have been cheered by Greenwald's slaying of sacred cows. He is not a member of the Beltway establishment, perhaps in part because he is forced to live outside the US to be with his Brazilian partner, who is denied the opportunity of residence on these shores.

In his latest post on ACORN, however, Mr. Greenwald has laid an egg (to mix metaphors a bit). As you will recall, two enterprising whistle blowers exposed staffers in two different ACORN bureaus for offering advice on how to set up a brothel importing teenage Salvadorian prostitutes.

I thought that liberals, especially, were opposed to human trafficking.

At all events, the employees were fired. They were just two bad apples. But were they? At the very least it is evident that ACORN is not just an altruistic organization to help the poor and downtrodden.

Ignoring the moral issue of the exploitation of women, Greenwald essays a relativistic argument. The amount of money, $53 million he claims, received from the federal government is miniscule in comparison with what we have shelled out to save the financial markets. True enough. But the argument is akin to that of a man who is stopped for drunken driving who indignantly asks the police why they aren't out solving murders instead. As the French say: comparison n'est pas raison.

Then Greenwald advances an ad hominem argument. Since Glenn Beck is spearheading the attack on ACORN it can't possibly be justified. Yet as Frank Rich's column of September 20 points out, "Even Glenn Beck is right twice a day." Rich does not shrink from comparing Beck with Michael Moore. Both are symptoms of a new populist rage that is welling up among the disaffected and disadvantaged--people who who have no reason to agree with Ben Bernanke's claim that the recesssion is over.

In fact conservatives have a valid reason for distrusting ACORN. The organization was slated to play a major role in the census. Since the group has a well-attested penchant for making up names, there is reason to fear that it would be involved in inflating the numbers of people in liberal districts and thus tilting the process of redistricting, which is based on the census findings.

For a long time liberals have complained that the conservatives who are prominent in the popular media are raucous and one-sided. With Rush Limbaugh and Glenn Beck this is flagrantly so. Yet if they are to gain credibility outside of their usual preaching to the choir, liberals need to be even-handed. (Just being raucous like Keith Olberman doesn't accomplish the job.) For a long time liberals have not been even-handed, by and large. This lack of good faith was one of the reasons for the liberal collapse in the 1980s.

Some clear-sighted liberals like Sean Wilentz of Princeton can see this. But the temptation to score debater's points and to retreat into self-righteousness is ever present.

UPDATE. In Mother Jones for September 23, Kevin Drum made the following useful points:


"ACORN has filed a lawsuit against James O'Keefe and Hannah Giles, the two undercover filmmakers who taped ACORN workers providing advice about how to smuggle underage sex workers into the country from El Salvador:

"The lawsuit asserts that neither O'Keefe nor Giles obtained consent from ACORN workers for videotaping them, as state law requires.

"ACORN executive director Bertha Lewis told reporters in a conference call that ACORN, the Association of Community Organizations for Reform Now, does not support criminal activity and believes the filmmakers should have obeyed Maryland laws.

"Points for chutzpah, I guess, but this is a bad idea on so many levels it hurts just to think about it. All they're doing is extending the news cycle on this whole debacle, making fools of themselves with transparently petty arguments, and just generally showing less common sense than your average mafia don caught on a 60 Minutes sting. At this point, ACORN needs to take their lumps, finish their internal investigation, and clean up their act. In the meantime, the least they can do is avoid handing the Glenn Beck crowd free additional ammunition. Fair or not, shooting the messenger isn't helping their cause."

UPDATE No. 2 (September 27) Clark Hoyt, public editor of the New York Times, has a tough job. Today he tries to put favorable spin on the way the Times sought to bury the ACORN scandal. For many days the "newspaper of record" refused to print a word about what most readers had been learning from the Internet, as well as from conservative commentators. As Hoyt now acknowledges, "a video sting had caught Acorn workers counseling a bogus prostitute and pimp on how to set up a brothel staffed by under-age girls, avoid detection and cheat on taxes. The young woman in streetwalker’s clothes and her companion were actually undercover conservative activists with a hidden camera."

He goes on: "But for days, as more videos were posted and government authorities rushed to distance themselves from Acorn, The Times stood still. Its slow reflexes — closely following its slow response to a controversy that forced the resignation of Van Jones, a White House adviser — suggested that it has trouble dealing with stories arising from the polemical world of talk radio, cable television and partisan blogs. Some stories, lacking facts, never catch fire. But others do, and a newspaper like The Times needs to be alert to them or wind up looking clueless or, worse, partisan itself."

Apparently, if the story had never "caught fire" we would never have learned about it from the lordly Times. No wonder the paper's revenues are way down.

Incredibly, NYT editors told Hoyt that they were not immediately aware of the Acorn videos on Fox, YouTube and a new conservative Web site called BigGovernment.com. Moreover, "when the Senate voted to cut off all federal funds to Acorn, there was not a word in the newspaper or on its Web site. When the New York City Council froze all its funding for Acorn and the Brooklyn district attorney opened a criminal investigation, there was still nothing."

A number of readers wrote in to say that they knew why the paper was silent: “protecting the progressive movement.” Surely they are correct.

"Finally, on Sept. 16, nearly a week after the first video was posted, The Times took note of the controversy, under the headline, “Conservatives Draw Blood From Acorn, Favored Foe.” The article said that conservatives hoped to weaken the Obama administration by attacking its allies and appointees they viewed as leftist. The conservatives thought they had a “winning formula,” the article said, mobilizing people “to dig up dirt,” then trumpeting it on talk radio and television."

In other words the only thing newsworthy about the event was that conservatives were exploiting it. But of course they wouldn't have been able to do so if ACORN were clean. And if the Times had been honest the repercussions might have been less. As one reader wrote “A suspicious person might see an attempt to deflect criticism of Acorn by highlighting how those pesky conservatives are at it again." A "suspicious person"? Surely anyone with any sense can see this bias, however much Hoyt tries to spin it.

Hoyt is still trying to deny what most observers think is the case, namely that the Times spiked a story of ACORN corruption last year in order to help the Obama case. And of course there is no question at all about the false story the Times posted last year claiming that John McCain had an extramarital affair.

Bloggers are gaining ground, and the mainstream media, headed by the New York Times, is losing it. That in my view is a good thing.

Labels:

Friday, September 18, 2009

The return of John Maynard Keynes

We are all in the same boat these days--either coping with reduced income or dreading the prospect of same. My income has dwindled somewhat, but the change is bearable, at least so far. Yet I am starting to get worried by the way the euro is trouncing the dollar.

In these days of uncertainty, there has been much casting about for a new (or old) orientation in the realm of economic theory. That is understandable. A feature of this quest is a certain revival of the ideas of Lord Keynes. I must aver that this development makes me a bit nervous, as Keynes has been drafted into service as an excuse for a number of dubious economic policies in the past.

Reduced to a bumper sticker--or very near--Keynes’ economic theory amounts to a reprise of the counsel Joseph gave to Pharaoh in the Hebrew Bible. Stock up during the fat years, and then generously dole out the surplus during the lean years. Too often, it seems to me, modern planners have adhered only to the second part. There has been hardly any stockpiling during the lean years.

James M. Buchanan, an economist critical of Keynes, has observed that the Keynesian doctrine of deficit spending provided the academic excuse for elected representatives to spend without taxing, thus removing the self-imposed discipline of balanced budgets that had existed prior to the adoption of Keynesian thinking. "The legacy or heritage of Lord Keynes is the putative intellectual legitimacy provided to the natural and predictable political biases toward deficit spending, inflation, and the growth of government."

Buchanan suggests that Keynesianism might perhaps work under a system of benevolent dictatorship, but not in a democratic setting with citizens who are both taxpayers and beneficiaries of public services, while living in the orbit of professional politicians, political parties, and government bureaucracy. "Political decisions in the United States are made by elected politicians, who respond to the desires of voters and the ensconced bureaucracy. There is no center of power where an enlightened few can effectively isolate themselves from constituency pressures.”

Defenders of Keynes have argued that these damaging conclusions are a distortion of what Keynes actually wrote. Maybe so, but that is the way his legacy has operated in recent decades.

Still, John Maynard Keynes is an interesting--indeed representative--personality in many ways. His formation occurred in the palmy days of Edwardian England. Keynes became a member of the Bloomsbury group, gay branch, along with E.M. Forster, Lytton Strachey, and the painter Duncan Grant.

As far as I can see, all relevant aspects of Keynes’ career are covered in Robert Skidelsky’s masterly, indeed sumptuous biography in three volumes. In 1983, after reading the first volume (which offers full coverage of the economist’s gay period), I could scarcely wait for the second and third installments (1992, 2000). They did not disappoint--except of course for an aspect beyond Lord Skidelsky’s control: Keynes heterosexualized himself by marrying a Russian ballerina. We all have friends who seem to be totally gay (and maybe still are), but who marry and have children. Mr. and Mrs. Keynes did not have children though--hence the jibe that his economic theory took no thought for future generations.

Now Lord Skidelsky has revisited the subject with a new book: “Keynes: The Return of the Master. “ It seems that this short work does not recap the three-volume magnus opus; the writer did that earlier. On this occasion, the author tries to imagine what Keynes would want now. Somehow, I’d rather not.

Lord Skidelsky is an interesting figure in his own right. According to the biography on his website, Robert Skidelsky was born on April 25,1939 in Harbin, Manchuria. His parents were British subjects, but of Russian ancestry. His father worked for the family firm, L. S. Skidelsky, which had leased the Mulin coalmine from the government. When war broke out between Britain and Japan in December 1941, he and his parents were interned first in Manchuria then Japan, but released in exchange for Japanese internees in England A few years ago, when Skidelsky returned to Harbin for a conference, he was acclaimed by his Chinese hosts for helping to restore the city’s Jewish heritage. This struck the honoree as a bit much, as his family was only partly Jewish and he had been raised an Anglican,

Skidelsky was educated at Brighton College and at Jesus College, Oxford.

In 1970, he became an Associate Professor at the School of Advanced International Studies, John Hopkins University. But the controversy surrounding the publication of his biography of Sir Oswald Mosley --in which he was felt to have let Mosley, the British fascist, off too lightly--led John Hopkins University to refuse him tenure. Oxford University also proved unwilling to give him a permanent post. These rejections were their loss.

In 1978 Skidelsky was appointed Professor of International Studies at the University of Warwick, where he has since remained, though joining the Economics Department as Professor of Political Economy in 1990.
 
During the 1980s he began to take a more active interest in politics. He was a founder member of the Social Democratic Party (SDP) and remained in the party until its dissolution in 1992. In 1991 he was made a life peer: Lord Skidelsky. Initially, he took the SDP whip but subsequently joined the Conservatives. In 2001, he left the Conservative Party for the cross benches. His political affiliations show a flexibility and attention to circumstances which I find admirable.
 
Today Lord Skidelsky is Emeritus Professor of Political Economy at the University of Warwick, I recommend his website, www.skidelskyr.com, which is both well written and informative on a number of subjects of current interest.

Labels: