Tuesday, January 30, 2007

"Bush hatred"

A conservative site accuses liberals of fomenting "Bush hatred." What need is there for that? Aided his Mephistophelean consigliere Dick Cheney ("I am the Vice-President, You're Not"), Bush has been perfectly capable of generating the hatred himself.

He has presided over the twin disasters of Iraq and Katrina. Gradually, even those who unwisely supported the Iraq invasion, are jumping off the bus. This leaves a dwindling corps of Bushbots to defend the bunker. I was saddened to see that Gaypatriot.com is among their number. What could be less patriotic than demolishing our reputation abroad?

Most disheartening to those who favor limited government and fiscal responsibility, Bush has sought to curtail civil liberties and permitted Congress to go on the biggest spending spree in our history. This erases the libertarian case for the Republicans.

Bush has turned out to be a pure gift for the Democrats. For a generation, possibly more, he has obliterated any reasonable case for principled Conservatism. Republican government turns out to be a new version of the regime of Boss Tweed. Enrich yourself and your cronies, and the Devil take the hindmost.

With respect to Iraq the Bushbots are unceasingly capable of generating new nonsense. After so many mantras, from Mission Accomplished to Defeat is Not an Option,, have become inoperable, the warmongers now tell us: "We have a plan. Those who criticize us have no plan."

Of course there are plans. The Galbraith-Biden plan is to partition Iraq along the lines of Yugoslavia. Murtha proposes redeployment.

One of the worst features of the current Bushian behavior is the refusal to make provision for what will happen when the day comes, as it inevitably will, our troops must withdraw. Left behind will be tens of thousands of Iraqis who collaborated with us. Now we are told that we can't leave, because these poor people will be killed. Well then, let's bring them out now.

To do such a thing would be a rare concession to reality. Let's try anything but that. Before long, though, we must withdraw.

So devastating has Bush been to conservative prospects that one might almost suspect a Manchurian Candidate scenario. Could it be that Bush is a liberal mole planted a long time ago with the mission of discrediting the Right? As we used to say on the old Left, he is "objectively" an agent provocateur whose secret mission is to destroy the things he claims to support. At least some of the speakers at a recent National Review weekend seem to think so. He has "betrayed" them.

Many observers, whether on the Right or Left, seem to believe that there is such a thing as "genuine Conservatism," which they either adore or execrate according to taste. Alas, Conservatism has always been a loose, baggy monster. Historical reflection confirms this surmise. Consider the careers of Benjamin D'Israeli and Otto von Bismarck. Far from pursuing the modest foreign policty goals the "Realists" think should guide us, D'Israeli made Victoria Empress of India. He recognized that ordinary English people would find solace in "National Greatness." And of course Bismarck blunted the effect of German Socialists by instituting the first social-security program in the history of the world.

Wednesday, January 24, 2007

Fakelore, urban legends, and ethnic stereotypes

The term fakelore was coined in 1950 by American folklorist Richard M. Dorson
Fakelore comprises a vast repertoire of invented stories, songs, legends, persons, and artifacts presented as if they were genuinely traditional. The term can refer to new items made up out of whole cloth, or to folklore that is reworked and modified for modern tastes. In principle, the elements of inauthenticity and misrepresentation are central. However, the artists and storytellers who transmit the fakelore motifs may not be aware of their dubious origins.

As an example Dorson cited the fictional cowboy Pecos Bill, ostensibly a folk hero of the American West, but actually invented in 1923 by the writer Edward J. O'Reilly. More controversially, Dorson regarded Paul Bunyan as fakelore. He conceded that Bunyan originated as a character in traditional tales told by loggers in the Great Lakes region of North America. Yet an ad writer working for the Red River Lumber Company invented many of the stories about him that are known today. According to Dorson, advertisers and popularizers turned Bunyan into a "pseudo-folk hero of twentieth-century mass culture" who bears little resemblance to the original.

The fate of Paul Bunyan shows how fakelore elements may be coopted by commercial interests. The tourist industry often finds such inventions lucrative. The Kensington runestone is a roughly rectangular slab of greywacke covered in runes on its face and side. Its origin and meaning have been disputed ever since it was found in 1898 near Kensington, Minnesota. If authentic, the stone would prove that Scandinavian explorers reached the heart of North America in the fourteenth century. The preponderance of opinion holds that it is a forgery. However, this finding has not prevented a cluster of motels and other businesses from adopting it as a magnet to lure tourists to this part of rural Minnesota.

Some fakelore may serve to advance the interests and self-consciousness of various groups. For example, some feminists have embraced Wiccan lore, with its stories, festivals, and rituals. A number of Wiccan, Neopagan, and even some "Traditionalist" or "Tribalist" groups cherish spurious "Grandmother Stories." These usually involve initiation by a grandmother, grandfather, or other elderly relative who is said to have instructed them in the arcana of the immemorial traditions of their ancestors. As this "secret wisdom" has almost always been traced to recent sources, or been quite obviously concocted even more recently, most proponents of these stories have eventually admitted they made them up. More broadly, scholars have shown that the stories and observances of the currently fashionable Neopagans have only a tenuous connection with the religious practices of old Europe.

A recent controversy has brought into question the authenticity of a legend concerning African American quilts made during slave days. Purportedly, these were created with motifs indicating ways of traveling along the Underground Railroad. The white masters would take the quilt motifs as merely decorative and abstract, while to the makers and their kin they conveyed a powerful message. For ourselves, these artifacts attest to a noble tradition of African American resistance. The quilt story stems from a 1999 book Hidden in Plain View by Jacqueline Tobin and Raymond Dobard. Unfortunately, there seems to be no truth to the legend, even though many quilts have been made recently utilizing the spurious visual motifs..

Stagolee (a.k.a. Stackerlee, Stack O'Lee, Stack-a-Lee) was an African American outlaw whose career was immortalized in a blues folk song, which has been recorded in hundreds of different versions. The legend, apparently based on a real person, was the subject of much embroidery. Although Stagolee was a murder, he is also shown as shrewd and powerful person who stands up for his own rights.

As these examples show, the stories elaborated by members of ethnic and other groups historically subject to discrimination are generally positive. The other side of the coin consists of the myths and fabrications generated by the host society in order to buttress their inferiorization of a particular minority or social group.

Blood libels against the Jews were a common vehicle of anti-Semitism during the Middle Ages, though there is no ritual involving human blood in Jewish law or custom. The first recorded instance was in the writings of the Greek grammarian Apion (first century C.E.), who claimed that the Jews sacrificed Greek victims in the Temple. Then the historical record falls silent, resuming with the story of the boy William of Norwich, first recorded in the Peterborough Chronicle (twelfth century). The libel afterward became an increasingly common accusation. In many cases, these anti-Semitic legends served as the basis for a blood libel cult, in which the alleged victim of human sacrifice was worshipped as a Christian martyr. Other hostile myths have Jews poisoning wells and profaning the Host. One such story, shown in a number of prints, even shows medieval Jews worshipping pigs.

Over the centuries Western society has generated a series of myths and fabrications to justify hatred of homosexuals. Dating from the time of Justinian in the sixth century C.E. is a legend that homosexuals cause natural disasters. Originally earthquakes were cited. Several years ago, however, Pat Robertson alleged that God had sent a hurricane to devastate Disneyworld in Florida because the amusement park had dared to permit Gay Days.

A common contemporary urban legend concerns gerbils. Ostensibly gay men take such animals and remove their claws and teeth. They then insert them into the anus, where injury results. Typically, a friend of some friend or relative, who happened to be an orderly at a hospital, is claimed to have observed this. The ultimate function of this sinister fiction is to affirm that gay men are irresponsible hedonists, inveterate risk takers, and seekers of momentary satisfactions to the neglect of long-term interests.

One of the most persistent myths that have gained a foothold in the gay movement is the belief that "faggot" derives from the basic meaning of "bundle of sticks used to light a fire," with the historical commentary that when witches were burned at the stake, "only presumed male homosexuals were considered low enough to help kindle the fires." Since there is no historical record of such use, this story is a myth. In fact, the use of the term faggot derives from the earlier meaning of a “slatternly woman.” See the discussion in my companion blog Homolexis.com.

The faggot story differs from the other two (featuring disasters and gerbils) because it owes its survival to repetition by gay people themselves. Fifty years ago some psychiatrists held that gay men were particularly given to “injustice collecting.” Their widespread adoption of the faggot myth may be an example of such behavior.

Although the matter of internalized homophobia has been much debated, it does seem that gay people have particular issues regarding self-esteem. In former times (though this is diminishing now) bitter queens dispensed a kind of self-loathing that could appear to be clever, but which was also demeaning. Even today it is common to hear gay people describing themselves as flighty, affected, superficial, irresponsible, and heedlessly pleasure-seeking. One of the reasons its supporters have advanced for gay marriage has been that access to this institution would reduce “promiscuity,” widely regarded as a problem.

There is no doubt that gay people have made many advances in integrating into society over the past fifty years. As this process continues we may expect to see a gradual diinution of self-contempt. Unfortunately, though, certain motifs, such as the ideas of disaster-bringing, the gerbil legend, and the faggot story, may linger.

Sunday, January 21, 2007

The success of failure

As I have several times noted, there was a fascinating contrast of views as we ran up to launch the Iraq war. Millions of ordinary folk, siting in ordinary living rooms, said: Don't do it. The war serves no legitimate national interest. After it was launched, we correctly predicted that the matter would end badly.

But a majority of the politicians, and almost all the well-placed members of the pundit class, were cheerleaders. Ostensibly they were privy to sources of information that we peons didn't have. Of course their information was wrong, as any person of even average intelligence could see. But groupthink, cleverly manipulated by the neocons, ruled.

An Internet friend of mine is a famous journalist. Before the war he was a major hawk. Until last year he continued to believe in "victory." The problem was that we "mismanaged" the war. No, the problem was that we got into it in the first place. Now my friend believes, belatedly, that we should redeploy.

The question that must be asked is this. If you were wrong, AS, before, why should we believe you now?

This question applies to a whole raft of journalists whose egregious mistakes in judgment only seem to enhance their visibility and incomes. In my view the worst is that charlatan Tom Freedman, the resident village explainer at the New York Times. Supposedly an expert on the Middle East, Freedman has been wrong over and over again. By contrast a consistent opponent of the war like Robert Scheer finds himself in the wilderness.

In their far-sightedness, these early opponents recall the "premature anti-fascists" of the 1930s. The PAFs were right, but their prescience embarrasses the johnny-come-latelies. So let's deprive them of their jobs and influence, while rewarding the idiots.

In a list of 113 or so problems that are plaguing America, the incompetence of the punditocracy ranks, I suppose, only as number 23. But the issue is puzzling all the same.

One answer I am afraid, is what is the alternative? Pat Buchanan was right about the war, but wrong about most everything else. And that sanctimonious windbag Bill Moyers is waiting in the wings to make a come-back. He is one more proof of the malign influence of religion in public life. Moyer's ally is, predictably enough, our own La Pasionaria, Amy Goodman.

Thursday, January 11, 2007

Gay New York

[Prefatory note: I had the good fortune to be living in Los Angeles during the fifties when, with the appearance of the Mattachine Society, the modern gay emancipation movement began. Then I lived in New York City at the end of the sixties, when the Stonewall Riots signaled a fundamental transformation, marking the inception of the major phase of the movement.

In three postings, based on recent books, I have offered my interpretation of the role of Los Angeles. I now supplement these with a posting on New York City.]

The Stonewall Inn was a small, dank, mob-run gay bar in Manhattan’s Greenwich Village. Despite the seeming insignificance of this dive, what happened there over three nights in June 1969 fundamentally changed the long-time landscape of the homosexual in society. The Stonewall Riots are widely acknowledged as the first shot that ushered in a previously unimagined era of openness, political action, and massive social change.

In his superb book Stonewall: The Riots That Sparked the Gay Revolution (New York: St. Martins, 2004), David Carter provides a careful account of those riots, together with a conspectus of the bar, the area, the social, political, and legal climate that led up to those events. Based on hundreds of interviews, an exhaustive search of public and previously sealed files, and much thought about the history and the topic, Stonewall has taken its place as the standard account of one of modern history's singular events.

Over the years a number of myths have arisen around the Stonewall Uprising. While David Carter profiles many individuals who were either at the bar, there is no way, after all these years, of filtering out those who falsely claim to have been at the bar during the events. (For the record, I wasn’t there personally. I was residing in New York, but was in Europe that summer. When I returned I noticed the huge change, and resolved to become an activist.)

It is crucial to remember that it was not the police raid itself that made Stonewall extraordinary, but the massive reaction of the crowd outside. At one point it even seemed that the police inside would be burned to death. Such were the times.

At any rate, the Carter book dispels—or should dispel—two major myths about Stonewall. The first is that the modern gay-rights movement started with Stonewall. That claim is false. The American gay movement we now have began in Los Angeles with the launching of the Mattachine Society almost twenty years before. (See my three contributions on Gay Los Angeles.)

The opposite myth, one that finds favor among some New York bashers in several parts of the country, is that there was nothing distinctive about the Stonewall Riots. Other police raids had taken place in other cities, especially San Francisco and Los Angeles. In this view the Stonewall Raids achieved their prominence only because New York is the center of the media.

Not so. Stonewall is of epochal importance not for the raid itself, which indeed does have many precedents, but for the astonishing aftermath. Those arrested did not go quietly, as the police expected. They resisted and found reinforcement in the form of a huge, shouting, bottle-throwing mob. After these people finally dispersed the next morning, the ordeal of the guardians of public order was not over. For the demonstrators returned for two more rowdy nights. To all intents and purposes the scene turned into an insurrection. It is not too much to say that it was the most important such event prior to the fall of the Berlin Wall in 1989.

This is the place, by the way, to dispel yet another myth, the belief that the riots were essentially a middle-class event. As a drinking hole, the Stonewall Inn attracted a mainly middle-class clientele. That much may be conceded. Yet it was the massing outside of the great unwashed—scruffy, disorderly street people, many of them individuals of color, and definitely “of repellent aspect”(as Oscar Wilde would have said)--that made the event seem so dangerous for the authorities. As sometimes happens in history, the “bad people” were the good people. By and large the middle-class respectables didn’t have the guts. But the “rancid little fags” (to quote one homophobic journalist) certainly did.

Almost a quarter of a century ago, I offered my own interpretation of the underlying causes of the Stonewall Riots. With a few minor changes, below I quote a few paragraphs from my analysis (“Afterword” to Jim Levin’s Reflections on the American Homosexual Rights Movement, New York: Gay Academic Union, 1983).

--------------------------------------------------------------------

The Stonewall Rebellion fell in the middle of an extraordinary cultural efflorescence in New York City. Now that that luster seems to be waning [or so it seemed in 1983], we can perhaps for the first time begin to see that flowering clearly. …

Just as Hitler and World War II had driven European movie folk to Hollywood, so they sent a stream of modernist painters and sculptors to New York City. At the end of the war many of these émigrés returned to Europe, but in their wake there developed the first American modernist school of international significance: Abstract Expressionism. Best known perhaps through the vast drip canvases of Jackson Pollock, thee artists (and the critics who sought to formulate their aims in words) stressed that their works were not so much artifacts or static monuments as living manifestations of creative encounter—or more concisely, actions. They were making Action Painting. This concept linked up with a deeply rooted American tendency to reject essentialism, the notion that there are fixed and rational definitions for the key features of the world, in favor of commitment to flux and experience (“Pragmatism”). Consequently, the works of the Abstract Expressionists had a broadly liberating quality, nudging those who were attracted to them towards a more open-textured, “improvisational” approach to life’s challenges. The paintings of this school looked messy and untidy; in the view of hostile observers, the “fecal masses” there displayed attested to the hazards of permissiveness. Cleanliness was next to godliness, but messiness could be downright diabolical.

Most of the first generation of the Abstract Expressionists were firmly heterosexual, though the practiced bohemian life styles. But their leading advocate, the New York poet and critic Frank O’Hara (1926-1966) was gay, and had a celebrated affair with the bisexual painter Larry Rivers. O’Hara’s own poetry combined a French-derived dandyism with the everyday conversational tone pioneered by William Carlos Williams in such works as In the American Grain. Jasper Johns and Robert Rauschenberg, the two leading painters of the immediately following generation, who form a transition to the vogue of Pop Art, were definitely gay—in fact they had an affair.

Developing a little later than the Abstract Expressionists was the somewhat grandiloquently named New American Cinema. Tirelessly promoted by Jonas Mekas in his columns in the Village Voice (then an important organ of the avant-garde, not the monster of sleaze it has since become), the young film makers sought to break with the naturalistic conventions of the commercial cinema (plot, characterizations, empathy and so forth) so as to create a celluloid equivalent of abstract painting and atonal music. The magus of the movement was the notorious Kenneth Anger, who made his underground homosexual classic “Fireworks” in Hollywood in 1947 when he was only seventeen. After a long stay in Paris (where he wrote Hollywood Babylon, an exposé of movieland’s scandals), 1962 found Anger in Brooklyn creating “Scorpio Rising,” a prophetic mixture of the occult, the leather and biker subculture, and homosexual sadomasochism. Jack Smith and Gregory Markopoulos made other experimental films (more experimental in fact than Anger’s) with a camp or homosexual sensibility. A little later Andy Warhol and his company popularized the genre in such pics as “Blowjob” (1964) and “My Hustler” (1965). It has been said, somewhat breathlessly, that “Warhol was the ‘sixties.” If he is now something of a classic, a marker of the past, there was no gainsaying Warhol’s topicality in that era.

Allen Ginsberg, the acknowledged dean of American gay poetry, has lived most of his life in an around New York City. (San Francisco was for him essentially an episode.) It was at Columbia University in the ‘forties that he met Jack Kerouac (whose bisexuality has only recently come to be widely acknowledged) and the definitely gay William Burroughs. “Howl” and “Kaddish,” Ginsberg’s two major poems, are deeply interwoven with New York’s people and rhythms. In turn the East Village scene is scarcely thinkable without Ginsberg. He was its presiding spirit.

In journalism Paul Krassner’s The Realist startled readers in 1967 with a surrealist description of Lyndon Johnson screwing a bullet hole in President Kennedy’s slain body. The Realist lapsed, but its place was taken by a host of underground publications, including Ed Saunders’ F*ck You: A Magazine of the Arts, Al Goldstein’s Screw, and the East Village Other. It is in this context that NYC’s post-Stonewall journalism—Come Out; Gay; Gay Power; and Gay Scene—must be regarded.

To the new resident arriving in the early or mid-1960s (and throughout we are concentrating on the developments before 1969), perhaps the most salient aspect of NYC’s avant-garde was the theater—not the Broadway octopus, which was shamelessly commercial—but the experimental houses of Off-Broadway and Off-off Broadway. In those days the archetype was the Living Theater of Julian Beck and Judith Malina, which performed in a loft-like space at Sixth Avenue and Fifteenth Street (1951-63), ranging from Sophocles as translated by Ezra Pound, through a staged I Ching, to the contemporary drug-culture classic “The Connection” (1959). With the Living Theater’s departure for Europe the mantle passed to Ellen Stuart’s La Mama in the East Village. The true center of gay theater, however, was Joe Cino’s Caffè Cino, originally a simple coffee house. Here gay playwrights like Robert Patrick (the Cino is evoked in his famous “Kennedy’s Children”), Doric Wilson, and William Hoffman got their start. By the late ‘sixties storefront theaters were springing up all over the East Village. Sometimes the material was sexually explicit, as in Rochelle Owens’ “Futz” (1967), about a man in love with a pig, and Michael McClure’s “The Beard” (1967), which became notorious for showing Buffalo Bill going down on Jean Harlow on stage.

In due course this whole theatrical flowering, with its frank treatment of “deviant” sexuality, along with drugs, rock music, astrology, and anti-war protest, was projected commercially—-first to Broadway and then to the nation and the world-—in the immensely successful musical “Hair.” At the same time the art world drew closer to the new theater with its own form of improvisation, the happening.

Against this ebullient background, one is tempted to regard the three days of the Stonewall Uprising as simply the most spectacular manifestation of the new funky theater, produced in improvisational style with unpaid actors, and the police playing themselves. Television, then centered in NYC, stood ready to beam scenes of such colorful social protest across the land. In this flamboyant outdoor theater, recruitment for a much-enlarged gay movement had already begun.

Many of the improvisational and “nonhierarchical” tendencies flowed, together with skills of left-sectarian and New Left provenance (leafleting; mimeographing of manifestoes and statements; hatching of demos and zaps) into the volatile Gay Liberation Front (GLF), formed in the wake of Stonewall. While this proved an unstable organization, fragments of its style filtered into other groups in New York City, and elsewhere.

In a deeper sense the New York mutation of the movement owed a great deal to cultural Modernism, with its roots going back to France in the middle of the nineteenth century. In keeping with the “tradition of the new,” there was a tenacious attempt to combine innovative aesthetics with radical politics. A powerful challenge was hurled at all existing standards, either by proposing new ones or by proclaiming “anything goes”—there are no standards. Out of such a yeasty amalgam, which peaked in New York City in the 1950s and ‘60s arose many of the distinctive features of the gay liberation movement of the 1970s.

----------------------------------------------------------------------------------

In conclusion, I broaden the above conspectus by calling the roll of a half-century of New York City “firsts,” stemming from the list in my article in the Encyclopedia of Homosexuality.

1) the publication of Donald Webster Cory’s (a.k.a. Edward Sagarin) The Homosexual in America (New York: Greenberg, 1951);

2) the appearance of the foundational text of the movement for recognition of intergeneration sex, J. Z. Eglinton’s Greek Love (New York, Oliver Layton Press, 1964);

3) the founding of the first gay student association in the history of the world, the Student Homophile League, by Stephen Donaldson (Robert A. Martin), 1966;

4) the opening in Greenwich Village of the Oscar Wilde Memorial Bookshop, the first to be devoted to gay/lesbian books, by Craig Rodwell, who had earlier organized a gay youth group (November 1967);

5) the Stonewall Rebellion (June 1969);

6) the founding of the Gay Liberation Front, with a “revolutionary” multi-issue format that was to enjoy widespread, though ephemeral imitation (July 1969);

7) the founding of Gay Activists Alliance, whose single-issue orientation was to provide a more lasting model than that of GLF (December 1969);

8) the first Gay Pride March (simultaneously with Los Angeles) (June 1970);

9) the launching of the Gay Academic Union at John Jay College (1973);
10) the founding of the National Gay Task Force (1974);

11) the establishment of Gay Men’s Health Crisis (1981);

12) the founding of the Gay and Lesbian Alliance Against Discrimination (1985);

13) the founding of ACT-UP (AIDS Coalition to Unleash Power) (1987);

14) the publication of the two-volume Encyclopedia of Homosexuality (Wayne R. Dynes, General Editor; New York: Garland, 1990), the first such comprehensive reference work of this kind in the history of the world;

15) the founding of CLAGS (Center for Lesbian and Gay Studies; April 1991), which became fully integrated with the Graduate Center of the City University of New York a few years later, making it the first such research institute to be affiliated with a public university.

The hidden agenda (perhaps)

In all the flood of discussion following Bush's speech Wednesday night, I missed the following Machiavellian scheme. According to a plan ascribed to Dick Cheney, we should abandon any semblance of even-handedness in Iraq, and simply support the creation of a Shia state. In this new order the Sunni would be reduced to a helotry comparable to that of the Palestinians vis-a-vis Israel. Since al-Qaeda is Sunni, it would not flourish in a Shia state.

However, this strategy cannot be openly avowed for several reasons. First, it goes against the rhetoric of establishing Democracy--rhetoric that Bush feels he cannot retire, as it would remove the last figleaf of the invasion. Secondly, tilting decisively to the Shia would distress Saudi Arabia and others of our "friends" in the region. So we must appear to be following one policy, while actually promoting another.

In this view Bush was telling the truth (up to a point) when he said that he is supporting Maliki's plan. Maliki is of course entirely a creature of the Shia militias. Bowing to Realpolitik as the Cheney scheme appears to do, they are the ones who will constitute the new Iraqi army. They already control the police.

Bush has probably exacted a price for this gift to Maliki and his supporters. After they have consolidated their hold on Iraq--with our military continuing to provide a protective shield--they must stand aside while we attack their coreligionists in Iran. As several observers have noticed, the speech coincides with other statements of bellicosity towards Iran. A few days ago General Clark, echoing the well-informed reports of Seymour Hersh, sounded the alarm. Preparations for bombing Iran have not ceased, and indeed are well advanced.

In this way Bush will follow the path of every tyrant who finds himself in a losing war--start another war.

Eleanor Clift has astutely suggested that the reason for removing John Negroponte from his post as intelligence czar is that he refused to go along with cooking the intel data about Iran.

Faced with these dire circumstances, the Democrats have only "symbolic" votes to put into place. Such impotent tut-tutting will have no effect. Perhaps it is not intended to.

As with Vietnam, we have reached a point where the normal politics of the two-party system has failed. Now only massive demonstrations will work. If these are big enough, they might convince a substantial body of Republican Senators to go to Bush and tell him that the game is up. The country is becoming ungovernable. This step worked with Nixon. Yet there is only a slender hope that it will work now.

Wednesday, January 10, 2007

Turning off the "gay gene"?

Some concern has been generated by work at Oregon State University that suggests the possibility of turning off the “gay gene” in sheep. The researchers have been adjusting hormones in the brains of gay rams in order to stimulate their interest in the opposite sex. It seems that the indifference of many rams to otherwise attractive and fertile ewes is a drag on sheep breeding.

Martina Navratilova and other gay and lesbian observers have sounded the alarm about this work. The gay rams have a right to be what they are, Navratilova asserts. Andrew Sullivan puts the concern this way: “If you can figure out how to flip the gay switch off in sheep, how long will it be before someone tries to do the same in humans?”

“The good news, then, [Sullivan goes on to write] is that the empirical origins of sexual orientation are slowly being discovered. The bad news is that once discovered, they could be manipulated.” He doubts, however, that the procedures applicable to animals could be extended to human beings.

Well, why not? For some time evidence has been accumulating that indicates that there is indeed a genetic component in human same-sex behavior. The best evidence of this is the fact that identical twins show a much greater concordance for sexual orientation than do fraternal twins. At present we can assess the effect (at least up to a point), while the mechanism--commonly, but inaccurately termed the "gay gene"--remains elusive.

And so we come to the ultimate nightmare scenario. If the genetic triggers for homosexual orientation could be identified, then fetuses destined to be gay can be detected and aborted. Apparently this is already happening on a large scale with fetuses identified with Down’s Syndrome. Very few such bables are being born nowadays.

It does not seem, though, that the work in Oregon is leading in this direction. The efforts address indifference of male sheep to female sheep. This is by no means a problem with many human homosexuals, who gravitate to women as friends and, in some cases, are perfectly able to perform sexually in order to beget children. Call these last individual bisexuals if you will, but they exist. The bisexual phenomenon indicates that we are not dealing with a simple binarism of gay vs. straight, but with a behavioral spectrum, as Kinsey realized long ago.

There is another problem that concerns animal homosexuality, now so confidently asserted as a result of Bruce Bagemihl’s big book on the subject. Yes, various forms of same-sex behavior occur among animals. However, this behavior is quite variable, ranging from mounting behavior (perhaps better analyzed as dominance assertion) to female bird pairs who raise their young together. In short, while many forms of same-sex behavior are found among animals, no species reveals the complex phenomenon that characterizes erotic and loving homosexuality among human beings.

In the light of this complexity, it is unlikely that a single gene will ever be found that will “make people gay.” Instead, one should expect the interplay of a variety of genetic triggers. Moreover, environment will continue to play a part.
To put the matter in the vernacular: Some of us are born that way, the rest just get sucked into it.

When the genetic conditioning is better understood, it is still possible that some bigoted parents will want to abort in order to avert even the slightest possibility of a gay child. However, the Catch 22 is that many of these homophobic individuals are governed by religious allegiances that forbid abortion. So while they might like to do it, they won't. In this case, though, there could be serious psychological consequences for a gay child who is unwanted by his parents because he or she has been tagged as homosexual before birth. In such cases adoption by gay parents might be the solution.

Genetic advances are causing all sorts of consequences. With this prospect it is important to avoid alarmism, while at the same time realistically assessing the dangers that may ensue.

Monday, January 08, 2007

At the movies

They say that one can ignore mainstream movies released for ten months of the year. These are just the first step to DVD or electronic release. They are mostly junk. Doing this skipping suits me fine, as I am very selective about what films I see.

Come December all this changes, though, and the quality stuff comes out (running on through January) to qualify for the Academy awards. Then I start going to movies.

The best one I have seen so far is "Painted Veil," a fine version, coordinated by Edward Norton, of an old standby by W. Somerset Maugham. Although it is set in Republican China, it is more a character study about two stereotypical English people, and how one (the wife) achieves personal growth.

Shifting to a Boston venue, "The Departed" has the highest volume of homophobic epithets of any film that I can remember. This filth is practically wall to wall. In general we gay people have made many gains over the last fifty years. In the old days, though, tabooing would keep such stuff off the screen.

Apparently the cops--or at least these Irish Boston types--commonly use homophobic slurs to "get over" on colleagues or others they wish to control. The homophobic slurs overlap misogynistic ones (e.g. calling another cop a "prom queen").

All the actors who use these words are, I think, just reading their lines. One, though, Mark Wahlberg spews them out with such venom that you feel he is really invested in them. You will remember that as "Marky Mark," Wahlberg was something of a gay icon. He has always been uncomfortable with this clientele, and now (it seems) he lets it all hang out. In that case the good news is that his looks have faded, and he should henceforth be relegated to doing character parts. Screw him.

Repeatedly, Leonardo di Caprio steals the movie. This actor, who started as a kind of male Shirley Temple, has shown an almost incredible ability to develop (with a few false steps along the way). He is also in "Blood Diamond," and may walk off with an Oscar for one or the other performance. Good.

One of the people in "The Departed" that Leo trounces is Matt Damon, truly a good guy in my book. I thought that Damon would win back his spurs in "The Good Shepherd," but he simply doesn't perform at all in that movie. In order to show a character devoid of feeling, he rigidly maintains a neutral mask throughout. The role could have been played by an expressionless android. They should garnish his paycheck.

Much has been expected of "The Good Shepherd," but it suffers from a really major flaw. The film seeks to explain the early years of the CIA by the personality of a single invidual, whose emotions were deadened by a childhood trauma. The movie is also full of minor loose ends in terms of plot. In my view, the makers (De Niro directed, as well as playing a supporting role) were trying to create another "Syriana." They failed.

Friday, January 05, 2007

Creative sites

Today the conventional wisdom holds that, because of the ease of electronic communication, it doesn't much matter where a creative person lives nowadays. So one might as well move to a place where the climate is mild and rents are low.

Maybe so, but that has not been my experience. It is true that I have met some persons who influenced me on the Internet, but on the whole I believe that this facility supplements, rather than replaces the old-fashioned "face-work" of actual encounter in a specified locality.

Sometimes this locality can be very specific, a microenvironment, as it were. In the early sixties I did a brief career detour into the world of publishing. I was fortunate to have Bernard Myers occupying an office next to mine on the twentieth floor of the old McGraw-Hill building (an architectural landmark, but that fact is incidental). Every other day, Bernie would invite me to lunch and tell me all sorts of things about intellectual and professional life in New York (he was born, if memory serves, in 1909). One day, though, Bernie got a promotion and was moved to another floor in the building. Our colloquies ceased. Still, I had learned what I needed to know, and later on, when I was living an impoverished life as a student in London, Bernie Myers gave me freelance work. If we had not shared an office wall, my life would have been different.

A cultural historian has drawn up a map of the places of residence of leading intelleectuals in Munich prior to World War I. Few people outside of Germanists realize the fact, but this was one of the most dynamic centers of creativity in Europe a hundred years ago. A space of little more than a square mile in the Schwabing district contained the apartments of Vassily Kandinsky, Franz Marc, and Paul Klee. Peter Behrens, who jump-started modern architecture, lived there also, as did Stefan George and Thomas Mann. And many more. Not all these luminaries knew one another, but collectively they created an atmosphere of incredible brilliance. Or as they said in those days "Muenchen leuchtete"--Munich shone.

Such constellations of creativity are fragile, and this one was no exception. The outbreak of World War I in August 1914 destroyed the preeminence of Munich. It was never to return.

It is less easy to define why the city developed this eminence. To be sure, in the 19th century the Wittelsbach house had begun a campaign of embellishing the city architecturally and building up its museums. However, Munich came into its own only after 1871, which began the age of German industrial expansion, symbolized by the Ruhr and by Berlin. Munich, traditionally a center of the Bavarian baroque with close ties to Italy, thrived because it was not the Ruhr, not Berlin.

The novelist Robert Stone has just brought out a book chronicling the efflorescence of the East Village in the 1960s and 70s. Today, the district has been gentrified. The arts scene didn't die though, it moved to Williamsburg and other places. Affordable housing, especially in rundown disctricts, is a key factor.

This is one reason why the prescriptions of Richard Florida, perporting to show what cities can do to make themselves attractive to creative people, have generally failed. Creativity flourishes under conditions of benign neglect--at least of a certain sort.

A further question is, What is creativity? Fifty years ago the answer seemed easy. It is achievement in the fine arts of literature, theater, music, dance, painting, sculpture and architecture. In the post-Gutenberg era this deck has been scrambled and the popular media of movies and television, rock and rap seem to fit the Zeitgeist all too closely.

But is the addition of these new media to the mix wide enough? Some advocate the principle of "symbolic analysis," which would include all brain workers.

One feature remains, though. About the only thing we know about creative districts is that they develop a critical mass after a few innovative souls have settled there, leading to the attraction of others. So far, so good. But then a version of the St. Tropez Syndrome sets in. Too many people, especially the uncreative rich, move in, rents go up--and the magic is gone.

Thursday, January 04, 2007

The God problem

In a recent essay in Time Magazine, Andrew Sullivan points to the prominence of religion as the most salient aspect of the closing years of the 20th century and the beginning ones of the 21st. Although he is a Roman Catholic, Sullivan is by no means comfortable with this resurgence. And indeed with the evidence that religion is crucially implicated in violence in Northern Ireland, West Africa, and the Middle East, what thinking person could be? While our evangelicals are not violent in this way, the efforts of some of them to turn the US into a theocracy are frightening.

It is in this context, I believe, that the interventions of such atheists as Sam Harris and Richard Dawkins must be placed. Unfortunately, they are not doing a very good job. For one thing, they identify religion with the three Abrahamic varieties, Judaism, Christianity, and Islam. This is scarcely an acceptable conspectus of world religion. Animism may be, in our eyes, superstition, but as a rule these beliefs do no harm apart from those who subscribe to them. The views of Scientologists I find preposterous, but they are not setting off car bombs. And to turn from the ridiculous to the sublime, Buddhism and Taoism have benefited hundreds of millions of human beings, without fostering the sort of murderousness that our Big Three have caused. So why don't Harris and Dawkins call a spade a spade? Harm is being caused by the faiths they were raised in, Judaism and Christianity, plus Islam. That is the way things are. Yet to be frank about this problem would probably lose them many readers.

Much attention has been focused on this harm question, especially as expounded in the latter part of Dawkins book. As the review I am citing in part below shows, however, he is guilty of bad faith in the matter of the harms caused by Marxist atheism. He spuriously claims that the victims of these false beliefs cannot (or perhaps should not) be counted. See, however, The Black Book of Communism for very believable estimates. Then, somehow, Dawkins claims that atheism is just a kind of add-on to Communism, an element not central to it. That would be news to Karl Marx who regarded atheism as the cornerstone of his belief system.

The reality is that atheism has surpassed the Abrahamic religions in the number of its victims in the 20th century. I must stess this point. How did the 20th century acquire its dubious distinction of being the most violent era of world history? The answer is godless totalitarianism--Marxism and fascism.

At any rate Dawkins' book has been devastated by a review coming from a biologist, H. Allen Orr. The full review appears in The New York Review of Books for January 11, 2006. Below are some major excerpts.


[Orr on Dawkins]

Among the [recent crop of books advocating atheism], Dawkins's The God Delusion stands out for two reasons. First, it's by far the most ambitious. . . . Dawkins is on a mission to convert. He is an enemy of religion, wants to explain why, and hopes thereby to drive the beast to extinction. Second, Dawkins has succeeded in grabbing the public's attention in a way that other writers can only dream of. His book is on the New York Times best-seller list and he's just been featured on the cover of Time Magazine.

Dawkins's first book, The Selfish Gene (1976), was a smash hit. An introduction to evolutionary theory, it explained a number of deeply counter-intuitive results, including how an apparently self-centered process like Darwinian natural selection can account for the evolution of altruism. . . .

Dawkins clearly believes his background in science allows him to draw strong conclusions about religion and, in The God Delusion, he presents those conclusions in language that's stronger still. Dawkins not only thinks religion is unalloyed nonsense but that it is an overwhelmingly pernicious, even "very evil," force in the world. His target is not so much organized religion as all religion. And within organized religion, he attacks not only extremist sects but moderate ones. Indeed, he argues that rearing children in a religious tradition amounts to child abuse.

Dawkins's book begins with a description of what he calls the God Hypothesis. This is the idea that "the universe and everything in it" were designed by "a superhuman, supernatural intelligence." This intelligence might be personal (as in Christianity) or impersonal (as in deism). Dawkins is not concerned with the alleged detailed characteristics of God but with whether any form of the God Hypothesis is defensible. His answer is: almost certainly not. Although his target is broad, Dawkins discusses mostly Christianity, partly because this faith has wrestled often with science and partly because it's the tradition Dawkins knows best (he was reared as an Anglican).

The first few chapters of The God Delusion are given over to philosophical matters. Dawkins summarizes the traditional philosophical arguments for God's existence, from Aquinas through pre-Darwinian arguments from biological design, along with the traditional arguments against them. . . .

The latter half of The God Delusion is partly devoted to Dawkins's discussion of religion as practiced. Not surprisingly, he finds little good to say about it: religion for him is the root of much evil and its disappearance from the world would be an unmitigated good. Religion, he tells us, is certainly not the source of our morality (indeed the God of the Old Testament is, he claims, nothing short of monstrous) and believers are no better morally than nonbelievers; in fact they may be worse. Dawkins regales us with tales of Christian cops who threaten to beat up an atheist; presents statistics on the higher rates of crime in regions that are religious; and argues that, when considering religiously inspired violence and terrorism, "we should blame religion itself, not religious extremism—as though that were some kind of terrible perversion of real, decent re-ligion." Late in his book, Dawkins defends a faith-free morality and provides his own, secular, Ten Commandments. (For example, "Do not indoctrinate your children" and "Enjoy your own sex life (so long as it damages nobody else).")

. . . Dawkins when discussing religion is, in effect, a blunt instrument, one that has a hard time distinguishing Unitarians from abortion clinic bombers. What may be less obvious is that, on questions of God, Dawkins cannot abide much dissent, especially from fellow scientists (and especially from fellow evolutionary biologists). Indeed Dawkins is fond of imputing ulterior motives to those "Neville Chamberlain School" scientists not willing to go as far as he in his war on religion: he suggests that they're guilty of disingenuousness, playing politics, and lusting after the large prizes awarded by the Templeton Foundation to scientists sympathetic to religion. The only motive Dawkins doesn't seem to take seriously is that some scientists genuinely disagree with him.

Despite my admiration for much of Dawkins's work, I'm afraid that I'm among those scientists who must part company with him here. Indeed, The God Delusion seems to me badly flawed. Though I once labeled Dawkins a professional atheist, I'm forced, after reading his new book, to conclude he's actually more an amateur. I don't pretend to know whether there's more to the world than meets the eye and, for all I know, Dawkins's general conclusion is right. But his book makes a far from convincing case.

The most disappointing feature of The God Delusion is Dawkins's failure to engage religious thought in any serious way. This is, obviously, an odd thing to say about a book-length investigation into God. But the problem reflects Dawkins's cavalier attitude about the quality of religious thinking. Dawkins tends to dismiss simple expressions of belief as base superstition. Having no patience with the faith of fundamentalists, he also tends to dismiss more sophisticated expressions of belief as sophistry (he cannot, for instance, tolerate the meticulous reasoning of theologians). But if simple religion is barbaric (and thus unworthy of serious thought) and sophisticated religion is logic-chopping (and thus equally unworthy of serious thought), the ineluctable conclusion is that all religion is unworthy of serious thought.

The result is The God Delusion, a book that never squarely faces its opponents. You will find no serious examination of Christian or Jewish theology in Dawkins's book (does he know Augustine rejected biblical literalism in the early fifth century?), no attempt to follow philosophical debates about the nature of religious propositions (are they like ordinary claims about everyday matters?), no effort to appreciate the complex history of interaction between the Church and science (does he know the Church had an important part in the rise of non-Aristotelian science?), and no attempt to understand even the simplest of religious attitudes (does Dawkins really believe, as he says, that Christians should be thrilled to learn they're terminally ill?).

Instead, Dawkins has written a book that's distinctly, even defiantly, middlebrow. Dawkins's intellectual universe appears populated by the likes of Douglas Adams, the author of The Hitchhiker's Guide to the Galaxy, and Carl Sagan, the science popularizer, both of whom he cites repeatedly. This is a different group from thinkers like William James and Ludwig Wittgenstein—both of whom lived after Darwin, both of whom struggled with the question of belief, and both of whom had more to say about religion than Adams and Sagan. Dawkins spends much time on what can only be described as intellectual banalities: "Did Jesus have a human father, or was his mother a virgin at the time of his birth? Whether or not there is enough surviving evidence to decide it, this is still a strictly scientific question."

The vacuum created by Dawkins's failure to engage religious thought must be filled by something, and in The God Delusion, it gets filled by extraneous quotation, letters from correspondents, and, most of all, anecdote after anecdote. Dawkins's discussion of religion's power to console, for example, is interrupted by the story of the Abbott of Ampleforth's joy at learning of a friend's impending death; speculation about why countries, such as the Netherlands, that allow euthanasia are so rare (presumably because of religious prejudice); a nurse who told Dawkins that believers fear death more than nonbelievers do; and the number of days of remission from Purgatory that Pope Pius X allowed cardinals and bishops (two hundred, and fifty, respectively). All this and more in four pages. . . .

One reason for the lack of extended argument in The God Delusion is clear: Dawkins doesn't seem very good at it. Indeed he suffers from several problems when attempting to reason philosophically. The most obvious is that he has a preordained set of conclusions at which he's determined to arrive. Consequently, Dawkins uses any argument, however feeble, that seems to get him there and the merit of various arguments appears judged largely by where they lead.

. . . Exercises in double standards also plague Dawkins's discussion of the idea that religion encourages good behavior. Dawkins cites a litany of statistics revealing that red states (with many conservative Christians) suffer higher rates of crime, including murder, burglary, and theft, than do blue states. But now consider his response to the suggestion that the atheist Stalin and his comrades committed crimes of breathtaking magnitude: "We are not in the business," he says, "of counting evils heads, compiling two rival roll calls of iniquity." We're not? We were forty-five pages ago.

Dawkins's problems with philosophy might be related to a failure of metaphysical imagination. When thinking of those vast matters that make up religion—matters of ultimate meaning that stand at the edge of intelligibility and that are among the most difficult to articulate—he sees only black and white. Despite some attempts at subtlety, Dawkins almost reflexively identifies religion with right-wing fundamentalism and biblical literalism. Other, more nuanced possibilities— varieties of deism, mysticism, or nondenominational spirituality—have a harder time holding his attention. It may be that Dawkins can't imagine these possibilities vividly enough to worry over them in a serious way.

There's an irony here. Dawkins's main criticism of those who doubt Darwin—and it's a good one—is that they suffer a similar failure of imagination. Those, for example, who argue that evolution could never make an eye because anything less than a fully formed eye can't see simply can't imagine the surprising routes taken by evolution. In any case, part of what it means to suffer a failure of imagination may be that one can't conceive that one's imagination is impoverished. It's hard to resist the conclusion that people like James and Wittgenstein struggled personally with religion, while Dawkins shrugs his shoulders, at least in part because they conceived possibilities—mistaken ones perhaps, but certainly more interesting ones— that escape Dawkins.

Putting aside these philosophical matters, Dawkins's key empirical claim—that religion is a pernicious force in the world—might still be right. Is it? Throughout The God Delusion, Dawkins reminds us of the horrors committed in the name of God, from outright war, through the persecution of minority sects, acts of terrorism, the closing of children's minds, and the oppression of those having unorthodox sexual lives. No decent person can fail to be repulsed by the sins committed in the name of religion. So we all agree: religion can be bad.

But the critical question is: compared to what? And here Dawkins is less convincing because he fails to examine the question in a systematic way. Tests of religion's consequences might involve a number of different comparisons: between religion's good and bad effects, or between the behavior of believers and nonbelievers, and so on. While Dawkins touches on each, his modus operandi generally involves comparing religion as practiced —religion, that is, as it plays out in the rough-and-tumble world of compromise, corruption, and incompetence— with atheism as theory. But fairness requires that we compare both religion and atheism as practiced or both as theory. The latter is an amorphous and perhaps impossible task, and I can see why Dawkins sidesteps it. But comparing both as practiced is more straightforward. And, at least when considering religious and atheist institutions, the facts of history do not, I believe, demonstrate beyond doubt that atheism comes out on the side of the angels. Dawkins has a difficult time facing up to the dual facts that (1) the twentieth century was an experiment in secularism; and (2) the result was secular evil, an evil that, if anything, was more spectacularly virulent than that which came before.

Part of Dawkins's difficulty is that his worldview is thoroughly Victorian. He is, as many have noted, a kind of latter-day T.H. Huxley. The problem is that these latter days have witnessed blood-curdling experiments in institutional atheism. Dawkins tends to wave away the resulting crimes. It is, he insists, unclear if they were actually inspired by atheism. He emphasizes, for example, that Stalin's brutality may not have been motivated by his atheism. While this is surely partly true, it's a tricky issue, especially as one would need to allow for the same kind of distinction when considering religious institutions. (Does anyone really believe that the Church's dreadful dealings with the Nazis were motivated by its theism?)

In any case, it's hard to believe that Stalin's wholesale torture and murder of priests and nuns (including crucifixions) and Mao's persecution of Catholics and extermination of nearly every remnant of Buddhism were unconnected to their atheism. Neither the institutions of Christianity nor those of communism are, of course, innocent. But Dawkins's inability to see the difference in the severity of their sins— one of orders of magnitude—suggests an ideological commitment of the sort that usually reflects devotion to a creed.

What of the possibility that present-day churchgoers are worse morally than those who stay away? They might be. Indeed C.S. Lewis, in perhaps the most widely read work of popular theology ever written, Mere Christianity, conceded the possibility. Emphasizing that the Gospel was preached to the weak and poor, Lewis argued that troubled souls might well be drawn disproportionately to the Church. As he also emphasized, the appropriate contrast should not, therefore, be between the behavior of churchgoers and nongoers but between the behavior of people before and after they find religion. Under Dawkins's alternative logic, the fact that those sitting in a doctor's office are on average sicker than those not sitting there must stand as an indictment of medicine. (There's no evidence in The God Delusion that Dawkins is familiar with Lewis's argument.)

In any case, there are some grounds for questioning whether Dawkins's project is even meaningful. As T.S. Eliot famously observed, to ask whether we would have been better off without religion is to ask a question whose answer is unknowable. Our entire history has been so thoroughly shaped by Judeo-Christian tradition that we cannot imagine the present state of society in its absence. But there's a deeper point and one that Dawkins also fails to see. Even what we mean by the world being better off is conditioned by our religious inheritance. What most of us in the West mean—and what Dawkins, as revealed by his own Ten Commandments, means—is a world in which individuals are free to express their thoughts and passions and to develop their talents so long as these do not infringe on the ability of others to do so. But this is assuredly not what a better world would look like to, say, a traditional Confucian culture. There, a new and improved world might be one that allows the readier suppression of in-dividual differences and aspirations. The point is that all judgments, including ethical ones, begin somewhere and ours, often enough, begin in Judaism and Christianity. Dawkins should, of course, be applauded for his attempt to picture a better world. But intellectual honesty demands acknowledging that his moral vision derives, to a considerable extent, from the tradition he so despises.

One of the most interesting questions about Dawkins's book is why it was written. Why does Dawkins feel he has anything significant to say about religion and what gives him the sense of authority presumably needed to say it at book length? The God Delusion certainly establishes that Dawkins has little new to offer. Its arguments are those of any bright student who has thumbed through Bertrand Russell's more popular books and who has, horrified, watched videos of holy rollers. Dawkins is obviously entitled to his views on God, ballet, and currency markets. But I doubt he feels much need to pen books on the last two topics.

The reason Dawkins thinks he has something to say about God is, of course, clear: he is an evolutionary biologist. And as we all know, Darwinism had an early and noisy run-in with religion. What Dawkins never seems to consider is that this incident might have been, in an important way, local and contingent. It might, in other words, have turned out differently, at least in principle. Believers could, for instance, have uttered a collective "So what?" to evolution. Indeed some did. The angry reaction of many religious leaders to Darwinism had complex causes, involving equal parts ignorance, fear, politics, and the sheer shock of the new. The point is that it's far from certain that there is an ineluctable conflict between the acceptance of evolutionary mechanism and the belief that, as William James putit, "the visible world is part of a more spiritual universe." Instead, we and Dawkins might simply be living through the reverberations of an interesting, but not especially fundamental, bit of Victorian history. If so, evolutionary biology would enjoy no particularly exalted pulpit from which to preach about religion.

None of this is to say that evolutionary biology cannot inform our view of religion. It can and does. At the very least it insists that the Lord works in mysterious ways. More generally, it demands rejection of anything approaching biblical literalism. There are facts of nature—including that human beings evolved on the African savanna several million years ago—and these facts are not subject to negotiation. But Dawkins's book goes far beyond this. The reason, of course, is that The God Delusion is not itself a work of either evolutionary biology in particular or science in general. None of Dawkins's loud pronouncements on God follows from any experiment or piece of data. It's just Dawkins talking.

We should not, though, conclude that there's no debate whatever to be had between science and religion. The view championed by Stephen Jay Gould and others that the two endeavors are utterly distinct and thus incapable of interfering with each other is overly simplistic. There have been, and likely will continue to be, real disagreements between legitimate science and authentic religion. Some of the issues involved are epistemological (Do scientific and religious claims simply begin with different premises, the first material-ist and the second not?), and others ethical (Where do we draw the line between what medicine can accom-plish and what it should be allowed to accomplish?). These questions are difficult and might well merit extended discussion between scientific and religious thinkers. But if such discussions are to be worthwhile, they will have to take place at a far higher level of sophistication than Richard Dawkins seems either willing or able to muster.

Monday, January 01, 2007

Phony truisms

Over the years I have been struck by how easily people assent to proverbs and other pieces of conventional wisdom that are simply absurd on their face. Take, for example, the adage "it's always darkest before the dawn." Occasionally, a sudden strom might blow up to temporarily darken the predawn atmosphere, but as a generalization of somthing that is "always" true the statement is patently false.

Its purpose in practical affairs, I suppose, is to assure supporters of a cause that is in deep trouble that if one just hangs on eventually things will get better.
Imagine, for example the situation in England in June of 1940, as the Nazi armies overran France. Here, however, Churchill got it right. Much blood, sweat, and tears would have to be expended before things got better. In fact, saying that dawn would soon appear (following the proverbial formula) would be likely to make things worse, as people would become complacent and rest on their oars.

Another saying is of more recent origin; it derives from Friedrich Nietzsche. "Whatever doesn't kill a person makes him stronger." Imagine a vicious Janjaweed thug in Darfur popping a victim's eyes out and saying: "There! You should be grateful. I've made you stronger."

In discussing this gem with my students, I have observed that they like to amend it by saying that rigorous training, say at a gym, will be initially painful, but will make you stronger. Yes, but that is not what Nietzsche said.

Now there is a statement attributed to Albert Einstein. "Insanity is doing the same thing that has failed over and over again in the hopes that it will succeed." As regards Iraq, I have concluded that no matter how many times we are deceived by the one-more try rhetoric, our project will not succeed.

However, this is not a general law. As a child I remember being impressed by the following story about a king of Scotland. It doesn't matter if the tale is apocryphal: the principle is valid. Here is a modern retelling:

HUNDREDS of years ago there was a king of Scotland whose name was Robert the Bruce. It was a good thing that he was both brave and wise, because the times in which he lived were wild and dangerous. The King of England was at war with him, and had led a great army into Scotland to drive him out of the land and to make Scotland a part of England.
Battle after battle he had fought with England. Six times Robert the Bruce had led his brave little army against his foes. Six times his men had been beaten, until finally they were driven into flight. At last the army of Scotland was entirely scattered, and the king was forced to hide in the woods and in lonely places among the mountains.
One rainy day, Robert the Bruce lay in a cave, listening to the rainfall outside the cave entrance. He was tired and felt sick at heart, ready to give up all hope. It seemed to him that there was no use for him to try to do anything more.
As he lay thinking, he noticed a spider over his head, getting ready to weave her web. He watched her as she worked slowly and with great care. Six times she tried to throw her thread from one edge of the cave wall to another. Six times her thread fell short.
"Poor thing!" said Robert the Bruce. "You, too, know what it's like to fail six times in a row."
But the spider did not lose hope. With still more care, she made ready to try for a seventh time. Robert the Bruce almost forgot his own troubles as he watched, fascinated. She swung herself out upon the slender line. Would she fail again? No! The thread was carried safely to the cave wall, and fastened there.
"Yes!" cried Bruce, "I, too, will try a seventh time!"
So he arose and called his men together. He told them of his plans, and sent them out with hopeful messages to cheer the discouraged people. Soon there was an army of brave men around him. A seventh battle was fought, and this time the King of England was forced to retreat back to his own country.
It wasn't long before England recognized Scotland as an independent country with Robert the Bruce as its rightful king.
And to this very day, the victory and independence of Scotland is traced to a spider who kept trying again and again to spin her web in a cave and inspired the king of Scotland, Robert the Bruce.