Saturday, January 30, 2010

'Tis the gift to be Zinnful, NOT

Notices of the death of the American historian Howard Zinn have been overwhelmed by the obituaries of the writer J. D. Salinger, author of "Catcher in the Rye" and other works. This seems appropriate, because while Zinn claimed to be capturing the spirit of the people, J. D. Salinger fundamentally redefined it.

Zinn in fact belongs a long line of writers who attacked elitism in the name of populism. As a kid, I listened, with mounting impatience, as my would-be radical stepfather played over and over again a set of records by the poet Carl Sandburg (1878-1967), “The People, Yes." Hearing his lugubrious, self-important voice, I shaped my own rebellion, tentatively entitled “The People--yeah, man.”

Howard Zinn’s more prosaic work, "A People's History of the United States, 1492-Present” (1980), was in this vein, but if anything more tendentious. In Zinn’s view the task of the historian is to produce a political document. His book has gone through five editions and multiple printings, been assigned in thousands of college courses, sold two million copies, and made the author a celebrity. At least his two Hollywood admirers, Matt Damon and Ben Affleck are seeking to confirm this status, post-mortem, in their road-show of a filmed version.

The best account of Zinn’s book remains that of Michael Kazin, “Howard Zinn's History Lesson,” published in Dissent 2004. Kazin says flatly, “People's History is bad history, albeit gilded with virtuous intentions. Zinn reduces the past to a Manichean fable and makes no serious attempt to address the biggest question a leftist can ask about U.S. history: why have most Americans accepted the legitimacy of the capitalist republic in which they live?”

The last question is indeed the operative one. Why is it, in fact, that in the thirty years that have passed since the first publication of Zinn’s opus, the left has been steadily loosing ground?

There are many reasons for this decline, but work such as Zinn’s doesn’t seem to have any effect in reversing the trend.


Saturday, January 23, 2010


A host of complaints have been lodged against the new blockbuster film, "Avatar," which I caught up with a few days ago. The most absurd of these--the horror! the horror!--I can scarcely bear to write these words, but write I must. The character played by Sigourney Weaver is shown . . . s m o k i n g. What an example for our youth!

If I were a young person faced with this paternalistic b.s., I would go right out and buy a pack of cigarettes--and blow smoke at anyone I chose to. Actually, I hate smoking. But I also hate the way smokers have been demonized, exemplifying Durkheim's theory that as one group--gays, transies, etc., take your pick-=escapes the net of proscription, another must be conscripted into the ranks of the abnormal. The definition of normality implies, as night follows day, the ascrption of abnormality.

Could we not just drop this judgmental dichotomy? Oh no; that would be worse than smoking. Civilization depends on demonizing people who are out of favor.

The real problem with the film is that it presents a utopia of vintage 1969, or thereabouts, when presumably director James Cameron was an impressionable youth. Those were the days when rural communes, low tech except for the inevitable music system, were all the rage.

Unbelievably, the Nav'i don;t even have cell phones! In their "small is beautiful" paradise, there are no electronic devices of any kind Their's is the society of an ant colony: they comunicate psychically, all under the direction of the Great Mother Aywah, or whatever the harridan's name is. Of course, the Nav'i are inherently peaceful, with a deep reverence for life. If so, though, why do they have a warrior class? Whom do these warriors make war on?

Many years ago, in a series of studies now neglected, the historian of ideas Arthur O. Lovejoy charted the Western infatuation with primitivism. He distinguished two types: soft primitivism, when the denizens ostensibly just kicked back and enjoyed themselves, and hard primitivism where life was challenging but good, because scarcity inhibited covetousness. "Avatar" leans towards the latter.

It also incorporates a staple of the sci-fi space operas, in which the hero must quickily master an unfamiliar culture in order to become its savior.

"Avatar"'s earthling hero, who dies in the end, can assume this role because, paradoxically, he has a strong sense of individuality. The Nav'i do not.

To me, it is this glorification of the suppression of individuality that is the ugliest feature of the juvenile plot. Of course, many still think we have too much individuality--the source of acquisitiveness, lack of caring, and the all-conquering ego. Such any rate seems to be the view enounced in a new book by the scientists Bert Hölldobler and E. O. Wilson, "The Superorganism: The Beauty, Elegance, and Strangeness of Insect Societies." This authors look forward to the triumph of the superorganism principle in the human species. If this development does occur, though, it will not be through the 1970s notion of a return to nature, but through an ever- proliferating repertoire of electronic devlces, the pivotal one possibly to be imprinted on the eyeball at birth. In this way we will achieve the ultimate togetherness. Ugh.

Of course by utlizing cutting-edge technology the film's creators have achieved dazzling visual effects. Twenty-first century technology is needed to visualize a stone-age society! This irony seems to have been lost on most viewers.

The name Nav'i, btw, is the modern Hebrew pronunciation of the Hebrew Nabi, prophet, underlining the retro stance of the movie.

I doubt very much if "Avatar" represents the future of movies. We saw this kind of thing sixty years ago with "Buana Devil." Who wants to have to wear those dorky glasses?

UPDATE (Feb 4). The annual ritual of the Academy Awards is almost upon us. I have never taken this event as marking any serioius or lasting premiation of the best in cinema. Those who vote are mainly motivated by opportunism and commercialism. Foreign films are relegated to a tiny ghetto. The Academy Awards are part of Hollywood's survival mechanism. Don't be fooled into thinking that they are anything else.

"Avatar" has been nominated, and will certainly win in a nmber of categories. Also nominated, curiously enough is a film of great value, "District Nine." (Such aberrations happen in Tinseltown from time to time.)

"District Nine," which pulls no punches, is a far more searing indictment of colonialism and ethnocentrism than Avatar could ever hope to be. See it.


Friday, January 22, 2010

N o t e

For the next few weeks blogging will be sparse, while I work on my book on the intertextual relations among Judaism, Christianity, and Islam--the Abrahamic religions. Obviously this is a tall order. I have, however, produced fragments of the projected work over the last two years or so on this blog. I am now pruning these and drafting some much needed additional material in an effort to create a continuous narrative.

Having taught elements of this complex of religious material in my college classes in art history (where these themes have inspired countless works of art), I have since come to realize how problematic and indeed deleterious this heritage is. However, one cannot simply throw it out, as the New Atheists urge. The Abrahamic memes have been--and still are--too important to Western civilization. As they are to every part of the world, except for East Asia and the Hindu-Buddhist realms of South and Southeast Asia.


Tuesday, January 19, 2010


In an oped piece in today's New York Times, the journalist Shankar Vedantam makes a case for "colorism." He holds that perceptions of skin-tone variation affect the way people are judged--and their life chances--across the board.

This claim is too sweeping. To be sure, we have heard that in some African American families there is a tendency to favor children with lighter skin. Some darker-skinned blacks, like Thomas Sowell, have complained of this effect later in life.

Still, if Vedantam's claim were true all the tanning salons across the nation would have to close. Beaches would be deserted in the summer months. And actors like George Hamilton, famous for his deep tan, would have no chance in Hollywood.

Surely the key to the matter is that colorism becomes significant when it interacts with other socio-economic factors. If rich and powerful people chose to go about with rich tans, they will do so.

Mr. Vedantam also brings up his native India, where, he says, skin lighteners are popular cosmetic items. He should have noted that the Sanskrit term "varna" (colror) means caste. The disapprobation of dark skins in India has to do with the fact that they are common among the lower castes, where centuries of prejudice has produced lower economic status. It is only when it is perceived as an index of such status that color becomes significant.

For his part, George Hamilton need not worry.


Saturday, January 16, 2010

Another indecent proposal

In the wake of the Haiti disaster, we are hearing the usual demands for increasing aid to that unhappy country. Yet reputedly, even before the earthquake some 10,000 aid organizations had been active in Haiti. But it isn't working. Apparently, they have been handiing out fish, but not teaching people to fish.

It is time now to think outside the box. We should arrange to turn the country over to China. The Chinese will establish clean, well-lit factories. Wages will be very low, but some wage is better than none.

And there will, for a while at least, be none of that nonsense about freedom of speech and human rights. Trouble makers will be dealt with.

Moreover, once the system gets rolling there would be no problem of arrogant white people telling black people what to do.

We tried it our way. Let's try another way.


Wednesday, January 13, 2010

"A Single Man"

Currently playing in cinemas, the film "A Single Man" is certainly worth seeing. The movie depicts a day in the life of an English gay expatriate, George Falconer, a professor at a mediocre public college in Los Angeles. The director Tom Ford, whose background is in fashion, has taken a great deal of trouble to reconsruct the appearance and manners of Southern California as they were in 1962. I know because I was there. It is true that the houses of Falconer and his best friend Charlotte are almost unbelievably lavish, but this kind of lifestyle enhancement is normal at the movies.

The film makes one major departure from the 1964 archetype by Christopher Isherwood. In the novel George is simply depressed about the accidental death of his partner Jim a few months before. In the movie Colin Firth repeatedly brandishes a pistol, with which he intends to kill himself. As in the book though, Falconer dies at the end from a seizure.

The film took me back to the book itself, which I had read in 1965 or so, not long after it came out. Since Isherwood did a couple of turns as a professor and lived in Santa Monica, the basis of the story is clearly autobiographical--à clef, with some invented garnishing thrown in. What struck me at the time was the absence of the lurid melodrama that characterized novels about gay men in those days. Allowing for his English origins and evident prosperity, George is a fairly ordinary citizen. This matter of factness was a distinct advance.

In another way, though, the book was not so advanced. It was a convention in novels of those days--Gore Vidal's "City and the Pillar" comes to mind--for one or both of the male lovers to die prematurely, usually violently, at the end. They could enjoy happiness for a while, perhaps a few years, but then a big price must be paid. (In fact "A Single Man" is dedicated to Gore Vidal.)

Unfortunately, Isherwood's novel conforms to this baleful pattern. At the beginning of the novel, Jim, the architect partner, is already dead; George will die at the end. There is a more specific similarity. Typically sold in a brown wrapper, James Barr's Quatrefoil (1950) was a gay novel we all read in those days. It is about a love affair between two virile naval officers. One of them dies in an accident. In Isherwood's book Jim is a naval officer when George meets him at a bar in Santa Monica. He dies in an accident. Whether consciously or unconsciously, Isherwood seems to be echoing Barr's book.

I also have some reservations about Firth's performance, which makes George much too prissy and reserved. I was privileged to meet Christopher Isherwood several tiimes in his later years. He was not prissy at all, but rather earthy, curiously enough. He went about calmly in a leather jacket, occasionally tossing off some well-chosen salty remark. He was altogether a real person. By contrast, Colin Firth is merely acting.

POSTSCRIPT. A sly touch is the surname Isherwood gave to Falconer's nosy Santa Monica neighbor, Mrs. Strunk. This moniker is almost certainly a kind of hommage to William Strunk, Jr., coauthor of a popular writing manual called "The Elements of Style." Strunk and White (as is is usually termed) appeared in its classic edition in 1958, six years before Isherwood's novel. After further revisions the little book still ranks as THE canonical manual of "good writing" on American campuses.

In addition to a series of dos and don'ts, "The Elements of Style" suggests that the tyro writer stick to simple declarative sentences. That is the safest course for neophytes. As a quondam teacher of English, Isherwood doubtless resented the ubiquity of this little book, which he must have associated with the dumbing down of American culture. However, the joke was on him, since Isherwood's writing, never exactly bravura in the style department, abounds in simple declarative sentences.

However that may be, the English writer gets points for naming Mrs. Strunk's obnoxious son "Christopher."


Thursday, January 07, 2010

Mary Daly, radical feminist/revolting hag

Mary Daly, an icon of radical feminism, has died at the age of 81 in a nursing home near Boston. The expression "revolting hag" is a self-descriptor. It it accurate.

In 1973, shortly after I joined the ranks of the Gay Academic Union in New York, some eminentos of the group told me that proficiency in feminist doctrine was a prerequisite for any adequate understanding of gay liberation,

The better to access this seemingly essential knowledge, I picked up one of Daly’s books. I soon wished I hadn't. Reading this screed, I was appalled. Whenever it suited her, the Boston scholar freely invented facts and generalizations. As a kind of bonus feature, the book was laced with a scarcely concealed hatred of men. Over the years this misandry became more outspoken, as Daly enounced her wish that men be reduced to some 10% of the population. Apparently even genocide did not lie outside outside the boundaries of radical feminism. Put simply, hers was an evil doctrine.

It was said that Daly’s jaundiced view reflected her run-ins with the administration at Boston College, a Jesuit institution, where she taught theology for 37 years. However, these difficulties were of her own making. For almost thirty years Daly had engaged in the despicable practice of excluding men from her classes. They were “dysfunctional.”

Such discriminatory behavior is a serious violation of academic ethics, not to speak of federal law. For centuries minorities and women had fought for full access to knowledge. And here a woman was engaging in the same vile practice of exclusion.

Daly claimed that she was willing to provide separate instruction for men who wanted to take her closed courses. In view of the uncomfortable prospect of being given one-on-one instruction by a hostile professor, I wonder how many male Boston College students were prepared to take up this offer? At all events this is an instance of "separate is equal." But separate is NOT equal. It is segregation.

From time to time individual students would protest that they wanted to be admitted to the actual classes, so that they could benefit from the give and take of the group experience. Perhaps the benefit would be slight, but at least they would be able to judge for themselves.

Oh, no. Daly would not budge. In practice, she would handle the matter by taking a leave of absence, whicb she did several times. Once she returned to campus, this unethical professor would simply resume her unyielding policy of exclusion. Her contumely was persistent and seemingly inexhaustible. In 1999, however, the long period of indulgence finally came to an end, as one student insisted on pursuing serious legal action. By this time Daly’s star, and that of radical feminism in general, had faded, and the hitherto craven college authorities decided to take action--at last. A negotiated settlement led to her retirement. A manifest charlatan, she should never have received tenure in the first place.

Ostensibly, Daly came out publicly as a lesbian in the early 1970s. As far as I know, no actual lesbian partner has ever been identified. In one of her more entertaining books, Wickedary, Daly defines lesbian as "a Woman-Loving woman; a woman who has broken the Terrible Taboo against Women-Touching women on all levels [and] rejected false loyalties to men in every sphere." Did she ever touch anyone in a meaningful way? It seems that Daly was not merely misandrous--man hating--but misanthropic as well. She herself personified the quality she claimed to oppose, necrophilia. She was antihuman and antilife.

To put it mildly, she was not a decent human being, but a seething cauldron of hatred. That said, what was the actual content of her radical feminist scholarship? Daly adhered to the doctrine of primordial matriarchy, a hypothesis formulated by the Swiss scholar J. J. Bachofen 150 years ago. No conclusive proof of this notion has ever appeared, as shown by Cynthia Eller’s scintillating expose, The Myth of Matriarchal Prehistory, Boston: Beacon, 2000. Nonetheless, Daly was a True Believer, holding that under the succeeding historical regime of patriarchy men had “stolen” the insights that women had achieved in prehistoric times, when this utopia supposedly reigned. For example, Daly believed that the Christian doctrine of the Trinity usurped a “worldwide” belief in Three Goddesses.

A practiced adept in the dubious science of Victimology 101, Daly claimed that nine million witches had been killed in late medieval and early modern Europe. Most reputable historians believe that the true figure ranges between 60,000 and 100,000. Over the years I found that Daly's deplorable example of simply inventing evidence was eagerly followed by a legion of gay and lesbian pseudoscholars. They were apt pupils, and she was the Pied Piper of Hamelin. No proof was needed if the assertion was in a good cause, or what seemed to be such.

Raging largely unchecked far and wide for thirty years, this unprincipled advocacy scholarship came to rank as a major contributor to the decline of American academic standards.

To be sure, there are many varieties of feminism. Achieving legal and social equality for women is a very worthy goal, one that has made salutary advances over the years. But crazies like Daly have done this cause no good. And, as I have indicated, the resulting damage has extended far beyond the precincts of radical feminist advocacy.


Monday, January 04, 2010

Stephen Heersink, 1953-2009

It is with great regret that I report the death on December 29 of my dear Internet colleague Stephen Heersink, who conducted a remarkable website at Curiously enough, we had never met in the flesh, since I was living in New York City and he in San Francisco, and we traveled little in recent years,  We were virtually soul mates.  While we had similar interests and views, we were both forthright--unafraid to voice disagreement with the other's opinions. By showing that "we could take it," we honed our own arguments.  Stephen showed incredible energy in producing, sometimes, three or four postings in a single day, most of them festooned with helpful hypertext references (including not a few to my work). I find it hard to imagine getting up in the morning and not looking at the latest post at The Gay Species.  He was also a "top 1000 reviewer" at Amazon, where his insightful comments may still be accessed.

Stephen was a brilliant thinker and writer.  He will be greatly missed.

Stephen Heersink was born in Central California on February 21, 1953 to a family of Dutch Calvinists. Recognizing that this background was somewhat narrow, in his senior year in college he sought instruction in Roman Catholic Scholasticism at a seminary in Berkeley. While resolutely secular in his mature views, Stephen knew a lot about the history of Christianity. In this way he was able to offer useful pointers concerning my ongoing manuscript on the Abrahamic religions.

At UC Berkeley and Mills College Stephen deeply immersed himself in analytic philosophy. While we disagreed on the value of that discipline, we both acknowledged a profound indebtedness to Karl Popper and Friedrich Hayek. A little later Stephen became a successful banker (a position from which he had retired). This background lent his analyses of the current economic crisis particular authority.

He was, of course, an unyielding defender of gay rights. It is appropriate, then, that his last posting was to felicitate the first gay marriages to achieve legal status in Latin America--in southern Argentina.


Sunday, January 03, 2010

My evolving political views

Over the years I have put together a few autobiographical reminiscences. These deal with persons and situations, to the neglect of my intellectual development, especially with regard to politics. That issue I will address here. Parenthetically, I note that, In discussing them with others, my politics tend to elicit indifference (because they are eclectic) or disparagement (because they fit no particular established pattern). Still, here goes.

My parents brought me up in a far-left political sect. We tempered our consumption of the “bourgeois” press with a reading of the Daily People’s World, the West Coast counterpart of the Daily Worker, Like many intellectuals of the thirties my stepfather had adopted the vulgar Marxism rife during those Depression years.

In our household I don’t remember any airing of such key issues of Marxian economic theory as surplus value or the purported progressive immiseration of the working class. In the immediate postwar period, when plumbers and truckdrivers began to earn more than professors, this stuff would not have had much traction. We were told, of course, that another Depression was just around the corner (NOT). The main thing I remember absorbing from those conversations and readings was a Manichaean view of the contemporary global situation in which the valiant “progressive forces” (that is the Warsaw Pact nations dominated by Moscow, and Mao’s China) were arrayed against the evil nemesis of capitalist plutocracy. Without question the US was always the archvillain in this process, a view that I have found wearyingly replicated over and over again in later dissident movements.. This is so even now that the Soviet Union is dead and gone. As far as I can see, the unending flood of screeds of Noam Chomsky simply mimics this hoary and simplistic sheep-and-goats doctrine.

Some averred that the only hope for change was the presidential candidacy of Henry Wallace, who ran in 1948 under the aegis of a third party. In fact, the hapless Wallace, whose main expertise lay in agriculture, was manipulated by his Communist and fellow-traveler advisers.

At the age of 16, however, I got off the bus, the Comintern Express. The precipating event was the defection of Marshall Tito’s Yugoslavia from Soviet allegiance in 1948. I wrote a long letter, a kind of cri du coeur, to the Daily People’s World, asking how a stalwart champion of the people could so suddenly turn into a “social-fascist beast.” No answer came. Of course the excoriation simply reflected the fact that Tito had had the temerity to defy Stalin. and got away with it. Stalinism, enforced by the party line, pulled the strings that made all the puppets, including my foolish parents, dance.

I then deprogrammed myself by reading two authors, George Orwell (1984; and Animal Farm) and Arthur Koestler (Darkness at Noon). I later came to find Orwell a narrow and simplistic puritan, riddled with misogynistic, homophobic. and other suburban prejudices (he denounced "pansy" poets). For his part, Koestler returned to his first love, the history of science. I followed him in this interest, as seen most notably in his brilliant treatise, The Act of Creation. Two recent biographies have highlighted Koestler's personal failings, charges that may well be true. But for me his life was one of the most emblematic trajectories of the twentieth century.

Any perceptive person can benefit from offtrack experiences such as my Commie education, tossing out the dross (lots of it) and retaining what still seems of value. In keeping with their beliefs, for example, my parents sought out and made friends with black people, whom we sometimes entertained in our home. Another thing I gained from this misguided though formative orientation was a healthy skepticism about our two major parties--or rather the Demopublicans. Their alternating dominance is simply a series of switches from Tweedledum to Tweedlee and back again. The reason, of course, is the Permanent Government in Washington DC, staffed by venal career bureaucrats, ruled by lobbyists awash in money, and abetted by a disgraceful, toadying press. Today the truth of this principle seems to be affirmed once again, as the Obama policies more and more mirror those of George W. Bush. Only the rhetoric changes.

From time to time a third party arises, only to fall by the wayside. By and large the Anglo-Saxon political system does not permit such pluralism. We are resolutely binary. Does this realization lead to despair? Not necessarily, because of the success of movements organized around particular goals, as seen in the civil-rights, women's, and (to some degree) GLBTQ movements. I write the acronym reluctantly, as it points to a degree of fragmentation ("diversity") that is not helpful.

In college I took a worthless course in that misnamed discipline Political Science. It was only in the 1960s that I began to read on my own in this field. As a medieval scholar I found, curiously enough, succor in that era, which invented the concepts of separation of powers, representative government, the common law, and the just war. (The latter, however, gives me pause, as the criteria for determining which wars, if any, are just, seem elastic, all too conveniently so.)

My own views have been marked most profoundly by the writings of Karl Popper and Friedrich Hayek. (I attended Popper's seminar in London). My gay "libertarian" friends claim the same heritage, but in fact most of them are simply neocons with some surface camouflage. Still, If pressed for a label, I would say that I am a libertarian anarchist--but not entirely, since I retain Popper's hope for a better world, which can only be constructed with the aid of "partial planning."

I also maintain a strong dose of political Realism, honed by my readings of Machiavelli and Thomas Hobbes. Time devoted to these towering figures is never lost.

At this point I copy some remarks I made on this blog in 2005 regarding some less well-known thinkers in the Realist vein, Gaetano Mosca, Vilfredo Pareto, and Robert Michels. Reflecting on his experience in Italy, Mosca (as early as 1893) posited that all societies, whatever their formal constitutions and public rituals, are controlled by an elite political formation. This harsh dynamic acknowledges only two social categories: the rulers and the ruled. Mosca’s ideas, and those of his contemporaries Pareto and Michels, differ from those of Marx in that the ruling group is composite, rather than unitary, and therefore not a class in the strict sense. In my view, Marx’s idea of the ruling class was more traditional, in that he envisaged a kind of pseudo-kinship group modeled on, though not the same as, the traditional nobility.

Conventional wisdom assigns Mosca, Pareto, and Michels to the Right. However, a similar point was made by Sidney Webb, the Fabian who, together with his wife Beatrice Webb, was one of the founders of the British Labor Party. Sidney noted, [n]othing in England is done without the consent of a small intellectual yet practical class in London, not 2,000 in number." Edwardian England was both centralized and close-knit, and probably one has to assume a somewhat larger, more diffuse elite in other countries.

As Vilfredo Pareto emphasized, the pool of the ruling elite are being constantly and continually refreshed, as new members find access. Yet the absolute number of these is small--it cannot be otherwise. This changing configuration, shifting by a continuing process of minute adjustments, helps to maintain the Participatory Illusion that would-be players cherish. "If Henry Kissinger could make it to the pinnacle of power, then maybe I can too." In fact, this outcome is very unlikely, perhaps fortunately so for those of us who are ruled.

Robert Michels aptly summarized this situation as the Iron Law of Oligarchy. This law applies to all kinds of societies, whether they be nominally democracies, monarchies, or authoritarian states. Moreover, size matters. The bigger the society, the more necessary—or at least convenient—it is that this ruling elite control matters.

In the old USSR this situation came out into the open (after a fashion) in the concept of the Nomenklatura. The term derives from a formal list (always hard to access) of privileged Party members who make all significant decisions. Oddly enough, in that respect the old Soviet Union was more transparent than the US today. As we have seen, however, the social mechanism is generally applicable-—above all to societies like our own, where regrettably its workings are obfuscated as much as possible.

Does this reality mean that individuals such as ourselves (who do not belong to the ruling elites) can expect to have no influence at all over policy decisions? On the whole that is just what it does mean, though there are some marginal exceptions. If they are wise, elite members in good standing will occasionally consult friends who stand outside the magic circle of power. However, if these seemingly consultative players seek, as a result, to foster a policy that goes counter to the collective wishes of their comrades, they will be instantly overruled. If it is a project that the group has already decided to undertake, the advice of the kibitzer is superfluous. At the end of the day, then, the actual influence the outsiders can bring to bear through this channel is highly circumscribed.

It is said that non-elite individuals can make a difference by joining together to form pressure groups. In union there is strength. Even here, though, the leverage accorded to non-elitists is exiguous. In many cases, the officers of pressure groups are usually themselves members of the elite, whose bidding they are more likely to do than that of their members.

For a time at least mobilization efforts such as those of the civil-rights and women’s movements can effect change. Another example, more narrowly focused ,is ACT-UP, which has had a beneficial effect on medical policy. Given enough “testicular pressure,” those who manage the elites will yield, though only up to a point. Their overarching goal, which they pursue with only the most minimal deviations, is to maintain power.

Occasionally there are popular upheavals, as in the massive opposition to the Vietnam War. Yet when it came to deposing president Nixon, that change was deftly managed by a few key players on the inside, who had made sure that one of their more pliable colleagues, the dimwitted Gerald Ford, would take the place of his disgraced predecessor. The king is dead, long live the king!

I do in fact see hope in the rise of the blogosphere. A few of the bloggers are very widely read and quoted. Most though are not. The Iron Law of Oligarchy, it seems, extends its pall even over the blogosphere. Yet at least the blogs hasten the process of the circulation of elites. Andrew Sullivan is in; David Broder is out. Fresh faces may mean better policies. Or so we may hope.

In closing, two objections to the above sketch of the Iron Law of Oligarchy may be noted. First, the analysis seems unduly bleak and pessimistic. In fact, we may easily observe contermporary societies much worse than the managed one we live under now. Examples are the kleptocracies that dominate much of the Third World. Pareto might well have agreed with Churchill that elitist democracy is the worst system in the world—except for every other. Still, it makes sense to go about the world with our eyes open.

The second objection is that mine amounts to a conspiracy theory. Along these lines, there have been attempts to pinpoint the loci of the elite conspiracy—the Club of Rome, the Trilateral Commission, and the Bohemian Grove clique. Yet my theory differs from pinpointing of this type, for it posits a set of arrangements that are looser and pretty much out in the open, if one will simply look to see. There is no need to leave the living room. Watching C-Span on a regular basis shows the ruling-elite folks doing what they do best, talking to each other. Like some privileged prisoner, one can witness this spectacle, but is not allowed to participate.

In this short summary of the Iron Law of Oligarchy I have presented an ideal type. What would be needed to put flesh on these bones would be a series of case studies. One might begin with certain think tanks, such as the odious Council on Foreign Relations and the Rand Corporation. Doubtless such studies exist; the task would be to correlate them.

FOOTNOTE. Much of my recent thinking about these matters has been guided by my discussions (sometimes down-and-dirty) with the brilliant thinker and blogger Stephen Heersink of San Francisco. Alas, such discussions have now ended, for Stephen died on December 28. An obituary will follow. I am glad to say that much of Stephen's work is perpetuated at his blogsite Gayspecies. blogspot. com.