Saturday, December 31, 2011

An End of Year Look at School Vouchers

In an end of the year article in City Journal, Marcus A. Winters celebrates 2011 as “The Year of the Voucher.”  After a decade of slow movement on expanding the voucher approach to school choice, Winters reports, suddenly 2011 saw legislatures in 12 states adopt voucher policies or expand existing programs, and the U.S. Congress saved the D.C. voucher program at the last minute.  This is good news for many families, especially those with low incomes, but I’d caution against seeing vouchers as the solution to educational shortcomings in the United States.
Vouchers and charter schools are two kinds of market-based approaches to schooling. Vouchers subsidize demand. Their purpose is to increase buying power among consumers of education (families) and give those consumers the power to choose among providers (schools). Charter schools subsidize supply. Their purpose is to increase the range of school types by funding institutions with competing strategies for education, with the idea that the consumers can then choose among many different providers. Both of these market-based approaches represent efforts to break down state monopolies on public schooling. State monopolies, choice advocates argue, have no motivation to provide high quality or varied goods to consumers. Just as the stores in the old Soviet Union were scantily stocked with inferior merchandise, so a monopolistic school system stocks its classrooms with shoddy curricular offerings and mediocre teaching.
The poor in this market perspective suffer the greatest damage because they are most subject to monopoly.  Higher-income families can opt out of low-performing schools by moving to other school districts, by enrolling in private or parochial schools, or by home-schooling. In effect, then, our present public school system is a competitive market for those who have sufficient resources, but a monopoly for the disadvantaged.
I cautiously support vouchers (and charters) as ways of improving schooling for some families in some places. But I don’t have as much faith in them as Winters and other choice advocates. This is because, although I do see schools as competitive markets, I think most exponents of school reform of all varieties misunderstand just what kind markets they are. Specifically, voucher and charter advocates tend to describe students at different times as consumers of schooling and as products of schooling. But students (and the families and communities from which they come) are also, very importantly producers of schooling.  The quality of a school depends, more than anything else, on the efforts and dedication of the students, who bring their capacity to learn to the school from their families and communities. Each student shares whatever capacity for learning he or she brings to the school with all other students. For this reason, when parents have the opportunity to seek a good school what they really want is a school with highly motivated, well-prepared students who will create a desirable environment for their own children. By giving low-income families choice through vouchers or some other kind of subsidy, we essentially offer ambitious, committed poor families a chance to escape from concentrations of bad students.
Ironically, even market-based subsidies can worsen the problem of education by encouraging misplaced accountability. If we maintain that students don’t learn because their schools are failing them, we are preserving the myth that students succeed or fail because of something that systems or policies give them or don’t give them. By adhering to the illusion that everyone can and will achieve some accepted level of learning if only officialdom gives them the right program, we move responsibility away from the social structures that primarily shape children.  It will probably help some people in 2012 if there are voucher programs in more places, and those programs won’t be worse than the present system. But let’s hope in 2012 we can get away from the idea that tinkering with the education industry can enable schools to crank out the right kinds of products.

Friday, December 30, 2011

Remembering the Amerasian in the Saigon Coffee Shop

            The tropical glare squats at the edge of the shade,
            studying arcs traced by our coffeecups
            in the rise and fall between crude wooden tables                  
            and our lips. Our rhythms are regular as heartbeats.
            Overhead, coconuts are swelling to self-sacrifice.

            We're taking a break from history.
            All the singers in the boom-box
            are maidens wailing for soldiers,
            soldiers wailing for maidens;
            there's no telling which war is in which song;
            the same enemy keeps changing uniforms.

            Jagged bits of your unknown father's face
            keep falling out of disoriented features.
            I try to fit them together,
            as I try to assemble the words I know
            in sentences and reshape them to my tongue.            
            When you talk the words dash out like small birds                
            and your hands swoop after them like birds of prey,
            a quickness acquired from years of street-life,
            selling peanuts and yourself and cadging petty coins.

            What will it be like in the country of my waking,
            the country of your dreams?
            When will you wake up there?
            Will you wonder, like Chuang-tzu,
            whether the dream was before or after the waking?
           
            You search my round eyes and long nose
            for pieces that will fit your face.
            Every my (your name for us means "beautiful")
            is a father in your eyes. Listen,
            when I smile, it means I have no face to lose
            or share.

            After the sweet coffee, the shopkeeper
            brings a jar of bitter Chinese tea.

Note: Amerasians (children of U.S. servicemen and Vietnamese mothers) were often despised outsiders in Vietnam, known as “the dust of life.” They began relocating to the United States under the Orderly Departure Program in the 1980s and their numbers increased greatly after the passage of the Amerasian Homecoming Act of 1988 by the U.S. Congress. When I worked with the U.S. refugee program in the 1980s, Amerasians begging and selling peanuts were a common sight on the streets of Saigon (Ho Chi Minh City). The last time I visited, in 2004, I saw none. The youngest Vietnamese Amerasians are now in their thirties.

Thursday, December 29, 2011

History Ends or Turns Again


Hegel
Francis Fukuyama has moved on since his 1989 book, The End of History and the Last Man.  Most recently, he has finished the first volume of a massive study of the origins of political order.  Earlier this year, I heard Fukuyama give a somewhat disorganized presentation of the new work. I’ll have to read the book before I decide what I have to say about it, but I was intrigued by his suggestion that religious beliefs and institutions have been critical in driving political and social change.  But no matter how many hefty tomes he piles on our library shelves, he’ll probably always be known as The End of History Guy.
The 1989 book enjoyed three big advantages in the competition for public attention. First, it had an impressive and prophetic title. Second, it proclaimed the world was reaching a historical resolution just when Soviet Communism was collapsing. Third, it made grand assertions that begged for debate, creating a Fukuyama cottage industry for professors and political commentators. The industry continues even today, with John Arquilla offering one of its latest products in the journal, Foreign Policy.
Fukuyama did not argue that history would end because we would all sit down and stop doing things. To oversimplify his claim, “history” is not just one thing happening after another but the competition of social and political ideas and systems.  Drawing on Hegel, Fukuyama saw history as having a direction, and argued that the direction ran toward liberal democracy. History was ending because this end point was becoming clear.
In his cleverly titled, The (B)end of History, Arquilla argues that Fukuyama was wrong to say that political systems have reached a  resolution and that the events of 2011 demonstrate how he was wrong. Arquilla maintains that we may have reached the end of conflicts between ideologically based nation states, but that this represents a turning toward a new kind of historical action. The Arab Spring, the Tea Party, the Occupy movement, and even al Qaeda represent the bend in history toward “loose-knit, largely leaderless networks.”
I sympathize with the view that history (even history narrowly understood as competing political goals) doesn’t end so much as it shifts and changes. But I think Arquilla also might be making excessive world-historical claims for today’s headline news.  We do see a lot about networked social movements, but these are not entirely novel, in spite of their use of new social media technology.  I think about, say, the Grange movement among American farmers and the populism it created, and even about the pre-internet movements that ended the Soviet Union.  Since today’s networked movements are such a new turning, moreover, we don’t know which ones will end up being incorporated into more traditional political organizations, which ones will operate as external pressure groups, which ones will disappear entirely, and which ones will transmogrify into different movements.  If they are to be effective, movements must somehow be formalized. If the uprisings in Egypt are to produce a new kind of government, for example, then they must end in an elective parliament.  And that would be a traditional political form.
Ultimately, it seems to me just too early to say that loose networks have become the new direction in history. As Hegel observed, the owl of Minerva only flies at dusk.  Who can say when we’ve reached the end of a world-historical day?

Wednesday, December 28, 2011

The New Knowledge Economy?

Josef Joffe’s review of Thomas Friedman & Michael Mandelbaum’s new book, That Used to Be Us, contains the following bizarre but common claim about how America’s current economic difficulties differ from those of the past: “Back in the Eisenhower days, Little Johnny couldn’t read so well, but so what? He could still take his place in the country’s humming industrial machine. Today, he can’t get a job because (a) net job-growth has been zero for the past decade and (b) low-skill, high-wage jobs are disappearing forever. Nor is this just Johnny’s problem. Behind him lurks an education system that isn’t equipping children with the intellectual capital in demand in the new knowledge economy.” Yes, we’ve heard that over and over. Today’s America needs only high-skilled, knowledge intensive workers and we aren’t producing enough of them. That claim, however, is patently false.
Back in 2006, Forbes published an article on-line entitled “The 10 Hardest Jobs to Fill in America.” “Engineer was indeed number 1. However, the list also included truck drivers, explaining “They are hard to recruit because they have to be away from home for long periods, receive low wages, work very long hours and put up with a fluctuating workload.” Another hard-to-fill job was laborer. Forbes explained the shortage by the fact that This is very physical, unskilled and often repetitive work at low pay.”
How have things changed since the Forbes’ article? A recent MSNBC article by Eve Tahmincioglu quotes Peter Creticos, president and executive director for the Institute for Work and the Economy. “If you look at the job growth distribution of the last two recoveries, it suggests we’re going to see growth of a lot more lower-income jobs.”  The jobs the article lists as the 8 lowest paying jobs in the country, including food preparation workers, cashiers, and home care aides, are those you can see in the help-wanted pages every day.  So, maybe the problem is not that our schools aren’t successfully turning everyone into highly skilled information workers. Maybe the problem is that we expect everyone to be a highly skilled information worker in an economy that really needs laborers, truck drivers, cashiers, restaurant workers and home health care aides.

Tuesday, December 27, 2011

Thailand, the Monarchy, and Free Speech

Recent U.S. Government comments on Thai lèse majesté laws have stirred some controversy within Thailand. Normally a close U.S. ally, Thailand is unaccustomed to the anti-American sentiments common in other supposed allies, such as Pakistan.  However, in this situation, American officials are dealing with what may be the most sensitive and difficult issue in the Southeast Asian nation.
In recent decades, freedom of expression has been greater in Thailand than in almost any other Asian nation except, perhaps, Japan. The one big exception to this rule has been the monarchy: any criticism of the king or the royal family is strictly prohibited.  We Americans have difficulty sympathizing with this exception, or even understanding it. Our nation originated in the rejection of an already limited monarchy, and in the intentional creation of a system of government through laws. The monarchy, though, was largely the origin of Thailand. The historic Thai kingdoms centered in Sukhothai, Ayuthaya, and finally in Bangkok, grew out of the centralization of feudal nobility under royal authority. On the model of the “wheel-rolling” king of the ancient Indianized states of Southeast Asia, the monarchs of Thailand were invested with divine authority, and this was the basis of their claim to popular allegiance. The king was the country.
Fortunately, Siam, as the nation was known until 1939, enjoyed a series of extremely capable kings.  Rama IV, or King Mongkut, was a former Buddhist monk who, after he ascended to the throne, became an astronomer, a polyglot, and an astute political strategist. The Thai will often boast that theirs was the only nation in Asia to avoid colonization, and this was due more than anything to the cleverness of Rama IV in playing the neighboring French and British off against each other. Rama IV also embarked on a program of modernization that included even his own family, bringing in an English governess named Anna Leonowens whose somewhat distorted tales became the basis of plays and movies than many Thai still consider offensive.
Rama V, King Chulalongkorn, continued and intensified the modernization efforts of his father. In the process, the kings created two institutions that would challenge the claim to absolute royal authority:  a professional military and the civil service. These two institutions brought about the revolution of 1932, which retained the king but turned him into a greatly revered national symbol, instead of an absolute ruler. Symbols are important, though. For much of the largely farming population, the king remained the nation, regardless of who made policy decisions.
During World War II, Thailand pulled off the feat of being on both sides, an accomplishment that would have made Rama IV proud. The pro-Japanese Thai military allied themselves with Japan, but many of those in the Thai government, especially in the civil service, sympathized with the Allies. The ambassador to the United States, for one, refused to deliver the declaration of war on the Allies, and the pro-Allied faction took power when it became evident that the Japanese would lose the war. Anti-Japanese partisans operated within the country and Japanese soldiers were unpopular. As one older lady told me in the 1980s, “we did not like the Japanese. They had bad manners and they bathed almost naked in public.”
King Rama VIII lived in Switzerland during the war and became a popular image of transcendent unification when he returned at war’s end. However, he was murdered under conditions that still remain unclear in 1946, and his younger brother, Bhumibol Aduljadej (pronounced, roughly, Bpoo-mee-pohn Ah-doon-yah-det) succeeded him as the present king, Rama IX.  Now the world’s longest reigning monarch, he owes his near-sacred status to the remnants of the pre-1932 divine kingship, to his status as the key national symbol, and to his personal virtues and good works. The most recent in a series of intelligent and benevolent monarchs, he has mostly contributed to the well-being of his subjects through apolitical development projects, only stepping in to mediate Thailand’s frequent political crises at carefully chosen strategic moments.
Since the king rose to the throne, Thailand has gone through a bewildering variety of administrations and regimes, changing by coups and by elections. A Thai acquaintance once told me:  “You Americans are your democracy. We are our traditions.” The king sums up and represents the traditions. This is why the royalty is an exception to the rule of freedom of expression, in somewhat the same manner that the monarch stands above and outside the system of government.
Traditions and nations do change, though. Perhaps the biggest change for Thailand has been the rise of a prosperous middle class. I remember in 2004 when I was lecturing in Paris as part of a faculty exchange between my department and the École des hautes etudes en sciences socials, I was surprised to hear Thai spoken several times on the streets and in the trains. I talked with some of the speakers and found that they were not wealthy jetsetters, but teachers and office workers.  You know that a country has achieved a large middle class when its ordinary citizens start showing up as tourists in Paris.
Despite the large and growing middle class, though, the majority of the Thai population remains poor and rural.  From 2001 to 2006, the wealthy businessman Thaksin Shinawatra (roughly, tahk-sin shin-ah-waht) served as prime minister mainly with the support of the rural poor. However, suspicions of his demagogical approach to government and his concentration of power alienated much of the urban middle class and parts of the military leadership and his public works program apparently enriched his own companies. He was also accused of insulting the royal house. The Thai military pushed him out in a controversial coup, only to see his daughter, Yingluck Shinawatra (ying-lak shin-ah-waht) elected as first female prime minister in 2011.
Thailand is, then, currently internally divided. Traditionalists see the monarchy as essential for stability. At the same time, the king is now ailing. This heightens anxiety over the monarchy.  The heir, Prince Maha Vajiralongkorn, who at 59 is one month older than I am, lacks the king’s wide popularity. This may change once he takes the throne, but at present there are reasons to be concerned once the stabilizing presence of the current king is no longer there.
While some in Thailand favor liberalizing the lèse majesté policies, the response to worries over national stability has been a rash of prosecutions of individuals accused of insulting the royal family, with hefty jail time handed out for what in the United States would be considered constitutionally protected free speech.  One Thai citizen received a sentence of twenty years. Perhaps most troubling to Americans, Thai-born U.S. citizen Joe Wichai Commart Gordon received two and a half years for publishing in Colorado a banned biography of the king that he had translated into Thai.
Given the sensitivity of the issue of the monarchy in Thailand, it seems to me that it would be wise for the United States to avoid all public commentary. Condemning what we see as clear violations of human rights will only make us look like we are intervening in that nation’s internal affairs and risk alienating many Thai who may actually oppose the crackdown. We should certainly not abandon our own citizen, but public pronouncements have probably hurt his situation more than helped. It would have been much better to work behind the scenes and make it clear that prosecuting American citizens for things they write or say in the United States can damage U.S.-Thai cooperation. If we want to promote the cause of free speech, rather than indulge in moral display, it seems to me that the best thing we can do is to allow the Thai to reach their own resolution.

Monday, December 26, 2011

Occupy Paranoia

In 1963, Richard Hofstadter published an essay entitled “The Paranoid Style in American Politics.”  “I call it the paranoid style,” he wrote, “simply hecause no other word adequately evokes the qualities of heated exaggeration, suspiciousness, and conspiratorial fantasy that I have in mind.”  Hofstadter acknowledged the influence of this paranoid style in popular movements throughout American history, but he directed his attention most immediately toward what he characterized as the right-wing political thinking of the day, especially toward anti-Communism. Of course, one response to Hofstadter’s claims might be that he was constructing an ad hominem argument (“you are wrong because you are paranoid”).  Was the anti-Communism of the fifties and early sixties really paranoia or was it recognition of a genuine conspiracy?  Following the fall of the Soviet Union and the publication of materials such as those in the Venona program, it is today evident that Soviet agents did engage in conspiratorial activities within the United States. The anti-Communist paranoids were wrong only in attributing excessive competence and coordination to the conspirators.
Still, although we might criticize Hofstadter’s essay as an effort to disguise partisan rhetoric as historical commentary, there is something to the concept of a paranoid political style. Not long ago, I read a collection of Gordon S. Wood’s essays, which contains his 1982 adaptation of the Hofstadter conceit to discuss ideas of causality in the eighteenth century. Wood maintains that one of the reasons Americans were so suspicious of the intentions of the English parliament in the years leading up to the Revolution was the commonly shared eighteenth century understanding of causality as a matter of the intentions of individual actors, rather than as the mechanics of impersonal and unintended historical forces. In reading Wood’s essay, I thought that Wood (like Hofstadter) might too readily assume that paranoids don’t have enemies, and that there the English government did not contain some genuine intentions to limit American autonomy and subordinate American interests. I also thought the Wood was too ready to attribute the “paranoia” to a peculiarly eighteenth century approach to social thought. The inclination to believe that everything happens because some group of actors is pursuing a virtuous or nefarious end may be more of a universal human tendency than a product of eighteenth century economic and political complexity outpacing social philosophy.
If the paranoid style is constant, the question may be: when is paranoia reasonable? In other words, when does it make sense to see events as products of small groups of people acting in concert to affect the lives of large numbers of others and when does it make sense to see events as unintended consequences of actions or institutional structures?  As I look at the various Occupy movements around the United States today, I tend to see them as manifestations of the paranoid style. If home values have been dropping and foreclosures have been increasing, this must be because the mortgage industry has sought to victimize homeowners for the sake of profit. If unemployment has been increasing, this must be because neo-liberals want to drive down American wages and seek more exploitable workforces at home and abroad. If students graduate from college with heavy loans, this must be because they are victims of a deceptive, self-seeking student loan industry. If economic inequality has been growing, this must be because the wealthy are attempting to deprive everyone else and concentrate resources in their own hands.  Ultimately, all evils come together in the machinations of the 1 percent cabal on Wall Street.  Our social and economic issues are, therefore, all questions of morality and justice and outrage and demonstration are the most appropriate responses to them.
I think in the case of the Occupy movements, the paranoia is indeed true paranoia, a perception of a conspiracy that doesn’t exist. Profit-seeking did play a part in the mortgage bubble, but so did the egalitarian ideology of universal home ownership.  Unemployment has increased, but this was due mainly to the mechanization of domestic labor and to a more inter-connected world labor market. College is expensive and depends heavily on student debt, but this is largely the unintended consequence of efforts to extend college education through subsidizing demand. Economic inequality fluctuates over the decades, but current increases in relative inequality are due to more to the financialization of the American economy than to the malevolence of the wealthy.  If the occupiers could somehow dispossess the putative plotters on Wall Street or force CEOs to take smaller pay packages, this would solve nothing.

Sunday, December 25, 2011

Christmas 2011

Yesterday evening my family and I celebrated Christmas Eve by visiting a local Chinese church on invitation of a Chinese friend.  The activities began with a skit by the youth group, portraying contemporary young people faced with dilemmas in living up to the Ten Commandments.  Three of the kids, dressed up as figures from the Old Testament in kaftans, burnooses, and false beards for the two boys, offered comments on temptation and morality, while the others struggled with making the right decisions in common adolescent situations. This dialogue consisted entirely of American-accented English.  Everyone in the audience seemed to enjoy the performance, even though it may have been incomprehensible for many of the older people.
The pastor delivered his homily on the Gospel Christmas story in Mandarin, pausing regularly for one of the ladies of the church to provide an impressive immediate translation into fluent English. A small choir singing traditional Christmas carols in Chinese followed this. I wondered if setting lyrics of a tonal language to Western music poses special challenges. The evening ended with the story of the Nativity, acted out, of course, by the small children of the church. This was the part everyone seemed to like best. For crowd-pleasing, you can’t go wrong with cute little children, especially when some of the people in the audience are their parents.  This was also the most completely understood event, not only because everyone knows the story, but also because narrators rendered it in Mandarin, Cantonese and English.  The adult organizers apparently ran out of sheep costumes and ox costumes are probably hard to find, so the animals in the stable included a little tiger and a little lion. I liked the idea that not only did a lion and a lamb kneel down together, but they even included a tiger in their fellowship.

Saturday, December 24, 2011

Thoughts on Japan

At a Christmas party last night, I spoke with a friend who translates advertisements, contracts, and other business documents from Japanese to English for firms in Japan doing business in the United States. He told me that he’s noticed his work orders always begin to increase just before the Japanese economy starts a spurt in activity, and that he has been very busy lately.  That may be nothing more than his impression, but disaster recovery can often be a stimulant. One of my wishes for the New Year will be that the Japanese people can go beyond rising above the disaster of the past year and emerge from the doldrums of the past two decades.

Friday, December 23, 2011

Iraq: What Lessons Should We Learn?

As the United States withdraws its troops from Iraq, the sectarian and ethnic divisions in Iraq threaten its fragile unity.  To the backdrop of dozens of people across Baghdad dying from explosions, Prime Minister Nur al-Maliki, a Shiite, has accused Sunni Vice-President Tariq al-Hashimi of organizing the vice-presidential security detail into a death squad. The vice-president has issued counter-accusations and has taken refuge in the semi-autonomous Kurdish region.  Interviewed on the PBS Newshour, foreign affairs expert and University of Chicago Professor John Mearsheimer said that U.S. problems in Iraq have resulted from not learning the lessons of Vietnam. My own view is that our national misapprehension goes much deeper and that it is less the consequence of our not having learned  the right lessons from failure in Vietnam than our having learned the wrong lessons from success in World War II.
Before World War II, the United States had, by historical standards, a fundamentally prosperous and growing economy even in the troubled times of the Depression. We had not maintained a strong enough military to defend ourselves, trusting in our broad oceans, but the country quickly responded to the demands of war. By concerted effort, we mobilized our resources and, with our allies, achieved victory over Germany and Japan. The United States emerged from the war an economic and military superpower. We completed the victory over our former enemies by occupying them, and after occupation they came out of the rubble and became free and democratic societies, as well as dynamic producers of goods and wealth.  Our global competition with the Soviet Union, along with our apparent success in rebuilding Germany and Japan, encouraged us to believe that we had the power to shape nations around the world in our own image and that to fail to do so would be to allow the Communist forces to shape nations in their image.
At home, the experience of World War II and the ensuing Cold War encouraged us to believe that governmental campaigns could remake our own society. Not only could we, in the words of President Kennedy, “pay any price, bear any burden, meet any hardship, support any friend, oppose any foe, in order to ensure the survival and success of liberty” abroad, we could, by dint of policy and national will, remake America into the ideal society. It was no coincidence that the Vietnam War and the War on Poverty together dominated foreign and domestic policy under President Kennedy’s successor. Both were consequences of the post-World War II belief that if we only mobilized our efforts, we could achieve all things through policy.
Our apparent successes in post-war Germany and Japan were not entirely results of our interventions in those counties, though.  Germany had been rapidly developing since national unification in 1871. Defeat in the two world wars disturbed its trajectory, but did not end it. Japan, similarly, had been on the rise since the Meiji Restoration in 1868. Both nations may have been physically destroyed by war, but they retained the cultural capital that had been driving them forward. Both also had democratic political traditions, in spite of their fall into dictatorship. The United States did not fundamentally re-shape those countries so much as it allowed them to start again on their own.
The belief that we could export our political values and institutions became part of a distorted version of containment strategy in the Cold War. The memoirs of diplomat George F. Kennan and the new biography of Kennan by John Lewis Gaddis attest that Kennan, often regarded as the original theorist of containment, did not think that the United States could halt the spread of Soviet power by intervening to make democracies all around the supposed free world. Kennan, in fact was an acerbic critic of Wilsonian international idealism. Instead, containment as he formulated the idea meant maintaining our defenses while using diplomatic means to fence in the Soviet Union until that nation would alter its own course.
I would agree with those who argue that invading Iraq was a mistake because it diverted us from the war we needed to fight with the rulers of the country that sponsored an attack on us, Afghanistan. Once in Iraq, though, we should have stopped at the initial goal of overthrowing Saddam Hussein, declared victory, and left.  Our confidence that we could build a new political society and re-shape the entire region, that the order in Iraq would be as we wanted and that our greater enemy, Iran, would not emerge stronger did not appear as likely came out of our post World War II delusion that federal policies can remake the world at will.
What lesson should we learn, then? I argue that at home, we should aim at maintaining representative government within Constitutional limits.  Abroad, we should set the best example we can and prepare our military for our own defense. We should produce things that people in other countries want to buy and buy the things that we want from them.

Thursday, December 22, 2011

In Search of Civilization

The word “civilization” once represented confidence in complex social systems in general and in the social systems of the ancient Mediterranean world and its European successor in particular.  Derived from the classical Latin term “civilitas,” which refers to the art of civil government or to citizenship, the English word only came into use in the modern period, probably about the middle of the seventeenth century, to describe a highly developed state of society or people who adhere to the norms of a highly developed state of society.  “Civilitas” was already ambiguous in antiquity.  The Romans used it to translate the Greek “πολιτική” (politike), or the means of governing a civil community.  In the classical Greek tradition, this often meant the management of small city states through face-to-face interactions, but the imperial Romans extended the concept to refer to behavior and institutions that could be spread over vast stretches of territory and different peoples.  Being civilized, in the modern sense, carried the varied senses of being part of a broad geographic or imperial order, participating in sophisticated and orderly cultural and political interactions, or conforming to the norms and traditions of a heritage based on Greco-Roman civility or a social pattern in some sense analogous to it.
“Civilization” always calls to my mind Vachel Lindsay’s now politically incorrect poem, “The Congo,” which lauded the triumph of civilization over African “mumbo jumbo.” The condemnation of the Belgian mission civilisatrice in Adam Hochschild’s King Leopold’s Ghost (1998) is one of the most recent illustrations of the contemporary loss of confidence in our civilization. It echoes the well-known retort of Gandhi, who, when asked what he thought of western civilization, replied “I think it would be a good idea.”  This loss of confidence arguably began to spread in the years following World War I, when Ezra Pound described the modern developed world as “a botched civilization.”  By the end of the twentieth century, challenges to Euro-American dominance of the world reinforced cynicism about the past and future of western civilization and about the very concept of civilization.
We may have carried cynicism too far and risk rejecting the values of civilization along with its imperfections. John Armstrong, a Philosopher in Residence at the Melbourne Business School and Senior Adviser to the Vice Chancellor at Melbourne University, offers a fresh look at this idea in In Search of Civilization: Remaking a Tarnished Idea (2009).  Armstrong examines the different uses of the word and attempts to say how those uses overlap and to suggest what value the idea of civilization might have. He identifies four major senses of the word: civilization as belonging, civilization as material progress, civilization as the art of living, and civilization as spiritual prosperity. He sees being civilized, in other words, as having social, economic, aesthetic, and spiritual dimensions.
Those looking for an argument and conclusions are likely to be disappointed. Armstrong’s work is neither strictly systematic nor rigorously logical. His goal is to say what civilization means in general by saying what it means to him. The book is inconclusive and tentative, but this is a part of its charm: it offers a meditation on civilization rather than an argument about its nature. It is social he suggests, because it entails high quality relationships with other people and draws on ideals from societies of the past. It is economic because a good life, for him, involves a decent standard of living. It is aesthetic because it requires the balancing of the energies of barbarism with the refinement of decadence. It is spiritual because it calls people to depths of feeling and lofty ideas. Ultimately, he sees civilization as the effort to reconcile two kinds of prosperity: material and spiritual.
One of the limitations of Armstrong’s book is that it is so abstract. Civilization, like ice cream, comes in specific flavors.  But thinking about what is valuable in civilization in general may help us recover confidence in our own civilization.

Wednesday, December 21, 2011

Thinking Critically about Critical Thinking

What do colleges teach these days? One answer to that question that has become common in the past five to ten years is “critical thinking.”  Supposedly, in the age of rapid communication and information storage, we always have facts, accounts, and theories at our fingertips and we have no need to carry large quantities of knowledge around in our minds.  Therefore, our institutions of higher learning should teach students intellectual skills, not content.
I have trouble with this approach because I’m not sure exactly what it means to teach people to think critically. The word “critical” comes from a Greek verb that means “to judge” or “to decide,” so I suppose it may entail teaching how to make judgments or decisions. That would suggest that it refers to formal systems of reasoning, such as syllogistic logic or cost-benefit analyses. But most of the courses at contemporary colleges don’t present those types of formal systems. In fact, the kind of “critical thinking” I see most often is decidedly uncritical, and involves entertaining students with pop culture and training them to repeat the shibboleth and slogans of their instructors.
In truth, thinking must always involve thinking about something.  Therefore, one cannot separate the skills from the content. There is no thought without knowledge. So maybe if our colleges are becoming devoted to teaching things like “critical thinking” or “creativity,” this means that they are moving toward teaching nothing at all.

Tuesday, December 20, 2011

The End of the Dear Leader

To me, the most disturbing scenes to come out of North Korea following the death of Kim Jong Il are the images of North Koreans weeping at the loss of their Dear Leader. Granted, these scenes do come to us from North Korean propaganda sources, so it is possible that many in that country are secretly wishing the late bouffant film fan a long sojourn in hell. But unlike other totalitarian regimes, the counterfactually named Democratic People’s Republic of Korea seems to have achieved the ideal of a completely unified command and control state, with no open opposition and the power to dictate even displays of emotion. This is a level of nightmarish success that neither the Soviet Union nor Nazi Germany ever managed to reach.
Perhaps one way to understand this morally questionable success might be to see totalitarian order as the product of disorder. In The Origins of Totalitarianism (1951), Hannah Arendt argued that the totalizing state was an effort to impose a rational unity in response to contradictions and conflicts in national societies. A last section appended to the second edition of the book attributed totalitarianism to the isolation and loneliness of individuals. Certainly, the efforts at totalitarian dictatorships that we have seen in our recent history emerged from varying degrees of social collapse, and the extent to which they accomplished moral, political, and ideological command largely depending on the degree of collapse.
 The internal chaos created by World War I and the fall of the czarist regime precipitated the creation of the Soviet Union.  Political polarization in Germany followed the war and enabled the Nazis to achieve power.  Although Nazi Germany perpetrated some of the most horrific crimes of the last century, the Nazi Party never achieved the same ideological unity in their country that Stalin brought about, arguably because the polarization of German society did not rise to the same level of disorder that prevailed in Russia with the end of the empire. Although the Italian Fascists had some of the most sophisticated theories of totalitarianism, as well one of history’s best-dressed dictators, Mussolini’s regime was never very successful at the goal of bringing everything inside the state. Italy lacked the requisite disorder for effective totalitarianism. Mao’s Communist Party in China was much more effective, following on the decay of the Chinese empire, decades of competing warlords, the invasion of the Japanese, and civil war.
One of the reasons we might be seeing so many weeping North Koreans, then, is that utter destruction of North Korean society by Japanese domination and civil war resulted in such a complete vacuum of ordinary social institutions.  Kim Jong Il’s smiling father, Kim Il Sung, was able to draw on Chinese support to establish his military cadres as the only effective organized body in the nation. Closing the country’s boundaries and propagating a personality cult then enabled the North Korean leadership to come closer to unified central control than any previous historical regime. Even when the people starve they don’t revolt against the state because they have no basis for organizing and the state is all the organization that exists.
None of this bodes well for the future of North Korea. I am not a Korea expert. My area is Southeast Asia. Even the Korea hands themselves don’t know what will happen in this strange country. But it looks to me like if the regime falls, it will probably be due to fighting among factions of the military leadership, rather than to any popular resistance.

Monday, December 19, 2011

After Iraq, Another Dangerous Tour of Duty

Previously, I wrote about the take-over of New Orleans schools by the state of Louisiana. While I strongly favor local control of schooling and other public concerns, I had to acknowledge that this take-over was a response to the fact that New Orleans had reached such a state of civic disorder that it was unable to effectively run its own schools. Now, State Representative Austen Baden, a Democrat from New Orleans, has said that he will ask Louisiana Governor Bobby Jindal to send troops of the National Guard to patrol the city's streets.

The recent shooting death of a one-year old child apparently inspired Rep. Baden's request. This is far from the first child to die in America's murder capital.  The call for the National Guard is yet one more recognition of the descent into savagery of the city once known as the Queen of the South.

Perhaps the most astonishing part of the city's loss of order may be the critics of its supposed gentrification.  Open warfare prevails in many of the neighborhoods and advocates are concerned that the city might not have enough poor people.

Sunday, December 18, 2011

The Mexican Drug War: Pobre Mexico, Pobre Estados Unidos

Pobre Mexico,”  Mexican dictator Porfirio Diaz is reported to have said in his last days, “tan lejos de Dios y tan cerca de Estados Unidos” [“poor Mexico, so far from God and so close to the United States”].  Mexican writer Carlos Fuentes improved on this lament and uncharacteristically expressed the sentiment with greater impartiality when he wrote, “pobre Mexico, pobre Estados Unidos, tan lejos de Dios, tan cerca el uno del otro” [poor Mexico, poor United States, so far from God, so close to each other”]. 
Mexico is currently in the midst of a full-scale drug war. Just how many have died in this conflict remains a matter of debate and speculation. The PBS Newshour gives an estimate of 45,000 deaths since President Felipe Calderon declared the war five years ago, but acknowledges that those fighting the war see “more murders than they can count.” Some of my Mexican friends have told me that the casualty count is too high, that Calderon plunged the country into a bloodbath, and that the federal government of Mexico should call off the war. The casualty count has been high and any citizen of Mexico should have a say in the policies of that country that I, as a North American, cannot assert.  Nevertheless, the organized criminal drug organizations are there and it is difficult for me to see that ceasing the war now would be anything other than simple surrender. To understand why these organizations exist, though, we ought to look at their historical background in terms of the Diaz/Fuentes quips: at the political and cultural emptiness of the nations south and north of the Rio Grande, and at how they have mutually exacerbated their domestic troubles.
The criminal organizations took root in Mexico only partly because that nation has a favorable climate for drug crops and lies on the drug route between South and North America. These organizations grew because of the high degree of centralization of the Mexican political system and the high-level of corruption fostered by such centralization.  The centralization came into existence because of the absence of functioning local civic institutions.The revolution that broke out in 1910 in attempts to overthrow the governing elite of Porfirio Diaz plunged the country into a series of revolts and short-lived governments. Mexico was a nation of Mestizo and Indian peasants ruled by large landowners of European ancestry (although, ironically, Diaz himself was a Mestizo). The problem of creating a government with its roots in the broad masses of the nation left the country, in the period immediately following the revolution, in the hands of generals whose claims to power were based on the number of soldiers at their command.
 The first steps toward a solution of this problem were taken during the administration of President Alvaro Obregon (1920-24). Obregon began reorganizing the guerilla armies of the revolutionary years into a disciplined national army, supported the establishment of a national trade confederation, and revived and enforced earlier land-reform legislation. In this way, he not only gave a military basis to his regime, but he also extended control over urban workers and the peasantry.  He was, in other words, creating a highly centralized, mass society. In 1923, Obregon backed Plutarco Elias Calles as his successor. Adolfo de la Huerta, Obregon's own immediate predecessor and rival of Calles, raised a revolt. Although a significant portion of the military favored de la Huerta, volunteers from the labor and agricultural sectors turned the struggle to the side of Calles and Obregon.
The new corporatist regime of military, labor, and peasantry took concrete organizational form. It acquired this under the leadership of Calles. During Calles' presidency, he established such close ties to the Regional Confederation of Mexican Workers (CROM), which represented both agricultural and labor organizations that the CROM became virtually an organ of the emerging Mexican corporate state. When Calles approached the end of his term, in 1928, he reluctantly gave in to the overwhelming power of Obregon in the Mexican Congress and consented to Obregon's return to power. Although a revolt by the opposition was put down before Obregon's successful election, the returning president was assassinated before he could enjoy his power. To prevent political tensions from erupting once again into armed struggle, Calles addressed Congress called for that body to name a provisional president, and declared the need for a political party that could institutionalize the Revolution and integrate all factions of Mexican society.  In March, 1928, the Partido Nacional Revolucionario was created in the city of Queretaro, under the direction of Calles, for just that purpose. The Comite Central Organizador of the new party undertook "invitar a todos los partidos, agrupaciones y organizaciones politicas de la Republica, de credo y tendencia revolucionaria, para unirse y formar el Partido Nacional Revolucionario" ["to invite all the parties, groups, and political organizations of the Republic, of revolutionary belief and tendency, to unite and form the National Revolutionary Party"].
The party organized in Queretaro continued, despite name changes, to rule Mexico until the very end of the twentieth century. President Manuel Avila Camacho (1940-46) reorganized the party under the name Partido Revolucionario Institucional, which it retains today. While various presidents emphasized "left-wing" issues such as nationalization and land-reform and others emphasized "right-wing" issues such as economic expansion, the party itself remained a mechanism for the incorporation and integration of all national institutions.
Although at one level Mexico became a highly centralized corporatist state, at the village and town level it retained an ancient patron-client that pre-dated the revolution. I recall in the first half of the 1970s visiting a friend in central Mexico, who had built a cabin in the Sierras Occidentales. At that time, the nation was in the throes of one of its most radical efforts at land reform under President Luis Echeverria (1970-1976). Concerned that Echeverria’s regime might seize some of their extensive landholdings, my friend’s father carved up his own property and placed portions of it in the names of his young adult children, providing my friend with the opportunity to try his own experiment in living off the land. I doubt that he is still living in a cabin today, but his attempt at alternative living was probably more successful than were the collective ejidos promoted by the Mexican president, which were generally disastrous for themselves and the Mexican economy.
One of my most distinct memories is of visiting my friend’s parents at their hacienda on the occasion of his father’s birthday. I remember the local villagers gathering in front of the hacienda, all dressed in traditional white Mexican clothes to sing songs for the patron. Underneath the modern mass state, attempting to direct all aspects of the national life of its atomized citizenry was a pre-modern society of patrons and clients.
A highly integrated centralized state cultivates corruption for three reasons. First, it has no balance of power to check the inclinations of office-holders to pursue their own self-interests. Second, all the rewards flow toward a controlling bureaucracy. Third, by attempting to coordinate all activities, the state establishes links with all areas of business and politics, including the illicit areas. A society such as Mexico’s lacks even the limits that may be imposed by local civic institutions, since these are mainly those time immemorial patron-client networks.
Mexico did at long last move away from single-party government. The Partido de Accion Nacional was founded in 1939 for the purpose of representing a Catholic social philosophy. From its inception, the party was critical of authoritarian tendencies in the PRI and supported increased reliance on individual initiative in the economic life of the nation. The PRI allowed PAN to exist as a nominal opposition party without, until recently, any real power or chance of challenging the ruling party. PAN finally began acquiring some influence, though, largely because of the growth of a Mexican middle class and because of popular disgust with the obvious corruption of the PRI. The PRI itself also fragmented, with its left wing breaking off to form the Frente Democratico Nacional.
In 2000, a PAN candidate, Vicente Fox finally broke the PRI monopoly by winning the presidency.  The current president, Felipe Calderon, succeeded Fox in a hotly contested election in 2006. While the PRI political monopoly has been broken, though, the pattern of connections among office-holders, business leaders, and unions remains in place, with the difference that now there are competing integrated power bases, that also have their own supports in the police and the military. Many of the people in the nation, though, are still peasants, whose traditional patronage systems have been disrupted both by the federal bureaucracies and by the emergence of new criminal systems. Understanding those criminal systems requires looking at the other unfortunate partner in this troubled relationship, the United States.
About the year 1970 (periodizations are always somewhat arbitrary), the United States entered a new period in its history characterized by three interrelated trends: First, the U.S. underwent a rapid centralization of its own political system.  Second, it became a demand-driven, import-based economy, buying more from other nations than it exported to feed the constantly increasing material expectations of its population. Third, it experienced a cultural shift, in which the consumption of goods became ever more important in the lives of its people.
Although there have been a number of periods of rapid growth in the size and influence of the U.S. federal government, none compare with the dramatic increase in the few years before and after 1970.  This was the time in which the United States truly became a welfare state, with the federal government deeply involved in virtually every area of the lives of its citizens. While there are justifications for some of the growth in federal intervention, such as protecting the voting rights of citizens in minority groups, this trend tended to redefine the social relations of Americans. The sociologist Robert Nisbet has argued that the most effective social relations have existed historically within small, highly localized, face-to-face ties. In the past, Nisbet wrote in The Quest for Community, the “institutional systems of mutual aid, welfare, education, recreation, and distribution” were primarily the products of “family, local community, church, and the whole network of interpersonal relationships.” Centralized state power, from Nisbet’s perspective, has resulted in serious problems for modern societies. It has weakened traditional and immediate institutions, such as the family, without being able to replace fully the functions of those institutions. This has created settlements of atomized individuals in place of true communities as well as undermined the abilities of those individuals to work together for common goals.  While the most serious deterioration of community has occurred in North America’s low-income inner cities, among those most directly dependent on the central state, basic community institutions have been weakened throughout the nation.
Part of the shift in power from localities to the federal government involved federal efforts to maintain high demand and high levels of public consumption. By the end of the 1970s, Americans were consuming more than they were producing, so that imports began to exceed exports. To the extent that Americans were producing domestically, the labor in new and often unpleasant industries fell more and more to immigrants, from Mexico more than anywhere else. America was importing people, in other words, in order to maintain its ever-increasing expectations for standards of living among the native-born.
Defining themselves through consumption led Americans to import more than labor and legal goods. The social psychology of consuming is: the purpose of life is to maximize enjoyment. These material goods will increase your enjoyment. No consumer goods fit this consumer psychology quite so well as pleasure-inducing goods. The inner cities, where the purposelessness and anomie of living with decayed social institutions was greatest, became sites of the most savage drug wars within North America, but drug use, as the purest form of consumerism spread through the nation in the post-1970 period. This was precisely why living next to the United States became so unfortunate for Mexico. The latter became an exporter of massive amounts of drugs, as well as workers, to the consumer culture of the northern neighbor, giving rise to highly organized criminal corporations extending throughout Mexican society. Poor Mexico, poor United States, so far from God and so close to each other.

Saturday, December 17, 2011

Against Social Policy

Over the past few years, I have grown increasingly suspicious of the very idea of social policy. This may be biographically rooted in my work during the 1980s with refugees from authoritarian societies with aggressive social policies. It may also be a reaction to working in a modern university and being surrounded by calls to fall into line and march for social change. I don’t like following anything but my own dim vision of the truth. Beyond these personal inclinations, though, I don’t see intentional efforts to achieve one or another sort of society as consistent with democracy.

While different people and different political philosophies use the word “democracy” in a variety of ways, the term most commonly refers to a system of government in which people either make political decisions for themselves (direct democracy) or elect representatives to make political decisions (representative democracy). In the former, there is no question of anyone “re-making” the people, since the people think for themselves and have the freedom to be what they are. In the latter, also, the goal of re-shaping a society along democratic lines is a contradiction because a representative government represents its public as it is; the government does not try to make the public into something it is not. Since a society is made up of people and of the total of formal and informal relations among people, changing a society means changing the people and their relations with each other. It is a profoundly authoritarian effort and even, as it approaches a goal of total reform, totalitarian at its extreme. Political reform is a matter of changing laws. Economic reform involves changing policies relating to matters such as taxation, public expenditures, or interest rates. Either of these may be consistent with democracy. But social reform aims at changing people and their relations with each other. This is a reversal of the direction of action and control of a democratic society, since it involves the authorities attempting to constitute or re-constitute the public, rather than the public constituting the authorities. This is reminiscent of Bertolt Brecht’s 1953 quip that the East German people had apparently lost the confidence of their government, so the government should “dissolve the people and elect another one.” Creating a new society is precisely the attempt to dissolve the people as they are and to appoint the people as one would like them to be.

Friday, December 16, 2011

The Chronomyopia of Higher Education

I have the privilege of teaching bright, courteous young people.  I am told that they come to our fairly selective university with generally high SAT scores, and I have no reason to doubt this. But my favorable opinion of my students constantly leaves me all the more surprised at how little many of them know outside of today’s current events and how unfamiliar they are with ideas beyond the various versions of present-day conventional wisdom.   They suffer from a condition that the anthropologist Robin Fox has labeled “chronomyopia,” a narrow focus on the immediately present.
Ironically, our information-rich society may be a large part of the reason for this intellectual short-sightedness. Overloaded by broad and shallow electronic communication, they have no attention left for thinking that goes beyond the present.  In the decidedly low communication environment of Walden, with his books, thoughts, and visitors, Henry David Thoreau wrote “[w]e are in great haste to construct a magnetic telegraph from Maine to Texas; but Maine and Texas, it may be, have nothing important to communicate." I’m not sure whether that was true, but I do think that partially disengaging from the networks of his own day enabled Thoreau to reach the depths that produced his masterpiece.  Most of the information that flows through the complex networks of our own day does have nothing important to communicate, and it often completely absorbs young people.
Although higher education may not be the source of the near-sightedness, it exacerbates the problem. Historian of education and policy analyst Diane Ravitch, in her 2000 book Left Back: A Century of Failed School Reforms, lamented the displacement of academic education in schools by programs such as “life adjustment” education and education for “social efficiency,” aimed at narrowing teaching and learning to topics immediately relevant in the daily lives of students. In my view, as college has become the new high school, various forms of education for immediate social purposes have moved to the center of post-secondary schooling.
Education for civic engagement, a major initiative at my university, is a part of the narrowing of the curriculum. The job of educators, according to this pedagogical ideology, is to train students to be “engaged citizens” and to direct learning toward solving social problems.  If we preach engagement, of course, we exclude Thoreau’s retreat from the range of possibilities and narrow the thinking of our students by removing from consideration the question of whether it is ever acceptable for them to refuse to throw themselves into the campaigns and crusades of the moment. Since “social problems” don’t exist as facts, but must be defined by someone, universities limit the students’ vision by defining what the students should see as problematic. The goal is to engage students in officially sanctioned service to the here and now, institutionalizing chronomyopia.
I do think that higher education should have something to do with citizenship. But our universities can best prepare students for citizenship in a representative republic by expanding the vision of the students and by encouraging them to make their own decisions about whether and how they will engage with their society. Our students need to learn much more about the history of their own political society and of the world. They need to have access to their cultural heritage as Americans and as human beings. Combining this kind of far-sighted citizenship education with the skills and knowledge for making a living is a big task. Instead of taking up this effort, we are directing our students’ eyes on our immediate and preferred social goals.

Thursday, December 15, 2011

Stephen Greenblatt's The Swerve: How the World Became Modern

Stephen Greenblatt’s The Swerve: How the World Became Modern has been among the books that I’ve most enjoyed reading over the past couple of weeks.  I liked Greenblatt’s Will in the World, a speculative effort to tell the largely unrecorded life of Shakespeare, and consider him one of the best contemporary non-fiction writers. Still, I think Greenblatt’s rich historical imagination sometimes carries him to views that are entertaining but dubious.
The Swerve tells the story of the loss and rediscovery of one of the world’s great literary treasures, De Rerum Natura [On the Nature of Things], written by the Epicurean Roman poet Lucretius. Greenblatt’s account begins with the recovery of a medieval copy of the poem by the Italian humanist Poggio Bracciolini, and moves back and forth between the fifteenth century Italian Renaissance and the ancient world in which Epicurus formulated and Lucretius poetically expounded an atomistic, materialistic world view. In the course of the story, Greenblatt makes stops in the decay and end of the Roman Empire and in the Middle Ages.  He displays an impressive ability to carry a clear and compelling narrative through so many diversions.  The book is a fascinating reminder of how little of the vast body of ancient literature remains to us today and how tenuously what we have survived. For me, also, it implicitly posed a question of current importance: what happens to the culture of a civilization when people lose interest in reading?
As much as I like the book, though, I question the claim that somehow this single philosophical poem played much of a part in making the world modern. While materialism did indeed become one of the major philosophical currents of modernity and atomism became part of modern sciences, I don’t see any evidence that the reading of De Rerum Natura played much of a part in these trends.  Greenblatt appears to suggest that because the poem contained ideas widely associated with the post-medieval world, the poem was somehow the source of these. That’s not a very sound approach to historical causation.
The Swerve also offers a somewhat simplistic view of what modernity is. Even the Renaissance, which the book portrays in somewhat conventional fashion as the origin of modernity, carried diverse and sometimes conflicting trends. The literary movement that later became known as humanism was distinct from the scientific Renaissance, which may well have owed more to late medieval scholastic Aristotelianism than to the rediscovery of ancient writings. Many of the later political and economic trends, including the rise of representative democracy and the market economy, were arguably much more closely connected to theological currents than to materialistic thinking. Greenblatt tells such a good story that readers need to watch that the narrative does not sweep them across some big gaps in its claims.

Wednesday, December 14, 2011

A Note on the Swamy Case

After posting my brief thoughts on the censure of Professor Subramanian Swamy by the Harvard Faculty of Arts and Sciences, I received the following email message from a reader:
There's another point that I would add to the analysis. Few people know Dr. Swamy's role in the case of, what came to be known as, the Hashimpura Massacre. While you may read about it at leisure, in brief, Dr. Swamy was the only man standing FOR the massacred muslims at the time, amidst "lip-service" secularists, who spoke against it and did whatever could be done, albeit with partial success, at best, to bring to justice the master-minds of the crime. So, if at all, the muslim students at Harvard should feel protected by his presence in the Faculty, in fact the muslim student community should honor him for it, 'coz an action is worth a million words, and his action was as decisive as an act of goodness can ever be, whereupon Dr. Swamy put his life for the cause of restoring the right to life and dignity of Muslims in India.
Sitting before my computer in Louisiana, I am not able to pass judgment on the morality of Dr. Swamy's various public actions and positions in India. On the basis of the links sent by this reader, it does look to me like he has acted in some decidedly virtuous and courageous ways.  The information in the links also suggests that whatever his views on the desirability of India becoming an officially Hindu state, he is far from bigoted against Muslims.
Ultimately, though, it is not my place to say whether India should become Hindustan or whether Saudi Arabia should continue to be a Muslim nation.  While I am entitled to my opinions about how people in other countries should govern themselves, I am not a citizen of those sovereign nations. The Harvard Faculty of Arts and Sciences, as a body, is also alien to India and Saudi Arabia. From its lofty moral position, though, the Harvard professoriate seems to believe that it can decide what everyone everywhere in the world may be allowed to think and say.
Now, I argue that Harvard or representative bodies within Harvard should not even be telling individual faculty members or students what views are proper on issues in Massachusetts.  But in trying to enforce political conformity on citizens of other nations on the internal matters of those nations the Harvard A&S displays a smug moral tyranny that literally knows no boundaries.