Monday, August 30, 2010

Cry Havoc: How the Iraq War Began

With President Obama about to announce the end of U.S. combat operations in Iraq, it is instructive to remember how all this began, seven years ago. It began with the bombing of Baghdad, a massive American bombing campaign against a civilian population of a country that had not made war on anyone, including the United States. It was one more escalation in the morally indefensible history of bombing, and it presaged everything that's happened since in that part of the world.

In the following piece, published in the San Francisco Chronicle several weeks before the bombing began, I refer to what was a new and little known phrase then, but which has since become a cliched description of this bombing campaign: Shock and Awe. It was supposed to break the will of whoever the Bush administration considered the enemy, but it did not.

I also quoted the famous lines from Shakespeare's Julius Caesar: "Cry 'Havoc' and let slip the dogs of war." I focused on the "havoc" in this piece, but in other writing of the time I emphasized the rest of the quotation. The point of the metaphor of the dogs of war is not only that a pack of dogs are indiscriminately violent but that once loosed, they are very difficult to control or to stop. Once this war started it was clear to some of us at least, that it would go on for a long time. And it has, with immensely and deeply destructive consequences to the U.S. as well as to Iraq. It has eroded us morally, depleted us financially, destroyed thousands of people physically and psychologically, and distorted our politics. We don't seem to have learned much from it either. If for no other reason than the resources of all kinds it stole from facing up to the double challenges of the Climate Crisis, it has deeply wounded the future.

There are going to be attempts to rewrite its history, which will likely be evident in the coming hours and days. But in some measure, its history was written before it started, and some of us saw that, pleading for the dogs of war to be restrained, even as the bombing became inevitable. (The prediction recorded in this piece that the bombing would begin by March 15 was off by only four days. It began March 19.) The following piece concentrated on one element, the bombing, and its moral and historical context.

When Islamic armies were the most powerful in the world, conquerors of Asia Minor and North Africa, and poised at the gates of Europe in the 8th century, Abu Hanifa, founder of a school of law in the city of Baghdad, proposed that the killing, maiming and raping of civilian noncombatants in war be forbidden. It was one of the first attempts to codify some kind of moral and legal restraints on civilized societies engaged in the dangerously uncivilized practice of warfare.

If and when war comes to Iraq, it will likely feature the relentless and perhaps unprecedented bombing of Baghdad. According to CBS News and other sources, the United States is considering implementing a strategy called "Shock and Awe," developed in 1996. The plan could result in at least 300 Tomahawk cruise missiles raining down on Baghdad in just the first day of an aerial campaign - more than were used on all targets in the entire Gulf War. And the plan calls for an equal or greater number on the second day as well, up to 800 total, each capable of carrying 1,000 pounds of explosives. There was no estimate of how many days the bombing would continue.

Although missiles would likely focus on infrastructure including electricity and water supplies, an average of one missile striking a city of 5 million inhabitants every four minutes around the clock could kill and maim thousands of civilians.

"There will not be a safe place in Baghdad," according to an unnamed Pentagon official quoted by CBS. "The sheer size of this has never been seen before, never been contemplated before."

The prospect of war in Iraq is crowded with moral as well as political questions, with multiple possibilities for ethical outrages of stunning proportions. But the continuous bombing of a city of civilians would probably be the first that confronts the watching world.

In A History of Bombing (published by New Press in 2001, and forthcoming this spring in paperback), Sven Lindqvist follows three main threads: the technology and use of aerial bombing in history, the attempts to deal with the moral implications of its use against civilian populations (Abu Hanifa is one example he cites), and social attitudes toward bombing found in sources such as popular fiction.

The historical parallels to the current prospect as well as the ironies are disquieting. The impact of the "shock and awe" strategy is meant to be on hearts and minds: to destroy the enemy's will, and mental and psychological ability to resist. But bombing's ability to terrorize - the sudden explosive death from the sky without warning - was one of the first effects to be observed, noted in 13th century China. It has often been a prime strategy of bombing, according to Lindquist, used extensively by European colonial powers in Africa, India and Asia.

Bombing is especially terrifying when used on the helpless. At first it was shelling from ships far offshore (which is how the United States bombed Nicaragua in 1854), then bombs dropped from airplanes. Bombing was a cost-effective way of keeping subjugated populations in line. Baghdad was a British target more than once in the 1920s.

As airplanes, bombs and cities all got bigger, moralists and diplomats negotiating the international rules of war and definitions of war crimes struggled to keep up. Several prohibitions against air warfare and the bombing of cities were proposed, and some were signed, except by the major powers capable of the bombing. Well into the 20th century, bombing was considered not so bad if the victims were of "inferior races." Some authors wrote glowingly of bombing as a way to civilize the world by permanently subjugating or even wiping out these races.

European bombing gradually got closer and closer to home, until the German military on behalf of Franco tested new kinds of bombs by dropping them on cities in Spain. Japan bombed civilian cities in China. Then in the 1940s, bombing of even the capital cities of combatant nations - Berlin, London, Tokyo - became a normal instrument of warfare, finally leading to the annihilation of the undefended cities of Hamburg and Dresden by British saturation firebombing, and of Hiroshima and much of Nagasaki by the U.S. atomic bomb.
By that time, terror was not the only result of bombing. Fifty thousand civilians were killed in a single night in Hamburg, most of them women, children and elderly. Twice that number died in Dresden. Two atom bombs did fill Japan with shock and awe, and killed several hundred thousand civilians in the process.

There are various strategic arguments for bombing campaigns that dovetail with apparent moral concerns, usually involving shortening a war's duration or substituting for ground assaults, thus saving lives, especially the lives of the side doing the bombing.

When facing the possibility that this war would unleash chemical, biological or nuclear weapons that have been largely absent from warfare for decades due to international taboos of one kind or another, it may seem quixotic to argue that bombing of civilian populations should be regarded as an evil in itself, and beyond the pale for nations that desire any sort of international relations. But it seems morally obtuse that there is a stronger taboo against assassinating a declared enemy's head of a state than against slaughtering babies in their beds. Surely bombing should be a last resort, not the first.

For even if the historical parallels are coincidental and not disturbing echoes of residual racism and empire-building, the bombing of Baghdad to begin this war would have a terrorizing effect on more than its residents. In Shakespeare's time, there was another word for the terror, the shock and awe that accompany a war without moral limits. The word was "havoc," as in the famous quotation from Julius Caesar, "Cry 'Havoc', and let slip the dogs of war."

The bombing and havoc may already be starting by the time you read these words, although according to Los Angeles Times reporter Doyle McManus on the Washington Week in Review, "You can pretty well mark on your calendar March 15. " It's the date formerly known as the Ides of March.
February 2003

Sunday, August 29, 2010

Paging Doctor Strangelove

The following piece is from 2003, also published in the San Francisco Chronicle Insight section. In a sense it follows from the previous piece posted here, although that was actually written later. It arose from a sense that using nuclear weapons was again becoming legitimized, particularly by the GW Bush administration, but also by fading memory of what nuclear weapons actually are. What they are not is just bigger bombs, with bigger, more cinematic explosions.

But President Obama has turned that particular tide, at least in terms of official U.S. policy. New nuclear weapons programs have been stopped, and steps have been taken to reduce U.S. weapons and reliance on them. More significantly in light of this piece is President Obama's efforts to institute new international agreements. He negotiated and signed a new START treaty with Russia, and began implementing the U.S. part of the agreement, even lacking Senate confirmation. He convened a meeting of 47 nations that resulted in agreements designed to decrease proliferation and accidental nuclear explosions. His plans include reviving a Comprehensive Test Ban Treaty. These efforts were prominent reasons for his Nobel Peace prize. They also clearly and consciously honor the courageous and ground-breaking efforts of President Kennedy in proposing and successfully completing the Nuclear Test Ban Treaty, and gaining public support to insure its ratification in the Senate.
Still, the tepid reception to Obama's efforts in the U.S., the failure of the Senate to ratify START, and the lack of understanding of either the reality of nuclear weapons or the importance of JFK's achievements, suggest that the fears expressed in the following piece still have substance.

October 2003
Last fall, the 40th anniversary of the Cuban Missile Crisis was accompanied by media stories and symposiums. This fall, the 40th anniversary of the most significant outcome of that crisis, the first nuclear arms treaty signed by the superpowers, has come and gone in silence.

It has been not quite 40 years since the release of Stanley Kubrick's classic film, Dr. Strangelove, or How I Learned to Stop Worrying and Love the Bomb. One conclusion that might be drawn by the widespread indifference to the anniversary of the limited nuclear test ban treaty of 1963 is that we've all learned to stop worrying. We may not love the bomb, but perhaps we're not so frightened of it anymore.

The specter of nuclear war dominated politics, culture and insinuated itself into daily life for decades after Hiroshima. But in recent years it seems to have lost its potency. Although the stories are tucked in back pages or hardly reported at all, the dangers continue to slowly grow.

Israel modified U.S.-made cruise missiles to carry nuclear warheads on submarines, to counter suspected advances in Iran's long-range missiles and the possibility of Iran acquiring nuclear arms. This new element of an atomic arms race in the highly volatile Middle East joins the continuing threat of North Korea to make and even export nuclear weapons, and the continuing danger of two known nuclear powers, Pakistan and India, facing off over disputed territory at their borders.

Few North American news outlets even noted a recently revealed Russian plan to consider using nuclear weapons to fight terrorism, although the mayors of Hiroshima and Nagasaki noticed it, and sent protests to the Russian government. But the U.S. could hardly object, since the Bush administration just pushed through the Senate its plan to develop low-yield nuclear bombs for battlefield use.

A chief reason for today's indifference probably is the belief that the fall of the Soviet Union meant that thermonuclear holocaust is no longer a real possibility. But a recent Rand study asserts that due to disorganization in Russia as well as other factors, the threat of a devastating nuclear exchange between the U.S. and Russia caused by accident or miscalculation has not lessened but increased.

Another reason could be that there hasn't been a visible nuclear bomb detonation in many years, and no atom bomb has been used against an enemy since World War II, resulting in a diminished appreciation for their power. But continued proliferation of weapons in a world where hostility is increasingly open, violent and unrestrained may change that, to our certain horror.

It could also be that because the size and power of conventional weapons has grown (and some are radioactive), while new nuclear weapons seem smaller and more precise, there doesn't seem to be as much difference.

But as California Sen. Dianne Feinstein said recently, "The administration is saying we can make nuclear weapons less deadly, and acceptable to use. Neither is true." According to an article in New Scientist magazine, the United States is exploring an entirely new class of gamma-ray nuclear weapons, which are thousands of times more powerful than chemical weapons and (the article stated) "could trigger the next arms race."

That is the greatest danger, as Feinstein noted: a new nuclear arms race. Which is precisely why it is so important to remember the nuclear test ban treaty four decades ago. It did more than ban the ever-larger nuclear explosions pouring radioactive poison into the atmosphere shared by the whole Earth. It broke the momentum of the arms race, which seemed to be out of human control, propelled by its own deadly logic of inexorably increasing force and counterforce. The Cuban Missile Crisis had sobered the U.S. and Soviet leaders into seriously negotiating a treaty.

What made a crucial difference was President John F. Kennedy's eloquent and persistent attack on the irrational logic of the arms race, and his insistence that humanity begin preparing for peace with the same courage and diligence as it prepares for war.

To those who think peace is unrealistic in a world of conflict, Kennedy countered, in a speech at American University, that this view means "that mankind is doomed, that we are gripped by forces we cannot control. We need not accept that view." Then he used the phrase that more than any other sums up the Kennedy faith: "Our problems are man-made; therefore they can be solved by man."

He advocated an attainable peace "based not on a sudden revolution in human nature but on a gradual evolution in human institutions. ... Genuine peace must be the product of many nations, the sum of many acts. ... For peace is a process, a way of solving problems."

In its most quoted phrases (spoken more recently, without attribution, by a fictional Russian president in Tom Clancy's film The Sum of All Fears) Kennedy said: "For in the final analysis our most basic common link is that we all inhabit this planet. We all breathe the same air. We all cherish our children's future. And we are all mortal." How different these words are than any we have heard recently.

"What kind of peace do I mean?" Kennedy asked. "Not a Pax Americana enforced on the world by American weapons of war ..."I am talking about genuine peace, not merely peace for Americans, but peace for all men and women; not merely peace in our time, but peace for all time."

The test ban treaty was negotiated in the midst of the suspicions and fears of the Cold War, just as today's world is permeated with suspicions and fears of terrorism. We desperately need to remember this first step toward peace. The threat of nuclear weapons, of becoming captives of an arms race and a psychology of war, belongs not just to history.

Friday, August 27, 2010

When It Was Real

[Hiroshima and Nagasaki]

This month marked the 65th anniversary of the only nuclear weapons to be used in war, the atomic bombs dropped on Hiroshima and Nagasaki. These anniversaries passed without much notice, although for the first time there was an official American presence at the Hiroshima commemoration. This lack of attention is disquieting, not only to some in Japan, such as this prominent writer, but to others, such as this teacher whose students consider it "ancient history." The further we get from the reality of the damage caused by two atomic bombs of very modest yield by today's standards, the more complacent we seem to be, and therefore the more likely that nuclear weapons will be used again.

What follows is my piece published in the San Francisco Chronicle
Outlook section marking the 60th anniversary of Hiroshima in 2005.

On July 16, 1945, the cruiser Indianapolis sailed from Hunters Point Naval Shipyard in San Francisco, carrying one 15-foot crate. Inside were the components for the first atomic bomb destined to be dropped on a city. It was being shipped to Tinian Island in the western Pacific, and its final destination a few weeks later would be Hiroshima. It left San Francisco just four hours after the first successful atomic bomb test in history, in the New Mexico desert.

Sixty years is a long time to keep even such an immense memory alive, but several books published recently bring these events into sharper focus than ever before.

Several are biographies of key figures like Robert Oppenheimer and Edward Teller, but one is billed as a biography of the bomb itself. "The Bomb: A Life" by Gerard DeGroot (Harvard University Press), professor of modern history at the University of St. Andrews in Scotland, benefits from newly available records, especially concerning the Soviet nuclear program. But mostly it is a skillfully condensed narrative of the nuclear era, fascinating in the selection of details and riveting in its revelations of how possessing nuclear weapons changed those involved, and changed America.

On the day of that first test in July 1945, no one knew what would happen. About half the scientists didn't think the device would explode at all. Enrico Fermi was taking bets that it would burn off the Earth's atmosphere.

It did explode, with such brightness that a woman blind from birth traveling in a car some distance away saw it. "A colony on Mars, had such a thing existed, could have seen the flash," DeGroot writes. "All living things within a mile were killed, including all insects."

America was now in sole possession of the most powerful weapon in history. The first effect of the bomb was in Potsdam, Germany, where President Harry Truman was conferring with British Prime Minister Winston Churchill and Joseph Stalin, premier of the Soviet Union, then an ally in the war against Japan. After Truman received the news of the successful test, he was "a changed man" and "generally bossed the whole meeting," according to Churchill.

That the second bomb left San Francisco on the day the first was tested suggests the momentum to use it. Whether dropping the bomb was necessary to secure Japan's surrender before an invasion became necessary is still being debated. DeGroot believes that Japan was looking for a way to surrender in June and July. But there were other considerations, mostly to do with demonstrating American power, especially to the Soviet Union.

Using the bomb quickly became a test of patriotism. "For most Manhattan Project scientists the bomb was a deterrent, not a weapon," DeGroot writes. Physicist Leo Szilard had done as much as anyone to try to persuade FDR to develop the bomb because Germany was doing so. But on the day after that first test, he sent government officials a petition signed by 69 project scientists arguing that to use the bomb would ignite a dangerous arms race and damage America's postwar moral position, especially its ability to bring "the unloosed forces of destruction under control."

The petition was ignored, and Gen. Leslie Groves, the senior military official in charge of the project, began making a case that Szilard was a security risk. It's a pattern that would be repeated often.

DeGroot places the decision to drop the bomb on Japan in the context of the brutalization that occurred during the long years of World War II, with an unprecedented scope of savagery on both sides. The bombing of civilians and cities, morally unthinkable in the West before the war, became a major feature of it by its final years, long past the time many military targets were left. Gen. Groves, he writes, was worried that Japan might surrender before the bomb could be dropped.

Hiroshima was selected as the primary target because it had no allied POW camps. However, there were nearly 5,000 American children in the city -- "mainly children sent to Japan after their parents, U.S. citizens of Japanese origin, had been interred." It seems likely some of those children were from San Francisco.

The nuclear era began with the secrecy of the Manhattan Project, which is perhaps partly why it was accompanied throughout its history by lies and denial. It began with Hiroshima. As many as 75,000 people died in the first blast and fire. But in five years the death toll would reach 200,000 because of what the U.S. government denied existed: lethal radiation.

Even after the hydrogen bomb was developed in the 1950s (so powerful that the first test vaporized an island and created a mile wide crater 175 feet deep), the untruths continued. In 1954, Dr. David Bradley reported on 406 Pacific islanders exposed to H-bomb fallout: nine children were born retarded, 10 more with other abnormalities, and three were stillborn, including one reported to be "not recognizable as human." Such information was denied or routinely suppressed through all the years of testing, even on U.S. soil. Groves even told Congress that death from radiation was "very pleasant."

Even after the war, criticizing the bomb in any way became a threat to national security, an act of disloyalty that only helped the communist enemy. And so people were silent and compliant, and streamed into air-conditioned theatres to see movies about monsters created by atomic radiation.

This extreme weapon prompted extreme and contrary emotions, often within the same people. Some of the same Los Alamos scientists who cheered madly at the first news of Hiroshima were later shell-shocked with regret. Gen. Omar Bradley called his contemporaries "nuclear giants and ethical infants." Yet he pushed for developing the hydrogen bomb.

This peculiar combination of denial plus the immense power of thousands of bombs contributed to an era of deadly absurdities: the age of Dr. Strangelove. Yet reality was not so different, right down to the preposterously appropriate names: the head of the Strategic Air Command, Gen. Tommy Power, gave his philosophy of nuclear war in 1960: "At the end of the war, if there are two Americans and one Russian, we win!"

The warp in American political life created by the bomb might be summarized in two statements. "In order to make the country bear the burden," said President Dwight Eisenhower's secretary of state, John Foster Dulles, referring to the Cold War arms race, "we have to create an emotional atmosphere akin to a wartime psychology. We must create the idea of a threat from without."

The second is more famous, but perhaps its connection to the bomb and its effect on America has been forgotten: Eisenhower's farewell address. "We have been compelled to create a permanent arms industry of vast proportions," he said. "We must not fail to comprehend its vast implications. ... We must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist."

Monday, March 08, 2010

Kathryn Bigelow, first woman to win the Oscar for Best Director. Is it time for genderless Acting awards, too? See my Los Angeles Times oped re-posted below.
What Gender is Oscar? Revisited

On Oscar Monday 2010, I chanced to see a Los Angeles Times oped debunking the idea that the Academy Award category of Best Actress is sexist. So it seems the right time to repost an LA Times oped I wrote in 2004, that posed the question this oped attempts to answer: why do the separate categories for Best Actor and Best Actress still exist--categories based on gender that exist nowhere else in the Academy Awards?

The question was prompted partly by the women being nominated in gender-neutral categories like Best Director, as well as the analogy of no race-based categories, when black actors and actresses were being nominated and winning. So there's another reason to revisit the oped-- the day after the Best Director award went to a woman for the first time in Academy history.

There is of course a very short answer to the question of why the Best Actress category exists: it's the dresses. That's even clearer now that the Red Carpet gauntlet has become so prominent. But you don't get paid for a three word oped. And actually, there are some other things to say about the subject. Still, I meant the oped to be at least slightly tongue-in-cheek.

What follows is a longer version than appeared in the LA Times. It restores some of the Times' edits (which tended to make the piece sound more solemn than I meant it), and some relevant if not especially funny material I'd cut from the version I sent to them, principally the comparisons to the Grammys.

It turns out this subject has been raised before, and since--as chronicled here by Daniel Radosh. Sorry, D.R., I wasn't aware of your prior piece when I wrote mine! Although it's such an obvious idea that he might not have been the first either.

You might still be able to find the version the Times printed here. [continued after photo]
Sandra Bullock, winner of the 2010 Oscar for Best Actress. Six years later and they're still not listening!
Why is there a Best Actress Award?

(Los Angeles Times: February 15, 2004)

As the Academy members mull over their choices among this year's nominees, I pause to ask one perhaps impertinent question about the Best Performance by an Actress categories. It's not about the fine female actors nominated this year---it concerns the categories themselves. My question is, why do they exist?

For after all, there is no award for the best screenplay by a woman writer. Sophia Coppola wasn't nominated as best female director. There will be no award for a Best Picture by a woman producer. Why are there separate acting awards divided by gender?

There doesn't appear to be anything about acting skill that is gender specific. In fact, many women insist on being called actors, and bristle at the designation of "actress" because it is implicitly demeaning, like the term "authoress." A writer is a writer, and an actor is an actor. Aren't these gender designated categories just relics of a less enlightened age?

All of the other Academy Award categories are based on the type of work or the type of film. These are the only categories that aren't. There are no separate categories based on race, ethnicity, religion, age, sexual preference or any other element of diversity. Why not best performance by a Latino in a leading role (apart from the extreme difficulty of coming up with five nominees)? Or best performance by a gay or Lesbian actor playing a character of the same gender and sexual preference, and another for playing a straight person?

It should be noted that women who might ordinarily lobby for equal treatment haven't exactly been burning their SAG cards to protest gender specific awards categories. The reasons aren't hard to figure out. Thanks in part to the prevalence of action pictures with a worldwide audience, fewer women get fewer starring roles, or even substantial supporting roles, than men (or, these days, than special effects creatures.) More male stars have more box office clout.

So if there were only a single acting category, women might be in danger of getting a token nomination or two, but how often would they win? Having their own categories means that more women are more likely to get more attention, which helps all women actors. In the movie business and particularly in the businesses that own the movie business, there are fewer women decision-makers than men, and only a percentage of them will focus at all on making things better for other women.

Women actors need this category just to survive. So the inevitable conclusion is simply this: the best actress award is an affirmative action program. The award redresses contemporary imbalances and historically derived inequalities that otherwise would continue automatically.

Of course, that's not why these categories were created, or even why they are kept. Glamorous and sexy women attract audiences to movies, apart from their acting performances. Audience interest in watching the awards program is increased as well by beautiful women crossing the stage and tearfully thanking their parents and agents while dressed in daring designer clothes.

On the other hand, if there weren't special categories for women actors, the most popular women might need to be paid in ways other than prestige, like in equal money.

Other movie and television awards shows follow the gender pattern for acting. The Grammy awards however have categories for the music of specific ethnic groups (Native Americans, for instance) and for types of music that are associated with performers and audiences of one race or another. So instead of getting rid of these categories, should new ones be created to reflect other differences and redress imbalance? Separate categories for black actors and actresses, for instance?

But just as women aren't agitating for an end to their best performance categories, minorities are by and large not asking for their separate categories. Black actors have chosen to compete without reference to race. So Denzel Washington was not the best black actor, nor was Halle Berry the best black actress. They were the best, period.

Despite the fact that these two were very rare awards, and that black actors have historically faced long odds to even get on the screen at all, it may be that the greater legitimacy of winning an unrestricted award is worth it. Also, minority actors can support each other and work for better opportunities without feeling they are competing against just each other for a separate and probably unequal prize.

The Grammys reflect cultural roots of music, and how recorded music is marketed. Even though at various times there were movies made specifically for black audiences, movies have a very different history in most respects, and an entirely different marketing structure than the music business. These days however there is a marketing distinction between male-oriented action pictures, and movies about intimate relationships, widely known as "chick flicks." The actress categories may serve to recognize this market in particular.

The Oscars have gender specific acting awards today because they've always had them, the press and public like them, and nobody seems to want it any other way. Still, a new entry in the endless stream of awards shows might try something different and give a single genderless Best Performance award. Besides recognizing equality, and quality regardless of gender, it would have the additional advantage of an awards show that is just that much shorter.