Wednesday, May 23, 2012

Picasso in Occupied Paris by David King




Another place Sartre, Camus, and Beauvoir could be seen that spring was at the restaurant Catalan, on rues des Grands-Augustins, sometimes seated at the table of their new friend, Pablo Picasso. Despite many invitations to come abroad, the Spanish artist had remained in Paris during the Nazis Occupation, painting in his two-story studio on rue Saint-Augustin, on the Left Bank. The sixty-two -year –old, with long white hair falling onto his shoulders, was surrounded by his work And his women, including his latest lover, the twenty-two-year-old painter Francoise Gilot.



In the eyes of the Nazis authorities, Picasso was a highly suspect artist. He had supported the Spanish republicans during the Civil War, raised money for their cause, and published caricatures of the military dictator in his Dream and Lie of Franco. He had commemorated the German firebombing raid of the Basque city of Guernica on the afternoon of April 26, 1937, on a three-hundred-square-foot canvas that had dramatically raised awareness of the tragedy. Hitler, of course, had place the painter on a list of modern degenerates, and the Nazis banned all his exhibits in Paris.



The French police had actually collected a sizable file on the Spanish painter, a dossier that was only discovered in 2003, when 140 cardboard boxes were returned to Paris from Moscow. The Russians had seized the archives in 1945 from the Germans, who had in turn taken during the Liberation. As historians had then learned, Picasso had applied for French citizenship in April 1940, but the state had rejected the application on the grounds that he was suspected of being an anarchist or communist, or harboring sympathies leaning in that direction. “He has no right to be naturalized,” an official wrote on the form, and “should even be considered suspect from a national viewpoint.”



Picasso had not even told his closest friends about this request. He had, however, let then know his fears: namely, that his authorization to remain in the country was about to expire and he had sworn never to return to Spain as long as Franco was in power. Fortunately for Picasso, a sympathetic police officer intervened. “Very Illegally,” Maurice Toesca wrote in his diary in September 1943, “I have prolonged his stay for three years.”



The Germans who visited Picasso’s studio during the Occupation were not the SS men who were rumored to be slashing his paintings, but instead a number of officials who admired his work. One frequent visitor was Lieutenant Gerhard Heller of the Referat Schriftum (Literature Section) of the Propaganda-Staffel. After his introduction in June 1942, Heller, a censor, would take a break from the stacks of manuscripts overflowing on the shelves, tables, chairs, and floors at his office at 52 Champs-Elysees to climb the spiral staircase, heart beating with excitement at another chance to observe the most infamous example of modern degenerate art at work.



As usual, Picasso was experimenting with color, texture, and form. In addition to woodcuts and pen-and-ink drawings, he worked on cardboard, matchboxes, cigarette boxes, even food, like a piece of bread – a reflection of his creative zeal as well as the shortage of canvases under Occupation. Many of the objects of his paintings – sausages, legs of lamb, grand buffet tables, and the empty cooking pot – reflect the preoccupations and hardships of the period, as did the death’s-heads and grotesque monsters reminiscent of his early cubist days. Even his choice of colors, more black, gray, and beige, seemed to parallel the drab palette of the Occupation.



On the evening of March 19, 1944, Picasso’s play Le Desir attrape par la queue (Desire Caught by the Tail) was performed at his friends Michel and Zette Leiris’s fifth-floor apartment on the Quai des Grands-Augustin. It was a dark surrealist farce that featured star-studded cast: Jean-Paul Sartre, Simone de Beauvoir, and Picasso’s former lover Dora Maar. Albert Camus narrated, describing the largely imaginary sets – that is, except for a large black box that served alternatively as a bed, a bathtub, and a coffin.



Picasso had written the play three years earlier, beginning, as he recorded in a notebook, on the evening of January 14, 1941, and, in the tradition of surrealist automatic writing, finished it three days later. Reminiscent of 1920s avante-garde theater, the play revolved around deprivation and indulgence, or more specifically hunger and sex. Michel Leiris, who had selected the cast, played the lead role of Big Foot. Sartre was The Round End; Raymond Queneau, The Onion; and Jacques-Laurent Bost, Silence. Simone de Beauvoir played The Cousin, while publisher Jean Aubier was The Curtains.



When Gertrude Stein had read the script, she suggested that Picasso stick to painting. But the photographer Gyula Haasz, better known as Brassai, thought otherwise. He praised Picasso’s virtuosity, comparing the composition style to the “verbal trance that gave free reign to dreams, obsessions, unavowed desires, comical connections between ideas and words, everyday banalities, the absurd.” This play, he added, displayed the painter’s “humor and inexhaustible spirit of invention. . . in their pure state.”



As the play ended about eleven, just before that night’s curfew, Leiris had invited the cast and several friends to spend the night. They sang, listened to jazz records, and admired Sartre playing the piano. Camus and the host acted out various scenes, enhanced in part with wine served warm with cinnamon. Then party ended at five in the morning. Simone de Beauvoir was overjoyed: “A year before we would have never dreamed of gathering together like this and having a noisy, frivolous party that went on for hours.”



This was the first of the fiestas, as Michel Leiris dubbed them, that would take place in the spring of 1944. Beauvoir described another one not long afterwards, at surrealist Georges Bataille’s house in the Cour de Rohan:



We constituted a sort of carnival with its mountebanks, its confidence men, its clowns, and its parades. Dora Marr used to mime a bullfighting act; Sartre conducted an orchestra from the bottom of a cupboard; Limbour carved up a ham as though he were a cannibal: Queneau and Bataille fought a duel with bottles instead of swords; Camus and Lemarchend played military marches on saucepan lids, while those who knew how to sing, sang. So did those who didn’t. We had pantomimes, comedies, diatribes, parodies, monologues, and confessions: the flow of improvisations never dried up, and they were always greeted with enthusiastic applause. We put on records and danced; some of us . . . very well; others less expertly.



Beauvoir, looking back, remembered being “filled with the joy of living. I regained my old conviction that life can and ought to be a real pleasure.”


Monday, May 21, 2012

Enlightened Aging by Nortin M. Hadler




Aging, dying, and death are no longer solely the purview of philosophers and clerics. Many biological and epidemiological theories of aging have been articulated. Some are even testable theories, and many have been tested. The result is an informative science. We still have much to learn and many a theory that eludes testing, but the product of all this science is a body of information that has much to say to anyone today who wants to reflect on aging, dying and death. This book is anchored on this body of information. Beyond reflection, aging, dying, and death have arrived at center stage in realpolitik at the urging of economists and for public-policy considerations, given the needs of the burgeoning population of elderly.



Aging, dying, and death are not diseases. Yet they are targets for the most egregious marketing, disease mongering, medicalization, and overtreatment. This book is to forewarn and arm the reader with evidence-based insights that promote informed medical and social decision making. All who have to good fortune to be healthy enough to confront the challenges of aging need such insights. Otherwise they are no match for the cacophony of broadcast media pronouncing the scare of the week or the miracle of the month; pandering magazine articles; best-selling books pushing “angles” of self-interest; and the ubiquitous marketing of pharmaceuticals and alternative potions, poultices, and chants. All are hawking ‘successful aging” and “long life” as if both were commodities. We awaken every day to advice as to better ways to eat, think, move, and feel as we strive to live longer and better. We are bombarded with the notions of risks lurking in our bodies and in the environment that need to be reduced at all cost. Life, we are told, is a field that is ever more heavily mined with each passing year.



There are places on the globe where life is a literal minefield. There are others where it is a figurative minefield. The former are places where ripe old age is the fate of a lucky few, unconscionably only a few. Those places usually have common denominators: inadequate water and sewer facilities, unstable political structures, and dire poverty. They are a reproach to the collective conscience. However, I am writing this book for those of us fortunate enough to reside in the resource-rich world, countries that have crossed the epidemiological watershed so that it is safe to drink the water. For us, death before our time is not a fact of life; it’s a tragedy. For us, a ripe old age is not a will-o’-wisp; it’s likely. And this happy and fortunate circumstance has almost nothing to do with what we eat, with our potions and pills, or with our metaphysical beliefs, and it has very little to do wit the ministrations of the vaunted “health-care” systems that we underwrite. In the chapters that follow, this becomes disconcertingly, even painfully, clear.



Dr. Hadler is professor of medicine and microbiology/immunology at the University of North Carolina and attending rheumatologist at UNC Hospitals. He also wrote


Stabbed in the Back: Confronting Back Pain in an Overtreated Society

Worried Sick: A Prescription for Health in an Overtreated America



Saturday, May 19, 2012

Europe's First War with China by Tonio Andrade






The Sino-Dutch War, 1661-1668, was Europe’s first war with China and the most significant armed conflict between European and Chinese force before the Opium War two hundred years later. The Opium War, of course, was fought with powerful steamships, and China lost badly. The Sino-Dutch War was fought with the most advanced cannons, muskets, and ships and the Chinese won. The great Ming warlord Koxinga drove the Dutch out of Taiwan.




The Sino-Dutch War is frequently mentioned in historical literature and in textbooks, but there has never been a major study of it in any language that makes use of the many sources – Chinese and European – that are available. Historians will doubtless uncover new documents and find errors and omissions here, but I hope this book will lead to greater understanding of this fascinating episode of global history.



I certainly found it fascinating to write. One thing that absorbed me as I read the sources is how the weather – the planet – became a major character. Time and time again, the war turned on a storm. Even before the war started, a typhoon destroyed a Dutch fortress on Taiwan and altered the sandy island on which it had been perched so much that the Dutch couldn’t rebuild it. This left the Dutch governor Coyet particularly vulnerable to Koxinga’s invasion. Another storm drove away the relief fleet that Coyet had managed to summon against the prevailing winds,, dashing one of its vessels to the ground, and, more importantly, taking from Coyet the element of surprise. Tide surges, unexpected currents, freak winds – over and over again nature changed the course of the war. I came to believe that nature was more important than any other factor in the war.



I say “nature,” because to me all this is an expression of the stochasticy of a beautiful but indifferent universe. As a botanist friend of mine says, “What do the stars care about some slime mold at the edge of one galaxy?” But of course the Dutch and the Chinese saw it differently. Both felt there was a higher power intervening in earthly affairs. The Dutch called in God, the Chinese called it Heaven, and although their cosmologies and theologies differed, they saw in the storms and tides, famines and floods a divine purpose. That each side thought Heaven favored their own people – or should favor their own people – is just they way they were built, it seems.



The fact that I kept finding myself writing about nature comes mostly from the sources – or I believe so anyway – but it resonated wiht me because I am trying to make sense of my own time, when climate catastrophe looms, when nature is about to start bucking like never before in our history. It bucked pretty hard in the seventeenth century, too. Right around the time the action in this book takes place, the global climate cooled abruptly. The cooling might not have caused major problems by itself, but it was accompanied by severe climate instability, just as global warming will be. There were floods and droughts, locusts and famines, riots and rebellions. Bandits raged, and governments fell like never before and never since. In fact, if it hadn’t been for this seventeenth-century global climate crisis, the Sino-Dutch War might never have happened. Koxinga might have ended up a Confucian scholar, passing examinations and writing poetry. The Dutch might have kept Taiwan for generations longer.



So did Koxinga win because he just happened to be better favored by the weather? No. Although luck played a role, Koxinga won because of leadership. His troops were better trained, better disciplined, and most important, better led than the Dutch. Bolstered by a rich military tradition, a Chinese ‘way of war,’ Koxinga and his generals outfought Dutch commanders at every turn.



The Sino-Dutch War can thus teach us valuable lessons about military history. It was fought at a time when the technological balance between China and the West was fairly even, a time more similar to today than the periods of the other Sino-Western wars – the Opium War, the Boxer Rebellion, the Korean War – all of which were fought across a steep technological gradient. Military historians have posited the existence of a “western way of war,” a “peculiar practice of Western warfare . . . that has made Europeans the most deadly soldiers in the history of civilization.” But partisans of this sort of argument are generally ignorant of Chinese military tradition. In the Sino-Dutch War, Chinese strategies, tactics, and leadership were superior, and all were tied to a set of operational precepts drawn from China’s deep history, a history that is as full of wars as Europe’s own. The Chinese sources I read are woven through with strands of wisdom from classics like The Art of War and The Romance of Three Kingdoms. Indeed, Chinese historians have argued that Koxinga’s victory over the Dutch was due to his mastery of this traditional military wisdom.


[ The Dutch had the advantage of large ships which could sail close to the wind and carry large cannons but Chinese junks could always outrun them sailing with the wind and could be used to much greater advantage in shallow waters. The Dutch had Renaissance fortresses, with bastions that could establish deadly cross-fires against besieging forces. After some initial frustrations, however, Koxinga adapted and learned the arduous siege-works necessary to counter their power and force surrender.]


Thursday, May 17, 2012

The Scandalous World of Olive Oil by Tom Mueller




Everyone in the olive oil business in California, and in America, knows a fraud story, because everyone knows a fraudster. The United States, whose oil consumption is third in the world and is growing at 10 percent annually, a market worthy over $1.5 billion and climbing, has long had the loosest laws on earth concerning olive oil purity, and the new USDA standards passed in October 2010, which mirror the lax regulations of the International Olive Council (UN), remain voluntary, with no provision for enforcement. Thus, the United States of America is an olive criminal’s dream.



A recent survey of supermarket extra virgins performed by the UC Davis Olive Center, in cooperation with the Australian Oils Research Laboratory, revealed that 69 percent of oils tested had taste flaws such as rancid, fusty, and musty, which meant they weren’t extra virgins at all and had been mislabeled. Such cases of “legal fraud” are common in American supermarket oils, as they are in many parts of the world: similar findings were reached by Andreas Marz in Germany, by CHOICE magazine in Australia, by the regional government in Andalucia. Paul Vossen, a University of California oil specialist, who beginning in 1997 trained and led America’s first IOC recognized tasting panel, said “We’ve pulled olive oils off the shelf and I would say very seldom do we ever find on that passes as extra virgin.” The same is often true at gourmet retailers and websites.



[ “A low incidence of cardiovascular disease, dementia, and certain kinds of cancer are among the central benefits attributed to the Mediterranean diet. Since the 1950s, people have accepted that olive oil, the main source of fat in this diet, is the keystone of this health dietary regime. Some of olive oil’s positive effects stem from its monounsaturated fat profile – at least its not butter or pork fat – but, more and more, medical research suggests that the polyphenols and other micro-nutrients , which constitute a scant 2 percent of its volume but are only present even to that degree in fresh, genuinely ‘extra virgin’ , are the source of olive oil’s health benefits. The concentration of beneficial micro-nutrients varies widely among oils, depending on which of the 700 kinds of olives are used, where they are grown, how much water the trees receive, the ripeness of the fruit at harvest, and the milling and extraction methods used. “Extra Virgin” refers to the coming together of all these variables in a rich soup of micronutrients that are completely absent in refined oils. The price for a genuine bottle of extra virgin is approximately $18 per 500ml. ]

But, “Price is by no means and indicator of quality,” Vossen said. “The high-ticket items can be equally bad.”



In the wholesale market, a lot of oil is adulterated outright with cheap vegetable oils. It is rare to find authentic extra virgin in a restaurant in America, even in fine restaurants that ought to know better. It is nearly impossible in some localities, such as southern California, where large-scale counterfeiters pump out blends of low-grade olive oil and soybean oil dyed bright green and sell it to their fences, the big-name ‘legitimate wholesalers’ such as Unilever, Caparelli and Sysco.



Much of the fake olive oil sold in America is imported. In 2006, in a rare intervention by authorities, federal marshals seized about 61,000 liters of what was supposedly extra virgin olive oil and 26,000 liters of olive pumice oil from a New Jersey warehouse. The shipment was all soybean oil. In 1997 federal marshals had seized a similar shipment from the same company which turned out to be mostly sunflower seed oil. The companies founder had pleaded guilty in 1988 to conspiring to import feta cheese contaminated with benzene hexachloride.



Estimates are that at least 50% of the olive oil sold in America is fraudulent, with particularly acute problems in the food service industry. “In America, people can pretty much put whatever they want in the container,” says Leonardo Colavita(the son of the man upon whom Marlin Brando's character in "The Godfather" was based). “So long as the product isn’t toxic, you can sell it however you like. If you put seed oil inside extra virgin, you don’t poison anyone so it’s the consumer’s choice whether to buy it or not. If you bought yourself some extra virgin that turns out to be lampante - [‘lamp oil’ the lowest grade which has to be chemically refined before it can be sold as food] - that’s your tough shit.”

The FDA considers olive oil adulteration a low priority. Martin Stutsman says his agency is hesitant to commit its extremely limited resources to fighting olive oil adulteration because he thinks it doesn’t represent a serious public health hazard. True, olive oil mixed with cheaper vegetable oils doesn’t compare in danger or virulence to anthrax, botulism or salmonella. Customers just miss out on health benefits they thought they were getting and paying for yet Italian investigators have found hydrocarbon residues, pesticides and other contaminants in fake olive oils, and pomace oil, a common adulterant, sometimes contains mineral oil as well as PAHs, proven carcinogens that damage DNA and the immune system. Then there’s the 1981 case of toxic oil syndrome in Spain, when rapeseed oil adulterated with an industrial additive, sold as olive oil, killed eight hundred people and seriously injured thousands more. Olive oil imported in flexi-bag containers in 2008 was found to be contaminated with naphthalene, a pesticide commonly used to fumigate cargo ships.


In fact, the FDA is itself the victim of a generalized bias against government regulation and an unfounded faith in laissez-faire economics. A November 2007 report on an internal review by the FDA’s own Subcommittee on Science and Technology stated that


The FDA cannot sufficiently monitor either the tremendous volume of products manufactured domestically or the exponential growth of imported products. During the past 35 years, the decrease in FDA funding for the inspection of our food supply has forced FDA to impose a 78 percent reduction in food inspections. FDA estimates that, at most, it inspects food manufacturers once every ten years, and cosmetic manufacturers even less frequently. The Agency conducts no inspections of retail food establishment or of food-producing farms. The FDA does not have the capacity to ensure the safety of food for the nation, its inability to keep up with scientific advances means that American lives are at risk.



There are signs that this risk is being addressed. In late 2010 Congress passed new food safety bills that aimed to expand the FDA’s power to inspect and recall tainted foods. Whether these aims will be funded in the next budget is another question. There may be some help on the way from the private sector. In August 2010 , responding to the UC Davis study that reported widespread mislabeling in the extra virgin grade, the Orange County law firm of Callahan &Blaine filed a class action complaint against manufacturers and distributors alleging fraud, negligent misrepresentation, false advertising, breach of warranty, unjust enrichment and of ‘misleading and defrauding California consumers for years.”



Callahan & Blaine dropped their suit but some say that, at least in Los Angeles, though still blending up bad oil, “the bad guys are sleeping with one eye open.”


Wednesday, May 16, 2012

The Politics of Backlash by William J. Stuntz




Before the 1960s, conservative politicians were either indifferent towards crime or mildly libertarian in their attitudes toward criminal defendants. Conservative Republican President William Howard Taft opposed Prohibition; his son Robert criticized the Nuremberg prosecutions. Save for his father’s fondness for trust-busting and the son’s late-career flirtation with McCarthyism, neither Taft ever sought to make political hay from crime. For political conservatives, that stance was natural. Criminal punishment is an especially intrusive form of government regulation. Spending on criminal justice –including prison spending – is redistributive: money spend to warehouse poor criminals comes disproportionally from rich taxpayer’s pockets. Conservative politicians dislike government regulation and redistributive spending.



Two conservative governors in the liberal 1960s – George Wallace and Ronald Reagan – upended that tradition. Before Wallace, southern politicians’ chief goal with respect to crime was to keep the federal government away from it. Wallace sought to keep the federal government away from civil rights –but with respect to crime, he focused not on state’s rights but on black criminals, and (even more) on liberal white judges who allegedly protected them. His 1968 stump speech included these lines: “If you walk out of this hall tonight and someone knocks you on the head, he’ll be out of jail before you’re out of the hospital, and on Monday morning they’ll try the policeman instead of the criminal.”



As race riots struck many American cities, Wallace bragged about Alabama’s version of social peace” “They start a riot down here, first one of ‘em to pick up a brick gets a bullet in the brain.” Such racially charged rhetoric worked: Wallace ran strong races in three Democratic presidential primaries in 1964, four years later, and he carried five states and won 13 percent of the popular vote on a third-party ticket.



Reagan was more subtle –instead of rhetorical bullets to the head, Reagan noted sadly that “our streets are jungle paths after dark” (the jungle reference was a clear piece of racial code: Reagan wasn’t that subtle) – and also more effective. In his 1966 campaign for California’s governorship, Reagan took Wallace-style tough-on-crime rhetoric, made it more respectable, and used it to draw blue-collar Democrats across the partisan isle in huge numbers: enough to win by a million-vote margin against a seemingly unbeatable opponent. One of his key tactics was to link urban rioting with disorder on college campuses – a largely white crime problem. That move helped him appeal to white racists without identifying himself as one of them. By doing so, Reagan married two political constituencies that his contemporaries thought were incompatible: economic conservatives who had opposed the New Deal and white union members who had formed its core base of support.



Partisan politics was transformed. To northern and western politicians of the 1950s and early 1960s, blacks and pro-civil rights whites were the swing voters for whose allegiance the two parties competed. Dwight Eisenhower won 40 percent of the black vote in 1956; Richard Nixon won nearly a third in 1960. With the support of every Republican Senator, the Republican Eisenhower administration pushed for major civil rights legislation in 1957; though the bill was watered down by Senate Democrats, Eisenhower ultimately signed the first such legislation enacted since Reconstruction. While blacks were the object of partisan competition, blue-collar whites were generally seen as a core part of the Democratic base. Reagan intuited that, thanks to the Kennedy and Johnson administrations’ support for civil rights, blacks and white liberals were now solidly Democratic; yesterday’s swing voters didn’t swing anymore. Rising crime, falling punishment, and liberal Supreme Court decisions protecting criminal defendants’ procedural rights had created a new set of swing voters: blue-collar whites. That changed electoral configuration gave conservative Republicans the opportunity to build a national majority, just when that opportunity seemed most distant.



The Warren Court’s criminal procedure decisions were crucial to that process, in three respects. First, those decisions allowed politicians to attack black crime indirectly by condemning the white judges who protected black criminals, not the criminals themselves. That gave conservative politicians like Reagan a chance to appeal to more than racist whites. Second, the Court made street crime –violent felonies and felony thefts: classic state-law crimes – a national political issue for the first time since Reconstruction. One reason crime played a larger role in national politics in the last decades of the twentieth century than ever before is that, in the midst of fighting a crime wave, national politicians could talk about the kinds of crime that voters feared the most. Instead of Mafia corruption of local governments and labor unions, the crimes that made Estes Kefauver’s and Robert Kennedy’s careers, the combination of the Supreme Court’s decisions and the 1960s crime trends made robbery and burglary, murder, and rape national issues. Earlier generations had assumed that only local officials concerned themselves with such crimes. Earl Warren helped change that equation. Third, because the Court was the Court, crime talk was cheap: politicians couldn’t change the constitutional rulings that prompted so much controversy, so their criticisms were unburdened by the need to exercise governing responsibility.



Reagan and Wallace exemplified that last point. California’s imprisonment rate fell by nearly half during Reagan’s two terms in Sacramento. Alabama’s imprisonment rate did likewise under Wallace. Neither of these tough-on-crime governors managed to reverse those trends. Their tough rhetoric was just that: rhetoric. Like Kefauver’s hearings and Hoover’s Ten Most Wanted list, the conservative politics of crime was an exercise in political symbolism that seemed, much like the Supreme Court’s procedural decisions in criminal jurisprudence (Miranda, etc.), to have no substantive consequences.



But symbols do not remain purely symbolic for long; substantive consequences have a way of catching up. When conservatives like Reagan, Wallace, and Richard Nixon won blue-collar votes by attacking soft judges and (indirectly) black criminals, liberal politicians were forced to respond. Liberal Democratic President Lyndon Johnson supported and signed legislation that funneled money to local police and purported to overrule Miranda vs Arizona: The Omnibus Crime Control and Safe Streets Act of 1968, the first of what became a long series of federal crime bills targeting urban street crime. Liberal Democratic presidential candidate Robert Kennedy made tough measures against urban disorder a centerpiece of his campaign for his party’s nomination. Jimmy Carter – embodiment of the southern Left in the early 1970s – presided over a 40 percent increase in Georgia’s imprisonment rate, while neighboring Alabama’s prison population stagnated. Liberal Republican Governor Nelson Rockefeller proposed ramped-up penalties for heroin offenders; the so-called Rockefeller laws became the model for the next wave of tough state drug statues. The same year Rockefeller signed those laws, New York’s imprisonment rate turned up after fifteen years of decline.



For the balance of the 1970s – as liberal Democrats controlled Congress, most state legislatures and governorship, and nearly all big-city mayoralties – prison populations rose steadily. America’s punitive turn did not come from the political right, at least not initially. Rather, the rise in punishment came from the left’s response to the right’s rhetoric. That soon bred its own response. Once liberal politicians like Johnson and Kennedy embraced punitive politics, the right’s bluff had been called. Conservative politicians had two choices: they could back down, cede the crime issue to their liberal opponents, and admit that their tough rhetoric was cheap talk. Or they could follow suit and ramp up punishment still more.



They followed suit. Reagan was once again a key player, the model for his party and for his ideological camp. As governor, he had specialized in combining tough talk with soft policy or no policy at all. As president, his walk matched his talk: he signed into law the most draconian piece of drug legislation to date; partly as a consequence, the federal imprisonment rate doubled in the 1980s. In an increasingly conservative age, state prison populations saw similar trends. What began as a political bluff had become a bidding war.



Overall, late twentieth century states with Republican legislatures and governors increased prison populations faster than states ruled by Democrats – but there were plenty of exceptions: Anne Richards in Texas, Mel Carnahan in Missouri and Douglas Wilder in Virginia; all Democrats under whose administration prison populations expanded more rapidly than under the Republic administrations of their predecessors or successors.



The moment that best captured both the liberal’s dilemma and their response to it came shortly before the New Hampshire primary in 1992. Then-Governor Bill Clinton, falling in the polls, returned to Arkansas to supervise the execution of a mentally disabled black inmate named Ricky Ray Rector. It worked: Clinton finished a close second, was hailed as “The Comeback Kid”, and went on to win the White House. The Rector execution was Clinton’s gruesome answer to the elder Bush’s use of Willie Horton to defeat Michael Dukakis four years earlier. The character of the answer captures the relevant political dynamic. This was no philosophical argument between opposing sides; rather, it was a war of images in which both sides sought to send the same message. The politics of crime had devolved into a game of can-you-top-this.

Bush probably found Lee Atwater’s Horton ad distasteful, and Clinton may have felt similarly about Rectors execution. If so, the two presidents’ distaste highlights an important feature of late twentieth century politics: right and left alike supported criminal justice policies that, in principle, they found repugnant. The Reaganite right opposed big government yet helped to create a prison system of unprecedented scope and size. The Clinton left opposed racially discriminatory punishment yet reinforced and expanded the most racially skewed prison population in American history. The source of this conflict between politics and principle was the same on both sides. Crime policy was not a means of addressing crime – and the policy’s consequences for the poor blacks who were both victimized by crime and punished for it were, politically speaking, irrelevant. Each side supported punitive policies because the other side had done so, and because changing course seemed politically risky.



Such political stances worked because the votes mattered most – the votes for which the two parties competed, the ones most likely to switch sides if the other side’s crime posture seemed more attractive – were not the votes of crime victims and their friends and neighbors, much less of criminal defendants and their friends and neighbors. They were the votes of those for whom crime was at once frightening and distant, those who read about open-air drug markets and the latest gang shootings in the morning paper but never saw it for themselves Neighborhood democracy for the communities in which most of the crimes occurred faded, and was replaced by the democracy of angry voters in safer havens. The consequence was much more criminal punishment, distributed much less equally.




Monday, May 14, 2012

Captain Bligh by Anne Salmond




If one reads all the documents relating to mutiny on the Bounty, it is clear that Edward Christian’s vindication of his brother and his fellow mutineers was over-stated. Although Fletcher was capable and popular, his shipmates all agreed that he was susceptible to women, and no doubt his decision to take over the ship was influenced by his desire to return to Tahiti. And while Peter Heywood always insisted that George Stewart was a loyal officer who remained true to his captain, several of the Bounty’s men indicated that Stewart had sympathized with the mutiny. Indeed, according to John Fryer and James Morrison, Stewart had helped inspire the uprising suggesting to Christian that the ‘men are ripe for anything’. If Heyward’s defense of Stewart fails ( and it does seem far-fetched), then his account of his own conduct during the mutiny also falters. Since Stewart was his friend and mentor, it seems more likely that the younger midshipman followed Stewart’s example, and simply let the mutiny unfold.



At the same time, William Bligh was seriously flawed as a commander. Vain and ungenerous, he had a volatile temper and a biting tongue. Unlike his mentor Captain Cook, he lacked charisma or an imposing physical presence; and unlike Charles Clerke, he had no sense of humor. Obtuse to the point of cruelty, he had little empathy, except for his family and a few young protégés; and no gift for the art of political management. Often spoken of as a ‘passionate’ man, William Bligh had a violent temper that exploded when he was thwarted. Determined to silence his critics by making a perfect voyage in the Bounty, and then in the Providence, he was enraged by any lapses that threatened his record; and knew how to humiliate those responsible for such infractions. As an insecure man, prone to elaborate feats of self-justification, Bligh had a gift, almost amounting to genius, for insulting and infuriating his immediate subordinates. N.A.M. Rodger’s crisp verdict on the mutiny on the Bounty ‘ ‘Bligh was an outstanding seaman with an ungovernable temper and no idea about how to get the best out of his officers’, and Fletcher Christian was ‘an unstable young man who could not stand being shouted at’ gets close to the heart of the matter.



Compared with Captain Edwards, however, Bligh could be warm and engaging (especially to those who posed no threat to his reputation). In his domestic life, he was ardent and faithful, in stark contrast to many of his former shipmates (Molesworth Phillips, for example, the Resolution's lieutenant of marines was an infamous brute to his wife and a bully to his children; while James Burney, Phillip’s brother-in-law had a series of affairs, including an incestuous one with his half-sister). In this respect, he was more like Captain Cook than most of their comrades. Compared with George Vancouver (and almost every other British commander in the Pacific),too, Bligh was a paragon of restraint in his methods of discipline (flogging only 10 percent of the Bounty’s crew and 8 percent on board the Providence, compared with 25 percent on board Cook’s Resolution and 45% on board Vancouver’s Discovery, for instance). On the basis of these figures, his reputation for brutality – initiated by Edward Christian (although Christian did not accuse him of physical violence) and later elaborated into a popular myth of Bligh as an archetypal ‘flogging captain’ – was a triumph of rhetoric over reality.



William Bligh was a fine practical sailor and hydrographer, a gifted ethnographer, who gained some real insights into life in Tahiti. If he tormented his men, they knew how to pay him back with insolence and passive resistance. The responsibility for the breakdown in relationships on Board the Bounty cannot be sheeted home to Bligh alone, but must be shared by his officers, especially Georg Stewart and Fletcher Christian.



Under most circumstances, too, the tensions aboard the Bounty would not have led to a collapse of command, Unfortunately, the planning of the expedition by Joseph Banks and the Admiralty had been fatally flawed, placing Bligh and his officers under intolerable pressures. If the government had been more generous, or Banks had selected a larger ship for the breadfruit expedition, with a Great Cabin for her captain and room for her officers and crew, the atmosphere on board might have been different. If the Admiralty had sent his orders to Bligh in time for him to sail around the horn, his stay in Tahiti would have been brief, just a matter of weeks’; and shipboard discipline is unlikely to have unraveled. Had there been other commissioned officers on board, Fletcher Christian would not have been appointed as acting lieutenant, without a proper commission and dependent on the whim of a quick-tempered, verbally abusive commander. If there had been a contingent of marines ion board, fearful of being shot, the young officer is unlikely to have indulged himself with fantasies of mutiny and desertions. Like Frank Bond on the Providence, Christian would have been forced to swallow Bligh’s insults, and do his duty; and the mutiny would not have happened.



To make matters worse, by the time that the Bounty’s men were brought back to England, the French Revolution was unfolding; and the government and the Admiralty could not afford to admit their own part in the responsibility for the mutiny (along with that of Sir Joseph Banks, a close friend of King George III,)- [but when does the government EVER admit their responsibility in ANY disaster, French Revolution or not?] Instead, the Bounty sailors( those that survived the brutality of their captivity) were put on trial; and after the court martial, three of them were hanged from the yardarm. When public sympathy for the Bounty officers, especially for the well-connected Peter Heywood and Fletcher Christian, was aroused by their relatives, William Bligh became the scapegoat. At the same time, Bligh’s intemperate tirades were the sparks that ignited the mutiny, driving his subordinates to distraction. His bad luck and his bad language proved to be an inflammable combination.



At about this time in Britain, a number of influential Evangelical Christians who had read the official accounts of Cook’s voyages became inspired with the idea of taking the Gospel to the heathen in the Society Islands. AS Dr. Thomas Haweis, chaplain to the Countess of Huntington, explained:


… I cannot but feel a deep regret that so beautiful a part of Creation, and the inhabitants of those innumerable Islands of the Southern Ocean should be regions of the shadow of Death, the Dens of every unclean Beast, and Habitation of Cruelty literally devouring one another.



Enthralled by the idea of converting the “heathen’ inhabitants of the Pacific, Haweis spoke with Joseph Banks about taking missionaries to Tahiti on the second breadfruit voyage, and he must have been persuasive, because Banks agreed, providing that the government gave their permission. Haweis was also in touch with William Wilberforce, the anti-slavery campaigner, who helped to secure the government’s support for this proposal, as long as Haweis trained the missionaries and paid for their passage and equipment.



Dr Haweis already had two missionaries in mind, young men named Waugh and Price from the Trevecca Wesleyan college in Wales, but they refused to sail on the Providence unless they were given a pension in the event of their untimely return to England. After this had been arranged, they demanded to be ordained. When the Bishop of London refused because neither of these men had studied at Oxford or Cambridge, thy withdrew from the voyage. By this time Haweis had lost patience with them, remarking caustically, ’The event left me no regrets.’



Although by this time Banks was a baronet, the President of the Royal Society and a respectably married man, he looked askance at the Dissenters and their pious habits. During his youthful visit to Tahiti, he had reveled in the delights of the arioi society*; and one of their members, Ma’i, arrived in London in 1774, Banks and his close friend Lord Sandwich had introduced the young Ra’iatean to high society, taking him on jaunts into the country, where they diverted themselves with concerts, feasts and ladies of pleasure, scandalizing those who thought that they should be teaching Ma’i about Christianity, and how to read the Bible, As Sir Harry Trelawny had exclaimed to Revd Mr. Broughton of the Society for Promoting Christian knowledge: “I hint to you what has I doubt not, appeared to you as it does to me a strange and diabolical neglect – the non-babtism & and non-instruction of the Indian Omiah – he is brought here where the full light of the glorious Gospel shines unclouded – and what has he learned? Why, to make refinements on sin in his own country.” Horace Walpole, the British literary eminence, had remarked in a letter to Rev. William Cole in 1780: “How I abominate Mr. Banks and Dr. Solander who routed the poor Otaheitians out of the center of the ocean, and carried our abominable passions among them!”



Stung by these criticisms, one of Bank’s circle penned a pamphlet in which Ma’i was made to describe his encounter with a ‘Methodist preacher who told me that I had been damned to all eternity, had I not been so happy as to have heard the name of Christ and talked about Adam and Eve, and original sin.’ He also passionately defended Tahitian morality, arguing the lack of Christian charity in such blanket condemnations . Through Ma’i’s fictitious voice, the evangelicals were mercilessly lampooned and accused of preaching ‘the efficacy of faith without works, and doing much harm to the common people of England’.



No doubt this reflected Bank’s own private opinion. Sceptical of evangelical zeal, he was ‘little inclined to Conversions’. In a letter to Count William Bentinck on the ‘Manners of Otaheiti’, he had praised the Tahitians for their sexual tolerance, observing that there ‘the want of Chastity does not preclude a woman from the esteem of those who have it. . . yet are there women there as inviolable in their attachments as here’. . . .



During his Bounty visit to Tahiti, Bligh had delighted in local customs, describing their island as the “Paradise of the World’, and demonstrating real insight in his descriptions of Tahitian society. After the mutiny, however, he was quick to blame the island’s seductive women for leading his men astray, like the Sirens in the Odyssey or Eve before the Fall. In search for self-justification, Bligh showed no mercy towards the mutineers – those ‘pirates’, ‘wretches’ and ‘villains’ – and their island partners. It is also probable that before he sailed from England on the second breadfruit expedition, his attitude towards the Tahitians had been colored by contacts with Dr. Haweis and the other evangelical Christians, who saw the islanders as living under the dark shadow of Satan, and hoped to bring them to the light of God. The officers on board Bligh’s ships knew about the proposal to bring missionaries to the island, and as they sailed away from Tahiti, George Tobin pronounced his verdict on this pious scheme:



What the exact creed of the Tahitians is, it is not in my power to explain. Yet it is a good one, if faith and good works travel in amity with each other. In the latter, these islanders are “eminent beyond compare” They encourage a lesson of morality and good will among one another that puts civilization to blush.

Let us then – I still may be in error – in the name of Providence have done with missions of this kind. Take a retrospect of their sanguinary exterminating consequences in many a large portion of the world, and humanity will tremble. The Tahitian needs no conversion; he divides what he has with the stranger, as with his neighbor. He administers to, he anticipates their wants. Can he be taught more, and till retain these amiable and generous qualities?



No doubt Joseph Banks heartily agreed.



For evangelical Christians and increasingly for William Bligh, however, it was blasphemy to think of Tahiti as a Paradise on earth. “Ah!”, thought they, “you may sit under spreading trees, eating the golden bread-fruit, or drinking the sweet milk of the coconut: but how can you be happy when you know not of the Paradise above, nor of the Savior who can wash out your many crimes with his blood? For soon death will snatch you from your sunny isle, and bring you before the judgment-seat.





*The Arioi society of Tahiti consisted of a special class of entertainers whose purpose was as much spiritual as for amusement. According to Cook, every man and woman in the society were held in common to one another, and that sexual relationships between any two individuals rarely lasted more than two days or three days. As Moerenhout stated: "Who would not have wished to belong to a society, whose members only seemed to live and die to be happy?".The Arioi carried the idea of the sacredness of sex to extremes.


Thursday, May 10, 2012

Writing Slaughterhouse Five by Charles J. Shields





The friendship between Kurt Vonnegut and Bernie O’Hare had begun in army boot camp and ripened into brotherhood during the Battle of the Bulge and their winter match into captivity as POWs. But the event that melded their lives together was the firestorm of Dresden. They talked about that experience in a private language, usually late a night when both had had a few drinks, like a pair of mediums conjuring voices and scenes from long ago.


In 1965, when God Bless You, Mr. Rosewater had received more attention than any of his previous novels and it began to look like Kurt could break into publishing world in a big way, Vonnegut understood that he needed to get back to his war book – at least now he had a title he liked- Slaughterhouse Five- he decided to go see his army buddy in Hellertown, Pennsylvania once again.



The two men smoked, drank, laughed, and went over the details of their capture, the hardships and their release – the same as they had countless times before – but Kurt was beginning to think he still didn’t have much of a book because his perspective was no different from dozens of other novels about the war. Bothering him too was Bernie’s wife Mary, who kept banging the ice cube trays on the kitchen counter, closing doors loudly, and huffing. Bernie indicated nothing was wrong but Kurt was getting uncomfortable.



Then she turned to me, let me see how angry she was, and that the anger was for me. She had been talking to herself, so what she said was a fragment of a much larger conversation. “You were just babies then!” she said.

“What?” I said.

“You were just babies in the war – like the ones upstairs!”

I nodded this was true. We had been foolish virgins in the war, right at the end of childhood.

“But you’re not going to write it that way, are you.”

That wasn’t a question. It was an accusation.

“I – I don’t know,” I said.

“Well, I know,” she said. “You’ll pretend you were men instead of babies, and you’ll be played in the movies by Frank Sinatra and John Wayne or some of those other glamorous, war-loving, dirty old men. And war will look just wonderful, so we’ll have a lot more of them.”




He assured her that he wouldn’t write a set piece for some Hollywood star to shout “Let’s go, boys!” as the troops whistled their way into Berlin wearing laundered uniforms; he pledged that if he ever did write the novel, he would include the phrase “The Children’s Crusade” in the title.



Thus it took Mary O’Hare, who wasn’t enamored of the ancient arms-and-the man ethos about war, to push Vonnegut off dead center about his big book. The truth was that as a twenty-one-year-old private, he hadn’t understood what was happening to him from the afternoon the 106th packed their gear at Camp Atterbury to the morning when he and the other POWS walked into Dresden at dawn. There had been no Ajax, no Achilles in Vonnegut’s anti-Illiad. The sacking of Dresden had been accomplished surgically, at night, from high above by men in machines who returned to their homes in a few hours, not years later like Ulysses. There were no classical heroes in twentieth-century total war, only victimizers and victims. It was the breakthrough he needed after two decades of false starts.



Even then, he might have preoccupied himself with projects that were easier, had he not been offered a position in the faculty at the creative writing program at the University of Iowa, ideal conditions for writing Slaughterhouse Five straight through. . .



Here also, in an unlikely Midwestern town Kurt Vonnegut became part of a nucleus of professionals like himself, all of them vibrating sympathetically to the latest changes in the environment of American literature.



During the course of talking about writing with his fellow instructors, he became especially intrigued with the ideas of Robert Coover. Coover was teaching courses in experimental fiction and working on what would become his most highly praised novel, The Universal Baseball Association, J. Henry Waugh, Prop., and early example of metafiction, as it came to be known.



Metafiction is “fiction about fiction”. The true subject is not the characters or other conventions of realism – plot, setting, the suspension of disbelief – but the writer’s self-consciousness. Through irony, deliberate artifice, and digressions, the reader is reminded that the story isn’t real. Unrestrained by convention, many writers found they were free to insert themselves into the narrative in ways that might be ironic, political, comical, metaphysical, or polemical. Said one of Coover’s students, “ I learned to see what I was doing in terms of traditions and possibilities more universal than realism.”



Vonnegut’s background in journalism had taught him the opposite: that you must not become part of the story. But firebombing of Dresden, on the other hand, as he experienced it, was his story. And metafiction gave him permission in a sense to tell it brokenly, hauntingly –the way it came to him in his dreams.


Tuesday, May 8, 2012

The Brain- Death Revolution by Dick Teresi




The author suggests that ‘Brain Death’ has replaced cardiopulmonary failure as the main criteria by which doctors decide whether a person is alive or dead and that this has been done in order to facilitate the lucrative business of organ transplantation. The author’s primary beef (or ‘dead-horse’) are the conclusions of the Ad Hoc Committee of the Harvard Medical School to Examine the Definition of Brain Death, as published in the August 5th 1968 edition of The Journal of the American Medical Association entitled “A Definition of Irreversible Coma”. Though he admits that this definition and its tests and procedures have no legal standing, he claims that they are the gold standard used by the medical establishment in the U.S.



According to Mr. Teresi the only infallible way to confirm the death of an individual who has been declared ‘brain dead’ or demonstrates the symptoms of irreversible cardio-pulmonary failure is the age-old wait for certain signs of decomposition. Of course this would make it impossible for organ donors and their families to get paid for their gifts (as Teresi suggests they should through-out his book) since transplantation (except in the rare and ethically challenged case of live donors) would thus be taken completely off the table.



Teresi complains that the Harvard Committee and Medicine in general have substituted a philosophical definition of ‘dead’ for a biological one but, of course, that is what he does as well, besides exploit people’s fears that they will be ‘buried alive’, sacrificed for their organs in a mammonish sort of way or helplessly experience excruciating pain in the last minutes, hours, days, weeks, months or years of their lives. So there is a definite ‘tabloid’ aspect to the book and, as far as I know, significant mischaracterization of transplant programs (at least at Massachusetts General Hospital) though not written without some imagination and interest for the general reader who may not be facing the complex reality of his/her own death as honestly and practically as they might.



Sunday, May 6, 2012

Concussion Crisis: Special Education, Drug Abuse and Homelessness by Linda Carrol and David Rosner




Dr. Wayne Gordon specialized in the neuropsychology and rehabilitation of traumatic brain injuries at Mount Sinai School of Medicine in New York City. Most of the patients he saw were adults, but he began to wonder what might be happening to kids in similar situations. He and his colleagues developed a questionnaire designed to ferret out undiagnosed TBSs and cognitive difficulties in children and took it into New York City schools. The results gave Gordon pause. In one city school, 10 percent of the children said they had sustained a significant head injury. When tested later, these children turned out to have cognitive impairments. With a grant from the U.S. Department of Education, he was able to explore the issue further: surveying children who’d been enrolled in special education classes. He was startled by the result: more than 50% of the learning-disabled children had experienced a sharp jolt to the head.



The typical curriculum in special education classes didn’t help with the deficits associated with traumatic brain injuries. Gordon realized that the best way to help these children was to educated the educators. He gathered up a team of Mount Sinai psychologists and, with federal dollars that had been set aside to fund TBI education, set up a project in 1995 to send them into New York City schools.



They taught teachers to identify the specific signs of TBI and show them strategies to help brain-injured children cope better with the demands of school., helping them to focus their attention, avoid distractions, make lettering on handouts larger and to limit the amount of information presented on a single page so student’s wouldn’t be overwhelmed. Some students were provided with peer note-takers and tape-recorders to help then focus on understanding what was being said. Since brain-injured children tended to become exhausted easily, breaks between tough classes like math and science were scheduled. Students were encouraged to visit a special resource room before and after school so teachers could make sure to make sure they had the right assignments or to loan them materials they might have forgotten to bring to school. Since TBIs often lead to slow mental processing the kids were given more time for tests and reduced homework levels. Students were given more time to formulate questions in class. Students were encouraged to create day-planners and color-code their folders and notebooks. They were allowed to use calculators since it is so difficult for TBI kinds to memorized multiplication tables.



Over the five years that then program was in effect, the psychologists from Mount Sinai worked with more than four hundred children. Funding petered out in 2001 and no one else stepped up to keep the TBI program going.



The experience with “hidden” TBIs in the school system led Dr. Gordon to suspect that other people might be getting off track because of unrecognized brain damage. A 2000 study showed that people with a head injury were at higher risk for depression as well as alcohol and drug abuse, Gordon and his colleagues decided to look at the prevalence of TBI in New York State substance abuse programs. The researchers interviewed more than eight hundred patients and found that 54 percent had a history of head injuries. Forty percent of those with a history of head trauma had symptoms indicative of post-concussion syndrome. Further, those with head injuries turned out to have more mental illness and to be more prone to recidivism and treatment failure. “That suggests to me that these folks need a different treatment program,” says Gordon. “You can’t expect people with learning and memory problems to learn at the same pace as everyone else. If you see a thirty-day program doesn’t work, that may mean that these people need sixty or ninety. Maybe they need structured environments to live in, too.” Also, Gordon rightly concluded, early intervention could prevent damage down the line. “If (these kids) had been picked up and identified and treated as folks with TBI upon that first injury, they might have gotten the services they needed to prevent them from going down the path to substance abuse.”



Later, Gordon and his colleagues tested one hundred homeless persons for signs of brain injury. Nearly 70 percent had deficits in memory, language, or attention- all indicative of a possible brain injury. 2 percent reported a significant jolt to the head before they became homeless, often the result of abuse by a parent. Many of these people might be in a very different place in life had their brain injuries been been recognized as serious, had they received treatment. While the solution seems simple – get patients diagnosed quickly and then give them whatever rehab is necessary – it doesn’t translate into reality so easily.


Friday, May 4, 2012

Fifth Century Greek Theatre by Jacob Burckhardt




In the Athenian theatre tragedy created the last and grandest realization of myth; writers now treated it with absolute freedom to attain a new psychological depth, while comedy delighted everyone with its grotesque transformation of of daily concerns and its caricature of a richly varied world. Clearly Athens was the sole possessor of the two dramatic forms, and was to remain so. It was only here that the Greeks could grasp the perspective on Hellenic civilization that the theatre offered, though at the great agonal sites all the rest of poetic and musical art might be briefly presented in concentrated form. Till this time the only drama known to the ordinary Greek had been the sacred pantomime in which the priest or priestess acted single scenes from the myth of their own temple deity, or else the clowning, character imitations and farcical turns which probably developed impromptu from dialogue and horseplay.


Now the Greek became aware that in one city in his country a living representation of the whole of myth had arisen out of the tumult of the Dionysian cult; he also learned that a huge structure was specially devoted to it, with a semicircular space where the audience felt as if it were in a second popular assembly, while, on a stage, the things that were elsewhere recited by bards or shown in pictures were magically enacted by real people and large choruses. He also heard that on certain festive days, the image of the true Athens of real life was brought before its people in a colossal and grotesque transformation. Finally the individual names of great writers rang throughout Greece as the inventors of all this, and of this new and unique kind of poetry. And this new thing was not some curiosity imported from Asia, but a Hellenic creation in the fullest sense, a deep and essential part of the national life.



Theatre had a darker side. As we have said, the compulsory choregia was often a burden on wealthy men. The persona insults in comedy were astonishingly coarse and crude, and what has always been reckoned filthy the world over was filth in Aristophanes too (as well as in the iambic poets before him) – however hard some scholars try to make an exception for him. On the one hand the tone of Athenian society must clearly have been conditioned by comedy, and on the other hand we cannot ignore the effect that this must have had on the victims. For a society and a social set accustomed to having comedy hanging over them all year round with all the other guillotines of the polis that menaced them, there was no doubt a strong incentive to affect indifference. In their hearts, however, nobody can really have been indifferent except those whom it robbed of all shame, and when, at every street-corner and at every banquet, people would meet the victims of comic writers, or know that they themselves would be victims at the next Dionysia, it must have given rise to that form of consciousness in which the mind secretly closes one door after another, and finally the innermost door of all.



What is extremely characteristic of the Athenian temperament, as distinct, probably, from that of all other Greeks, but certainly from other nations, is the attitude of the old comedy to the political situation. No modern nation would tolerate this objective view of itself, and in a solemn, semiofficial context at that; least of all in emergencies and times of universal suffering and anxiety. The whole grotesque accompaniment to the Peloponnesian War which comedy provided would be condemned by any city in our day, and a writer like Aristophanes would be regarded as a heartless jester on the theme of public misery. Yet, as comedy shows, Athens then bread and tolerated not just one poet but a collection of poets of the same kind, writing in a mature, casual style and aloof from the shared values in a way that has been unthinkable in later nations. Comedy was able to defy and mock not only the rulers of the day, but also universal common sentiment; Athens acknowledged the supremacy of the joke at her own expense.


Tuesday, May 1, 2012

The American Dream Legend by Margaret Gullette




In the United States, the legend of the American Dream – open to all – actually becomes more significant as larger numbers of people go into economic free fall. The gap between what people might hope and what they are likely to get grows vast. Margaret Mead described what is required when social change goes wrong –feels too fast, too intense, too generationally divided, when the systems become brittle and individuals less secure. “The idea of progress, which provides a rationale for the unstable situation, makes it bearable.” Just before the 2008 election, as the gap grew week by week, nobody said the rhetoric about the American dream was a wicked lie. It was a necessary hope. It made possible and electoral success that few had initially deemed possible. “Change we can really believe in” really mean “Progress we can believe in.”



Yet the requisites for believing in life-course progress after youth have become more elusive in the U.S. over the last thirty years, as the country began to produce less, unionization declined, wages stagnated, and income inequality grew. Layoffs first appeared as a mass phenomena, absent a Depression, for the first time over two decades ago. And the highest rates of displacement in the 1990s occurred among midlife segments of the population. Unemployment is terrible for anyone, but many younger people find jobs quickly when the economy picks up, while many midlife people remain left out of improvements in the business cycle. Even if age discrimination is not the cause of a job loss, midlife discrimination can be a problem when looking for the next position. It bears repeating: Displacement among workers in their fifties and sixties often results in lower wages or lasting unemployment. Some are forced out of paid work altogether.



Seniority is the reason than any American can still acquire life-course benefits like respect as we age into our middle years and past retirement. It is not our own merits per se but the existence of a system of age-graded benefits that enables many of us to climb the ladder of income –up to a point- as we climb the ladder of years. But the United States, the richest empire in the history of the world, has been weakening seniority in frightening ways since he 1980s. The Bush administration decimated a well-established system by denying union protections to employees of the Department of Homeland Security, who amounted to one out of twelve government employees. The Supreme Court has weakened the Age Discrimination in Employment Act. The academy is weakening tenure: between 1975 and 1996, colleges saw a 12 percent drop in professors hired with the possibility of tenure, and a 92 percent jump in positions without the possibility of tenure. As business weakens seniority through downsizing, outsourcing, and clawbacks from unions, forces of de-professionalization are ending it tacitly through losses of authority that affect everyone from judges ( mandatory sentencing) to Doctors (HMOs).



The ‘midlife crisis” may have seemed like a hiccup in life-course storytelling when it was first named decades ago, but this innocuous, privatized, misleading term has turned out to disguise a vast national crisis. The degradation of work in America – longer hours, shorter tenure in contract jobs, fewer benefits, high and widespread unemployment, eroded or non-existent pensions – along with the cult of youth, are undermining the security and income support on which self-continuity and progress depend in midlife and beyond. Economic security is the truth offered to more and more Americans, at younger and younger ages, starting earlier for the disadvantaged. “Late capitalism fills young people with ambition and aspiration which by definition can only be enjoyed by the fortunate few,” Mark Brand writes in The Possibility of Progress.



Young people may have reasonable, class-based aspirations, but once the majority age past youth, hope for growing material success, more autonomy, efficacy, responsibility, trust –everything good in older Americans’ life-course stories seems less and less in their control. Too often, the only prospective narrative that makes sense is decline.



Adults too need help in maintaining their progress narratives, but it will not come from anti-aging products or blind faith in change. Governments, law, and market forces structure the conditions in which each of us writes our life-course narrative - conditions not entirely of our own making. These help some and deprive or neglect others. Whose progress is it?




Whoever acts privately on the preferential options for the poor – whether in Nicaragua, low-income areas of the U.S.A., or anywhere else in the world – is compensating for some basic government neglect or active government harm. Housing, clean water, health care, jobs –most governments in the world don’t provide them. They’re governing in name only. They have a flag and an army. Globalization means that more nation-states can’t do it: they are being ground deeper into poverty through debt repayment, exploitation, or resource extraction by multinational corporations and inhuman cuts to health and education via the International Monetary Fund. Even for children, such governments provided only in the barest way. Unless an international outcry focuses on subcategories of adults – people with AIDS, say, or the famine-stricken – neoliberalism finds a way to justify abandonment, arguing that self-reliance is the highest good (except for the rich), that grown-ups need to pay for what they get, that NGOs helping them ought quickly to become self-financing. The good work of activists almost always has this other face leering out at it – not a sad or remorseful face, which might be some comfort – but a malicious, blind or hypocritical face.