Roode History

Tuesday, November 18, 2008

Think Big on Health Care

November 16, 2008, 9:30 pm

Think Big

Logo

In this installment of Health Care Watch, Stuart M. Butler and Ezekiel Emanuel talk about what President-elect Barack Obama should and shouldn’t do on health care reform. Go to Mr. Butler’s post.

Ezekiel Emanuel, an oncologist, is the chairman of the department of bioethics at the Clinical Center of the National Institutes of Health. He is the author of “Health Care, Guaranteed: A Simple, Secure Solution for America.”(Full biography.)

The election of Barack Obama is a historical transformative event. As he and his new administration wrestle with health care reform here are five points to be kept in mind.

1) “Make no little plans. They have no magic to stir men’s blood and probably will not themselves be realized.” So said Daniel Burnham, the architect and urban planner (and fellow Chicagoan).

In health care, big plans are necessary not only to motivate people but as a matter of sound policy. The health care system is broken. It is not enough to just add more people to a broken system. Health care reform must reorganize the system to deliver higher quality care while keeping costs under control. Incremental change that just covers more people will not be sustainable. Reform must include changing the delivery system and how we pay for care. The health care system needs major surgery, not more Band-aids.

More important, as negotiation specialists note, you don’t begin with your compromise position. If we have to settle for incremental Band-aids, it should be only as a last resort.

2) Health care policy is fiscal policy.

Forget Social Security or defense, health care costs are the long-term driving force in federal and state budgets. To control the deficit and keep the country solvent, health care must be solved. Therefore, when the president-elect considers senior economic advisers, one test should be whether they really get health care policy.

Fortunately, Peter Orzag at the Congressional Budget Office does. So do some of the people rumored to be leading candidates for appointments — Larry Summers at the Treasury, Jim Cooper at the Office of Management and Budget and Jason Furman at the Domestic Policy Council. This is very encouraging.

3) Comprehensive health care reform is cheaper.

One of the secrets of health care reform that has not yet sunk in, is that bigger changes to the system actually cost less. Consider the Lewin Group’s analysis of the different health care from plans, which I wrote about in an earlier post.

4) No plan is perfect, institutionalize tinkering.

Health care reform will be incredibly complex. As improvements are made, problems will arise and unintended consequences will occur. There will need to be numerous mid-course corrections. Good reform will make addressing these issues easy by not requiring major legislation for each adjustment.

5) Everything is connected.

Health care reform cannot be considered in isolation. The new administration must remember that health care is so big — $1 out of every $6 in the economy, dwarfing automobiles and all other economic segments. Everything is affected by health policy, and every decision should be examined for its impact on health care reform.

Consequently, if the heart of Mr. Obama’s economic policy is job creation, then it is contradictory to have a health care reform built on an employer mandate or to fund reform with a payroll tax. Employer mandates and payroll taxes stymie job formation.

Similarly, every favor to a constituency should be linked to support for the health care reform agenda. If the automakers want a bail out, then they and their suppliers have to agree to support and lobby for the administration’s health care reform effort. This builds grass roots support.

Since 1913, the United States has been trying to achieve comprehensive health care reform. If the Obama administration finally does it, it will truly be history making. The challenge is huge, but the rewards — for the administration and every citizen — will be even “huger.

Sunday, October 12, 2008

David Brooks gets evicerated

New York Times Death Spiral Watch (Yet Another David Brooks Edition)

Michael Berube speaks to conservatives:

[Y]ou could take poor flailing David Brooks as a model. One day after this humble blog suggested that high-end conservative pundits will slurp down any old slop they’re fed by the party, Brooks was slopping out this review of Sarah Palin’s debate performance:

this debate was about Sarah Palin. She held up her end of an energetic debate that gave voters a direct look at two competing philosophies. She established debating parity with Joe Biden. And in a country that is furious with Washington, she presented herself as a radical alternative. By the end of the debate, most Republicans were not crouching behind the couch, but standing on it. The race has not been transformed, but few could have expected as vibrant and tactically clever a performance as the one Sarah Palin turned in Thursday night...

Only a week later, having realized to his horror that writing columns like this will soon deprive him of dinner-party conversation with sane people, Brooks has decided to call Palin a “fatal cancer to the Republican party.” Now that’s the way to throw someone under the couch, folks—if you want to maintain some sense of self-respect as a Serious Person.

Here is Brooks:

David Brooks: [Sarah Palin] represents a fatal cancer to the Republican party. When I first started in journalism, I worked at the National Review for Bill Buckley.... He thought it was important to have people on the conservative side who celebrated ideas, who celebrated learning. And his whole life was based on that, and that was also true for a lot of the other conservatives in the Reagan era. Reagan had an immense faith in the power of ideas. But there has been a counter, more populist tradition, which is not only to scorn liberal ideas but to scorn ideas entirely. And I'm afraid that Sarah Palin has those prejudices. I think President Bush has those prejudices...

Every time the New YorK Times publishes a column by David Brooks, a fairy has its wings torn off by a predator and dies of blood loss.

Causes of financial crisis McClatchy

McClatchy Washington Bureau

these are the guys who did the best fact-based job of investihgating WMD claims before Iraq
they called the actual inspectors & mid-level bureaucrats who know the scoop and like IF stone read the government reports carefully - too tedious & boring for most reporters Pete

Posted on Sat, Oct. 11, 2008

Private sector loans, not Fannie or Freddie, triggered crisis

David Goldstein and Kevin G. Hall | McClatchy Newspapers

last updated: October 11, 2008 04:56:24 PM

WASHINGTON — As the economy worsens and Election Day approaches, a conservative campaign that blames the global financial crisis on a government push to make housing more affordable to lower-class Americans has taken off on talk radio and e-mail.

Commentators say that's what triggered the stock market meltdown and the freeze on credit. They've specifically targeted the mortgage finance giants Fannie Mae and Freddie Mac, which the federal government seized on Sept. 6, contending that lending to poor and minority Americans caused Fannie's and Freddie's financial problems.

Federal housing data reveal that the charges aren't true, and that the private sector, not the government or government-backed companies, was behind the soaring subprime lending at the core of the crisis.

Subprime lending offered high-cost loans to the weakest borrowers during the housing boom that lasted from 2001 to 2007. Subprime lending was at its height vrom 2004 to 2006.

Federal Reserve Board data show that:

_ More than 84 percent of the subprime mortgages in 2006 were issued by private lending institutions.

_ Private firms made nearly 83 percent of the subprime loans to low- and moderate-income borrowers that year.

_ Only one of the top 25 subprime lenders in 2006 was directly subject to the housing law that's being lambasted by conservative critics.

The "turmoil in financial markets clearly was triggered by a dramatic weakening of underwriting standards for U.S. subprime mortgages, beginning in late 2004 and extending into 2007," the President's Working Group on Financial Markets reported Friday.

Conservative critics claim that the Clinton administration pushed Fannie Mae and Freddie Mac to make home ownership more available to riskier borrowers with little concern for their ability to pay the mortgages.

"I don't remember a clarion call that said Fannie and Freddie are a disaster. Loaning to minorities and risky folks is a disaster," said Neil Cavuto of Fox News.

Fannie, the Federal National Mortgage Association, and Freddie, the Federal Home Loan Mortgage Corp., don't lend money, to minorities or anyone else, however. They purchase loans from the private lenders who actually underwrite the loans.

It's a process called securitization, and by passing on the loans, banks have more capital on hand so they can lend even more.

This much is true. In an effort to promote affordable home ownership for minorities and rural whites, the Department of Housing and Urban Development set targets for Fannie and Freddie in 1992 to purchase low-income loans for sale into the secondary market that eventually reached this number: 52 percent of loans given to low-to moderate-income families.

To be sure, encouraging lower-income Americans to become homeowners gave unsophisticated borrowers and unscrupulous lenders and mortgage brokers more chances to turn dreams of homeownership in nightmares.

But these loans, and those to low- and moderate-income families represent a small portion of overall lending. And at the height of the housing boom in 2005 and 2006, Republicans and their party's standard bearer, President Bush, didn't criticize any sort of lending, frequently boasting that they were presiding over the highest-ever rates of U.S. homeownership.

Between 2004 and 2006, when subprime lending was exploding, Fannie and Freddie went from holding a high of 48 percent of the subprime loans that were sold into the secondary market to holding about 24 percent, according to data from Inside Mortgage Finance, a specialty publication. One reason is that Fannie and Freddie were subject to tougher standards than many of the unregulated players in the private sector who weakened lending standards, most of whom have gone bankrupt or are now in deep trouble.

During those same explosive three years, private investment banks — not Fannie and Freddie — dominated the mortgage loans that were packaged and sold into the secondary mortgage market. In 2005 and 2006, the private sector securitized almost two thirds of all U.S. mortgages, supplanting Fannie and Freddie, according to a number of specialty publications that track this data.

In 1999, the year many critics charge that the Clinton administration pressured Fannie and Freddie, the private sector sold into the secondary market just 18 percent of all mortgages.

Fueled by low interest rates and cheap credit, home prices between 2001 and 2007 galloped beyond anything ever seen, and that fueled demand for mortgage-backed securities, the technical term for mortgages that are sold to a company, usually an investment bank, which then pools and sells them into the secondary mortgage market.

About 70 percent of all U.S. mortgages are in this secondary mortgage market, according to the Federal Reserve.

Conservative critics also blame the subprime lending mess on the Community Reinvestment Act, a 31-year-old law aimed at freeing credit for underserved neighborhoods.

Congress created the CRA in 1977 to reverse years of redlining and other restrictive banking practices that locked the poor, and especially minorities, out of homeownership and the tax breaks and wealth creation it affords. The CRA requires federally regulated and insured financial institutions to show that they're lending and investing in their communities.

Conservative columnist Charles Krauthammer wrote recently that while the goal of the CRA was admirable, "it led to tremendous pressure on Fannie Mae and Freddie Mac — who in turn pressured banks and other lenders — to extend mortgages to people who were borrowing over their heads. That's called subprime lending. It lies at the root of our current calamity."

Fannie and Freddie, however, didn't pressure lenders to sell them more loans; they struggled to keep pace with their private sector competitors. In fact, their regulator, the Office of Federal Housing Enterprise Oversight, imposed new restrictions in 2006 that led to Fannie and Freddie losing even more market share in the booming subprime market.

What's more, only commercial banks and thrifts must follow CRA rules. The investment banks don't, nor did the now-bankrupt non-bank lenders such as New Century Financial Corp. and Ameriquest that underwrote most of the subprime loans.

These private non-bank lenders enjoyed a regulatory gap, allowing them to be regulated by 50 different state banking supervisors instead of the federal government. And mortgage brokers, who also weren't subject to federal regulation or the CRA, originated most of the subprime loans.

In a speech last March, Janet Yellen, the president of the Federal Reserve Bank of San Francisco, debunked the notion that the push for affordable housing created today's problems.

"Most of the loans made by depository institutions examined under the CRA have not been higher-priced loans," she said. "The CRA has increased the volume of responsible lending to low- and moderate-income households."

In a book on the sub-prime lending collapse published in June 2007, the late Federal Reserve Governor Ed Gramlich wrote that only one-third of all CRA loans had interest rates high enough to be considered sub-prime and that to the pleasant surprise of commercial banks there were low default rates. Banks that participated in CRA lending had found, he wrote, "that this new lending is good business."

(e-mail: khall )at)mcclatchydc.com)

McClatchy Newspapers 2008

Saturday, October 11, 2008

Troopergate trivia

the Palins pressured the Public Safety guy to fire their neer-do-well brother in law.
Alaska Republicans vote unanimously to release report.
Big Whoop!!
only trouble - lying, multiple stories and lack of transparency.
Do it in public.
Explain your motives clearly.
do not use secret political leverage.
The governor's husband does not give up 1st amendment rights any more than Billy Carter gave his right to endorse beer.
As Franklin said, beer is proof that god wants us to be happy.
yelling at brother's in law is also allowed.
have fun, Pete

Thursday, September 18, 2008

My Gal

SHOUTS & MURMURS

My Gal

by George Saunders September 22, 2008 New Yorker

Explaining how she felt when John McCain offered her the Vice-Presidential spot, my Vice-Presidential candidate, Governor Sarah Palin, said something very profound: “I answered him ‘Yes’ because I have the confidence in that readiness and knowing that you can’t blink, you have to be wired in a way of being so committed to the mission, the mission that we’re on, reform of this country and victory in the war, you can’t blink. So I didn’t blink then even when asked to run as his running mate.”

Isn’t that so true? I know that many times, in my life, while living it, someone would come up and, because of I had good readiness, in terms of how I was wired, when they asked that—whatever they asked—I would just not blink, because, knowing that, if I did blink, or even wink, that is weakness, therefore you can’t, you just don’t. You could, but no—you aren’t.

That is just how I am.

Do you know the difference between me and a Hockey Mom who has forgot her lipstick?

A dog collar.

Do you know the difference between me and a dog collar smeared with lipstick?

Not a damn thing.

We are essentially wired identical.

So, when Barack Obama says he will put some lipstick on my pig, I am, like, Are you calling me a pig? If so, thanks! Pigs are the most non-Élite of all barnyard animals. And also, if you put lipstick on my pig, do you know what the difference will be between that pig and a pit bull? I’ll tell you: a pit bull can easily kill a pig. And, as the pig dies, guess what the Hockey Mom is doing? Going to her car, putting on more lipstick, so that, upon returning, finding that pig dead, she once again looks identical to that pit bull, which, staying on mission, the two of them step over the dead pig, looking exactly like twins, except the pit bull is scratching his lower ass with one frantic leg, whereas the Hockey Mom is carrying an extra hockey stick in case Todd breaks his again. But both are going, like, Ha ha, where’s that dumb pig now? Dead, that’s who, and also: not a smidge of lipstick.

A lose-lose for the pig.

There’s a lesson in that, I think.

Who does that pig represent, and that collar, and that Hockey Mom, and that pit bull?

You figure it out. Then give me a call.

Seriously, give me a call.

Now, let us discuss the Élites. There are two kinds of folks: Élites and Regulars. Why people love Sarah Palin is, she is a Regular. That is also why they love me. She did not go to some Élite Ivy League college, which I also did not. Her and me, actually, did not go to the very same Ivy League school. Although she is younger than me, so therefore she didn’t go there slightly earlier than I didn’t go there. But, had I been younger, we possibly could have not graduated in the exact same class. That would have been fun. Sarah Palin is hot. Hot for a politician. Or someone you just see in a store. But, happily, I did not go to college at all, having not finished high school, due to I killed a man. But had I gone to college, trust me, it would not have been some Ivy League Élite-breeding factory but, rather, a community college in danger of losing its accreditation, built right on a fault zone, riddled with asbestos, and also, the crack-addicted professors are all dyslexic.

Sarah Palin was also the mayor of a very small town. To tell the truth, this is where my qualifications begin to outstrip even hers. I have never been the mayor of anything. I can’t even spell right. I had help with the above, but now— Murray, note to Murray: do not correct what follows. Lets shoe the people how I rilly spel Mooray and punshuate so thay can c how reglar I am, and ther 4 fit to leed the nashun, do to: not sum mistir fansy pans.

OK Mooray. Get corecting agin!

Thanks, Murray, you’re fabulous. Very good at what you do. Actually, Murray, come to think of it, you are so good, I suspect you are some kind of Élite. You are fired, Murray, as soon as this article is done. I’m going to hire someone Regular, who is not so excellent, and lives off the salt of the land and the fat of his brow and the sweat of his earth. Although I hope he’s not a screw-up.

I’m finding it hard to concentrate, as my eyes are killing me, due to I have not blinked since I started writing this. And, me being Regular, it takes a long time for me to write something this long.

Where was I? Ah, yes: I hate Élites. Which is why, whenever I am having brain surgery, or eye surgery, which is sometimes necessary due to all my non-blinking, I always hire some random Regular guy, with shaking hands if possible, who is also a drunk, scared of the sight of blood, and harbors a secret dislike for me.

Now, let’s talk about slogans. Ours is: Country First. Think about it. When you think of what should come first, what does? Us ourselves? No. That would be selfish. Our personal families? Selfish. God? God is good, I love Him, but, as our slogan suggests, no, sorry, God, You are not First. No, you don’t, Lord! How about: the common good of all mankind! Is that First? Don’t make me laugh with your weak blinking! No! Mercy is not First and wisdom is not First and love is super but way near the back, and ditto with patience and discernment and compassion and all that happy crap, they are all back behind Country, in the back of my S.U.V., which— Here is an example! Say I am about to run over a nun or orphan, or an orphan who grew up to become a nun—which I admire that, that is cool, good bootstrapping there, Sister—but then God or whomever goes, “It is My will that you hit that orphaned nun, do not ask Me why, don’t you dare, and I say unto thee, if you do not hit that nun, via a skillful swerve, your Country is going to suffer, and don’t ask Me how, specifically, as I have not decided that yet!” Well, I am going to do my best to get that nun in one felt swope, because, at the Convention, at which my Vice-Presidential candidate kicked mucho butt, what did the signs there say? Did they say “Orphaned Nuns First” and then there is a picture of a sad little nun with a hobo pack?

Not in my purview.

Sarah Palin knows a little something about God’s will, knowing God quite well, from their work together on that natural-gas pipeline, and what God wills is: Country First. And not just any country! There was a slight error on our signage. Other countries, such as that one they have in France, reading our slogan, if they can even read real words, might be all, like, “Hey, bonjour, they are saying we can put our country, France, first!” Non, non, non, France! What we are saying is, you’d better put our country first, you merde-heads, or soon there will be so much lipstick on your pit bulls it will make your berets spin!

In summary: Because my candidate, unlike your winking/blinking Vice-Presidential candidate, who, though, yes, he did run as the running mate when the one asking him to run did ask him to run, which that I admire, one thing he did not do, with his bare hands or otherwise, is, did he ever kill a moose? No, but ours did. And I would. Please bring a moose to me, over by me, and down that moose will go, and, if I had a kid, I would take a picture of me showing my kid that dead moose, going, like, Uh, sweetie, no, he is not resting, he is dead, due to I shot him, and now I am going to eat him, and so are you, oh yes you are, which is responsible, as God put this moose here for us to shoot and eat and take a photo of, although I did not, at that time, know why God did, but in years to come, God’s will was revealed, which is: Hey, that is a cool photo for hunters about to vote to see, plus what an honor for that moose, to be on the Internet.

How does the moose feel about it? Who knows? Probably not great. But do you know what the difference is between a dead moose with lipstick on and a dead moose without lipstick?

Lipstick.

Think about it.

Moose are, truth be told, Élites. They are big and fast and sort of rule the forest. Sarah took that one down a notch. Who’s Élite now, Bullwinkle?

Not Sarah.

She’s just Regular as heck.

Moooo Alaska crony capitalism

September 17, 2008, 9:06 pm Timothy Egan NYT Moooooo
People should stop picking on vice-presidential nominee Sarah Palin because she hired a high school classmate to oversee the state agriculture division, a woman who said she was qualified for the job because she liked cows when she was a kid. And they should lay off the governor for choosing another childhood friend to oversee a failing state-run dairy, allowing the Soviet-style business to ding taxpayers for $800,000 in additional losses.
What these critics don’t understand is that crony capitalism is how things are done in Alaska. They reward failure in the Last Frontier state. In that sense, it’s not unlike like Wall Street’s treatment of C.E.O.’s who run companies into the ground.
Look at Carly Fiorina, John McCain’s top economic surrogate — if you can find her this week, after the news and her narrative fused in a negative way. Dismissed as head of Hewlett-Packard after the company’s stock plunged and nearly 20,000 workers were let go, she was rewarded with $44 million in compensation. Sweet!
Thank God McCain wants to appoint a commission to study the practice that enriched his chief economic adviser. On the campaign trail this week, McCain and Palin pledged to “stop multimillion dollar payouts to C.E.O.’s” of failed companies. Good. Go talk to Fiorina at your next strategy session.
Palin’s Alaska is a cultural cousin to this kind of capitalism. The state may seem like a rugged arena for risky free-marketers. In truth, it’s a strange mix of socialized projects and who-you-know hiring practices.
Let’s start with those cows. A few years ago, I met Harvey Baskin, one of the last of Alaska’s taxpayer-subsidized dairy farmers, at his farm outside Anchorage. The state had spent more than $120 million to create farms where none existed before. The epic project was a miserable failure.
“You want to know how to lose money in a hurry?” Harvey told me, while kicking rock-hard clumps of frozen manure. “Become a farmer with the state of Alaska as your partner. This is what you call negative farming.”
That lesson was lost on Palin. As the Wall Street Journal reported this week, Governor Palin overturned a decision to shutter a money-losing, state-run creamery — Matanuska Maid — when her friends in Wasilla complained about losing their subsidies. She fired the board that recommended closure, and replaced it with one run by a childhood friend. After six months, and nearly $1 million in fresh losses, the board came to the same conclusion as the earlier one: Matanuska Maid could not operate without being a perpetual burden on the taxpayers.
This is Heckuva-Job-Brownie government, Far North version.
On a larger scale, consider the proposal to build a 1,715-mile natural gas pipeline, which Palin touts as one of her most significant achievements. Private companies complained they couldn’t build it without government help. That’s where Palin came to the rescue, ensuring that the state would back the project to the tune of $500 million.
And let’s not talk about voodoo infrastructure without one more mention of the bridge that Palin has yet to tell the truth about. The plan was to get American taxpayers to pay for a span that would be 80 feet higher than the Brooklyn Bridge, and about 20 feet short of the Golden Gate — all to serve a tiny airport with a half-dozen or so flights a day and a perfectly good five-minute ferry. Until it was laughed out of Congress, Palin backed it — big time, as the current vice president would say.
Why build it? Because it’s Alaska, where people are used to paying no state taxes and getting the rest of us to buck up for things they can’t afford. Alaska, where the first thing a visitor sees upon landing in Anchorage is the sign welcoming you to Ted Stevens International Airport. Stevens, of course, is the 84-year-old Republican senator indicted on multiple felony charges. He may still win re-election thanks to Palin’s popularity at the top of the ballot.
Alaskans will get $231 per person in federal earmarks — 10 times more than people in Barack Obama’s home state. That’s this year, with Palin as governor. Palin as mayor was even better at suckingat the federal pork teat- $1384 per person for Wasilla – about 50 times what your town gets on average.
If Palin were a true reformer, she would tell Congress thanks, but no thanks to that other bridge to nowhere.
Yes, there is another one — a proposal to connect Anchorage to an empty peninsula, speeding the commute to Palin’s hometown by a few minutes. It could cost up to $2 billion. The official name is Don Young’s Way, after the congressman who got the federal bridge earmarks. Of late, he’s spent more $1 million in legal fees fending off corruption investigations. Oh, and Young’s son-in-law has a stake in the property at one end of the bridge.
Some of these projects might be fully explained should Palin ever open herself up to questions. This week she sat down for her second interview — with Sean Hannity of Fox, who has shown sufficient “deference” to Palin, as the campaign requested.
One question: When Palin says “government has got to get out of the way” of the private sector, as she proclaimed this week, does that apply to dairy farms, bridges and gas pipelines in her state? I didn’t think so.

Sunday, September 02, 2007

Worst Mistake in the History of the World

Opinion : Worst Mistake in the History of the Human Race

By Jared Diamond University of California at Los Angeles Medical School

Discover Magazine, May 1987 Pages 64-66

Illustrations by Elliott Danfield

To science we owe dramatic changes in our smug self-image. Astronomy taught us that our earth isn’t the center of the universe but merely one of billions of heavenly bodies. From biology we learned that we weren’t specially created by God but evolved along with millions of other species. Now archaeology is demolishing another sacred belief: that human history over the past million years has been a long tale of progress. In particular, recent discoveries suggest that the adoption of agriculture, supposedly our most decisive step toward a better life, was in many ways a catastrophe from which we have never recovered. With agriculture came the gross social and sexual inequality, the disease and despotism, that curse our existence.

At first, the evidence against this revisionist interpretation will strike twentieth century Americans as irrefutable. We’re better off in almost every respect than people of the Middle Ages, who in turn had it easier than cavemen, who in turn were better off than apes. Just count our advantages. We enjoy the most abundant and varied foods, the best tools and material goods, some of the longest and healthiest lives, in history. Most of us are safe from starvation and predators. We get our energy from oil and machines, not from our sweat. What neo-Luddite among us would trade his life for that of a medieval peasant, a caveman, or an ape?

For most of our history we supported ourselves by hunting and gathering: we hunted wild animals and foraged for wild plants. It’s a life that philosophers have traditionally regarded as nasty, brutish, and short. Since no food is grown and little is stored, there is (in this view) no respite from the struggle that starts anew each day to find wild foods and avoid starving. Our escape from this misery was facilitated only 10,000 years ago, when in different parts of the world people began to domesticate plants and animals. The agricultural revolution spread until today it’s nearly universal and few tribes of hunter-gatherers survive.

From the progressivist perspective on which I was brought up, to ask "Why did almost all our hunter-gatherer ancestors adopt agriculture?" is silly. Of course they adopted it because agriculture is an efficient way to get more food for less work. Planted crops yield far more tons per acre than roots and berries. Just imagine a band of savages, exhausted from searching for nuts or chasing wild animals, suddenly grazing for the first time at a fruit-laden orchard or a pasture full of sheep. How many milliseconds do you think it would take them to appreciate the advantages of agriculture?

The progressivist party line sometimes even goes so far as to credit agriculture with the remarkable flowering of art that has taken place over the past few thousand years. Since crops can be stored, and since it takes less time to pick food from a garden than to find it in the wild, agriculture gave us free time that hunter-gatherers never had. Thus it was agriculture that enabled us to build the Parthenon and compose the B-minor Mass.

While the case for the progressivist view seems overwhelming, it’s hard to prove. How do you show that the lives of people 10,000 years ago got better when they abandoned hunting and gathering for farming? Until recently, archaeologists had to resort to indirect tests, whose results (surprisingly) failed to support the progressivist view. Here’s one example of an indirect test: Are twentieth century hunter-gatherers really worse off than farmers? Scattered throughout the world, several dozen groups of so-called primitive people, like the Kalahari bushmen, continue to support themselves that way. It turns out that these people have plenty of leisure time, sleep a good deal, and work less hard than their farming neighbors. For instance, the average time devoted each week to obtaining food is only 12 to 19 hours for one group of Bushmen, 14 hours or less for the Hadza nomads of Tanzania. One Bushman, when asked why he hadn’t emulated neighboring tribes by adopting agriculture, replied, "Why should we, when there are so many mongongo nuts in the world?"

While farmers concentrate on high-carbohydrate crops like rice and potatoes, the mix of wild plants and animals in the diets of surviving hunter-gatherers provides more protein and a bettter balance of other nutrients. In one study, the Bushmen’s average daily food intake (during a month when food was plentiful) was 2,140 calories and 93 grams of protein, considerably greater than the recommended daily allowance for people of their size. It’s almost inconceivable that Bushmen, who eat 75 or so wild plants, could die of starvation the way hundreds of thousands of Irish farmers and their families did during the potato famine of the 1840s.

So the lives of at least the surviving hunter-gatherers aren’t nasty and brutish, even though farmes have pushed them into some of the world’s worst real estate. But modern hunter-gatherer societies that have rubbed shoulders with farming societies for thousands of years don’t tell us about conditions before the agricultural revolution. The progressivist view is really making a claim about the distant past: that the lives of primitive people improved when they switched from gathering to farming. Archaeologists can date that switch by distinguishing remains of wild plants and animals from those of domesticated ones in prehistoric garbage dumps.

How can one deduce the health of the prehistoric garbage makers, and thereby directly test the progressivist view? That question has become answerable only in recent years, in part through the newly emerging techniques of paleopathology, the study of signs of disease in the remains of ancient peoples.

In some lucky situations, the paleopathologist has almost as much material to study as a pathologist today. For example, archaeologists in the Chilean deserts found well preserved mummies whose medical conditions at time of death could be determined by autopsy (Discover, October). And feces of long-dead Indians who lived in dry caves in Nevada remain sufficiently well preserved to be examined for hookworm and other parasites.

Usually the only human remains available for study are skeletons, but they permit a surprising number of deductions. To begin with, a skeleton reveals its owner’s sex, weight, and approximate age. In the few cases where there are many skeletons, one can construct mortality tables like the ones life insurance companies use to calculate expected life span and risk of death at any given age. Paleopathologists can also calculate growth rates by measuring bones of people of different ages, examine teeth for enamel defects (signs of childhood malnutrition), and recognize scars left on bones by anemia, tuberculosis, leprosy, and other diseases.

One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5’ 9" for men, 5’ 5" for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5’ 3" for men, 5’ for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors.

Another example of paleopathology at work is the study of Indian skeletons from burial mounds in the Illinois and Ohio river valleys. At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced bya bone condition called porotic hyperostosis), a theefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor. "Life expectancy at birth in the pre-agricultural community was bout twenty-six years," says Armelagos, "but in the post-agricultural community it was nineteen years. So these episodes of nutritional stress and infectious disease were seriously affecting their ability to survive."

The evidence suggests that the Indians at Dickson Mounds, like many other primitive peoples, took up farming not by choice but from necessity in order to feed their constantly growing numbers. "I don’t think most hunger-gatherers farmed until they had to, and when they switched to farming they traded quality for quantity," says Mark Cohen of the State University of New York at Plattsburgh, co-editor with Armelagos, of one of the seminal books in the field, Paleopathology at the Origins of Agriculture. "When I first started making that argument ten years ago, not many people agreed with me. Now it’s become a respectable, albeit controversial, side of the debate."

There are at least three sets of reasons to explain the findings that agriculture was bad for health. First, hunter-gatherers enjoyed a varied diet, while early farmers obtained most of their food from one or a few starchy crops. The farmers gained cheap calories at the cost of poor nutrition. (today just three high-carbohydrate plants–wheat, rice, and corn–provide the bulk of the calories consumed by the human species, yet each one is deficient in certain vitamins or amino acids essential to life.) Second, because of dependence on a limited number of crops, farmers ran the risk of starvation if one crop failed. Finally, the mere fact that agriculture encouraged people to clump together in crowded societies, many of which then carried on trade with other crowded societies, led to the spread of parasites and infectious disease. (Some archaeologists think it was the crowding, rather than agriculture, that promoted disease, but this is a chicken-and-egg argument, because crowding encourages agriculture and vice versa.) Epidemics couldn’t take hold when populations were scattered in small bands that constantly shifted camp. Tuberculosis and diarrheal disease had to await the rise of farming, measles and bubonic plague the appearnce of large cities.

Besides malnutrition, starvation, and epidemic diseases, farming helped bring another curse upon humanity: deep class divisions. Hunter-gatherers have little or no stored food, and no concentrated food sources, like an orchard or a herd of cows: they live off the wild plants and animals they obtain each day. Therefore, there can be no kings, no class of social parasites who grow fat on food seized from others. Only in a farming population could a healthy, non-producing élite set itself above the disease-ridden masses. Skeletons from Greek tombs at Mycenae c. 1500 B. C. suggest that royals enjoyed a

better diet than commoners, since the royal skeletons were two or three inches taller and had better teeth (on the average, one instead of six cavities or missing teeth). Among Chilean mummies from c. A. D. 1000, the élite were distinguished not only by ornaments and gold hair clips but also by a fourfold lower rate of bone lesions caused by disease.

Similar contrasts in nutrition and health persist on a global scale today. To people in rich countries like the U. S., it sounds ridiculous to extol the virtues of hunting and gathering. But Americans are an élite, dependent on oil and minerals that must often be iimproted from countries with poorer health and nutrition. If one could choose between being a peasant farmer in Ethiopia or a bushman gatherer in the Kalahari, which do you think would be the better choice?

Farming may have encouraged inequality between the sexes, as well. Freed from the need to transport their babies during a nomadic existence, and under pressure to produce more hands to till the fields, farming women tended to have more frequent pregnancies than their hunter-gatherer counterparts–with consequent drains on their health. Among the Chilean mummies for example, more women than men had bone lesions from infectious disease.

Women in agricultural societies were sometimes made beasts of burden. In New Guinea farming communities today I often see women staggering under loads of vegetables and firewood while the men walk empty-handed. Once while on a field trip there studying birds, I offered to pay some villagers to carry supplies from an airstrip to my mountain camp. The heaviest item was a 110-pound bag of rice, which I lashed to a pole and assigned to a team of four men to shoulder together. When I eventually caught up with the villagers, the men were carrying light loads, while one small woman weighing less than the bag of rice was bent under it, supporting its weight by a cord across her temples.

As for the claim that agriculture encouraged the flowering of art by providing us with leisure time, modern hunter-gatherers have at least as much free time as do farmers. The whole emphasis on leisure time as a critical factor seems to me misguided. Gorillas have had ample free time to build their own Parthenon, had they wanted to. While post-agricultural technological advances did make new art forms possible and preservation of art easier, great paintings and sculptures were already being produced by hunter-gatherers 15,000 years ago, and were still being produced as recently as the last century by such hunter-gatherers as some Eskimos and the Indians of the Pacific Northwest.

Thus with the advent of agriculture and élite became better off, but most people became worse off. Instead of swallowing the progressivist party line that we chose agriculture because it was good for us, we must ask how we got trapped by it despite its pitfalls.

One answer boils down to the adage "Might makes right." Farming could support many more people than hunting, albeit with a poorer quality of life. (Population densities of hunter-gatherers are rarely over on eperson per ten square miles, while farmers average 100 times that.) Partly, this is because a field planted entirely in edible crops lets one feed far more mouths than a forest with scattered edible plants. Partly, too, it’s because nomadic hunter-gatherers have to keep their children spaced at four-year intervals by infanticide and other means, since a mother must carry her toddler until it’s old enough to keep up with the adults. Because farm women don’t have that burden, they can and often do bear a child every two years.

As population densities of hunter-gatherers slowly rose at the end of the ice ages, bands had to choose between feeding more mouths by taking the first steps toward agriculture, or else finding ways to limit growth. Some bands chose the former solution, unable to anticipate the evils of farming, and seduced by the transient abundance they enjoyed until population growth caught up with increased food production. Such bands outbred and then drove off or killed the bands that chose to remain hunter-gatherers, because a hundred malnourished farmers can still outfight one healthy hunter. It’s not that hunter-gatherers abandonded their life style, but that those sensible enough not to abandon it were forced out of all areas except the ones farmers didn’t want.

At this point it’s instructive to recall the common complaint that archaeology is a luxury, concerned with the remote past, and offering no lessons for the present. Archaeologists studying the rise of farming have reconstructed a crucial stage at which we made the worst mistake in human history. Forced to choose between limiting population or trying to increase food production, we chose the latter and ended up with starvation, warfare, and tyranny.

Hunter-gatherers practiced the most successful and logest-lasting life style in human history. In contrast, we’re still struggling with the mess into which agriculture has tumbled us, and it’s unclear whether we can solve it. Suppose that an archaeologist who had visited from outer space were trying to explain human history to his fellow spacelings. He might illustrate the results of his digs by a 24-hour clock on which one hour represents 100,000 years of real past time. If the history of the human race began at midnight, then we would now be almost at the end of our first day. We lived as hunter-gatherers for nearly the whole of that day, from midnight through dawn, noon, and sunset. Finally, at 11:54 p. m. we adopted agriculture. As our second midnight approaches, will the plight of famine-stricken peasants gradually spread to engulf us all? Or will we somehow achieve those seductive blessings that we imagine behind agriculture’s glittering façade, and that have so far eluded us?

Monday, July 16, 2007

Why are so many Americans in Jail

Why Are So Many Americans in Prison?

Race and the transformation of criminal justice

Glenn C. Loury

8 The early 1990s were the age of drive-by shootings, drug deals gone bad, crack cocaine, and gangsta rap. Between 1960 and 1990, the annual number of murders in New Haven rose from six to 31, the number of rapes from four to 168, the number of robberies from 16 to 1,784—all this while the city’s population declined by 14 percent. Crime was concentrated in central cities: in 1990, two fifths of Pennsylvania’s violent crimes were committed in Philadelphia, home to one seventh of the state’s population. The subject of crime dominated American domestic-policy debates.

Most observers at the time expected things to get worse. Consulting demographic tables and extrapolating trends, scholars and pundits warned the public to prepare for an onslaught, and for a new kind of criminal—the anomic, vicious, irreligious, amoral juvenile “super-predator.” In 1996, one academic commentator predicted a “bloodbath” of juvenile homicides in 2005.

And so we prepared. Stoked by fear and political opportunism, but also by the need to address a very real social problem, we threw lots of people in jail, and when the old prisons were filled we built new ones.

But the onslaught never came. Crime rates peaked in 1992 and have dropped sharply since. Even as crime rates fell, however, imprisonment rates remained high and continued their upward march. The result, the current American prison system, is a leviathan unmatched in human history.

According to a 2005 report of the International Centre for Prison Studies in London, the United States—with five percent of the world’s population—houses 25 percent of the world’s inmates. Our incarceration rate (714 per 100,000 residents) is almost 40 percent greater than those of our nearest competitors (the Bahamas, Belarus, and Russia). Other industrial democracies, even those with significant crime problems of their own, are much less punitive: our incarceration rate is 6.2 times that of Canada, 7.8 times that of France, and 12.3 times that of Japan. We have a corrections sector that employs more Americans than the combined work forces of General Motors, Ford, and Wal-Mart, the three largest corporate employers in the country, and we are spending some $200 billion annually on law enforcement and corrections at all levels of government, a fourfold increase (in constant dollars) over the past quarter century.

Never before has a supposedly free country denied basic liberty to so many of its citizens. In December 2006, some 2.25 million persons were being held in the nearly 5,000 prisons and jails that are scattered across America’s urban and rural landscapes. One third of inmates in state prisons are violent criminals, convicted of homicide, rape, or robbery. But the other two thirds consist mainly of property and drug offenders. Inmates are disproportionately drawn from the most disadvantaged parts of society. On average, state inmates have fewer than 11 years of schooling. They are also vastly disproportionately black and brown.

How did it come to this? One argument is that the massive increase in incarceration reflects the success of a rational public policy: faced with a compelling social problem, we responded by imprisoning people and succeeded in lowering crime rates. This argument is not entirely misguided. Increased incarceration does appear to have reduced crime somewhat. But by how much? Estimates of the share of the 1990s reduction in violent crime that can be attributed to the prison boom range from five percent to 25 percent. Whatever the number, analysts of all political stripes now agree that we have long ago entered the zone of diminishing returns. The conservative scholar John DiIulio, who coined the term “super-predator” in the early 1990s, was by the end of that decade declaring in The Wall Street Journal that “Two Million Prisoners Are Enough.” But there was no political movement for getting America out of the mass-incarceration business. The throttle was stuck.

A more convincing argument is that imprisonment rates have continued to rise while crime rates have fallen because we have become progressively more punitive: not because crime has continued to explode (it hasn’t), not because we made a smart policy choice, but because we have made a collective decision to increase the rate of punishment.

One simple measure of punitiveness is the likelihood that a person who is arrested will be subsequently incarcerated. Between 1980 and 2001, there was no real change in the chances of being arrested in response to a complaint: the rate was just under 50 percent. But the likelihood that an arrest would result in imprisonment more than doubled, from 13 to 28 percent. And because the amount of time served and the rate of prison admission both increased, the incarceration rate for violent crime almost tripled, despite the decline in the level of violence. The incarceration rate for nonviolent and drug offenses increased at an even faster pace: between 1980 and 1997 the number of people incarcerated for nonviolent offenses tripled, and the number of people incarcerated for drug offenses increased by a factor of 11. Indeed, the criminal-justice researcher Alfred Blumstein has argued that none of the growth in incarceration between 1980 and 1996 can be attributed to more crime:

The growth was entirely attributable to a growth in punitiveness, about equally to growth in prison commitments per arrest (an indication of tougher prosecution or judicial sentencing) and to longer time served (an indication of longer sentences, elimination of parole or later parole release, or greater readiness to recommit parolees to prison for either technical violations or new crimes).

This growth in punitiveness was accompanied by a shift in thinking about the basic purpose of criminal justice. In the 1970s, the sociologist David Garland argues, the corrections system was commonly seen as a way to prepare offenders to rejoin society. Since then, the focus has shifted from rehabilitation to punishment and stayed there. Felons are no longer persons to be supported, but risks to be dealt with. And the way to deal with the risks is to keep them locked up. As of 2000, 33 states had abolished limited parole (up from 17 in 1980); 24 states had introduced three-strikes laws (up from zero); and 40 states had introduced truth-in-sentencing laws (up from three). The vast majority of these changes occurred in the 1990s, as crime rates fell.

This new system of punitive ideas is aided by a new relationship between the media, the politicians, and the public. A handful of cases—in which a predator does an awful thing to an innocent—get excessive media attention and engender public outrage. This attention typically bears no relation to the frequency of the particular type of crime, and yet laws—such as three-strikes laws that give mandatory life sentences to nonviolent drug offenders—and political careers are made on the basis of the public’s reaction to the media coverage of such crimes.

* * *

Despite a sharp national decline in crime, American criminal justice has become crueler and less caring than it has been at any other time in our modern history. Why?

The question has no simple answer, but the racial composition of prisons is a good place to start. The punitive turn in the nation’s social policy—intimately connected with public rhetoric about responsibility, dependency, social hygiene, and the reclamation of public order—can be fully grasped only when viewed against the backdrop of America’s often ugly and violent racial history: there is a reason why our inclination toward forgiveness and the extension of a second chance to those who have violated our behavioral strictures is so stunted, and why our mainstream political discourses are so bereft of self-examination and searching social criticism. This historical resonance between the stigma of race and the stigma of imprisonment serves to keep alive in our public culture the subordinating social meanings that have always been associated with blackness. Race helps to explain why the United States is exceptional among the democratic industrial societies in the severity and extent of its punitive policy and in the paucity of its social-welfare institutions.

Slavery ended a long time ago, but the institution of chattel slavery and the ideology of racial subordination that accompanied it have cast a long shadow. I speak here of the history of lynching throughout the country; the racially biased policing and judging in the South under Jim Crow and in the cities of the Northeast, Midwest, and West to which blacks migrated after the First and Second World Wars; and the history of racial apartheid that ended only as a matter of law with the civil-rights movement. It should come as no surprise that in the post–civil rights era, race, far from being peripheral, has been central to the evolution of American social policy.

The political scientist Vesla Mae Weaver, in a recently completed dissertation, examines policy history, public opinion, and media processes in an attempt to understand the role of race in this historic transformation of criminal justice. She argues—persuasively, I think—that the punitive turn represented a political response to the success of the civil-rights movement. Weaver describes a process of “frontlash” in which opponents of the civil-rights revolution sought to regain the upper hand by shifting to a new issue. Rather than reacting directly to civil-rights developments, and thus continuing to fight a battle they had lost, those opponents—consider George Wallace’s campaigns for the presidency, which drew so much support in states like Michigan and Wisconsin—shifted attention to a seemingly race-neutral concern over crime:

Once the clutch of Jim Crow had loosened, opponents of civil rights shifted the “locus of attack” by injecting crime onto the agenda. Through the process of frontlash, rivals of civil rights progress defined racial discord as criminal and argued that crime legislation would be a panacea to racial unrest. This strategy both imbued crime with race and depoliticized racial struggle, a formula which foreclosed earlier “root causes” alternatives. Fusing anxiety about crime to anxiety over racial change and riots, civil rights and racial disorder—initially defined as a problem of minority disenfranchisement—were defined as a crime problem, which helped shift debate from social reform to punishment.

Of course, this argument (for which Weaver adduces considerable circumstantial evidence) is speculative. But something interesting seems to have been going on in the late 1960s regarding the relationship between attitudes on race and social policy.

Before 1965, public attitudes on the welfare state and on race, as measured by the annually administered General Social Survey, varied year to year independently of one another: you could not predict much about a person’s attitudes on welfare politics by knowing their attitudes about race. After 1965, the attitudes moved in tandem, as welfare came to be seen as a race issue. Indeed, the year-to-year correlation between an index measuring liberalism of racial attitudes and attitudes toward the welfare state over the interval 1950–1965 was .03. These same two series had a correlation of .68 over the period 1966–1996. The association in the American mind of race with welfare, and of race with crime, has been achieved at a common historical moment. Crime-control institutions are part of a larger social-policy complex—they relate to and interact with the labor market, family-welfare efforts, and health and social-work activities. Indeed, Garland argues that the ideological approaches to welfare and crime control have marched rightward to a common beat: “The institutional and cultural changes that have occurred in the crime control field are analogous to those that have occurred in the welfare state more generally.” Just as the welfare state came to be seen as a race issue, so, too, crime came to be seen as a race issue, and policies have been shaped by this perception.

Consider the tortured racial history of the War on Drugs. Blacks were twice as likely as whites to be arrested for a drug offense in 1975 but four times as likely by 1989. Throughout the 1990s, drug-arrest rates remained at historically unprecedented levels. Yet according to the National Survey on Drug Abuse, drug use among adults fell from 20 percent in 1979 to 11 percent in 2000. A similar trend occurred among adolescents. In the age groups 12–17 and 18–25, use of marijuana, cocaine, and heroin all peaked in the late 1970s and began a steady decline thereafter. Thus, a decline in drug use across the board had begun a decade before the draconian anti-drug efforts of the 1990s were initiated.

Of course, most drug arrests are for trafficking, not possession, so usage rates and arrest rates needn’t be expected to be identical. Still, we do well to bear in mind that the social problem of illicit drug use is endemic to our whole society. Significantly, throughout the period 1979–2000, white high-school seniors reported using drugs at a significantly higher rate than black high-school seniors. High drug-usage rates in white, middle-class American communities in the early 1980s accounts for the urgency many citizens felt to mount a national attack on the problem. But how successful has the effort been, and at what cost?

Think of the cost this way: to save middle-class kids from the threat of a drug epidemic that might not have even existed by the time that drug incarceration began its rapid increase in the 1980s, we criminalized underclass kids. Arrests went up, but drug prices have fallen sharply over the past 20 years—suggesting that the ratcheting up of enforcement has not made drugs harder to get on the street. The strategy clearly wasn’t keeping drugs away from those who sought them. Not only are prices down, but the data show that drug-related visits to emergency rooms also rose steadily throughout the 1980s and 1990s.

An interesting case in point is New York City. Analyzing arrests by residential neighborhood and police precinct, the criminologist Jeffrey Fagan and his colleagues Valerie West and Jan Holland found that incarceration was highest in the city’s poorest neighborhoods, though these were often not the neighborhoods in which crime rates were the highest. Moreover, they discovered a perverse effect of incarceration on crime: higher incarceration in a given neighborhood in one year seemed to predict higher crime rates in that same neighborhood one year later. This growth and persistence of incarceration over time, the authors concluded, was due primarily to the drug enforcement practices of police and to sentencing laws that require imprisonment for repeat felons. Police scrutiny was more intensive and less forgiving in high-incarceration neighborhoods, and parolees returning to such neighborhoods were more closely monitored. Thus, discretionary and spatially discriminatory police behavior led to a high and increasing rate of repeat prison admissions in the designated neighborhoods, even as crime rates fell.

Fagan, West, and Holland explain the effects of spatially concentrated urban anti-drug-law enforcement in the contemporary American metropolis. Buyers may come from any neighborhood and any social stratum. But the sellers—at least the ones who can be readily found hawking their wares on street corners and in public vestibules—come predominantly from the poorest, most non-white parts of the city. The police, with arrest quotas to meet, know precisely where to find them. The researchers conclude:

Incarceration begets more incarceration, and incarceration also begets more crime, which in turn invites more aggressive enforcement, which then re-supplies incarceration . . . three mechanisms . . . contribute to and reinforce incarceration in neighborhoods: the declining economic fortunes of former inmates and the effects on neighborhoods where they tend to reside, resource and relationship strains on families of prisoners that weaken the family’s ability to supervise children, and voter disenfranchisement that weakens the political economy of neighborhoods.

The effects of imprisonment on life chances are profound. For incarcerated black men, hourly wages are ten percent lower after prison than before. For all incarcerated men, the number of weeks worked per year falls by at least a third after their release.

So consider the nearly 60 percent of black male high-school dropouts born in the late 1960s who are imprisoned before their 40th year. While locked up, these felons are stigmatized—they are regarded as fit subjects for shaming. Their links to family are disrupted; their opportunities for work are diminished; their voting rights may be permanently revoked. They suffer civic excommunication. Our zeal for social discipline consigns these men to a permanent nether caste. And yet, since these men—whatever their shortcomings—have emotional and sexual and family needs, including the need to be fathers and lovers and husbands, we are creating a situation where the children of this nether caste are likely to join a new generation of untouchables. This cycle will continue so long as incarceration is viewed as the primary path to social hygiene.

* * *

I have been exploring the issue of causes: of why we took the punitive turn that has resulted in mass incarceration. But even if the racial argument about causes is inconclusive, the racial consequences are clear. To be sure, in the United States, as in any society, public order is maintained by the threat and use of force. We enjoy our good lives only because we are shielded by the forces of law and order, which keep the unruly at bay. Yet in this society, to a degree virtually unmatched in any other, those bearing the brunt of order enforcement belong in vastly disproportionate numbers to historically marginalized racial groups. Crime and punishment in America has a color.

In his fine study Punishment and Inequality in America (2006), the Princeton University sociologist Bruce Western powerfully describes the scope, nature, and consequences of contemporary imprisonment. He finds that the extent of racial disparity in imprisonment rates is greater than in any other major arena of American social life: at eight to one, the black–white ratio of incarceration rates dwarfs the two-to-one ratio of unemployment rates, the three-to-one ration of non-marital childbearing, the two-to-one ratio of infant-mortality rates and one-to-five ratio of net worth. While three out of 200 young whites were incarcerated in 2000, the rate for young blacks was one in nine. A black male resident of the state of California is more likely to go to a state prison than a state college.

The scandalous truth is that the police and penal apparatus are now the primary contact between adult black American men and the American state. Among black male high-school dropouts aged 20 to 40, a third were locked up on any given day in 2000, fewer than three percent belonged to a union, and less than one quarter were enrolled in any kind of social program. Coercion is the most salient meaning of government for these young men. Western estimates that nearly 60 percent of black male dropouts born between 1965 and 1969 were sent to prison on a felony conviction at least once before they reached the age of 35.

One cannot reckon the world-historic American prison build-up over the past 35 years without calculating the enormous costs imposed upon the persons imprisoned, their families, and their communities. (Of course, this has not stopped many social scientists from pronouncing on the net benefits of incarceration without doing so.) Deciding on the weight to give to a “thug’s” well-being—or to that of his wife or daughter or son—is a question of social morality, not social science. Nor can social science tell us how much additional cost borne by the offending class is justified in order to obtain a given increment of security or property or peace of mind for the rest of us. These are questions about the nature of the American state and its relationship to its people that transcend the categories of benefits and costs.

Yet the discourse surrounding punishment policy invariably discounts the humanity of the thieves, drug sellers, prostitutes, rapists, and, yes, those whom we put to death. It gives insufficient weight to the welfare, to the humanity, of those who are knitted together with offenders in webs of social and psychic affiliation. What is more, institutional arrangements for dealing with criminal offenders in the United States have evolved to serve expressive as well as instrumental ends. We have wanted to “send a message,” and we have done so with a vengeance. In the process, we have created facts. We have answered the question, who is to blame for the domestic maladies that beset us? We have constructed a national narrative. We have created scapegoats, indulged our need to feel virtuous, and assuaged our fears. We have met the enemy, and the enemy is them.

Incarceration keeps them away from us. Thus Garland: “The prison is used today as a kind of reservation, a quarantine zone in which purportedly dangerous individuals are segregated in the name of public safety.” The boundary between prison and community, Garland continues, is “heavily patrolled and carefully monitored to prevent risks leaking out from one to the other. Those offenders who are released ‘into the community’ are subject to much tighter control than previously, and frequently find themselves returned to custody for failure to comply with the conditions that continue to restrict their freedom. For many of these parolees and ex-convicts, the ‘community’ into which they are released is actually a closely monitored terrain, a supervised space, lacking much of the liberty that one associates with ‘normal life’.”

Deciding how citizens of varied social rank within a common polity ought to relate to one another is a more fundamental consideration than deciding which crime-control policy is most efficient. The question of relationship, of solidarity, of who belongs to the body politic and who deserves exclusion—these are philosophical concerns of the highest order. A decent society will on occasion resist the efficient course of action, for the simple reason that to follow it would be to act as though we were not the people we have determined ourselves to be: a people conceived in liberty and dedicated to the proposition that we all are created equal. Assessing the propriety of creating a racially defined pariah class in the middle of our great cities at the start of the 21st century presents us with just such a case.

My recitation of the brutal facts about punishment in today’s America may sound to some like a primal scream at this monstrous social machine that is grinding poor black communities to dust. And I confess that these brutal facts do at times incline me to cry out in despair. But my argument is analytical, not existential. Its principal thesis is this: we law-abiding, middle-class Americans have made decisions about social policy and incarceration, and we benefit from those decisions, and that means from a system of suffering, rooted in state violence, meted out at our request. We had choices and we decided to be more punitive. Our society—the society we have made—creates criminogenic conditions in our sprawling urban ghettos, and then acts out rituals of punishment against them as some awful form of human sacrifice.

This situation raises a moral problem that we cannot avoid. We cannot pretend that there are more important problems in our society, or that this circumstance is the necessary solution to other, more pressing problems—unless we are also prepared to say that we have turned our backs on the ideal of equality for all citizens and abandoned the principles of justice. We ought to ask ourselves two questions: Just what manner of people are we Americans? And in light of this, what are our obligations to our fellow citizens—even those who break our laws?

* * *

To address these questions, we need to think about the evaluation of our prison system as a problem in the theory of distributive justice—not the purely procedural idea of ensuring equal treatment before the law and thereafter letting the chips fall where they may, but the rather more demanding ideal of substantive racial justice. The goal is to bring about through conventional social policy and far-reaching institutional reforms a situation in which the history of racial oppression is no longer so evident in the disparate life experiences of those who descend from slaves.

And I suggest we approach that problem from the perspective of John Rawls’s theory of justice: first, that we think about justice from an “original position” behind a “veil of ignorance” that obstructs from view our own situation, including our class, race, gender, and talents. We need to ask what rules we would pick if we seriously imagined that we could turn out to be anyone in the society. Second, following Rawls’s “difference principle,” we should permit inequalities only if they work to improve the circumstances of the least advantaged members of society. But here, the object of moral inquiry is not the distribution among individuals of wealth and income, but instead the distribution of a negative good, punishment, among individuals and, importantly, racial groups.

So put yourself in John Rawls’s original position and imagine that you could occupy any rank in the social hierarchy. Let me be more concrete: imagine that you could be born a black American male outcast shuffling between prison and the labor market on his way to an early death to the chorus of nigger or criminal or dummy. Suppose we had to stop thinking of us and them. What social rules would we pick if we actually thought that they could be us? I expect that we would still pick some set of punishment institutions to contain bad behavior and protect society. But wouldn’t we pick arrangements that respected the humanity of each individual and of those they are connected to through bonds of social and psychic affiliation? If any one of us had a real chance of being one of those faces looking up from the bottom of the well—of being the least among us¬—then how would we talk publicly about those who break our laws? What would we do with juveniles who go awry, who roam the streets with guns and sometimes commit acts of violence? What weight would we give to various elements in the deterrence-retribution-incapacitation-rehabilitation calculus, if we thought that calculus could end up being applied to our own children, or to us? How would we apportion blame and affix responsibility for the cultural and social pathologies evident in some quarters of our society if we envisioned that we ourselves might well have been born into the social margins where such pathology flourishes?

If we take these questions as seriously as we should, then we would, I expect, reject a pure ethic of personal responsibility as the basis for distributing punishment. Issues about responsibility are complex, and involve a kind of division of labor—what John Rawls called a “social division of responsibility” between “citizens as a collective body” and individuals: when we hold a person responsible for his or her conduct—by establishing laws, investing in their enforcement, and consigning some persons to prisons—we need also to think about whether we have done our share in ensuring that each person faces a decent set of opportunities for a good life. We need to ask whether we as a society have fulfilled our collective responsibility to ensure fair conditions for each person—for each life that might turn out to be our life.

We would, in short, recognize a kind of social responsibility, even for the wrongful acts freely chosen by individual persons. I am not arguing that people commit crimes because they have no choices, and that in this sense the “root causes” of crime are social; individuals always have choices. My point is that responsibility is a matter of ethics, not social science. Society at large is implicated in an individual person’s choices because we have acquiesced in—perhaps actively supported, through our taxes and votes, words and deeds—social arrangements that work to our benefit and his detriment, and which shape his consciousness and sense of identity in such a way that the choices he makes, which we may condemn, are nevertheless compelling to him—an entirely understandable response to circumstance. Closed and bounded social structures—like racially homogeneous urban ghettos—create contexts where “pathological” and “dysfunctional” cultural forms emerge; but these forms are neither intrinsic to the people caught in these structures nor independent of the behavior of people who stand outside them.

Thus, a central reality of our time is the fact that there has opened a wide racial gap in the acquisition of cognitive skills, the extent of law-abidingness, the stability of family relations, the attachment to the work force, and the like. This disparity in human development is, as a historical matter, rooted in political, economic, social, and cultural factors peculiar to this society and reflective of its unlovely racial history: it is a societal, not communal or personal, achievement. At the level of the individual case we must, of course, act as if this were not so. There could be no law, no civilization, without the imputation to particular persons of responsibility for their wrongful acts. But the sum of a million cases, each one rightly judged on its merits to be individually fair, may nevertheless constitute a great historic wrong. The state does not only deal with individual cases. It also makes policies in the aggregate, and the consequences of these policies are more or less knowable. And who can honestly say—who can look in the mirror and say with a straight face—that we now have laws and policies that we would endorse if we did not know our own situation and genuinely considered the possibility that we might be the least advantaged?

Even if the current racial disparity in punishment in our country gave evidence of no overt racial discrimination—and, perhaps needless to say, I view that as a wildly optimistic supposition—it would still be true that powerful forces are at work to perpetuate the consequences of a universally acknowledged wrongful past. This is in the first instance a matter of interpretation—of the narrative overlay that we impose upon the facts.

The tacit association in the American public’s imagination of “blackness” with “unworthiness” or “dangerousness” has obscured a fundamental ethical point about responsibility, both collective and individual, and promoted essentialist causal misattributions: when confronted by the facts of racially disparate achievement, racially disproportionate crime rates, and racially unequal school achievement, observers will have difficulty identifying with the plight of a group of people whom they (mistakenly) think are simply “reaping what they have sown.” Thus, the enormous racial disparity in the imposition of social exclusion, civic ex-communication, and lifelong disgrace has come to seem legitimate, even necessary: we fail to see how our failures as a collective body are implicated in this disparity. We shift all the responsibility onto their shoulders, only by irresponsibly—indeed, immorally—denying our own. And yet, this entire dynamic has its roots in past unjust acts that were perpetrated on the basis of race.

Given our history, producing a racially defined nether caste through the ostensibly neutral application of law should be profoundly offensive to our ethical sensibilities—to the principles we proudly assert as our own. Mass incarceration has now become a principal vehicle for the reproduction of racial hierarchy in our society. Our country’s policymakers need to do something about it. And all of us are ultimately responsible for making sure that they do. <

Glenn C. Loury is the Merton P. Stoltz Professor of the Social Sciences in the department of economics at Brown University. He is the author of The Anatomy of Racial Inequality, and he was a 2002 Carnegie Scholar.

Originally published in the July/August 2007 issue of Boston Review.

Monday, February 05, 2007

FAULED COVERUP on Uranium

A Failed Cover-Up
What the Libby Trial Is Revealing

By David Ignatius
Friday, February 2, 2007; A15

Why was the White House so nervous in the summer of 2003 about the CIA's reporting on alleged Iraqi attempts to buy uranium from Niger to build a nuclear bomb? That's the big question that runs through the many little details that have emerged in the perjury trial of Vice President Cheney's former top aide, Lewis "Scooter" Libby.

The trial record suggests a simple answer: The White House was worried that the CIA would reveal that it had been pressured in 2002 and early 2003 to support administration claims about Iraqi weapons of mass destruction, and that in the Niger case, the CIA had tried hard to resist this pressure. The machinations of Cheney, Libby and others were an attempt to weave an alternative narrative that blamed the CIA.

The truth began to emerge on July 11, 2003, when CIA Director George Tenet issued a public statement disclosing that the agency had tried to warn the White House off the Niger allegations. In that sense, the Libby trial is about a cover-up that failed.

What helped start the whole brouhaha was a 2003 op-ed article by former ambassador Joseph Wilson, disclosing that his fact-finding trip to Niger the previous year had yielded no evidence of Iraqi uranium purchases. His piece opened with a devastating question: "Did the Bush administration manipulate intelligence about Saddam Hussein's weapons programs to justify an invasion of Iraq?" A frantic White House tried to rebut Wilson's criticism by leaking the fact that his wife, Valerie Plame, worked at the CIA and had suggested sending him to Niger -- as if the CIA connection somehow contaminated Wilson's allegations and made the White House less culpable.

To understand the Libby case, it's important to look at the documentary evidence, which has been usefully compiled by washingtonpost.com.

The record begins with a Feb. 13, 2002, memo from a CIA briefer who had been "tasked" by Cheney on the uranium issue: "The VP was shown an assessment (he thought from DIA) that Iraq is purchasing uranium from Africa. He would like our assessment of that transaction and its implications for Iraq's nuclear program." The CIA briefer responded the next day with a comment that should have aroused skepticism on whether Iraq needed to buy any more uranium: Iraq already had 550 tons of "yellowcake" ore -- 200 tons of it from Niger. But the CIA, eager to please, asked Wilson a few days later to go to Niger to investigate the claim.

A glimpse of the pressure coming from the vice president's office emerges from a memo from CIA briefer Craig R. Schmall, after he was interviewed in January 2004 by FBI agents investigating the leak of Plame's covert identity: "I mentioned also to the agents that Libby was in charge within the administration (or at least the White House side) for producing papers arguing the case for Iraqi WMD and ties between Iraq and al Qaeda, which explains Libby's and the Vice President's interest in the Iraq/Niger/Uranium case."

CIA and State Department documents show that analysts at both agencies became increasingly skeptical about the Niger allegation and tried to warn the White House. A memo from Schmall to Eric Edelman, then Cheney's national security adviser, recalled: "CIA on several occasions has cautioned . . . that available information on this issue was fragmentary and unconfirmed." A memo from Carl W. Ford Jr., then head of the State Department's intelligence bureau, noted that his analysts had found the Niger claims "highly dubious."

The Niger issue wasn't included in Secretary of State Colin Powell's famous U.N. speech on Iraqi weapons of mass destruction, according to Ford, "due to CIA concerns raised during the coordination regarding the veracity of the information on the alleged Iraq-Niger agreement." But despite CIA warnings, Bush referred to uranium purchases from Africa in his January 2003 State of the Union address, attributing it to British sources.

So we begin to understand why the White House was worried about the CIA in the summer of 2003: It feared the agency would breach the wall of silence about the claims regarding weapons of mass destruction. Robert Grenier, a CIA official who was the agency's Iraq mission manager, told colleagues that he remembered "a series of insistent phone calls" that month from Libby, who wanted the CIA to tell reporters that "other community elements such as State and DOD" had encouraged Wilson's Niger trip, not just Cheney.

The bottom line? Grenier was asked in court last week to explain the White House's 2003 machinations. Here's what he said: "I think they were trying to avoid blame for not providing [the truth] about whether or not Iraq had attempted to buy uranium." Let me say it again: This trial is about a cover-up that failed.