donkey o.d. too

My main site, donkey o.d. is moving here. Pardon the dust...

Friday, November 18, 2005

A Private Obsession by PAUL KRUGMAN

November 18, 2005
Op-Ed Columnist
A Private Obsession
By PAUL KRUGMAN
"Lots of things in life are complicated." So declared Michael Leavitt, the secretary of health and human services, in response to the mass confusion as registration for the new Medicare drug benefit began. But the complexity of the program - which has reduced some retirees to tears as they try to make what may be life-or-death decisions - is far greater than necessary.

One reason the drug benefit is so confusing is that older Americans can't simply sign up with Medicare as they can for other benefits. They must, instead, choose from a baffling array of plans offered by private middlemen. Why?

Here's a parallel. Earlier this year Senator Rick Santorum introduced a bill that would have forced the National Weather Service to limit the weather information directly available to the public. Although he didn't say so explicitly, he wanted the service to funnel that information through private forecasters instead.

Mr. Santorum's bill didn't go anywhere. But it was a classic attempt to force gratuitous privatization: involving private corporations in the delivery of public services even when those corporations have no useful role to play.

The Medicare drug benefit is an example of gratuitous privatization on a grand scale.

Here's some background: the elderly have long been offered a choice between standard Medicare, in which the government pays medical bills directly, and plans in which the government pays a middleman, like an H.M.O., to deliver health care. The theory was that the private sector would find innovative ways to lower costs while providing better care.

The theory was wrong. A number of studies have found that managed-care plans, which have much higher administrative costs than government-managed Medicare, end up costing the system money, not saving it.

But privatization, once promoted as a way to save money, has become a goal in itself. The 2003 bill that established the prescription drug benefit also locked in large subsidies for managed care.

And on drug coverage, the 2003 bill went even further: rather than merely subsidizing private plans, it made them mandatory. To receive the drug benefit, one must sign up with a plan offered by a private company. As people are discovering, the result is a deeply confusing system because the competing private plans differ in ways that are very hard to assess.

The peculiar structure of the drug benefit, with its huge gap in coverage - the famous "doughnut hole" I wrote about last week - adds to the confusion. Many better-off retirees have relied on Medigap policies to cover gaps in traditional Medicare, including prescription drugs. But that straightforward approach, which would make it relatively easy to compare drug plans, can't be used to fill the doughnut hole because Medigap policies are no longer allowed to cover drugs.

The only way to get some coverage in the gap is as part of a package in which you pay extra - a lot extra - to one of the private drug plans delivering the basic benefit. And because this coverage is bundled with other aspects of the plans, it's very difficult to figure out which plans offer the best deal.

But confusion isn't the only, or even the main, reason why the privatization of drug benefits is bad for America. The real problem is that we'll end up spending too much and getting too little.

Everything we know about health economics indicates that private drug plans will have much higher administrative costs than would have been incurred if Medicare had administered the benefit directly.

It's also clear that the private plans will spend large sums on marketing rather than on medicine. I have nothing against Don Shula, the former head coach of the Miami Dolphins, who is promoting a drug plan offered by Humana. But do we really want people choosing drug plans based on which one hires the most persuasive celebrity?

Last but not least, competing private drug plans will have less clout in negotiating lower drug prices than Medicare as a whole would have. And the law explicitly forbids Medicare from intervening to help the private plans negotiate better deals.

Last week I explained that the Medicare drug bill was devised by people who don't believe in a positive role for government. An insistence on gratuitous privatization is a byproduct of the same ideology. And the result of that ideology is a piece of legislation so bad it's almost surreal.

Thursday, November 17, 2005

The New Rules on Going Broke in America


Nicholas Kulish, an editorial board member, writes on business issues.




There was a long line of people wrapped around the corner of New York's hundred-year-old granite Custom House at the foot of Broadway one Friday last month. Most of the people in it had little more than cheap black convenience-store umbrellas to protect them from a torrential downpour. But they had good reason to wait for hours in the driving rain. The following Monday, October 17, a much-disputed new federal bankruptcy law took effect.

Similar scenes played out across the country. People living on the economic brink rushed to declare themselves insolvent under the old law, rather than become guinea pigs of the new regime. A surge in filings in the run-up to the deadline was expected, but the sheer quantity that gray Friday dwarfed most expectations.

The new law makes it much harder for people in financial distress to make the "fresh start" that has long been the promise of American bankruptcy law. It requires most people who earn more than the median income in their state to pay off their debts on a five-year repayment plan. Poorer filers can still avail themselves of Chapter 7's debt-erasing provisions, but they face an array of new hurdles, including mandatory credit counseling, greater paperwork requirements, and rising lawyers' fees.

America has always been ambivalent about bankruptcy. It has been stigmatized as a refusal to make good on one's obligations. But at the same time, the laws governing bankruptcy have been credited with contributing to the flexibility of the American economic system, and has been a key ingredient in its success over the last century.

News stories about the new law have largely focused on how it will make life more difficult for people on the economic margins, including those who ended up there as a result of illness, divorce, or other life crises. That focus is understandable - the changes will have a devastating impact on many of the most vulnerable Americans. What is less understood, however, is how the new law could hurt the entire United States economy, and consequently, the financial wellbeing of all Americans.

The traditionally more lenient approach in past laws to the discharge of debt was not primarily intended to make life easier for the poorest Americans. It was designed to help create the kind of risk-taking, dynamic economy that has been critical to America's success. But the new rules could well chip away at two of the main pillars of the American economy - the entrepreneurial spirit of small businesses, and robust consumer spending.

While the bankruptcy regulations that apply to big businesses were left pretty much the same, the new law is likely to cause hardships beyond those of individual filers by putting a damper on small businesses. Unlike large corporations, small businesses are often established and financed by their owners, with money from their own bank accounts. As a result, people whose enterprise fails often file for personal bankruptcy. The new law is likely to inhibit them.

The clear winners are the credit card companies and other lenders who pushed the law through Congress. The losers, though, are not just the poor people who will have more trouble declaring bankruptcy. We may all be worse off because of the way in which the new law weakens American economic life.

I. The Blame Game: Reckless Spenders vs. Victims of Circumstance

Personal bankruptcy filings have increased sharply in recent years, more than doubling between 1994 and 2003, when they reached 1.6 million. There has long been agreement that something had to be done, but creditor and debtor interest groups differed sharply over why bankruptcies were soaring, and how to respond.

Public policy debates often produce stock characters that aim to cut through the statistics and complexities - like the welfare queen, living the high life on government largesse. Both sides of the consumer bankruptcy debate have their preferred symbol.

Supporters of tougher restrictions conjured up the image of the luxury-loving overspender, living beyond his means, with a flat-screen television at home and a Porsche Carrera in the driveway. Just when the creditors begin snapping at his heels, the deadbeat hides behind consumer-friendly bankruptcy laws, shielding his ill-gotten gains while ordinary suckers - the ones who pay their bills - absorb the cost of his irresponsibility.

Consumer advocates, on the other hand, paint the picture of a poor, honest family, barely scraping by with both parents working, who then suffer an unavoidable setback, such as the loss of a job or a sudden illness. Bankruptcy is the only salvation for these hardworking victims of circumstance, the argument goes, and the new law condemns them to permanent debt slavery.

There are, of course, examples of both. But the statistics show that there are many more hard-luck cases than footloose overspenders.

In the end, this debate was resolved not by the power of either side's arguments, but by Capitol Hill politics. The financial services industry, one of the nation's biggest campaign contributors, persuaded Congress to enact a law that was a virtual industry wish list. The lawmakers ignored the concerns raised by consumer groups, who wanted the law to address the lenders' role in the debt crisis - the explosion of credit-card offerings to poor people, students, and others who are likely to end up with bills they cannot pay, and the outrageous level of interest levied by many credit card companies.

The lenders have tried to frame bankruptcy purely as a story of irresponsible borrowers. But in many cases, it is the creditors who have been irresponsible, by lending money to people they have reason to know may be unable to pay the money back. In the name of holding debtors more accountable, lenders asked the federal government to let them off the hook for their own bad lending decisions.

Neither borrowers nor lenders are, as a group, entirely without blame when debts go unpaid. But of the two groups, the lenders - who are almost invariably large banks and credit card companies - are in a better position to absorb the loss, since they can spread it over many borrowers. Individuals don't have that luxury. When deciding where to place the burden of a bankruptcy law, Congress should have given the benefit of the doubt to vulnerable individuals.

II. The All-American Second Chance

The United States may be far less charitable than European countries when it comes to social welfare programs like health insurance and unemployment benefits, but it has long been generous in giving debtors a second chance.

Bankruptcy is a legal status that in America must be determined by a judge. In Chapter 7 or so-called "fresh start" cases, once bankruptcy is declared, the debtor's remaining assets and all but a few, exempted possessions are divided up among creditors. But any remaining debts are discharged and do not have to be paid back in the future.

Bankruptcy laws were not always so forgiving. In Roman times a debtor could be sold into slavery and the selling price divided among his creditors. In extreme cases, the debtor could literally be chopped into pieces for divvying up among the same group. England's debtors' prisons were only slightly more humane.

Treating debtors punitively can be cruel to the individuals involved, as those unlucky Romans might have pointed out, but it is also harmful to society as a whole. In our own system, tough bankruptcy measures can discourage people from taking the sort of financial risks that lead to innovation and economic growth. And even those in relatively stable enterprises could be driven out of business by harsh economic climates. American legislators recognized long ago that some measure of protection would benefit the economy.

In 1841, the United States passed its first bankruptcy law that allowed debtors to declare bankruptcy on their own initiative, rather than because their creditors demanded it. Congressman Eugenius Nisbet of Georgia said at the time, "The public will be the great gainers by discharging the bankrupts, because thereby you throw into activity a large amount of intellectual and professional capital which otherwise would be forever lost." In other words, don't leave the best and brightest on the sidelines just because they got burned in a market collapse or an economic downturn.

Since that 1841 law, America has gone even further to establish a system that gives businesses and individuals leeway to fail. "This country has long had the most debtor friendly of bankruptcy laws, designed to promote entrepreneurship and innovation, and it worries me that we're moving away from that tradition," says David Moss, a professor at Harvard Business School. "Generous bankruptcy laws encourage us to take risks, and this is a country that does well on risk taking."

III. Message to Entrepreneurs : Don't Take the Plunge

When most people think of American business, they think of Fortune 500 companies and other behemoths. But small, entrepreneurial businesses also help drive the American economy.

According to the Small Business Administration, small companies provide roughly three-quarters of the net new jobs added to the economy and employ half of the private workforce. Of course, not all small companies stay small. American business history is full of stories like that of Stanford graduates Bill Hewlett and Dave Packard, who founded Hewlett-Packard in 1939 out of a Palo Alto garage, or Ray Kroc, who transformed a single hamburger restaurant owned by the McDonald brothers into a global phenomenon.

These once-small businesses, and many others, were the result of a leap of faith on the part of their founders. It is far less of a hassle to work for a big company, and certainly less of a risk. It takes a particular kind of personality to sink one's life savings into a venture that statistics have proven will most likely fail. In 2004, about 581,000 new firms were founded and 576,000 closed.

This entrepreneurial spirit has long been part of the American economic ideal. And bankruptcy has long been a safety net for entrepreneurs. According to government statistics, there were about 37,000 business bankruptcies in 2003. But a recent study by bankruptcy experts Elizabeth Warren of Harvard and Robert Lawless of the University of Nevada-Las Vegas estimated the actual number at between 260,000 and 315,000 bankruptcies annually. Lawless calls the government numbers "divorced from reality," in a release accompanying the study.

The government figure does not include the personal bankruptcies of small business owners whose enterprises have failed. It can be difficult to separate the individual from the company, when personal credit cards are the first source of venture capital and garages, attics, and dorm rooms serve as the company headquarters. The new bankruptcy law will make things far tougher for hundreds of thousands of small businesses, something entrepreneurs are likely to take into account when they consider whether to take the plunge. Michelle J. White, an economist at the University of California, San Diego, found that states with higher homestead exemptions - which allow bankruptcy filers to keep some amount of home equity after filing - had much higher rates of business ownership. Her conclusion: Entrepreneurs take bankruptcy, and the degree to which they are likely to be punished for failure, into account.

That is logical. Entrepreneurs have to evaluate a wide array of possible outcomes, and one of these is the worst-case scenario, the failure of the company. The bankruptcy debate focuses so much on lower-income groups - and correctly so from a social justice standpoint - that the business side is ignored. When discussed at all, it's usually to debate the justice of big, old concerns like the auto parts manufacturer Delphi slashing wages.

With the bar raised for personal bankruptcy, and particularly the costs associated with failure, fewer people may decide to start businesses. Instead of losing almost everything but being able to start anew, would-be business owners now must spend up to five years living on a system of allowances developed by the Internal Revenue Service.

Professor White writes in an article that this "would make the U.S. small business environment more like that of Germany, where bankruptcy law has never included a "fresh start," risk taking is frowned upon, there are many fewer entrepreneurs, unemployment is higher and economic growth is slower."

IV. Driving the Poor Out of the System

The changes to the law affecting the poorer half of the population appear modest, but seemingly little changes can do a lot of damage to those on the economic edge. Poor debtors will have to pay for mandatory credit counseling and furnish more pay stubs and tax returns. Lawyers now have to certify their filings and assess their average incomes over the previous months, which in turn is leading to higher fees.

The upshot is that bankruptcy is becoming more of a hassle and more expensive. One Harlem native waiting in line in downtown Manhattan on that wet October afternoon said she was shocked at the price tag. "I couldn't go to a lawyer," said the woman, who preferred that her name not be used. "I tried that first and it was $790. There was no way I could afford that." With the help of the low-cost filing service "We the People," she managed to get it done for just over $500, including court fees. She was embarrassed to admit that she had to borrow much of the money from her sister and a friend. Being too poor to declare bankruptcy sounds like the ultimate Catch-22, but it can be a reality for people in economic distress - and it is likely to become far more common under the new law, which hikes the costs of bankruptcy considerably.

The answer for many people who are too poor to be bankrupt is what academics call "informal bankruptcy." "If you erect barriers, more people will opt not to file," says Lawrence Ausubel, a University of Maryland economics professor who has studied the phenomenon. Rather than pay the high costs of a formal bankruptcy, poor people may choose to hang up on creditors when they call, change their phone numbers, and even change addresses. In a study he co-authored, Mr. Ausubel found that half of the delinquent accounts written off by credit card companies were from debtors who had not filed for bankruptcy.

Worse still is what can happen after that. To avoid having their wages garnished, debtors may begin to turn away from mainstream financial institutions and credit arrangements. They may turn to the sort of high-cost payday loans and check-cashing outlets that some illegal immigrants rely on, and work for cash-only businesses. That hurts them by driving up their cost of borrowing money and limiting their employment options. It also hurts the wider economy, because it means they stop paying taxes.

V. A Blow to Consumer Spending

A final likely result of the new bankruptcy law is that many consumers may do what has become unthinkable: stop spending. If small businesses are important for the economy, consumer spending is absolutely essential.

While proponents of creditor-friendly bankruptcy laws harangue consumers for living beyond their means, economists say that exuberant consumer spending is crucial if the American economy is going to keep growing at its current pace. For manufacturers and service providers to continue to thrive, consumers have to keep buying what they are offering up.

Americans have shown a strong, and ever-growing, commitment to spending: average household credit card balances have risen to over $9,000. A change in the bankruptcy law alone is not enough to change these consumer spending patterns. But the new bankruptcy law is not an isolated occurrence. It comes at a time when a number of forces are all working against consumer spending. This winter, the combination of these forces may finally break the back of United States demand.

The rapid rise in real estate values over the past few years has been an important factor in driving up consumer spending. The "wealth effect" of soaring property values has given Americans the confidence, and in many cases the home equity loans, to spend on consumer goods. There are signs, however, that the real estate market is finally beginning to cool, which means that home equity as a cost-free ATM for homeowners may be about to stop.

At the same time, consumers are facing an array of fast-rising expenses. Health care costs are skyrocketing, with employee shares of deductibles and premiums increasing far faster than in the past. Gasoline prices have jumped and home heating costs are expected to soar this winter. Americans who make less than the median income already spend well over 10% of their budgets on energy, according to Economy.com. That percentage is likely to rise sharply in the days ahead.

Perhaps most significant of all is an expensive but little-noticed rule change by the nation's banking regulators, who decided back in January 2003 to require credit card companies to ask for higher minimum payments. The change was intended to ensure that customers paying the minimum would eventually pay off their full balance and get out of debt. It's a great rule that happens to be hitting at exactly the wrong moment. Some banks have already made the switch, but between now and January 1, more and more customers carrying high balances on their cards will see their minimum payment requirements double.

With consumers under financial pressure from so many directions, we could have expected a surge in bankruptcies. Credit-card delinquencies reached a record of 4.81 percent of accounts in the second quarter - and that was before Hurricane Katrina hit and drove prices up.

It is clear that the new bankruptcy law will, as critics have long argued, make the lives of debtors far worse. It's not as clear how severe the effect on economic growth will be. But consumer spending adjusted for inflation fell in September for the second month in a row. That's the first time that's happened in 15 years. Instead of a White Christmas, we may - thanks in part to the new bankruptcy rules - have one deep in the red.

Tuesday, November 15, 2005

In Intelligent Design Case, a Cause in Search of a Lawsuit

In Intelligent Design Case, a Cause in Search of a Lawsuit

By LAURIE GOODSTEIN (NYT) 1256 words
Published: November 4, 2005

For years, a lawyer for the Thomas More Law Center in Michigan visited school boards around the country searching for one willing to challenge evolution by teaching intelligent design, and to face a risky, high-profile trial.

Intelligent design was a departure for a nonprofit law firm founded by two conservative Roman Catholics -- one the magnate of Domino's pizza, the other a former prosecutor -- who until then had focused on the defense of anti-abortion advocates, gay-rights opponents and the display of Christian symbols like crosses and Nativity scenes on government property.

But Richard Thompson, the former prosecutor who is president and chief counsel of the Thomas More Center, says its role is to use the courts ''to change the culture'' -- and it well could depending on the outcome of the test case it finally found.

Lawyers for the center are to sum up their case on Friday after a six-week trial in which they have been defending the school district in the small Pennsylvania town of Dover. The school board voted last year to require that students in ninth grade biology class be read a statement saying that ''Darwin's theory'' is ''not a fact'' and that intelligent design is an alternative worth studying.

At issue in the Dover lawsuit, brought by 11 parents in Federal District Court, is whether intelligent design is really religion dressed up as science, and whether teaching it in a public school violates the constitutional separation of church and state.

The More center's lawyers put scientists on the witness stand who argued that intelligent design -- the idea that living organisms are so complex that the best explanation is that a higher intelligence designed them -- is a credible scientific theory and not religion because it never identifies God as the designer.

Still religion is at the heart of the case's appeal for the center, say its lawyers and the chairman of its board.

The chairman, Bowie Kuhn, the former baseball commissioner, said the board agreed that the center should take on an intelligent design case because while it is not necessarily based on religion ''it is being opposed because people think it is religious.'' And that was enough for a group whose mission, as explained on its Web site, is ''to protect Christians and their religious beliefs in the public square.''

''America's culture has been influenced by Christianity from the very beginning,'' Mr. Thompson said, ''but there is an attempt to slowly remove every symbol of Christianity and religious faith in our country. This is a very dangerous movement because what will ultimately happen is, out of sight, out of mind.''

The legal group was founded in 1999 by Mr. Thompson and Thomas Monaghan, the former chief executive of Domino's pizza. At the time, Mr. Thompson had just lost his re-election campaign for prosecutor in Oakland County, Mich., defeated by voters disenchanted by his pursuit of Dr. Jack Kevorkian, the retired pathologist who attended numerous assisted suicides.

In earlier cases, the center defended an enormous cross placed on a hill outside San Diego and Nativity scenes in Florida and New York. It sued the Ann Arbor schools for providing benefits for same-sex partners. And in one of its most controversial cases, it defended an anti-abortion group that ran an online list of doctors it said should be stopped from providing abortions. The doctors said the group was threatening them and their families. Mr. Thompson said in an interview it was ''a very important free speech case.''

To find its first intelligent design case, the lawyers went around the country looking for a school board willing to withstand a lawsuit. In May 2000, Robert Muise, one of the lawyers, traveled to Charleston, W.Va., to persuade the school board there to buy the intelligent design textbook ''Of Pandas and People'' and teach it in science class.

Mr. Muise told the board in Charleston that it would undoubtedly be sued if the district taught intelligent design, but that the center would mount a defense at no cost.

''We'll be your shields against such attacks,'' he told them at a school board meeting, a riff on the center's slogan, ''The Sword and Shield for People of Faith.'' He said they could defend teaching intelligent design as a matter of academic freedom.

John Luoni, the former president of the Charleston school board, said he remembered listening to Mr. Muise and concluding: ''It's not really a scientific theory. It's more of a religious theory. It should be taught if a church or a denomination believes in it, but I didn't think that religious viewpoint should be taught as part of a science class.''

The board in West Virginia declined the center's offer. So did school districts in Michigan and Minnesota and a handful of other states, Mr. Muise and Mr. Thompson said.

But in Dover, the firm found willing partners when it contacted the school board in the summer of 2004 and promised it a first-class defense,

The Dover school board proceeded despite a memo from its lawyer, Stephen S. Russell, warning that if the board lost the case, they would have to pay its opponents legal fees -- which according to the plaintiffs' lawyers exceeds $1 million. In the memorandum, revealed in court on Wednesday, Mr. Russell advised that opponents would have a strong case because board members had a lengthy public record of advocating ''putting religion back in the schools.''

Some of the proponents of intelligent design are also unhappy that the case went to court, and fear it could stop the movement in its infancy because some board members had a public record of advocating creationism, which the Supreme Court has twice ruled cannot be taught in public schools.

''The school district never consulted us and did the exact opposite of what we suggested,'' said John G. West, a senior fellow at the Discovery Institute, an organization in the forefront of the intelligent design movement. ''Frankly I don't even know if school board members know what intelligent design is. They and their supporters are trying to hijack intelligent design for their own purposes. They think they're sending signals in the culture wars.''

Mr. Thompson, the Thomas More Center's chief counsel, said the case appealed to him because of its ''national impact.'' Four months before the trial started, he said, he watched the movie ''Inherit the Wind,'' a drama about the Scopes evolution trial 80 years ago that helped turn the country against religious creationists and fundamentalists.

''It's only when you take the cases that are on the borderline that you can change the law,'' he said.

No matter how the Dover case turns out, the center is considering defending several teachers who are defying their school districts by teaching intelligent design.

''We're developing all this expertise in intelligent design,'' Mr. Thompson said. ''We hope to use it.''

Sunday, November 13, 2005

'We Do Not Torture' and Other Funny Stories by FRANK RICH

November 13, 2005
Op-Ed Columnist
'We Do Not Torture' and Other Funny Stories
By FRANK RICH

IF it weren't tragic it would be a New Yorker cartoon. The president of the United States, in the final stop of his forlorn Latin America tour last week, told the world, "We do not torture." Even as he spoke, the administration's flagrant embrace of torture was as hard to escape as publicity for Anderson Cooper.

The vice president, not satisfied that the C.I.A. had already been implicated in four detainee deaths, was busy lobbying Congress to give the agency a green light to commit torture in the future. Dana Priest of The Washington Post, having first uncovered secret C.I.A. prisons two years ago, was uncovering new "black sites" in Eastern Europe, where ghost detainees are subjected to unknown interrogation methods redolent of the region's Stalinist past. Before heading south, Mr. Bush had been doing his own bit for torture by threatening to cast the first veto of his presidency if Congress didn't scrap a spending bill amendment, written by John McCain and passed 90 to 9 by the Senate, banning the "cruel, inhuman or degrading" treatment of prisoners.

So when you watch the president stand there with a straight face and say, "We do not torture" - a full year and a half after the first photos from Abu Ghraib - you have to wonder how we arrived at this ludicrous moment. The answer is not complicated. When people in power get away with telling bigger and bigger lies, they naturally think they can keep getting away with it. And for a long time, Mr. Bush and his cronies did. Not anymore.

The fallout from the Scooter Libby indictment reveals that the administration's credibility, having passed the tipping point with Katrina, is flat-lining. For two weeks, the White House's talking-point monkeys in the press and Congress had been dismissing Patrick Fitzgerald's leak investigation as much ado about nothing except politics and as an exoneration of everyone except Mr. Libby. Now the American people have rendered their verdict: they're not buying it. Last week two major polls came up with the identical finding, that roughly 8 in 10 Americans regard the leak case as a serious matter. One of the polls (The Wall Street Journal/NBC News) also found that 57 percent of Americans believe that Mr. Bush deliberately misled the country into war in Iraq and that only 33 percent now find him "honest and straightforward," down from 50 percent in January.

The Bush loyalists' push to discredit the Libby indictment failed because Americans don't see it as a stand-alone scandal but as the petri dish for a wider culture of lying that becomes more visible every day. The last-ditch argument rolled out by Mr. Bush on Veterans Day in his latest stay-the-course speech - that Democrats, too, endorsed dead-wrong W.M.D. intelligence - is more of the same. Sure, many Democrats (and others) did believe that Saddam had an arsenal before the war, but only the White House hyped selective evidence for nuclear weapons, the most ominous of all of Iraq's supposed W.M.D.'s, to whip up public fears of an imminent doomsday.

There was also an entire other set of lies in the administration's prewar propaganda blitzkrieg that had nothing to do with W.M.D.'s, African uranium or the Wilsons. To get the country to redirect its finite resources to wage war against Saddam Hussein rather than keep its focus on the war against radical Islamic terrorists, the White House had to cook up not only the fiction that Iraq was about to attack us, but also the fiction that Iraq had already attacked us, on 9/11. Thanks to the Michigan Democrat Carl Levin, who last weekend released a previously classified intelligence document, we now have conclusive evidence that the administration's disinformation campaign implying a link connecting Saddam to Al Qaeda and 9/11 was even more duplicitous and manipulative than its relentless flogging of nuclear Armageddon.

Senator Levin's smoking gun is a widely circulated Defense Intelligence Agency document from February 2002 that was probably seen by the National Security Council. It warned that a captured Qaeda terrorist in American custody was in all likelihood "intentionally misleading" interrogators when he claimed that Iraq had trained Qaeda members to use illicit weapons. The report also made the point that an Iraq-Qaeda collaboration was absurd on its face: "Saddam's regime is intensely secular and is wary of Islamic revolutionary movements." But just like any other evidence that disputed the administration's fictional story lines, this intelligence was promptly disregarded.

So much so that eight months later - in October 2002, as the White House was officially rolling out its new war and Congress was on the eve of authorizing it - Mr. Bush gave a major address in Cincinnati intermingling the usual mushroom clouds with information from that discredited, "intentionally misleading" Qaeda informant. "We've learned that Iraq has trained Al Qaeda members in bomb-making and poisons and deadly gases," he said. It was the most important, if hardly the only, example of repeated semantic sleights of hand that the administration used to conflate 9/11 with Iraq. Dick Cheney was fond of brandishing a nonexistent April 2001 "meeting" between Mohamed Atta and an Iraqi intelligence officer in Prague long after Czech and American intelligence analysts had dismissed it.

The power of these lies was considerable. In a CBS News/New York Times poll released on Sept. 25, 2001, 60 percent of Americans thought Osama bin Laden had been the culprit in the attacks of two weeks earlier, either alone or in league with unnamed "others" or with the Taliban; only 6 percent thought bin Laden had collaborated with Saddam; and only 2 percent thought Saddam had been the sole instigator. By the time we invaded Iraq in 2003, however, CBS News found that 53 percent believed Saddam had been "personally involved" in 9/11; other polls showed that a similar percentage of Americans had even convinced themselves that the hijackers were Iraqis.

There is still much more to learn about our government's duplicity in the run-up to the war, just as there is much more to learn about what has gone on since, whether with torture or billions of Iraq reconstruction dollars. That is why the White House and its allies, having failed to discredit the Fitzgerald investigation, are now so desperate to slow or block every other inquiry. Exhibit A is the Senate Intelligence Committee, whose Republican chairman, Pat Roberts, is proving a major farceur with his efforts to sidestep any serious investigation of White House prewar subterfuge. Last Sunday, the same day that newspapers reported Carl Levin's revelation about the "intentionally misleading" Qaeda informant, Senator Roberts could be found on "Face the Nation" saying he had found no evidence of "political manipulation or pressure" in the use of prewar intelligence.

His brazenness is not anomalous. After more than two years of looking into the forged documents used by the White House to help support its bogus claims of Saddam's Niger uranium, the F.B.I. ended its investigation without resolving the identity of the forgers. Last week, Jane Mayer of The New Yorker reported that an investigation into the November 2003 death of an Abu Ghraib detainee, labeled a homicide by the U.S. government, has been, in the words of a lawyer familiar with the case, "lying kind of fallow." The Wall Street Journal similarly reported that 17 months after Condoleezza Rice promised a full investigation into Ahmad Chalabi's alleged leaking of American intelligence to Iran, F.B.I. investigators had yet to interview Mr. Chalabi - who was being welcomed in Washington last week as an honored guest by none other than Ms. Rice.

The Times, meanwhile, discovered that Mr. Libby had set up a legal defense fund to be underwritten by donors who don't have to be publicly disclosed but who may well have a vested interest in the direction of his defense. It's all too eerily reminiscent of the secret fund set up by Richard Nixon's personal lawyer, Herbert Kalmbach, to pay the legal fees of Watergate defendants.

THERE'S so much to stonewall at the White House that last week Scott McClellan was reduced to beating up on the octogenarian Helen Thomas. "You don't want the American people to hear what the facts are, Helen," he said, "and I'm going to tell them the facts." Coming from the press secretary who vowed that neither Mr. Libby nor Karl Rove had any involvement in the C.I.A. leak, this scene was almost as funny as his boss's "We do not torture" charade.

Not that it matters now. The facts the American people are listening to at this point come not from an administration that they no longer find credible, but from the far more reality-based theater of war. The Qaeda suicide bombings of three hotels in Amman on 11/9, like the terrorist attacks in Madrid and London before them, speak louder than anything else of the price we are paying for the lies that diverted us from the war against the suicide bombers of 9/11 to the war in Iraq.

Health Economics 101 By PAUL KRUGMAN

November 14, 2005
Op-Ed Columnist
Health Economics 101
By PAUL KRUGMAN

Several readers have asked me a good question: we rely on free markets to deliver most goods and services, so why shouldn't we do the same thing for health care? Some correspondents were belligerent, others honestly curious. Either way, they deserve an answer.

It comes down to three things: risk, selection and social justice.

First, about risk: in any given year, a small fraction of the population accounts for the bulk of medical expenses. In 2002 a mere 5 percent of Americans incurred almost half of U.S. medical costs. If you find yourself one of the unlucky 5 percent, your medical expenses will be crushing, unless you're very wealthy - or you have good insurance.

But good insurance is hard to come by, because private markets for health insurance suffer from a severe case of the economic problem known as "adverse selection," in which bad risks drive out good.

To understand adverse selection, imagine what would happen if there were only one health insurance company, and everyone was required to buy the same insurance policy. In that case, the insurance company could charge a price reflecting the medical costs of the average American, plus a small extra charge for administrative expenses.

But in the real insurance market, a company that offered such a policy to anyone who wanted it would lose money hand over fist. Healthy people, who don't expect to face high medical bills, would go elsewhere, or go without insurance. Meanwhile, those who bought the policy would be a self-selected group of people likely to have high medical costs. And if the company responded to this selection bias by charging a higher price for insurance, it would drive away even more healthy people.

That's why insurance companies don't offer a standard health insurance policy, available to anyone willing to buy it. Instead, they devote a lot of effort and money to screening applicants, selling insurance only to those considered unlikely to have high costs, while rejecting those with pre-existing conditions or other indicators of high future expenses.

This screening process is the main reason private health insurers spend a much higher share of their revenue on administrative costs than do government insurance programs like Medicare, which doesn't try to screen anyone out. That is, private insurance companies spend large sums not on providing medical care, but on denying insurance to those who need it most.

What happens to those denied coverage? Citizens of advanced countries - the United States included - don't believe that their fellow citizens should be denied essential health care because they can't afford it. And this belief in social justice gets translated into action, however imperfectly. Some of those unable to get private health insurance are covered by Medicaid. Others receive "uncompensated" treatment, which ends up being paid for either by the government or by higher medical bills for the insured. So we have a huge private health care bureaucracy whose main purpose is, in effect, to pass the buck to taxpayers.

At this point some readers may object that I'm painting too dark a picture. After all, most Americans too young to receive Medicare do have private health insurance. So does the free market work better than I've suggested? No: to the extent that we do have a working system of private health insurance, it's the result of huge though hidden subsidies.

Private health insurance in America comes almost entirely in the form of employment-based coverage: insurance provided by corporations as part of their pay packages. The key to this coverage is the fact that compensation in the form of health benefits, as opposed to wages, isn't taxed. One recent study suggests that this tax subsidy may be as large as $190 billion per year. And even with this subsidy, employment-based coverage is in rapid decline.

I'm not an opponent of markets. On the contrary, I've spent a lot of my career defending their virtues. But the fact is that the free market doesn't work for health insurance, and never did. All we ever had was a patchwork, semiprivate system supported by large government subsidies.

That system is now failing. And a rigid belief that markets are always superior to government programs - a belief that ignores basic economics as well as experience - stands in the way of rational thinking about what should replace it.