tag:blogger.com,1999:blog-12441873178184112522024-03-13T13:53:15.753-07:00Why We Make Mistakes BlogJoe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.comBlogger52125tag:blogger.com,1999:blog-1244187317818411252.post-85481752182827205212012-03-14T08:58:00.003-07:002012-03-14T09:08:43.474-07:00California Dreamin'For a truly outstanding example of wishful thinking, check out <a href="http://www.wallstreetnews.info/?p=7697">this article</a> on pension fund behemoth Calpers. Faced with pathetic returns on its investments, the California fund appears poised to do what many other pension funds have already (grudgingly) done: lower the expected rate of return it receives on its investments.<br /><br />Despite record-low interest rates, the median state pension plan in the U.S. still assumes an annual rate of return near 8%, which is a bit like the median American housewife assuming a date with George Clooney; it ain’t gonna happen.<br /><br />So, ever so slightly, public pension plans have been trimming their expected rates of return. Calpers’s pension and health benefits committee has recommended an assumed annual rate of 7.5% -- just a smidgen below its current level of 7.75%.<br /><br />And how, you might ask, does that compare to Calpers’s <span style="font-style:italic;">actual</span> rate of return?<br /><br />For the year ended Dec. 31, Calpers earned a whopping return of just 1.1%.<br /><br />Nevertheless, Calpers’s chief actuary said the 7.5% rate was prudent, and said Calpers has a 50-50 chance of meeting that goal.<br /><br />Right.<br /><br />So years from now, if there is a gigantic shortfall in California’s pension funds and retirees go begging, many excuses will be offered - but remember: it all started with wishful thinking.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-27036534319560456732011-11-09T10:47:00.000-08:002011-11-09T11:01:46.328-08:00Another Thing That Doesn't WorkTo the growing list of bright ideas that don’t work we can now add another: a bypass surgery to prevent strokes. Until yesterday, many doctors thought that if they connected an artery in the scalp to a deeper vessel to improve blood flow to the brain, they could help patients with poor circulation avoid strokes.<br /><br />But no. It turns out that the surgery itself actually caused strokes! According to a $20-million <a href="http://jama.ama-assn.org/content/306/18/1983.full.pdf+html">government-funded study</a> published on Tuesday, 14.4 percent of the patients who had the surgery had a stroke within a month of the operation. By comparison, the stroke rate for the group that did not have surgery was only 2 percent. <br /><br />The evidence was so overwhelming that the study was stopped early. As the <a href="http://www.nytimes.com/2011/11/09/health/research/surgery-to-prevent-strokes-is-found-ineffective.html">New York Times noted</a>, “What had seemed to make sense medically did not work out in fact.”<br /><br />This is becoming a troubling refrain for medical “cures.” A few weeks ago we got a <a href="http://www.nejm.org/doi/full/10.1056/NEJMoa1105335">nearly identical report</a> on the use of brain stents to prevent strokes. (Those who got the device actually had so many more strokes than those assigned to control groups that that study, too, was stopped early.)<br /><br />And a few weeks before the stent study came out we got a similar report regarding the drug niacin. Doctors had hoped that it would prevent heart attacks by raising the levels of “good” cholesterol in a patient’s blood. But that, too, didn’t pan out. According to that study, niacin provided no benefit over simple statin therapy.<br /><br />What’s the lesson here? Just because you think a thing <span style="font-style:italic;">should<span style="font-style:italic;"></span></span> work is no guarantee that it <span style="font-style:italic;">will</span> work. And you never know whether it will work until it has been independently tested.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com1tag:blogger.com,1999:blog-1244187317818411252.post-67178367083857482032011-10-13T11:41:00.000-07:002011-10-13T12:14:57.504-07:00Left to Our Own Devices...In case you missed it, let me direct your attention to a <a http://www.blogger.com/img/blank.gifhref="http://online.wsj.com/article/SB10001424053111904106704576582621677354508.html">recent page-one story</a> in The Wall Street Journal. The article documents a growing trend among surgeons to implant their own medical devices into the bodies of their patients. As a result, the surgeons stand to profit twice: once from the surgery and again from the sale of the device.<br /><br />The surgeons, as you might suspect, say the prospect of making money (even lots of money) doesn't influence their medical decisions. A lawyer for one of the surgeons is quoted in the article as saying the surgeon used his own devices because they "were the best on the market for the procedure" and not because he stood to profit from them.<br /><br />Perhaps. But nearly every piece of research on the subject points the opposite way. <br /><br />First of all, most of us don't like to admit we're biased. Our colleagues? Oh sure, we're happy to admit <span style="font-style:italic;">they</span> might be influenced by such things as money. But not us. A 2001 study of medical residents, for instance, found that 84 percent thought that their colleagues were influenced by gifts from pharmaceutical companies...<span style="font-style:italic;">but</span> only 16 percent thought that they were similarly influenced.<br /><br />The interesting thing is that bias is seldom a matter of deliberate choice.It is typically unconscious and unintentional. Research has shown that professionals who sincerely believe that their decisions are “not for sale”(such as physicians) are still biased in the direction of self-interest.<br /><br />Even those who are purportedly neutral have been shown to be biased. A 2005 study of psychiatric drug trials found that when academic researchers were funded by a drug company, they were nearly five times as likely to report that the treatment was effective.<br /><br />Likewise, a 2003 study by economists at Carnegie Mellon and Harvard found that independent auditors were significantly more likely to approve questionable accounting practices when those practices were done by the firm paying their bills.<br /><br />And it doesn’t take much to bias someone’s judgment – even a coffee mug will do. Students at a medical school where gifts such as coffee mugs and pens are permitted from drug companies had a more favorable attitude toward a cholesterol drug than did students at a medical school where such gifts were banned.<br /><br />So if you're due for surgery any time soon, ask your surgeon about his or her financial interests in the surgery. As the Journal article pointed out, doctors don't always disclose these. One man died shortly after receiving implants from a doctor who stood to profit from them. His widow says she would have liked to have known this before the surgery.<br /> <br />"It might have caused me to ask: Is the surgery really necessary, or is he out to make more money?"Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-42198846470729554532011-06-23T12:04:00.000-07:002011-08-17T08:58:24.995-07:00A Rigged SystemIf you want to see a good example of how bias warps decision-making, check out <a href="http://online.wsj.com/article/SB10001424052702304070104576400032473761332.html">this article</a> in The Wall Street Journal, which has been reporting on developments regarding medical device giant Medtronic Inc. (Disclosure: I have friends and family who work for medical device makers.)
<br />
<br />According to the Journal, the Senate Finance Committee is investigating whether surgeons who received lots of money from Medtronic for consulting and other work failed to note complications associated with a Medtronic product that has become widely used in spinal surgery.
<br />
<br />Medtronic would have us believe the answer is no. It says it provides data about adverse events that occur in clinical trials of its products “irrespective of any financial relationship” between the company and those involved in the studies.
<br />
<br />But academic research suggests otherwise. As Princeton professor and Nobel Laureate Daniel Kahneman and colleagues note in a <a href="http://http://hbr.org/2011/06/the-big-idea-before-you-make-that-big-decision/ar/1?cm_sp=most_widget-_-hbr_articles-_-The%20Big%20Idea%3A%20Before%20You%20Make%20That%20Big%20Decision...">recent article</a> in the Harvard Business Review, “Research has shown that professionals who sincerely believe that their decisions are “not for sale” (such as physicians) are still biased in the direction of their own interests.”
<br />
<br />And in the case of Medtronic, those are quite some interests. According to the Journal, Medtronic paid one Wisconsin surgeon involved in one of its trials $19 million from 2003 to 2007. Another doctor involved in a trial received more than $1.5 million between 2001 and 2006.
<br />
<br />The Senate committee’s investigation was triggered in part by a forthcoming study in a medical journal. That study shows that numerous complications – including potentially fatal ones -- associated with Medtronic’s Infuse Bone Graft occurred in clinical trials.
<br />
<br />But – and this is an important but – those complications went unreported in a dozen research papers about those trials <span style="font-style:italic;">that Medtronic sponsored</span> between 2000 and 2009. (Italic mine.)
<br />
<br />Here you have the nub of the problem: industry money typically funds these trials.
<br />
<br />As the old adage says, he who pays the piper calls the tune. And if you have any doubts, we got a great lesson during the recent financial crisis. We know now that many of the allegedly objective ratings agencies gave high ratings to mortgage securities that later turned out to be junk. Those rosy ratings, of course, favored the ratings agencies’ clients.
<br />
<br />And we have much the same situation with medical trials. They are paid for, in whole or in part, by the companies that make the products being tested. Imagine if we conducted, say, civil trials the same way: the evidence used in court would be paid for by one of the two parties. Which party do you think the evidence would favor?
<br />
<br />The answer here is obvious: we need a system where companies do not directly pay for the evidence used to determine whether their products are safe. Anything else is a rigged system.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-66372343970881843232011-05-27T11:57:00.000-07:002011-05-27T12:00:34.824-07:00Darwin Was Right (and the Doctors Were Wrong)Among the many astute observations made by Charles Darwin is this gem: “Ignorance more frequently begets confidence than does knowledge.”<br /><br />For real-world proof of this claim, look no farther than today’s papers. There, on the front page, are the results of a study that undermines the way physicians have treated heart disease.<br /><br />As the New York Times put it: “The results are part of a string of studies that suggest that what doctors thought they knew about cholesterol may be wrong.”<br /><br />In short, raising the level of their patients’ HDL, or “good” cholesterol, does not matter. For years, doctors have thought the opposite was true. They assumed (without proof, apparently) that raising the levels of good cholesterol – typically by prescribing niacin – would benefit their patients.<br /><br />Now, it’s egg-on-the-face time.<br /><br />“We were stunned, to say the least,” said Dr. William E. Boden, one of the study’s lead investigators.<br /><br />“It’s a shocker,” Steven Nissen, chief of cardiovascular medicine at the Cleveland Clinic, told The Wall Street Journal. “Most of us in the medical community, if we were going to bet on anything, we would have bet on niacin.”<br /><br />Q.E.D.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com1tag:blogger.com,1999:blog-1244187317818411252.post-20085116495340175282011-04-26T09:47:00.000-07:002011-04-26T10:14:46.851-07:00Mis-directionsIn <em>Why We Make Mistakes </em>, I have a little fun pointing out how often people fail to follow the instructions that come with a variety of products, from nail guns to car seats. But if you want an even scarier example, check out <a href="http://online.wsj.com/article/SB10001424052748703521304576279123606877448.html">today's story </a>in The Wall Street Journal on drug labeling. <br /><br />Drug labels are notoriously hard to read -- and often confusing for those who do read them. Not surprisingly, as many as three in four Americans say they don't take prescription medicine as directed. And in recent studies, more than half of adults misunderstood one or more common prescription warnings and precautions.<br /><br />Often, this leads to a trip to the hospital. Nearly 1.9 million people were treated in hospitals for illnesses and injuries from taking medicines -- a 52% increase from 2004 to 2008.<br /><br />One cure for confusion, of course, is simplicity. And one study shows -- what a shock! -- that patients better understood simple, explicit language. For example, "use only on your skin" is better understood than "for external use only." <br /><br />So before you grab that bottle of pills, take some time to check the label.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com4tag:blogger.com,1999:blog-1244187317818411252.post-67766169623206090472011-03-15T08:18:00.001-07:002011-03-15T08:37:56.666-07:00A Tip 'o the Hat......goes to <a href="http://www.bookofjoe.com">Joe Stirt</a>,for passing along <a href="http://www.bookofjoe.com/2010/07/the-wrong-stuff-mountaineer-ed-viesturs-on-making-mistakes.html">this interview </a>of mountaineering legend <a href="http://www.edviesturs.com/">Ed Viesturs </a>by author <a href="http://beingwrongbook.com/author">Kathryn Schulz</a>. It's well worth the read, especially the part where he describes his biggest mistake. I won't spoil it for you, but Schulz tells him the mistake doesn't sound so bad. "You made it down safely, after all." <br /><br /><br />"Yeah," says Viesturs, "but a mistake is a mistake even if you get away with it."<br /><br />I couldn't have put it better.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-56272057622676168562011-01-26T09:34:00.000-08:002011-01-26T10:25:17.478-08:00Missing the ObviousI've talked and written about the importance of failing to see the obvious. It's a common source of error -- and one to which we are all prone -- since most of us believe (understandably, but wrongly) that we would notice obvious things. (See my previous posts on inattentional blindness, for example.)<br /><br />The latest example comes to us from Financial Crisis Inquiry Commission. The FCIC was established to investigate the causes of the financial collapse that wreaked havoc from Wall Street to Main Street. The Commission's final report is due out tomorrow. But a few news outlets, including the <em>New York Times</em>, have gotten an early look. <a href="http://www.nytimes.com/2011/01/26/business/economy/26inquiry.html?_r=1&hpw">Here's</a> what the <em>Times</em> had to say: <br /><br />“The captains of finance and the public stewards of our financial system ignored warnings and failed to question, understand and manage evolving risks within a system essential to the well-being of the American public. Theirs was a big miss, not a stumble.” <br /><br />Those words echo the <a href="http://www.c-spanvideo.org/program/Causesof">testimony</a> of Jamie Dimon, chairman of JPMorgan Chase & Co., who appeared before the Commission a year ago. <br /><br />"I’ve already mentioned the biggest mistakes we made," he told the Commission. "In mortgage underwriting, somehow, we just missed that home prices don’t go up forever..."<br /><br />This admission startled one of the commissioners, who asked Dimon, "Did you do a stress test that showed housing prices falling?"<br /><br />"No," said Dimon. "I would say that’s probably one of the big misses."<br /><br />Yes, indeed. <a href="http://www.nytimes.com/2011/01/26/business/economy/26econ.html?src=me&ref=business">Home prices still haven't recovered</a>, as the latest Standard & Poor's/Case-Shiller index makes clear. In many big cities, home prices have sunk to their lowest prices in years.<br /><br />As the article in the <em>Times</em> goes on to point out, "one striking finding (of the FCIC report) is its portrayal of incompetence." <br /><br />"It quotes Citigroup executives conceding that they paid little attention to mortgage-related risks. Executives at the American International Group were found to have been blind to its $79 billion exposure to credit-default swaps, a kind of insurance that was sold to investors seeking protection against a drop in the value of securities backed by home loans. At Merrill Lynch, managers were surprised when seemingly secure mortgage investments suddenly suffered huge losses." <br /><br />One way to prevent errors of this magnitude is to impose constraints that other other countries (notably Canada, which avoided much of the mortgage mess) have adopted.<br /><br />But with the Dow hovering around 12,000 and <a href="http://online.wsj.com/article/SB10001424052748703555804576102702270161540.html">bankers strutting again at Davos</a>, this seems unlikely.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com3tag:blogger.com,1999:blog-1244187317818411252.post-19274308190126323672011-01-04T12:01:00.000-08:002011-01-04T12:50:16.605-08:00Thinking Things ThroughIf you want to avoid making big mistakes, it's important to learn to think <em>through</em> a problem. This means learning to think not just about how an idea can succeed, but how it can fail.<br /><br />Most of us, of course, don't like to do this. When faced with a decision - which job to take, which college to attend, which car to buy -- our search for information is not neutral. We tend to look for information that supports our pre-existing ideas and to discount information that doesn't. Researchers call this "confirmation bias." Often, it leads to bad decisions.<br /><br />A not-so-obvious example involves the <a href="http://online.wsj.com/article/SB10001424052970204204004576049812990293074.html">high-tech garbage collection system in the city of Toledo</a>, Ohio. The system was supposed to save the city $3 million a year. Instead, it has left the city with $1.3 million in unanticipated expenses – and a bunch of really honked off residents.<br /><br />Like a lot of cities these days, Toledo is strapped for cash. So in 2009 its city council voted to spend $22 million (which it had to borrow, by the way) to buy new garbage cans and a fleet of super-duper, driver-operated trucks. The trucks were equipped with pincers that grab the cans, lift them overhead and dump their contents in the truck. The trucks allowed Toledo to eliminate 70 trash-collector jobs and to slash the solid waster division’s budget to $8 million from $11 million. So far, so good.<br /><br />But then the problems set in. Since the trucks no longer had a pair of human collectors who jump off the back to grab trash bins on either side of the street, drivers had to double back to cover the opposite side of the street. As a result, the trucks use more gas than the city expected, so fuel costs went up.<br /><br />Not only that, the system proved wildly unpopular with residents. The robo-trucks left trash strewn everywhere, and the new 96 gallon garbage cans were so big that old people complained they were too big to haul.<br /><br />They flooded city hall with as many as 600 complaints a day. As a result, the city had to hire 8 telephone operators to run a complaint hotline — further eating into savings.<br /><br />The whole thing has turned into such a nightmare that now the city wants to get out of the trash business entirely and let the county handle it.<br /><br />Toledo isn't alone. The city of Seattle not so long ago made a similar blunder, <a href="http://www.nytimes.com/2008/07/17/us/17toilets.html">squandering $5 million on high-tech public toilets </a>that turned into havens for drug dealers and prostitutes.<br /><br />Whether it's toilets or trash cans, the problem is the same: nobody thought through the problem before emabrking on the "solution." And the result was a costly error.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-56591433255201191172010-09-20T09:31:00.000-07:002010-09-20T10:20:38.670-07:00Looking on the bright sideIn <em>Why We Make Mistakes</em>, I talk quite a bit about our tendency not only to err, but to err in a particular (and predictable) direction -- and that direction is toward optimism.<br /><br />Many people see nothing wrong with this. From the time we are children, most of us are told to look on the bright side, to think positively. Which is fine, as far as it goes. But later in life, this trait usually becomes so ingrained that it forms a cognitive bias that can lead to a number of mistakes.<br /><br />Here are two interesting and seemingly unrelated examples: one involves pensions; the other involves sex. We'll take sex first:<br /><br />A few weeks ago the <em>New York Times </em>ran <a href="http://www.nytimes.com/2010/08/14/health/policy/14pill.html">a front-page story </a>on a birth control pill named ella (with a lower-case e). Ella is a so-called "morning after" pill, an emergency contraceptive designed to be taken after sex.<br /><br />If you're a pharmaceutical company, you might think there's a big market for such a drug. After all, unprotected sex happens quite a lot: studies have shown that more than one million women who do not want to get pregnant are estimated to have unprotected sex every night in the United States.<br /><br />But think again: According to the <em>Times</em>, many women who have unprotected sex tend to look on the bright side -- they think they won't get pregnant. <br /><br />To quote from the <em>Times</em>: "Studies have found that many women fail to realize they are at risk for an unplanned pregnancy after unprotected sex." Which is is astonishing, given that half of all pregnancies in the United States are unintended.<br /><br />Nevertheless, sayeth the <em>Times</em>, "they tend not to use the emergency contraceptives even when they receive them free."<br /><br />In fact, says Dr. James Trussell, director of the Office of Population Research at Princeton, “Emergency contraception has no effect on pregnancy rates or abortion rates. Women just don’t use them enough to make an impact.”<br /><br />No effect. <br /><br />So, so much for a pill called ella. Now for pensions. It turns out that pension managers (at least those who manage public pension funds) have a lot in common with the morning-after crowd: They tend to look on the bright side. Which may not be so good if you happen to be relying on them for your pension.<br /><br />According to <a href="http://online.wsj.com/article/SB10001424052748704358904575477731696162858.html">The Wall Street Journal</a>, pension managers are clinging to wildly unrealistic estimates of their funds' rate of return. And are those estimates unrealistically low? No, they are unrealistically high. The median expected return for more than 100 U.S. public pension plans surveyed by the National Association of State Retirement Administrators is a whopping 8%. That's the same as it was back in 2001.<br /><br />For handy comparison purposes, a conservative investment like the 10-year U.S. Treasury note currently yields less than 3%.<br /><br />If the funds don't earn 8%, of course, that means funding gaps down the road; the pensions won't have as much money as they thought they would have to pay retirees. And that's bad news if you happen to be a retiree.<br /><br />So there you have it: sex, money and optimism all rolled into one.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-29715181035545986062010-07-20T11:43:00.000-07:002010-07-20T12:00:14.131-07:00Tumor? What tumor?Readers of <em>Why We Make Mistakes </em>know that radiologists miss much of what they are supposed to catch. The <a href="http://chestjournal.chestpubs.org/content/115/3/720.full">generally accepted error rate </a>for the radiologic detection of early lung cancer, for instance, is between 20% and 50%; one study at the Mayo Clinic put it as high as 90%. Now comes the <em>New York Times </em>with a <a href="http://www.nytimes.com/2010/07/20/health/20cancer.html?_r=1&src=me&ref=homepage">front-page story </a>on the high error rate not for lung cancer but for breast cancer.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-90307468174128904112010-07-19T10:40:00.001-07:002010-07-19T10:48:17.513-07:00Information Overload ReduxWe've talked often about the perils of information overload. The biggest of these is that more information usually leads to more confidence in our predictions -- but not necessarily more accuracy. In short, we become overconfident about our abilities to accurately predict whatever it is we think the information will help us predict. And that overconfidence often leads to complacency. (Just go back and look Wall Street's ability to predict the housing crisis.)<br /><br />For the latest, and perhaps scariest, example of information overload, take a look at the impressive <a href="http://projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-growing-beyond-control/"> investigative piece </a>by <em>The Washington Post</em>.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-67908278999167772552010-06-03T18:34:00.000-07:002010-06-03T19:03:22.096-07:00Blown CallsPoor Jim Joyce. <br /><br /><a href="http://www.usatoday.com/sports/baseball/al/2010-06-03-1093647302_x.htm">His blown call </a>cost Detroit Tigers pitcher Armando Galarraga a perfect game -- and has since made him the most infamous umpire in Major League Baseball.<br /><br />And all because of <em>one</em> bad call. Imagine if Joyce had blown <em>thousands </em>of such calls. He wouldn't be an umpire for long, would he? But he <em>could</em> work for a credit rating agency, where bad calls seem to be the norm. <br /><br />It turns out that just as the Tigers were getting ready to play, the head of one of the nation's largest credit rating agencies, Moody's, was also warming up for his testimony before the Financial Crisis Inquiry Commission. <br /><br />The commission, as its name implies, is investigating the causes of the nation's latest round of financial shenanigans. And it is now zeroing in on the companies that rated all those lovely securities that quickly turned into compost. <br /><br />Phil Angelides, the commission's chairman, wasted no time in putting a bulls eye squarely on the back of Moody's. <a href="http://www.nytimes.com/2010/06/03/business/03rating.html">In his opening statement </a>he said that 89 percent of the securities given a top triple-A rating by Moody's were later downgraded. 89 percent!<br /><br />"The miss was huge," said Angelides. "Ninety percent downgrade. Even the dumbest kid gets 10% on the exam."<br /><br />Former employees of Moody's attributed this appalling record to the culture in place at Moody's. "Cooperative analysts got good reviews, promotions, higher pay, bigger bonuses, better grants of stock options and restricted stock," said one former employee. <br /><br />Uncooperative analysts, on the other hand, were often fired. <br /><br />At least Joyce had the courage to admit that he made a mistake and to apologize to his victim.<br /><br />This quality seems in short supply on Wall Street.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-60274048536134488022010-05-31T08:53:00.000-07:002010-05-31T09:31:27.433-07:00The Power of Negative ThinkingIf you want a textbook example of how negative thinking can help prevent errors, look no further than the BP oil spill in the Gulf of Mexico. As <a href="http://www.nytimes.com/2010/05/31/opinion/31mon1.html?hp">an editorial </a>in today's <em>New York Times </em>makes clear, "BP's disjointed response suggested it had given little thought to the possibility of a blowout at 5,000 feet."<br /><br />And it's not just BP that gave little thought to a blowout. The same could be said of many Wall Street firms. They, too, failed to give serious thought to the possibility of a blowout of the U.S. housing market. In <a href="http://www.fcic.gov/hearings/01-13-2010.php">testimony</a> earlier this year, for instance, JP Morgan CEO Jamie Dimon made this admission: “In mortgage underwriting,” he said, “somehow, we just missed that home prices don’t go up forever...”<br /><br />Nobody likes to think negatively (by which I mean thinking seriously and deeply <em>on the front end </em>of a problem about what can go wrong). We prefer to think there will always be blue sky. <br /><br />But in any serious endeavor, negative thinking is absolutely necessary. Eisenhower thought so. That's why, when planning the invasion of Normandy, he went so far as to draft a <a href="http://www.archives.gov/education/lessons/d-day-message/images/failure-message.gif">letter</a> taking responsibility for the <em>failure</em> of the D-Day invasion.<br /><br />More recently, Lloyd Blankfein, the head of Goldman Sachs, <a href="http://www.fcic.gov/hearings/01-13-2010.php">testified</a> about the essential importance of negative thinking, which at Goldman takes the form of stress tests. <br /><br />“...The one thing that we constantly learn from every crisis,” he said. “is the need for more stress tests.” <br /><br />“What a stress test does is it says, ‘Don’t tell me that this is unlikely. What if it did happen?’<br /><br />“'But, it’s not going to happen.’”<br /><br />“'What if it did’”?<br /><br />What if it did? That's the key question every negative thinker needs to ask -- and it's one that BP clearly avoided. <br /><br />But there's a natural impediment to negative thinking. Researchers call this impediment a confirmation bias. But <em>The Wall Street Journal</em>, in <a href="http://online.wsj.com/article/SB10001424052748703811604574533680037778184.html">a recent article</a>, referred to it as the "yes-man in your head." Either way, it amounts to the same thing: when we have a decision to make and set out to gather information, our search isn't neutral. As the <em>Journal</em> piece points out, we are twice as likely to seek information that confirms our original belief as we are to seek information that contradicts it.<br /><br />In other words, the more we know, the more certain we become that we are right. And we go on believing that -- right up to the blowout.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com2tag:blogger.com,1999:blog-1244187317818411252.post-18457240910172957182010-04-23T09:35:00.000-07:002010-04-23T09:47:53.503-07:00Future star or future flop?Maybe you saw <a href="http://www.usatoday.com/NEWS/usaedition/2010-04-22-1Atebow22_cv_U.htm?csp=34">USA Today's front-page piece </a>on University of Florida quarterback Tim Tebow the NFL draft. The headline was, "A shining star, or a flop in the making?"<br /><br />It's a good question -- and one that is answered in a <a href="http://mba.yale.edu/faculty/pdf/massey_thaler_overconfidence_nfl_draft.pdf">new paper </a>by Cade Massey of Yale and Richard Thaler of the University of Chicago.<br /><br />Readers of this blog know how often I harp about overconfidence as a source of error. Massey and Thaler find that NFL teams are way overconfident when it comes to their ability to spot talent in draft picks. Just over half the time, they found, the top picks in the draft turn out to be flops.<br /><br />As they state: "The more information teams acquire about players, the more overconfident they will feel about their ability to make fine distinctions...these findings stand as a reminder that decision-makers often know less than they think they know. This lesson has been implicated in disaster after disaster, from international affairs to financial markets.” <br /><br />Amen.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-12687242450668309912010-04-08T09:49:00.000-07:002010-04-08T10:06:21.585-07:00Error by DesignIf you get a chance, check out <a href="http://www.nytimes.com/2010/04/08/technology/personaltech/08pogue.html">David Pogue's column </a>in today's <em>New York Times</em>. On one level, it's about wireless routers. But on another, it shows how bad design can induce people to make errors.<br /><br />It seems that a whopping 25% of wireless routers are returned to the store after purchase. Why? Because they are too complicated to use.<br /><br />In <em>Why We Make Mistakes </em> I talk a lot about how bad design can cause people to make mistakes they might not otherwise make. Those mistakes, in turn, are then blamed on the people, not the design. (For an example, check out the section on the heparin overdose given to the newborn twins of actor Dennis Quaid and his wife.)<br /><br />These kinds of mistakes can be avoided by applying well-known design principles. One example is what engineers call a "forcing function." A forcing function, as the name suggests, forces you to do a certain thing in a certain way.<br /><br />A good example is in your car: to put your car in gear you must first depress the brake. That way, you don't accidentally have your foot on the gas when you drop it into gear and go hurtling into a pedestrian or other car. Good idea, huh?<br /><br />But the people who make routers haven't caught on -- yet.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com1tag:blogger.com,1999:blog-1244187317818411252.post-22774976477355839382010-04-08T09:31:00.000-07:002010-04-08T09:45:36.429-07:00The End is NearIf you like doing stupid things like playing Russian roulette or talking on your cellphone while driving your car, go ahead and do it now. Why? Because soon it will be illegal (at least the cellphone and driving part).<br /><br />U.S. Transportation Ray LaHood said as much in <a href="http://online.wsj.com/article/SB20001424052702303591204575170232249655828.html">an interview </a>with The Wall Street Journal.<br /><br />"The end game is to get cellphones out of (drivers') hands," said LaHood. <br /><br />DOT is sponsoring pilot programs in Syracuse, New York, and Hartford, Conn., to ticket distracted drivers. In Syracuse, it'll cost you 180 beanos if you are caught talking and driving. In Hartford, it'll be a C note, plus costs. The program's motto: "Phone in One Hand. Ticket in the Other." <br /><br />DOT's move comes on the heels of mounting evidence that cellphone use while driving results in dangerous distractions for drivers.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com1tag:blogger.com,1999:blog-1244187317818411252.post-28193894815882759522010-03-31T13:14:00.001-07:002010-03-31T13:16:20.724-07:00This blog has moved<br /> This blog is now located at http://whywemakemistakes.blogspot.com/.<br /> You will be automatically redirected in 30 seconds, or you may click <a href='http://whywemakemistakes.blogspot.com/'>here</a>.<br /><br /> For feed subscribers, please update your feed subscriptions to<br /> http://whywemakemistakes.blogspot.com/feeds/posts/default.<br /> Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-81646313339088134622010-02-24T08:59:00.000-08:002010-02-24T09:10:16.230-08:00The Dirty TruthQuestion: Why do we keep making the same mistake over and over?<br /><br />Answer: Because we fail to identify the root cause of the mistake to begin with -- a tendency researchers call, "misattribution."<br /><br />Example: Washing your clothes. Say you get a stain on your shirt. You throw the shirt in the washing machine, add some detergent and 45 minutes later -- Voila! – the stain is still there. <br /><br />You cuss. You holler. You kick the washing machine. Maybe you blame the detergent. But do you blame yourself? Nooooo. But maybe should. <br /><br />According to a recent study in <em><a href="http://online.wsj.com/article/SB10001424052748703808904575025021214910714.html">The Wall Street Journal</a></em>, most Americans -- 53% -- don’t use the recommended amount of detergent per wash load. Instead they guess, usually filling the cap up to the top. This is a big mistake.<br /><br />Why? Because detergent "overpouring" creates a high, foamy tide inside the machine, lifting soil and lint above the water level so it isn't rinsed away. That leaves residue on clothing that fades colors and attracts more dirt.<br /><br />It’s also bad for your washing machine. Inside the machine, detergent buildup encourages odor and bacteria growth, and leads in time to wear and tear that will require professional attention.<br /><br />So why do we do this? Because we don’t read the instructions. And why don’t we read the instructions? Because we think we know better. Most of us, the article reports, have done so many loads of laundry in our lives that we consider ourselves to be laundry experts. And experts don’t need no stinking instructions.<br /><br />So there you have it: Ignorance and overconfidence all wrapped into one. <br /><br />Class dismissed.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com3tag:blogger.com,1999:blog-1244187317818411252.post-66767909404735718162010-02-01T09:27:00.000-08:002010-02-01T09:47:22.046-08:00Foul? What foul?For another example of how deeply-ingrained biases can affect our judgment -- even when we try to be objective -- check out <a href="http://www.educationnews.org/pr_releases/36771.html">this recent study </a>on fouls during soccer games. (And thanks to my old soccer teammate Robbie Woodward for sending it along.)<br /><br />Researchers at Rotterdam School of Management, Erasmus University, researched all recorded fouls in three major soccer competitions over seven years. They discovered an ambiguous foul is more likely to be attributed to the taller of two players.<br /><br />Similar studies of over the years have found that the judgment of referees can be biased by other factors, too -- such as the color of a hockey team's jersey (teams with black jerseys accrue more fouls) or even the <a href="http://bpp.wharton.upenn.edu/jwolfers/Papers/NBARace.pdf">racial makeup of officiating crew in the National Basketball Association</a>. <br /><br />But we go on pretending the biases don't exist.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-76374958880402179122010-01-26T08:27:00.000-08:002010-01-26T08:38:38.086-08:00DWD (Driving While Distracted)Readers of this blog have heard us carp for some time about the dangers of distracted driving. Now, the federal government is doing something to stop this nuttiness. Effective immediately, drivers of commercial trucks and buses will no longer be allowed to text while driving. Under <a href="http://www.distraction.gov/files/dot/MotorCarrierPressRelease.pdf">federal guidelines </a>that U.S. Transportation Department announced today, drivers of big rigs and buses may be subject to civil or criminal penalties of up to $2,750. <br /><br />Now, if the feds would apply a similar rule to the rest of the drivers on the road, we'd all be much safer.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com1tag:blogger.com,1999:blog-1244187317818411252.post-27211392729462392882010-01-22T09:36:00.000-08:002010-01-22T09:44:20.519-08:00Hack MeReaders of <em>Why We Make Mistakes </em>already know why we pick computer passwords that are easily remembered -- and easily hacked (see pages 33-34). But if you have forgotten why or need more proof, check out the <a href="http://www.nytimes.com/2010/01/21/technology/21password.html">New York Times article </a>on commonly-used passwords. Security researchers discovered a list of 32 million passwords that had been stolen from a website. And the number one password was...123456.(Number two was: 12345.)Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-31398399245027653802010-01-11T09:07:00.000-08:002010-01-11T09:39:18.774-08:00Full-Body Scanners and Error RatesAh, <a href="http://www.cnn.com/2010/TRAVEL/01/11/body.scanners/index.html">full-body scanners</a>. They're supposed to make us safer. But will they?<br /><br />My hunch is: not much. My guess is based not on the scanners themselves (which are intrusive and come with real risks, like additional <a href="http://www.nytimes.com/2010/01/09/health/09scanner.html?scp=1&sq=body%20scan%20radiation&st=cse">radiation deaths</a>), but on the people who do the scanning.<br /><br />Undercover <a href="http://www.usatoday.com/printedition/news/20071018/1a_lede18_dom.art.htm">tests conducted at major airports </a>show that the "miss rates" for baggage inspectors using conventional technology is between 60% and 75%. That's a lot. <br /><br />Has the Transportation Security Administration (or anyone else) assured us that full-body scans will result in lower error rates? If so, I've seen no such assurance.<br /><br />Ultimately, all scans must be interpreted by the people behind the scanners. And that's where the problem comes in. As <a href="http://www.nature.com/nature/journal/v435/n7041/full/435439a.html">work by researchers like Jeremy Wolfe </a>has demonstrated, human beings have real-world limits on their ability to detect objects, especially ones that they rarely see, such as bombs and guns. That's why the current miss rate is so high -- and why it is unlikely to improve with full-body scanners.Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-73901357091726393832009-12-07T07:37:00.000-08:002009-12-07T07:49:02.935-08:00Upwardly MobileAs readers of <em>Why We Make Mistakes </em>know, multi-tasking is usually a bad idea. It's an especially bad idea when you are behind the wheel of a car. Talking on a cell phone or texting while you are driving dramatically increases your chances of an accident. But we do it anyway because, among other reasons, we are overconfident about our abilities to multi-task. <br /><br />For some interesting history about how we got to this point, see the <a href="http://www.nytimes.com/2009/12/07/technology/07distracted.html">page-one story </a>by Matt Richtel in today's <em>New York Times</em>. As the article notes:<br /><br />"Long before cellphones became common, industry pioneers were aware of the risks of multitasking behind the wheel. Their hunches have been validated by many scientific studies showing the dangers of talking while driving and, more recently, of texting.<br /><br />"Despite the mounting evidence, the industry built itself into a $150 billion business in the United States largely by winning over a crucial customer: the driver."Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com0tag:blogger.com,1999:blog-1244187317818411252.post-51403110970456643992009-10-23T11:09:00.000-07:002009-10-23T11:24:47.775-07:00Asleep at the Wheel -- AgainWe've written before about the problem of pilots falling asleep midflight.<br /><br />Now, it seems that this has happened again.<br /><br />Northwest Airlines Flight 188, en route to Minneapolis, overshot the airport -- way overshot the airport. The pilots didn't turn around until they were over Eau Claire, Wis., 150 miles away.<br /><br />The pilots, who have not been identified, reportedly told the Federal Bureau of Investigation and the airport police that “they were in a heated discussion over airline policy and they lost situational awareness.”<br /><br />Right.<br /><br />The plane, an Airbus A320, which carried two pilots and three flight attendants as well as 144 passengers, was cruising at 37,000 feet when the crew stopped responding to air traffic controllers and airline dispatchers. According to the <em>Wall Street Journal</em>, the radio silence continued for 78 minutes.<br /><br />According to the same article, "pilot fatigue has long been regarded as one of the most serious safety issues confronting commercial aviation."<br /><br />The question is: When is the FAA going to wake up?<em></em>Joe Hallinanhttp://www.blogger.com/profile/13193377805917363408noreply@blogger.com1