Patients indulge wishful thinking in medicine

We humans are a species of animal that tends to look for the hopeful side of anything, especially when addressing our health.

A new meta-study conducted at Bond University in Australia finds that most patients are unreasonably optimistic about the potential benefits and the risks of medical procedures.

“What struck us is that, in general, people thought that treatments were going to be much more beneficial to them [than the evidence suggested],” says researcher Professor Chris del Mar of Bond University.

“And that the harms would be much less,” says del Mar.

He says clinicians and patients need to make decisions based on more accurate information on the pros and cons of such interventions.

“This is one of the most important ways we can save the healthcare system from going bust.”

Del Mar and Associate Professor Tammy Hoffmann carried out a systematic review of 35 studies analysing patient expectations of tests and treatments.

The studies covered such interventions as mammography, prostate specific antigen (PSA) tests, angioplasty, stem cell transplants, statins, kidney transplants, bariatric surgery, inflammatory bowel disease drugs and resuscitation after a cardiac arrest.

This wishful thinking must be a factor in the general mistrust of medical science. Either the facts are poorly communicated by the medical profession, and/or patients are strongly prejudiced against factual assessment — biasing their understanding toward the overly optimistic.

Such mistrust in medical science is powerfully exploited by practicioners of alternatives to medicine, whether they are “Big Placebo”, “Big Herba”, or any other industry of quacks preying on the sick and vulnerable who hope for more.

The science-based medical profession is often rightly criticised for spending so little time with each patient that they treat symptoms, not people. This in turn is also exploited by charlatans promising “holistic” treatment, but providing nothing more than would be expected from paying closer attention to overall health with any non-expert.

The Bond University researchers argue that better consultation is needed in medicine, but find that this doesn’t need to occupy significantly more time.

Del Mar says some people argue that more talk will lead to longer consultations but he says studies show this is a “myth”.

“You can do it all in the same time frame. All you’re doing is exchanging one kind of discourse with another,” he says.

“It does require a change in mindset.”

In fact, evidence suggests more talk results in less intervention, says del Mar, citing the case of PSA testing.

“If you talk to people first before you do the screening, and explain all the downstream consequences of being screened, a lot of men say they don’t want it,” he says.

While unnecessary or unwanted treatment and tests add to the cost of healthcare, del Mar says “overtreatment” can also result in physical and emotional harms.

“I think [more talking] will save money but I don’t think that’s the main driver. The main driver is delivering better care,” he says.

Here’s hoping better science will lead to better health services.

The stubborn persistence of false beliefs

Maria Konnikova at The New Yorker gives us a familiar truth phrased in understatement:

Until recently, attempts to correct false beliefs haven’t had much success.

In Why do people persist in believing things that just aren’t true? she reports on recent research into the factors influencing the stubbornness of people’s false views of the world.

One such researcher is Brendan Nyhan, who did a longitudinal study into how the beliefs of parents changed on the topic of childhood vaccination, before and after various kinds of pro-vaccination campaign.

The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.

Stephan Lewandowsky has conducted psychological research into how humans treat misinformation and correction: a series of studies presented a scenario, with or without racial information, followed by retracting or not retracting the racial information.

Everyone’s memory worked correctly: the students could all recall the details of the crime and could report precisely what information was or wasn’t retracted. But the students who scored highest on racial prejudice continued to rely on the racial misinformation that identified the perpetrators as Aboriginals, even though they knew it had been corrected. They answered the factual questions accurately, stating that the information about race was false, and yet they still relied on race in their inference responses, saying that the attackers were likely Aboriginal or that the store owner likely had trouble understanding them because they were Aboriginal. This was, in other words, a laboratory case of the very dynamic that Nyhan identified: strongly held beliefs continued to influence judgment, despite correction attempts—even with a supposedly conscious awareness of what was happening.

The growing impression is that false beliefs are easy to change *only* if they are not tied to the person’s self identity or group identity.

If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief.

But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.

This important difference informs possible improvements in approaches to correct false beliefs that are unfortunately tied to a person’s identity.

Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would. On all issues, attitudes became more accurate with self-affirmation, and remained just as inaccurate without. That effect held even when no additional information was presented—that is, when people were simply asked the same questions twice, before and after the self-affirmation.

Still, as Nyhan is the first to admit, it’s hardly a solution that can be applied easily outside the lab. “People don’t just go around writing essays about a time they felt good about themselves,” he said. And who knows how long the effect lasts—it’s not as though we often think good thoughts and then go on to debate climate change.

The message is a poingnant and important one for skeptics:

Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted. Instead, why not focus on presenting issues in a way keeps broader notions out of it—messages that are not political, not ideological, not in any way a reflection of who you are?