Wollongong in the wrong: How a university failed science

Universities in Australia are under increasing danger of being a comfortable home for dangerous anti-science loony ideas, authored by quacks granted the title “Doctor” by the university. The latest is an extensive anti-vaccination screed by Judith Wilyman, dressed up as a PhD thesis and published by the University of Wollongong. That university has failed in its public duties to science, specifically by posting a science research student with a thesis supervisor who would not sufficiently vet the scientific claims.

Wilyman’s 2015 thesis paper is a weighty tome, at over 390 pages. Yet among its many claims of global conspiracies, claims of the inefficacy of vaccines, and challenges to established germ theory, it devotes none to substantiating those with data or other credible evidence.

The propagation of harmful nonsense is not new territory for Wilyman. Her anti-vaccination website is teeming with dangerous anti-vaccine misinformation. It promotes unfounded fear of government vaccination programs, the medical establishment, and worldwide charity and health organisations.

Nor is the University of Wollongong a new forum for Wilyman’s crackpot anti-science views. In 2007 they granted her a PhD in Master of Science in Population Health, and proceeded to publish (in 2011, in July 2013, in August 2013) her opinions on vaccination programs.

Despite Wilyman’s PhD being in a science field, and despite these publications casting fears about the science of vaccination, they were given the imprimatur of University of Wollongong’s school of Law, Humanities and the Arts. Clearly, and perhaps unsurprisingly given their inappropriate field of publication, they were not subject to any review by anyone qualified in the field of vaccination and public health.

That a noted anti-vaccination campaigner would seek the authority of science for those views is not shocking. What should be shocking, and is to the detriment of public understanding of science generally and public health specifically, is that the University of Wollongong grants these dangerous opinions the veneer of scientific authority. They grant Wilyman the title “Master of Science”, and publish a lengthy PhD thesis promoting her easily-falsified claims.

The thesis supervisor for the 2015 paper, Brian Martin, is well known to Wollongong University as someone who has supervised many past papers without fulfilling the duty to check their scientific claims. His public position on the reactions to Wilyman’s 2015 thesis fails to address the substantive complaints: that the paper contains numerous scientific errors that are easily discovered by experts in the field of vaccination and public health, yet was approved by Martin regardless.

If the University of Wollongong wants its students to have credible qualifications, it must publicly commit to never assign research students to crackpots like Brian Martin who demonstrate no regard for scientific truth. It must exercise – and demonstrate publicly its ongoing exercise of – procedures of critical and skeptical enquiry into any thesis or publication, by known experts in the fields addressed.

That has clearly not been done in the case of Wilyman’s paper, and the University of Wollongong’s reputation is rightly tarred as a result. Worse, the University’s negligent publication of dangerous falsehoods actively undercuts public respect for academic qualifications, and science in general.

The stubborn persistence of false beliefs

Maria Konnikova at The New Yorker gives us a familiar truth phrased in understatement:

Until recently, attempts to correct false beliefs haven’t had much success.

In Why do people persist in believing things that just aren’t true? she reports on recent research into the factors influencing the stubbornness of people’s false views of the world.

One such researcher is Brendan Nyhan, who did a longitudinal study into how the beliefs of parents changed on the topic of childhood vaccination, before and after various kinds of pro-vaccination campaign.

The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.

Stephan Lewandowsky has conducted psychological research into how humans treat misinformation and correction: a series of studies presented a scenario, with or without racial information, followed by retracting or not retracting the racial information.

Everyone’s memory worked correctly: the students could all recall the details of the crime and could report precisely what information was or wasn’t retracted. But the students who scored highest on racial prejudice continued to rely on the racial misinformation that identified the perpetrators as Aboriginals, even though they knew it had been corrected. They answered the factual questions accurately, stating that the information about race was false, and yet they still relied on race in their inference responses, saying that the attackers were likely Aboriginal or that the store owner likely had trouble understanding them because they were Aboriginal. This was, in other words, a laboratory case of the very dynamic that Nyhan identified: strongly held beliefs continued to influence judgment, despite correction attempts—even with a supposedly conscious awareness of what was happening.

The growing impression is that false beliefs are easy to change *only* if they are not tied to the person’s self identity or group identity.

If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief.

But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.

This important difference informs possible improvements in approaches to correct false beliefs that are unfortunately tied to a person’s identity.

Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would. On all issues, attitudes became more accurate with self-affirmation, and remained just as inaccurate without. That effect held even when no additional information was presented—that is, when people were simply asked the same questions twice, before and after the self-affirmation.

Still, as Nyhan is the first to admit, it’s hardly a solution that can be applied easily outside the lab. “People don’t just go around writing essays about a time they felt good about themselves,” he said. And who knows how long the effect lasts—it’s not as though we often think good thoughts and then go on to debate climate change.

The message is a poingnant and important one for skeptics:

Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted. Instead, why not focus on presenting issues in a way keeps broader notions out of it—messages that are not political, not ideological, not in any way a reflection of who you are?

A Week in Science – 15 March 2013

Paul Willis, from the Royal Institution of Australia, presents a weekly video update of the news in science. Each show goes for 3 – 5 minutes, and is always worth watching.

Here’s this week’s episode, featuring Martian life, ‘hearing’ earthquakes and annoying phone talk:

Check the RIAus page for more information about each story.