Using experts 'inexpertly' leads to policy failure, warn researchers

October 14, 2015, University of Cambridge

The accuracy and reliability of expert advice is often compromised by "cognitive frailties", and needs to be interrogated with the same tenacity as research data to avoid weak and ill-informed policy, warn two leading risk analysis and conservation researchers in the journal Nature today.

While many governments aspire to -based policy, the researchers say the evidence on themselves actually shows that they are highly susceptible to "subjective influences" - from individual values and mood, to whether they stand to gain or lose from a decision - and, while highly credible, experts often vastly overestimate their objectivity and the reliability of peers.

The researchers caution that conventional approaches of informing policy by seeking advice from either well-regarded individuals or assembling expert panels needs to be balanced with methods that alleviate the effects of psychological and motivational bias.

They offer a straightforward framework for improving expert advice, and say that experts should provide and assess evidence on which decisions are made - but not advise decision makers directly, which can skew impartiality.

"We are not advocating replacing evidence with expert judgements, rather we suggest integrating and improving them," write professors William Sutherland and Mark Burgman from the universities of Cambridge and Melbourne respectively.

"Policy makers use expert evidence as though it were data. So they should treat expert estimates with the same critical rigour that must be applied to data," they write.

"Experts must be tested, their biases minimised, their accuracy improved, and their estimates validated with independent evidence. Put simply, experts should be held accountable for their opinions."

Sutherland and Burgman point out that highly regarded experts are routinely shown to be no better than novices at making judgements.

However, several processes have been shown to improve performances across the spectrum, they say, such as 'horizon scanning' - identifying all possible changes and threats - and 'solution scanning' - listing all possible options, using both experts and evidence, to reduce the risk of overlooking valuable alternatives.

To get better answers from experts, they need better, more structured questions, say the authors. "A seemingly straightforward question, 'How many diseased animals are there in the area?' for example, could be interpreted very differently by different people. Does it include those that are infectious and those that have recovered? What about those yet to be identified?" said Sutherland, from Cambridge's Department of Zoology.

"Structured question formats that extract upper and lower boundaries, degrees of confidence and force consideration of alternative theories are important for shoring against slides into group-think, or individuals getting ascribed greater credibility based on appearance or background," he said.

When seeking expert advice, all parties must be clear about what they expect of each other, says Burgman, Director of the Centre of Excellence for Biosecurity Risk Analysis. "Are expecting estimates of facts, predictions of the outcome of events, or advice on the best course of action?"

"Properly managed, experts can help with estimates and predictions, but providing advice assumes the expert shares the same values and objectives as the . Experts need to stick to helping provide and assess evidence on which such decisions are made," he said.

Sutherland and Burgman have created a framework of eight key ways to improve the advice of experts. These include using groups - not individuals - with diverse, carefully selected members well within their expertise areas.

They also caution against being bullied or "starstruck" by the over-assertive or heavyweight. "People who are less self-assured will seek information from a more diverse range of sources, and age, number of qualifications and years of experience do not explain an expert's ability to predict future events - a finding that applies in studies from geopolitics to ecology," said Sutherland.

Added Burgman: "Some experts are much better than others at estimation and prediction. However, the only way to tell a good expert from a poor one is to test them. Qualifications and experience don't help to tell them apart."

"The cost of ignoring these techniques - of using experts inexpertly - is less accurate information and so more frequent, and more serious, policy failures," write the researchers.

Explore further: Buying a new product: When is it better to ask a novice rather than an expert?

Related Stories

Strengthening the bond between policy and science

March 10, 2012

One only has to be reminded of the BSE crisis and the MMR vaccine scare to recognise the importance of having policy informed by the best available science. Now, a collaboration of over fifty academics and policy makers from ...

Integrated approach vital to reduction food waste and loss

July 10, 2014

Governments, international organisations, businesses and community agencies worldwide must create joint programmes to reduce food loss and waste (FLW), if they want to show that they are really committed to developing sustainable ...

Recommended for you

EPA adviser is promoting harmful ideas, scientists say

March 22, 2019

The Trump administration's reliance on industry-funded environmental specialists is again coming under fire, this time by researchers who say that Louis Anthony "Tony" Cox Jr., who leads a key Environmental Protection Agency ...

Coffee-based colloids for direct solar absorption

March 22, 2019

Solar energy is one of the most promising resources to help reduce fossil fuel consumption and mitigate greenhouse gas emissions to power a sustainable future. Devices presently in use to convert solar energy into thermal ...

6 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

julianpenrod
3.7 / 5 (3) Oct 14, 2015
Make sense. Those in "official" positions of authority don't care what's right or wrong. They do what they want. If they use "experts", it's only to give the gullible the impression that qualified advice is being used. The rich use everything from false advertising to bribing distributors to manipulating stock numbers to the insipidity of a plurality of the public to get money. There is no such thing as "economics", every turn in the market has been the result of connivance behind the scenes by the rich. Politicians tell "electoral" boards what fraudulent numbers to print to put who they want in office. And the gullible believe that the "news" is ordered to say are the results of various exercises. There is no record of "experts" complaining that various individuals or organizations didn't use their advice right.
chapprg1
1 / 5 (1) Oct 15, 2015
Well said Julian..
""cognitive frailties", and needs to be interrogated with the same tenacity as research data to avoid weak and ill-informed policy,"
How this can be done in the face of bias, political and otherwise, by people who are not expert is the real question. In other words how to prevent the 'same tenacity' from going awry for all the less objective reasons seems to be our perennial problem. Unfortunately these same problems plague the 'experts' who allow for the politicization of science even within the 'expert community'.

Read more at: http://phys.org/n...html#jCp
antialias_physorg
4.3 / 5 (6) Oct 15, 2015
OK, I get that there is bias, but to propose as a solution that people who know nothing about the problem (or the foundations of an expert opinion) should be the arbiters of impartiality is ludicrous. You might as well forego expert advice altogether and just roll dice.

I say go the other way: Cut the politicians out of the loop and follow expert advice directly. That way you at least eliminate the political distortion/bias that comes with a limited term and a need to pander to the basest drives of mony backers and voters.

Who would you believe about the quality of a car? The guy who built it or the salesman? Both will be biased towards the quality of the car, but I'll take the word of a mechanic/engineer over that of a salesman any day.
AGreatWhopper
1 / 5 (2) Oct 20, 2015
Penrod's idea of an expert is an octogenerian bachelor teaching about sexual behavior. Which expert says you're living in sin, Penrod, denying AGW and fighting against efforts to curb it.
SuperThunder
2.3 / 5 (3) Oct 20, 2015
@antialias, hell yes. There really is no need for politicians in a world with the scientific method, except maybe as zoo keepers until education can be stolen back from the stormtroopers of insanity.
docile
Oct 20, 2015
This comment has been removed by a moderator.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.