Not all published research studies are created equal, so put them to a pressure test before you take their findings on board.
By Jason Murphy
By now, you’ve probably heard about power posing. You may have even tried it – standing with your feet apart and hands on hips. Research has suggested this cowboy-like stance may make you more confident for a high-stakes experience such as a job interview or a performance review, and ultimately it may help you succeed.
Excitement over power posing began a few years ago with two small psychological studies.
The technique really took off when Harvard Business School associate professor Amy Cuddy – who with fellow social psychologists Dana Carney and Andy Yap had co-authored the original power-posing research (published in 2010 in Psychological Science) – gave a TED Talk outlining their findings and demonstrating a few poses.
Her talk, “Your body language shapes who you are”, has since had more than 38 million views online.
Yet after gaining a great deal of popularity, the power-pose findings failed to be replicated in a much larger study undertaken by European academics (published as “Assessing the Robustness of Power Posing” in 2015 in Psychological Science). If this wasn’t enough to make dedicated power posers slump, last year one of the original study’s co-authors, Carney, posted a document on her website effectively debunking the research.
“Management journals, they want a pretty story.” Professor Jen Overbeck, Melbourne Business School
This generated heated debate among psychology academics, but the power-posing case was just one instance that provoked international consternation.
In 2015, a collaboration of 270 investigators working for five years published the results of their efforts to replicate 100 studies that had appeared in three leading psychology journals. The Reproducibility Project: Psychology, led by Brian Nosek of the University of Virginia, reported in Science that the results of these original studies were repeated in just 39 per cent of cases.
What has been called “a crisis of replication” has ensued, raising major questions over how much published psychology research actually stands up.
Professor Jen Overbeck of the Melbourne Business School says academic business literature is yet to deal with the implications of the replication crisis.
“There has been this illusion or maybe motivated denial that what is happening in social psychology has anything to do with what happens in management,” she says. “It’s coming to management. It’ll be here.”
Overbeck publishes in both psychology journals and business journals, and says the psychology journals have made steps to increase transparency and fight the risk of non-replicating studies, “whereas management journals, they want a pretty story”.
Sometimes those pretty stories have a less attractive sequel, such as the study into German names that proclaimed people with a noble moniker, such as Kaiser, were more likely to be in managerial positions than those called Koch.
“I would always be worried if somebody is only talking about his or her own studies. That’s another sign it hasn’t been replicated.” Michael Frese, University of Singapore Business School
One of the authors of this flawed German names study, associate professor Eric Uhlmann, now at INSEAD Singapore, has become proactive about making replication a vital part of science. In 2016, he co-authored a paper with Martin Schweinsberg calling for findings to be replicated before they are published, rather than after.
Uhlmann and Schweinsberg, in an email interview, say, “No one is immune to failed replications. The good news is that social psychology is beginning to set the standards for reproducible science. The science of business can only benefit from that.”
For now, managers must use their judgement in assessing the merits of eye-catching studies they come across in business media.
One way to ensure sound judgement is to seek safety in numbers. “Do not look for just one study,” says Michael Frese, provost’s chair and head of the Department of Management and Organisation at the National University of Singapore Business School. He recommends studies of studies, called meta analyses.
CPA Q&A. Access a handpicked selection of resources each month and complete a short monthly assessment to earn CPD hours. Exclusively available to CPA Australia members.
“Meta analysis is a good way to deal with the problem of replication because you look for the average of studies,” says Frese. “I would urge managers or HR departments to look for meta analyses rather than base everything on one study only.”
Where that’s not possible, Frese suggests checking to see if the findings in a particular field come from one industrious researcher. “I would always be worried if somebody is only talking about his or her own studies,” he says. “That’s another sign it hasn’t been replicated.”
Frese cautions against over-reaction. Just because some studies are broken does not mean the whole field is.
“One should not take from this discussion that science is useless and you should listen to your intuition, because your intuition is more frequently wrong,” Frese says. “It’s better to have a meta analysis based on a few thousand companies and experiences rather than one’s own experience.”
For business, one way to avoid falling into a post-factual fog is to actively help make good research. Research should not be the sole burden of academics, says marketing professor Utpal Dholakia of Rice University in Houston, Texas.
“Help them so they can generate robust knowledge that will help you,” he urges. “My best research has been in collaborating with supportive and thoughtful managers and business owners.”
Owing your results
In 2013, Eric Uhlmann and Raphael Silberzahn published a study that concluded Germans with last names with noble meanings, such as Kaiser (meaning emperor), were more likely to hold managerial positions than people with other names (such as Koch, meaning cook).
Such claims get noticed and, sure enough, they made headlines worldwide.
However, the study had several problems. Some were to do with statistical techniques, but one was simpler: the social network from which the researchers originally drew their data did not show managers with different names at equal frequencies. In short, there were more Kaisers to start with.
In 2015, the pair published a follow-up, with Uri Simonsohn, who has made a name for himself as a “data detective” and is a researcher at the forefront of ensuring good practice.
In the newer paper, they found that “no significant relationship between the meaning of a person’s name and his or her career outcomes can be confirmed”. The table below, from the 2015 paper, indicates that a statistically significant relationship is not present.