Science Fictions links for April 2024
Plagiarism at Harvard (again); retractions at Stanford (again); and a whole bunch of other stuff from the world of bad science
Bad science links! Get your bad science links right here! As usual, over the month I’ve collected all the most interesting stories about meta-science, crap research, scientific fraud, the broken scientific publishing system, and related topics - and I’ve posted them below.
Enjoy. And please do subscribe and share the newsletter with your friends and colleagues if you haven’t done so already.
The links
Another month, another Nobel prizewinner (this time from Stanford) who’s had to retract and correct several of his papers due to “errors” in the images. So it’s not just the ex-President of that university who was making these “errors”!
See also the note at the top: even more errors were found in the same scientist’s papers after the news article appeared. Oops!
The data sleuth who noticed the problems in the above papers, Elisabeth Bik, successfully gets two other papers retracted… a mere nine years after she pointed out the problems. Science is self-correcting… sometimes, eventually.
I can actually go one better: here’s a paper (by the President and CEO of the Dana-Farber Cancer Institute, no less) that was retracted this month after concerns were raised about it ten years ago.
And in other Elisabeth Bik news, the BS harassment complaints thrown at her after she criticised the research of powerful French scientist Didier Raoult have, happily, been thrown out by a court.
In last month’s newsletter I mentioned the disgraced superconductivity researcher Ranga Dias. Since then, journalists have seen the report his university (University of Rochester) commissioned into his misconduct. Fabrication, falsification, plagiarism - it’s all here.
Incidental detail for metascience fans: the investigations of Dias were kicked off by a complaint from Jorge Hirsch, the materials physicist who also—for better or worse—came up with the h-index.
Oh, and talking about plagiarism - you’ll never guess what someone discovered in the work of Harvard prof and very-credibly-accused-fraudster, Francesca Gino. So it’s not just the ex-President of that university who was plagiarising!
Gino’s erstwhile colleague, Dan Ariely, this month posted an update to the effect that his university have investigated his research, given him a slap on the wrist for “faulty data”, but taken no further action. He’s “putting this matter behind him” now. Jolly good!
One of those large-scale reproducibility attempts, covering 110 papers from economics and political science. The results are a mixed bag: a high proportion (85%) of papers had computationally-reproducible results (though, c’mon, it’s not that high when you think about it), but a quarter of papers have a substantial coding error, and just over half exaggerate their effect size. Lots more interesting results in there, too.
Do more diverse companies do better financially? Consulting firm McKinsey says yes; these academics who’ve re-analysed the data say no.
From the re-analysis paper: “we commit to posting the data and code by 31 December 2026”. This is stupid. What if you get run over by a bus before that date? Just post it now!
A refreshingly open and honest discussion from the editors of a reproductive health journal on their process of dealing with fraud allegations and retractions.
Remarkable article by a scientist whose paper was retracted for entirely spurious reasons (in actual fact it seems to have been done at the behest of a publishing company the paper identified as a predatory outfit).
Seriously - read the retraction note. It’s Kafkaesque. The “regression analysis didn’t have a control group”? What are you talking about?
Ruben Arslan finds a paper on subliminal priming (cited in Jonathan Haidt’s new book) where it was physically impossible that the stimuli were shown to the participants, even “subliminally”, because of the refresh rate of the monitors they used. And yet they still found effects! Psychology really is magic.
“What is the prevalence of bad social science?” Dunno. But there are some interesting examples and discussions here.
You know that thing where albums suddenly disappear from music streaming services, meaning that if you haven’t got the tracks downloaded to your computer, you can’t listen to them any more? Well, that - but for scientific journals.
Rare example of a female (alleged!) scientific fraudster (though of course cf. Gino above): researcher at the University of Toronto is accused of fabricating data, as well as getting her husband to pretend to be participants in her psychology study. She’s had her PhD revoked, and seems to have lost a new job at Northwestern University, too.
You think you’ve seen it all, and then along comes “co-authoring a paper with your own alter ego”.
P.S. Invisible College
Are you aged 18-22? Are you around the UK on 26-31 August this year? You should consider attenting Invisible College, the residential programme in Cambridge run by Works in Progress magazine.
It’s all about Progress Studies - understanding how and why things get better over time. I’ll be giving a series of talks on the topics covered on this Substack: bad science, how to spot it, and what we can do about it. The week will also include loads of other interesting talks on topics like the history of technological progress and how public policy works.
If you’re interested (or you know someone who might be), you can find more details and the application form at this link. See you in August!
P.P.S. Metascience grants!
If you’re a researcher at a UK research org who’s interested in making science better, you need to take a look at these new meta-science grants (up to £300,000 each), co-funded by the UK Government and Open Philanthropy.
P.P.P.S. The Studies Show
More of my opinions on science-related matters, should you wish to hear them, are available on my podcast, The Studies Show. For understandable reasons, the biggest episode we’ve done recently was on the Cass Review and youth gender medicine. You’ll have to become a paid subscriber to The Studies Show for that one, but free subscribers can still listen to new episodes on microplastics, the history of probability, and whether depression really exists.
Image credit: Getty
An 85% computational reproducibility rate sounds depressingly low to me, I'm flabbergasted that this is presented as "high" in the original paper. Only an 85% chance that your conclusions match the data you collected does not bode well for the overall correctness rate of studies when you take into account data collection issues. In a sane world this would be >99%.
The comments from Sudhof are just so unfathomably mercenary, I'm a little shocked. As far as I can see, this is still very much in the "okay, could be fraud, could be sloppiness" territory, but the way he's responded makes me suspect fraud much more than I otherwise would.
It's just such a rote, half-assed attempt to play some sort of identity politics card, claiming that having errors pointed out in his team's work is particularly bad for the women in his team, somehow, for no specific reason, then refusing to give any examples of that being the case or provide anything to back that up. It's just a sort of automatic, knee-jerk "oh, I'm under attack, this is, um [spins wheel] sexist - what's that you say - I'm a man - erm I guess it's bad for some women who are involved, no I don't know whom, but, you know demanding data not be faked or contain sloppy errors is, erm, bad for women in some generalised way because, erm, because accuracy and honesty are more male traits or something, as I said you're the sexist here".
It's the laziest attempt to draw some sort of conceptual line between someone criticising him and some bad thing I can remember seeing, and it's just so offensive itself, to just assert as if it's obvious that having scientific standards, expecting errors to be corrected, acting against fraud, is particularly bad for women. This is implicitly a claim that women are less honest and/or more sloppy than men!