Welcome to the new (and now only) Fora!
Started by Langue_doc, June 26, 2023, 07:11:46 AM
Quote from: ab_grp on June 29, 2023, 09:36:29 AMI don't think this guy's research has been discussed here before (Nicolas Gueguen): https://retractionwatch.com/2022/12/02/paper-about-sexual-intent-of-women-wearing-red-retracted-seven-years-after-sleuths-raised-concerns/ The Data Thugs did a pretty thorough review of his research, summarized on Nick Brown's blog (https://steamtraen.blogspot.com/2017/12/a-review-of-research-of-dr-nicolas.html) and discussed at length in the 52-page document linked there. The pushback against this kind of effort bugs me both because this has been going on for quite a while now, flawed research (for various reasons) uncovered and held up to scrutiny, and yet researchers continue to try to get away with the most obvious fraud. I also see researchers I respect pushing back, and I just cannot fathom it. I think there can be a legitimate need to change the incentive structure so just getting things published is not as much of a career driver, but the researchers who apparently think they can just get away with it should also be held publicly accountable, in my opinion. Unfortunately, there are already plenty of examples of researchers who think the gamble is worth it. Nick Brown and James Heathers have discovered a bunch of them and brought their actions to light.Regarding clinical trials, Elisabeth Bik has done work in that area. I think she is mainly known for spotting image duplication in publications but has received public backlash for her science integrity efforts: https://scienceintegritydigest.com/about/
Quote from: spork on July 07, 2023, 08:24:05 AMYou reminded me of the Michael LaCour case from *gulp* almost ten years ago: https://activelearningps.com/2015/06/04/american-idol/.AFAIK, none of his collaborators/supervisors faced any consequences.
Quote from: Diogenes on July 07, 2023, 09:12:12 AMAfter reading all the Datacolada blogs, their outsider results are pretty darning. I expect after Harvard's internal review is complete, she gets fired. That is unless she can supply proof it was someone else in her lab- like a postdoc or grad student. The whistleblowers don't think others had access to the data, but probably only Harvard's IT could untangle that.
QuoteWhen behavioral-science researchers are accused of misbehavior, the allegations have a funny way of being a little on the nose. The former Harvard psychologist Marc Hauser, author of Moral Minds: The Nature of Right and Wrong, was found to have fabricated data and manipulated results. The University of Pennsylvania psychologist Lawrence Sanna, who studied judgment and decision making, resigned after facing similar allegations. Diederik Stapel, a Dutch social psychologist whose work touched on such topics as selfishness and morality, fabricated data at least 50 times, making him "perhaps the biggest con man in academic science." And last month, Francesca Gino, a Harvard Business School professor who studies dishonesty—and who wrote a book titled Rebel Talent: Why It Pays to Break the Rules at Work and in Life—was accused of falsifying data in at least four papers, three of which are on their way to being retracted. Her accusers now suggest that Gino, who has been placed on administrative leave from Harvard, may have faked data in dozens of her other published papers.
Quote from: Wahoo Redux on July 07, 2023, 04:17:47 PMIs this some weird sublimation psychosis among scholars studying honesty, or is it just too tough to get good data on something like people's actual attitudes?
Quote from: ab_grp on July 10, 2023, 04:40:26 PMJust wanted to pass along a few items I came across today that might be of interest:Back to Dan Ariely, this time regarding data provenance for a different (heavily cited) dishonesty study. Apparently the researcher he was trying to pin the data collection on has released the emails they exchanged about the situation: https://openmkt.org/blog/2023/ucla-professor-refuses-to-cover-for-dan-ariely-in-issue-of-data-provenance/Nick Brown has also looked into a study on the relationship of wind speed and voting and has found some data errors that he does not suspect are the result of malicious doings but that do seem to serve as another cautionary tale about taking results at face value: https://steamtraen.blogspot.com/2023/07/data-errors-in-mo-et-als-2023-analysis.html#google_vignette
Quote from: Hibush on July 07, 2023, 09:47:05 AMQuote from: spork on July 07, 2023, 08:24:05 AMYou reminded me of the Michael LaCour case from *gulp* almost ten years ago: https://activelearningps.com/2015/06/04/american-idol/.AFAIK, none of his collaborators/supervisors faced any consequences.LaCour was an excellent social engineer. He worked with advisors at two institutions (UCLA and Columbia), persuading each one that the other was providing the necessary rigor. As a charmer, he could pull that off and escape scrutiny that these advisors would normally apply. The advisors appeared to admit getting conned in this way, which may have helped them escape serious reputational consequences.