Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Technology

Academic Publishers Turn To AI Software To Catch Bad Scientists Doctoring Data (theregister.com) 39

Shady scientists trying to publish bad research may want to think twice as academic publishers are increasingly using AI software to automatically spot signs of data tampering. The Register: Duplications of images, where the same picture of a cluster of cells, for example, is copied, flipped, rotated, shifted, or cropped is, unfortunately, quite common. In cases where the errors aren't accidental, the doctored images are created to look as if the researchers have more data and conducted more experiments then they really did. Image duplication was the top reason papers were retracted for the American Association for Cancer Research (AACR) over 2016 to 2020, according to Daniel Evanko, the company's Director of Journal Operations and Systems. Having to retract a paper damages the authors and the publishers' reputation. It shows that the quality of work from the researchers was poor, and the editor's peer review process missed mistakes.

To prevent embarrassment for both parties, academic publishers like AACR have turned to AI software to detect image duplication before a paper is published in a journal. The AACR started trialling Proofig, an image-checking programme developed by a startup going by the same name as their product based in Israel. Evanko presented results from the pilot study to show how Proofig impacted AACR's operations at the International Congress on Peer Review and Scientific Publication conference held in Chicago this week. AACR publishes ten research journals and reviews over 13,000 submissions every year. From January 2021 to May 2022, officials used Proofig to screen 1,367 manuscripts that had been provisionally accepted for publication and contacted authors in 208 cases after reviewing image duplicates flagged by the software. In most cases, the duplication is a sloppy error that can be fixed easily. Scientists may have accidentally got their results mixed up and the issue is often resolved by resubmitting new data. On rare occasions, however, the dodgy images highlighted by the software are a sign of foul play.

This discussion has been archived. No new comments can be posted.

Academic Publishers Turn To AI Software To Catch Bad Scientists Doctoring Data

Comments Filter:
  • by VeryFluffyBunny ( 5037285 ) on Tuesday September 20, 2022 @12:48PM (#62898525)
    Perhaps we need to take a step back & think how/why we got to the point where falsifying data, & therefore casting doubt on what is known about the world, has become a common practice? What is it that institutions are doing that's cultivated this kind of mentality?
    • by pete6677 ( 681676 ) on Tuesday September 20, 2022 @12:52PM (#62898539)

      Lots of data gets falsified due to pressure from Big Pharma, for one.

      • by Tablizer ( 95088 )

        There's somewhat of a difference between pressure to produce real results, and pressure to cheat. But I do agree they often result in the same thing: cheating.

      • Pressure may make it worse but the problem is wider than this because it is not just seen in scientists writing journal publications but also in students who are cheating on exams and plagiarizing assignments in worryingly increasing numbers. While pressure to "do well" may push some towards taking immoral shortcuts it's the complete absence of morals in the behaviour of many of our political leaders that, I believe, gives them the "permission" to take those shortcuts.

        If our political leaders lie and che
    • by Tablizer ( 95088 )

      It's the result of PPP: "publish or parish" pressure. And cross-checking others' work is not glamorous and doesn't pay well. Checks and balances are necessary in a good many endeavors and need to somehow be funded.

      Re TFA: "To Catch Bad Scientists Doctoring Data"

      It's not necessary to put "bad" in the title, there's rarely a good scientist who doctors data. (A military spy on "your side" is about the only situation I can think of.)

    • by Gravis Zero ( 934156 ) on Tuesday September 20, 2022 @01:19PM (#62898665)

      Perhaps we need to take a step back & think how/why we got to the point where falsifying data, & therefore casting doubt on what is known about the world, has become a common practice?

      You mean how science became linked to money? Simple, businesses are always pushing to make more money. With US colleges becoming for-profit businesses (despite claims to the contrary) thanks to a major political party being against further education and cutting funding as much as possible, it's a problem in US colleges too.

      A system designed to reward greed will push people to do unsavory things or be replaced by someone who will. Fixing this will be a lifelong endeavor as "the system" is highly lopsided in favor of moneyed parties.

    • It's like everything else in a capitalistic society. Eventually, money becomes the main driver of everything. In the drive to make more money, all ethics and morals either slowly or quickly erode, as people do whatever they can to gain wealth, which, in a capitalistic society, also means gaining power. Nothing is immune from it. Not even science.

      • Re: (Score:1, Flamebait)

        by sfcat ( 872532 )

        Because scientists in other systems are perfect right? You know, right now in liberal arts departments in universities there is an ideology that teaches that you can ignore data and reality if it doesn't align with your political ideology. And this is a distinctly left wing ideology. At the top, university presidents are judged by their ability to build new buildings and little else. A step down, bureaucracy is growing at a rapid rate and consuming all the extra funds universities are taking in. Below

      • by HiThere ( 15173 )

        The problem here is that you're thinking it's limited to capitalistic societies. It's literally every where&when. Sometimes the push is stronger than others, and sometimes there emphasis is on alternative currencies, like political connections, but it's always present. Even when cowry shells are the currency. (I seem to recall that Sparta was able to avoid the problem by making the currency large chunks of iron, that nobody could carry around. Their alternative currencies were slaves and political

        • And it's corruption all the way down.

        • There's a big difference between having "some" competition vs basing your entire economic system on pitting everyone against each other & grinding them down to maximise profits for the owners of the capital, who sit back & watch their obscene fortunes accumulate, at the expense & suffering of everyone else, AKA capitalism. If anything, capitalism is like feudalism on steroids.

          The artificial competition currently being pursued by the institutions & publishers doesn't make any more money
          • by HiThere ( 15173 )

            If you think the slaves weren't ground down, I don't know what your model is. The competition is just as strong in other societies, though the accounting may be a bit different. In some societies they kill lots of people. Rome was like that. It's not always clear from the histories just how things were done, but certainly the push to be on top was strong enough to kill large numbers of pretenders to the throne. To me this doesn't imply weak pressure lower down, it rather implies a desperate striving to

            • It depends which slaves you're talking about. The British empire's & the USA's ideas of slavery were an extreme departure from the rest of the world at the time. Colonialism was an unusually egregious crime against humanity & we're still feeling its effects today, in the workplace & people's attitudes towards what are acceptable working hours. Pre-industrial revolution, people typically worked far fewer hours.
              • by HiThere ( 15173 )

                Actually the slaves I was referring to were the Spartan slaves I had mentioned earlier. You could, however, also check into the "gentle" methods used by Rome and Carthage. My expamles were chosen because they were before capitalism had evolved, so they could not have been contaminated by it.

                And if you were talking about pre-agricultural revolution, then I would agree that the people commonly worked fewer hours. The difference the industrial revolution made is that nominally free people worked more intens

                • I think you may be conflating the effects of democracy with the effects of capitalism. It was the combination of feudalism (the UK's king & the crown) with the early stages of modern capitalism (e.g. the East India Trading Company) that developed one of the worst crimes against humanity in history. If anything, democracy is our defence against the ravages of capitalism & the owning classes privately despise it.
      • Haha. Chinese are among the worst offenders.

    • How about spending months, maybe years gathering data to test a hypothesis & getting null results, & then worrying if you're going to have a job going forward? I don't think corporate corruption is directly responsible. Maybe more that institutions are applying a very high-stakes system & otherwise conscientious researchers sometimes buckle under the pressure?
  • IMPOSSIBLE (Score:1, Funny)

    by Var1abl3 ( 1021413 )

    This is not possible. Scientist would never lie or mislead the public. I have heard this rule applies from AGW to Covid so I know it must be true... Trust the science!!!!

    • And Round Earth Conspiracy... and Old Earth conspiracy, And Moon Landing! GO GO GO Trust the Science!
  • by medv4380 ( 1604309 ) on Tuesday September 20, 2022 @12:56PM (#62898563)
    Sounds like an attempt to get more lousy science by creating part of the adversarial network needed to generate even more legit-sounding papers.
  • by Kunedog ( 1033226 ) on Tuesday September 20, 2022 @01:01PM (#62898575)

    Having to retract a paper damages the authors and the publishers' reputation. It shows that the quality of work from the researchers was poor, and the editor's peer review process missed mistakes.

    To prevent embarrassment for both parties, academic publishers like AACR have turned to AI software to detect image duplication before a paper is published in a journal.

    When I reached this part, I had a sudden twinge, as if I were reading an article about pepper spray and stun guns, and how they can be acquired and analysed effectively by serial killers and rapists to hone their craft so everything goes smoothly on "game night."

    How much embarrassment do we want to prevent for crooked researchers?

    • by Zak3056 ( 69287 )

      While not as colorful as your take, this was pretty much my immediate thought at well: what they're actually doing here, whether they intend to or not, is training people who falsify results to do a better job of it while ensuring there is no personal or professional consequence for doing so.

      When someone is found to have falsified their data and submitted it for publication, the result should be to broadcast the fact that they did so as far and wide as possible, so that their entire body of work can be revi

  • People that are better with photoshop....

    • by guruevi ( 827432 )

      Having worked in academia, Photoshop and InDesign is sadly the prime way of making figures, statistics and layouts for some. Slack channels are full of grad students and faculty asking how to improve the little data they have to extrapolate the hypothesis' underpinning.

      It's primarily government funding that drives this too, basically you get funding on the basis of having already been close to proving your hypothesis.

    • We need Elisabeth Bik "super-spotter of duplicated images in science papers" But the AI version of her.
  • The problem is that people skew results to make it either very good or extremely good.

    Science needs us to publish negative results, and admit when approaches fail.

  • by Anonymous Coward

    Just wait until someone turns that engine on the covid data. The heads of the local branch covidians will pop.

    • by sfcat ( 872532 )

      Just wait until someone turns that engine on the covid data. The heads of the local branch covidians will pop.

      The covid data was badly gathered and badly analyzed, it wasn't fake. So this AI won't detect anything there. Sorry to burst your bubble.

  • by Tony Isaac ( 1301187 ) on Tuesday September 20, 2022 @01:57PM (#62898795) Homepage

    Results are likely doctored if:
    - The research is paid for by a commercial interest / drug company
    - The research (methods and assumptions) is not made publicly available
    - The raw data is not made publicly available
    - The research conclusions align with a political agenda
    - The research or devices related to it, are patented by the researchers or university

    All others: 50% chance

    But good luck finding one of those others.

    • by sfcat ( 872532 )

      - The raw data is not made publicly available

      So all medical data then? Because the medical field never releases the raw data from their studies. I am sure they claim this is about patient privacy. Most medical studies have a very small number of patients and you can't anonymize such small sample sets. However, I am also sure that if a statistician dug into that data, many many drugs (with large side effects and questionable effeminacy) would be pulled off the market very quickly. Doctors are not scientists and aren't given the statistical traini

      • Medical raw data can be de-identified, at least to the degree that it would require serious forensic analysis to figure out who is who.

        This is similar to the 100,000 Genomes project https://www.genomicsengland.co... [genomicsengland.co.uk] Sure, if you get that raw DNA data, you could upload it to a site like GedMatch and figure out who a particular raw data sample belonged to. But it would take a lot of effort, beyond what a typical marketer would want to spend. If somebody is after YOUR specific identity and are willing to work

      • Also, if the sample size is so small that it can't be effectively de-identified, then you know that the results are little better than random. And yes, I believe there are many drugs on the market that should not be, because they are statistically no better than a placebo, but may have serious side effects.

  • Running fast while standing still...the Red Queen's Race [wikipedia.org] with AI.

    AI detects doctored data, then new AI doctors data better, and then the AI checks the doctored data better....ad infinitum. Perhaps it will spur increases in CPU and processor power. :-)

    JoshK.

  • by burtosis ( 1124179 ) on Tuesday September 20, 2022 @02:07PM (#62898821)
    Well thank goodness an AI could catch some of these people cheating on their research papers. As this is within my field of expertise, perhaps you would like to peruse my latest paper on intent based data generation, where you just put your goals in as a measurable objective, and it is optimized to spit out the most believable realistic results to the point it may be too believable. But don’t worry, you can always dial it back if looks too good for your taste, after all it is intent based generation after all




    Ok, so I may have falsified the data backing up the algorithm efficacy, but if I had gotten the damn thing to work I wouldn’t have needed to.
    • by swell ( 195815 )

      "put your goals in as a measurable objective"

      In a more idealistic time, a high school teacher explained to me how science is done.

      "A scientist develops an idea, a theory, about some basic science matter. If he thinks it's worth exploring he begins experiments. The purpose of the experiments is to show that the theory is WRONG. After many failures to prove his idea WRONG, the scientist reaches out to others- maybe they can prove it WRONG. Finally, if nobody can prove the theory WRONG, the results are publish

      • High school teacher obviously never worked as a professional scientist. However, he did attend a church known as the The Church of Latter Day Scientists where he was inculcated in some scientolic religion.

        No scientist invents a theory in the hope of proving it wrong. More like he dares others to prove him wrong, confident that he's right. Furthermore, many sciences don't have experiments, only observations.

        • No scientist invents a theory in the hope of proving it wrong. More like he dares others to prove him wrong, confident that he's right. Furthermore, many sciences don't have experiments, only observations.

          Semantics. The only successful scientists are ones that have tried to prove themselves wrong and failed to do so with the math, or through observation and experiment. Sure, I bet none of them wanted to be able to disprove their hunch, but to achieve success it has to actually work and thus fail to be disproven. When they failed to do it themselves and it was predictive that’s when they started daring people to prove them wrong. Scientists that want to be right and did not try to think about how th

The opposite of a correct statement is a false statement. But the opposite of a profound truth may well be another profound truth. -- Niels Bohr

Working...