Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Technology

ChatGPT Listed as Author on Research Papers. Many Scientists Disapprove. (nature.com) 40

The artificial-intelligence (AI) chatbot ChatGPT that has taken the world by storm has made its formal debut in the scientific literature -- racking up at least four authorship credits on published papers and preprints. Journal editors, researchers and publishers are now debating the place of such AI tools in the published literature, and whether it's appropriate to cite the bot as an author. From a report: Publishers are racing to create policies for the chatbot, which was released as a free-to-use tool in November by tech company OpenAI in San Francisco, California. ChatGPT is a large language model (LLM), which generates convincing sentences by mimicking the statistical patterns of language in a huge database of text collated from the Internet. The bot is already disrupting sectors including academia: in particular, it is raising questions about the future of university essays and research production. Publishers and preprint servers contacted by Nature's news team agree that AIs such as ChatGPT do not fulfil the criteria for a study author, because they cannot take responsibility for the content and integrity of scientific papers. But some publishers say that an AI's contribution to writing papers can be acknowledged in sections other than the author list.
This discussion has been archived. No new comments can be posted.

ChatGPT Listed as Author on Research Papers. Many Scientists Disapprove.

Comments Filter:
  • by lvxferre ( 2470098 ) on Thursday January 19, 2023 @02:22PM (#63222268)

    ChatGPT is a tool and it should be treated as such.

  • by physicsphairy ( 720718 ) on Thursday January 19, 2023 @02:30PM (#63222298)

    The important thing is not crediting ChatGPT as a coauthor, it is being able to list your other "coauthors" who didn't do crap *after* ChatGPT in the author list.

    An important benchmark.

  • Any AI that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism. If there're any that correctly attribute the works produced I see nothing wrong with it, as they can then be peer reviewed. As is, this appears to be insignificantly different from citing Google, Wikipedia, or Pinterest for works presented by those services.
    • I am not sure I agree. Some of what it does is grabbing facts and putting them into common language based on a request. The facts themselves are not plagiarized; they can even be corroborated between multiple sources. The common language component is also not directly plagiarism (or even theft of style), if it is trained on a wide enough sample of data.

      • Another problem that makes reviewer life difficult is that it can produce very well written text, which you can typically link to high quality authors, but the content may be garbage at times. So, you need to question every single sentence, if you're reviewing for accuracy.
      • I am not sure I agree. Some of what it does is grabbing facts and putting them into common language based on a request. The facts themselves are not plagiarized; they can even be corroborated between multiple sources.

        Right, and there's a word for providing material that "can even be corroborated between multiple sources" without providing citation for those multiple sources. To quote myself where I addressed the scenario "As is, this appears to be insignificantly different from citing Google, Wikipedia, or Pinterest for works presented by those services."

        The common language component is also not directly plagiarism (or even theft of style), if it is trained on a wide enough sample of data.

        That falls into true some of the time, and false some of the time category. Paraphrasing and citing the paraphrasing tool, rather than the works being paraphrased, is s [libguides.com]

        • Paraphrasing and citing the paraphrasing tool, rather than the works being paraphrased, is still plagiarism

          Even stronger, it's probably copyright infringement. A mechanical transformation of copyrighted expressions is at least a derived work.

    • Any AI that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.

      Any human intelligence that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.

      Thus if you read Shakespeare and write a sonnet, really Shakespeare should be credited and not you for learning how to write a sonnet.

      Your view is in contradiction because the AI learns in the same manner as humans and we credit humans for their unique sonnets imitating the sonnets of others.

      • Any AI that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.

        Any human intelligence that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.

        correct, other than the application part

        Thus if you read Shakespeare and write a sonnet, really Shakespeare should be credited and not you for learning how to write a sonnet.

        Show me one AI that people are citing that's trained only on material that's out of copyright like Shakespeare and I'll accept I was wrong. One is enough.

        Your view is in contradiction because the AI learns in the same manner as humans and we credit humans for their unique sonnets imitating the sonnets of others.

        Nearly right, AI doesn't learn in the same manner as humans. In academia it's important to credit the originators of derivative works of copyrighted and material no longer under copyright alike so that peer review can be done. These fail to do so. Using AI as a black box that fails to indicate where the ideas came

        • by rgmoore ( 133276 )

          In academia it's important to credit the originators of derivative works of copyrighted and material no longer under copyright alike so that peer review can be done.

          That's not quite right. Citing work isn't about creating derivative works or copyright in any sense; that's why it's just as important to cite works that are long out of copyright as it is ones that were published yesterday. The purpose of citation in academic works is threefold:

          1. To give proper credit to the people whose work you're building
  • Don't the majority of them recycle each other's papers anyway?

  • Did they list Microsoft Word as an author too?

  • From what I've seen on stackexchange when people have attempted to answer with chatGPT, the answers have been poor. A research paper is much more involved than answering a simple technical question. Because of this I would expect that any chatGPT technical text would be massaged by a human for it to work correctly. In my view chatGPT is a tool, like spellcheck or grammer check. And you don't cite those when you write a research paper, you don't even cite latex, so why would you cite any other tool? ChatG
    • I've asked it to invent things... For example, ask it to invent a new Stargate series pilot episode. It will reuse characters from the existing show but if you ask it for all new people it just makes them up. Same with the bad guy... Ask for a new antagonist and it will invent something you never thought of before. I was pretty impressed with it's creativity.

      • Now ask it to come up with something novel, like a completely new genre of TV series with completly novel characters, it couldn't do it, because it's never seen anything like that before. It's already watched stargate.
    • it doesn't come up with anything new outside of what it's been trained on.

      That's not true, at least not in any direct sense. For example here is a transcript I just generated with it:

      tell me a joke about a guy named wakeboarder using a word that rhymes with boarder

      Why did the wakeboarder bring a border collie with him on the boat? To herd the waves!

      Now we could argue about whether "border" rhymes with "boarder," and also whether the joke is funny. But it's not quoting anybody. And it's definitel

      • New in the sense that it's completely novel. Your transcript is just like a few it has already seen. It could not generate a new grammatical structure unlike something it hasn't already seen
  • You pulled your data analysis from the public using ChatWhateverIt'sCalled, so why should any of your research be treated as anything but Public Domain. Sure use automated tools on data you couldn't be bother to look up yourself, but don't expect to get credit for it either.
  • An author needs to be a legal entity. The creator of a 3D object is not the printer. ChatGPT can be listed as author after it successfully argues in court that it is a legal entity. So, maybe 2 or 3 years from now. ;-)
  • Citing ChapGPT as an "author" is more like admitting to plagiarism - parts of this article copied from sources unknown, or giving TMI about the editing tools you used "grammar corrected by grammar-monkey", or perhaps "I'd like to thank my parents, Clippy, and ChatGPT for the copy-pasta".

    The only place I could see for real using ChatGPT in a paper would be for generating a summary.

    I have to guess anyone listing ChatGPT as an "author" is just doing it for laughs/attention.

  • What have in common those four papers referenced by the article? They came from a medical sector, that is, people that do not think but memorize stuff up and apply it like a bot. That's why they see ChatGPT as a co-author, because it works in the same way as them: don't think just memorize and apply data based on rules and frameworks.

    Google fired an engineering that was saying company's AI is sentient. The authors of these papers are almost in the same terrain of thinking ChatGPT has intelligence. This i
    • What you've just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this thread is now dumber for having read it. I award you no points, and may God have mercy on your soul.
    • But is that how they used it? Or did they do what journalists have been doing for a while: give it specific directions like "Write an introduction to a white paper on the subject of blah where we were testing for blah and blahblahblah...". You edit the technical stuff and insert you data in, and massage it all until you have a nice readable paper. Think of using the AI as more of an editor and non-technical wordsmith.
      I am suspicious that most people are reading way too much into their use of the term "autho

  • SexyPG89 betflik [sexypg89.com] 2023
  • SexyPG1688 3KAuto [sexypg1688.com]
  • SexyPG168777 [sexypg168.com]

A committee is a group that keeps the minutes and loses hours. -- Milton Berle

Working...