ChatGPT Listed as Author on Research Papers. Many Scientists Disapprove. (nature.com) 40
The artificial-intelligence (AI) chatbot ChatGPT that has taken the world by storm has made its formal debut in the scientific literature -- racking up at least four authorship credits on published papers and preprints. Journal editors, researchers and publishers are now debating the place of such AI tools in the published literature, and whether it's appropriate to cite the bot as an author. From a report: Publishers are racing to create policies for the chatbot, which was released as a free-to-use tool in November by tech company OpenAI in San Francisco, California. ChatGPT is a large language model (LLM), which generates convincing sentences by mimicking the statistical patterns of language in a huge database of text collated from the Internet. The bot is already disrupting sectors including academia: in particular, it is raising questions about the future of university essays and research production. Publishers and preprint servers contacted by Nature's news team agree that AIs such as ChatGPT do not fulfil the criteria for a study author, because they cannot take responsibility for the content and integrity of scientific papers. But some publishers say that an AI's contribution to writing papers can be acknowledged in sections other than the author list.
Might as well list pens and pencils as authors. (Score:4, Insightful)
ChatGPT is a tool and it should be treated as such.
Re: (Score:3)
"ChatGPT is a tool and it should be treated as such."
Exactly. ...
If the paper is signed with 'Written on my iPhone' OTOH
Re: Might as well list pens and pencils as authors (Score:2)
You forgot to append "posted from my iPhone".
Re: (Score:1)
ChatGPT is a tool and it should be treated as such.
Bingo. The humans who think ChatGPT is something more than a tool, are the actual tools.
Re: (Score:2)
Re: (Score:1)
Why are we listening to scientists at all ?
You ask that question as "social media" becomes "the news" for anyone under 30?
I'd say the answer is rather obvious; because we're dumb enough to assume scientists are never wrong. Kind of like celebrities.
Re: (Score:2)
I'd say the answer is rather obvious; because we're dumb enough to assume scientists are never wrong. Kind of like celebrities.
Is that better or worse than being so dumb you think you know more about a topic than people who actually understand the topic?
Re: (Score:2)
Perhaps it's "better", to understand you, me, and every other fallible human (and scientist) falls somewhere within the Dunning-Kueger realm of human ignorance and confidence. And we get humbled by our own race. Often.
It's also good to understand the Disease of Greed has infected mankind for thousands of years now. Questioning the validity and accuracy of "research studies" and scientists has damn near become a full-time industry that exploded in recent years due to the massive market value of selling H
Don't take this from us (Score:3)
The important thing is not crediting ChatGPT as a coauthor, it is being able to list your other "coauthors" who didn't do crap *after* ChatGPT in the author list.
An important benchmark.
Re: (Score:2)
> Unless the scientists are claiming there's a special art to science, following the scientific method is all that is required to do science.
Curious what you mean by a special art, or by The scientific method ?
"6. Discourse on scientific method.
"Despite philosophical disagreements, the idea of the scientific method still figures prominently in contemporary discourse on many different topics, both within science and in society at large. Often, reference to scientific method is used in ways that convey eit
It's still plagiarism (Score:2, Insightful)
Re: (Score:3)
I am not sure I agree. Some of what it does is grabbing facts and putting them into common language based on a request. The facts themselves are not plagiarized; they can even be corroborated between multiple sources. The common language component is also not directly plagiarism (or even theft of style), if it is trained on a wide enough sample of data.
Re: (Score:2)
Re: (Score:3)
I am not sure I agree. Some of what it does is grabbing facts and putting them into common language based on a request. The facts themselves are not plagiarized; they can even be corroborated between multiple sources.
Right, and there's a word for providing material that "can even be corroborated between multiple sources" without providing citation for those multiple sources. To quote myself where I addressed the scenario "As is, this appears to be insignificantly different from citing Google, Wikipedia, or Pinterest for works presented by those services."
The common language component is also not directly plagiarism (or even theft of style), if it is trained on a wide enough sample of data.
That falls into true some of the time, and false some of the time category. Paraphrasing and citing the paraphrasing tool, rather than the works being paraphrased, is s [libguides.com]
Re: (Score:2)
Paraphrasing and citing the paraphrasing tool, rather than the works being paraphrased, is still plagiarism
Even stronger, it's probably copyright infringement. A mechanical transformation of copyrighted expressions is at least a derived work.
Learning isn't plagiarism. (Score:3)
Any AI that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.
Any human intelligence that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.
Thus if you read Shakespeare and write a sonnet, really Shakespeare should be credited and not you for learning how to write a sonnet.
Your view is in contradiction because the AI learns in the same manner as humans and we credit humans for their unique sonnets imitating the sonnets of others.
Re: (Score:2)
Any AI that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.
Any human intelligence that produces output without sufficiently crediting the authors of the inputs it trained on is correctly regarded as an application that just speeds up plagiarism.
correct, other than the application part
Thus if you read Shakespeare and write a sonnet, really Shakespeare should be credited and not you for learning how to write a sonnet.
Show me one AI that people are citing that's trained only on material that's out of copyright like Shakespeare and I'll accept I was wrong. One is enough.
Your view is in contradiction because the AI learns in the same manner as humans and we credit humans for their unique sonnets imitating the sonnets of others.
Nearly right, AI doesn't learn in the same manner as humans. In academia it's important to credit the originators of derivative works of copyrighted and material no longer under copyright alike so that peer review can be done. These fail to do so. Using AI as a black box that fails to indicate where the ideas came
Re: (Score:1)
That's not quite right. Citing work isn't about creating derivative works or copyright in any sense; that's why it's just as important to cite works that are long out of copyright as it is ones that were published yesterday. The purpose of citation in academic works is threefold:
What is a scientist? (Score:2)
Don't the majority of them recycle each other's papers anyway?
ChatGPT is a tool not an author (Score:2)
Did they list Microsoft Word as an author too?
ChatGPT really isn't that great (Score:2)
Re: (Score:1)
I've asked it to invent things... For example, ask it to invent a new Stargate series pilot episode. It will reuse characters from the existing show but if you ask it for all new people it just makes them up. Same with the bad guy... Ask for a new antagonist and it will invent something you never thought of before. I was pretty impressed with it's creativity.
Re: (Score:2)
Re: (Score:2)
That's not true, at least not in any direct sense. For example here is a transcript I just generated with it:
Now we could argue about whether "border" rhymes with "boarder," and also whether the joke is funny. But it's not quoting anybody. And it's definitel
Re: (Score:2)
Then work submitted should be Public Domain. (Score:2)
This post was written by Mozilla Firefox (Score:2)
Just for the laughs, I suppose (Score:2)
Citing ChapGPT as an "author" is more like admitting to plagiarism - parts of this article copied from sources unknown, or giving TMI about the editing tools you used "grammar corrected by grammar-monkey", or perhaps "I'd like to thank my parents, Clippy, and ChatGPT for the copy-pasta".
The only place I could see for real using ChatGPT in a paper would be for generating a summary.
I have to guess anyone listing ChatGPT as an "author" is just doing it for laughs/attention.
Dangerous assumptions about what ChatGPT can do (Score:1)
Google fired an engineering that was saying company's AI is sentient. The authors of these papers are almost in the same terrain of thinking ChatGPT has intelligence. This i
Re: (Score:2)
Re: (Score:2)
But is that how they used it? Or did they do what journalists have been doing for a while: give it specific directions like "Write an introduction to a white paper on the subject of blah where we were testing for blah and blahblahblah...". You edit the technical stuff and insert you data in, and massage it all until you have a nice readable paper. Think of using the AI as more of an editor and non-technical wordsmith.
I am suspicious that most people are reading way too much into their use of the term "autho
SexyPG89 (Score:1)
3KAuto (Score:1)
777 (Score:1)