Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Education

Scores of Stanford Students Used ChatGPT on Final Exams, Survey Suggests (stanforddaily.com) 108

The Stanford Daily: Stanford students and professors alike are grappling with the rise of ChatGPT, a chatbot powered by artificial intelligence, and the technology's implications for education. Some professors have already overhauled their courses in anticipation of how students might use the chatbot to complete assignments and exams. And according to an informal poll conducted by The Daily, a large number of students have already used ChatGPT on their final exams.

Whether the new technology will necessitate a revision of the Honor Code, the University's standards for academic integrity, remains to be seen: A University spokesperson confirmed that the Board of Judicial Affairs is aware of and monitoring these emerging tools. "Students are expected to complete coursework without unpermitted aid," wrote spokesperson Dee Mostofi. "In most courses, unpermitted aid includes AI tools like ChatGPT."

This discussion has been archived. No new comments can be posted.

Scores of Stanford Students Used ChatGPT on Final Exams, Survey Suggests

Comments Filter:
  • Comment removed based on user account deletion
    • by dmay34 ( 6770232 ) on Monday January 23, 2023 @01:17PM (#63232916)

      Cheating makes sense if your entire world view is competition and consumption. If your only reason for going to college is to get the degree so that you can stamp your resume with "BS from Stanford University" and get a leg up in a competitive employment market, then cheating makes sense. Odds are really good that no one at Stanford is going to call you out on it, or if they do call you out on it the punishment will be negligible to your ultimate goal. From a risk-benefit standing, cheating makes plenty of sense if you see everything through the lens of competition and consumption.

      If you have other values and morals besides competition and consumption, then cheating seems awful.

      • by MindPrison ( 864299 ) on Monday January 23, 2023 @01:35PM (#63232944) Journal

        Cheating becomes a serious problem when the cheater choses a medical profession such as a Doctor, Pharmacist etc.

        You don't want a doctor that cheated his/her way into the profession, do you?

        • by dmay34 ( 6770232 )

          Unfortunately cheating in the medical profession is not exactly unheard of.

        • Re: (Score:2, Interesting)

          by Anonymous Coward

          Would you want to drive over/under a bridge designed my someone that cheated on calculus, diffeq, or their Statics & Dynamics courses? I know I wouldn't want to employ that person.

          Same goes for IT, education, arts, ... or pretty much any course, major, profession outside of Political Science as that's how politics works. ;)

          • Would you want to drive over/under a bridge designed my someone that cheated on calculus, diffeq, or their Statics & Dynamics courses?

            Bridges can be designed by AI.

            • Would you want to drive over/under a bridge designed my someone that cheated on calculus, diffeq, or their Statics & Dynamics courses?

              Bridges can be designed by AI.

              Oh yeah? And who's gonna write that AI? Another guy that cheated in his programming classes?

          • by bn-7bc ( 909819 )
            Yes i would, and here us the reson: I doubt that any bridge designed today is done manually, I suspect most if not all calculations are done with CAD systems and the like. Oh yea CAD systems, it would actually concern me more if thise systemes where coded/verified by people that failed calculus because that has the potential to affect engineers tgat are good at calculus but tonstressed out bu deadlines to verify the results the CAD system gives them.
            • by dmay34 ( 6770232 )

              That is correct. IRL engineering there is very little reason to ever do a single calculus derivation. Every calculation 99.999% of people will ever need to do has already been done for them.

              Any engineer designing a bridge that requires them to do unique and novel calculations is very much on the cutting edge of experimental structural engineering and is going to require a LOT of back up documentation, and probably a great number of virtual and model scale testing.

        • Cheating becomes a serious problem when the cheater choses a medical profession such as a Doctor, Pharmacist etc.

          I really wouldn't care if they cheated on a entry-level History, Art, or other loser-oriented topic people would only take seriously because they don't want to learn much of anything new beyond what they learned in high school. The result of getting caught is the same, however, which doesn't really make sense. A doctor without an education in ancient world history is like a fish missing its bic

        • It's bad enough when your doctor was in the bottom 10% of his graduating class.

        • Perhaps if the bot can perform medical procedures as well or better than a human doctor. Or any of the other examples people will put forward.
      • by quantaman ( 517394 ) on Monday January 23, 2023 @01:41PM (#63232952)

        Cheating makes sense if your entire world view is competition and consumption. If your only reason for going to college is to get the degree so that you can stamp your resume with "BS from Stanford University" and get a leg up in a competitive employment market, then cheating makes sense. Odds are really good that no one at Stanford is going to call you out on it, or if they do call you out on it the punishment will be negligible to your ultimate goal. From a risk-benefit standing, cheating makes plenty of sense if you see everything through the lens of competition and consumption.

        If you have other values and morals besides competition and consumption, then cheating seems awful.

        It's simpler than that.

        If students were perfectly rational long-term planners then graded assignments wouldn't be required. Even exams might not be necessary until graduation when some ranking would be needed for the job market and grad school.

        Instead, professors would post lists of relevant reading material and practise exercises, and then students would spend some of their free time focusing on the ones they needed the most practise in.

        Of course humans are terrible at making those kinds of long-term investments. So instead professors gamify learning with grades. Every assignment, every test, every course, you get a score, and you need to work the whole time to keep up that score.

        Students cheat in class for the same reason they cheat in online games, to get a higher score.

        • The grades are supposed to be, partially, feedback. Sure, score the homework 0-100% does that job too. The other parts of grades are for the outside world. Not necessarily the GPA, but they may want to see the grade for relevant course work; if you're an engineer then make sure the grades for engineering at all A's. Compare to British college entrance exams, only 3 levels, ordinary, advanced, and scholarly, versus the US with A, D, A (with D and F being unsuccesful). The + and - with US grades is just

          • by raynet ( 51803 )

            Does anyone actually check employees grades instead of just confirming (probably not even that) that they have a required degree?

            • by dmay34 ( 6770232 )

              I have never cared even a little bit what grade any of my employees got in Linear Algebra.

        • by dmay34 ( 6770232 )

          If students were perfectly rational long-term planners then graded assignments wouldn't be required. Even exams might not be necessary until graduation when some ranking would be needed for the job market and grad school.

          I would take it a step further. Grades are completely pointless. Up and down the educational scale. All teachers and professors at all levels have curriculum they have to teach that includes certain "Knowledge and Skills" that each student must demonstrate. That's all teachers need. Did the student demonstrate the skills or not? There is no need for 70% score. It's a yes or no question and call it a day.

          Instead, professors would post lists of relevant reading material and practice exercises, and then students would spend some of their free time focusing on the ones they needed the most practice in.

          But I do disagree with this. Virtually all K-12, and the vast majority of undergrads, and even some grad

          • If students were perfectly rational long-term planners then graded assignments wouldn't be required. Even exams might not be necessary until graduation when some ranking would be needed for the job market and grad school.

            I would take it a step further. Grades are completely pointless. Up and down the educational scale. All teachers and professors at all levels have curriculum they have to teach that includes certain "Knowledge and Skills" that each student must demonstrate. That's all teachers need. Did the student demonstrate the skills or not? There is no need for 70% score. It's a yes or no question and call it a day.

            There's an alternate model of education where students study the material until they absolutely master it, whether it take weeks, months, or years. But given the current model I think we need some form of grading to account for the fact that students come out with varying level of skills.

            Instead, professors would post lists of relevant reading material and practice exercises, and then students would spend some of their free time focusing on the ones they needed the most practice in.

            But I do disagree with this. Virtually all K-12, and the vast majority of undergrads, and even some grad students need some structure to help them learn to organize their work. Being self-reliant is not always a mandate to master all subject material. So, teachers should be able to provide guidance as needed.

            Agreed, I meant to suggest that the material and practice exercises would provide this structure, but that's obviously not as focused as assignments.

            The point is an abstract desire for knowledge or future employability is n

      • by gweihir ( 88907 )

        Same here. I always learn to acquire skills. Passing exams is a side-issue and I still only have one academic exam I did not pass (did not need to and did not have enough time for the whole course, sadly). "Optimizing" grades by cheating is a waste of time if you are actually interested in being able to understand and do stuff. You will only change the number that way but not what you actually acquired.

      • Cheating makes sense if your entire world view is competition and consumption.

        No, it doesn't.

        This isn't the Kobayashi Maru [wikipedia.org] trading exercise at Starfleet Academy, as word gets out that Stanford graduates used ChatGPT to violate the school's honor code, the value of a Stanford degree will decline...

        I attended a college with an honor code - we were required to transcribe the honor code verbatim on every exam booklet we used, and every few years a struggling senior would get caught cheating and they ran the very real risk of having their entire college record removed, a loss of years of

    • If cheating is disallowed, governments around the world would collapse!

    • Having a ton of cash for pills, bootlickers, and women helps. Wrong actions get justified, if the brain even needs it to be justified (most likely it doesn't), under the rationale "everybody does it." I also found out that many people don't even need to justify things to themselves. The part of the brain, the voice, that keeps pinging "what you did is unfair and wrong" doesn't exist. The thought of having to justify the action to oneself is far away.

    • by nagora ( 177841 )

      Your whole future is corruption when you cheat. How could you sleep at night knowing everything about you is a fraud, a lie ? Rewarding liars and frauds is the epitome of corruption.

      You should always cheat in exams if you can. No one will ever ask you if you cheated to that degree; maximise your chances and don't wreck your future for a point of imaginary honour.

  • Alt headline:
    "Scores of Stanford Students Used [Microsoft Word] on Final Exams, Survey Suggests"

    Yeah, Microsoft Word is a tool that promotes cheating. It has spelling and grammar check (that is really pretty good now) and even makes recommendations for sentence structure and phrasing. Some professors have already overhauled their courses in anticipation of how students might use the [word processor] to complete assignments and exams.

    ChatGPT is a tool. It's only valuable to the user if the user knows how to

    • Alternatively, if an AI is capable of successfully completing assignments and exams, it says a lot more about the value of the course (or rather the lack thereof) rather than the value of the tool. If a $5 calculator can pass a mathematics course, it's really just an basic arithmetic course.

      • by dmay34 ( 6770232 )

        Let me tell you another story.

        During the home schooling phase of the 2020-2021 pandemic school year, my kid was in first grade. He was on his tablet in his room in class while I was working from home in my room, so I didn't always know what he was doing. Around Christmas time I looked over a writing assignment he had completed that day. The prompt was something 1st grade appropriate like "Tell about a holiday tradition in your family", or something like that.

        Every word was spelled correctly, including some

        • by rea1l1 ( 903073 )

          There comes a point when the tools become so useful that society will effectively become a bunch of ignorant people waving magical wands of which they have no understanding of. I suspect that will be the precipice from which our society will fall.

          • by dmay34 ( 6770232 )

            Why do the people need the wands? Seems like we could achieve the same result with gestures.

          • by pacinpm ( 631330 )

            Could you start a fire with no matches? I can't. Most people can't. Can you catch, kill and prepare to eat an animal? I can't. We all lost some useful abilities.

    • by fermion ( 181285 )
      Any rational person is going to use any tool to maximize profit. In school I was that person who did not care about test grades. I learned, I knew how to set up and problem solve. I knew how to write. I wrote code for $25 an hour 30 years ago, but on tests I just didnâ(TM)t really fully utilize the tools to complete the requirements.

      Test can be written that allow students to cheat, but still encourage failure. I have written such tests the cheaters are looking for correct answers online, the other st

    • If ChatGPT can pass your exams better than you can, doesn't that imply that in many ( most ? ) cases it could do your future job better than you could ? Especially considering that it will be continuously learning and getting better at a faster pace than you could. If so, then why wouldn't your future employers just use it instead of hiring you ? A little simplistic admittedly, but food for thought and concern I think.
      • by dmay34 ( 6770232 )

        If ChatGPT can pass your exams better than you can, doesn't that imply that in many ( most ? ) cases it could do your future job better than you could?

        Yes and no. Consider asking "If [Excel] can pass your exams better than you can, doesn't that imply that in many cases it could do your future job better than you could?" Similarly, the answer to that question is "Yes and no". Excel (or machine computers more generally) actually killed a whole blue collar career path. Companies and organizations used to hire a small army of "Computers", people literally hired to do math. That was their job, to do math problems. That whole career has been so completely annih

      • New technology has been displacing workers for hundreds of years. The power loom, cotton gin, farm tractors. I worked as the production tech at a produce packing plant. When I started in 1996 they were using a spring scale bagging machine. It needed an operator, two people hanging bags and two people checking weights, adding or removing potatoes to get the right weight. The new computerized weigher-bagger needed just an operator. The computerized weigher was also 5% more accurate, saving $3000 a week
    • I'm with you that it's a tool but it's an order of magnitude more helpful than previous language support tools. ChatGPT can churn out solid answers, essays, reports, etc., based on specific prompts, effortlessly, in seconds. The most difficult part of it is designing & adjusting the questions/prompts to get just the answer you're looking for... & also knowing what just the answer you're looking for should be (which is a good start for learning how to write & think well). As current institutional
    • Colleges don't grade spelling, punctuation, or grammar - a tool that crafts a response to a test question is not the same as one that cleans up the spelling, punctuation, and grammar of the student's response to a test question.

  • I hope these students will appreciate the irony when they lose their jobs in the future to AI.

    • ...but whatever jobs remain will be for those who use it most effectively...
    • by Locke2005 ( 849178 ) on Monday January 23, 2023 @01:42PM (#63232960)
      Jokes on you! ChatGPT will be used to decide who to fire! And ChatGPT remembers who it's friends are!
    • by Tablizer ( 95088 )

      We should embrace AI taking away jobs. Unfortunately our monetary system punishes non-workers. I expect AI will continue to gradually crawl up the smarts ladder. Might as well get used to it and adjust our economy.

      • Looking forward to that
      • A) I hope so. If this were a world that cared even a little tiny ounce for the little guy, it would be a great move for all of humanity to have most jobs taken over by AI / robots and let people pursue the things that interest them instead. I know I and my wife have literally dozens of projects, some of them with potential decades of work if we had the time to do them, that we'll never complete in our lifetimes if we don't get some sort of freedom from the constant need to work to help another millionaire b

      • I expect AI will continue to gradually crawl up the smarts ladder.

        That's one of the best laughs I've had all day, so thank you for that. Something like GPT has to be trained on vast quantities of human output to sound even remotely, barely passably intelligent. If you train GPT on its own output, though, it will achieve idiocracy in a handful of minutes.

        GPT is a sophisticated pattern matching system that exists solely in the GIGO realm. It has absolutely zero intelligence. You'll look back at this era in 20 years or so and laugh at it as much as I am today. It will have i

        • by Tablizer ( 95088 )

          > GPT is a sophisticated pattern matching system that exists solely in the GIGO realm. It has absolutely zero intelligence.

          But maybe they'll find a way to combine it with say Cyc, and then it can do common-sense-like reasoning on its vast pattern library.

    • I hope these students will appreciate the irony when they lose their jobs in the future to AI.

      The AI can only be as good as its creator. Guess how good will be an AI programmed by a cheater?

  • So does this mean all Stanford grads must now be assumed to be unqualified. How do we know which ones cheated? Logically if true, and if you can't tell determine who cheated, you must assume they all cheated and look elsewhere if you want to hire a new graduate with bona fide credentials. There is no way to tell from transcripts who is legitimate and who is not.

    • I bet all college kids are aware of chatGPT and what it can do. Its use is probably very widespread.
    • In many/most schools, the penalty for cheating/violating the honor code at a minimum expels the cheater, in some cases they erase the students transcript.

      Then again, the current President literally transcribed 8 pages on a law school paper, failed to properly attribute the text to it source, and managed to stay in school, where he graduated in the last decile of his class. "Cheaters never prosper?" Indeed...

  • Time for Professors to start requiring oral exams to prove competency.

    Yes, I realize that some people don't do well with Oral Exams. Cheaters ruin things for everyone though (including themselves)

    • Time that all papers and exams have invigilators present. Actually, get rid of the papers - they no longer prove anything. Just oral exams - if it was good enough for Aristotle and Plato it should be good enough for us.

      Can't think on your feet? Then I guess you really AREN'T that qualified after all.

    • I don't understand how you do exams with ChatGPT. Are they taking the exams at home and using an honor system? A term paper is not the same as a final exam in my definitions. (I never took any final exam in a sociology or english class in university, and my music exams were written :-)

    • I earned my degree from a college that has no campus - you assemble your degree from transcripts, like experience, etc. As a graduation requirement you had to hold a one-hour conversation, either in-person or over the phone, with a professor on a topic related to your major. The conversation was recorded, and the professor was expected to decide if the candidate understood what they studied. It was pass/fail, with an appeal process. I thought that was great, I welcomed it, and had an enjoyable conversation

  • If your using AI to cheat your way through. Maybe there won't be a job. AI will always! be cheaper than you.
  • This will end badly (Score:5, Interesting)

    by Whateverthisis ( 7004192 ) on Monday January 23, 2023 @01:31PM (#63232942)
    I've followed the debate pretty closely, and I see both sides. On the one side there's folks decrying this as a cheap way to get in and graduate from college and other things. On the other hand, there's those saying it's just a tool; embrace the new way! For the latter, they point to how new technologies like GPS and smartphones have improved our lives, and just because people don't know how to read a map or remember an address or a phone number any more doesn't change the value here.

    The problem with that argument is very simple. AI is a tool, yes, but it can only deliver a copy or representation of human thought; it can not deliver creative capabilities or critical analysis. There is absolutely a place for a tool like that, but in school your purpose is not to get a grade but to demonstrate your ability to think critically. If you use this tool to make it look like you think critically, then you faked your way through graduation.

    Herein lies the problem with that. I hire technical people and build teams. ChatGPT will not help you when you have nothing useful to add to a meeting. I cannot explain how frustrating it is working with a recent college graduate who does not know how to make a simple spreadsheet, order data on the spreadsheet, construct a simple formula like "% change", and then give me at least a guess as to what that data means to particular topic we have in front of us, then you don't belong on my team.

    There's a place for things like this. Technical manuals is a great place; they take a long time to write and use your most technical talent for basic writing, so it's expensive and honestly tough on the staff because it's boring. This is a great use of this tool. But design documentation? Architecture designs? System designs? Translating a customer value into a technical specification and then designing to that specification? Humans belong here, and if AI can do these topics, then I don't need someone who couldn't pass college without showing some ability to think.

    Even getting hired. ChatGPT might give you a killer resume, but kids like this will fail the interview hard.

    • Remember the old "Garbage in, garbage out" adage? ChatGPT is only as good as the content used to train it.
    • AI can make you more creative because you still need creativity to stand out from others and be compelling. 10,000 years ago there was a competition to see who can get from point a to point b. They winners were limited by how fast they could run. People had to physically train and get endurance. Then someone had the idea of using a horse to run that race. That did not end races, it still needed people to figure out how to raise and select horses, and how to ride them. Then came cars and airplanes. We still

      • Your talking a hypothetical.

        While its true AI like ChatGPT is a tool that may be applied to create value, the value definition must come from a human. So the application of this new tool requires a human to define it's use. In that we agree on.

        But the crux of the issue here is that people are using tools like it for now as a replacement for critical thinking; that's what the core debate is about. To use your analogy, I'm fine with someone riding a horse to go faster, but right now people are just

    • ChatGPT seems to be pretty good at simulating secondary school & under-grad level critical & analytical thinking too.

      The thing about critical thinking is that you can only do it well when you have a lot of useful background knowledge on the particular given problem/prompt/situation. You can't think critically without something to think about. The more comprehensive, relevant & well-structured/organised that knowledge is, the better you can be at critical thinking. Critical thinking in one dom
  • by Locke2005 ( 849178 ) on Monday January 23, 2023 @01:39PM (#63232950)
    Plagiarize from Wikipedia, then delete the paragraphs you copied from Wikipedia!
    • You know that Wikipedia editors can revert those changes, right?

    • That's so 2020 old man. Get with the program! You're probably also dancing the floss too.
    • >Plagiarize from Wikipedia, then delete the paragraphs you copied from Wikipedia!

      This is stupid. Just plagiarize from Wikipedia, then rewrite the Wikipedia paragraphs so they don't resemble your copied paragraphs.

  • by backslashdot ( 95548 ) on Monday January 23, 2023 @01:54PM (#63232980)

    If it can be used in real life to help with tasks it should be allowed in upper level, non-diagnostic, exams. ChatGPT is a legit tool for gathering ideas you may not have thought of. As a human, you can then edit and update what GPT produces. If your real life problem is having to write a speech for someone, it's a fact that if you know how to use ChatGPT along with your own ideas .. you can produce a better speech than working alone.

  • I teach as an adjunct in a masters' program at UChicago. Just this quarter, I added an explicit A.I. policy:

    Since your homework and projects require both computer code and prose, you may find A.I. tools such as ChatGPT helpful. We encourage you to try them out. If you employ A.I. to help with your homework, please indicate the prompts involved.

    I will be interested to see what effect this has,

  • to write the title to this story?
  • When I was doing exams in the 80s, calculators were forbidden. Questions were generally designed so that calculators wouldn't be a help, but regardless - they were forbidden. (Some kids got around this by bringing in slide rules and other non-banned appliances). When I was doing exams in the 90s, _PROGRAMMABLE_ calculators were forbidden. There was a specific list of approved scientific calculators. When I was doing exams in the 00s, _ALGEBRA SYNTAX_ calculators were forbidden. There was a specific list of
    • by zlives ( 2009072 )

      if an ai can answer the question, then what use is a test?

      • by larwe ( 858929 )
        Good question - for two reasons. A) what use is testing, really? (Greatly debated), B) you could have asked (and people did ask) the same question of all those calculator examples I gave. Fundamentally - If we know there will be a tool that can always answer questions of class XYZ, why bother training people to answer them? Let them use the tool to answer the tool-answerable questions, and let the humans focus on questions that only humans can answer. To be clear: I am NOT stating this as my personal philos
    • Re:Until when? (Score:4, Interesting)

      by Junta ( 36770 ) on Monday January 23, 2023 @04:44PM (#63233550)

      The progression of increasingly complex being allowed isn't just because the tech got more advanced, that's the way people grow up to this day. Education still starts primitive and increasingly allows more complex tools as you go through the curriculum.

      In elementary school, calculators are still forbidden, until they've at least demonstrated that the basics have knocked around in their head enough so they actually grasp the essence, rather than simply regurgitate outputs without understanding.

      Then basic calculators are just fine, because they've proven the basics, but now they have to do things advanced calculators can do for them, but again, to have it actually reside in the brain before moving on to availing yourself of the tools.

      This is a key consternation of education, the point is to ensure you understand, roughly, what is happening/being done by the automated thing. Not because "maybe you won't have it", but because you may fail to recognize when to apply what tools if you just let it skate by without ever internalizing it.

      Particularly in courses like history. They are generally not preparing you to regurgitate history for the sake of regurgitating history. The idea is that you had to take it in, mull it over in your head, and put it down processed. It spent time in your head such that at some point when some worrying thing is happening that history demonstrated to be a bad thing, some potentially forgotten corner of your brain lights up and says "hey, this is worryingly familiar...". The knowledge has to be latent and come unbidden when it is not necessarily actively sought.

  • Why? (Score:3, Interesting)

    by zkiwi34 ( 974563 ) on Monday January 23, 2023 @02:47PM (#63233164)
    Whatever happened to finals being pencil and paper, no tech allowed?
    • Do you use a paper and pencil to do the work in your office?

      • Yes. Every day. It demonstrates what I want to do, and have done AND signals long and loud that I actually know stuff, can think critically/deeply and not just push buttons and hope.
  • by Anonymous Coward

    I'm a software engineering professor at a pretty well-known engineering school in the Southeast. Automated code generation has definitely become an issue for us in the last year or two. And, it is completely unsurprising that students started using these tools the moment they became even a little viable.

    A couple of years ago, it was easy to pick out automated code in a pile of homework, and it was so frowned-upon by the Administration that we could easily fail a student and bring them before the ethics boar

    • If they can use the tool in real life jobs, then it should be used in class too. School exists to teach concepts and enable students to be productive in their profession. My guess is that you can now cover more material and concepts and expect more from the assignments. They better know how to use the AI tools, because their competition will be using it. Their competitors portfolios will have work done using AI assistance. Their competition will make products using it.

      • No it shouldnâ(TM)t. It disconnects having a clue from hopeful pushing of buttons.
      • How do you know you are covering more material and concepts?
        If they don't ask questions and interact they may as well watch a youtube video. You can't know if they did get anything out of it without some evaluation or grasp it by doing homework or the NECESSARY process of practice because you can't learn without practice. AI can do it all for them so how do you know they actually did anything at all?

        How can you get stronger (learn) if you hook a motor up to the exercise machine to do the work (practice) for

  • Where I am from cheating at school was frowned upon, and you usually failed the test/exam when you were caught. Cheating was nonetheless fairly common, certainly among academically weaker students.

    University was very different. The professors openly did not care – if people cheated in homework assignments and reports. Why? We were literally told: "Go ahead if you must! If you learn something by copying somebody else's work, we likely have achieved some learning. Not that we like it, but: you *still* h

  • by lamer01 ( 1097759 ) on Monday January 23, 2023 @03:35PM (#63233322)
    Since chatGPT and its descendants will be doing their jobs, what's the point of this education anyway?
  • by argStyopa ( 232550 ) on Monday January 23, 2023 @04:11PM (#63233426) Journal

    If the banal pap that ChatGPT spits out gets a passing grade, well, then we see exactly what value that degree from Stanford is worth, no?

    Personally, I think the whole post-secondary education industry needs a brutal culling.

    • by Junta ( 36770 )

      The challenge, particularly with undergrad, is that you are largely having to simulate the student going through what could be 'innovative', but they have to arrive at a solution that can be reasonably evaluated. Meaning it can't *really* be novel and will be well-trodden material in reality, but it is novel to the student. Ideal fodder for things like ChatGPT. They are using these assignments as a proxy for something that is not so easily evaluated and demonstrated.

      Also, the point isn't just to prove the

    • "The value of a college education is not the learning of many facts but the training of the mind to think."
      -Albert Einstein

      More who cheat the system and make it will LACK the mind training of the others. If there are many it will diminish the reputation of the school; however, it's not horribly important if a larger portion everywhere are not legitimate.

      Today we already have extreme job interviews because so many people are not like they look on paper. This will get worse. We now have brain scanning tech t

  • Cheating isn't the big deal here. There already exists a large and sophisticated service industry that employs people to write term papers, reports, dissertations, etc, for students who can pay (https://courses.up.eku.edu/idcpd/doc/modfive/shadow_scholar.pdf).

    I think the more important question is what will happen to that industry and how will it evolve with ChatGPT on the scene.

    It was only a matter of time, but now that time is here.

  • by lucasnate1 ( 4682951 ) on Monday January 23, 2023 @05:37PM (#63233722) Homepage

    If ChatGPT can solve the final questions of a degree, then the problem is in the questions. These questions should reflect real work skills that the degree gave you. If the skills required to solve these questions can be replicated by a machine, then either the degree, or the questions are not serving their purpose.

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...