Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Education AI

College Grads Are Pursuing a New Career Path: Training AI Models (bloomberg.com) 34

College graduates across specialized fields are pursuing a new career path training AI models, with companies paying between $30 to $160 per hour for their expertise. Handshake, a university career networking platform, recruited more than 1,000 AI trainers in six months through its newly created Handshake AI division for what it describes as the top five AI laboratories.

The trend stems from federal funding cuts straining academic research and a stalled entry-level job market, making AI training an attractive alternative for recent graduates with specialized knowledge in fields including music, finance, law, education, statistics, virology, and quantum mechanics.
This discussion has been archived. No new comments can be posted.

College Grads Are Pursuing a New Career Path: Training AI Models

Comments Filter:
  • by Petersko ( 564140 ) on Friday July 25, 2025 @09:09AM (#65544532)

    It's a transitory need, and the role won't remain available in this firm for long, but make hay while the sun shines. Just don't use the income as a reason to assume a large mortgage.

    • Buddy of mine was approached by one of these companies and told them to piss off, feeling that he'd be training his own AI replacement. But I can't imagine they're doing much quality control so I don't see why you couldn't pee in the well instead.
    • by allo ( 1728082 )

      Why not? You may have for longer a chance than for simple coding. Making a GOOD tune of a network is not straightforward and each failed try costs you real money.

      • by Junta ( 36770 )

        Because either:

        a) It works as intended and the job inherently fast-tracks self-obsolescence.

        b) It doesn't work as intended and this job evaporates as the hype money comes back down to earth.

        No matter how well/poorly this current technology goes, this is a job that is not set to be a career.

        Just like people claiming to be "prompt engineers", either the LLMs work and you are a useless middle man or they don't work and people don't want to fool with you. Just like "webmaster" was a thing just by being able to

        • Just like "webmaster" was a thing just by being able to edit HTML files and that evaporated in the early 2000s.

          Judging by current web sites, they're still around. Just had a web page not work in Firefox. The fields where you enter your credit card information never activated. Couldn't enter anything. How many decades has this been around and we still can't get it right? Had to downgrade to using Edge at work and hope my info doesn't get spread to the four winds.

        • by allo ( 1728082 )

          Why should it fast track into obsolescence? I don't see networks training themselves yet. Each run is costly, and for large networks you need human experience, a grid search for hyper parameters would cost way too much. People who are able to train AI will be part of the last people in IT to get replaced, I suppose.

          • Because the task of training the model is already becoming commoditized. Yes, a handful of oversight roles will stick around for now, but most of the work won't require any special skill. It certainly won't support a workforce of moderately priced IT professionals. Like overnight sorting of transaction records for accountants being done in Mumbai, it's going to happen cheap, and probably elsewhere.

            • by allo ( 1728082 )

              Try training an own model and you'll see that it's not just starting a script and you're done. Even if you had a one-size-fits-all configuration you still need to curate the dataset, to control the outputs, and to evaluate the resulting model. And whatever college grads are training can't be really good models either. It is doable to do a fine-tune, but even for a good fine-tune you need someone who understands the math behind it to do good training. Just trying a few parameters and collecting a dataset for

  • by hardwarejunkie9 ( 878942 ) on Friday July 25, 2025 @09:21AM (#65544578)
    Can't get a job with your specialized knowledge? Why not spend your time trying to create your replacement? They're just going to turn around and drop it into services as a way to prevent the need for actual expertise.
    • by mysidia ( 191772 )

      What if you train your replacement on a lot of incorrect info? Perhaps deliberately in order to make the replacement weak?

      For some reason I think the graduates interested in going into "teaching AIs" might not be most experienced brightest individuals in their field.

      Generally new graduates are not your "experts" or seniors in a field, but the lowest of the low in knowledge and experience. Perhaps the reason they would be willing to take an hourly rate in the first place. And $160/hour with no

      • Someone trying to train AI agents on bad data soon won't have a job. It's not an unsupervised position where anyone can just toss bad data into the machine. Typically there are a couple of layers to the process. Person A provides training data. Person B makes corrections. Person C verifies those corrections. To get bad data into the system you'd need to have all three random people working together to provide bad training data. Anyone not working in good faith would soon be caught and fired.
  • Companies don't want any costs in their profits. They can't do it straight away so they "put up" with some costs until they have a replacement. It used to be outsourcing to other countries. Millions of local jobs destroyed and created racism in the tech industry. Now they are eliminating all humans so they have a perfect automated money printing machine.
    • and when there are no jobs and grades have 500K loans they just don't pay off?

    • AI exists to allow wealth to access skill without skill accessing wealth.

      People keep asking what the rich will do when nobody buys their products.

      Do you want us to think they haven't thought of that?

      They are painfully aware that they are dependent on us filthy consumers and they are taking steps to sever that dependency.

      Right about now you should be asking what that means for you when they do.
      • It does worry me, because I'm quite sure they're too dumb to realize it makes their wealth not really worth anything anymore.
        • It does worry me, because I'm quite sure they're too dumb to realize it makes their wealth not really worth anything anymore.

          That realization will occur to them a few months after the killbots are sent out to deal with the peasant problem. Of course, it will be too late then, but lessons are rarely learned by humanity the easy way.

          • Electricity at all for our houses since it's all going to go to AI and robots by law. As far as a way around that there really is only one and it's to change how we teach to focus on critical thinking and improving children's capabilities to the maximum they can reach. But that's it odds with cost cutting and a desire to create effective corporate robots as fast as possible and it's cheaply as possible. It's too expensive to move a warehouse. Robots is devouring jobs.

            Tech workers and customer service worke

      • Rich people aren't nefarious super geniuses, they are stupid assholes lying to themselves about being assholes just like everyone else (present company excluded).

        They want you and your kids to own nothing and be happy while they and the other filthy rich lizard people rule magnanimously and fairly, that's about as nefarious as they get (of course being stupid assholes they overestimate both their ability to rule and their ability to accomplish that rule, so it's not really worth worrying about).

    • by mysidia ( 191772 )

      Companies don't want any costs in their profits. They can't do it straight away so they "put up" with some costs until they have a replacement.

      AI Inference is not free definitely has a cost, and the AI services companies are not selling their product for free. In fact the price of these AI services come out as astronomical ever-increasing subscription fees.

      No matter how much you train the AIs they do not think like humans do, and will be extremely error prone constantly coming up with delusions; at be

      • by HiThere ( 15173 )

        Sorry, but even not the AIs often come up with answers no human has ever thought of. It's not that you're wrong about the errors, you aren't. But they have working original insights, and design working machines that no human understands (admittedly this may be because not too many people have studied them).

        Yes, AIs are currently vastly inferior to people, but they also don't think the same way. (Or even the same as each other.) AFAIK, nobody has a good explanation for this effect yet, I sure don't. But

  • by Pseudonymous Powers ( 4097097 ) on Friday July 25, 2025 @09:25AM (#65544602)
    From the makers of Babies Having Babies comes Dummies Training Dummies.
    • Re: (Score:2, Insightful)

      by drinkypoo ( 153816 )

      Nailed it in one. They're using people with no experience to train their software... so that it can make more mistakes?

      • Don't lie.

        FTFA:
        "Handshake wasn’t looking for coders. It was advertising for music graduates like Spellman, as well as plant biologists, chemists, educators, physicists and food scientists, among other experts, with hourly wages ranging from $30 to $160 an hour depending on the discipline and the level of expertise being sought."

        The point is to create high quality training data 'curated' by people with knowledge in the relevant fields.

        • You, sir, are the one who are lying, or at best cannot read. It says right there they're looking for graduates, not experienced workers They could get the same information they will get from them by training on the course materials.

          • There is a wide gap between "no experience" and "experienced workers". Graduates are positioned quite snugly in that gap and qualify as having experience (albeit limited). Your comment implied that the people in question were just random people off the street.

      • Nailed it in one. They're using people with no experience to train their software... so that it can make more mistakes?

        Perhaps, but.a quick check of open contract positions requires a PhD or in a PhD program as a prerequisite; so it's not like they're getting some kid who just got an undergraduate degree where AI wrote all the papers. I suspect experts with years of experience would be too expensive; altoughretired ones might be enticed just as a side gig.

  • by Anonymous Coward
    The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In any case, most actual fighting will be done by small robots, and as you go forth today remember your duty is clear: to build and maintain those robots.
  • And they said AI was taking our jobs! All the contrary.
  • career,
  • sawing the branch you're sitting on.

...there can be no public or private virtue unless the foundation of action is the practice of truth. - George Jacob Holyoake

Working...