The Problem With Letting AI Do the Grunt Work (theatlantic.com) 55
The consulting firm CVL Economics estimated last year that AI would disrupt more than 200,000 entertainment-industry jobs in the United States by 2026, but writer Nick Geisler argues in The Atlantic that the most consequential casualties may be the humble entry-level positions where aspiring artists have traditionally paid dues and learned their craft. Geisler, a screenwriter and WGA member who started out writing copy for a how-to website in the mid-2010s, notes that ChatGPT can now handle the kind of articles he once produced.
This pattern is visible today across creative industries: the AI software Eddie launched an update in September capable of producing first edits of films, and LinkedIn job listings increasingly seek people to train AI models rather than write original copy. The story adds: The problem is that entry-level creative jobs are much more than grunt work. Working within established formulas and routines is how young artists develop their skills. The historical record suggests those early rungs matter. Hunter S. Thompson began as a copy boy for Time magazine; Joan Didion was a research assistant at Vogue; directors Martin Scorsese, Jonathan Demme, and Francis Ford Coppola shot cheap B movies for Roger Corman before their breakthrough work. Geisler himself landed his first Netflix screenplay commission through a producer he met while making rough cuts for a YouTube channel. The story adds: Beyond the money, which is usually modest, low-level creative jobs offer practice time and pathways for mentorship that side gigs such as waiting tables and tending bar do not. Further reading: Hollow at the Base.
This pattern is visible today across creative industries: the AI software Eddie launched an update in September capable of producing first edits of films, and LinkedIn job listings increasingly seek people to train AI models rather than write original copy. The story adds: The problem is that entry-level creative jobs are much more than grunt work. Working within established formulas and routines is how young artists develop their skills. The historical record suggests those early rungs matter. Hunter S. Thompson began as a copy boy for Time magazine; Joan Didion was a research assistant at Vogue; directors Martin Scorsese, Jonathan Demme, and Francis Ford Coppola shot cheap B movies for Roger Corman before their breakthrough work. Geisler himself landed his first Netflix screenplay commission through a producer he met while making rough cuts for a YouTube channel. The story adds: Beyond the money, which is usually modest, low-level creative jobs offer practice time and pathways for mentorship that side gigs such as waiting tables and tending bar do not. Further reading: Hollow at the Base.
Re: (Score:2)
I don't think the man himself would say he was a role model nor does anyone put him up as a role model except for a certain subset of writers.
Also the guy lived to 67, wealthy and famous with a family and went out on his own terms. Outside of moralizing about drug use what was so bad exactly?
Re: (Score:3)
People idolize celebrity musicians who sing (and, do) about drugs and sex and crap all the time.
He was cool because he lived the way he wanted... stoned most of the time, and didn't give a crap what anyone said.
Re: (Score:3)
He was also a very talented writer and how many of our most celebrated authors were druggies and alcoholics, so much so it's an easy joke [frinkiac.com]
Re: (Score:2)
Aww link got busted but it was supposed to be "The Drunken Irish Novelists of Springfield"
https://frinkiac.com/meme/S08E... [frinkiac.com]
Re: (Score:2)
Sir: While Mr. Thompson's life choices were less than optimal, I defy you to find a better motorcycle road test than the Song of the Sausage Creature:
http://www.latexnet.org/~csmit... [latexnet.org]
And here is Cycle World's posthumous write-up of same:
https://www.cycleworld.com/201... [cycleworld.com]
Re: (Score:2)
I know all about HST. I read all the books, I did all the drugs.
For anyone who doesn't get irony or sarcasm, I'll just say it straight so you can understand: There is no rational or polite world in which he could possibly be considered a good role model.
Was he a good journalist? Maybe. Was he a good writer, probably, I would say so, I read a few thousand pages he wrote. Again, let's be straight up, the man abused drugs to FAR beyond excess. Y
Re: (Score:2)
Amen. I owned and rode a 1977 Ducati 900ss, an ancestor of Mr. Thompson's ride. It was bullet-fast and had the turning radius of an aircraft carrier. Every time I took it out for a frolic, I was dead straight stone sober. Every. Single. Time.
But at least Thompson did not do a Jackson Pollack pavement painting. My sincere apologies if I caused you any distress.
Re: (Score:2)
I loved the Gonzo Papers and all of that Fear and Loathing stuff, and a few other books... I'm just pointing out that he was an interesting character, very thought provoking, insightful, very entertaining, stupidly fearless, arguably an innovative journalist and writer... but by no means a person that you should emulate or even want to emulate. A role model or hero his is not. He's much closer to the class clown saying "hold my beer".
Yeah, all the guys I knew who
The real problem (Score:2)
Future AI will change everything in unpredictable ways.
Pundits, hypemongers and lying salesweasels have their own predictions, usually directed toward investors.
Articles like the one posted are based on speculation about how AI will fit into the old way of doing things.
The real future will be different, very different
Re: (Score:2)
Future AI will change everything in unpredictable ways.
Oh, I don't know that's true unless by "future AI" you mean "a hypothetical thing that we currently do not have, nor have any prospects of having anytime soon". The software currently labeled "AI" are - I think - predictable.
They fall into four categories:
1} Language automation.
2} Visual media automation.
3} Audio media automation.
4} Statistical analysis/modeling tools.
#1 through #3 pretty obviously will do what physical robots did to manufacturing. Anyone working in those fields is about to be rep
Re: (Score:2)
Re: (Score:2)
Writing Software: which if you aren't familiar can do pretty much everything knowledge related and produce any ditigal artifact. Still more: software can generate control instructions for physical media - like a fork lift - just as easily as it can crank out a PDF, and we have Language models that can already generate the code for that. So think bigger.
I hear you, but I don't personally categorize that as "bigger". It's just more of the same. Writing code is just language automation. Writing code to control physical objects - like Boston Dynamics' Spot - is just writing code. Its results are still - in my eye - predictable. Not at the specific item level; I can't tell you what the name or number of ailments that will have treatments by 2040 because of this, but I can tell you Big Pharma will get bigger. I can't tell you how extensive job loss in the
Re: (Score:2)
Re: (Score:2)
Actually, thought processes will move to the lowest level you can find.
"Things we didn't think were practical" = standard human flying like Superman, growing two more arms so you can play two hands of poker against your buds, driving your Ford Fiesta faster than a drag racer? What things weren't practical before? Having your job replaced by a darn (still not sure about swearing on here) computer which also drives your car (so, do you even need a drivers license anymore?), not having said job = lack of mon
Re: (Score:3)
Yeah, for example, the copy writing example of TFA: Instead of the interns doing copy writing for a column or two they will be in charge of operating ChatGPT to write 10 of them at a time, or similar things thereto. The scale of work possible by one person will increase
This increases quantity of output at the cost of quality. You don't want interns to crank out a 100 columns of shit-work in an afternoon. You want an intern to struggle for a week to come up with 1 column that is good enough for their editor to accept. You want juniors to submit their work to a mentor who marks it up, hands it back and tells them to do it again until they produce good works. That is how they progress in skill and grow in responsibilities.
Using better tools is not a bad thing -it is just
Sounds like a feature! (Score:2)
Re: (Score:2)
Re:Sounds like a feature! (Score:4, Insightful)
the bots have trouble now for sure, but look back 5 years and tell me you thought any of this was possible back then.
I would think the vast majority of us on Slashdot would actually respond to that with: Yes, yes we thought this was possible back then. However, we would have to admit that we anticipated systems that were a lot less terrible than the ones we've ended up with.
Re: (Score:2)
Re: (Score:2)
I think this is revisionist. Look up AI articles from 10-15 years ago and the idea of conversing / generative AI will have been poo-pooed here on Slashdot.
As for "terrible", AI has made amazing strides in a short time. What is terrible is that the rush to market has resulted in half baked products being pushed out. But given what has improved in just the past year alone (still remember AI generating images with 12 fingers and 14 toes?) I really question your view of the product being "terrible". Put your im
Re: (Score:2)
I think this is revisionist. Look up AI articles from 10-15 years ago and the idea of conversing / generative AI will have been poo-pooed here on Slashdot.
OK. So we'll just ignore the fact that the time frame up for discussion was 5 years ago, but somehow I'm the one being "revisionist" when you move it to 10-15 years ago? OK, let's go back three times longer.
15 years ago, we already had things like Siri and Watson. Various kinds of computer generated art had been around for decades and "filters" and other "intelligent" tools were all over the place in all kinds of graphics software, editing video live, etc. Sure, turning your face into Shrek, or an anime cha
Re: (Score:2)
Why live? (Score:2)
Why live life as a human in human culture, when you can sit back and let AI do the living for you?
Re: (Score:2)
Re: (Score:2)
Nobody will hire me if I don't have 10 years experiences in training AIs.
Ignorance? (Score:4, Insightful)
But MBAs, tech bros, and 'owners'? yeah.. that I can see.
Re: (Score:2)
That's exactly it... the C-suite guys have the piece of paper that says they know how to "manage" a company, and don't give a flying toaster about you or me... robots controlled by LLM-AI and computer vision can do every job (or, so they think right now)... humans cost money to hire and employ and want holidays off and want to work an 8 hour day... a factory full of robots only needs a couple techs to solve problems while the company makes millions a week or whatever.
On the other hand, someone (like me) who
ai slop (Score:2)
"low-level creative jobs offer practice time and pathways for mentorship"
More importantly, ultimately, is they offer pathways to high level creative jobs eventually.
Training an ai offers a pathway to... Nothing.
Re: (Score:2)
In this case, these are two pathways to slop. Whether it's entry level creative jobs or AI, the end result is still slop. Maybe it's the end product we don't need.
Of course (Score:5, Insightful)
Working you way up the rungs is how you learn. No one knows what the exact steps are to do something in a work environment the moment you walk in the door. You don't know what is or is not right until you get feedback. It's why it's called learning.
Take that learning away and how does someone know what to tell/direct an AI bot to do? If they've never done the steps before, how do they know what is right or wrong?
As a side note, many "manuals" that come with equipment fall into the last category. Clearly the people "writing" the manual have never done what they're telling you to do. Had they done so, the numerous quirks and confusions wouldn't be there.
Re: (Score:3)
Re:Of course (Score:4, Interesting)
So the observation here is that this trend of using AI instead of entry level people will, eventually, leave the job market empty of mid-level people, since nobody got the real-world experience that they need to become mid-level people.
Even if true, it still doesn't make sense for any individual business to hire and pay entry level people that they no longer need. If they do this, they are basically running a charity at that point, and likely violating their fiduciary duties. Each individual business needs to cut costs in whatever ways make sense for the health of the business. The problem of not being able to find the people they need isn't actually a problem until later.
So, even if every single business owner in the world reads these warnings and nods in agreement with them, they still have no incentive to hire entry level people. That would just increase their costs while allowing their competitors to keep their costs low. It wouldn't be rational for them to hire these people they don't need.
The most likely way this plays out (assuming it is actually true) is: when the day comes that mid-level people are needed but none are available, entry-level people will be hired instead, right into mid-level roles. Also, senior level people will be retained longer, paid more, possibly even invited out of retirement, to train and coach these entry-level people who are needed in mid-level roles.
There you go, problem solved. It might be a bit pricey when the day comes, quality and reliability might take a hit, but those costs will be felt industry-wide so all companies will at least be on equal footing.
At no point will warnings about a talent drought prevent any employers from using AI instead of humans as much as they possibly can.
Re: (Score:2)
Take that learning away and how does someone know what to tell/direct an AI bot to do? If they've never done the steps before, how do they know what is right or wrong?
The people making the decision to replace entry level jobs with AI absolutely do not care about the lack of training. That is a future problem, not their problem.
As a side note, many "manuals" that come with equipment fall into the last category. Clearly the people "writing" the manual have never done what they're telling you to do. Had they done so, the numerous quirks and confusions wouldn't be there.
Of course they don't. The people that write the manuals are Tech writers not users and not engineers.
The user or engineer writes the instructions and the Tech writer dresses it up so it looks nice gives it back to the user or engineer to review.
They iterate that until they have a decent manual--at least that's what's supposed to happen.
Big surprise. (Score:5, Interesting)
And by big surprise, I mean it's a big surprise that someone finally wrote an article pointing out the obvious. The public articles have mostly tried to ignore this part of the AI trend in favor of, "AI is great at everything," messaging up to this point.
We're seeing it here in podunk nowheresville US as well. We hire new programmers, they rely nearly 100% on AI assisted vibe coding to do the simple tasks we give them, and their skills are not improving at the rate they need to to climb into the bigger projects we have waiting for them as they work through the simple bug fixes and version transfers we need them to get through before taking on bigger projects. It's honestly frightening how little these "programmers" are learning, as they let the AI do the coding, and for the most part, the bug fixing. Now, if that made them faster than the folks we used to hire a few years back, that'd be one thing. But it actually seems to slow them down, on top of the fact that they aren't really building a knowledgebase themselves so much as sorting out how to prompt and reprompt the AI they use. And even then, they don't always get the job done until we do a code review meeting where senior devs march them lockstep through what they've bungled thus far to get it functional.
And then we repeat that process over and over. It doesn't seem they're learning, just repeating the same prompt, reprompt, reprompt, have the seniors fix it in review pattern over and over. Where are we going to get solid coders for our next generation? I know the AI companies swear that if we just ignore the problem now they'll get their systems up to par in time to take over the senior positions as we age out, but man, I'm just not seeing that kind of progress coming fast enough to catch us.
In short, as an old timer, I'm continually asking, when it comes to AI? Where's the beef?
Re: (Score:2)
Re:Big surprise. (Score:5, Insightful)
Something the extra frustrates me... a while back I decided to go take a programming class in a language/domain I had not used before, and the very first part went over how to use AI to 'help you learn'...
That *might* be OK so long as there's some form of guidance on how to utilize the AI as a research assistant, rather than as a "hey, make this program for me." LLM style AI is pretty decent with research, so long as it's set up to provide its reference material along with its answer. In fact, I'd say that's the one domain it's really great at, but also seems to be the one domain that mostly gets ignored in favor of pushing the, "AI will do everything for you," agenda.
Same as outsourcing (Score:4, Insightful)
A lot of the current concerns around AI very much remind me of the outsourcing trend of the 2000s. Basically keep the senior people in-house, and outsource all the junior positions to some other country with lower wages. It ultimately failed then too, for similar reasons. Too much effort required to review and fix the work, and the removal of a pipeline for new senior people.
Re: (Score:2)
There is still a lot of outsourcing. It's just that the power that be realized it's not a replacement for all grunt work, but a limited use tool. AI may turn out similar. We've only had a few years of mass AI adoption. Give it a few more and the dearth of juniors moving up the ranks will get more obvious, deadlines will slip, and the "AI will do everything" narrative will start to fade.
They don't care (Score:2)
Must have experience with ... (Score:4, Funny)
the most consequential casualties may be the humble entry-level positions where aspiring artists have traditionally paid dues and learned their craft.
Makes me think of programming job ads saying "must have 10 years of experience with [insert language]" when that programming language had literally only been around for 5 years. Also makes me think of the junior software engineer, three months out of university, who still needed help with simple things, who ask when he'd be promoted to senior. The thought, "when you don't need a senior engineer to help you with your work" came to mind.
Re: (Score:2)
"must have 10 years of experience with [insert language]" when that programming language had literally only been around for 5 years
Isn't that just code for give me H1Bs?
Re: (Score:2)
"must have 10 years of experience with [insert language]" when that programming language had literally only been around for 5 years
Isn't that just code for give me H1Bs?
Or, at least, "we have incompetent HR and/or tech managers, who don't actually know anything."
In a similar vein, I'll note that a place near here called Jefferson Labs (originally CEBAF - Continuous Electron Beam Accelerator Facility) used to post super detailed and specific, bordering on unreasonable, job ads that made me think that they already had a specific person in mind. One said, must me fluent in PostScript - seriously, who (manually) codes/edits PostScript? (admitting that I have actually done
human slop (Score:2)
Misleading headline (Score:1)
Do better.
Yeah but that's only half the problem (Score:3)
Yes, AI destroying entry level jobs is a big problem, and it would be big enough all by itself.
But it's compounded by the fact that you have to know what you're doing to use AI. If it delivered on its promises, then it wouldn't matter that there weren't entry-level jobs, because a noob could do expert-level work.
AI fundamentally not being able to do that, and it taking an experienced person to even know what to ask for (let alone how to fix the fuck-ups) combined with the fact that it takes an expert to use it correctly and then combined with the fact that it's going to destroy entry-level jobs is what's really going to fuck us long term. The jobs upheaval is a relatively short term problem, but it will cause knock-on effects for years and years as we try to build the skills we've lost. Those noobs were supposed to learn from the elders.
Re: (Score:2)
*waves to drinkypoo*
But, the noobs only learn from the elders if the job the noob has allows for learning time, which is next to never.
So, because only those with 15+ years of experience in a 2-year old industry qualify for the LLM-AI supervising tech job, and that's basically all the jobs that are left... are the rest of us supposed to just starve and die?
Don't worry... the C-suite crew is gonna quickly realize they made a mistake installing all those robots when one bot malfunctions and the cause of the m
Re: (Score:2)
are the rest of us supposed to just starve and die?
[...]
Don't worry... the C-suite crew is gonna quickly realize they made a mistake
If they were going to realize quickly, I wouldn't be worried. Alas, they are not smart enough for that. They will realize slowly, by which time lots of us will already have suffered at best, or died as I do indeed believe they would prefer most of us to do, and ASAP at that. They only need enough of us to exist to produce offspring they can fuck.
It is not grunt work that teaches you (Score:3)
It is not grunt work that teaches you how to work. I personally started programming simply by doing my own projects. It was no grunt work, I just did what I could and if I didn't know how to do something, I tried to learn how to do it. When I joined the work force, I pretty much continued to do my own projects again. I made a lot of mistakes and I learned from those.
I think we can easily let go of the grunt work, if we just offer the kids a place where they can do those big projects in an environment where they can make mistakes and where they can ask for advice. If they can't play with the real thing, let them play in simulation.
We don't need so many copy writers (Score:3)
Have you been to LinkedIn or any how-to pages lately? Same shit, different page. Reheating the same subject, the same stale pancake over and over again. Fighting for scraps of the same traffic/attention cake as everyone else.
The truth is we don't need so many copy writers. We need more quality encyclopedia/wiki pages. Here's all the reliable information you need. Here's how you do it. We don't need a million copies of the same shit just so that someone can get 5 cents a year from the viewers.
Back to apprenticeships? (Score:2)
It looks like we're going full circle, if entry level jobs are too expensive to be profitable, if you really want to learn a new craft, you may end up having to work for free or close to free to gather experience.