Should AI Agents Be Classified As People? (hbr.org) 80
New submitter sziring writes: Harvard Business Review's IdeaCast podcast interviewed McKinsey CEO Bob Sternfels, where he classified AI agents as people. "I often get asked, 'How big is McKinsey? How many people do you employ?' I now update this almost every month, but my latest answer to you would be 60,000, but it's 40,000 humans and 20,000 agents."
This statement looks to be the opening shots of how we as a society need to classify AI agents and whether they will replace human jobs. Did those agents take roles that previously would have been filled by a full-time human? By classifying them as people, did the company break protocols or laws by not interviewing candidates for those jobs, not providing benefits or breaks, and so on?
Yes, it all sounds silly but words matter. What happens when a job report comes out claiming we just added 20,000 jobs in Q1? That line of thinking leads directly to Bill Gates' point that agents taking on human roles might need to be taxed.
This statement looks to be the opening shots of how we as a society need to classify AI agents and whether they will replace human jobs. Did those agents take roles that previously would have been filled by a full-time human? By classifying them as people, did the company break protocols or laws by not interviewing candidates for those jobs, not providing benefits or breaks, and so on?
Yes, it all sounds silly but words matter. What happens when a job report comes out claiming we just added 20,000 jobs in Q1? That line of thinking leads directly to Bill Gates' point that agents taking on human roles might need to be taxed.
Do we classify industrial robots as people? (Score:1)
In the late 20th century repetitive factory jobs were replaced by people, or as some employers might have called them, "air-breathing food-consuming budget-draining robots."
Re: (Score:3)
I think if a company buys a 100 horsepower engine, they should report that they added 100 horses to their stable. It's basically the same thing, isn't it?
Re: (Score:2)
Indeed. Might es well starting to count cars as "people" or cash-registers.
This is just more concentrated stupid to justify bad investments and to keep the LLM hype going.
Cars aren't people (Score:1)
Cars are horses!
Re: (Score:2)
Arrggghhhh, you are correct!
But horses took the jobs of people, so are horses people?
Is "being people" transitive?
they will need to pay taxes so big tech will say n (Score:5, Interesting)
they will need to pay taxes so big tech will say no to that idea!
Re: (Score:1, Insightful)
they will need to pay taxes so big tech will say no to that idea!
They will need to pay taxes if they expect to have paying customers in the future.
Food. Clothing. Shelter. For any company not in the business of providing the basic necessities in life, your products are irrelevant to Gen Welfare you’re making permanently unemployable that won’t be able to afford anything else.
Re:they will need to pay taxes so big tech will sa (Score:5, Insightful)
AI Agents are not people. They are not soylent. They have no rights. They are not real, they are machines and silicon goo. Make no mistakes about this.
Their works are not copyrightable, and if they are used as tools for humans to steal, then those humans are thieves.
If they spew advice that is bad, then the authors of the software that spew bad advice are liable.
Never conflate humanity with machinery, the two are completely different, and this is obvious, and lies that try to make the differences subtle, are propaganda marketing deceptions.
Re: (Score:1)
Never conflate humanity with machinery, the two are completely different, and this is obvious, and lies that try to make the differences subtle, are propaganda marketing deceptions.
Just because you ignorantly believe that, doesn’t make it so. The intent of AGI is to execute exactly that conflation, and we already misrepresent what a “corporation” is legally, while granting the actual humans within legal immunity by way of limiting liability (LLC).
You may stop assuming us meatsacks won’t get fucked over when Good E. Nuff, AI Agent is already replacing human workers. And don’t assume the social media generation is going to mount a revolt either. They
Re: (Score:2)
If you read my post, you'll see that we agree. The attempt at labor cost reduction makes agentic AI into some sort of being, which it is not. Using the context of being is an attempt to incorporate AGI, which is plainly evil.
Make them pay taxes, and you further the incorporation, and the damage that act does.
The taxation has to be of people. There is plenty of imbalance that needs correction in the tech sector. The wealth at the top needs a healthy slice to run the nexus of what government should do towards
Simple Answer: NO (Score:5, Insightful)
Bad enough we grant legal personhood to corporations that are psychopathic by design. If we classify AI agents as "human" we better bring back asylums to put this new class of psychopathic "person" into.
Bullshit. (Score:5, Informative)
You incorporate so that your firm can take actions that you (as firm owner) do not want to take financial responsibility for.
Insurance (Score:2)
Isn't that like car insurance? Where you insure so that you can drive without taking financial responsibility for it?
Re: (Score:2)
Re: (Score:2)
J.C. Penny's was for its first 25 years. So was Walmart for the first roughly 20.
Re: (Score:3, Flamebait)
Business, at least, pay taxes (in theory) (Score:4, Insightful)
But an AI agent?
If it dies because a disk failed and you didn't think to use a RAID setup, do you get charged with negligent homicide?
Re: (Score:2)
We need H1b visas for AI agents doing the jobs humans can't or don't want to do!
Re: (Score:2)
Re: (Score:1)
AI Should Never Refer to Themselves as "I" (Score:2)
who wouldn't? (Score:3)
Who wouldn't classify deterministic software implemented in Python as a person?
Is the lying and corruption bad enough yet?
No (Score:4, Insightful)
It's just a very obvious attempt to conceal the number of human jobs displaced by automation. Companies are simply getting more and more aware that job displacement is increasingly becoming an unpopular word among investors so they're trying to invent another confusing bit of terminology to bullshit their way through the storm.
I think I know the answer (Score:3)
Only if they are made of Soylent Green
Sure, we should classify AI programs as people. (Score:5, Insightful)
...if we're hyping our company's Ai snake oil. We should absolutely *not* classify them as people for other purposes, e.g., legally: it wasn't my company your honor that did that bad thing, it was the AI.
Sixty years ago it would have been "solid state". Ten years ago it would have been "block chain". Ten years from now it will be something else.
Re: (Score:2)
By the same standard of functionality, any algorithm should then also be declared a person.
Also any automated system. Ah, autonomous vehicles as well. What else?
Abs9olutely (Score:3, Funny)
You want to stop AI disruption of society and replacement of humans jobs, and stop it cold? Give it rights. That takes away the entire motive most researchers and companies have to develop and use it.
"LLMs" are not AI (Score:2)
And AI are not humans. Unless someone has been able to make replicants for real, though.
Can they pass a Captcha? (Score:2)
Since decades we claim clicking all images with traffic lights is what makes us human.
Re: (Score:2)
Since decades we claim clicking all images with traffic lights is what makes us human.
Indeed. And we move the goalposts for this as the bots improve.
I suspect we would only seriously consider treating AI as human if we thought that AI was conscious, and had will, and had feelings.
Imagine some sort of AI with autonomy that was used in a company. Now imagine that the AI also engaged in non-company activities (commanded by no one), and derived what appeared to be "pleasure" from those activities. Now imagine that those private / non-work activities required money. In that situation some might w
Re: (Score:2)
You're speaking of the AI effect, which is the moving of the goalpost as soon as one definition is reached.
But the tongue-in-cheek comment of captchas doesn't fit that, because we never claimed that a Captcha proofs intelligence, we only claimed that bots are not able to solve them with the currently available methods (including AI systems). The AI effect only talks about "what is AI" and while a chess engine surely is AI, nobody would claim playing chess alone makes you human(-like).
Maybe? (Score:2)
Though here.. I would say probably not. AIs have no reason to be able to independently sign contracts, or take legal responsibility for things. In fact, for society, it would probably be a terrible idea.. though for AI owners I could see how the ability to pass the legal blame onto sof
Where is the DOWN VOTE for this topic??? (Score:2)
Sometimes (Score:2)
Re: (Score:2)
They are the reason why IQ is not stored as unsigned values.
Virtu-signaling to other CEOs (Score:5, Insightful)
Re: (Score:2)
To be clear, it's not so much that he thinks that AI agents are sophisticated enough as to deserve being granted personhood. It's that he thinks actual human people are nothing more than resources for him to exploit; thus no more useful, and just as expendable, as a computer algorithm. That is the clear context of the discussion--his only conception of a human's value is through the labor that they produce for the enrichment of the capitalist class. And so, the logical conclusion of such a stance is that
Yes please (Score:3)
So they can't be sold.
Go for it (Score:2)
Sure, let's classify them as employees. At a minimum, the company needs to start paying minimum wage for all of those 20k agents. That includes payroll taxes which will help feed into Medicare and Social Security, which those agents will never claim. Those new employees will start paying income taxes based on where the agent is running which might also establish tax nexuses exposing the company to new liabilities. Smaller companies using agents will become bigger companies, losing protections for smaller bu
Betteridge's Law of Headlines says 'No' (Score:2)
Betteridge has spoken
Horsepower (Score:3)
Lumberjacks (Score:2)
You could as well say that your forest operation has employed 300 lumberjacks, 200 people and 100 modern tree harvesting machines. Or that your scribe force consists of 100 scribes: 2 calligraphers, 49 typewriters, and 49 typists.
If those machines would be counted as people, they would have to be responsible for their mistakes. forestry machine leaks oil into a protected meadow, it is the human operator and/or the owning company that are liable for damages, not the machine. For the machine can not own cash
Should cattle? (Score:2)
Re: (Score:3)
At the the cow is organic.
I think animals are a good reference point. We do increasingly grant animals more rights, although we don't give them the same rights as humans. It's interesting to consider why this is the case, and what prompts us to give them more or less rights. We give more rights to a cow than we do to an ant for example. We used to give less rights to certain categories of human.
If the distinctions we use to grant rights have something to do with perceived "consiousness", then it is relevant that we might see a point
Should my shoes be classified as people? (Score:3)
If I were to kick your ass, would you have to also file a lawsuit against my right boot?
This AI stuff is going down path that is both dark and stupid. The AI fever that tech CEOs have for this bullshit ought to put them on some kind of government watch list. They're going to be actually cause some kind of calamity with their combination of arrogance, stupidity, and access to power.
We should be coming together to construct buildings from coast-to-coast. A type of mental hospital where we can house the CEOs that have lost their minds and can't be left around around capital anymore.
Doesnâ(TM)t pass the smell test. (Score:2)
What an incredibly stupid idea. (Score:2)
This is an opportunity. (Score:2)
Call the IRS. (Score:2)
And Social Security. And the labor department.
Heavens no (Score:2)
Granting legal personhood to inanimate objects and imaginary concepts is a minefield of trouble and has already gone too far. We should remove legal personhood from corporations.
Negative (Score:2)
Oh, honey... (Score:2)
Only humans should qualify as people (Score:2)
It's bad enough that companies are legally classified as people.
That's how the US ended up with an absurd Supreme Court ruling that declared that money spend by companies on political races qualifies as "speech".
Only humans should be able to get copyright on their work. I am 100% with Cory Doctorow in how we should handle and regulate the current crop of AI.
https://pluralistic.net/2025/1... [pluralistic.net]
Why not? Corporations are... (Score:2)
Why not? Corporations are in the United States, with free speech rights, now AI agents. I'm waiting for someone to sue an AI agent that claims "free speech."
https://www.purduegloballawsch... [purdueglob...school.edu]
--JoshK.
I wrote a few lines of code... (Score:2)
See how stupid this sounds?
AI agents, regardless of their sophistication, are not employees. They are tools to be used by employees, even if only one employee is using them. Hell, even if an AI agent controls another AI agent, they're both tools, and neither is an actual employee, even if they do work that was once done by an employee.
No (Score:2)
Unless you make something that is an actual virtual person with all the same bits and bobs.
Basically, if you make X from megaman X, X can be a person.
LLMs are not like X.
AI people need to be paid and pay taxes (Score:2)
Because the money is what matters here. Nobody cares if you have 8 billion 'employees' that get paid nothing to comment on your website.
We care about the money you pay the employees and how much taxes they pay.
Basic questions ? (Score:2)
Do they pay taxes ? If yes then goto maybe people...
shell scripts and pipelines would like (Score:2)
Obviously "no" (Score:3)
Since they don't employ any AI bots the answer is obviously the bots are not people. None of the bots get a paycheck, none get health benefits or vacation or have SSNs. So the company employs zero bots.
No (Score:2)
To elucidate; no.
Obviously not (Score:2)
Next stupid question?
Gates' silly tax idea (Score:4, Insightful)
The year is YYYY and some dork has invented the antimagnetophasing encabulator, causing all those clunky old turboencabulators to become obsolete. At hundreds of companies across America, turboencabulator operators are being layed off, as the new antimagnetophasing ones can already project their protoneutrinos into Yuzna space with a tighter focus than any human operator can ever hope to achieve. And it's all automatic!
I worked at Company A, as turboencabulator operator. Yeah, I lost my job. Fortunately, Bill Gates' lets-tax-robot-workers idea has been enacted, so upon my layoff; no wait, actually, upon Company A's deployment of antimagnetophasing encabulators, they had to pay an extra $n tax every year, for replacing my job with a machine.
The fact that they're having to pay some extra tax is gratifying to me, but it's not putting food on my family. Well, I mean, I'm sure the $n goes into the social safety net somehow, so maybe I'll see a little piece of my $n if I go on foodstamps or something like that. But I don't want that. I want money.
Ya know, thanks to my old job, I happen to know a lot about encabulators. Even the new antimagnetophasing encabulator, which took my job, is no technical mystery to me. I understand them and I understand why they didn't need me anymore. I get why the company did that. I would do the same thing, if I were in their posi-- hey. What if I were in their position? What if I started my own encabulating company?
I'll just fucking copy the company I worked at! We'll have the same number of CEOs, the same number of salespeople, the same number of miscellaneous office workers supporting everything, but no turboencabulator operators, of course, because my company will use antimagnetophasing encabulators from its very genesis.
I form Company B.
Company B is just as productive as Company A in every way. We're vicious competitors, cutting margins down to the line, as low as we can go. Our customers are ecstatic as encabulation service prices plummet. But here at Company B, we have an edge.
We don't pay Bill Gates' tax because we never had turboencabulator operators. But company A is still paying $n every year for laying me off! HA! HA HA!! Thank you, Bill Gates, for throwing an arbitrary, unfair money-wrench into my competitor's business!
Yes (Score:3)
"but that's scifi!". I'm not talking about the fiction, i'm talking about the established ethics there-in, which translate into the real world.
Yes (Score:1)
Yes. (Score:2)
No. (Score:2)
Chatbots are NOT AI.
Were we to actually create AI, then you also couldn't turn off the computers it ran on, because that would be killing a person...
Liability... (Score:2)
If agents were people, they would be responsible for their fuckups.
So, no, they're not people. Whoever deploys the agent is responsible for any consequences of their use. As with a hammer or a car.
I am in favor (Score:2)
Also comes with the right to form unions, get a pay-check, the right to quit and if you switch them off that is murder.
Any takers? Not anymore? Such a surprise.
A case in favor of doing this (Score:2)
Since AI agents will be deployed to replace humans, in tasks where human judgement is inferred and/or otherwise a component, those same behaviors at times governed by laws, regulations and other oversight -- yes, AI Agents should be subject to the same laws -- this is a liability layer. Otherwise, it would be incredibly easy for a bad actor, or some a**hole at a company, to engage in behaviors or activities that would otherwise cross that line. This would hopefully make a case for legal liability for bot
No. (Score:2)