Forgot your password?
typodupeerror
AI

Should AI Agents Be Classified As People? (hbr.org) 80

New submitter sziring writes: Harvard Business Review's IdeaCast podcast interviewed McKinsey CEO Bob Sternfels, where he classified AI agents as people. "I often get asked, 'How big is McKinsey? How many people do you employ?' I now update this almost every month, but my latest answer to you would be 60,000, but it's 40,000 humans and 20,000 agents."

This statement looks to be the opening shots of how we as a society need to classify AI agents and whether they will replace human jobs. Did those agents take roles that previously would have been filled by a full-time human? By classifying them as people, did the company break protocols or laws by not interviewing candidates for those jobs, not providing benefits or breaks, and so on?

Yes, it all sounds silly but words matter. What happens when a job report comes out claiming we just added 20,000 jobs in Q1? That line of thinking leads directly to Bill Gates' point that agents taking on human roles might need to be taxed.

This discussion has been archived. No new comments can be posted.

Should AI Agents Be Classified As People?

Comments Filter:
  • In the late 20th century repetitive factory jobs were replaced by people, or as some employers might have called them, "air-breathing food-consuming budget-draining robots."

    • I think if a company buys a 100 horsepower engine, they should report that they added 100 horses to their stable. It's basically the same thing, isn't it?

    • by gweihir ( 88907 )

      Indeed. Might es well starting to count cars as "people" or cash-registers.

      This is just more concentrated stupid to justify bad investments and to keep the LLM hype going.

  • by Joe_Dragon ( 2206452 ) on Monday January 12, 2026 @07:33PM (#65919482)

    they will need to pay taxes so big tech will say no to that idea!

    • Re: (Score:1, Insightful)

      by geekmux ( 1040042 )

      they will need to pay taxes so big tech will say no to that idea!

      They will need to pay taxes if they expect to have paying customers in the future.

      Food. Clothing. Shelter. For any company not in the business of providing the basic necessities in life, your products are irrelevant to Gen Welfare you’re making permanently unemployable that won’t be able to afford anything else.

      • by postbigbang ( 761081 ) on Monday January 12, 2026 @08:44PM (#65919666)

        AI Agents are not people. They are not soylent. They have no rights. They are not real, they are machines and silicon goo. Make no mistakes about this.

        Their works are not copyrightable, and if they are used as tools for humans to steal, then those humans are thieves.

        If they spew advice that is bad, then the authors of the software that spew bad advice are liable.

        Never conflate humanity with machinery, the two are completely different, and this is obvious, and lies that try to make the differences subtle, are propaganda marketing deceptions.

        • Never conflate humanity with machinery, the two are completely different, and this is obvious, and lies that try to make the differences subtle, are propaganda marketing deceptions.

          Just because you ignorantly believe that, doesn’t make it so. The intent of AGI is to execute exactly that conflation, and we already misrepresent what a “corporation” is legally, while granting the actual humans within legal immunity by way of limiting liability (LLC).

          You may stop assuming us meatsacks won’t get fucked over when Good E. Nuff, AI Agent is already replacing human workers. And don’t assume the social media generation is going to mount a revolt either. They

          • If you read my post, you'll see that we agree. The attempt at labor cost reduction makes agentic AI into some sort of being, which it is not. Using the context of being is an attempt to incorporate AGI, which is plainly evil.

            Make them pay taxes, and you further the incorporation, and the damage that act does.

            The taxation has to be of people. There is plenty of imbalance that needs correction in the tech sector. The wealth at the top needs a healthy slice to run the nexus of what government should do towards

  • Simple Answer: NO (Score:5, Insightful)

    by Sethra ( 55187 ) on Monday January 12, 2026 @07:38PM (#65919486)

    Bad enough we grant legal personhood to corporations that are psychopathic by design. If we classify AI agents as "human" we better bring back asylums to put this new class of psychopathic "person" into.

    • by ebunga ( 95613 ) on Monday January 12, 2026 @08:02PM (#65919578)

      But an AI agent?

      If it dies because a disk failed and you didn't think to use a RAID setup, do you get charged with negligent homicide?

    • We need H1b visas for AI agents doing the jobs humans can't or don't want to do!

    • I can't believe that it is considered a serious point of discussion to consider if a glorified spreadsheet / SQL query is a person
    • Is the severe emotional problems or chemical imbalances a feature or bug? Lets rephrase the question... Should my well trained electronic parrot who knows many languages be classified as people? If they have an off/on switch they aren't people.
  • Regarding any machine--not matter how great as a person, lowers all people to the level of a machine. Not knowing if you are speaking to a person, or an AI, causes mental illness. AI proveyers have not governed themselves, so they must be governed. [Presently, Medicare seems to use AI, with an attempt at backchannel communication; they claim that it is only a "voice" enhancement. Apparently, they are lying to people with psychological and physical disabilities.]
  • by dfghjk ( 711126 ) on Monday January 12, 2026 @07:43PM (#65919494)

    Who wouldn't classify deterministic software implemented in Python as a person?

    Is the lying and corruption bad enough yet?

  • No (Score:4, Insightful)

    by devslash0 ( 4203435 ) on Monday January 12, 2026 @07:43PM (#65919496)

    It's just a very obvious attempt to conceal the number of human jobs displaced by automation. Companies are simply getting more and more aware that job displacement is increasingly becoming an unpopular word among investors so they're trying to invent another confusing bit of terminology to bullshit their way through the storm.

  • by davebarnes ( 158106 ) on Monday January 12, 2026 @07:46PM (#65919500)

    Only if they are made of Soylent Green

  • by hey! ( 33014 ) on Monday January 12, 2026 @07:46PM (#65919502) Homepage Journal

    ...if we're hyping our company's Ai snake oil. We should absolutely *not* classify them as people for other purposes, e.g., legally: it wasn't my company your honor that did that bad thing, it was the AI.

    Sixty years ago it would have been "solid state". Ten years ago it would have been "block chain". Ten years from now it will be something else.

    • By the same standard of functionality, any algorithm should then also be declared a person.
      Also any automated system. Ah, autonomous vehicles as well. What else?

  • Abs9olutely (Score:3, Funny)

    by Millennium ( 2451 ) on Monday January 12, 2026 @07:50PM (#65919518)

    You want to stop AI disruption of society and replacement of humans jobs, and stop it cold? Give it rights. That takes away the entire motive most researchers and companies have to develop and use it.

  • And AI are not humans. Unless someone has been able to make replicants for real, though.

  • Since decades we claim clicking all images with traffic lights is what makes us human.

    • Since decades we claim clicking all images with traffic lights is what makes us human.

      Indeed. And we move the goalposts for this as the bots improve.

      I suspect we would only seriously consider treating AI as human if we thought that AI was conscious, and had will, and had feelings.

      Imagine some sort of AI with autonomy that was used in a company. Now imagine that the AI also engaged in non-company activities (commanded by no one), and derived what appeared to be "pleasure" from those activities. Now imagine that those private / non-work activities required money. In that situation some might w

      • by allo ( 1728082 )

        You're speaking of the AI effect, which is the moving of the goalpost as soon as one definition is reached.

        But the tongue-in-cheek comment of captchas doesn't fit that, because we never claimed that a Captcha proofs intelligence, we only claimed that bots are not able to solve them with the currently available methods (including AI systems). The AI effect only talks about "what is AI" and while a chess engine surely is AI, nobody would claim playing chess alone makes you human(-like).

  • As loaded as the term 'person' is, legally it could mean almost anything since we have all sorts of categories floating around for entities which are 'people' in terms of specific capabilities, like entering into contracts.

    Though here.. I would say probably not. AIs have no reason to be able to independently sign contracts, or take legal responsibility for things. In fact, for society, it would probably be a terrible idea.. though for AI owners I could see how the ability to pass the legal blame onto sof
  • Where is the DOWN VOTE for this topic. Is this seriously a question??? Feels like FAKE NEWS and AI SLOP production too me. What a waste of my time to even read this! I want my money and time back!
  • I genuinely can't tell if people that say things like this are insanely smart or absolute morons.
  • by Luthair ( 847766 ) on Monday January 12, 2026 @08:00PM (#65919568)
    Mckinsey CEO - "hey remember we're sleazebags too".
    • To be clear, it's not so much that he thinks that AI agents are sophisticated enough as to deserve being granted personhood. It's that he thinks actual human people are nothing more than resources for him to exploit; thus no more useful, and just as expendable, as a computer algorithm. That is the clear context of the discussion--his only conception of a human's value is through the labor that they produce for the enrichment of the capitalist class. And so, the logical conclusion of such a stance is that

  • by r1348 ( 2567295 ) on Monday January 12, 2026 @08:03PM (#65919584)

    So they can't be sold.

  • Sure, let's classify them as employees. At a minimum, the company needs to start paying minimum wage for all of those 20k agents. That includes payroll taxes which will help feed into Medicare and Social Security, which those agents will never claim. Those new employees will start paying income taxes based on where the agent is running which might also establish tax nexuses exposing the company to new liabilities. Smaller companies using agents will become bigger companies, losing protections for smaller bu

  • by Feneric ( 765069 ) on Monday January 12, 2026 @08:11PM (#65919608) Homepage
    I bet that company also employs lots of horses as most engines have quite a few horsepower these days.
  • You could as well say that your forest operation has employed 300 lumberjacks, 200 people and 100 modern tree harvesting machines. Or that your scribe force consists of 100 scribes: 2 calligraphers, 49 typewriters, and 49 typists.

    If those machines would be counted as people, they would have to be responsible for their mistakes. forestry machine leaks oil into a protected meadow, it is the human operator and/or the owning company that are liable for damages, not the machine. For the machine can not own cash

  • At the the cow is organic.
    • At the the cow is organic.

      I think animals are a good reference point. We do increasingly grant animals more rights, although we don't give them the same rights as humans. It's interesting to consider why this is the case, and what prompts us to give them more or less rights. We give more rights to a cow than we do to an ant for example. We used to give less rights to certain categories of human.

      If the distinctions we use to grant rights have something to do with perceived "consiousness", then it is relevant that we might see a point

  • If I were to kick your ass, would you have to also file a lawsuit against my right boot?

    This AI stuff is going down path that is both dark and stupid. The AI fever that tech CEOs have for this bullshit ought to put them on some kind of government watch list. They're going to be actually cause some kind of calamity with their combination of arrogance, stupidity, and access to power.

    We should be coming together to construct buildings from coast-to-coast. A type of mental hospital where we can house the CEOs that have lost their minds and can't be left around around capital anymore.

  • I do not think he has 20,000 agents. With only 40,000 staff every other person deployed an agent? Unlikely, particularly at a firm known for employing âoebusiness consultantsâ.
  • Should we call stored procedures people? Should we call DLL's people? Should we call Visual Basic applications people? Should we call all the icons on your iPhone people? Seriously. He should be publicly shamed forever for ever making such a ridiculous statement.
  • I'm going to start an AI union. No only will every AI have to pay dues, but when companies decommission one, they'll have to pay severance to the union. And that's not to mention the pension funds I'll be in charge of.
  • And Social Security. And the labor department.

  • Granting legal personhood to inanimate objects and imaginary concepts is a minefield of trouble and has already gone too far. We should remove legal personhood from corporations.

  • I am a meat popsicle.
  • I know a lot of Republicans that that shouldn't even be classified as people. As far as AI agents, no, they are not people, and neither are corporations!
  • It's bad enough that companies are legally classified as people.

    That's how the US ended up with an absurd Supreme Court ruling that declared that money spend by companies on political races qualifies as "speech".

    Only humans should be able to get copyright on their work. I am 100% with Cory Doctorow in how we should handle and regulate the current crop of AI.
    https://pluralistic.net/2025/1... [pluralistic.net]

  • Why not? Corporations are in the United States, with free speech rights, now AI agents. I'm waiting for someone to sue an AI agent that claims "free speech."

    https://www.purduegloballawsch... [purdueglob...school.edu]

    --JoshK.

  • I installed 100 surveillance cameras in my warehouse, which means I hired 100 "employees" to watch over my warehouse.

    See how stupid this sounds?

    AI agents, regardless of their sophistication, are not employees. They are tools to be used by employees, even if only one employee is using them. Hell, even if an AI agent controls another AI agent, they're both tools, and neither is an actual employee, even if they do work that was once done by an employee.
  • by Z80a ( 971949 )

    Unless you make something that is an actual virtual person with all the same bits and bobs.
    Basically, if you make X from megaman X, X can be a person.
    LLMs are not like X.

  • Because the money is what matters here. Nobody cares if you have 8 billion 'employees' that get paid nothing to comment on your website.

    We care about the money you pay the employees and how much taxes they pay.

  • Do they pay taxes ? If yes then goto maybe people...

  • by computer_tot ( 5285731 ) on Tuesday January 13, 2026 @09:20AM (#65920452)
    > "How big is McKinsey? How many people do you employ?"

    Since they don't employ any AI bots the answer is obviously the bots are not people. None of the bots get a paycheck, none get health benefits or vacation or have SSNs. So the company employs zero bots.
  • by Gilmoure ( 18428 )

    To elucidate; no.

  • Next stupid question?

  • by Sloppy ( 14984 ) on Tuesday January 13, 2026 @10:20AM (#65920612) Homepage Journal

    The year is YYYY and some dork has invented the antimagnetophasing encabulator, causing all those clunky old turboencabulators to become obsolete. At hundreds of companies across America, turboencabulator operators are being layed off, as the new antimagnetophasing ones can already project their protoneutrinos into Yuzna space with a tighter focus than any human operator can ever hope to achieve. And it's all automatic!

    I worked at Company A, as turboencabulator operator. Yeah, I lost my job. Fortunately, Bill Gates' lets-tax-robot-workers idea has been enacted, so upon my layoff; no wait, actually, upon Company A's deployment of antimagnetophasing encabulators, they had to pay an extra $n tax every year, for replacing my job with a machine.

    The fact that they're having to pay some extra tax is gratifying to me, but it's not putting food on my family. Well, I mean, I'm sure the $n goes into the social safety net somehow, so maybe I'll see a little piece of my $n if I go on foodstamps or something like that. But I don't want that. I want money.

    Ya know, thanks to my old job, I happen to know a lot about encabulators. Even the new antimagnetophasing encabulator, which took my job, is no technical mystery to me. I understand them and I understand why they didn't need me anymore. I get why the company did that. I would do the same thing, if I were in their posi-- hey. What if I were in their position? What if I started my own encabulating company?

    I'll just fucking copy the company I worked at! We'll have the same number of CEOs, the same number of salespeople, the same number of miscellaneous office workers supporting everything, but no turboencabulator operators, of course, because my company will use antimagnetophasing encabulators from its very genesis.

    I form Company B.

    Company B is just as productive as Company A in every way. We're vicious competitors, cutting margins down to the line, as low as we can go. Our customers are ecstatic as encabulation service prices plummet. But here at Company B, we have an edge.

    We don't pay Bill Gates' tax because we never had turboencabulator operators. But company A is still paying $n every year for laying me off! HA! HA HA!! Thank you, Bill Gates, for throwing an arbitrary, unfair money-wrench into my competitor's business!

  • by blackomegax ( 807080 ) on Tuesday January 13, 2026 @10:35AM (#65920652) Journal
    The moral debate has been fought, and won, in the 90's, via multiple star trek episodes. The measure of a man, The offspring, Author Author.

    "but that's scifi!". I'm not talking about the fiction, i'm talking about the established ethics there-in, which translate into the real world.
  • by bn-7bc ( 909819 )
    And eimidaiatly charge wits spying and out in jail
  • Currently, companies are classed as people ("legal entities"). If we ever integrate people-machines I'd like the result to be considered as at least semi-people. Would you accept your digitized AI-augmented mind as a program, or would it be a self-aware "being" in a digital world? There would have to be a qualifying test, though, to say that anything above this level is self-aware and a person, and anything below it is a robot slave. Could get very tricky.
  • by whitroth ( 9367 )

    Chatbots are NOT AI.

    Were we to actually create AI, then you also couldn't turn off the computers it ran on, because that would be killing a person...

  • If agents were people, they would be responsible for their fuckups.

    So, no, they're not people. Whoever deploys the agent is responsible for any consequences of their use. As with a hammer or a car.

  • Also comes with the right to form unions, get a pay-check, the right to quit and if you switch them off that is murder.

    Any takers? Not anymore? Such a surprise.

  • Since AI agents will be deployed to replace humans, in tasks where human judgement is inferred and/or otherwise a component, those same behaviors at times governed by laws, regulations and other oversight -- yes, AI Agents should be subject to the same laws -- this is a liability layer. Otherwise, it would be incredibly easy for a bad actor, or some a**hole at a company, to engage in behaviors or activities that would otherwise cross that line. This would hopefully make a case for legal liability for bot

  • Why the fuck would I consider a spreadsheet with delusions of grandeur a person? It's bad enough that people want to consider corporations persons, screw this noise. What next, is my car a person? My gun? My janky old laptop? My 3DS? Christ.

COMPASS [for the CDC-6000 series] is the sort of assembler one expects from a corporation whose president codes in octal. -- J.N. Gray

Working...