Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Software

Can AI Really Replace Salesforce and Workday? (theinformation.com) 67

Can AI kill the enterprise software app industry that's led by companies such as Salesforce and Workday? The Information: That's the trillion-dollar question at the heart of recent comments from the CEO of Klarna, Sebastian Siemiatkowski, who's made a name for himself -- and drawn some skepticism too -- as a chief cheerleader of OpenAI's software. In the latest example from a couple of weeks ago, Siemiatkowski told investors in his buy now, pay later firm that it's shutting down a lot of the enterprise software apps it uses, including some run by the above-mentioned CRM and HR firms, because it can replicate them with AI. SeekingAlpha picked up those comments, which went viral in recent days.

The idea behind the comments is the following: Conversational AI can understand natural-language commands and be ordered to write software code, so companies can cheaply and quickly build customized apps that do most of the things that traditional enterprise apps can do, especially if most of what those apps do is manage corporate data. Siemiatkowski expanded on the comments in a Wednesday X post, saying he wasn't looking to primarily save money on software license fees "even though that is nice upside."

This discussion has been archived. No new comments can be posted.

Can AI Really Replace Salesforce and Workday?

Comments Filter:
  • I'm very confident that today's AI can't handle taking a client's data and process it in a useful way unless the client's data is so incredibly standardized that it has no deviations from the standard at all.

    Typically, you need to have an intelligent understanding of the data schema because it is almost universal that something non-standard will have been hacked into place, and if not, they'll still want something non-standard from the output.

    AI has the A but not the I. It'll do something, it might even be

    • Seems like someone is not really up to date with the technology. The AI is progressing astonishingly rapidly and the very nature of AI is to find patterns in data in hyperspace.

      Some time ago I took an online course and there are now chains of AI LLMs, called agents, and one example was:
      Input: "please find interesting patterns in US salaries and present it in a graph"
      The agent fetched proper data, learned python, found correlation between salaries and company size and plotted graphs showing that the best sal

      • Sounds like someone is easily impressed when they lack sufficient understanding of the issue.

        • Sounds like someone is easily impressed when they lack sufficient understanding of the issue.

          Definitely, I have to agree that handling client's data is beyond AI capabilities, after all "there's no 'I' in AI".

      • "AI is progressing astonishingly rapidly"

        The hallucinations are becoming even more vivid, and sometimes there are even smells and tingling sensations on the skin.

        How do you know if anything it delivers is correct? Did you bother to verify any result ever? Do any users do cross checks? Does the provider have any information about how they keep the system honest? If you are betting the company on using this information, can you sue, or is it 100% caveat emptor?

        You've not making informed decisions, you're u

    • Comment removed based on user account deletion
    • I am not convinced by that. I occasionally have to deal with bad sales people in tech stores. They really sound like GPT.
  • by Hoi Polloi ( 522990 ) on Friday September 13, 2024 @10:35AM (#64785353) Journal

    AI is barely out of the gate but every investor and ceo is foaming at the mouth thinking it is the answer to everything.

    • AI has more potential than 'blockchain', so there's that. But it's still only as there's no clear path to actual intelligence over the current appearance of it.

      • by gweihir ( 88907 )

        You think? Crapto can at least be used to finance crime and run scams. AI seems to have no real larger applications.

        • by HBI ( 10338492 )

          It's a useful reinforcing buzzword to add to the same old shit to make it sound (and smell) new.

        • by Targon ( 17348 )

          You misunderstand how many of these unskilled positions are out there, where the person on the phone has very little skill, and the only advantage they have compared to the person calling is an inside connection to the company to get simple things done. With a small amount of work, having an AI handle the majority of calls, the need to outsource these low-skill positions to call centers here and there around the world would be gone.

          Note that you may be thinking about SKILLED jobs that AI can't do, instead

          • by gweihir ( 88907 )

            You see any of that "small amount of work" actually having been implemented? I do not. Maybe not so small after all?

            And yes, I am very well aware how unskilled many workers are. I did expect them to get unemployed pretty fast when LLMs became usable. Apparently I vastly overestimates then what LLMs can do. It seems they are even less capable than my really low expectations.

            • Useless people aren't hired to do a job. They are hired to give the executives above them more importance. If you don't have a tree of people below you, then you are nobody. You're not even in the management arena.
              • by gweihir ( 88907 )

                Exactly. In bureaucracies (the very definition of an "useless organization"), people get their importance from the number of underlings they have and how much time of others they waste. Obviously, the underlings and the ones that have their time wasted must be real people.

                • Technically there's no problem if the people below you do things (ie, don't have their time wasted), but you have to be careful because if they get too effective other managers will notice and stab you in the back (or sabotage you).
        • AI has a ton of potential. The current LLMs are more limited.

          Important to remember the distinction. OpenAI has not achieved general AI or strong AI.
        • AI seems to have no real larger applications.

          Replacing human customer service first line with cheaper automation is the obvious one. Lots of money to be saved while giving the impression that service is not being deteriorated.

          Bonus: freemium model becomes available for all types of business when people really want to speak to a human customer service.

    • by gweihir ( 88907 )

      Indeed. Probably because problems are mounting and they do not have any real answers themselves. Now they think they can just buy some. That never works, except for the sellers.

  • ...of outsourcing your software is when one of your employees does something really stupid that reveals confidential customer information or, say, just spitballing here, BSD's computers globally shutting down airlines and businesses all over the world...you can "blame the software supplier."
  • by Somervillain ( 4719341 ) on Friday September 13, 2024 @10:42AM (#64785375)
    If you can trust AI/ML to do the job, you didn't care about the job to begin with. Enterprise apps codify workflows and processes to ensure consistency. ML is designed to answer things that are difficult to impossible to answer with algorithms. You'd never have AI/ML determine if it's freezing outside. We have math for that. We use it for things that cannot be objectively determined with current methods, like "is this spam?" "is this nudity?" "is this a pedestrian?"

    First of all OpenAI is an overhyped scam, but even if it worked as promised, it was never designed to codify process. AI, as we know it, is designed to augment algorithms...to handle edge cases, not replace well-codified processes. You NEVER want to use AI when conventional code will do. Conventional code is always superior. It is deterministic...ML really isn't. It's also very expensive and an ecological disaster from all the computing waste. Now in this case, are they using generative AI to write code?....well...

    1. It has to work...it'll take just as much time, if not more, to verify...remember, if money is on the line, it has to actually work. Writing code that "works sometimes" is really fucking easy...any decent programmer can churn out code really fucking fast. Writing code that "works correctly"..."scales"...and most importantly, is maintainable down the road?...that's what you're paying for in delays (if the programmer is good).

    2. You still have to maintain it. Is your business process set in stone? It never evolves? well...someone has to modify it...most businesses tweak their processes many times a year...that's why they hire developers, analysts, QA, etc. Generating code is really fucking easy....making a small change to working code?....really fucking hard and error prone. How much do you trust AI to make surgical changes without side effects?...a problem that vexes most highly skilled experienced developers with a LOT more contextual knowledge.

    Don't fall for the sales pitch. If what he said was actually true, he wouldn't be telling you, he'd be showing you...There are literally TRILLIONS of dollars at stake. If he COULD do this...he'd be the richest human in history. You're hearing the modern day equivalent of someone telling you they've mastered alchemy and can turn lead into gold.
    • by gweihir ( 88907 )

      Don't fall for the sales pitch. If what he said was actually true, he wouldn't be telling you, he'd be showing you...There are literally TRILLIONS of dollars at stake. If he COULD do this...he'd be the richest human in history. You're hearing the modern day equivalent of someone telling you they've mastered alchemy and can turn lead into gold.

      Exactly. The only ones profiting from the AI craze are the AI sellers. And even they are struggling, so they try to make larger and larger claims false claims to keep the hype going. In actual reality, everybody that has invested large-scale in AI is hemorrhaging money like crazy.

      • I use AI to process documents and extract fields and tables. It's only 90% accurate, but useful with a validation step. There are thousands of legitimate use cases for AI. It can't replace humans because when it fucks up, it has nothing to lose. So we can't trust it on critical tasks. You can't even punish a model, what can it do, it has no physical body.
        • by gweihir ( 88907 )

          There are thousands of legitimate use cases for AI.

          People keep claiming that. Somehow they do not seem to materialize in any significant amount.

    • I am what you called a "...highly skilled experienced developer(s) with a LOT more contextual knowledge". We have dozens to development teams working on many, many enterprise apps where I work. We all have access to CoPilot and ChatGTP (private versions). Today I'm working on converting a nodejs application to typescript. CoPilot will help me with all sorts of little things (like type definitions), and likely other stuff. But it won't help me re-organize the folder structure of this project to be similar to another project we have. It wouldn't know which things are important to mimick, and which should be ignored.

      I also can't see it writing any nodejs/typescript app in its entirety in my context. The security constraints, unit test requirements, and what my teammates said in stand-up this morning all affect how the final code will look. CoPilot has access to precious little of that. Even if CoPilot and its breathern were capable of generating perfect output, humans would be unable to give it the prompts to do so.

      In short, we lack the language to describe our processing needs in their entirety.

    • If you can trust AI/ML to do the job, you didn't care about the job to begin with.

      Well when I enter my leave into workday my boss often approves it without even looking at it, so is it a problem if my days off are completely made up by AI? I guess it's only a problem if it burns off my leave. Naturally I won't be telling HR if it accidentally didn't debit my leave while I was on holidays. But I really hope that AI leaves me a message saying ";-) I got you bro"

    • by AvitarX ( 172628 )

      It sounds like (from the summary) they're going to use AI to rewrite the software, but still run software that can be tested and reviewed.

      I suspect they get pretty far, but realize there's more to a good CRM than is obvious up front and end up scrapping the project.

  • At this time LLMs are still nothing but a massive hype with very few actual applications and they are all niche. All the big players are hemorrhaging money on AI. The only ones profitable are the hardware suppliers. And there is really no sane reason to expect this to change anytime soon.

    • I think the main winners in the LLM age are the users. OpenAI serves 200M users, an estimated 1B tasks per month, and 2 trillion tokens. But that is a lot of assistance, and its value to the user might not even be quantifiable in money. OpenAI makes pennies on a million tokens and is burning cash.
      • by gweihir ( 88907 )

        You think this stuff is actually helping somebody? Not me. But then what do I know, I am in the upper ranges of intelligence, education, experience and skill. Maybe this stuff is actually helpful for more average people. Or at least gives then the illusion they are being helped.

  • saying he wasn't looking to primarily save money on software license fees "even though that is nice upside."

    As if any of the AI pushers would even think about letting you avoid paying your subscription fees.

    Those replacement enterprise apps will demand at least an annual service fee, assuming you even get to have it on prem. (Fat chance. They'll probably make it cloud only for all but their biggest cash cows.)

  • as a pay day loan type of moral operation, is it just that they have finally gotten their form application filing organized and called it AI?
    i mean i can see insurance companies increasing profit by denying even more claims by AI and profit

  • you're still gonna need some custom code for interoperability between systems. What AI can and will do is the interface coding (which thanks to dirt cheap Indian programmers is still done in code rather than point & click interfaces). AI will be cheaper than those Indian programmers and replace even them.

    India knows this and they're trying to boost their manufacturing base because it's still cheaper to use humans than machines in a lot of places.
  • by MpVpRb ( 1423381 ) on Friday September 13, 2024 @11:54AM (#64785603)

    "Conversational AI can understand natural-language commands and be ordered to write software code", knows nothing about writing software
    Natural language is nowhere near precise enough to do the job. That's why programming languages were invented
    People have tried specifying complex things in natural language for centuries, it's called the law, and lawyers still argue about the precise meaning
    Creating complex systems is hard. Changing from a precise language to a sloppy language doesn't make it easier

  • If that turns out to be possible, doesn't that mean that there's a hell of a lot of redundancy in that kind of software? I'm guess that the LLMs were trained on massive quantities of code which is typically found in such apps. This sounds like FOSS alternatives could find a good niche & be a lot better than the "minimum viable product" we typically find in commercial apps. Are they already any?
  • AI isn't a system, its a feature. Is he suggestion that people enter "form data" via a conversational interface (a horrible idea except for the most basic data such as name).

    Where is that data captured and stored?

    Are they using Word docs and a graph connector? Spreadsheets?

    The data doesn't go away with the system, just the user interface. And tracking structured data in an unstructured way is a quick way to data problems (especially with AI, it loves structured data).

  • Companies just want to get on with their real business so whilst they might investigate Salesforce replacements, what they actually want is a cheaper Salesforce. Unless vendors adjust licence costs, there will be an awful lot of companies attempting to write their own. Boom times for security specialists!
    • What they want is a less shit Salesforce. Drupal exists and will do 90% of what Salesforce does out of the box (which might cover your needs) and getting your data from one to the other is trivial, but you will have to reimplement all of the logic manually.

  • Yeah, as an employee I really want my annual review to be hallucinated by an AI because my manager is too lazy to actually track the performance and goals of their direct reports - you know, their job.

    No fucking thanks. Even as much as Workday sucks, it's still better than that.

  • All right, now install and run it in the cloud (or on my desktop computer).
    Next, add a new custom work flow for it, that fits my business.

    When AI can do this, SalesForce will need to worry. Until then, it's not going anywhere.

    • You are 100% correct. What's humorous about your comment is, of course, you can change the workflow of your business to match SalesForce for less than the cost of customizing SalesForce to your business. But nobody wants to do this. Everyone insists they are too much of a special snowflake for that.
  • That's really the question people should be asking. If you just go sign up for SalesForce, it's cheap. As in cheaper than a ChatGPT subscription. Why would you want to use ChatGPT to replace something that works really well and is cheap?

    What happens is an irresistible siren's call where companies by SalesForce but can't resist customizing it. Then they get to a point where they have the worst of custom build software and the worst of buying a product. But by the time they realize that they are tota

    • Salesforce is just shit. I took a class in it and everything about it turned out to be appalling. And the documentation is even worse than the product. That you need to use both the old stuff and the new stuff to get anything accomplished is pathetic. The performance is trash. You have to notify them before doing a performance test because you could otherwise crater performance for other sites, which means their cloud architecture is trash too.

      Salesforce is a stupid answer to any question.

      • Salesforce is a multi-tenant solution. A performance test for you could be a DoS for others. Notifying them seems a pretty reasonable requirement.
  • This is going to be so bad. They are going to spend millions remaking tools (with complex regulations in the case of HR) from scratch instead of making - you know - a product to sell. And it will fail so bad that they will have to spend months going back.
  • Could hardly make it worse ...

    More seriously, Salesforce and other CRMs already don't validate identity (usually) of contacts with logins or anything, so yeah, I wouldn't be surprised if some sort of "AI" could be similarly useful in doing loose contact tracking like CRMs.

  • Sometimes I think a box of crayons could replace Workday.

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...