Forgot your password?
typodupeerror
Open Source Software AI

SaaS Apocalypse Could Be OpenSource's Greatest Opportunity (hackernoon.com) 78

Longtime Slashdot reader internet-redstar writes: Nearly a trillion dollars has been wiped from software stocks in 2026, with hedge funds making billions shorting Salesforce, HubSpot, and Atlassian. At FOSDEM 2026, cURL maintainer Daniel Stenberg shut down his bug bounty program after AI-generated slop overwhelmed his team. A new article on HackerNoon argues that most commercial SaaS could inevitably become OpenSource, not out of ideology but economics. The author points to Proxmox replacing VMware at enterprise scale and startups like Holosign replicating DocuSign at $19/month flat as evidence. The catch, the article claims, is that maintainers who refuse to embrace AI tools risk being forked, or simply replicated from scratch, by those who do.
This discussion has been archived. No new comments can be posted.

SaaS Apocalypse Could Be OpenSource's Greatest Opportunity

Comments Filter:
  • What? (Score:5, Insightful)

    by SlashbotAgent ( 6477336 ) on Wednesday March 18, 2026 @02:08PM (#66048160)

    I can't be bothered to read the article after that summary.

    What a mess of conflated nonsense. Stock price, unrelated developer activity due to AI, free software replacing rapist vendor software... What does any of that have to do with open sourcing SaaS or SaaS "apocalypse"?

    If you wanna talk about AI slop, this sure looks like it.

    • Re:What? (Score:5, Insightful)

      by nightflameauto ( 6607976 ) on Wednesday March 18, 2026 @02:22PM (#66048210)

      I can't be bothered to read the article after that summary.

      What a mess of conflated nonsense. Stock price, unrelated developer activity due to AI, free software replacing rapist vendor software... What does any of that have to do with open sourcing SaaS or SaaS "apocalypse"?

      If you wanna talk about AI slop, this sure looks like it.

      Yeah, all of that, and the conclusion is developers *MUST* embrace AI, or risk being replaced by those who do. It's literally just an amalgamation of everything the AI Prophets and their sales henchmen have been preaching for the last several years. "GET ABOARD OUR HYPE TRAIN OR WE WILL RUN YOU DOWN WITH IT!" Meh, whatever.

      Open Source developers can develop however they want. Just because you can flood their bug system with AI generated slop doesn't mean you're running them out of development. These people need to get a grip.

    • Companies that sell software subscriptions. They took a huge beating because investors think, possibly rightly, that those services will just get replaced with AI solutions. If nothing else there's risk of market disruption that could wreck the sales of a big company.

      The thought here is that if the market became unprofitable there's still some demand and open source software will take that over .



      but of course there's a healthy amount of AI slop in the article at the end because the author was proba
    • I can't be bothered to read the article after that summary.

      What a mess of conflated nonsense. Stock price, unrelated developer activity due to AI, free software replacing rapist vendor software... What does any of that have to do with open sourcing SaaS or SaaS "apocalypse"?

      If you wanna talk about AI slop, this sure looks like it.

      Under normal circumstances I'd yell at you for that, that being my apparent role on this forum it seems, but it turns out I'm as annoyed by the blurb as you are. And its too hot an

    • The so-called SaaS apocalypse, where *all* SaaS vendors will go out of business because of AI.

      The thing is, TFS can't even back that up - it even says "and startups like Holosign replicating DocuSign" - so a SaaS vendor, replicating another SaaS vendor. Fair play, they're doing it cheaper (by a long way), and it's largely because they could throw the app together quickly and can use AI to do some of the OCR type stuff they need. Truthfully though, it's not the hardest app in the world to start with, how Doc

  • Even I have started to simply use AI and well thought out prompting to "Write my own" forks of certain things. And they work...
    • Working is not the same as good.

      Good software is efficient, maintainable, and secure.

      • When you are talking STM / RP2040 projects like WLED and GP2040CE, forking a page or two of code to get something working that didn't before, is "No-code" now. I tell it the components/boards I am using, how they are wired, and tell it accurately "I want it to do this. Write that code and crosscheck it against previous bugs / issues, optimize it as best you can. Annotate EVERY possible line, so that I can try to understand what you did" And it just... works. (I can settle for works vs 10,000$ to a developer
        • by Ceiu ( 8681133 )

          Don't forget to add "don't make mistakes" to the prompt so it knows not to make any mistakes.

          • So far, every drop of the 500 or so pages of code its written for me, appear to be "By the book" Aka, there are no visible mistakes. Additionally the annotation, would help a laymen such as myself, locate them if there were any. It has never made me a drop of code that wouldn't compile and work as intended. For my use cases, it works. (I am also not trying to use it to re-write salesforce APIs, or write my own OS.... So, for simpler IO programming, its fairly reliable. I am not saying it codes like a season
            • What are you using to write 500 pages of error free code?

              I can't get 15 line bash scripts that work properly.

              • I've used SuperGrok to write and annotate a TON of Arduino Code for a number of projects. Sometimes, I let it write it ALL from scratch, and sometimes, I feed it code I need modified to do something more specific or entirely different. Bit I am telling it specifically the BoM and components involved, exactly how they are wired, the intended function, the exact details on the specifics of that function, and how it may interact with other functions, etc. And so far, I have no encountered any code that won't w
              • by bcoff12 ( 584459 )
                That really sounds like a you problem. Both ChatGPT and Claude have written a lot of code for me that have been life changing. My billing process used to take me a complete workday to complete. It now takes an hour. Painful processes have been replaced by painless code. Do I know if the code is efficient, maintainable, or secure? I don't. Do I know if it works? Yes.
            • by gweihir ( 88907 )

              So far, every drop of the 500 or so pages of code its written for me, appear to be "By the book" Aka, there are no visible mistakes.

              You seem to have missed that LLM-type AI excels at creating that impression, not at actually making high-quality code.

              • But you missed the only two parts that actually matter... 1 " It can do programming that I currently, cannot." 2. It generates working code. I never tried to say it was expertly coded. Just the opposite. But like an NVIDIA reference board, its taking the FAQs and Manual, and writing code from that. And so far, accurately. Others may use different functions, use them in another way, or to a more efficient degree, or even make that same code in less lines with a more thought out approach. But what I am gettin
            • every drop of the 500 or so pages of code its written for me

              What does that even mean? What's a page of code? Are you printing it? I have no idea how many pages of code the software systems I've worked on are. We've talked about lines of code and number of files, symbols, functions. Pages? No.

              • Do you not know how much of your screen a page is comprised of? Do you pot calling the kettle much? (These kids man...)
            • No mistakes visible to a layman with little coding skill in 500 "pages" of code doesn't exactly instill confidence in me, but maybe that's just me.
              • No mistakes visible to a seasoned veteran IT technician who has easily built and repaired 10,000+ Computers... OR are you just trolling? ;-)
        • I can settle for works vs 10,000$ to a developer, in most of my use cases

          A completely understandable, and valid, decision.

          It is also the logic that lead to businesses being run on MS Excel spreadsheets. It is good enough for a one-off, but long term the technical debt is unsustainable.

          • I'm literally trying to sync up 24 stairs with RGB to a custom PCB made in KiCAD.... OR, trying to edit code to make an emulated analog "Axis" with steps out of a basic rotary encoder in GP2040CE. I think its gonna be ok ;-D
            • by gweihir ( 88907 )

              It is fine for projects with no real damage potential. Most real-world software does not fall into that class though.

              • Between my stuff at 5v with proper fusing (HW design is key up front), or even modifying production software to add fields, features, and functions that are not inherent, the REAL trick, is understanding WHAT you are asking for. 99% of people trying to use AI to code, are not sitting on 30 years worth of IT experience, do not have a background in hardware mod-ing and RE, and cannot technically express "What" they need the code to do. I think there in lies the difference. IF you speak search engine and resea
        • by gweihir ( 88907 )

          If that MCU does not control anything important, yes. If you end up, say, burning down or flooding a few houses, those $10k for a real developer may be a far superior choice.

          • Controlled Hardware design upfront can mitigate this 100%, and iterating the code after QA issues are found takes seconds. And can be cross referenced against multiple AI's to reveal coding styles and approaches of those individual developers. Your own pattern recognition just needs to be on point. The annotations alone, indicating what the AI is "Trying" to do, lead me to a framework for understanding that code. I might learn a few bad habits along the way, but learning directly from explained example, is
            • by gweihir ( 88907 )

              What amateur level nonsense. Good luck. You can only hope to never face liability.

              • Your dismissal is basically an admission of defeat. You addressed NOTHING that I actually said. You are simply stuck on your position. Unfortunately, The ability of a human to see 10 moves ahead, when coding, will soon be surpassed by machine coding algorithms, that can see 2 million moves ahead. Technology, reveals itself in its rapid advancement. 2 Years ago, a simple prompt sound not produce a convincing action movie scene... Now it can. In seconds. Everyone is radically underestimating that those buildi
              • He's not writing it for commercial purposes. It's hobby code. Besides, half of you on here sound like you think you are Gods gift to programming and also think that your average programming sucks anyway. It's really amusing to us that aren't coders.

                Of course the AI code isn't perfect. It may contain bugs. Etc. Thing is, so does professional code. Since no one ever seems to take any kind of responsibility for any code produced in any venue, what difference does it even make how it's generated?

                Microsoft devel

      • by gweihir ( 88907 )

        Working is not the same as good.

        Good software is efficient, maintainable, and secure.

        And does follow KISS and the "Principle of Least Surprise" in its interface. And some other things.

        From available evidence, LLM-generated code violates all (!) of these aspects. Somebody here called LLM code "review resistant" so you cannot even reasonably fix those shortcomings. I guess a lot of people are in the process of writing themselves a mountain of technological debt. This will not and can not end well.

        Well, I guess if I ever want to do any work after retirement (in 10 year or so), I will have a la

        • Unfortunately, The ability of a human to see 10 moves ahead, when coding, will soon be surpassed by machine coding algorithms, that can see 2 million moves ahead. Technology, reveals itself in its rapid advancement. 2 Years ago, a simple prompt sound not produce a convincing action movie scene... Now it can. In seconds. Everyone is radically underestimating that those building these AIs, ARE taking the things you are all seeing and finding into consideration, and then writing specific programs to work aroun
  • by irreverentdiscourse ( 1922968 ) on Wednesday March 18, 2026 @02:09PM (#66048168)

    Is slashdot taking fanfiction submissions now?

  • What a load of contorted nonsense.

    What is "OpenSource" anyway? Is that a word?

  • ...creating complex software is hard, regardless of the tools used
    Hypemongers have been spreading the fiction that AI tools allow the clueless to effortlessly create software
    What they actually create is bloated, inefficient, bug-ridden, insecure slop
    I do believe that companies with expert programmers will be able to use AI tools to create custom software for their in-house use. Some may choose to release it open source, some will keep it private
    I also believe that SAAS and software subscriptions will fade a

  • ...when I can cook at home?

    Sometimes I don't have the time to cook at home. Or it is too much trouble. Or I just want someone else to take care of the dishes and menu planning. Or I'm just lazy. But I'm not paying $1000 for a burger. Maybe $20 for the burger plus the service and convenience.

    Same thing with SaaS. Most SaaS is a huge ripoff, and has been even before AI. Now it is easier than ever to replicate SaaS. There might still be good reasons to choose SaaS, but it had better be much, much cheaper.
  • SaaS is not going away, some may be replaced by AIaaS. At least for relatively simple things, but most companies paying for SaaS are paying for the support that comes with it.

    You can definitely vibe code a lot of crap, but once you get into regulated environments where liability is really high (GDPR, HIPPA, FedRamp, etc.) it's risky to trust that a probabilistic interpretation of a prompt is compliant.

    • by gweihir ( 88907 )

      You can definitely vibe code a lot of crap, but once you get into regulated environments where liability is really high (GDPR, HIPPA, FedRamp, etc.) it's risky to trust that a probabilistic interpretation of a prompt is compliant.

      And that is the kicker: Regulation (and liability) will dramatically expand in the next decades, because the damage from crap code has gotten unsustainably high. Clearly, the US will not be a leader in this space, but the EU is going to be. The only thing left to be decided is the speed, but we will get liability, regulation and qualification requirements basically for all commercial software and software that is part of a commercial product.

      As a side note, the EU is aware of the value of FOSS. They will fi

    • by jythie ( 914043 )

      I think this hits on the biggest reason that SaaS companies are not going to be going away. Throwing together a home grown (or fly by night consultant, since that segment seems to REALLY be hyping up these tools) solution is great for stuff where you can absorb the hit of things going wrong, but once you touch anything even remotely regulated, you are going to want those dedicated companies that produce these solutions and sign off on them.

      You don't even have to go as far as GDPR/HIPPA/etc... anyt

    • I'm still trying to figure out what they're trying to say. In at least one case, Docusign vs Holosign, it's two SaaS companies competing, so how is SaaS being threatened in any way? Stenberg's problems with AI slop have nothing to do with SaaS beyond maybe that it's easier to have a third party manage bug bounties, but that's a single application that, in a very specific way, is being threatened, not SaaS itself.

      Does the same issue cause problems for Salesforce? Are SF's customers switching to home grown CR

  • Refusing to embrace AI is not the same as being inundated with poor quality AI output.

  • If you open source a clusterfuck, isn't it still a clusterfuck?
  • by DarkOx ( 621550 ) on Wednesday March 18, 2026 @02:47PM (#66048250) Journal

    *most* of the value of SaaS is the someone else is responsible part. All you do given them a credit card number, setup your tenant by filling a few forms and creating some users and done.. Or at least it *can* be that easy, if you have a lot of people or are part of big enterprise you are maybe setting up SSO or something.

    The reality is most of these SaaS projects are just slightly targeted CRUD apps. Sure AI makes cloning them much faster, but any business with a sizeable software teem could have already replicated them anyway. The point is:
    1) Leaving all the details to someone else
    2) having someone else to blame when things go wrong
    3) having a very predictable op-ex, you know the bill is $2500 every month, no surprises like a member of the financial software support time decided to retire and IT had hire a replacement, your department is getting billed for 33% of the hiring costs this quarter...

    The other end of the spectrum is truly unique software that does actual hard things engineering tasks, very large scale shipping lane management, very complex industry specific billing, payment, license, legal management, telecom/conferencing; where things might actually have some proprietary algorithms, or significant infrastructure requirements not entirely availible as some AWS or Azure PaaS service.

    I think the lower of SaaS is probably in real trouble. Because you can vibe code your own and drop it in AWS/Azure to handle all your infrastructure and be very much more in the driver seat vs what SalesForce decides to 'let you do' with Lighting or Apex. AI tools with their ability to let a single developer rapidly take and extend/customize a FOSS product rapidly will suck the value out of your basic SaaS CRM/ERP/Storefront providers, in a way FOSS alone has done the opposite for.

    • Re: (Score:3, Insightful)

      by mitchy ( 34242 )

      This is spot on. There is also another huge driver that has nothing to do with AI, that some companies strongly prefer variable costs (SaaS contracts) over fixed costs (salaries, FOSS-friendly DIY). They will make the decision thinking "this is not core to our business, we do not need to innovate here, we can get economy and speed to market by just paying someone else to provide the service" instead of building it themselves.

      AI definitely lowers the barrier for a DIY approach, but that does not answer the t

    • Once again monthly subscriptions pay for the the warm and fuzzies.

    • by jezwel ( 2451108 )

      ...having a very predictable op-ex, you know the bill is $2500 every month, no surprises...

      I don't know if you track this, but I am - and some SaaS companies love the annual renewal dance where they throw out eye-watering annual increases (25+%) and then graciously dropping it a bit like they're doing you a favour by only going up by 20%.

      • by DarkOx ( 621550 )

        Yeah I see that stupid game a lot. Although my experiences is you can usually talk them down to something more like %5, if really threaten to switch vendors and make a half way compelling case your are really able to do so.

        I won't name names but a couple vendors in the SAST industry are especially notorious.

    • by jythie ( 914043 )

      I keep thinking back to that example in the article about how quickly and cheaply an AI company through together a document signing service to compete with an existing commercial offering. Setting up such a chain is the easy part, and it works great if you don't really care if it works. Companies that act as trusted 3rd parties for such signing though, their software is often not that great, since that isn't really what they are selling.

  • I have an idea for a small app.

    Say it was something like an app to convert PDFs to DOCs.
    NB: It's not that, there are millions of apps that do that, but say it was something like that, but more niche.

    Would I be able to make as much for it if I open-sourced it, compared to charging a small fee for it as a SaaS website ?
    And if so, how ?

    Serious question.
    • by ceoyoyo ( 59147 )

      Would I be able to make as much for it if I open-sourced it

      No, and that's the point. I think Docusign is a great example. The backend is simple and there are lots of libraries in whatever language you like to do it. Turning that into Docusign involves making a bunch of web forms, e-mail integration, prettifying, documentation and other stuff open source developers don't like doing because it's boring and trivial. Docusign can charge whatever they charge because they're willing to do all that boring stuff. B

  • Software as a Scam!

    People will wise up eventually.

  • AI isn't, so while it can immitate intelligence with approximate regurgitations of training data, it can't apply knowledge in novel ways.

    What happens when nobody wants to share code because it's immediately vacuumed up by a corporation as training dara when they do? AI falls behind proprietary code.

    The next big thing in programming is going to be closed code that manages to be opaque to AI.

  • by Tom ( 822 )

    or simply replicated from scratch

    Next week in these news: Companies who went to $random_startup_of_the_month found out that they now bought into being at the START of a five-year crusade to eliminate bugs, performance issues and usability problems.

  • So cURL is going to be forked because they refuse AI bogus PR??
    • This. There's no need to embrace AI tools or add AI features. 99 times out of 100 the reason AI gets shoehorned into things is not functional, it's to impress investors, which all but a few open source projects don't have to worry about. Adding AI to an open source project is more likely to trigger a fork than prevent one.

  • by dskoll ( 99328 ) on Wednesday March 18, 2026 @05:55PM (#66048646) Homepage

    Maintainers risk being forked? That might not be a bad thing. Five or six Linus Torvalds, a bunch of Greg Kroah-Hartmans, etc. would be awesome!

    Of course, the downside is we might end up with 9 Matt Mullenwegs...

  • Let's see, we see lots of weirdness, and bugs. Ok, so the simple answer is to have AI write code that can be proven to be correct.

    That means AI to write in ADA,

  • How does one replace ERP / ERM frameworks? Is is even possible to write your own app with forms and data reports with a modern programming language as Rust?

Help! I'm trapped in a PDP 11/70!

Working...