Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Technology

Meta's Free AI Isn't Cheap To Use, Companies Say (theinformation.com) 18

Some companies that pay for OpenAI's artificial intelligence have been looking to cut costs with free, open-source alternatives. But these AI customers are realizing that oftentimes open-source tech can actually be more expensive than buying from OpenAI. The Information: Take Andreas Homer and Ebby Amir, co-founders of Cypher, an app that helps people create virtual versions of themselves in the form of a chatbot. Industry excitement this summer about the release of Llama 2, an open-source large language model from Meta Platforms, prompted the duo to test it for their app, leading to a $1,200 bill in August from Google Cloud, Cypher's cloud provider. Then they tried using GPT-3.5 Turbo, an OpenAI model that underpins services such as ChatGPT, and were surprised to see that it cost around $5 per month to handle the same amount of work.

Baseten, a startup that helps developers use open-source LLMs, says its customers report that using Llama 2 out of the box costs 50% to 100% more than for OpenAI's GPT-3.5 Turbo. The open-source option is cheaper only for companies that want to customize an LLM by training it on their data; in that case, a customized Llama 2 model costs about one-fourth as much as a customized GPT-3.5 Turbo model, Baseten found. Baseten also found that OpenAI's most advanced model, GPT-4, is about 15 times more expensive than Llama 2, but typically it's only needed for the most advanced generative AI tasks like code generation rather than the ones most large enterprises want to incorporate.

This discussion has been archived. No new comments can be posted.

Meta's Free AI Isn't Cheap To Use, Companies Say

Comments Filter:
  • I just hack into Putin's servers and run bots for free.

  • First dose free (Score:4, Insightful)

    by alexru ( 997870 ) on Friday November 03, 2023 @04:07PM (#63977756)
    This is just a preview of the bills you will be paying once OpenAI has 99% of the market and killed all the competition. It is hard to compete with a "business" that loses money and has no chance of getting profitable without jacking up the prices.
    • This is just a preview of the bills you will be paying once OpenAI has 99% of the market and killed all the competition.

      It is hard to compete with a "business" that loses money and has no chance of getting profitable without jacking up the prices.

      That, or renting GPUs in the cloud is super expensive.

      The article is paywalled to it's hard to say what the actual cost is, but I suspect the difference is that ChatGPT is running with high demand 24x7. They'd scale up or down the resources depending on the time of day, but they'd use dedicated machines and GPUs and use them close to 100%.

      If your workload is smaller than that dedicated cloud GPU is largely idle and a massive cost drain, so you basically need on-demand resources, ie, check out a GPU when it'

    • But that's the beauty of an open source model - OpenAI can kill off every competitor tomorrow and you can still start spinning up Llama 2 instances the moment OpenAI prices go up more than 2x.

      But given their competitors are presently entities like Google and Amazon it's pretty hard to imagine them winning the market by just burning money. Even with very generous investors they are not going to have that kind of cash.

  • than to buy the same free software from somebody that's because you don't actually have any engineers, and you need to hire somebody to be those engineers you were too cheap to hire.

    That's not a problem with Open Source. That's because you're not actually a tech company, you're.... I don't know what you are. But it ain't a tech company. Tech companies hire engineers.
    • by alexru ( 997870 )
      It is not the labor cost, it is a cloud bill that shows how expensive those things to run. The same bill OpenAI investors eat at the moment. But at some point they will want their money back.
      • The same bill OpenAI investors eat at the moment. But at some point they will want their money back.

        Sure, the prices will change and are likely to equalize somewhat compared to what is reported here. But you have to shop around and go with what's best for now, while staying vigilant and technically ready to switch horses when somebody else takes the lead. That's business.

      • You've gotta wonder what revenue streams their profit projections turn to in order to have a business case that the investors will continue to fund. Certainly wouldn't make sense as a search service or something the average consumer would utilize. They would likely be the product, then. What are they planning?
        • by alexru ( 997870 )
          Most likely business plan is to sell to Microsoft. And Microsoft will find a way to ram it into everyone's throat. Plus MS already has cloud infrastructure.
  • They setup their own infrastructure which is likely not used as much, with a new ai platform that is not as mature, hasn't had as much time to be optimized, without the advantage of using an infrastructure at scale

    how is this news? could have been said of pretty much any service... database, hosting, ....

    • by ceoyoyo ( 59147 )

      They setup their own infrastructure

      No they didn't. Microsoft gave them a bunch of free Azure time. When that runs out they'll have to raise prices, go broke, or convince MS they should continue subsidizing.

      Shockingly, paying for cloud services costs more than not paying for them.

      • by youn ( 1516637 )

        Yes they did.

        I bet most people using the cloud would disagree.
        _ virtual infrastructure is still considered infrastructure.
        _ It also still needs to be setup, even if it is virtual

        • by ceoyoyo ( 59147 )

          From what I've seen, most people using the cloud might disagree with basic math, yes.

  • I just deployed llama2 with the big model on a Orange Pi 5+ with a $200 TPU. Performance was much better than GPT3.5 turbo.
  • The dirty secret of LLMs, ML and big AI in general is that it's insanely expensive to run and maintain these models, in terms of data, processing power, and upkeep of models. None of this will reasonably scale to a mass marketable, cost effective product, pretty much ever. Don't tell the investors, though!

Any program which runs right is obsolete.

Working...