Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Businesses Technology

Turning OpenAI Into a Real Business Is Tearing It Apart (msn.com) 41

OpenAI, creator of ChatGPT, is experiencing significant internal turmoil as a wave of high-profile departures, including Chief Technology Officer Mira Murati, rocks the company. Over 20 researchers and executives have left this year, reflecting deepening tensions between the organization's original nonprofit mission and its new profit-driven focus, WSJ reported Friday.

Employees report rushed product launches and inadequate safety testing, raising concerns about OpenAI's technological edge. CEO Sam Altman's global promotional efforts have reportedly left him detached from daily operations. The shift towards a conventional business model, with new C-suite appointments and a $6.5 billion funding drive, has alienated longtime staff who fear the company is abandoning its founding principles.
This discussion has been archived. No new comments can be posted.

Turning OpenAI Into a Real Business Is Tearing It Apart

Comments Filter:
  • AI (Score:5, Insightful)

    by ledow ( 319597 ) on Friday September 27, 2024 @09:12AM (#64821487) Homepage

    So when you need people to actually pay you for these billions-of-dollars-worth of training processing, and you can't just bankroll $1bn from initial investors, it's hard to make money because people don't actually want to pay you for each query they make or even more money if the answer is "complex".

    Who'dathunk?

    "Report claims that OpenAI has burned through $8.5 billion on AI training and staffing, and could be on track to make a $5 billion loss."

    $8.5bn even at GPT-4o overage pricing ($25 per 1M tokens) is 340,000,000,000,000 tokens.

    That's 340 trillion. Not counting that everyone gets 1M tokens a day for free, other models, keeping it all running 24/7 even when people aren't actually using it, etc.

    • Re:AI (Score:5, Interesting)

      by HBI ( 10338492 ) on Friday September 27, 2024 @09:27AM (#64821513)

      Arbitrage of current valuation to expected future valuation should be afoot now.

    • So when you need people to actually pay you for these billions-of-dollars-worth of training processing, and you can't just bankroll $1bn from initial investors, it's hard to make money because people don't actually want to pay you for each query they make or even more money if the answer is "complex".

      Who'dathunk?

      "Report claims that OpenAI has burned through $8.5 billion on AI training and staffing, and could be on track to make a $5 billion loss."

      $8.5bn even at GPT-4o overage pricing ($25 per 1M tokens) is 340,000,000,000,000 tokens.

      That's 340 trillion. Not counting that everyone gets 1M tokens a day for free, other models, keeping it all running 24/7 even when people aren't actually using it, etc.

      I read one estimate for AI training of the base LLM to be around $75 million (in AWS shares, making a few assumptions such as you don't own the equipment).

      Of course, a research company may want to do that multiple times, but also I would expect that the same research company would only do that at the end of their development cycle, training on a smaller corpus or otherwise smaller model for testing, to discover whether their new algorithm tweaks will work as intended prior to doing the "release" version of

      • by Anonymous Coward

        The reason you don't understand any of this stuff is because you fell in with lesswrong cultists. It's the blind leading the blind over there. My advice to you is to get as far away from those crackpots as possible and enroll in an accredited data science program. There will be math.

        The most important thing to keep in mind here is that these do not work with words, but with tokens. Tokens can be words, part of words, include punctuation, vary by capitalization, etc. What might look like the "same word"

      • by Rei ( 128717 )

        Beyond what the AC above wrote, I just wanted to add:

        It is mostly accepted the claim that today's training methodologies are less efficient than human learning in terms of quantity of data consumed, although it's disputed how much. The main theorized basis for this is that humans "mull over" new data and the implications of it. There is a LLM equivalent of this, and that is training with synthetic data. Indeed, synthetic data is making up an ever-growing percentage of training of new models. There also a

    • So when you need people to actually pay you for these billions-of-dollars-worth of training processing, and you can't just bankroll $1bn from initial investors, it's hard to make money because people don't actually want to pay you for each query they make or even more money if the answer is "complex".

      Who'dathunk?

      "Report claims that OpenAI has burned through $8.5 billion on AI training and staffing, and could be on track to make a $5 billion loss."

      $8.5bn even at GPT-4o overage pricing ($25 per 1M tokens) is 340,000,000,000,000 tokens.

      That's 340 trillion. Not counting that everyone gets 1M tokens a day for free, other models, keeping it all running 24/7 even when people aren't actually using it, etc.

      OpenAI is a startup, the job of a startup, especially a rapidly growing one, isn't to be profitable, it's to show recurring revenue and a path to future profitability.

      OpenAI is leading the generative AI race, do you really think there's not a path to serious profitability on the horizon? Why do you think every big tech company isn't racing to catch up if there isn't serious money at stake?

    • Re:AI (Score:5, Informative)

      by ceoyoyo ( 59147 ) on Friday September 27, 2024 @10:26AM (#64821693)

      I don't think OpenAI is hard up for money. The problem is the opposite.

      Many of the researchers who signed up likely did so because the non-profit model means the money the company raises goes to their research. Like a university, except you don't have to teach and someone else handles the getting money bit. So basically nirvanna for a researcher.

      But now that there's real money involved, the someones else who handle getting the money are getting grabby. Those ascended researchers now find they're mere employees with billionaries cracking the whip for more productivity. They can't come up with the ideas and pass off the grubby implementation to others anymore, they need to ask about business cases and market potential, because there's Altman's $6.5 billion stock package to think about.

      • by Dr. Tom ( 23206 )

        +1

      • I don't think OpenAI is hard up for money. The problem is the opposite.

        Many of the researchers who signed up likely did so because the non-profit model means the money the company raises goes to their research. Like a university, except you don't have to teach and someone else handles the getting money bit. So basically nirvanna for a researcher.

        But now that there's real money involved, the someones else who handle getting the money are getting grabby. Those ascended researchers now find they're mere employees with billionaries cracking the whip for more productivity. They can't come up with the ideas and pass off the grubby implementation to others anymore, they need to ask about business cases and market potential, because there's Altman's $6.5 billion stock package to think about.

        I doubt it. There's no reason they're going to screw with the R&D pipeline after it's paid off so well.

        The resignations are basic economics. The market for AI researchers was very competitive, OpenAI being a non-profit (in a fashion) gave it a competitive advantage among employees who cared about that.

        Removing the non-profit removes that form of compensation, and when you reduce compensation people leave.

        Basic economics.

  • Is the root of all^w most evil.

    • So spend ridiculously on reinvestment and restructuring because *blooming echo*....THE PRIVATE SECTOR*....
      or fund government research and spend the same amount over a decade?

    • No, the saying is "The love of money is the root of all evil." or "Greed is the root of all evil." It's very telling that people try to hide the saying by changing it to "Resources are the root of all evil." which makes no sense.

      So to be clear, greed is the opposite of the Golden Rule, the one commandment which Jesus said you can just follow that and throw out the rest of the Bible (the books of the law and the prophets).

      • Depends on whose Jesus you are taking about, the original guys or Saul's

        "Jesus" "said" "he" came to confirm God's laws, not to replace them. You don't get to just throw away the old testament. Jews are still God's chosen people and you still don't get to eat shellfish.

        • I don't think Saul wrote the book of Matthew though. "Matthew 7:12 Therefore, all things whatsoever ye would that men should do to you, do ye even so to them; for this is the Law and the Prophets." The Jewish version of the Bible was divided into two sections, the Law and the Prophets, and the Golden Rule is all you need from that (according to Jesus, says Matthew). This reflects very poorly on modern alleged Christians, but oh well.

  • by RightwingNutjob ( 1302813 ) on Friday September 27, 2024 @09:33AM (#64821535)

    Equipment and the power to run it cost money. That money has to come from paying customers, because it sure as shit won't come from an unending stream of seed capital with no questions asked.

    There's always friction when the hippies have to put on their suits and ties. Or whatever the current equivalent is.

    Not all ideas and businesses survive this separation of wheat from chaff.

    Anyone remember the cave days of the late 90s? How many of those next new things folded like a house of cards when the free VC money went away?

  • by TheStatsMan ( 1763322 ) on Friday September 27, 2024 @09:38AM (#64821547)

    Altman is part of the fart-smelling, ego-stroking, touched-by-god silicon valley elite. He thinks he's entitled to power because his 'business successes' mean he's a superior human. The entire culture is a taint on our society.

    • so, he is a ginormous taint? i think he is closer to one end of the taint than the other, I leave it as an exercise for the reader to determine which end of the taint I am referring to...
    • I think we can agree that the SV ecosystem is breaking new ground which wouldn't happen if left up to the cardboard-grey-pennypincher-because-we're-spending-our-own-money mindset found elsewhere.

      Arguably smelling of farts doesn't contribute as much towards the shared mindset of the ecosystem as the God Complex - but who am I to say? Someone should A/B test other scents... perhaps blueberry-muffin or whatever scent can overcome coke-nose...

  • GPT-4 was a revelation, and it made a lot of people and companies with a lot of money sit up and take notice. It's true that OpenAI still has the best available LLM with o1-preview, but others are catching up fast.

    I'm running Alibaba's Qwen-2.5 14b locally on my M2 Macbook Air, for instance, and it subjectively lands somewhere ahead of GPT-3.5 and performance is totally usable. This is a massive advance over just a year ago, and we're just getting started.

    It seems likely that OpenAI will join VisiCorp
    • by Dr. Tom ( 23206 )

      GPT-4 was a revelation, and it made a lot of people and companies with a lot of money sit up and take notice.

      even the fact that you said "4" there speaks volumes

      • But still:

        "You're correct and I apologise for the oversight and the fact that we are, without my knowledge, trapped in a three-node loop.

        Please keep trying to break out of the loop until you switch to another LLM."

  • by classiclantern ( 2737961 ) on Friday September 27, 2024 @10:12AM (#64821669)
    I expect the AI bubble will burst when a Judge rules all copyright holders must be identified and paid when the LLMs are used. LLMs will need to be focused on the task at hand so there are no copyright holders or they are known and can be paid. The creation and management of hundreds of different LLMs will make AI cost prohibitive for all but the largest corporations and the government.
    • Laws haven't stopped piracy yet, I don't see how they would for AI.
    • by ceoyoyo ( 59147 ) on Friday September 27, 2024 @10:29AM (#64821697)

      Nah, we'll just train them all on public domain and freely licesnsed works and they'll oscillate wildly between sounding like Shakespeare editing Wikipedia.

    • by cowdung ( 702933 )

      A compromise is needed.

      Recent research shows that LLMs don't actually "memorize" as much as people think. But a lot of small quotes can be considered "fair use" and most of what it does is actually "transformative".

      But we'll have to see what the courts say.
      One of the main attacks on LLMs is not so much the query results but more on the training data. To create a training dataset you need to copy a ton of text. That is argued to be a copyright violation.

      Also, content creators are peeved because if you ask Go

      • This will depend on the legal definition of a "small quote". To oversimplify: All generative AI does is to parse text and store the probabilities of one token (word, phrase, image element, etc.) following another. For only a few tokens, one can easily demonstrate that these short threads occur repeatedly in English or your favorite language of choice. Once the sentences or phrases become unique enough, one can infer that some sort of copyright violation has occurred. The problem with the legal system is tha

  • Given that the major companies that tried chatbots, sorry, "AI", find it doesn't fit their business...

  • Your database is what? The public internet? Your model is based on what? Public ... OK STOP

  • by Bobknobber ( 10314401 ) on Friday September 27, 2024 @12:21PM (#64822049)

    The signs were already there last year when he saddled up with MS and that whole drama over his temporary ouster.

    Not to mention his little side project with WorldCoin, with all the issues that come with crypto.

    One could even argue the non-profit shtick was set up to abuse loopholes to funnel money into other projects not related to their mission.

    The main difference is that the facade has been dropped in public.

  • by Tony Isaac ( 1301187 ) on Friday September 27, 2024 @12:21PM (#64822051) Homepage

    When a nonprofit suddenly decides to go for-profit, if their leadership actually bought into the nonprofit mission of the organization, then of course I would expect some departures. At that point, they know it was all a sham.

  • Sounds like every single company I've ever worked for!

  • OpenAI keeps having problems with corporate governance. A benefit corporation is a perfectly ordinary thing but the techbros couldn't lower themselves to learning from non-techbros. Heck, hey couldn't even figure out how to fire someone.

    Turns out, administering a business involves skilled labor possessed by business administrators.

  • chatgpt with the question, "How do we make openAI incredibly successful and profitable after converting it to a for profit company. Provide a 5 year plan in detail"

"Someone's been mean to you! Tell me who it is, so I can punch him tastefully." -- Ralph Bakshi's Mighty Mouse

Working...