Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Technology

AI is Entering an Era of Corporate Control (theverge.com) 47

An annual report on AI progress has highlighted the increasing dominance of industry players over academia and government in deploying and safeguarding AI applications. From a report: The 2023 AI Index -- compiled by researchers from Stanford University as well as AI companies including Google, Anthropic, and Hugging Face -- suggests that the world of AI is entering a new phase of development. Over the past year, a large number of AI tools have gone mainstream, from chatbots like ChatGPT to image-generating software like Midjourney. But decisions about how to deploy this technology and how to balance risk and opportunity lie firmly in the hands of corporate players.

The AI Index states that, for many years, academia led the way in developing state-of-the-art AI systems, but industry has now firmly taken over. "In 2022, there were 32 significant industry-produced machine learning models compared to just three produced by academia," it says. This is mostly due to the increasingly large resource demands -- in terms of data, staff, and computing power -- required to create such applications.

This discussion has been archived. No new comments can be posted.

AI is Entering an Era of Corporate Control

Comments Filter:
  • When I read the headline I thought it mean that corporations are using AIs to run themselves. Makes sense, the ideal CEO is a sociopath with no real human emotions.

  • No change there then (Score:5, Interesting)

    by anonymous scaredycat ( 7362120 ) on Tuesday April 04, 2023 @10:57AM (#63425030)

    Headline should really be:Software Still in Era of Corporate Control
    This will continue to be the case until RMS wins.

    • by kbonin ( 58917 ) on Tuesday April 04, 2023 @11:06AM (#63425050)
      This is a fight RMS can't win, you can't buy enough GPUs on the salary of a waiter to compete with a Wall Street or Sand Hill Road funded data center. The best we can do is to continue to advance the tools and open training data sets so that academia and non-corporate models and training are usable. I look at Open Street Maps - Google and Garmin may have better maps due to their funding, but most of the interesting stuff is being built using OSM since its actually open.
      • Spam filtering is a good analogy to the recent history of deep nets - it was a very hot research topic in academia around the turn of the century, but as people gravitated towards more centralized email, largely for better spam filtering, the largest providers like gmail took over an insurmountable lead because they had a broad view of billions of emails to detect spam campaigns, and also because managing the problem requires some grunt work, which isn't what academia does.

        Chat GPT is now beyond what you

      • Agreed that no one(ordinary) person could afford to do so but it could be probably be done as a volunteer computing(https://en.wikipedia.org/wiki/Volunteer_computing) effort like SETI@home.

    • by TWX ( 665546 )

      RMS's biggest nemesis in winning is RMS.

      While I can appreciate someone having opinions on the various degrees of 'freedom' one sees in various projects, when he vocally dismisses every single mainstream Linux distribution including those that go out of their way to place walls between free and non-free software that require the user/admin to set parameters in order to use non-free software, only to endorse the most obscure and unusable distros solely because they have never even provided the option to use n

      • by gillbates ( 106458 ) on Tuesday April 04, 2023 @12:25PM (#63425292) Homepage Journal

        The reason why we can, and still, use Linux is because it's FOSS.

        Every piece of proprietary "free" software follows one of two trajectories: it's either abandoned (founder dies, etc...), or it is bought by company X and either becomes a paid version, or is killed off.

        The reason we can still use Linux after three decades is the fact that its open-source license has explicitly kept it free. When the developers need to make a living, they pass it off to someone else. Linux is like the Olympic torch of the software movement which is continually passed from one volunteer to another.

        Free (as in freedom) software is the only model that can work this way. Everything else will either have to be bought or abandoned. And yes, developers do deserve to be paid for their efforts, but everyone pays taxes. That is, contributing to the greater good has been a social contract responsibility since the beginning of civilization. If you can write code, your contribution may very well be an unpaid bug fix or patch, from which we all collectively benefit. Contributing to Open Source or Free Software projects is just your way of giving back to the community that has given so much to you. Even though you may not receive monetary remuneration for your efforts, consider how much more difficult your life would be, how much less you could earn, if instead of writing code, you had to write your own compiler (thanks gcc!), your own operating system (thanks gnu!), or your own kernel (thanks Linus!), or deal with the myriad of problems and conflicts of interest inherent in proprietary software.

        When developers contribute to FOSS, it is only "unpaid" work in the monetary sense - what they're really doing is investing in their future. Twenty years ago, Linux experience on your resume meant little to nothing; today, you won't qualify for some jobs without it.

  • by snowshovelboy ( 242280 ) on Tuesday April 04, 2023 @11:20AM (#63425116)

    Alexa doesn't run on the echo dot, it runs in amzn's datacenter. Google's search engine doesn't run on my PC, it runs in google's datacenter. AI has been under corporate control for at least two decades.

  • by TWX ( 665546 ) on Tuesday April 04, 2023 @11:22AM (#63425122)

    If anyone thinks that AI would somehow not be under the control of corporate entities then they've clearly never paid attention to both the copious amount of speculative fiction over the decades, nor to how the majority of large, ambitious projects are organized and funded.

    Basic research falls into three main categories. Private companies self-funding because they see some kind of future in their long-duration, large-scale research (Xerox Parc, Bell Labs, etc), well-funded educational institutions driving research through their tenured faculty, and government-funded, either through business arrangements with the prior two categories, or else within divisions of government agencies such as NASA's JPL.

    Each of these methods comes with strings attached, and none of them are steered towards personal control.

    • by timeOday ( 582209 ) on Tuesday April 04, 2023 @11:28AM (#63425138)
      You don't need to consult speculative fiction, all successful industries follow this path. There was a time when hobbyists could build a world-leading aircraft - that's the Orville and Wilbur Wright story. Obviously those days are long gone. Universities and hobbyists cannot build a Boeing 737 for you.
      • There was a time when hobbyists could build a world-leading aircraft - that's the Orville and Wilbur Wright story. Obviously those days are long gone. Universities and hobbyists cannot build a Boeing 737 for you.

        Happened with computers too. Konrad Zuse invented floating point, microcoding and pipelining in his parents living room, single handedly. He built the first thing that could reasonably be called a computer a couple of years laterwith the aid of a very small team. The first machine holding its progr

        • Happened with computers too. Konrad Zuse invented floating point, microcoding and pipelining in his parents living room, single handedly. He built the first thing that could reasonably be called a computer a couple of years laterwith the aid of a very small team. The first machine holding its program in RAM was then build a few years after that by 3 people as a project at a small university.

          Finish the story:

          Then computers became big, expensive things that cost millions of dollars and required extensive support staff. Then they were scaled down and became accessible to hobbyists and general consumers, and now everyone carries a powerful computer in their pocket, and hobbyists can design and build their own computer-based systems to do whatever they can dream up. Meanwhile, a simultaneous and competing trend is re-centralizing computation and making it available as a service, making "big iron"

          • No, designing and fabbing chips became pretty much the exclusive province of large businesses, and everybody else (including e.g. university researchers) use off-the-shelf chips for the most part, have learned what to expect from them, and are overwhelmingly OK with that.

            That's what I expect for state-of-the-art large language models. Parties like researchers will learn how to do interesting research on top of existing LLM's, instead of creating them from scratch.

            • Bah. Designing and fabbing chips became the exclusive province of large businesses, but that led directly to the commoditization that makes them so incredibly cheap and powerful today. That's a feature, not a bug. Economies of scale FTW.
          • You can get hold of a computer system easily, but building a leading one by yourself? No, not even slightly.

            The closest you can get is taking an off the shelf SOC (i.e. a computer someone else made) and attaching some stuff to it. Even then it will certainly be a decent bit back from the state of the art. Actually making the state of the art computers is the province of large companies with a lot of resources, not a dude with some tinsnips hacking around in his parent's living room.

            Likewise you can make air

    • by mysidia ( 191772 )

      It's entirely predictable... And quite honestly: the scary thing is Not so much that corporations will obviously be required to create and maintain Ai. The scary thing is that corporations are also inherently very selfish.

      Meaning.. there is a good chance the Best enhanced AI technologies going forward will soon be developed by corporations and then kept private for their exclusive use as a secret "special sauce" used to benefit Only them.

      Like imagine if "Google Search" was kept internal as a tool for

  • by nightflameauto ( 6607976 ) on Tuesday April 04, 2023 @11:30AM (#63425146)

    Now you can join the rest of us in being completely, 100% controlled by corporations. YAY! Servitude is awesome!

  • by WDot ( 1286728 ) on Tuesday April 04, 2023 @11:32AM (#63425152)
    AI is one of the most accessible fields to hobbyists without credentials or connections. Most of the papers are open access (ArXiV, CVPR, ICLR, ICML), most of the codes are open source, oftentimes even the weights are open source. Many models can be trained on a single large GPU. The most popular frameworks and libraries for making your own AI are open source. There are even transformer language models (such as the smaller GPT2s) and generative image networks (Stable Diffusion) that you can run yourself, on your own hardware.

    There is only one class of AI where there is significant "corporate control:" large scale generative models, where in the last couple of years a few large tech firms have found that they can simply increase the size of the models to get better results, and have not yet topped out even when trainings cost millions of dollars. Even then, hobbyists are chipping away at this advantage by distilling large models into small models or getting the same model to run with fewer bits of precision. Million dollar AI models may not be a permanent state of affairs.
    • Hobbyists just need to beowulf cluster their resources. With naked and petrified Natalie Portmans dumping hot grits in their. . . . wait, forgot what year it was again.

    • by JBMcB ( 73720 )

      It's expensive to make general purpose models. Making very limited, purpose-specific models can be pretty cheap and is definitely open to hobbyists.

      The other side of corporatization is that nearly every medium to large sized university has a supercomputer of some type, and they often pool resources and loan computer time out to each other. The only thing stopping them from working on their own models is how much time other departments are using the supercomputer.

    • by godrik ( 1287354 )

      Really the thing preventing you from building nice models is that you probably don't have access to enough data to make it work. copilot is useful because it is training on the entire github corpus. catGPT is "good" because of the massive amount of text it is trained on.
      You can replicate some of that for pretty cheap.
      If you are trying to do anything else, you run into not having good classified data to train on.

      • by WDot ( 1286728 )
        Sure, but how many technology products have "everything" as their scope? An individual with a fairly small labeled dataset for a specific task can get decent results by fine-tuning on a free pre-trained model and playing with regularization/augmentation options. They can do it with much less work, much less expertise, and far fewer resources and yet get better results than well-funded geniuses of 20 years ago.
  • Time for Ice T to fire up Body Count and crank out the theme song for this. Not many good ideas that our corporate overloads haven't screwed up by taking them mainstream.
  • by Glasswire ( 302197 ) on Tuesday April 04, 2023 @11:42AM (#63425190) Homepage

    Well sure, it's completely unexpected that AI tech requiring datacenters filled with 1000s of expensive GPUs that require 1000s of hours of expensive training isn't a consumer technology.

    • Well sure, it's completely unexpected that AI tech requiring datacenters filled with 1000s of expensive GPUs that require 1000s of hours of expensive training isn't a consumer technology.

      Not necessarily even GPUs. Google uses TPUs [wikipedia.org], which are purpose-built. Though, consumers can rent them [google.com]. AWS and Azure have large numbers of GPUs for rent, too.

      So it isn't access to the hardware that's the issue, but the money to afford time on it. Either way, training sophisticated AI models is expensive.

  • First, Most of AI - and software in general - has always been developed by corporations and even researchers in academia funded by corporations

    The TFS complains about how there are now more models in general... which should be considered a good thing.. as AI becomes mainstream, expect to see more models in everything everywhere. It's like complaining that there are many commercial compilers or web browsers

    much ado about nothing, nothing to see here

  • Well duh (Score:5, Insightful)

    by rsilvergun ( 571051 ) on Tuesday April 04, 2023 @12:16PM (#63425272)
    It was open source just long enough to have the universities do all the hard work on the public dime so they could swoop in and monetize it after the taxpayer paid to build it. Same thing the pharmaceutical industry does. And the electric car battery industry. And the computer industry. And...

    Privatize the profits and socialized the losses.
    • by DarkOx ( 621550 )

      Indeed its almost like we should stop shoveling tax payer dollars at basic research and let the people who will commercialize the results foot the bill.

      • At basic research. But the patents should stay with the public along with the profits. Either that or like the polio vaccine there shouldn't be any profits.

        I mean it's seriously fucked up that the vaccine for covid is a for-profit endeavor and we have hundreds of thousands of people around the world dying because they can't get it
  • When they all merge under SkyNet [wikipedia.org], I'm moving to a far off remote island.

  • 2023 AI is Entering an Era of Corporate Control
    2034 Corporations are Entering an Era of AI Control

  • by rbrander ( 73222 ) on Tuesday April 04, 2023 @03:50PM (#63425832) Homepage

    ...since the most-accurate description I've ever heard for a profit-making corporation is Cory Doctrow's "Slow AI" term. They behave as AIs that have humans for neuron cells. No one cell is really in charge, just the program to make money. If even a CEO shows some human weakness about making money (not hiring kids, say), the Slow AI will replace him with one more obedient to the program.

    For those who see it that way, all the whinging about "AI in control" or displacing humans in some way, are just funny. The corporations that wiped out whole towns like Flint, Michigan, with globalization were not acting like humans, but like AIs.

  • We're pretending that the corporations don't run the colleges now?
  • "AI" was created by corporate marketing more than two decades ago. You know how you can tell? Because the phrase has no real, definable meaning.

    So, no, AI is not "entering" an era of corporate control. It's just been promoted to the next fad to be hyped ala blockchain/crypto/NFT.

    With any luck, it will cause the downfall of the big tech companies in the same way the "metaverse" took out Facebook (which may linger on for a few years but, is definitely beginning its death spiral). That'll leave the marke
  • Comment removed based on user account deletion

It is easier to write an incorrect program than understand a correct one.

Working...