Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Businesses Technology

Executives Say AI Will Change Business, But Aren't Doing Much About It (axios.com) 76

American business executives expect artificial intelligence to have a large impact on their companies, but few are actually doing anything with AI, according to a new MIT- Boston Consulting Group survey. From a report: Key takeaways, per co-author and BCG senior partner Martin Reeves: Nearly 85% of the 3,000-plus executives surveyed expect AI will give them a competitive advantage But their adoption of AI isn't matching up: just 1 in 5 of the companies use AI in some way, and only 1 in 20 incorporate it extensively. "Less than 39% of all companies have an AI strategy in place," they wrote. The barriers for adoption include: access to data to train algorithms, an understanding of benefits to their business, a shortage of talent, competing investment priorities, security concerns, and a lack of support among leaders.
This discussion has been archived. No new comments can be posted.

Executives Say AI Will Change Business, But Aren't Doing Much About It

Comments Filter:
  • >> Nearly 85% of the 3,000-plus executives surveyed expect AI will give them a competitive advantage

    I am quite certain that at least 35.03% of them are wrong.
  • Pattern recognition is not 'AI'. Maybe when some actual AI is developed companies will adjust to it.
    • "They say a future technology will make them more-competitive but aren't using cold fusion RIGHT NOW!"
      • Web 2.0 has run its course and IOT is stillborn they are grasping at straws.
        • IoT is getting a lot of bad publicity for being an enormous security nightmare. It needs a standard. Not another standard so we have 14 standards; it needs a standard that people follow. As for regulations... Congress should stop at accountability for reasonable security measures; legislating technology creates inflexibility.

          I've left mine stillborn, though. I wanted IoT devices to have a near-process set-up (i.e. you have to put the devices together, tell them they're setting up, and they open a win

    • by ShanghaiBill ( 739463 ) on Wednesday September 06, 2017 @03:33PM (#55149721)

      Pattern recognition is not 'AI'.

      Pattern recognition is what your brain does. So how is it "not AI" when a computer does it?

      • The difference between the two is thoroughly discussed elsewhere on the internet. Maybe I have too much 'I' to repeat it here, or am just too lazy.
    • by AHuxley ( 892839 )
      Any sufficiently advanced pattern recognition is indistinguishable from a real AI.
  • What is AI? (Score:3, Interesting)

    by Volda ( 1113105 ) on Wednesday September 06, 2017 @03:32PM (#55149707)
    Can someone give me the definition of AI? I ask because it keeps getting thrown around but when people get in to details they talk about pattern recognition, machine learning or data analytics. Not what I would consider AI. CTO seems to think that if a machine can read, say safety manuals, then it can make decisions on safety better then humans. We have yet to see this work.
    • by Anonymous Coward

      (With tongue firmly planted in cheek) AI is the set of problems computer researchers think computers could do, but can't yet.

      Once there's off-the-shelf software to solve a category of problem, it stops being "AI" and becomes "machine vision" or "autonomous $FOO".

    • Re:What is AI? (Score:5, Interesting)

      by clodney ( 778910 ) on Wednesday September 06, 2017 @03:43PM (#55149787)

      So long as AI is implementing techniques that work on general purpose computers, programmers will look at it and say that that is not AI, that is just code running this algorithm or that. You keep waiting for the magic to happen, and keep finding out that it is just software.

      But so what? Neural nets are making great strides in specific applications, and even though we know how they work in general, the specific way they put together associations still surprises us and lets them come up with answers we didn't expect, or implementations we would never have come up with. Computers playing Jeopardy don't do anything a human can't do, but the fact that they can do it at all is a huge leap forward compared to where we were 20 years ago. And sure the hardware is a million times faster, but as the saying goes, quantity has a quality all its own.

      Going back to the point of magic happening, what would it take for you to decide something was AI? And if you discovered you understood all of the techniques that went in to that, would it stop being AI?

      • by Myrdos ( 5031049 )

        That's why I prefer the term "Synthetic Artificial Virtual Intelligence". So much less debate about the meaning, letting us focus on the practical benefits.

      • You admit then it's just more of the same of what we have experienced the last 25 years. Why is it now suddenly AI? It's just a marketing gimmick.
        • by clodney ( 778910 )

          You admit then it's just more of the same of what we have experienced the last 25 years. Why is it now suddenly AI? It's just a marketing gimmick.

          The techniques were described as AI 25 years ago as well, it is just that the scale and applicability of them is hugely different now.

          Perhaps where we are foundering is expectations. When I took an AI course back in college we learned about expert systems, neural networks, associative arrays and other things I don't remember, and all of those were algorithms or approaches that would be used to build AI. I have no problem with calling useful systems built on those techniques AI, even if they don't pass the

          • It's happened through incremental improvements. So what happened in the last year or two where suddenly we need AI consultants that wouldn't be covered by ordinary IT initiatives?
      • Good question. And question is the operative word. First IMHO true AI should be indistinguishable from I. Einstein thought computers were uninteresting because they did not ask novel questions. They still don't. An AI should be able to synthesize data sets and generalize across them, posit questions and set novel goals and elucidate tasks to reach them. An AI should be able to see what isn't and ask , "Why not?"

        Currently, I see AI as just a marketing term for highly capable systems that can perform tasks

    • The waters are so muddied by marketing hype, media hype, and hype from so many other sources, that it's easier to explain what is not 'AI': Nothing that you've seen or heard about is 'artificial intelligence'. It's all ersatz. At the rate we're going we won't have real 'AI' anytime soon, probably not in our lifetimes -- not unless neuroscience makes a big breakthrough in how our own human minds are able to actually 'think' and be 'conscious' and 'self aware' like we are; in other words we have to solve the
      • The marketing hype is itself changing the definition of AI. Under the newer definition it means a computer doing something that appears to be intelligent. Under this definition, AI is all around us.
      • by Myrdos ( 5031049 )

        Why does it need to be exactly as smart as a human in order to be AI? I mean, cats aren't self aware. Why not as smart as a cat?

        • Okay, well if someone makes a robot as smart as a cat I am in.....
          • Since you bring it up.. at current we can't even build a machine that emulates the cognitive abilities of a cat's brain, either; they don't even really know how an animals brain works yet.
        • I did not say, "as smart as a human", that is something YOU are superimposing on what I did say, which is being able to 'think' like a human brain can.
          Also your cat or dog is, in fact, smarter than the junk they're trying to pass off as 'AI'.
          • by myrdos2 ( 989497 )

            Yes, I follow. But why does it need to be able to think like a HUMAN brain? Lots of animals have some intelligence without being self aware.

            • We're trending towards so-called 'AI' being put in charge of critical things, some of which involve human safety, the most notable of which are so-called 'self driving cars'. The garbage they're trying to pass off as real 'AI' doesn't know the difference between a human being and any other inanimate object; that's because it cannot actually THINK.
            • Even a dog or a cat can think, even if it's on the level of about a 2-year old human, and that's still better than the so-called 'AI' they keep trotting out.
    • At this point there is no real definition. Various companies, researchers and media have made such a fucking mess of the messenging from this space that anything that in anyway resembles a computer providing an answer to something a human would normally do is considered fucking AI. Even if said program is just a simple series of IF THEN statements that produce pre programmed speech some retard will call it AI.
    • by Tablizer ( 95088 )

      I'll call it AI when it can fetch me a beer, take out the garbage, and suck my dick.

    • by taustin ( 171655 )

      AI is a marketing term for vaporware. It's an important tool to sell stock options to investors.

    • The definition of AI has changed. We all need to get used to this. If you throw a fit every time somebody says "AI" without meaning some kind of magical super machine, all of your hair is going to fall out. The meanings of words change over time. Just go with it.

      The modern definition is basically just any implementation of machine learning. Which is funny, because the phrase "machine learning" also used to be a buzzword that nobody in the industry actually used unless talking to the media. We'd be
  • I think the reason few are actually doing anything with AI is that it hasn't been turned into a product yet. Ask a financial analyst how they intend to use AI to improve their forecasts, and they will give you a blank look. Sell them a product that you feed a bunch of data into and a forecast spits out, that uses AI under the hood, and they will be happy to buy it. But they don't have the ability to start from scratch.

    Kind of like a study asking "Why aren't house builders using superconductors?" Because

  • I suspect the product support lines will be the first to use it, not because it's good, but because co's want to cut staff. It just has to kinda sorta work to make it tempting enough. It's one of the lowest barriers of entry due to low expectations since product support already sucks at a good many co's.

  • Since apparently nobody really knows what "AI" is:
    https://qz.com/1067123/stop-pr... [qz.com]

    saying an undefined quantity will accomplish something is a bit of a stretch.
  • AI joined the recent overhyped buzzwords: MOOC, 3DTV, AR/VR... When the dust will settle, a few will certainly use it, but we will hear much less about it.
  • In other news, surveys show that 85% of executives have no imagination and the attention-span of a gnat. They have no interest in what's happening next quarter, let alone what might happen in a year or two.

  • whatever BS is included in the latest best-seller business book summaries they read, or whatever they read in that airline magazine on their last flight (you know, the article that was sandwiched between Sharper Image ads for electrically heated dog sweaters and the ad for $500 per person steak dinners). They'll do something with it when one of their hired-gun management consultants tells them what they should do with it.

Genius is ten percent inspiration and fifty percent capital gains.

Working...