Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Technology

'I've Seen the Future of Consumer AI, and it Doesn't Have One' (theregister.co.uk) 137

Andrew Orlowski of The Register recounts all the gadgets supercharged with AI that he came across at IFA tradeshow last week -- and wonders what value AI brought to the table. He writes: I didn't see a blockchain toothbrush at IFA in Berlin last week, but I'm sure there was one lurking about somewhere. With 30 vast halls to cover, I didn't look too hard for it. But I did see many things almost as tragic that no one could miss -- AI being squeezed into almost every conceivable bit of consumer electronics. But none were convincing. If ever there was a solution looking for a problem, it's ramming AI into gadgets to show of a company's machine learning prowess. For the consumer it adds unreliability, cost and complexity, and the annoyance of being prompted.

[...] Back to LG, which takes 2018's prize for sticking AI into a superfluous gadget. The centrepiece of its AI efforts this year is a robot, ClOi. Put Google Assistant or Alexa on wheels, and you have ClOi. I asked the booth person what exactly ClOi could do to be told "it can take notes for your shopping list." Why wasn't this miracle of the Fourth Industrial Revolution let loose on the LG floor? I wondered -- a question answered by this account of ClOi's debut at CES in January. Clearly things haven't improved much -- this robot buddy was kept indoors.

This discussion has been archived. No new comments can be posted.

'I've Seen the Future of Consumer AI, and it Doesn't Have One'

Comments Filter:
  • by Spy Handler ( 822350 ) on Thursday September 06, 2018 @03:53PM (#57265934) Homepage Journal

    3D printer in every home will fundamentally change human society

    IoT internet connected belt buckles and toothbrushes will take over the world

    AI will revolutionize consumer electronics

    Net PC from Sun will dominate the computer industry (this one is really old)

    • by ShanghaiBill ( 739463 ) on Thursday September 06, 2018 @04:05PM (#57265988)

      Excessive hype is always followed by a trough of disillusionment. But as the TOD fades, plenty of mature, practical applications are likely to emerge. The technological naysayers are usually even more wrong than the hypesters.

      Hype cycle [wikipedia.org]

      • by fahrbot-bot ( 874524 ) on Thursday September 06, 2018 @04:20PM (#57266056)

        Excessive hype is always followed by a trough of disillusionment.

        Pro Tip: Get out in front and mention this *before* taking your date home. Better for her to hear it from you than her working it out on her own ... :-)

      • by CaptainDork ( 3678879 ) on Thursday September 06, 2018 @04:33PM (#57266114)

        If smart phones and tablets are any indicator ...

        AI, too, is an evolutionary dead end.

        It's a buzz word with a vacuous definition.

        • by Q-Hack! ( 37846 )

          Not a lot different than back in the 1950's when the trend was to create all manor of odd gadgets to make life easier. Those deemed useful are still around... The rest can be found in junk markets around the world. But hey, the Cracker-barrel's of the future will still need stuff to decorate their walls with.

          • In reaction to your sig:

            I recently re-read "Nineteen Eighty-Four," because my first reading was so long ago.

            Good read, but what a goddam depressing book!

        • AI, too, is an evolutionary dead end.

          It's a buzz word with a vacuous definition.

          Why is AI an evolutionary dead end? If you're justifying that sentence with the following one then what will happen is exactly as the GP said, except that the name will change in the process.

          There's nothing dead end about computers taking thinking off our hands. It's a technological end game salivated over by scifi writers for decades and the processes behind them are already solving some very real problems better than people can.

          • The "AI," you're referring to doesn't mean what you think it does, because it doesn't exist. Hence, "vacuous."

            Go back and review the state of the art.

            The "A" doesn't stand for "Automation."

            And, the "I" part stands for, "Intelligence."

            The intelligent part refers to human intelligence.

            We will never have that because we will not allow for quirks, independence, and insanity, and other aspects of intelligence.

            "My AI thermostat went nuts and the fucking house was freezing. I told the machine it's WAY to cold.

            "Loo

            • The "AI," you're referring to doesn't mean what you think it does, because it doesn't exist. Hence, "vacuous."

              The AI I am referring to is precisely what we are calling AI right now. Machine learning algorithms.

              The intelligent part refers to human intelligence. We will never have that because we will not allow for quirks, independence, and insanity, and other aspects of intelligence.

              Sorry but I can't mince my words here. That has to be the dumbest thing I've ever heard. The goal of AI is not, has never been, nor ever should be to copy the stupidity and inadequacies of humans. We are not the be all and end all of intelligence. The whole point of offloading this to a machine is to be better than that.

              • Stop.

                The AI I am referring to is precisely what we are calling AI right now. Machine learning algorithms.

                The former is AI and the latter, MLA.

                You have a right to your opinions, but you don't have the right to mix acronyms. AI != MLA.

            • The "A" doesn't stand for "Automation."

              And it doesn't stand for "Authentic" either.

              The term "Artificial" means that it is not human-like, but is somewhat similar to what a human brain would do.
              Of course, even that is a bit of a stretch today.

      • by Anonymous Coward

        Right. Look at CES in the 80s and notice all the ridiculous PC crap people were trying to push, with all the functionality of Lotus Notes. But, there was a future there...

      • Excessive hype is always followed by a trough of disillusionment. But as the TOD fades, plenty of mature, practical applications are likely to emerge. The technological naysayers are usually even more wrong than the hypesters.

        Hype cycle [wikipedia.org]

        Back in the early PC days, when you had to hook up a cassette player to load your application, and then another one to load your data, we used to tell people they could store recipes on their TRS-80 personal computer. This was not much of a productivity enhancer. I'm sure based on this experience some people would have thought PC's were useless and had no future.

        And then floppy disks and spreadsheets were invented.

      • by jythie ( 914043 )
        It is really difficult to say if the naysayers or hypesters are more often right or wrong. One problem with looking back at negative guesses is we only really remember the ones that turned out to be wrong since the evidence is in modern use today, while all the naysayers that we right, well, the things they were right about faded into obscurity.
        • You only count as a "true" naysayer if you are negative about an overhyped trend with groupies and fanbois, not about an obviously stupid idea.

          The naysayers were right about the Segway, but that was an easy target, since it reached peak hype before it had even been shown to the public.

          Other tech failures were Iridium, Zune, Pebble, Juicero. But none of these were hyped as world changing technology.

    • Prognosticators have been wrong before. While it is easy to poke fun at the unusual who knows, perhaps in a few years dental floss will come with AI. The thought of not having AI floss will be unthinkable.
    • by Anonymous Coward on Thursday September 06, 2018 @04:14PM (#57266022)

      As much as I am a nerd, I blame "nerds" for this. There is this whole new fad of being a "techie", watching Big Bang Theory, owning a Tesla, and generally being absolutely ignorant about real science, technology and math while "pretending" to be a nerd. I used "pretending" but there may be some legitimate attempt but it is hard to tell if someone is a fake nerd or just a stupid nerd. I think this trend partly follows from women trying to follow the (tech) money and then men trying to follow the women.

      This has led to a culture of "techies/nerds" that don't understand one bit how the underlying technology works or how it can be brought to market. All they care about is being able to spew bullshit about how awesomely nerdy they are. Unlike "real" nerds, they have a very shallow understanding of what they are talking about and easily fall for marketing. They totally read that article on self driving cars and they are going to be out net year and totally revolutionize cities and travel. Next, AI will steal all our jobs and we will be unemployed, Google figured it all out yesterday.

      TLDR: Fake nerds need something to talk about and buy to show how nerdy they are. Marketing departments across the world have noticed and capitalized on that to the maximum.

      • I don't know that there's a lot of these people but they do exist, for certain yes. The 'watching big bang theory' is the kicker, once someone admits watching that, you know they're very unlikely to be a 'proper nerd' for lack of a better term.

        Considering they only have partial skills in technology then, we can likely guess, if they work in the industry, they're probably higher on the ladder than us and paid more though :/ like most management / consultant types.

    • by JMJimmy ( 2036122 ) on Thursday September 06, 2018 @04:36PM (#57266126)

      The thing no one can consider is time.

      "AI" being jammed into things now is probably lame, awkward, and of very limited use. Much like computers were back in the punch card days with devices that. Less than 100 years later we've got computers in our pocket. We are in the early days of AI - we'll look back on it decades from now as we do with things like: https://www.youtube.com/watch?... [youtube.com]

      This article is just another example of someone who can't see past their nose to the road ahead and the million different branching paths this technology could take.

    • by AHuxley ( 892839 )
      Good for a few workers over the decade of hype.
    • by m00sh ( 2538182 )

      3D printer in every home will fundamentally change human society

      IoT internet connected belt buckles and toothbrushes will take over the world

      AI will revolutionize consumer electronics

      Net PC from Sun will dominate the computer industry (this one is really old)

      I don't know about home but it plays a big part in manufacturing. There are very specialized and successful medical companies that use 3d printing.

      Don't know about belt buckles but fitbit, apple watch, garmin has been worth billions of dollars and fundamentally changed the way a lot of people do things.

      I don't know about NetPC but what about the cloud? The hype that we would all put all our stuff in the cloud blah blah actually materialized. There are many companies who own no hardware except the dev la

    • by lokedhs ( 672255 )
      Net PC was not from Sun. I should I know, I worked for them during that era. What they had was JavaStation, which was a neat idea but ahead of its time. That concept is now realised by the Chromebook. Net PC was a Compaq thing, if I recall correctly. However, Wikipedia tells me it was Oracle, so perhaps the Compaq device was called something else.
  • by serviscope_minor ( 664417 ) on Thursday September 06, 2018 @03:59PM (#57265956) Journal

    Andrew Orlowski of The Register is basically a professional dickhead. His main goal seems to be to be as obnoxious and ignorant as possible presumably with the goal of trolling the readership. He's pretty much the reason I stopped reading the Register because of the constant streem of utter bullshit from that guy.

  • Red Dwarf has already shown why this is a BAD Idea.

    https://www.youtube.com/watch?v=lhnN4eUiei4

  • by Anonymous Coward

    I have no interest in pretty much any product which has so-called 'AI' in it.

    It's utterly pointless tech, which is very gimmicky but serves no actual purpose.

    I don't want to have my fridge ordering milk, or my oven deciding that now is the time to turn on, or my thermostat to greet me as I come in the door. I don't want to access it from my phone, or have it run in the cloud

    I don't want any of this connected crap, because it's just annoying technology for the sake of being technology.

    This shit is all just

    • But I do like being able to verbally ask my phone to navigate to a contact, without having to squint at a screen in the sun, and get turn by turn directions. Digital assistants have slipped into a place in my life where they do a few useful things. As time goes on, this set will grow larger.

      But I know: "If it works, it's not AI!" "If it's AI, it won't work!"

  • by Anonymous Coward

    If Sony's Aibo lives up to the demos I have seen - that would be one big application. AI as a pet.

    I also use AI (maybe more ML) all the time with photo sorting, image recognition, etc. It is already in the home.

  • by JoeDuncan ( 874519 ) on Thursday September 06, 2018 @04:17PM (#57266038)

    ... because consumer AI is *ALREADY* ubiquitous and all around us.

    From the face detection in your phone, to the fuzzy logic controllers in washing machines, to the ant colony algorithms being used to route network traffic, to finding directions with google maps, to Netflix and Amazon's recommendation algorithms, to OCR for cheques and mail, to NEST thermostats, to robot vacuum cleaners and lawn mowers, to expert systems in medical diagnosis... (I could keep going)

    AI in consumer products is literally *already* ALL around us.

    Saying that consumer AI "has no future" is like looking around at the world today and saying "personal cars have no future" - it's completely idiotic because to anyone with half an ounce of perception that future is ALREADY here.

    It's like looking at a forest and claiming there are no trees

    • Yeah it seems like it is a natural fit in optimizing the things we do.

      Even though I don't routinely use my phone as an alarm clock, it still knows when i'm likely to get up and if I plug it in at bed time it'll do a good job of figuring out when i'm likely to get up and adjusts its charging rate to be done about an hour before then. Yet if I plug it in a 3pm then it'll assume i want as much charge as possible and charge as fast as it can. It's not rocket science, but it's useful.

      Do I need a dishwasher with

      • Do I need a dishwasher with a screen that I can talk to?

        Nope, but I'm willing to bet it has an embedded fuzzy logic controller in it to control water levels.

      • "Do I need a dishwasher with a screen that I can talk to?" Printers have a screen. You can't talk to it (at least you're not supposed to--when aggravated, I've been know to do so, and not kindly). But try to decipher what's on that screen. I claim that printers are not any easier to use than they were in 1984 (which is when I got my first dot matrix printer). You (ok, I) *still* can't figure out what's wrong with them, despite the screen.

    • by Anonymous Coward

      Well, it all depends on how one defines AI. None of the tings you mention actually contain any real artificial intelligence in the sense of being able to making decisions in the face of unknown circumstances and data sources. AI is still more hype than reality

      • None of the tings you mention actually contain any real artificial intelligence in the sense of being able to making decisions in the face of unknown circumstances and data sources.

        They do actually.

        Roombas have to be able to adapt to unknown obstacles and uncertain sensory input (could get blocked, partially occluded etc...).

        Embedded fuzzy logic controllers (also used in anti-lock brakes) have to be able to maintain a steady output signal given uncertain input (wear and tear on the mechanics, grit...) that can vary wildly in an unknown manner.

        OCR systems need to be able to tell the difference between a cheque and unknown things, like night club flyers, and they deal with hand written

        • by Knuckles ( 8964 )

          When a Roomba hits an obstacle it stops, rotates by an arbitrary amount, and tries again. Repeat until unstuck or timeout. It's hardly intelligence. Intelligence would be to understand what the obstacle is and how to best free itself, much like you are able to leave a bathroom without walking into the wall until you find the hole.

          As for actual "AI" attempts, I booted a fresh OEM install of Windows for the first time 30 years. Cortana showed up and utterly embarrassed herself. I chose English language and Ge

    • by hazem ( 472289 )

      From the face detection in your phone, to the fuzzy logic controllers in washing machines, to the ant colony algorithms being used to route network traffic, to finding directions with google maps, to Netflix and Amazon's recommendation algorithms, to OCR for cheques and mail, to NEST thermostats, to robot vacuum cleaners and lawn mowers, to expert systems in medical diagnosis... (I could keep going)

      When I took an AI class a few years ago, one of my favorite things the professor said was, "What we called 'AI' yesterday is simply the algorithm for how we do a thing today."

      • To be fair that AI of yesterday was a fixed algorithm with a pre-defined path from inputs to outputs. What is different about it today is that the path is undefined. The whole concept of "machine learning" is what makes the AI of now very different to the AI of the past. Neither represent true intelligence, but the one in the past didn't even represent thought, just looking things up in a list.

        Calling what we do now AI is a bit different to calling the "expert system" of the past AI as we did. That was ulti

      • When I took an AI class a few years ago, one of my favorite things the professor said was, "What we called 'AI' yesterday is simply the algorithm for how we do a thing today."

        Exactly!^

        Far from being a failure, AI has become so successful its invisible.

  • AI (i.e. machine learning/neural networks) is really good at optimizing stuff, so its natural strength shows when you have hundreds of thousands of entities in a system. Examples are the electricity grid, playing Go, and a department store's inventory.

    In our individual lives, AI seems more like another drop in the bucket of too much technology, and I think one day we'll realize that less is more when it comes to the stuff in our homes.

  • I was looking at new fridges recently as a friend was asking for a recommendation, and it's alarming how trying to find a fridge without a screen is getting to be like trying to find a cell phone without a camera... it really limits your options.

    The only way they could make fridges any worse is the if screens also played CNN constantly when not in use, like in an airport... you can absolutely see subsidized ad-fridges coming down the pipeline.

    • by lgw ( 121541 )

      Seems like only the highest and lowest-end fridges lack screens these days (as well as ice/water in the door, something else I could do without).

      • Come to my house. The refr *has* an ice/water dispenser in the door, but it hasn't worked for over a year. I think the tube to the water dispenser is frozen, and if it gets thawed, it just freezes up again. Same with the water dispenser on the refr nearest my office at work.

        As for the ice dispenser on our refr, we never used it, so I took it out and got lots more room in the freezer. If we want ice cubes, we make them in trays, like the 1960s.

  • My uncle was a computer scientist for a National Lab. He retired 15 or so years ago. I remember just after my grandmother first got internet, he didn't have it at his home yet because he didn't believe it was safe -this was probably 1997 or 98, and I remember him talking to me about how disappointed he was with the internet. "It was supposed to be this great thing. It's useless. It'll never amount to anything."

    Yeah, he was wrong.

    • My uncle was a computer scientist for a National Lab. He retired 15 or so years ago. I remember just after my grandmother first got internet, he didn't have it at his home yet because he didn't believe it was safe -this was probably 1997 or 98, and I remember him talking to me about how disappointed he was with the internet. "It was supposed to be this great thing. It's useless. It'll never amount to anything."

      Yeah, he was wrong.

      Was he? Was he really?

      How much of the internet is truly useful and how much is just trash? Judging by my inbox, the number of E-mail in my inbox the ratio 1s more than 10 to 1 SPAM to worth while messages (And that's AFTER the SPAM filters.)

      I find that this ratio pretty much governs the whole of the internet.. Where 1/10th of it is actually something of use and the rest is just useless junk.

      So he's not that wrong.

  • AI is turning frogs gay.

  • by Laxator2 ( 973549 ) on Thursday September 06, 2018 @05:46PM (#57266452)

    I did not see any example where someone says: "I did not buy that product because it lacked AI".

    I did not hear from anyone that they need AI so they are going out of their way to buy it. In its current form AI is good for pattern recognition in some cases, for example, face identification in photos.
    The only customers are corporations with massive collections of personal data to analyze, but not individual consumers.
    I believe AI has been over-hyped and pushed in areas where it is not usable in its current form (like self-driving cars) and we start to see the backlash.

    I've already seen stories saying that the medical diagnoses made by IBM's Watson are just plain wrong. More examples will follow.

    • by m00sh ( 2538182 )

      I did not see any example where someone says: "I did not buy that product because it lacked AI".

      I did not hear from anyone that they need AI so they are going out of their way to buy it. In its current form AI is good for pattern recognition in some cases, for example, face identification in photos. The only customers are corporations with massive collections of personal data to analyze, but not individual consumers. I believe AI has been over-hyped and pushed in areas where it is not usable in its current form (like self-driving cars) and we start to see the backlash.

      I've already seen stories saying that the medical diagnoses made by IBM's Watson are just plain wrong. More examples will follow.

      What about Google home and Alexa?

      How do you recognize pedestrians in self-driving cars without AI?

      IBM Watson was wrong quite a bit but it won jeopardy.

    • I did not see any example where someone says: "I did not buy that product because it lacked AI".

      That's because you're turning the concept on it's head and overspecified what you're looking for. You may not hear anyone say "I didn't buy that because it lacked AI" but you've probably heard the reverse stupefied:

      "I bought x because it is smart"
      "I bought y because it learns and does something automatically"
      "Hey check this out this device can tell me me and the other person apart"

      Just because people don't know specifically what AI is in their devices doesn't mean it hasn't been part of their purchasing dec

  • First they ignore you, then they laugh at you, then they fight you, then you win.

    Mahatma Gandhi

    This field is moving so fast compared to the 90s.

  • So-called 'AI' is over-hyped and under-performing.
  • The AI bubble seems to be starting to deflate. It may not pop, but it will likely carry on shrinking. Most people already know that Alex and co. are little more than gimmicks, good for party games, grins and giggles, and little more. The AI community seems to be making the same mistakes they made in the late 60s and 70s. The second AI winter is nigh.
    • by Tablizer ( 95088 )

      Looks like it. In the 80's, AI (or AI-ish) tech started showing practical promise, so investment funds poured into it. But despite making some strides, AI didn't live up to the hype, and the bottom fell out of the market, creating "AI Winter" (One).

      It looks like "deep neural nets" have reached a plateau such that they too won't live up to the hype; only incremental improvements (for a while, at least). I see more articles on their silly failures and lack of common sense of late. This reality will eventually

  • If Consumer AI doesn't have a future, how can that non-existent future be seen ?

    In an alternative interpretation, the author has seen the future of Consumer AI and so of course it exists. But the future of the future of Consumer AI doesn't exist. I.e. Future of Consumer AI doesn't have one - where "one" stands for future.

    Any other interpretations ?

  • Since the consumer is not control of it.

    It's Anti-Consumer AI if anything
  • "AI" right now is a trendy buzzword, like "cloud". But the truth is that modern machine learning is very useful and is showing up in many new places.

Keep up the good work! But please don't ask me to help.

Working...