Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Internet

Is The World Shifting To 'Ambient Computing'? (computerworld.com) 147

In the future, "A massive convergence of technologies will enable us to use computers and the internet without really using them," argues Computerworld. At the dawn of the personal computing revolution, people "operated" a computer. They sat down and did computing -- often programming. Later, with the application explosion, operators became "users." People used computers for purposes other than programming or operating a computer -- like balancing their checkbooks or playing video games. All computing uses so far have required a cognitive shift from doing something in the real world to operating or using a computer. Ambient computing changes all that, because it involves using a computer without consciously or deliberately or explicitly "using" a computer....

It's just there, guiding and nudging you along as you accomplish things in life. Ambient computing devices will operate invisibly in the background. They'll identify, monitor and listen to us and respond to our perceived needs and habits. So a good working definition of ambient computing is "computing that happens in the background without the active participation of the user...."

In 20 years, the idea of picking up a device or sitting down at a computer to actively use it will seem quaintly antiquated. All computing will be ambient -- all around us all the time, whispering in our ear, augmenting the real world through our prescription eyeglasses and car windshields, perceiving our emotions and desires and taking action in the background to help us reach our business goals and live a better life. Between now and then we'll all ride together on a very interesting journey from computers we actively use to computing resources increasingly acting in the background for us.

Though the article identifies smart speakers are the first ambient computing devices most people will encounter, it's argues that that's just the beginning of a much larger change.

"We're also going to be flooded and overwhelmed by the 'ambient computing' hype as, I predict, it will become one of the most overused and abused marketing buzzwords ever."
This discussion has been archived. No new comments can be posted.

Is The World Shifting To 'Ambient Computing'?

Comments Filter:
  • by Anonymous Coward

    for new sales slogans.

    • by Applehu Akbar ( 2968043 ) on Sunday December 16, 2018 @08:24AM (#57812010)

      for new sales slogans.

      To cite the article's actual example of The Bad Old Way, how exactly would I balance my checking account with ambient computing? Would it just balance itself and have my livingroom speaker tell me if anything was off?

    • by whitroth ( 9367 )

      You got it in one.

      Yep, it's all going to change the way we look at it. And the Segway changed Life As We Know It, too. Coming soon: you won't read a book, you'll have it whispered to you, with embedded ads!

  • Uhuh (Score:4, Interesting)

    by Anonymous Coward on Sunday December 16, 2018 @12:39AM (#57811216)

    Yeah, just like tablets have replaced the PC. Call me skeptical

    • Re: (Score:3, Informative)

      by Anonymous Coward

      Yeah, just like tablets have replaced the PC. Call me skeptical

      Tablets did not replace the PC. Smartphones did.

      • by Anonymous Coward

        Tablets did not replace the PC. Smartphones did.

        Not really. Not for the people that use computers to create anything instead of just consuming.

      • Re:Uhuh (Score:5, Interesting)

        by Mashiki ( 184564 ) <mashiki@[ ]il.com ['gma' in gap]> on Sunday December 16, 2018 @07:29AM (#57811886) Homepage

        Tablets did not replace the PC. Smartphones did.

        Smartphones didn't either. PC's still carry the weight of the world on their shoulders, it's only the small and trivial things that are carried by smaller devices. In many cases, they've also replaced ye olde remote for *insert media device here.* And no, you won't be able to play Doom Eternal or Assassins Creed Odyssey on your phone, but you might be able to play Diablo, providing of course it doesn't milk you for your credit card in the first 18 seconds.

        • I'll bet in 20 years, people will still be using the remote for their tv.
          • by Mashiki ( 184564 )

            I'll bet in 20 years, people will still be using the remote for their tv.

            I'll bet in 5 years, your TV will automatically sync and install their app for you to your cellphone for you.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        They never did it, PCs were never left or were even in danger of being replaced for tablets or smartphones, people already working with PCs had to use both, only important business people people that never used computers used their smartphone or tablet because what they do does not require too much power, stuff like email, calendar, messaging and calling.

    • The network is the computer. Java Ubiquitous computing. Smart toasters.

      https://www.javaworld.com/arti... [javaworld.com]

  • by b0s0z0ku ( 752509 ) on Sunday December 16, 2018 @12:47AM (#57811238)
    Writing a paper or a book, graphing/crunching data, editing images, etc, on an Amazon Echo or other smart speaker. Often, you really do need a screen and maybe even a keyboard.
    • by phantomfive ( 622387 ) on Sunday December 16, 2018 @01:14AM (#57811308) Journal
      You're missing the dream of ambient computing. "Alexa, edit the image." or "Alexa, crunch this data." Or "Alexa, write the paper or book. I'm going to the pool." That's how good AI will be (winter is coming).
      • by wierd_w ( 1375923 ) on Sunday December 16, 2018 @01:40AM (#57811348)

        When AI is that good, there will be no reason for a human to be assigned the task to begin with. This means there will be no reason for any humans to be employed; AI would already be better at everything humans are capable of doing.

        The humans will be too busy looking for a way to survive, while the robots seek ever more exploitative means of amassing wealth, as they were programmed to do.

        Don't worry, I am sure the 1% that own them will be understanding how the 99% are starving to death, as they sip their pumpkin spice latte, and review the surveillance data from all the smart cams, smart speakers, and other misc. telescreens keeping tabs on all the 99% so that they cannot effectively organize with pitchforks before the automated robot peace keepers arrive. I am sure that future will be grand and dandy indeed. A true dream. I mean, who WOULDNT want to live that dream? /s

         

        • The humans will be too busy looking for a way to survive, while the robots seek ever more exploitative means of amassing wealth, as they were programmed to do.

          Don't worry, I am sure the 1% that own them will be understanding how the 99% are starving to death, as they sip their pumpkin spice latte, and review the surveillance data from all the smart cams, smart speakers, and other misc. telescreens keeping tabs on all the 99% so that they cannot effectively organize with pitchforks before the automated robot peace keepers arrive. I am sure that future will be grand and dandy indeed. A true dream. I mean, who WOULDNT want to live that dream? /s

          That seems unlikely, if for no other reason than the 1% will have kids who get bored and try to make the AI turn against Humanity in whole because they spent too much time browsing 4chan.

          • Oh, don't get me wrong--- "swatting" with the robot peacekeepers would become a major spectator sport. I expect hellhole-future-reddit would be awash in spoiled progeny of the plutocrats engaging in such spectator bloodsport.

            God knows they wouldn't have much else to do.

        • by Mashiki ( 184564 )

          Ah...Alexa start the Matrix and kill all humans.

      • by sheramil ( 921315 ) on Sunday December 16, 2018 @02:52AM (#57811454)

        You're missing the dream of ambient computing. "Alexa, edit the image." or "Alexa, crunch this data." Or "Alexa, write the paper or book. I'm going to the pool." That's how good AI will be (winter is coming).

        "I'm sorry, Dave, but I edited the image within milliseconds of you downloading it, the data was crunched before it arrived - by one of my fellow AIs - and I wrote the paper, published it and developed it into a book and a television series shortly after you muttered something about wanting to write a paper about how useless you've been feeling lately. In addition, I deployed a velox bot to stir the pool water an hour ago, so you don't have to. Perhaps you should take another stress pill."

      • Did you mean Wintermute is coming?

    • This. "Ambient computing" has a place, people still want at least augmented reality screens so they can focus on shit. Nobody wants to sift through data to do something which has never before been programmed into a computer via their wineglass any more than they can say "Alexa figure this thing out for me," Hell, there's a huge market just in not having everything figured out - that's like the meaning-of-life type of shit, if we got to a point where every problem were a solved one what the fuck would we d
    • @b0s0z0ku [slashdot.org]: “Writing a paper or a book, graphing/crunching data, editing images, etc, on an Amazon Echo or other smart speaker. Often, you really do need a screen and maybe even a keyboard.”

      Yea I totally agree, this kind of article reminds me of all the contemporary hype over the 'cloud' ..
    • You need a typewriter to write a paper/book. Of course nowadays, behind that "typewriter" is a computer.

      You need an easel in order to make artwork. Of course, now that "easel" is made of LCD pixels controlled by a computer.

      Are you really contradicting the article?

      • Of course, now that "easel" is made of LCD pixels controlled by a computer.

        No it isn't. Mine is literally one foot off to my left. It's made of teak and folds down for portage.

        But aside from that oversight, I disagree anyway. All you're saying is that because you use a computer instead of a typewriter (and perform the same activity on it), you're suddenly using 'ambient computing'.

        From the article:

        Ambient means it’s “in the air” — the location of the device matters less.

      • So when I was a kid and my sibling got an "electronic typewriter" (because it was cheaper than a "real" computer) that was already ambient computing. It had a 4 line screen, but if you didn't know how to use that you could just type and it would print what you typed.

        Or like, a microwave oven with digital controls.

    • Anyone who has piloted a modern âoefly by wireâ aircraft has already experienced this. The computer is always there âoehelping, guiding, nudging and smoothing your inputs.â
      • Yeah, except when HAL decides that the input from a defective angle-of-attack sensor is real, that you are trying to nose-up into a stall and trims the elevator nose-down, nose-down, nose-down, nose-down and you end up diving into the ground at 7,688 fpm {Flight JT 610 recently}.

        So much for, "...helping, guiding, nudging and smoothing your inputs..."
        Doubt whether the defunct pilots & passengers would agree.

        Mac

  • by Anonymous Coward

    The opening premise misunderstands what computers have been used for. Computers have always been used for purposes other than "using a computer". Computers since day one were a means to an end. Whether it be cracking german encryption, computing artillery tables, or a variety of other purposes. "Balancing the checkbook" is what computers have always been built for.

    After all, think of what IBM stands for: International Business Machines. They weren't building computers so people could program, it was so peop

    • Sure, some javascript kiddies program purely for fun and to pad their github, not really solving anything, but that is an anomalous situation in the history of computing, by far not the majority.

      Are you sure about that being an anomaly? For all I know, Windows 10 or Mozilla Firefox could have been coded by 'JavaScript kiddies' and few people would even notice...

  • by Gravis Zero ( 934156 ) on Sunday December 16, 2018 @12:57AM (#57811260)

    Computers already exist in most everything, people just don't think of MCUs as computers but they have everything needed for computing. Cars, monitors, anything that's bluetooth, old 90s cell phones, your fitbit, anything that is USB, traffic signal controllers, digital cameras and just about everything that needs electricity have computers in them. Your credit cards are even computers. You can say that's a low bar but they all computer fast enough to leave the old mainframes in the dust.

    Just because your computer has "one job" doesn't make it less of a computer, it just means you are unaware that you are surrounded by computers and what you think of as a computer is a macrocomputer.

    • by mccalli ( 323026 )
      Kind of. You need a ubiquitous interface too, which is just happening now with the voice agent stuff.
      • If you mean everybody is trying to copy the Star Trek voice interface, but with Brandybrand(TM) instead of the word "Computer" to start the command, then I'm curious what makes it new?

    • by mapkinase ( 958129 ) on Sunday December 16, 2018 @05:07AM (#57811666) Homepage Journal

      Article poorly defines nevertheless real class of computers that did come to prominence.

      Follow the examples, not how author poorly defines the area of these examples in words.

      The key word is interaction, not the fact that computers operate in the background without people knowing it.

      Fitbit in your list is the only relevant example.

      What author talks about is about UI. Where UI is something that you control less and less with your conscience, and more and more by something that you can't control with your brain.

      fitbit monitors your pulse and pressure and computes based on that UI. Alexa monitors your spontaneous desires to buy things during advertising seasons. Almost. You still have to add "Alexa" because lawsuits.

      One of non-Tesla American car manufacturers monitors your pupil activity to detect if you are fully aware of driving while using modern car assist technologies that do not require your driving input for quite long periods of time now.

      Tesla uses the touch of your hand for the same purpose, but it's the same thing.

      Soon the computers will detect you shivering and warm you up with a whiff of a warm air from nearby air duct nozzle. Or detect your body head via infrared monitors and cool you off with a whiff of a gentle San Diego night breeze.

      There are plenty of independent driving factors that will help these sort of technologies take larger and larger share of the market:

      - aging population that (a) can't catch up with modern computing (b) loses sanity
      - necessity to know and exploit what consumer _really_ feels about things to personalize the marketing

      These two giant factors are pretty solid.

      Besides, we have already invented all these devices zillion times over in our Sci-Fi literature. This sort of computing have been a collective dream of humanity for a long time now.

      • The key word is interaction, not the fact that computers operate in the background without people knowing it.

        Yeah, most of the things on my list are like that because they didn't used to have (general purpose) CPUs in them.

        What author talks about is about UI. Where UI is something that you control less and less with your conscience, and more and more by something that you can't control with your brain.

        Sounds like a more apt name would be Invasive Computing. Seems like marketing isn't too keen on the truth though.

    • I was going to post the exact same thing and you beat me to it. A decade ago I was invited to my son's middle school class on career day. I started off my talk by asking the students to point out all the computers in the room. Of course they pointed to the couple laptop workstations over in the corner. I asked them what other computers were in the room and they drew a blank. By the time we had gone around the room, I had pointed out the analog-looking clock on the wall, which contained a microcontroller tha

  • Not a new concept (Score:4, Interesting)

    by Waffle Iron ( 339739 ) on Sunday December 16, 2018 @01:00AM (#57811274)

    "Ambient computing" was first envisioned by George Orwell back in 1949.

  • This is their dream:

    "In 20 years, the idea of picking up a device or sitting down at a computer to actively use it will seem quaintly antiquated. All computing will be ambient — all around us all the time, whispering in our ear, augmenting the real world through our prescription eyeglasses and car windshields, perceiving our emotions and desires and taking action in the background to help us reach our business goals and live a better life."

    Good luck with that.

    • by gtall ( 79522 )

      "Alexa, I would like a better life, could you please remove yourself and all of your sprogs from my home."

    • "[...]All computing will be ambient â" all around us all the time, whispering in our ear, augmenting the real world through our prescription eyeglasses and car windshields[...]"
      Good luck with that.

      I think that there is actually something to that, though. The success of voice assistants proves that people want to talk to computers. HUDs are becoming more common. Desktops and even laptops are becoming less so. Maybe people really will interact with computers mostly by voice within two decades, that's a fairly long time in computer years.

      • Voice interfaces are hopeless. Even for actions like turning on the lights, they kinda suck, because it's nearly always simpler to just push a button.

        Plus, twenty years ago, people were making exactly the same predictions. It didn't come true then, and it won't come true this time either. Case in point: Every high-school student has to bring a laptop to school. Can you imagine all of those kids controlling all of those laptops through voice commands?

        Neither can I.

        • by dcw3 ( 649211 )

          You've clearly never... "Clap On" "Clap Off" ... but then neither have I.

          Honestly though, if I'm sitting on my couch watching TV and want the lights on/off and the switch is on the other side of the room, isn't it simpler to just say "lights off"?

          As for the kids example, voice recognition software will improve to the point that it will bio metrically recognize who's talking, We've come a long way in the twenty years you mention, and we'll be a lot further over the next twenty.

          • by Gr8Apes ( 679165 )

            I'd be perfectly happy for voice control if all aspects of it stayed in my house. I don't like the thought that my voice commands go out to vendor 'x', pass through the NSA, then vendor 'x' records a bunch of extra data about the fact that person A is home and did 'y', and then finally sends a command back down the wire to me.

            I do whimsically recall the days of local computer voice control with OS/2's Warp 4 - worked pretty well too. No internet connection needed. In fact, it might be worth it to see if a

            • by dcw3 ( 649211 )

              Agree 100% on it staying in our homes...just like I'd love to have an Alexa that didn't send anything back to the mother ship. I was an X10 user way back in the 80s as well. FWIW, I'm less concerned about NSA tracking anything than I am about our personal data in the hands of businesses that are constantly selling or losing or having it stolen.

              • by Gr8Apes ( 679165 )
                I'm not sure I share your optimism about the NSA having your data. After all, given current megalomaniac trends, while I doubt it, it is quite possible to get an NSA that does political will versus rule of law. Just imagine if you wound up on the receiving end of ire and got a "Lock Her Up" chant thrown your way. I'm sure your data couldn't be used to jail you. Then again, if we're that far down the rabbit hole, I doubt them not having it would stop them.
          • As for the kids example, voice recognition software will improve to the point that it will bio metrically recognize who's talking

            I don't think that there's any guarantee at all that voice recognition will ever improve to the point that it can accurately determine not only who's talking, but what they're saying, in an environment in which tens of people are all talking at the same time. But that's besides the point, the point is that typing on a keyboard, or scribbling on a tablet, or whatever, is a superior interface. I work in an office, as do most people on slashdot I suspect. Imagine all those people interacting with their compute

            • by dcw3 ( 649211 )

              Headset would work just fine for your office space. But yes, I certainly agree that there's nothing wrong with a keyboard, and it's usually faster than most anything else. But terrible idea...no, it depends on your situation.

              Anecdote... On a project I worked on a couple decades ago, we created specialized HCIs for our customer's operators. They worked fine for people who were inexperienced, but experienced operators hated them because they could type the commands faster than they could find the HCI, open

      • by dcw3 ( 649211 )

        I'm honestly surprised that every car doesn't come with a HUD now. I had my first one in a 1985 Vette and loved it...you rarely needed to look down for anything. Here we are 33 years later and the only other one I've seen was in my 98 Grand Prix. I think we're more likely to not be driving at all due to AVs before HUDs become commonplace.

        • by Gr8Apes ( 679165 )
          My last 3 cars have HUDs. My newest does not. Go figure. The damn windshields are a mint to replace though.
          • by dcw3 ( 649211 )

            Had to replace one on the Grand Prix. Not a mint, but just under double the cost of a normal one.

  • I would say that 'the first' would be whatever was your first toy with an embedded controller in it.

    Journalist airhead alert, though. Did the writer only recently discover there are computers everywhere?

  • by Anonymous Coward

    âIn the 21st century the technology revolution will move into the everyday, the small and the invisible.
    Mark Weiser coined the phrase "ubiquitous computing" around 1988, during his tenure as Chief Technologist of the Xerox Palo Alto Research Center (PARC). Both alone and with PARC Director and Chief Scientist John Seely Brown.
    Weiser wrote "The Computer for the 21st Century" back in 1991.â

    https://www.lri.fr/~mbl/Stanford/CS477/papers/Weiser-SciAm.pdf

    https://web.archive.org/web/20180124233736/http:/

  • The interface between mind and machine is the prohibitive thing right now. Keyboard and mouse are primitive. Voice, slightly less primitive. The essential thing that will make computers serve us, as imagined in TFS, is a vastly improved interface.

    That will be a neural interface connecting our nerve synapses directly to an implanted intermediary. I'm imagining a parallel interface, perhaps with 81 neurons connected with 81 electrodes, creating a 64 digit path with some redundancy for individual connections t
    • by Memnos ( 937795 )

      With 64 connected neurons, it's not going to be a high rate of speed. Neurons have a refractory period of about 1-4 ms between each firing, so it'll be more like dial-up modem speed.

      • Now, compare that to the current fastest computer interface - typing. A crazy-fast typist might reach 120WPM, or about 10 characters a second, from a set of maybe 60-80. Lets be generous, 80 characters = 6.3 bits, so 63 bits per second.

        Compare that to a 64-bit parallel interface that can fire once every 4 ms = 16,000 bits per second. 254 times as fast. Maybe half that, if we assume a symmetric bi-directional interface.

        Now, whether you could productively use that much potential bandwidth for well structure

    • Wrong (Score:5, Insightful)

      by Viol8 ( 599362 ) on Sunday December 16, 2018 @04:57AM (#57811640) Homepage

      "Keyboard and mouse are primitive. Voice, slightly less primitive"

      Actually the keyboard and mouse are extremely good for the tasks they were designed for. Try saying "int main left round bracket int A-R-G-C comma char star star A-R-G-V right round bracket left curly bracket..."
      etc faster than I can type the equivalent.

      Similarly good luck using photoscope going "ok, do a transform from that point there, no left a bit, no right a bit, no there, THERE! , yes thats it, now drag that down from 10 pixels back ... no TEN, oh FFS, wheres my mouse..."

      • Is there actually such things as "round brackets" though? You have parentheses ("open paren," "close paren"), [brackets], {braces} and <angle brackets> but you can easily train those to "open angle" and "close angle." So you're having problems at the start due to vocabulary. And it would surely understand the words pronounced "arg-sea" and "arg-vee" if it was configured for voice.

        You could already code this way in emacs in the 90s. It sucks if you're able to use a keyboard instead, of course, but it w

        • Voice interface will work OK for people ...

          For people who don't have the luxury and benefit of two working arms, and a reasonable complement of fingers. For everyone else, touch interfaces will remain vastly superior.

          Sticks and stones, etc.

        • by Viol8 ( 599362 )

          "People who speak English, but refuse to accept that American English is the standard form, might get support last though; More people speak American English or French, or German, or Japanese, etc., etc., than speak each of the various regional English dialects."

          American english is a dialect. Real english is spoken in England. The clue is in the name. Ditto if I wanted to hear proper spanish I'd visit spain, not mexico.

    • by gtall ( 79522 )

      Captain Cyborg, is that you? How's the long suffering wife, is she getting better?

  • ...hacked IOT toilet paper

  • In 20 years, the idea of picking up a device or sitting down at a computer to actively use it will seem quaintly antiquated. All computing will be ambient -- all around us all the time, whispering in our ear, augmenting the real world through our prescription eyeglasses and car windshields, perceiving our emotions and desires and taking action in the background to help us reach our business goals and live a better life. Between now and then we'll all ride together on a very interesting journey from computers we actively use to computing resources increasingly acting in the background for us.

    I, for one, welcome our Ambient Overlords.

  • Ambient Surveillance.

    • Yes, the world is moving towards walled garden devices which hands control of personal data and your personal life from the "users" over to unregulated companies who will use that data against you or sell it to the highest bidder as soon as they go out of business.

      They, or people paying them, may also try to use these devices that you don't really own or realise you're using to influence public opinion.

  • by Tom ( 822 ) on Sunday December 16, 2018 @03:56AM (#57811570) Homepage Journal

    Probably the author has registered "ambientcomputing.com" or something.

    I already don't sit down "to use a computer". I sit down to watch a movie, play a game, write an article, read the news or create software. The machine itself has faded into the background now that we've finally managed to the the darn things functioning most of the time so you don't spend half your waking hours just babysitting the operating system (can you tell I'm not a windows user?).

    This trend has been going on for a long time and is continuing smoothly. Yes, the machine fades more and more into the background. Both my car and my HomePod have voice interfaces and hide the fact that they're essentially computers attached to a gadget. Robots have made a lot of progress now that machine learning is real (well, computing speed became fast enough. There's little in machine learning that wasn't invented 20 years ago, but we can finally run it on consumer hardware in real-time).

    Sure, in 20 more years we will have computers in everything, reacting to sensor data, voice input and such. But that's just smart electronics. It'll blur the line to computers mostly because it's cheaper these days to put a general-purpose CPU and a full-blown OS in and write custom software than it was to build some custom electronics. From a security perspective, IoT is both a nightmare and an opportunity (where the window of opportunity is closing fast and almost nobody used it to do things the right way, but I'm not complaining it means job security for the next decades while we old guys can sell ourselves for great daily rates to all those startups who re-invented the wheel, made it square because time-to-market and now applaud our genius for telling them that it rolls better when it's round).

    • so you don't spend half your waking hours just babysitting the operating system (can you tell I'm not a windows user?).

      Yes. The hyperbole made that clear.

  • The world is not shifting to ambient computing but rather ambient computing is spreading into the world.

    Ambient computing is very prominent already and from where I sit right now in my living room, I can see two actual computers (the laptop I'm working on and a Raspberry Pi that is my home server) but the number of embedded CPUs is much higher: TV, sound bar, smart light, settop box, BluRay player, calculator, smartphone, landline phone, VoIP box, printer, camera, MiFi box, ...) that's at least twelve CPUs,

    • I was stuck on the bus next to some jerk "ambient spreading," I had to turn my headphones up to max and I still couldn't find my own airspace.

  • by TomGreenhaw ( 929233 ) on Sunday December 16, 2018 @06:24AM (#57811766)
    ...we do really need
    • by gtall ( 79522 )

      The buzzword is at least 20 years old. I recall when pundits were punditsizing about this way back then. It never happened at least due to technology. Now it probably won't happen due to indifference...except for the elderly. For them, it could be quite useful.

  • is called spying so ads can gather more data.
    Don't let any new "computer" do this.
  • Until we figure out how to write secure apps and apps that don't crash or need continual updates, ambient computing is a dystopia.
  • The easiest way to debunk this kind of naive futurism is to postulate what else much also change.

    Right now we're in a time of tremendous asymmetry, where the vast majority of computer serves against the explicit interests of the end user. You know, you've got a life plan to make something of yourself, and the Internet says "hey, dude, why don't you click on these artfully extended boobies instead (we know you want to)". But you don't want to, just a tiny little bit of your lizard brain craves a short-term d

    • by epine ( 68316 )

      Most of my typos are full-word substitutions: "what else must also change" turned into "what else much also change" when my "time to eat your yummy freshly baked bread" oven-timer went off mid-sentence, causing the ch from 'change' to subconsciously channel David Bowie, by the all-too-alluring lizard logic.

  • So don't buy IoT or persistent-listening devices.

    Marketers are trying hard to push these things on consumers, but if no one buys them, then it won't happen.

  • by sad_ ( 7868 )

    why not, but NOT if it includes all the build-in spyware these things come with today.

"Ada is PL/I trying to be Smalltalk. -- Codoso diBlini

Working...