Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Communications Technology

Internet of Things Endangered By Inaccurate Network Time, Says NIST 166

An anonymous reader writes: Current standards of network timekeeping are inadequate to some of the critical systems that are being envisaged for the Internet of Things, according to a report (PDF) by the National Institute of Standards and Technology (NIST). The report says, "A new economy built on the massive growth of endpoints on the internet will require precise and verifiable timing in ways that current systems do not support. Applications, computers, and communications systems have been developed with modules and layers that optimize data processing but degrade accurate timing." NIST's Chad Boutin likens current network accuracy to an attempt to synchronize watches via the postal system, and suggests that remote medicine and self-driving cars will need far higher standards in order not to put lives at risk. He says, "modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require."
This discussion has been archived. No new comments can be posted.

Internet of Things Endangered By Inaccurate Network Time, Says NIST

Comments Filter:
  • ORLY? (Score:3, Insightful)

    by Anonymous Coward on Friday March 20, 2015 @01:52PM (#49303249)

    That's assuming self-driving cars and medicine have any place at all on the internet.. Which they don't, if you ask me.

    • Re:ORLY? (Score:4, Insightful)

      by Lunix Nutcase ( 1092239 ) on Friday March 20, 2015 @01:53PM (#49303269)

      Yes, but you aren't an "Internet of Things" seller.

    • by Anonymous Coward

      I think that systemd offers a solution to this problem. It probably includes timekeeping functionality. It also powers all Linux distros, and all IoT devices run Linux. So I don't see why there is even a problem here. All IoT devices could use systemd, systemd will keep the time consistent on all IoT devices, and nobody needs to worry about any of this.

    • That's assuming self-driving cars and medicine have any place at all on the internet.. Which they don't, if you ask me.

      The self-driving car must respond correctly to changes in the weather, traffic reports, detours, road-closings, and the like. Will it take the elevated highway that locals have learned to their cost is extraordinarily dangerous in high winds?

  • by Chirs ( 87576 ) on Friday March 20, 2015 @01:52PM (#49303255)

    The network is not necessarily involved. The example given of a self-driving car talks about the amount of time taken to distinguish between a plastic bag blowing in the wind and a child running in front of the car. This is not "network" timekeeping, just regular real-time processing.

    • Yep they are trying to ascribe 2 different things to inaccurate network time. Real-Time systems do not need to know the Date/Time in most cases, they need an accurate timer/heartbeat. Inaccurate time(clock) can cause authentication failures in some cases.
      • by CreatureComfort ( 741652 ) on Friday March 20, 2015 @02:19PM (#49303579)
        Plus, self-driving cars, in particular, will be using the time stamps from GPS, which is about as accurate as you can get outside of a lab these days, and far more accurate than anything the vehicle will need it for.

        Now what time source my IoT toaster will use, to brown my bread for exactly 23.5439263 seconds, starting at precisely 13minutes and 4.5098 seconds after local dawn... THAT I am concerned about!
        • Re: (Score:2, Funny)

          by Anonymous Coward
          Without accounting for relative humidity and possible changes therein at start and end times? Are you mad?
          • by Duhavid ( 677874 )

            You didn't factor in bread types, densities and thicknesses.....

        • GPS is expensive though, especially when these IoT devices may be running on batteries that are not being recharged nightly. GPS is good when you have it, but it's still just a starting point as some networks may require a tighter synchronization of time than GPS offers.

      • Accurate time(clock) is very necessary to wireless networking, point to point or meshed. A node must know when its neighbors are hopping to new channels. Almost all of the internet-of-things protocols and techniques are not going to be relying on some global beacon to keep things in sync (except for the dumb IoT stuff that's merely bluetooth to your phone). Either that's requires a distributed beacon or a distributed time synchronization method.

        Even with self driving cars that matters. The car may want

    • This is not "network" timekeeping, just regular real-time processing.

      Besides, SDCs don't get their time from "the network", they get it from GPS satellites, which are accurate to within a few nanoseconds. How far does a wind blown plastic bag move in a nanosecond? The width of a molecule?

    • by plover ( 150551 )

      Remember that the bag's Zigbee radio is broadcasting the bag's location constantly in real time, whereas the child's embedded GPS transceiver is using an accelerometer to help predict when the child will zip across the roadway; plus the child's Wi-Fi chip, network path, etc., will all add latency. If that child's GPS receiver has lost signal due to interference, it's going to need to rely on inertial navigation and its own free-running clock to send the predictions of future locations to the car, and those

    • by itzly ( 3699663 )

      Exactly, and real-time processing with safety critical deadlines is nothing new. That stuff has been done since the very first computer systems were used in industrial control.

  • by Jason Pollock ( 45537 ) on Friday March 20, 2015 @01:53PM (#49303265) Homepage

    There is no "now" [1]. If you're relying on accurate timing from a network, you're already broken. If you require accurate local times, then you know that and know the error terms on your clocks. Standard OS clocks only tick at about 100hz, so you're always out by an average of 5ms anyways.

    [1] https://queue.acm.org/detail.c... [acm.org]

    • by Anonymous Coward

      And an atomic clock can synchronize it. If the precision requirement is lower, the average network latency can be used to adjust for it. Still, this problem has been solved over and over again with not much issues in multiple industries. I have seen it work without issues with precision below a millisecond.

    • by OzPeter ( 195038 )

      If you require accurate local times, then you know that and know the error terms on your clocks.

      And that was the issue pointed out in the second FA - that the error terms are so badly defined that it affects "correctness" of operation.

      “For example,” he writes “for a driverless car to decide whether what it senses ahead is a plastic bag blowing in the wind or a child running, its decision-making program needs to execute within a tight deadline. Yet modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require,”

      While I can't argue the merits of timekeeping one way or another, I'm wondering if the reporting of this report has actually gotten the way of the what the report is actually about, because I would want my safety systems running on a hard real-time OS and this quote implies that they aren't.

      • by 0123456 ( 636235 )

        Real-time OS. You're funny.

        Do you really think the outsourced programmers developing Things for the 'Internet Of Things' will do anything but hack together the code in Java or Python on the cheapest OS they can find?

        • by suutar ( 1860506 )

          The ones for cars hopefully are, because the car companies have a concept of liability for poor design decisions and they're likely to have or know someone who realizes that RTOS is going to work better for that case than what you'd put on a web server. Like the folks who do the firmware for the engine control system; that's got some reasonably tight time tolerances.

          Toasters, not so much. Then again, toasters don't really care about windblown bags.

          • It's incredibly precise, I used to test ECU software to 12,000 RPM. That's 200 Revolutions per Second, or 72,000 Degrees per Second.

            At 33Mhz, you have about 458 clock cycles per degree, so if you have a 60 tooth crank sensor with 6 degrees per tooth, you have a real time position update you need to task switch to, synchronize with, and schedule events on coming every 2750 clock cycles. In between them, you have to read filter and diagnose all of the sensors so you can look up, interpolate, and calculate al

        • by Jeremi ( 14640 )

          Do you really think the outsourced programmers developing Things for the 'Internet Of Things' will do anything but hack together the code in Java or Python on the cheapest OS they can find?

          Some companies will do a half-assed job, and some will do a more thoughtful job. Then the market will decide whether or not it's willing to pay the extra money to have things done well. The outcome will depend a lot on what the particular Thing is used for, and what the costs of the occasional malfunction are vs the extra development costs of developing the software 100% correctly.

          • by itzly ( 3699663 )

            And in case of safety critical devices, such as autonomous cars, there will be probably be government mandated regulations in addition to market forces.

        • Hmm, I'm doing IoT using a mix of C and assembler. Not outsourced though. I can guarantee that the outsourced people will not be going python or java except as back office data churning or mock-ups to show to the investors.

        • I expect Apple Watch apps will be written in Swift.

          But now you're telling me the watch itself won't keep accurate time because Darwin isn't an RTOS? Suckers!

      • by sjames ( 1099 )

        Even there, the example is not quite right. The computer needs to decide if it's a paper bag on a tight deadline. It's OK if it still doesn't know at the deadline as long as it applies the brakes assuming it's a child. It's fine if it only 'realizes' after applying the brakes that it's a false alarm.

        But, of course none of that is at all related to timekeeping. The exact time of day doesn't alter the problem.

      • Providing an answer inside of a deadline is an entirely different problem to knowing the current time, and you definitely do not need an accurate clock source to do it.

        Even driving a car doesn't require split second timing. If it did, human's wouldn't be able to do it. That's why we've got the 2s rule...

        • Even that's a relative2s, a delta; the difference between when the vehicle in front passes a marker and when you do.

          It doesn't matter whether those seconds start at 09:35:27 or 23:59:59[1].

          [1] If your software's written right. I've seen supposed one minute delay loops that would run forever if midnight fell in the interval.

    • Nothing doing IoT should be doing "standard OS" anyway.

    • by Bengie ( 1121981 )
      Newer hardware supports millions to tens of millions of times per seconds. My home firewall is kept withing 0.1ms of a remote NTP server that's about 2000 miles away round trip. It's a hardware tick that is programmable and can schedule interrupts. The OS can check the current counter to see how much time has elapsed since the last check, and NTP clients can adjust the frequency to adjust for skew.
  • Duh (Score:2, Informative)

    by Anonymous Coward

    Nobody has ever depended on accurate time synch delivered over a network system with zero guarantee of packet delivery, let alone guarantee of delivery time. NTP has always just been "Good enough" to make sure your systems to be on the same date/time so things can synchronize in a somewhat organized fashion.

    Anything requiring honestly accurate time synch has always relied on external synchronization schemes. Ultra-accurate clocks, sometimes synched with outside networks that /do/ have guarantee mechanisms.

    B

  • Sounds like another real job for a "quantum computer"...

  • "modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require."

    So, we're going back to coding in assembly and calculating the execution time of opcodes, right?

    • So, we're going back to coding in assembly and calculating the execution time of opcodes, right?

      No, we're admitting that non-realtime OSs cannot guarantee when a program will be executed or that it won't be interrupted in the middle for some other task. It is a Bad Thing if the program that is in charge of determining whether your autonomous vehicle is about to hit a small person or a wind-blown plastic bag is currently swapped out because the mp3 player is processing your touch-screen input, or even if it is waiting for disk I/O to complete.

      Coding in assembly will not solve that problem.

      I would

  • I don't buy it. (Score:2, Insightful)

    by Anonymous Coward

    I am a professional real-time embedded software engineer working with mission-critical networking devices. I don't buy the claims in the article because I don't understand _why_ internet-of-things devices need to have tight time sync or be real-time deterministic.
    Accurate time sync is challenging - especially if you have wireless asymmetric links with non-deterministic latency.

    Rather than trying to fix time sync, we should be questioning the reasons why we require tight sync to begin with. It is definitel

    • The article is just click bait nonsense like 66% of what is linked from slashdot, it confounds at least three OS time-related concepts.
    • Re:I don't buy it. (Score:5, Informative)

      by Guspaz ( 556486 ) on Friday March 20, 2015 @02:30PM (#49303693)

      Exactly. The vast majority of Internet-of-Things devices can solve the problem by just installing ntpd and being done with it. My refrigerator or coffee maker or dehumidifier don't need hyper-accurate timing, and in the past year my devices running ntpd have never been more than around a tenth of a second off, which is still more accurate than anything that I actually need.

      I get that you may need hyper-accurate timing for some things, but if something is so critical that a few milliseconds of clock skew can kill people, it shouldn't be connected to the Internet anyhow!

  • by mmell ( 832646 ) on Friday March 20, 2015 @02:01PM (#49303345)
    The clocks are hyper-accurate, world accessible and the technology is sufficiently robust and mature to be considered essentially bulletproof. It relies on a broadcast technology that scales to any number of receivers you care to connect and doesn't get bogged down by additional loading. Best of all it's managed and maintained by the US Government - but it works correctly anyway.
    • by Durrik ( 80651 )
      This works nicely for self driving cars which need GPS anyway. I have no idea why self driving cars were listed. And for the times that it can't get a GPS signal the internal clock shouldn't drift that much. Unless the self driving car is 100% underground it should be able to find a GPS signal to time sync to often enough.

      Things inside a building might be harder. But there are things that take a GPS signal and put a NTP server on the network. All you need is on of these and you're fine for the local ne
      • by adolf ( 21054 )

        The problem with network time is that it relies on network access. It fails in all of the same ways that GPS wins.

        But it doesn't matter much because GPS repeaters are things that exist. Some additionally handle GLONASS, thus limiting reliance on any singular government's system.

        For example. [gps-repeaters.com]

    • Or better yet.....write software that doesn't depend on a precise time for security.....
    • by Anonymous Coward

      GPS is robust?

      They rely on an ultra weak signal being received from satellites that are 20,000 km away from the user.

      GPS receivers have their limitations. You can get an idea of the timing accuracy by looking at the positioning accuracy. Roughly dividing the positioning accuracy by the speed of light gives you the timing accuracy that can be achieved. A typical consumer grade GPS in open-sky conditions will get you around 10m accuracy. Lets call it 9m and the speed of light lets say it is 3e8m/s for eas

  • I call bullshit (Score:5, Interesting)

    by msobkow ( 48369 ) on Friday March 20, 2015 @02:11PM (#49303489) Homepage Journal

    Anyone who is designing such systems around "accurate time" hasn't got a freaking clue how to build such systems.

    For example, when dealing with spacing on self-driving vehicles, you rely on radar or laser tracking to maintain the separation between vehicles, not some wildly inaccurate network message about the velocity and position sent by other vehicles.

    Medical in particular baffles me. Who in their right mind would design a medical system that synchronizes with anything other than the patient's own body rhythms?

    But hey, that's what happens when you get some simulation designers trying to apply their single-clock logic to complex systems. They don't think about how real systems work -- the problem isn't an inaccurate time value -- it's an inaccurate understanding of the problem itself.

    • ...Who in their right mind would design a medical system that synchronizes with anything other than the patient's own body rhythms? ...

      If you need to understand how external stimuli affect a patient, you need medical event timestamps that are sync'd to an external agreed-upon clock.

      .

    • by dj245 ( 732906 )

      Anyone who is designing such systems around "accurate time" hasn't got a freaking clue how to build such systems.

      For example, when dealing with spacing on self-driving vehicles, you rely on radar or laser tracking to maintain the separation between vehicles, not some wildly inaccurate network message about the velocity and position sent by other vehicles.

      Why not both? I deal with industrial controls somewhat frequently, and it is a common approach to take multiple inputs, align them into comparable units, then weight them according to their importance and add them together. Typically this is done in such a way that if the usual governing input fails, the remaining inputs, combined with the control logic, will guide the system into a safe state.

    • If something is on a network they very often need accurate time. Wireless networks especially. Even your basic dumb wifi depends on timing signals from the access point, and your dumb smart phones require accurate times from the cell access points. Many medical devices are on a network; it's a method to get data back and forth from the medical device that is in the exam or operating room back to storage for images or patient records.

      • by rdnetto ( 955205 )

        There's a difference between elapsed time locally and globally. Locally (i.e. on a single processor), you can have some meaningful concept of absolute time (i.e. whenever the timer interrupt fires). The moment you introduce a second processor, you run into issues where they can be ticking at different rates, and the non-trivial delay in communications between them means that you can't ever hope to synchronize them to the extent you can assume they are the same.

        For most applications, you can get away with fa

  • A regular hissy fit over something they don't yet control.
  • Why does my toaster need to know the time more accurately than, say, a five minute window? For that matter, why does my toaster need internet access?

    For that matter, why the hell do I want my two-ton thin-metal-shelled death trap visible on the internet while flinging its contents (me) down the highway at 80MPH?
    • For that matter, why does my toaster need internet access?

      Because companies need you to be a good little consumer whore and buy worthless junk you don't need.

    • Your toaster does not need this. The IoT is screwed up by the mass media who don't understand it. The sorts of things that make sense to be networked are off the radar of most mass media journalists who normally write articles about the latest phone apps. Ie, these are electric meters, power transformers, stop lights, street lights, traffic counters, shipping pallets or cargo containers, and plenty of other stuff that has nothing to do with mass market consumer goods.

  • At least as far as Android is concerned it is endangered by an incredibly buggy implementation of the Bluetooth LE stack.

  • Its hardly the only thing by which "Internet of Things" is endangered. Its far from the biggest threat I'd even say.

  • by Anonymous Coward

    DOCSIS cable modems use a Time Division Multiple Access (TDMA) techniques to allow multiple subscribers share their valuable upstream bandwidth. When a cable modem wants to transmit data up to the internet, the data packet must arrive at the cable provider's equipment with a precision around 6.25 microseconds. With this amount of precision, the transit time has to be factored in. Yet this is done all the time with cable modems that cost less than $100 a pop.

    There already exists all the technology require

    • However if you have a network of devices that all need to keep in time sync with each other then this gets a lot harder. If there's no global time source that everyone hears (like a wifi access point) then it's harder still as you have to distribute the time synchronization across the network.

  • I couldn't figure out, I can't figure out what they are talking about.

    I've only seen IoT things that either don't care, at all, about time: all the datakeeping is local, and you can ask them or not ask them about the state and the logs (a fridge or a kettle doesn't care what your clock is),

    or IoT things that are real time. that doesn't care what your clock is because they will just want to contact you as fast as possible, like a fire alarm. It really doesn't care what your clock is, it just wants to get the

    • by Mirar ( 264502 )

      and no,

      “For example,” he writes “for a driverless car to decide whether what it senses ahead is a plastic bag blowing in the wind or a child running, its decision-making program needs to execute within a tight deadline. Yet modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require,”

      has nothing to do with IoT _nor_ timekeeping.

    • IoT is a vague concept. I don't think about refrigerators, toasters, or even smart phones, as IoT because those are simple problems to solve. Just but a big noisy access point in the house that everything hears and that is solved, for the consumer electronics stuff that mass media loves to talk about when it says IoT. But for a distributed wireless network of things, running off of batteries that can not be easily recharged, some of which may be moving around, this becomes a very difficult problem.

      I have

      • by Mirar ( 264502 )

        As far as I know network synchronization has been solved in many different ways already, and is not a problem. Can you give some example on where it hasn't been solved?

        Z-wave, Zigbee, 802.11*, BLE have all solved that. If you invent more (mesh) networks you'll have to solve it for your stack. But it's not like solutions doesn't exist. Or that these protocols have anything to do with IoT. (Timing on networks like ethernet or CAN or radio protocols like GPS or 4g/LTE have all been solved as well, and have eve

  • by David_Hart ( 1184661 ) on Friday March 20, 2015 @02:59PM (#49303961)

    The article talks about synchronization of time between systems and processes, not accurate time, as in my watch is 5 minutes fast.

    If a self driving car is seeing something in front of it and launches an app to determine what that object is, then that app needs to return an answer before the car hits the object and in time to brake to a stop, if necessary. It needs a time signal to understand how much time it has left. The problem, in this situation, is that without some sort of accurate time signal and time synchronization, the object recognition app could take more than the remaining time to develop an answer. Of course, you could launch a second app that acts as an emergency braking program that will hit the brakes in time, even if the object recognition app hasn't returned a result. The problem here is that you still don't know within a rigid level of certainty that the emergency baking app will complete in time.

    In many ways you can see this exact same problem with inexperienced drivers. It takes them longer to process what's in front of them and decide to hit the brakes or not. An experienced driver almost has an automatic awareness ("muscle memory") that gives them an advantage when reacting to situations that they have encountered before.

    My thought is that as these scenarios become "learned", they can be moved to "muscle memory". For example, most firewall devices rely on application-specific integrated circuit (ASIC) for real-time firewall rule evaluation. It seems to me that self-driving cars will require their own version of ASICs that contain "rules of the road" and evaluation shortcuts to handle real-time events without having to rely on time signaling.

    • by itzly ( 3699663 )

      If a self driving car is seeing something in front of it and launches an app to determine what that object is, then that app needs to return an answer before the car hits the object and in time to brake to a stop, if necessary. It needs a time signal to understand how much time it has left.

      It doesn't need a time signal. It just needs to be written in such a way that's its guaranteed to be fast enough.

    • > If a self driving car is seeing something in front of it and launches an app to
      > determine what that object is, then that app needs to return an answer
      > before the car hits the object and in time to brake to a stop, if necessary.
      > It needs a time signal to understand how much time it has left.

      What are you talking about? Time to impact = distance-to-object divided by your current speed. Distance is obtainable by radar/sonar/whatever, and speed comes from the same tachometer connected to your ca

  • 1. Don't let safety-critical decisions be based on unreliable time sources.
    2. Let each device tag incoming messages with its own timestamps, which never leave the device. Due to the laws of nature messages can safely be assumed to have been transmitted no later than the time of reception.

    I wonder if I should patent this...

    • Due to the laws of nature messages can safely be assumed to have been transmitted no later than the time of reception.

      I wonder if I should patent this...

      Yes, patent it. However, be sufficiently ambiguous in your patent that it will somehow apply to time machines. If the patent passes muster, you'll obviously receive a visit from either your future self, or a beta testing team that needs you to not file that patent. Either way, you'll likely end up either very rich, or very dead.

  • Of not living up to it's marketing. Despite what those idiot investors and marketing folks who over-invested in the buzzword would like us to believe, there will be no internet of things. While one can think of plenty of reasons why any particular object in the house might be slightly improved by being able to share some random status or change to the internet at large; one can barely think of anything that that would greatly improve this behavior. Yes I can get a push notification when the toaster pops.

  • They are the official time keepers, so of course they want the world to rely on their services for better time keeping!

  • I have seen this with radar processing chains where different component slew the time at different rates, mostly because of differences in the OS and the time synchronisation software. If one part of the chain suddenly steps its time by a second, downstream components reject its messages.

  • First only one system on the network needs to actually check with the internet time servers. Everything else can just check that local system. The most damaging thing is not having the time be wrong but the times be out of sync with each other. I'd much rather have all the systems be 4 hours off in the same direction than have them be every which way with some of them using the right time and some of them not.

    Second, there are a lot of ways to check the time and the NIST is not the only way to do it. A lot

  • I am sorry to inform you that there is an old Turkish proverb that states "If my aunt had moustache, then she would be my uncle". This was the first thing came to my mind while reading TFA....
  • The Time Rift of 2100: How We lost the Future --- and Gained the Past.

    WE CAN ONLY BLAME OURSELVES for the Time Rift. From discrete logic to main boards to chipsets to picoboards to nanite molecular clusters, we had machines re-drawing the same machines on smaller scale until they were like dust and pebbles, and yet, everything worked pretty well most of the time.

    THE DISTINCTION between software and hardware had merged, workable modules open sourced and refined with a really clever interconnection scheme. Somewhere along the line we left hardware design from 'scratch' --- and software design to the 'code' level --- behind. Things were no longer constructed for purpose. Software was no longer compiled. We began to plug and play and clone and shim.

    IT WAS HUMANS, amateur enthusiasts even, that first cloned and shimmed small machines into other machines of similar more refined purpose, and they did it with the same techniques we had used to construct analog circuits: locking together this way, and securing with that, test and done. There was an art to it. Where one had once meshed APIs together in the synchronous communications realm, now it was a matter of finding the proper angle and orientation of these smart pebbles, based on their markings and unique shapes. There was a flair to it, and some of this art was as much judged by its appearance as by function.

    BUT SOON WE GREW WEARY of that, and trained our machines to clone, shim and assemble these smaller machines. It was like some cyborg Tetris game where your challenge was to fit the pieces together as they fell from the sky. And the sky was full of pieces. Anything was possible if your reach was long and you gazed far enough, to grasp the perfect puzzle-piece.

    A FEW RESPONSIBLE ENGINEERS of the era took the time to publish diagnostic procedures with which one could fix these amalgamations, should one have the patience to pull them apart to do so, like the SAMS Photofacts of old. Every piece had its own direct interface for configuration and in theory at least, one could fix problems or reconfigure the pieces by simply talking to them directly. They documented these diagnostic and configuration interfaces, often cribbed from the documents of other engineers, which were scarcely ever used now, probing them to discover the more primal pieces within to gather documentation on those too.

    BUT IT WAS THANKLESS to do so, and these engineers found themselves out of work or forcefully retired. Their productivity paled besides younger geniuses who were simple hunter-gatherers, whose cleverness in assembling working prototypes was deft and swift. From concept to bubble-wrap technology companies had little interest in deep documentation. It was seen as a fetish. The thing works! Clone it and done. These hastily made things flooded the market and soon replaced other well-documented things. At times something failed and its inventors could not say why, they just assembled a new one or went bankrupt.

    IN A SAD IRONY as to the supposed superiority of digital over analog --- that this whole professionon of digitally-stored 'source' documentation began to fade and was finally lost. It had became dusty, and the unlooked-for documents of previous eras were first flagged and moved to lukewarm storage. It was a circular process, where the world's centralized search indices would be culled to remove pointers to things that were seldom accessed. Then a separate clean-up where the fact that something was not in the index alone determined that it was purgeable. The process was completely automated of course, so no human was on hand to mourn the passing of material that had been the proud product of entire careers. It simply faded.

    THEN SOMETHING TOOK THE INTERNET BY STORM, it was some silly but popular Game with a perversely intricate (and ultimately useless) information store. Within the space of six months index culling and auto-purge had assigned more than a third of all storage to the Game. Only as the Game itself faded

    • OH --- AND BEFORE YOU GO --- do please look over these necklaces of fine silicon jewels and take one as your own, or for your sweetie. You see they are actually little computers, or 'chips', as ran the great society of old. I have filed off the covering so you can see the tiny chip, which shines in the light. See here! Only a copper or two for each, and if you look me in the eye and promise you will strive to better your mind and help re-build this world, I'll part with it for a shake of the hand.

      Did you kn

What is research but a blind date with knowledge? -- Will Harvey

Working...