Internet of Things Endangered By Inaccurate Network Time, Says NIST 166
An anonymous reader writes: Current standards of network timekeeping are inadequate to some of the critical systems that are being envisaged for the Internet of Things, according to a report (PDF) by the National Institute of Standards and Technology (NIST). The report says, "A new economy built on the massive growth of endpoints on the internet will require precise and verifiable timing in ways that current systems do not support. Applications, computers, and communications systems have been developed with modules and layers that optimize data processing but degrade accurate timing." NIST's Chad Boutin likens current network accuracy to an attempt to synchronize watches via the postal system, and suggests that remote medicine and self-driving cars will need far higher standards in order not to put lives at risk. He says, "modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require."
ORLY? (Score:3, Insightful)
That's assuming self-driving cars and medicine have any place at all on the internet.. Which they don't, if you ask me.
Re:ORLY? (Score:4, Insightful)
Yes, but you aren't an "Internet of Things" seller.
Why not just use systemd? (Score:1)
I think that systemd offers a solution to this problem. It probably includes timekeeping functionality. It also powers all Linux distros, and all IoT devices run Linux. So I don't see why there is even a problem here. All IoT devices could use systemd, systemd will keep the time consistent on all IoT devices, and nobody needs to worry about any of this.
Re: (Score:2)
That's assuming self-driving cars and medicine have any place at all on the internet.. Which they don't, if you ask me.
The self-driving car must respond correctly to changes in the weather, traffic reports, detours, road-closings, and the like. Will it take the elevated highway that locals have learned to their cost is extraordinarily dangerous in high winds?
NOT "network timekeeping", just timekeeping (Score:5, Informative)
The network is not necessarily involved. The example given of a self-driving car talks about the amount of time taken to distinguish between a plastic bag blowing in the wind and a child running in front of the car. This is not "network" timekeeping, just regular real-time processing.
Re: (Score:2)
Re:NOT "network timekeeping", just timekeeping (Score:4, Funny)
Now what time source my IoT toaster will use, to brown my bread for exactly 23.5439263 seconds, starting at precisely 13minutes and 4.5098 seconds after local dawn... THAT I am concerned about!
Re: (Score:2, Funny)
Re: (Score:2)
You didn't factor in bread types, densities and thicknesses.....
Re: (Score:2)
Shortly thereafter, toasters will be designed to quietly burn anything that doesn't have a cryptographically signed RFID tag...
Re: (Score:2)
GPS is expensive though, especially when these IoT devices may be running on batteries that are not being recharged nightly. GPS is good when you have it, but it's still just a starting point as some networks may require a tighter synchronization of time than GPS offers.
Re: (Score:2)
Accurate time(clock) is very necessary to wireless networking, point to point or meshed. A node must know when its neighbors are hopping to new channels. Almost all of the internet-of-things protocols and techniques are not going to be relying on some global beacon to keep things in sync (except for the dumb IoT stuff that's merely bluetooth to your phone). Either that's requires a distributed beacon or a distributed time synchronization method.
Even with self driving cars that matters. The car may want
Re: (Score:2)
This is not "network" timekeeping, just regular real-time processing.
Besides, SDCs don't get their time from "the network", they get it from GPS satellites, which are accurate to within a few nanoseconds. How far does a wind blown plastic bag move in a nanosecond? The width of a molecule?
Re: (Score:3)
Remember that the bag's Zigbee radio is broadcasting the bag's location constantly in real time, whereas the child's embedded GPS transceiver is using an accelerometer to help predict when the child will zip across the roadway; plus the child's Wi-Fi chip, network path, etc., will all add latency. If that child's GPS receiver has lost signal due to interference, it's going to need to rely on inertial navigation and its own free-running clock to send the predictions of future locations to the car, and those
Re: (Score:2)
Exactly, and real-time processing with safety critical deadlines is nothing new. That stuff has been done since the very first computer systems were used in industrial control.
You're doing it wrong. (Score:5, Insightful)
There is no "now" [1]. If you're relying on accurate timing from a network, you're already broken. If you require accurate local times, then you know that and know the error terms on your clocks. Standard OS clocks only tick at about 100hz, so you're always out by an average of 5ms anyways.
[1] https://queue.acm.org/detail.c... [acm.org]
Re: (Score:1)
And an atomic clock can synchronize it. If the precision requirement is lower, the average network latency can be used to adjust for it. Still, this problem has been solved over and over again with not much issues in multiple industries. I have seen it work without issues with precision below a millisecond.
Re: (Score:3)
If you require accurate local times, then you know that and know the error terms on your clocks.
And that was the issue pointed out in the second FA - that the error terms are so badly defined that it affects "correctness" of operation.
“For example,” he writes “for a driverless car to decide whether what it senses ahead is a plastic bag blowing in the wind or a child running, its decision-making program needs to execute within a tight deadline. Yet modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require,”
While I can't argue the merits of timekeeping one way or another, I'm wondering if the reporting of this report has actually gotten the way of the what the report is actually about, because I would want my safety systems running on a hard real-time OS and this quote implies that they aren't.
Re: (Score:2)
Real-time OS. You're funny.
Do you really think the outsourced programmers developing Things for the 'Internet Of Things' will do anything but hack together the code in Java or Python on the cheapest OS they can find?
Re: (Score:2)
The ones for cars hopefully are, because the car companies have a concept of liability for poor design decisions and they're likely to have or know someone who realizes that RTOS is going to work better for that case than what you'd put on a web server. Like the folks who do the firmware for the engine control system; that's got some reasonably tight time tolerances.
Toasters, not so much. Then again, toasters don't really care about windblown bags.
Re: (Score:2)
It's incredibly precise, I used to test ECU software to 12,000 RPM. That's 200 Revolutions per Second, or 72,000 Degrees per Second.
At 33Mhz, you have about 458 clock cycles per degree, so if you have a 60 tooth crank sensor with 6 degrees per tooth, you have a real time position update you need to task switch to, synchronize with, and schedule events on coming every 2750 clock cycles. In between them, you have to read filter and diagnose all of the sensors so you can look up, interpolate, and calculate al
Re: (Score:2)
Do you really think the outsourced programmers developing Things for the 'Internet Of Things' will do anything but hack together the code in Java or Python on the cheapest OS they can find?
Some companies will do a half-assed job, and some will do a more thoughtful job. Then the market will decide whether or not it's willing to pay the extra money to have things done well. The outcome will depend a lot on what the particular Thing is used for, and what the costs of the occasional malfunction are vs the extra development costs of developing the software 100% correctly.
Re: (Score:2)
And in case of safety critical devices, such as autonomous cars, there will be probably be government mandated regulations in addition to market forces.
Re: (Score:2)
Hmm, I'm doing IoT using a mix of C and assembler. Not outsourced though. I can guarantee that the outsourced people will not be going python or java except as back office data churning or mock-ups to show to the investors.
Re: (Score:2)
I expect Apple Watch apps will be written in Swift.
But now you're telling me the watch itself won't keep accurate time because Darwin isn't an RTOS? Suckers!
Re: (Score:2)
Even there, the example is not quite right. The computer needs to decide if it's a paper bag on a tight deadline. It's OK if it still doesn't know at the deadline as long as it applies the brakes assuming it's a child. It's fine if it only 'realizes' after applying the brakes that it's a false alarm.
But, of course none of that is at all related to timekeeping. The exact time of day doesn't alter the problem.
Re: (Score:2)
Providing an answer inside of a deadline is an entirely different problem to knowing the current time, and you definitely do not need an accurate clock source to do it.
Even driving a car doesn't require split second timing. If it did, human's wouldn't be able to do it. That's why we've got the 2s rule...
Re: (Score:1)
Even that's a relative2s, a delta; the difference between when the vehicle in front passes a marker and when you do.
It doesn't matter whether those seconds start at 09:35:27 or 23:59:59[1].
[1] If your software's written right. I've seen supposed one minute delay loops that would run forever if midnight fell in the interval.
Re: (Score:2)
Nothing doing IoT should be doing "standard OS" anyway.
Re: (Score:2)
Duh (Score:2, Informative)
Nobody has ever depended on accurate time synch delivered over a network system with zero guarantee of packet delivery, let alone guarantee of delivery time. NTP has always just been "Good enough" to make sure your systems to be on the same date/time so things can synchronize in a somewhat organized fashion.
Anything requiring honestly accurate time synch has always relied on external synchronization schemes. Ultra-accurate clocks, sometimes synched with outside networks that /do/ have guarantee mechanisms.
B
"probabilities on execution times..." (Score:2)
Sounds like another real job for a "quantum computer"...
Finally! (Score:2)
So, we're going back to coding in assembly and calculating the execution time of opcodes, right?
Re: (Score:2)
So, we're going back to coding in assembly and calculating the execution time of opcodes, right?
No, we're admitting that non-realtime OSs cannot guarantee when a program will be executed or that it won't be interrupted in the middle for some other task. It is a Bad Thing if the program that is in charge of determining whether your autonomous vehicle is about to hit a small person or a wind-blown plastic bag is currently swapped out because the mp3 player is processing your touch-screen input, or even if it is waiting for disk I/O to complete.
Coding in assembly will not solve that problem.
I would
I don't buy it. (Score:2, Insightful)
I am a professional real-time embedded software engineer working with mission-critical networking devices. I don't buy the claims in the article because I don't understand _why_ internet-of-things devices need to have tight time sync or be real-time deterministic.
Accurate time sync is challenging - especially if you have wireless asymmetric links with non-deterministic latency.
Rather than trying to fix time sync, we should be questioning the reasons why we require tight sync to begin with. It is definitel
Re: (Score:2)
Re:I don't buy it. (Score:5, Informative)
Exactly. The vast majority of Internet-of-Things devices can solve the problem by just installing ntpd and being done with it. My refrigerator or coffee maker or dehumidifier don't need hyper-accurate timing, and in the past year my devices running ntpd have never been more than around a tenth of a second off, which is still more accurate than anything that I actually need.
I get that you may need hyper-accurate timing for some things, but if something is so critical that a few milliseconds of clock skew can kill people, it shouldn't be connected to the Internet anyhow!
Re: (Score:2)
Call the producers of Hoarders, because for that to happen there would need to be IoT devices piled so high in my home that I could not reach the computer.
Re: (Score:2)
Re: (Score:3)
Comment removed (Score:5, Insightful)
Re: (Score:3)
Things inside a building might be harder. But there are things that take a GPS signal and put a NTP server on the network. All you need is on of these and you're fine for the local ne
Re: (Score:2)
The problem with network time is that it relies on network access. It fails in all of the same ways that GPS wins.
But it doesn't matter much because GPS repeaters are things that exist. Some additionally handle GLONASS, thus limiting reliance on any singular government's system.
For example. [gps-repeaters.com]
Re: (Score:2)
Having a backup is a win. The GPS in my phone does both, concurrently.
And good luck on getting the US and Russia to agree to do anything at the same time...
Re: (Score:2)
Re: (Score:1)
GPS is robust?
They rely on an ultra weak signal being received from satellites that are 20,000 km away from the user.
GPS receivers have their limitations. You can get an idea of the timing accuracy by looking at the positioning accuracy. Roughly dividing the positioning accuracy by the speed of light gives you the timing accuracy that can be achieved. A typical consumer grade GPS in open-sky conditions will get you around 10m accuracy. Lets call it 9m and the speed of light lets say it is 3e8m/s for eas
Re: (Score:2)
Silly! How would that channel extra funds to NIST?
http://tf.nist.gov/time/common... [nist.gov]
Because NIST developed the "Common view time transfer using the GPS system"...
Because NIST has a finger in everything having to do with measurement?
Clearly, you'll never be a politician, son!
I call bullshit (Score:5, Interesting)
Anyone who is designing such systems around "accurate time" hasn't got a freaking clue how to build such systems.
For example, when dealing with spacing on self-driving vehicles, you rely on radar or laser tracking to maintain the separation between vehicles, not some wildly inaccurate network message about the velocity and position sent by other vehicles.
Medical in particular baffles me. Who in their right mind would design a medical system that synchronizes with anything other than the patient's own body rhythms?
But hey, that's what happens when you get some simulation designers trying to apply their single-clock logic to complex systems. They don't think about how real systems work -- the problem isn't an inaccurate time value -- it's an inaccurate understanding of the problem itself.
Re: (Score:3)
...Who in their right mind would design a medical system that synchronizes with anything other than the patient's own body rhythms? ...
If you need to understand how external stimuli affect a patient, you need medical event timestamps that are sync'd to an external agreed-upon clock.
.
Re: (Score:2)
Anyone who is designing such systems around "accurate time" hasn't got a freaking clue how to build such systems.
For example, when dealing with spacing on self-driving vehicles, you rely on radar or laser tracking to maintain the separation between vehicles, not some wildly inaccurate network message about the velocity and position sent by other vehicles.
Why not both? I deal with industrial controls somewhat frequently, and it is a common approach to take multiple inputs, align them into comparable units, then weight them according to their importance and add them together. Typically this is done in such a way that if the usual governing input fails, the remaining inputs, combined with the control logic, will guide the system into a safe state.
Re: (Score:2)
If something is on a network they very often need accurate time. Wireless networks especially. Even your basic dumb wifi depends on timing signals from the access point, and your dumb smart phones require accurate times from the cell access points. Many medical devices are on a network; it's a method to get data back and forth from the medical device that is in the exam or operating room back to storage for images or patient records.
Re: (Score:2)
There's a difference between elapsed time locally and globally. Locally (i.e. on a single processor), you can have some meaningful concept of absolute time (i.e. whenever the timer interrupt fires). The moment you introduce a second processor, you run into issues where they can be ticking at different rates, and the non-trivial delay in communications between them means that you can't ever hope to synchronize them to the extent you can assume they are the same.
For most applications, you can get away with fa
NISTerical (Score:1)
Can we stop treating the "IoT" as a real thing? (Score:2)
For that matter, why the hell do I want my two-ton thin-metal-shelled death trap visible on the internet while flinging its contents (me) down the highway at 80MPH?
Re: (Score:2)
For that matter, why does my toaster need internet access?
Because companies need you to be a good little consumer whore and buy worthless junk you don't need.
Re: (Score:2)
Your toaster does not need this. The IoT is screwed up by the mass media who don't understand it. The sorts of things that make sense to be networked are off the radar of most mass media journalists who normally write articles about the latest phone apps. Ie, these are electric meters, power transformers, stop lights, street lights, traffic counters, shipping pallets or cargo containers, and plenty of other stuff that has nothing to do with mass market consumer goods.
Re: (Score:2)
I put the butter in the pan, put the bread in the toaster, put the eggs in the pan, pour the OJ, flip the eggs, take the toast out, and flip the egg onto it. None of that requires Stratum-1 quality time, or even an internet connection.
More importantly - You have apparently confused "magic" for "the internet". The fact that your toast
Android POV (Score:2)
At least as far as Android is concerned it is endangered by an incredibly buggy implementation of the Bluetooth LE stack.
So what? (Score:1)
Its hardly the only thing by which "Internet of Things" is endangered. Its far from the biggest threat I'd even say.
How accurate do we need? (Score:1)
DOCSIS cable modems use a Time Division Multiple Access (TDMA) techniques to allow multiple subscribers share their valuable upstream bandwidth. When a cable modem wants to transmit data up to the internet, the data packet must arrive at the cable provider's equipment with a precision around 6.25 microseconds. With this amount of precision, the transit time has to be factored in. Yet this is done all the time with cable modems that cost less than $100 a pop.
There already exists all the technology require
Re: (Score:2)
However if you have a network of devices that all need to keep in time sync with each other then this gets a lot harder. If there's no global time source that everyone hears (like a wifi access point) then it's harder still as you have to distribute the time synchronization across the network.
what system? example? (Score:2)
I couldn't figure out, I can't figure out what they are talking about.
I've only seen IoT things that either don't care, at all, about time: all the datakeeping is local, and you can ask them or not ask them about the state and the logs (a fridge or a kettle doesn't care what your clock is),
or IoT things that are real time. that doesn't care what your clock is because they will just want to contact you as fast as possible, like a fire alarm. It really doesn't care what your clock is, it just wants to get the
Re: (Score:2)
and no,
“For example,” he writes “for a driverless car to decide whether what it senses ahead is a plastic bag blowing in the wind or a child running, its decision-making program needs to execute within a tight deadline. Yet modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require,”
has nothing to do with IoT _nor_ timekeeping.
Re: (Score:2)
IoT is a vague concept. I don't think about refrigerators, toasters, or even smart phones, as IoT because those are simple problems to solve. Just but a big noisy access point in the house that everything hears and that is solved, for the consumer electronics stuff that mass media loves to talk about when it says IoT. But for a distributed wireless network of things, running off of batteries that can not be easily recharged, some of which may be moving around, this becomes a very difficult problem.
I have
Re: (Score:2)
As far as I know network synchronization has been solved in many different ways already, and is not a problem. Can you give some example on where it hasn't been solved?
Z-wave, Zigbee, 802.11*, BLE have all solved that. If you invent more (mesh) networks you'll have to solve it for your stack. But it's not like solutions doesn't exist. Or that these protocols have anything to do with IoT. (Timing on networks like ethernet or CAN or radio protocols like GPS or 4g/LTE have all been solved as well, and have eve
Summarry is misleading... (Score:3)
The article talks about synchronization of time between systems and processes, not accurate time, as in my watch is 5 minutes fast.
If a self driving car is seeing something in front of it and launches an app to determine what that object is, then that app needs to return an answer before the car hits the object and in time to brake to a stop, if necessary. It needs a time signal to understand how much time it has left. The problem, in this situation, is that without some sort of accurate time signal and time synchronization, the object recognition app could take more than the remaining time to develop an answer. Of course, you could launch a second app that acts as an emergency braking program that will hit the brakes in time, even if the object recognition app hasn't returned a result. The problem here is that you still don't know within a rigid level of certainty that the emergency baking app will complete in time.
In many ways you can see this exact same problem with inexperienced drivers. It takes them longer to process what's in front of them and decide to hit the brakes or not. An experienced driver almost has an automatic awareness ("muscle memory") that gives them an advantage when reacting to situations that they have encountered before.
My thought is that as these scenarios become "learned", they can be moved to "muscle memory". For example, most firewall devices rely on application-specific integrated circuit (ASIC) for real-time firewall rule evaluation. It seems to me that self-driving cars will require their own version of ASICs that contain "rules of the road" and evaluation shortcuts to handle real-time events without having to rely on time signaling.
Re: (Score:2)
If a self driving car is seeing something in front of it and launches an app to determine what that object is, then that app needs to return an answer before the car hits the object and in time to brake to a stop, if necessary. It needs a time signal to understand how much time it has left.
It doesn't need a time signal. It just needs to be written in such a way that's its guaranteed to be fast enough.
Re: (Score:2)
> If a self driving car is seeing something in front of it and launches an app to
> determine what that object is, then that app needs to return an answer
> before the car hits the object and in time to brake to a stop, if necessary.
> It needs a time signal to understand how much time it has left.
What are you talking about? Time to impact = distance-to-object divided by your current speed. Distance is obtainable by radar/sonar/whatever, and speed comes from the same tachometer connected to your ca
Simple (Score:2)
1. Don't let safety-critical decisions be based on unreliable time sources.
2. Let each device tag incoming messages with its own timestamps, which never leave the device. Due to the laws of nature messages can safely be assumed to have been transmitted no later than the time of reception.
I wonder if I should patent this...
Re: (Score:2)
Due to the laws of nature messages can safely be assumed to have been transmitted no later than the time of reception.
I wonder if I should patent this...
Yes, patent it. However, be sufficiently ambiguous in your patent that it will somehow apply to time machines. If the patent passes muster, you'll obviously receive a visit from either your future self, or a beta testing team that needs you to not file that patent. Either way, you'll likely end up either very rich, or very dead.
The Internet things in Great Danger! (Score:2)
Of not living up to it's marketing. Despite what those idiot investors and marketing folks who over-invested in the buzzword would like us to believe, there will be no internet of things. While one can think of plenty of reasons why any particular object in the house might be slightly improved by being able to share some random status or change to the internet at large; one can barely think of anything that that would greatly improve this behavior. Yes I can get a push notification when the toaster pops.
Of course NIST would say that! (Score:2)
They are the official time keepers, so of course they want the world to rely on their services for better time keeping!
Leap seconds (Score:2)
I have seen this with radar processing chains where different component slew the time at different rates, mostly because of differences in the OS and the time synchronisation software. If one part of the chain suddenly steps its time by a second, downstream components reject its messages.
this is a bs problem (Score:2)
First only one system on the network needs to actually check with the internet time servers. Everything else can just check that local system. The most damaging thing is not having the time be wrong but the times be out of sync with each other. I'd much rather have all the systems be 4 hours off in the same direction than have them be every which way with some of them using the right time and some of them not.
Second, there are a lot of ways to check the time and the NIST is not the only way to do it. A lot
Lost in translation (Score:2)
The Time Rift of 2100: How We lost the Future (Score:3)
The Time Rift of 2100: How We lost the Future --- and Gained the Past.
WE CAN ONLY BLAME OURSELVES for the Time Rift. From discrete logic to main boards to chipsets to picoboards to nanite molecular clusters, we had machines re-drawing the same machines on smaller scale until they were like dust and pebbles, and yet, everything worked pretty well most of the time.
THE DISTINCTION between software and hardware had merged, workable modules open sourced and refined with a really clever interconnection scheme. Somewhere along the line we left hardware design from 'scratch' --- and software design to the 'code' level --- behind. Things were no longer constructed for purpose. Software was no longer compiled. We began to plug and play and clone and shim.
IT WAS HUMANS, amateur enthusiasts even, that first cloned and shimmed small machines into other machines of similar more refined purpose, and they did it with the same techniques we had used to construct analog circuits: locking together this way, and securing with that, test and done. There was an art to it. Where one had once meshed APIs together in the synchronous communications realm, now it was a matter of finding the proper angle and orientation of these smart pebbles, based on their markings and unique shapes. There was a flair to it, and some of this art was as much judged by its appearance as by function.
BUT SOON WE GREW WEARY of that, and trained our machines to clone, shim and assemble these smaller machines. It was like some cyborg Tetris game where your challenge was to fit the pieces together as they fell from the sky. And the sky was full of pieces. Anything was possible if your reach was long and you gazed far enough, to grasp the perfect puzzle-piece.
A FEW RESPONSIBLE ENGINEERS of the era took the time to publish diagnostic procedures with which one could fix these amalgamations, should one have the patience to pull them apart to do so, like the SAMS Photofacts of old. Every piece had its own direct interface for configuration and in theory at least, one could fix problems or reconfigure the pieces by simply talking to them directly. They documented these diagnostic and configuration interfaces, often cribbed from the documents of other engineers, which were scarcely ever used now, probing them to discover the more primal pieces within to gather documentation on those too.
BUT IT WAS THANKLESS to do so, and these engineers found themselves out of work or forcefully retired. Their productivity paled besides younger geniuses who were simple hunter-gatherers, whose cleverness in assembling working prototypes was deft and swift. From concept to bubble-wrap technology companies had little interest in deep documentation. It was seen as a fetish. The thing works! Clone it and done. These hastily made things flooded the market and soon replaced other well-documented things. At times something failed and its inventors could not say why, they just assembled a new one or went bankrupt.
IN A SAD IRONY as to the supposed superiority of digital over analog --- that this whole professionon of digitally-stored 'source' documentation began to fade and was finally lost. It had became dusty, and the unlooked-for documents of previous eras were first flagged and moved to lukewarm storage. It was a circular process, where the world's centralized search indices would be culled to remove pointers to things that were seldom accessed. Then a separate clean-up where the fact that something was not in the index alone determined that it was purgeable. The process was completely automated of course, so no human was on hand to mourn the passing of material that had been the proud product of entire careers. It simply faded.
THEN SOMETHING TOOK THE INTERNET BY STORM, it was some silly but popular Game with a perversely intricate (and ultimately useless) information store. Within the space of six months index culling and auto-purge had assigned more than a third of all storage to the Game. Only as the Game itself faded
Re: (Score:2)
OH --- AND BEFORE YOU GO --- do please look over these necklaces of fine silicon jewels and take one as your own, or for your sweetie. You see they are actually little computers, or 'chips', as ran the great society of old. I have filed off the covering so you can see the tiny chip, which shines in the light. See here! Only a copper or two for each, and if you look me in the eye and promise you will strive to better your mind and help re-build this world, I'll part with it for a shake of the hand.
Did you kn
Re:Internet of Bullshit (Score:4, Insightful)
Dice.com.
Re: It supports it just fine, article is BS (Score:1)
BS (Score:1)
Perhaps GP should read up on IEEE 1588-2008 [wikipedia.org] which is also known as the "Precision Time Protocol".
Re: (Score:2)
And this works fine even if the time differences are 'positive' or 'negative'?
It's when LocalTime appears in the future that the fun begins. Snap LocalTime back a few seconds to sync with CurrentbaseTime, feh, probably livable. But make it more than an hour, and do you risk having all of those log entries either being reset to some arbitrary time, or do they have to disappear? And my password change? And the front door alarm entry? Did that get re-timed, or deleted, or marked as suspicious?
We are going
Re: It supports it just fine, article is BS (Score:1)
The log problem you're talking about isn't an issue if you use a robust logging system like systemd's journal. Binary logs are more resilient to time changes than text logs are.
Re: (Score:2)
Re: (Score:2)
With luck, good components, and good climate control, you can usually manage to keep an internal LAN within about 1/5th of a millisecond. Maybe 1/10th if everything is well behaved.
Re: (Score:2)
Slow/speed time rather than set it (Score:3)
It's when LocalTime appears in the future that the fun begins. Snap LocalTime back a few seconds to sync with CurrentbaseTime, feh, probably livable. But make it more than an hour, and do you risk having all of those log entries either being reset to some arbitrary time, or do they have to disappear?
I once worked on the kernel of a system that had to deal with time updates very carefully to avoid screwing up various user land apps whose programmers never considered that a time delta could be negative. Rather than snap the current reported system time to the correct time I would slow down or speed up the progress of time until reported and actual got sufficiently close.
Slashdot summary is confused (surprise!) (Score:2)
Article TLDR version: a cluster of microcontrollers (raspberry pis) does not a real-time operating system (RTOS) make.
The article has to do with deadline-based process timing in a dispersed computing cluster. It has nothing at all do with "network time" which means keeping clocks in sync.
Re: (Score:3)
Delay = LocalTime - SystemTime - TimeDifference
TimeDifference = LocalTime - SystemTime - Delay
Now = LocalTime - TimeDifference
There, corrected it for you. Now you try to figure this problem out.
Re: Read the PDF. (Score:4, Informative)
It covers a wide berth of timing related topics and is information dense. I found no marketing BS in this paper at all.
Re: (Score:2)
Isn't there a radio signal transmitted in most modern countries specifically FOR synchronising times without a hardline or remote purposes?
There is WWV in United States and DCF77 in Europe.
Re: (Score:2)
These radio beacons aren't very reliable though. A little bit of interference from some nearby electronics can be enough to lose connection.
Re: (Score:2)
Re: (Score:2)
The easiest time source nowadays is the almost ubiquitous cell phone system. You don't even need a paid data service to get the time from the local cellsite.
Re: (Score:2)
You appear to be confusing Republicans with Democrats. Though, to be fair, it's pretty easy to do these days.
Re: (Score:3)
There's also GPS, for which receivers are very cheap and which provides very accurate time.
This article is nonsense. Assuming IoT ever becomes an actual thing, the vast majority of devices won't need any better than the "good enough" that NTP provides. Those that do will probably manage their own time using accurate clocks and GPS.
Time patronization rarely matters. Usually you need an accurate clock (i.e. exactly 100hz) way more than you need your time to be within 0.100ms of someone elses time.
Re: (Score:2)
There are many broadcasts with time embedded into them. They are not exactly secure as authenticating the time is far harder than receiving it.
For IoT really you have two times the local time that needs to be taken of faith to be right (served from a local trusted source) this is your clock display etc etc etc, it keeps the complexity and recently ever changing timezone bits out of IoT devices. You then have a unix clock few IoT devices should need this level of accuracy or complexity, when they do they s
Re: (Score:2)
Don't forget forcing mandatory upgrades, removing features and charging you to add them back. then announcing that they're moving to a subscription model, and if you don't pay them $500 a month, your house will no longer work.
I remember reading an SF story once where everything was 'smart' and people even had to put a quarter in the lock to get in and out of their house; that's the future the rentier corporations want to see.
Re: (Score:2)
If you don't want your utilities to spy on how much of their resources you are using, then disconnect from the grid and the pipes.
Re: (Score:2)
And, if you don't buy them, the government will mandate them. Smart TVs will allow them to immediately send police or ambulance to your house when something bad happens, and smart themostats will allow them to control power demand for the most efficient energy usage.
Re: (Score:2)
Time is a big issue here with RTOS, not just scheduling but the synchronization with other devices. The devices don't have the same clocks so there often needs to be some synchronization when they talk to each other, especially on networks. All of the wireless protocols we have depend upon accurate timing so that each node knows which channels to be on at which time. The tighter and more accurate you can get this time then the better your network performance is, and as a side effect the more power you sa