Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Bug Technology

2016 Bug Hits Text Messages, Payment Processing 340

An anonymous reader writes "It seems some systems are suffering from a Y2K16 bug. When 2009 ticked over to 2010, some Australian EFTPOS machines skipped to the year 2016. Coincidentally, some Windows Mobile users are also having issues with their new year SMSes coming from 2016. What function could cause this kind of error?"
This discussion has been archived. No new comments can be posted.

2016 Bug Hits Text Messages, Payment Processing

Comments Filter:
  • by Gorobei ( 127755 ) on Sunday January 03, 2010 @10:10AM (#30631410)

    No dial tone, no incoming calls.

    Had to reset the internal datetimes back to 2007, then they started working again. Nice job, Panasonic.

  • Re:Some kind of... (Score:3, Interesting)

    by kalidasa ( 577403 ) on Sunday January 03, 2010 @10:23AM (#30631480) Journal
    Do *you* often work with years as floats?
  • Re:Why (Score:3, Interesting)

    by TheLink ( 130905 ) on Sunday January 03, 2010 @10:46AM (#30631608) Journal
    Nah, it's more likely the same bunch that did excel 2007... Anyone remember the infamous 77.1 * 850 = 100000 Excel 2007 bug? :)
  • by Posting=!Working ( 197779 ) on Sunday January 03, 2010 @11:20AM (#30631798)

    Can someone tell me what the advantage of swapped nibble encoding are? Other than just being annoying as fuck when you're trying to decode it?
    For those too lazy to read the link , swapped nibble encoding is
    "BCD code where nibbles within octet is swapped. E.g.: 0x31 Represents value of 13"

    So for the format YY MM DD HH MM SS TZ (Time zone in 15 minute increments from GMT) instead of 10 01 03 10 11 43 24 for 2010 Jan 3 10:11:43 time zone 24 (GMT +6) you get the identical data but in the less readable form of 01 10 30 10 11 34 42 (and now it can be confused for 2001 Oct 30 10:11:34 AM. Bonus!)

    It's just complete idiocy to me. Is there some reason you'd want the date/time stamp slightly harder to read?

  • by Anonymous Coward on Sunday January 03, 2010 @11:46AM (#30631940)

    It is 2016, not 2k16. Is there something cool about replacing a zero with a k now?
     
    Any EE will tell you that 2k16 means 2.16k or 2160. How does this garbage continue making it to the front page?

  • Re:Some kind of... (Score:3, Interesting)

    by mrslacker ( 1122161 ) on Sunday January 03, 2010 @11:49AM (#30631960)

    Well, certainly BCD confusion caused a similar problem for RISC OS. Machines have been skipping to 2012. More here:

    http://www.riscository.com/ [riscository.com]

  • by Chabil Ha' ( 875116 ) on Sunday January 03, 2010 @11:54AM (#30631986)

    No to mention Y2K was a pun of sorts for shortening 2000 to three bytes from four. Now we're taking more space than the original; Y2K16 vs 2016. Those COBOL programmers would be rolling in their graves.

  • Re:Some kind of... (Score:3, Interesting)

    by Skapare ( 16644 ) on Sunday January 03, 2010 @01:48PM (#30632770) Homepage

    Someone obviously reverse engineered it. They saw 0x02 in 2002 and 0x03 in 2003 and assumed it was an 8-bit binary number of years since 2000. They should have read the specs instead to learn it was BCD (a leftover relic from the mainframe era that needed to die over a decade ago). OTOH, the spec designers should have use plain unsigned binary (which would have lasted until 2255 instead of 2099). Or better yet, don't even break up dates into bytes and nybbles; leave them encoded as number of nanoseconds since the epoch, times some scaling factor to fit less resolution in fewer bits as needed.

  • Re:Some kind of... (Score:3, Interesting)

    by Bigjeff5 ( 1143585 ) on Sunday January 03, 2010 @02:01PM (#30632866)

    Just nit-picking here, but the decimal number 10 and the BCD number 0001 0000 would be incorrectly written in hexadecimal as 0x10. The correct conversion, had the programmer kept better track of the formatting he was using is 0x0A. It would have been avoided with a practice like converting everything to binary before converting to hex, even if you think it's already binary. Really though he probably just needed better comments in the code.

    It's basically just a result of not keeping track of your units - it's the same kind of error that destroyed a mars probe a few years back. One guy was talking inches, another guy centimeters, and nobody bothered to write down which units the numbers were for. Calculations for the landing were of by hundreds of yards and the probe smacked into the surface of the planet instead of floating down gently.

    In this case, one guy is talking BCD, another guy assumed binary, and it was no big deal until 2010 rolled around and 10 suddenly became 16.

  • by Megane ( 129182 ) on Sunday January 03, 2010 @02:42PM (#30633284)

    What? I looked at someone else's link showing the SMS data format, and it is quite clear and consistent. Dates are six nibbles of BCD, with the low nibble coming first in both the date and phone number. The year is in exactly the same format as the month and day: two BCD digits. It's not their fault that so many programmers were morons and couldn't comprehend to process the year (before adding the century) in exactly the same way as the month and day.

    As for "lessons learned from Y2K", this is not a situation where you are keeping old data for years and years, nor are you doing date arithmetic or sorting of 50+ years range, not even ten years. Nobody's birthdate will be in the timestamp of an SMS message, nor will the expiration date of their credit card. As a timestamp, the century is explicitly implied* by context, the context being that of the calendar that's hanging on the wall of your cubicle.

    If someone were to make a database with years of SMS messages and not include the century as messages were added to the database, then yes, that would be a problem. But it would not be a problem with the SMS message format. When the year 2100 rolls around, and assuming we're even still using SMS, the worst that would happen is "hundred year old" text messages. At least then the low two digits would still be right, unlike the current "2016" stupidity.

    *that's an awesome combination of words

  • Re:Some kind of... (Score:5, Interesting)

    by Mr Z ( 6791 ) on Sunday January 03, 2010 @02:47PM (#30633324) Homepage Journal

    I wouldn't say it's incorrect. It's quite common to represent binary strings as hexadecimal strings regardless of the underlying format. After all, what units would you apply to an x86 opcode? what about a portion of an H.264 bit stream? Or heck, even a floating point number? 0x3F800000 is the hexadecimal representation of the IEEE 754 single precision floating point number 1.0. (And if you don't think people write floating point numbers in hex, think [gnu.org] again. [qc.edu]

    Now the integer 10 encoded in standard binary format would be 1010, which written in hexadecimal would be A (or 0x0A in C syntax). It also happens to be the same as if you directly went from base 10 to base 16 and ended up with 0x0A. If the decimal number were encoded as BCD, though, the resulting binary string would be written 00010000 (or 0001 0000 if you so choose--the spaces are insignificant!), and that binary string is equivalent to the hexadecimal string 0x10. Note that "hexadecimal string" does not imply base-16 number. It's just a shorter representation than the binary string. It is not equivalent to the hexadecimal number 0x10. When I said "written in hexadecimal", I meant "when written as a hexadecimal string."

    Look at it this way: If I were trying to hex-dump a file full of BCD values and the file had nothing but the value "10" (decimal) over and over in each byte, I would see 0x10 in every byte, not 0x0A. Try it. Go write a 6502 or x86 program that uses that processor's BCD mode, and then do a hex dump of memory. Or better yet, go get a BCD number, and pass it to the "%x" format specifier in your favorite C compiler (or other language that uses C's format specifiers).

    In this case, one guy is talking BCD, another guy assumed binary, and it was no big deal until 2010 rolled around and 10 suddenly became 16.

    I think you mean to say "one guy is talking BCD, another guy assumed standard binary integer. The "B" in BCD is "Binary" after all... Every number stored in the computer is in one binary format or another. Saying "assumed binary" is roughly the same as "assumed the number was stored in a computer." You have to say which format.

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...