Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Facebook Bug Social Networks Unix

Epoch Time Bug Causes Facebook To Congratulate Users On 46 Years of Friendship (gizmodo.com) 108

An anonymous reader writes: A bunch of Facebook users received mysterious messages yesterday congratulating them on 46 years of being friends with somebody on Facebook. An astute observer may note that Facebook hasn't been around for 46 years. An even more astute observer might note that 46 years ago yesterday would be 12/31/1969 — easily recognizable as value '0' in the Unix Epoch with a time zone adjustment. A Microsoft engineer posits that the messages were sent because of how Facebook implemented its congratulatory messages. Many people were Facebook friends when the feature was rolled out, and instead of finding or estimating the date they became friends, Facebook simply set that database value to '0'. When the script fired to send those messages, it grabbed that value expecting a time, and interpreted the 0 accordingly. "The developer who wrote the "friends with since" memories algorithm should have added a case WHERE friendsWithSinceDate != '0' or something along those lines."
This discussion has been archived. No new comments can be posted.

Epoch Time Bug Causes Facebook To Congratulate Users On 46 Years of Friendship

Comments Filter:
  • Oblig (Score:5, Funny)

    by American AC in Paris ( 230456 ) on Friday January 01, 2016 @02:42PM (#51222335) Homepage
    Epoch fail.
    • At least facebook's logon system works.

  • by Anonymous Coward

    We have so many women graduating college, and early STEM education, and experienced workers training their H1-B replacements, how can there be any coding errors?

  • Not an Epoch bug (Score:5, Insightful)

    by Predius ( 560344 ) <.josh.coombs. .at. .gmail.com.> on Friday January 01, 2016 @02:49PM (#51222381)

    It's not an Epoch time bug, it's a lazy programmer bug. If you're going to use X time system, do so intelligently. If you're going to use Y time system... etc.

    • It's not an Epoch time bug, it's a lazy programmer bug....

      Exactly correct. At least the Microsoft Engineer put the blame in the correct place when he wrote, "The developer who wrote the 'friends with since' memories algorithm should have ..."

    • Or strenthen your type system. If you mean false, don't write a zero literal into your code.
    • But 0 is a real and valid time. It's a common issue when people borrow time systems without understanding them. Unix time was invented to specify file dates only. But it got repurposed to do stuff which it was not designed for. Ie, you can not specify necessary medical information dates using 32 bit Unix time. People keep reusing Unix time because they don't know how to do a different time library. Expanding to 64 bits is ok, but keeping 1/1/1970 as the epoch is a bit strange (pick maybe year 0, or 20

      • by Kjella ( 173770 )

        And I don't think the test versus 0 is very good in this case. There are indeed people who met and became friends on 1/1/1970.

        In the real world, yes. On Facebook, no.

  • by DahGhostfacedFiddlah ( 470393 ) on Friday January 01, 2016 @02:53PM (#51222407)

    I don't understand people who prefer to use magic numbers over NULL, but there appear to be many.

    • I don't understand people who prefer to use magic numbers over NULL, but there appear to be many.

      Last I checked, NULL is just a magic number (usually zero).

      • NULL should never be zero. If you treat null's as zero's it is just as bad as "default 0" and then using it in a date calculation.

        • Awe, thats cute, you're one of those spoiled kids that doesn't know how the computer works because you use high level languages that abstract everything from you.

          To store 'NULL' you have to encode it SOMEHOW for the processor to have any usefulness to it, that encoding could ALWAYS have another meaning. Processors have absolutely no concept of NULL, so NULL is effectively ZERO to any programmer who actually understands how computers actually work.

          Sure, in Java, .NET and many other high level languages, the

          • Awe, thats cute, you're one of those spoiled kids that doesn't know how the computer works because you use high level languages that abstract everything from you.

            To store 'NULL' you have to encode it SOMEHOW for the processor to have any usefulness to it, that encoding could ALWAYS have another meaning.

            Who gives two shits about what the processor is doing? The processor isn't the interface or the logical representation actual people are managing.

            Whether databases use separate bit fields or simply reduce published range of a datatype by 1 to make physical room for expressing 'NULL' who exactly cares? Why is it relevant at all?

            In systems which properly express NULL values NULL is NULL, NULL is not 0 ever.

            Processors have absolutely no concept of NULL, so NULL is effectively ZERO to any programmer who actually understands how computers actually work.

            Completely irrelevant.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        NULL is a special case in any SQL database. If you try to check if something is equal to NULL, you will always receive false. Even NULL = NULL is false. That forces you to specifically check for a NULL value using IS NULL instead of accidentally bumping into correct functionality, but having things inexplicably break later on.

        And the fact is, this Facebook bug is probably caused by how most data connector libraries interpret a NULL timestamp in MySQL (IIRC, FB uses a heavily-modified variant of MySQL). They

        • [...] And also because NULL should be NULL, not a value that might be valid to other types of code.

          Blah, blah, blah, I don't care how you fancy it up, everything the computer does is dealing with numbers. NULL is just another binary number any way you slice it, and it will be a valid number in code if you're not careful keeping track of which data is NULL and which is the number used to represent NULL.

          If you disagree, please by all means tell how *you* represent NULL without using some combination of ones and zeros.

          • by Anonymous Coward

            Every participant in this thread with the exception of you and Bitzstream understand the context of this discussion relates to NULL in a database. How the two of you consistently fail to grasp this concept is surely exhausting to all other posters. Yes, we understand binary. That has nothing to do with a programmer at Facebook writing a buggy SQL query.


            % select username from users where brain = 0;
            username
            ------------
            (0 rows)

            % select username from users where brain IS NULL;
            username
            -----------

            • Sorry, your database, and it's NULL entries, are nothing more than magic numbers. I'm discussing NULL in a database as related to the story, and you're discussing NULL in a database that isn't in the story but pretending it is because that's the database you're familiar with.

        • You mean someone didn't do ISNULL() on the returned value and let the connector interpret ...

          That makes both the connector and the developer stupid :) The connector should have returned NULL or an error/exception.

          Just for reference, NULL behaves like that in any sane language, excluding C/C++ due to its low level nature, though the compiler will effectively fix that flaw in the case of most processors where it can do so intelligently since page 0 is almost universally 'protected' against reads AND writes b

      • by bsolar ( 1176767 )
        Check again. We're talking about a database: "the answer is unknown" is definitely not the same as "the answer is known and is 0". NULL exists as concept exactly to differentiate between the two. From the summary it's pretty clear the answer is not known, so it should have been represented with NULL.
        • So which number do you use to represent NULL in your database? (Hint: it's a binary number)

          • by bsolar ( 1176767 )

            Hint: the specific internal representation is absolutely irrelevant in the context of the discussion: the point is that whatever representation is used for NULL, it's a different representation from any other valid value. Yes, this means that e.g. if you have in-DB a nullable TINYINT (1 byte number), you have *more* information than what is representable in a C or Java variable of type 'byte' (primitive).

            How you'll need to define your programming data model to accurately map all informations you read from t

            • How you'll need to define your programming data model to accurately map all informations you read from the DB is a completely different issue and doesn't change the fact that the DB *does* provide a specific value which represent unambiguously the concept "information is unknown".

              1) That specific value is a number
              2) It's not always unambiguous, hence this story.

              • by bsolar ( 1176767 )

                1) That specific value is a number

                No, it's not. Don't take my word for it: play with SQL and realise that NULL doesn't behave like a number when used in operations and functions.

                2) It's not always unambiguous, hence this story.

                It's definitely unambiguous. Again, don't take my word for it, take e.g. a TINYINT and try to figure out which of its possible values is ambiguous with NULL. (Read: SELECT x = NULL results in True, hint: there is no such value).

                The story is *not* about NULL having an ambiguous representation: it's about the programmer *not* using NULL to represent the concept of "m

                • 1) That specific value is a number

                  No, it's not. Don't take my word for it: play with SQL and realise that NULL doesn't behave like a number when used in operations and functions.

                  So if I take your SQL database, and read it with a C program that prints the file bit by bit as ones and zeros, at some point your magical non-numerical NULL entry will output something other than a number?

                  2) It's not always unambiguous, hence this story.

                  It's definitely unambiguous. Again, don't take my word for it, take e.g. a TINYINT and try to figure out which of its possible values is ambiguous with NULL. (Read: SELECT x = NULL results in True, hint: there is no such value).

                  The story is *not* about NULL having an ambiguous representation: it's about the programmer *not* using NULL to represent the concept of "missing information" (which is exactly why it exists in SQL) and instead (ab)using a specific numerical value.

                  Yeah, but if you read the database with a different program that has a different idea of which number means NULL then all your NULL entries are suddenly numbers. And maybe someone thought they'd be clever and store data more efficiently by using only 8 bits to store their TINYINT, then messed it up their

                  • Yeah, but if you read the database with a different program that has a different idea of which number means NULL then all your NULL entries are suddenly numbers.

                    Database access interfaces include management of concept of NULL even if a suitable analogue of the concept is not directly supported by underlying programming environment. Being lazy and or using shitty APIs have predictable results including committing totally preventable errors. NULL exists to prevent exactly this type of failure from ever occurring.

                    And maybe someone thought they'd be clever and store data more efficiently by using only 8 bits to store their TINYINT, then messed it up their handling of NULL.

                    There always seems to be an infinite array of excuses for fucking up and then being surprised when hit with equally screwed up outcomes. "being clever" in

          • Null is probably stored as a flag, not a number. The flag says that the number value is ignored. It could be a high bit in a numeric value making it.out of range, and that's definitely not zero. But its not a numeric value, its a flag.

            How about you look at some open source databases or read Wikipedia or something?

            • So your computer uses trinary, one, zero, NULL? My computer would store a flag bit as a number, either one or zero.

              • by KGIII ( 973947 )

                Hmm... Would it store it as a 1 or 0 or would it store it as a 00001 or 00000? 'Cause the whole (the 1 or 0) might be represented in binary and thus take more than a single bit to represent the whole, no?

      • by prefec2 ( 875483 )

        In C this is true NULL is defined as 0. However, in SQL the semantic of NULL is "value not set".

        • Close.

          In C a true NULL is defined as a cast FROM integer 0 to a pointer. The difference in definition is that the reverse is not true, and casting NULL TO an integer is implementation defined.

          • That's weird, I didn't think NULL even had to be 0. I remember reading about some platform where NULL actually wasn't zero.
            • According to C11 in the section for pointers (6.something), null is defined in stddef.h as
              #define NULL ( (void *) 0)

              If some platform defines it otherwise then it goes against the C standard.

              That doesn't mean you're not right, it just means that the platform was not following the standard correctly.

              • oh, it was defined recently. Interesting.
                • No. Sorry I just looked at C11 as an example. The text is unchanged from C99.

                  ANSI C on the other hand is a bit more difficult.

                  4.1.5 shows the standard defines in stddef.h and has written:
                  " The macros are
                  NULL
                  which expands to an implementation-defined null pointer constant;"

                  That is interesting because on the one side it says implementation-defined but on the other hand section 3.2.2.3 on pointers states:
                  "An integral constant expression with the

                  • C99? I'm talking old [stackoverflow.com], bro
                    • Well if you want to talk pre-1989 standardisation then all bets are off. A lack of a standard by definition means people can do whatever the heck they want :-)

                    • One of my earliest memories as a professional programmer was telling another, more experienced coworker, "Don't use 0 instead of NULL! There will be problems if it changes in the future!"
                      He just smiled and said, "It won't."

                      From that I learned to recognize things that won't likely change in the future, in contrast to things that will.
                    • Those problems extend way beyond programming. We take all sorts of things for granted as something that won't change in the future :-) That's why it's so important to quote the year after a standard when you reference it. It's a wonderful lesson of arsecovering because even the most obviously established of standards can change.

                    • It's a wonderful lesson of arsecovering because even the most obviously established of standards can change.

                      That's true, but I still feel perfectly comfortable using 0 as NULL.
                      Things that are more likely to change need more attention given to make them flexible.

                  • Reading the sections of the ANSI C standard you copied, it seems carefully written so that NULL can be defined to be any value on a given platform......however the idiom of using (void*)0 to represent a null must be preserved. So you could have code like this that evaluates to true:

                    void *a = 0;
                    int b = (int)a;
                    if(b==07777){
                    printf("true\n");
                    }
                    • Certainly and that could be true by all standards because while void *a = 0; is defined in the standard, int b = (int)a; is implementation specific.

                      All bets are off.

      • by truedfx ( 802492 )
        Are you perhaps thinking of C? Its meaning of NULL is nothing like SQL's.
    • by bondsbw ( 888959 )

      Agreed that using NULL is typically better than using magic numbers, but NULL isn't a perfect solution either. It has different semantics in different languages, and comparisons are often confusing and/or undefined.

      Use a flag instead.

    • NULL has it's own set of issues like you can't index it because it's not zero, it's not nothing or blank, it's literally unknown
    • There are a million ways this could have been coded 'correctly'. Please let us not start going through them all.
      • A millions ways to code it correctly, but only a few architectural decisions that would prevent this particular bug. The decision to give a fake value to an unknown is wrong before the first line of code has been written.

    • NULL is intended for checking pointers, not for integers. In this case, 0 is correct.

      • by bsolar ( 1176767 )
        Checking pointers in a relational DB? We're talking about this NULL [wikipedia.org].
      • Zero is obviously not correct in this case. Null semantics vary by language. Even if writing this in C, you'd want some semantic that would let a database query answer "I don't know that" rather than zero.
    • NULL is 0 on many systems. If you're in an extremely high level language and you don't care about how much space your database takes up, then you have types stored along with values. But if you're using a low level language down below for speed purposes then using NULL won't fix the problem.

  • WHERE friendsWithSinceDate != '0'

    Who would store a numeric timestamp as a varchar? Certainly storing it as a numeric value type would be more appropriate.

    • That and while is normally used for loops... 'IF' would have been more appropriate for his example. But since we all got his point the example served its point.

      • That and while is normally used for loops

        Err, yes, but this is a SQL "WHERE", not a "while".

        I guess it's been a where since you did any SQL ;)

        • Yah; I guess my own language basis got in the way of reading comprehension. Can't ever think of a time of writing a 'where' statement, but in an SQL context that makes perfect sense.

  • Doesn't everyone who is asked for a birthday they don't want to enter use 1/1/1970 :-)

    • Has UNIX ever considered the poor people who actually were born on 19700101? And who have to prove it to every single damn webpage out there?

      • by dissy ( 172727 )

        Has UNIX ever considered the poor people who actually were born on 19700101? And who have to prove it to every single damn webpage out there?

        Unix time was (occasionally is) represented by a 32 bit signed integer, meaning both zero and negative values can be stored just fine.

        Only people who are 115 years old or more would have any problems on those old unix systems.
        Specifically dates prior to 1901-12-13.

        However even that limitation has been fixed roughly a decade ago when "time_t" was modified to be a 64 bit signed integer.

        Negative values in 64 bits is supposed to be a couple hundred billion years (or so I've been told, I don't have that many fin

  • There's a threeple only an NSL could love.

  • I had a couple people on FB wish me happy birthday today.

    But it's not actually my birthday. (Nor do I display my birthday on FB).
  • Party like it's 1999!

    Happy New Year!

On the eighth day, God created FORTRAN.

Working...