Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Bug Technology

New York City Has a Y2K-Like Problem, and It Doesn't Want You To Know About It (nytimes.com) 119

On April 6, something known as the GPS rollover, a cousin to the dreaded Y2K bug, mostly came and went, as businesses and government agencies around the world heeded warnings and made software or hardware updates in advance. But in New York, something went wrong -- and city officials seem to not want anyone to know. [Editor's note: the link may be paywalled; alternative source] New submitter RAYinNYC shares a report: At 7:59 p.m. E.D.T. on Saturday, the New York City Wireless Network, or NYCWiN, went dark, waylaying numerous city tasks and functions, including the collection and transmission of information from some Police Department license plate readers. The shutdown also interrupted the ability of the Department of Transportation to program traffic lights, and prevented agencies such as the sanitation and parks departments from staying connected with far-flung offices and work sites. The culprit was a long-anticipated calendar reset of the centralized Global Positioning System, which connects to devices and computer networks around the world. There has been no public disclosure that NYCWiN, a $500 million network built for the city by Northrop Grumman, was offline and remains so, even as workers are trying to restore it.

City officials tried to play down the shutdown when first asked about it on Monday, speaking of it as if it were a routine maintenance issue. "The city is in the process of upgrading some components of our private wireless network," Stephanie Raphael, a spokeswoman for the Department of Information Technology and Telecommunications, said in an email on Monday. She referred to the glitch as a "brief software installation period." By Tuesday, the agency acknowledged the network shutdown, but said in an emailed statement that "no critical public safety systems are affected." Ms. Raphael admitted that technicians have been unable to get the network back up and running, adding, "We're working overtime to update the network and bring all of it back online." The problem has raised questions about whether the city had taken appropriate measures to prepare the network for the GPS rollover.

This discussion has been archived. No new comments can be posted.

New York City Has a Y2K-Like Problem, and It Doesn't Want You To Know About It

Comments Filter:
  • by Zorro ( 15797 )

    NYC Maintenance budget. You would assume they would switch out 5% of everything a year for a 20 year refresh cycle, but No.

    • >> NYCWiN, a $500 million network built for the city by Northrop Grumman

      For $500M you could have built and deployed the system ($40M?) and then put the rest ($460M) in a trust whose proceeds could have funded maintenance forever.
      • From what I had read earlier last week, the issue was affecting an emergency response network where they pay Northrop Grumman 40 million a year to maintain.
        Maintaining it each year is as much as building it should have cost.

      • Re: (Score:2, Insightful)

        by supremebob ( 574732 )

        How much of that goes to union bosses who get paid to stand around on the job site and look important? This is NYC we're talking about here.

      • Re: (Score:2, Troll)

        by thegarbz ( 1787294 )

        For $500M you could have built and deployed the system ($40M?)

        Making project and construction estimates while not only not knowing the scope, but none of the requirements? I take it you spend your days posting on Slashdot because you're an unemployed project manager who ran your company into the ground?

        • He probably has no idea how few money 40million actually are (his proposal).

          Assuming a router or repeater costs $20 and a work hour $20 too, and making the optimistic assumption a worker sets up 4 routers per hour, then we have about $100 costs per hour (including the routers). So 40,000,000 / 100 is 40,000 work hours and 160,000 routers which covers a square of 400 x 400 routers. With about 100m distance from router to router that would be a 40km square, something like 25miles x 25miles.

          If a worker is real

          • I would assume it costs several hundred, and that core infrastructure runs in the millions. I was part of a project that installed a wireless multipurpose voice/data system at a single plant. That ran into the $40m mark.

        • I really do enjoy it when I get modded Troll by people who have no idea. $40million? Ha we spent more than that on a wireless project at a large chemical plant. If I had to give a thumb in the air estimate I would have come at larger than $500million for city wide multipurpose wireless infrastructure.

      • I was involved with a similar project in a different city, and just the infrastructure upgrades to the shelters and towers ran about $20 million per site for about 20 sites. None of them had been designed to a sufficiently robust criteria originally. The networking bits were easily another $10 million per site from what I understood.

      • That would require planning and brains. The antithesis of Government / Politics.

    • I doubt it is a hardware issue. I am more inclined to think it is software that is too tightly integrated, and software that was integrated to the "Mass" without a restart ability.

      If A depends on B and B depends on C and C depends on A, then, how to bring up the system on A with all the dependencies?

      Systems need to be more loosely coupled.

  • Y2K-like? Possibly. (Score:5, Informative)

    by The MAZZTer ( 911996 ) <megazzt.gmail@com> on Tuesday April 16, 2019 @12:45PM (#58445292) Homepage

    Maybe it went something like this?

    1. Step 1: Identify huge date based problem (Y2K)
    2. Step 2: Succeed and convincing management types that it's a problem that should be dealt with before it's serious.
    3. Step 3: Fix problem ahead of time.
    4. Step 4: Nothing serious happens because the problems were fixed ahead of time.
    5. Step 5: Identify huge date based problem (GPS rollover).
    6. Step 6: Fail to convince management types that it's a huge problem. They spent a lot of money fixing Y2K and it didn't cause any problems, why should this?
    7. Step 7: Everything goes offline because the problem wasn't fixed.
    8. Step 8: Management has no idea what happened. Y2K wasn't this bad!
    • by Anonymous Coward

      This is the same reason programmers often deliberately inject bugs into systems. If everything is working hunky dory then management will assume they don't need their techs and fire them all. Of course, the only reason everything is working so well is because the techs are diligently and proactively dealing with failures before they can become dumpster fires.

      End result: Dumpster fire.

      The problem now is that they just burnt all their staff so there's no-one to fix it. Hire swarms of interns and turn the dump

      • Dude, I've been earning my living programming for over 35 years. I have never ever, not even once, encountered an instance of a programmer deliberately injecting a bug into a system. You're either speaking out of your donkey, or you know some of the most worthless excuses for programmers in existence.
        • by cfa22 ( 1594513 )
          Ran out of mod points before I found this, but thank you for giving me my new favorite expression. I will accuse many of "speaking out of their donkeys" in the coming days and weeks.
        • Bugs deliberately created, no. Bugs accidentally created, identified, and knowingly not fixed, as a result of time and/or budget pressure, with the idea that "we'll fix it later if we have time"? All the time. I've done it myself. I hate it, and I do feel it my duty to warn management of possible/probable consequences. But they rarely listen. Management in my experience tends to not understand the impact of technical debt, or why spending a little on quality up front pays tremendous dividends over the
    • by sjames ( 1099 )

      Step 5 happened at least 10 years before the equipment was initially installed. Arguably, it happened before the very first GPS satellite was launched.

  • I don't see how they are supporting their claim of the City trying to keep people from knowing about this. Just because the government isn't jumping up and down declaring "we failed!" doesn't mean they are actively trying to oppress people from reaching that conclusion.
    • by Anonymous Coward

      To be fair, the government and cops expect full honesty and compliance when they are asking questions of you but they want to hide behind "everything is ok, just go away while we fix it" when you ask them. This isn't a private corporation that has a reasonable expectation and rationale for dodging media inquires. This is a public body that should be honest about the situation or at least not actively LIE about the situation. This is not and never was "an upgrade of the wireless network" unless upgrade ha

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      No city officials informed anyone on the NY City Council about the network failure and apparent lack of preparation for the rollover. Some council members only became aware of it when the Times called to ask for comment on the situation.

      And then there's this:

      Laura Anglin, the deputy mayor for operations who is responsible for the information technology agency, refused to answer questions about it on Wednesday afternoon as she entered City Hall.

      Asked if the city had taken the necessary steps to prepare for the GPS rollover, she said, “Talk to the press office.”

      Not exactly forthcoming from the city official who's most likely to have the most pertinent information about the subject.

      • by Anonymous Coward

        It's the directed government answer to all media inquiries. And let's be honest: getting ambushed on the way into a building is a hit job, not journalism.

      • No city officials informed anyone on the NY City Council about the network failure and apparent lack of preparation for the rollover. Some council members only became aware of it when the Times called to ask for comment on the situation.

        Let's put this in the context we use for other segments of our government and see if we're being reasonable here.

        If this was the federal government - especially the Trump administration - we would say something like "it's too complex, and the administration was too stupid to understand it" and we'd move on. Yet because it is the government of the largest city in the US, we expect for some reason that they will have a deeper understanding of technical matters that we would place "beyond the pay grade" of

  • Questions raised (Score:4, Insightful)

    by genfail ( 777943 ) on Tuesday April 16, 2019 @01:04PM (#58445394)
    "The problem has raised questions about whether the city had taken appropriate measures to prepare the network for the GPS rollover." I would say it raised answers not questions. The question, did NYC prepare for the GPS rollover, was answered a resounding and emphatic NO, they did not even try to prepare.
    • Re:Questions raised (Score:4, Interesting)

      by spacepimp ( 664856 ) on Tuesday April 16, 2019 @02:02PM (#58445688)

      They didn't have to prepare. The preparation was paying 40 million a year to northrop grumman to maintain and be prepared for them.

      • by jezwel ( 2451108 )
        Government run entity runs something and messes up = government is incompetent

        Government outsources to private enterprise and they mess up = government contract management is incompetent.

        Somewhere in there it needs to be realised that people mess up, regardless of whether they work for a public or private organisation.

        Get rid of the blame game and work out a non-biased system that determines whether the infrastructure or service should be government run or not, and set them up as required.

  • by pintpusher ( 854001 ) on Tuesday April 16, 2019 @01:17PM (#58445446) Journal

    > waylaying [...] the collection and transmission of information from some Police Department license plate readers

    good.

  • Clearly, The Machine is battling Samaritan again.
  • Uh, this is the second GPS date rollover since its inception in 1980. The first was in 1999, after 1024 weeks of operation.

    There is no excuse for any device released or updated after 1999 to not account for this GPS glitch.

  • by JustAnotherOldGuy ( 4145623 ) on Tuesday April 16, 2019 @07:25PM (#58446814) Journal

    "The problem has raised questions about whether the city had taken appropriate measures to prepare the network for the GPS rollover."

    I'm no rocket scientist but seeing as how they're having massive problems due to the rollover I'd have to say no, they didn't.

  • This gives me lots of faith that we will have 0 problems in 2038 with the epoch rollover.

It is now pitch dark. If you proceed, you will likely fall into a pit.

Working...