Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Networking IT

Internet Black Holes 100

An anonymous reader writes "Hubble is a system that operates continuously to find persistent Internet black holes as they occur. Hubble has operated continuously since September 17, 2007. During that time, it identified 881,090 black holes and reachability problems. In the most recent quarter-hourly round, completed at 04:40 PDT, 04/09/2008, Hubble issued 46,846 traceroutes to 1,815 prefixes it identified as likely to be experiencing problems (of 78,772 total prefixes monitored by the system). Of these, it found 195 prefixes to be unreachable from all its vantage points and 139 to be reachable from some vantage points and not others." No relationship to that other Hubble which also tries to find black holes ;)
This discussion has been archived. No new comments can be posted.

Internet Black Holes

Comments Filter:
  • take note that (Score:5, Interesting)

    by OrochimaruVoldemort ( 1248060 ) on Wednesday April 09, 2008 @10:23AM (#23012570) Journal
    a large majority of them are in manhattan, followed by dc area, then france.
  • Does it matter? (Score:5, Interesting)

    by flyingfsck ( 986395 ) on Wednesday April 09, 2008 @10:35AM (#23012696)
    Since traffic cannot go to these black holes, I don't think it matters. A white hole, constantly spewing out crap (spammer) is a real problem, but a dead machine doesn't matter.
  • Re:Does it matter? (Score:5, Interesting)

    by mR.bRiGhTsId3 ( 1196765 ) on Wednesday April 09, 2008 @10:52AM (#23012922)
    I was under the impression that traffic to legitimate hosts was being lost into these black holes. Its not a dead machine, but rather bad routes being advertised for live machines. Thats general not supposed to happen, although I suppose it would be sweet if all the gunk the white holes spewed out is sucked into the black hole.
  • Re:Does it matter? (Score:5, Interesting)

    by JustinOpinion ( 1246824 ) on Wednesday April 09, 2008 @10:54AM (#23012946)
    I suppose it doesn't matter, but it's nice to know about it.

    I've often wondered why we don't have some kind of system that when I try to go to a web-page, and it is unreachable (host down? internet down? slashdotted?), I instead am given the "last known good copy" of the site. If you combined this black-hole detector with the "automatic archives" that exist (e.g. Google's cache, or the Wayback machine), then instead of getting an error page, you could get a banner that says "host not available for reason X; here is what the site looked like on datetime Y".

    Seems like this could be built into a Firefox plugin perhaps, with it automatically delivering the cached version if the host is on the black-hole list or doesn't respond after a set wait time.

    (Of course, typically when I have an idea like this, I then discover that people have already implemented it. So, if anyone knows of a browser-level or system-level utility that does this, please let me know!)
  • by Anonymous Coward on Wednesday April 09, 2008 @03:19PM (#23016092)
    That's Verizon (old GTE) network. The problem with this is that I use a 2ndary DNS server, 4.2.2.2, as a test to see if the Internet is "up". In about 10 years, if I have network connectivity, that address is pingable. And no, I've never been inside the Verizon network testing it... I've always been outside their network.

    So I don't see how it's only reachable %71 of the time from the Hubble project. Makes you wonder how many times the project itself is unreachable... ;)

  • by PalmKiller ( 174161 ) on Wednesday April 09, 2008 @03:39PM (#23016300) Homepage
    See we have this here new fangled linux based firewall (actually its pretty old) that simply ignores ping and traceroute requests...among others...who doesn't these days.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...