NETI@home Data Analyzed 155
An anonymous reader writes "The NETI@home Internet traffic statistics project (featured in Wired and Slashdot previously) has a quick analysis on the malicious traffic they observed. It's a rough world out there." Perhaps not suprising, but still disheartening, the researchers find among other things that a large portion of typical end-user traffic consists of malicious connection attempts.
Re:RBL of infected/malicious sites? (Score:3, Informative)
From the abstract of their paper:
Finally, we look at activity relative to the IP address space and observe that the sources of malicious traffic are spread across the allocated range.
So the answer is no, you can't filter effectively for bad sites.
malicious? (Score:3, Informative)
Re:Root of the problem (Score:2, Informative)
Re:RBL of infected/malicious sites? (Score:4, Informative)
Very highly recommended. With the case of p2p, it's good to keep your head down. It's the tall ones that get their heads chopped off...
They also have software to convert the lists to various formats for use in different firewalls. iptables fans should check out "linblock". Beware though, a large list can take an hour to parse on your typical recycled firewall box, but the tool merges the ranges to keep the tables as short as possible.
Recent Worms DO organize to manage utilization (Score:4, Informative)
Code Red II [caida.org] implemented a randomized variant on this: "1/8th of the time, CodeRedII probes a completely random IP address. 1/2 of the time, CodeRedII probes a machine in the same /8 (so if the infected machine had the IP address 10.9.8.7, the IP address probed would start with 10.), while 3/8ths of the time, it probes a machine on the same /16 (so the IP address probed would start with 10.9.)" It means the worms don't have to keep track of phases, but it gets similar effects, and while there is more chance of overlap, it's not too high until the worm's infected most of the net, and the added random searches help make up for machines that didn't successfully infect their netblocks due to firewalls or failures or simple slowness.
At least one worm that took this sort of approach had a bad random number generator, so it kept hitting the same territory too hard and missing other wide-open spaces, which protected a few parts of the net from infection.
List of Zombie Blocklists (+ other Bad-Site-BLs) (Score:3, Informative)