Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Spam Software Apache

Stopping Spambots: A Spambot Trap 312

Neil Gunton writes "Having been hit by a load of spambots on my community site, I decided to write a Spambot Trap which uses Linux, Apache, mod_perl, MySQL, ipchains and Embperl to quickly block spambots that fall into the trap. "
This discussion has been archived. No new comments can be posted.

Stopping Spambots: A Spambot Trap

Comments Filter:
  • Looking at my Day Job and personal web site, other than the very cool technical achievement of the trap (I'll have to see if I can rewrite this for my Checkpoint FW system), there were one things I learned about good design from this article:

    Eliminate mailto - makes sense. You should have an http based "send me a message system" - force a live person to type stuff in instead of letting a program pick out addresses.

    Eliminating mailto alone would probably help in mot of my spam problems (as I have my "contact me" address right on the first page).
    • by hagardtroll ( 562208 ) on Friday April 12, 2002 @09:12AM (#3328932) Journal
      I put my email address in a jpeg image. Haven't found a spambot yet that can decipher that.

    • The only problem with the idea of using entirely http based "send me a message systems" is that some people, like myself, would much rather have an actual email address to use instead of having to use 50 different layouts and 50 different configurations and 50 different methods of communicating with someone or a company. Every html based contact system has its own quirks and problems, I'd rather just need to learn my email programs issues instead.
    • Eliminating mailto alone would probably help in mot of my spam problems

      You're 100% right. And fighting against spambots by relying on UserAgent is akin to... well.... security thru obscurity, albeit somehow in reverse.

      What also looks strange is that he doesn't consider that one can get a link directly to a page on the n-th level: as human browsers don't usually download robots.txt either, sounds like he's gonna ban some poor guys who got a link from a friend...

    • I like the way handles the problem. To email a user, you have to click on a link containing the user profile. A link in the profile provides a contact user option which provides a form to fill out - if you are also a regisered user of the site. If you are not a user of the site, then you are prompted to log in or become a user. If you are a user and contacting another user, there is a checkbox when if checked will also send your real address to the user you are contacting so then with his permission, contact may be made via regular mail. This is useful for sending graphics and attachments. The best part is your address is not given out unless you specificaly permit it on a case by case basis. I love it.
    • Make a big macromedia flash site. Let the bot's eat that: this is the thing a lot of company's do.

      don't worry, and google wil adapt. They read even pdf and .doc files.

      new thought: make a site written in .doc format.
    • that you can leave them out of your HTML source: s

      Instructions for use are included in comments. The script fragment that replaces mailto: links in the page will actually shorten your code -- it only requires entering the username and domain once. Also, the @ sign is added in by the script, so the address itself never appears in your HTML.
  • /.ed (Score:2, Funny)

    by Anonymous Coward
    Looks like you should've written some code to handle an overload from slashdot too!
  • Slashbot (Score:3, Funny)

    by Ctrl-Z ( 28806 ) <tim.timcoleman@com> on Friday April 12, 2002 @09:10AM (#3328927) Homepage Journal

    "I have a truly marvelous demonstration of this proposition which this bandwidth is too narrow to transmit."
  • by Anonymous Coward on Friday April 12, 2002 @09:13AM (#3328940)
    Why on Earth would you like to block a spambot? So it doesn't get any more useful addresses?
    No way, man.
    If you realize you're serving to a bot, go on serving. Each time the bot follows the "next page" link, you /give/ it a next page. With a nicely formatted, where words and nums are random.
    Give it thousands, millions of addresses this way.
    • by f3lix ( 234302 ) on Friday April 12, 2002 @09:31AM (#3329017) Homepage
      This isn't such a good idea - for every random (non-existent) domain that you generate, a root DNS server will be queried when an email is sent to this address, which increases the load on the root servers, which is generally a bad thing. How about instead, returning pages with the email address abuse@domain-that-spambot-is-coming-from all over them...
      • by BlueUnderwear ( 73957 ) on Friday April 12, 2002 @09:46AM (#3329095)
        - for every random (non-existent) domain that you generate, a root DNS server will be queried when an email is sent to this address, which increases the load on the root servers, which is generally a bad thing.

        Why is this a bad thing? They are owned by Verisign.

        How about instead, returning pages with the email address abuse@domain-that-spambot-is-coming-from all over them...

        This is also a good idea. In fact, I have a script which does a traceroute to the IP of the bot, and then looks up the admin contact using whois for the last couple of hops, and returns these. Oh, and for additional fun, throw in a couple of addresses of especially loved "friends"...

      • I like that idea...look up the originating host, and make links back to abuse@, root@, webmaster@, and whatever else you can think of. Clog their mailservers. The problem is, it would be simple enough (if it's not already in place) to have your spam bot ignore addresses for your own domain.
      • How about instead, returning pages with the email address abuse@domain-that-spambot-is-coming-from all over them...

        Most spambots know better than to send their crap to email addresses containing things like abuse, root, postmaster, .edu, or .gov.

        Also, in regard to the problem of root servers being queried every time a is looked up, could you not just use random IP addresses?
      • I've posted a separate article [] about fun tricks with round-robin DNS to feed spammers FQDNs that resolve to open relays, which will forward to other open relays. And if you know machines running Teergrubes [], they're excellent addresses to feed spiders.*

        If you're not messing with DNS, though, there are lots of addresses that can cause trouble:

        •, where the domain may be your spammer (if you customize your spidertrap) or a random spammer. They'll probably reject abuse@ and other obvious administrators, but names like "sales" and "purchasing" and "marketing" and anything that might get a real user is good.
        • If they're not verifying the list before using it, this is good.
        • randomjunkuser@randomjunksubdomain.spammerdomain .c om
        •, at some site that encourages spammer customers.
        • randomjunkuser@randomjunksubdomain.spammers-ISP. ne t - does the spammer's ISP check for bad DNS hits?
        • randomjunkuser@othercustomer-of-spammers-hosting -I Your mission is to get the spammer's ISP to throw off the spammer. If you want to be much ruder, you can use real-presidents-name@othercustomer-of-spammers-hos .but both of those attacks require more customization to hit spammers you're having ongoing problems with, as opposed to shotgunning them all.
        • - anything not immediately recognizable as "remove@". Give some other spammer's list builder a bunch of addresses to work with.
        • Teergrubes [] are tarpits to stick spammers in. They look like perfectly correct SMTP servers, e.x.c.e.p.t. t.h.e.y. a.n.s.w.e.r. v..e..r..y.. s..l..o..w..l..y.. and maybe generate lots of error messages requiring repetition, and basically they leave the spammer's machine tied up for a long time with very little effort. A legitimate mailing list server that encounters a teergrube will normally survive, because it's usually multithreaded, or at least has almost all its recipients as legitimate users, but an occasional few minutes of one thread stuck in a trap isn't a major problem. But a spammer who's encountering a large number of teergrubes (especially if he picked them all up at once from a spidertrap) will have lots of threads tied up for a long time and may not have enough spare capacity to bother real targets. There are a number of implementations around.

          And somewhere out there is a far nastier variant on a teergrube that can keep a typical smtp session up for hours with only a few kilobits/minute, using tricks like setting TCP windows very small, NAKing lots of packets so TCP retransmits them, etc. (It basically works by saying "No, SMTP/TCP/IP isn't a set of protocol drivers in my Linux kernel, it's a definition of a set of messages and there's no reason I should user a bunch of well-tuned efficient reliable kernel routines when I can send raw IP packets myself designed for maximal ugliness."

          • Spamido [] is an automated tool for collecting spammers' addresses so they can be fed back to other spammers.
          • Wpoison [] and Sugarplum [] are spidertraps that generate lots of fake addresses for a long time.

    • Wpoison [] does this.

      From the website: Wpoison is a free tool that can be used to help reduce the problem of bulk junk e-mail on the Internet in general, and at sites using Wpoison in particular.

      It solves the problems of trapped spambots sucking up massive bandwidth/CPU time, as well as sparing legitimate spiders (say, google) from severe confusion.

    • Actually, I've done this w/a bot trap on my site at home. It's a perl script that generates a bunch of weird-sounding text w/some fake email addresses at the bottom and a bunch of database-query-looking links back to the original page.

      The bots don't fall for it anymore. Some dorks in Washington state decided to make a couple requests a second to it once, but in the two years I've had it up, they're the only ones.
    • Give it thousands, millions of addresses this way.

      Liberally sprinkled postmaster@ and abuse]@

      • postmaster@ and abuse]@ and abuse@

        Good idea but, I'm sure spam software has been rejecting for many years.

        How about a few people volunteering real FQDNs that all resolve to I realize that people would be volunteering horsepower and bandwidth for DNS lookups, but it would be in the name of dramatically reducing spam. Then, keep a list of all the "loopback FQDN's" and let the rest of us feed those FQDN's into spam-trap generators. Eventually, there would be so many real-looking spam trap email addresses that the spam software wouldn't be able to keep up with the list of loopback FQDN's.

        To take it to the next level, you could hide the list of "loopback FQDN's" by making a reverse DNS lookup against a couple of volunteered IP addresses return a random FQDN from the list of loopback FQDN's at the time that the spamtrap page is dynamically generated.

        Spammers would never know the entire list of FQDN's that resolve to loopback.
    • by boky ( 220626 ) on Friday April 12, 2002 @10:18AM (#3329243) Homepage

      I agree. And, come on, how much technology do you need?

      This is my solution to stopping spambots. It's in a JavaServlet technology and I am posting it here to prevent my company's site from being slashdotted. It does not prevent the spammer from harvesting emails it just slows them down.... a lot :) If everyone had a script like this, spambots would be unusable.

      Feel free to use the code in anyway you please (LGPL like and stuff)

      Put robots.txt in your root folder. Content:

      User-agent: *
      Disallow: /members/

      Put in WEB-INF/classes/com/parsek/util:

      package com.parsek.util;
      // Slashdot lameness filter trick... sklj lijef oiwej goweignm lkjhg woeèi weoij woefh woegih weoigj woefm weoikjf woeifh woefhpweifjwopejf pw
      // Slashdot lameness filter trick... flk joweij pgwej pweof ,mpeof ,mpweorj pweomfwpegj pwehg woeigh owèefij woeij eogih oibhwepoi upeorw wpeo
      // Slashdot lameness filter trick... fkjew fiwje spbojkwe gkwpeori wpbv-j wpeofksweok pweorjsw eigjhwoeifj pweorj wepoj wepfomwe fpmwoe fpowe
      // Slashdot lameness filter trick... epoiw epw0 w'pg wpoe wpeom, wpog wepfoiwpeor kwpeof, wpobm wepofkwpeofk wopvf,w bowkpeoirf pwoef,mwepof p
      // Slashdot lameness filter trick... vlwkepo wesp ibebemwf èsdm fèefo.bp kwèpef èlfk èeofsw èegjwegoweofiw peok èglks dgèlksdfèokwe ofèkwe èfoe
      import javax.servlet.ServletContext;
      import java.util.Enumeration;
      import java.lang.reflect.Array;
      public class StopSpammersServlet extends javax.servlet.http.HttpServlet {
      private static String[] names = { "root", "webmaster", "postmaster", "abuse", "abuse", "abuse", "bill", "john", "jane", "richard", "billy", "mike", "michelle", "george", "michael", "britney" };
      private static String[] lasts = { "gates", "crystal", "fonda", "gere", "crystal", "scheffield", "douglas", "spears", "greene", "walker", "bush", "harisson" };
      private String[] endns = new String[7];
      private static long getNumberOfShashes(String path) {
      int i = 1;
      java.util.StringTokenizer st = new java.util.StringTokenizer(path, "/");
      while(st.hasMoreTokens()) { i++; st.nextToken(); }
      // Respond to HTTP GET requests from browsers.
      public void doGet (javax.servlet.http.HttpServletRequest request,
      javax.servlet.http.HttpServletResponse response)
      throws javax.servlet.ServletException, {
      // Set content type for HTML.
      response.setContentType("text/html; charset=UTF-8");
      // Output goes to the response PrintWriter. out = response.getWriter();
      try {
      ServletContext servletContext = getServletContext();
      endns[0] = "localhost";
      endns[1] = "";
      endns[2] = "2130706433";
      endns[3] = "";
      endns[4] = "";
      endns[5] = request.getRemoteAddr();
      endns[6] = request.getRemoteHost();
      String query = request.getQueryString();
      String path = request.getPathInfo();
      out.println("<title>Members area</title>");
      out.println("<p>Hello random visitor. There is a big chance you are a robot collecting mail addresses and have no place being here.");
      out.println("Therefore you will get some random generated email addresses and some random links to follow endlessly.</p>");
      out.println("<p>Please be aware that your IP has been logged and will be reported to proper authorities if required.</p>");
      out.println("<p>Also note that browsing through the tree will get slower and slower and gradually stop you from spidering other sites.</p>");
      long sleepTime = (long) Math.pow(3, getNumberOfShashes(path));

      do {
      String name = names[ (int) (Math.random() * Array.getLength(names)) ];
      String last = lasts[ (int) (Math.random() * Array.getLength(lasts)) ];
      String endn = endns[ (int) (Math.random() * Array.getLength(endns)) ];
      String email= "";

      double a = Math.random() * 15;
      if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a email = email + "@" + endn;

      out.print("<a href=\"mailto:" + email + "\">" + email + "</a><br>");


      } while (Math.random()
      do {
      int a = (int) (Math.random() * 1000);
      out.print("<a href=\"" + a + "/\">" + a + "</a> ");
      } while (Math.random() out.println("</body>");

      } catch (Exception e) {
      // If an Exception occurs, return the error to the client.
      // Close the PrintWriter.

      Put this in your WEB-INF/web.xml

      <servlet-name>stopSpammers</servlet-name& gt;
      <servlet-class>com.parsek.util.StopSpammersS ervlet</servlet-class>
      <servlet-name>stopSpammers</servlet-name& gt;

      Here you go. No PHP, no APache, no mySQL, no Perl, just one servlet container.


      • by erc ( 38443 ) <erc.pobox@com> on Friday April 12, 2002 @11:30AM (#3329650) Homepage
        Way too much work. Here's similar Escapade [] code:

        <QUIET ON>
        <html><head><title>Members area</title></head><body>
        <p>Hello random visitor. There is a big chance you are a robot collecting mail
        addresses and have no place being here.
        Therefore you will get some random generated email addresses and some random links
        to follow endlessly.</p>
        <p>Please be aware that your IP has been logged and will be reported to proper
        authorities if required.</p>
        <DBOPEN "SpamFood", "localhost", "login", "password">
        <FOR I=1 TO 100 STEP 1>
        <SQL select * from names order by rand() limit 1>
        <LET FN="$Name">
        <SQL select * from lasts order by rand() limit 1>
        <LET LN="$Last">
        <SQL select * from addresses order by rand() limit 1>
        <LET AD="$Address">
        <a href="mailto:$FN.$LN@$AD">$FN.$LN@$AD</a> <br>
        • Way too much work. Here's similar Escapade [] code:

          Not similar enough. That makes 300 queries per hit against your database, and I don't think you even used prepared statements. His code slowed their software to a crawl by sleeping. Yours will slow your software to a crawl by excessive database traffic.

    • Wpoison [] basically does that; it serves a page with bogus addresses and adds a nasty delay between pages, keeping the spider occupied.

      However, the instructions for installating Wpoison more or less assumes that one has a single website to protect. I have around 20 virtual hosts. So instead of creating a renamed cgi-bin in every DocumentRoot, I added a single

      ScriptAlias /runme/ "/var/www/cgi-bin/"

      to httpd.conf and then linked it like this:

      <A HREF="/runme/addresses.ext"><IMG SRC="pixel.gif" BORDER=0></A>

      I also added a single transparent pixel to the link to keep it invisible but still fool the spiders. Add the runme directory as excluded in the robots.txt and you should be on your way. Muhahahah, and so on.

    • Why on Earth would you like to block a spambot? So it doesn't get any more useful addresses?
      No way, man.
      If you realize you're serving to a bot, go on serving. Each time the bot follows the "next page" link, you /give/ it a next page. With a nicely formatted, where words and nums are random.
      Give it thousands, millions of addresses this way.

      This would be good to do with known bad addresses, but random addresses only add more unknowing people to the list. You may add 1000 email addresses to the list and slow them down, but if even 10 of those email addresses are real, you've added to the problem. The bad addresses will be taken out as they are found to be bad, and the good ones will be left in. You've signed up for all the spam he can handle, even if he has taken great lengths to keep his email address off the spam lists. In theory this sounds like a great idea, until your the guy getting your email address randomly fed to the bots.
    • Try out the Book of Infinity []. It's a CGI that generates an infinite trail of gibberish links. It could easily be modified to add gibberish e-mail addresses to each page.
    • We've recently set up a Spam Troll-box using Vipul's Razor [] on our new Tux4Kids [] dev server (you can find our troll box here []).

      A troll-box gives Spam-bots a place to send their spam. When this box intercepts the spam, it reports it to the Vipul's Razor network, and everyone else on this network becomes aware of that spam (if they are also using Vipul's Razor to filter, which, chances are they are, it will filter that spam if they get it).

      If Vipul's Razor isn't enough, one can even use something like SpamAssassin [] in conjunction with Vipul's Razor to get even better results.

      Of course, this isn't cutting off Spam-bots at their source... but if enough sites were to cut them off at their source, then I'd imagine the Spam-bot authors would get wise to this and devise a way around it. Whereas with something like a SPam Troll-box, the Spam-bots seem to still be working to those running the Spam bots ;-)

  • hmm, just a wild guess, but does this technique involve using the http-referrer to see if there are too many clients coming from just a particalar address (which would obviously be a *bad* thingy), and subsequently block them too?

    might explain why we can't see it no more :-(

    I want it too!!! it seems to work pretty good!
  • by Elkman ( 198705 ) on Friday April 12, 2002 @09:18AM (#3328960) Homepage
    I did something rather low-tech: I created a "Contact Us" page on my web server that has an automatically-generated address at the bottom. It says, "Note: The address is not a valid contact address. It's just here to catch spammers." The number is actually the current UNIX timestamp, so I know exactly who grabbed this mail address and sent me mail.

    As it turns out, I really haven't received that much mail to this address. About the only mail I've ever received to it is someone from, who tells me that I'm not listed on a few search engines and that I can pay them to have my site listed. I need to send her a nasty reply saying that I don't care about being listed on Bob's Pay-Per-Click Search Engine, and that if she had actually read the page, she would have noticed that she was sending mail to an invalid address. Besides, the web server is for my inline skate club and we don't have a $10/month budget to pay for search engine placement.

    I think I've received more spam from my Usenet posting history, from my other web site, and from my WHOIS registrations than I've received from the skate club web site.

  • by cswiii ( 11061 ) on Friday April 12, 2002 @09:19AM (#3328965)
    From the website:
    The Problem: Spambots Ate My Website

  • re: spidertrap (Score:4, Interesting)

    by blibbleblobble ( 526872 ) on Friday April 12, 2002 @09:22AM (#3328972)
    My PHP spider-trap [] - See an infinity of email addresses and links in action!

  • by bluGill ( 862 ) on Friday April 12, 2002 @09:22AM (#3328978)

    Removing mailto: links is a bad solution to the problem. It might be the only solution, but it is bad.

    I hate the editor in my web browser. No spell check (and a quick read of this message will prove who diasterious that is to me), not good editing ability, and other problems. By contrast my email client has an excellent editor, and a spell checker. Let me pull up a real mail client when I want to send email, please!

    In addition, I want people to contact me, and not everyone is computer literate. I hang out in antique iron groups, I expect people there to be up on the latest in hot tube ignition technology, not computer technology. To many of them computers are just a tool, and they don't have time to learn all the tricks to make it work, they just learn enough to make it do what they want, and then ignore the rest. Clicking on a mailto: link is easy and does the right thing. Opening up a mail client, and typing in some address is error prone at best.

    Removing mailto: links might be the only solution, but I hope not. So I make sure to regualrly use spamcop [].

    • 1) Put a link such as: [Question] About your site (or whatever)
      2) Trash any email sent to dedicatedaddress that doesn't have the [Question] tag in the subject.

      Hope this helps.
      • by c=sixty4 ( 35259 )
        1. Put a link such as: [Question] About your site (or whatever)
        2. Trash any email sent to dedicatedaddress that doesn't have the [Question] tag in the subject.
        Congratulations. You just ensured you can't be emailed by anyone not running Internet Explorer.
        • Re:Simple solution! (Score:3, Informative)

          by fanatic ( 86657 )
          Congratulations. You just ensured you can't be emailed by anyone not running Internet Explorer.

          This seems to work fine (the window comes upo with the right email address in the to: line and the '[Question]' tag in the subject: line) in Netscape 4.76

          and Lynx Version 2.8.3rel.1

          and Mozilla 0.9.7, which implies Netscape 6.x, and Galeon will work as well, though I haven't tested these.
    • by rsidd ( 6328 ) on Friday April 12, 2002 @09:47AM (#3329101)
      Write some of your email address using html code for the ascii characters, like &#36 &#35 114 for "r".
      (Yes, I've posted about this before [], but it does work for me.) Browsers render it so users get the address they want, but spambots try to grab it from the raw html and get something meaningless.
      • by Sangui5 ( 12317 ) on Friday April 12, 2002 @01:16PM (#3330291)

        Some spambots will render that correctly. Less likely, though, is if they'll render an email that has had this [] done to it: it's encrypted through javascript.

        It is a rather impressive piece of work. Uses honest-to-god RSA.

        You could also encrypt all email addresses, and then in your spambot trap, put really really CPU intensive javascript. You'll win either way: either the spambot doesn't do javascript, and it won't get your addresses, or it does do javascript, and they've just spent an eternity wasting time. It would work the same way as a tarpit, but it wouldn't eat nearly so many resources on your end.

        If you're really clever, you could have the javascript do useful work, and then have the results of that work encoded into links in the page. You could then retrieve the results when the spider follows the link.

        There was an idea called hashcash floating arount a while back. The idea was that an SMPT server would refuse to deliver email if the sender didn't provide a hash collsion of so many bits to some given value. The sender has to expend way assymetrically more resources to generate the collision than it takes the reciever to check it. That way on can impose a cost on sending a lot of email. It's not so much to be a burden on ordinary users, but if you need to send thousands of emails, it will add up.

  • Thanks to: Spam Bots!!! More than meets the Eye :)
  • by Masem ( 1171 ) on Friday April 12, 2002 @09:32AM (#3329024)
    After the Battle Creek incident with ORBZ, the maintain changed the way it worked; instead of being pro-active on checking for open relays, he now has a 'honeypot' like system where a unique email address that isn't directly visible on the site but still may be harvested by a spam bot. Any server that sends email to that address is automatically added to The List. Mail server admins that believe that they should not be on this list can argue their case to remove their server.
    • by toupsie ( 88295 ) on Friday April 12, 2002 @09:56AM (#3329147) Homepage
      he now has a 'honeypot' like system where a unique email address that isn't directly visible on the site but still may be harvested by a spam bot. Any server that sends email to that address is automatically added to

      This is the same method I have been using for a while. I have an e-mail account called "cannedham" that I had posted on several web sites as a mailto: anchor on a 1x1 pixel graphic. Any e-mail sent to that address updates my Postfix header_checks file to protect the rest of my accounts. It works like a charm.

      • function SeedFakeEmail($Email)
        echo "\n<font size=\"-5\" style=\"display:none\"><a
        href=\"mailto:$Email\"> Please don't email $Email</a></font>";


        Put that in your pageheader and smoke it!
  • by Spackler ( 223562 ) on Friday April 12, 2002 @09:35AM (#3329041) Journal

    Superior Labs spambot_trap mirror []

  • A tip (Score:5, Informative)

    by anthony_dipierro ( 543308 ) on Friday April 12, 2002 @09:36AM (#3329047) Journal

    Here's a tip for those of you writing spambot traps... How about not blindly responding to the faked Return-Path address?

    Now that should be illegal. You people whine about your 10 spams a day, try 10,000 from 2000 different email addresses. Idiot postmasters should be caught and jailed.

    • That's not usually the spambot trap, it's usually the MTA, when the spammer sends to an invalid address.

      Although, the MTA would be looking at the envelope sender if it's any good, but most of the time those are faked too.
  • by nwc10 ( 306671 ) <nwc10+/> on Friday April 12, 2002 @09:38AM (#3329054) Homepage
    Interestingly within the article he suggests hiding your e-mail addresses by making a feedback page. One of the programs that he suggests is formmail, and he links to Matt's original version.

    formmail itself (even the most recent version) can still be abused by spammers to use your webserver as a bulk mail relay - see the advisory at [] ry . df

    It's a shame he didn't suggest the more robust formmail replacement at nms [] which is maintained, and attempts to close all the known bugs and insecurities.

    • Yeah, I had a funny incident where my address was put in the From:-field of a pr0n-spam sent using a formmail exploit. I quickly made an autoresponder to the people who complained to me, but it turned out to be just a handful of people (who I then took the opportunity of educating about many things).

      But there are later versions of formmail that are patched, aren't there?

    • Yes! It's becoming a popular target for spammers. If you have formmail in a common location (like it will be eventually scanned for and picked up.

      I've seen it happen to sites I administer a number of times in the past, where individuals apparently using some sort of AOL name harvesting tool were using the scripts to send mass messages. Looking at the User-Agent headers, it looks like there's a VB script out there designed specifically to automate this exploit.
  • by liquidsin ( 398151 ) on Friday April 12, 2002 @09:40AM (#3329065) Homepage
    I've found that a lot of people just won't send email if there's not a link to facillitate it. I've become rather fond of using javascript to write the address to the page. Spambots read the source so they don't piece the address together but *most* browsers will still do it right. Just use something like:

    <script>document.write("<A CLASS=\"link\" HREF=\"mailto: " + "myname" + String.FromCharCode(64) + "mydomain"</script>

    Seems to work fine. Anyone know of any reason it shouldn't, or have any other way to keep down spam without totally removing the Mailto: ? I know this won't work with *every* browser, but it beats totally removing mail links. And I don't think spammers can get it without having a human actually look at the page...

    • This also makes it invisible to anyone who disabled JavaScript, and anyone using a browser that doesn't do JavaScript (lynx, links, etc.)
    • In the description of the trap, the author has a warning page just in case a real user hits one of the bogus links. That page would also benfit from a handy javascript history.go(-1). You might consider an HTTP redirect header, but the bot might be smart enough to follow that.
    • On my company's Web site we've had success with this technique. The addresses posted on the Web site have not received any significant amount of spam. I have yet to see a single spam message that hits all four of the addresses on our contact page at once, which I believe would be a likely indicator we've been hit by a spambot.

      We embed this JavaScript code on each page that needs mailtos:

      <script type="text/javascript" language="JavaScript1.3">
      // Anti e-mail address harvester script.
      function n_mail(n_user) {
      self.location = "mailto:" + n_user + "@" + "yourdomain" + "." + "com";

      And then make email address links of this form:

      <a href="javascript:n_mail('foo');">foo<!-- antispam -->@<!-- antispam -->yourdomain<!-- antispam -->.<!-- antispam -->com<!-- antispam --></a>

      Our addresses even show up correctly in lynx, but are "clickable" only in JavaScript-enabled browsers.

      Of course, it's probably only a matter of time before spambots can compensate for this code. A more secure approach would be to put email addresses "components" in borderless cells of tables, or as a previous poster suggested, in images.
  • by bero-rh ( 98815 ) <bero.redhat@com> on Friday April 12, 2002 @09:41AM (#3329072) Homepage
    My setup (catches some of the more commonly used spambots) uses mod_rewrite to send spammers to a trap.
    Setup details at []
  • by PanBanger ( 465405 ) on Friday April 12, 2002 @09:43AM (#3329083)
    Have your page linked on slashdot! Page gets slashdotted, problem solved.
  • I used a very nifty bit of javascript which masks your mailto address. Provided the person has javascript on (and lets face it, nearly everyone who doesn't read /. does) then it works well.

    You can generate the code for your own email address here [] or, if you want some source code, then you can find an implementation of it here [].

  • my spambot trap (Score:4, Informative)

    by romco ( 61131 ) on Friday April 12, 2002 @09:48AM (#3329109) Homepage
    The page is already slashdoted. Here is a little
    script that traps bots (and others) that use your robots.txt
    to find directories to look through. Requires an .htaccess file with mod_rewrite turned on


    User-agent: *

    Disallow: /dont_go_here
    Disallow: /images
    Disallow: /cgi-bin


    $now = date ("h:ia m/d/Y");
    $host=getenv(R EMOTE_HOST);

    $ban_code =
    '# '."$host banned $now\n".
    'RewriteCond %{REMOTE_ADDR} ^'."$IP\n".
    'RewriteRule ^.*$ denied.html [L]'."\n\n";

    $fp = fopen ("/path/to/.htaccess", "a");
    fwrite($fp, $ban_code);
    fclose ($fp);

    mail("$your_email_address", "Spambot Whacked!", "$host banned $now\n");

  • Other options.. (Score:4, Informative)

    by primetyme ( 22415 ) <djc_slash&djc,f2o,org> on Friday April 12, 2002 @10:00AM (#3329163) Homepage

    A pretty good article, but being able to install modules into Apache may not be the best situation for everyone who wants to stop Spambots..

    Shameless plug, but I've got an ongoing series in the Apache section of /. that deals with easy ways that administrators *and* regular users can keep Spambots off their sites:
    Stopping Spambots with Apache []
    Stopping Spambots II - The Admin Strikes Back []

    Just some more options and choices to help people out!

  • If you use images for email addresses, what are people using text browsers supposed to do? Even worse is using them on the "warning" pages - someone with a text browser would have no idea what the image said and therefore nothing to stop them falling into the trap and getting firewalled.

    And of course if he uses ALT text for the images, then he has the same problem he was trying to avoid, of creating something the spambots can read.
  • How about sending a parameter to a page which redirects to the mailto: protocol?

    For example:


    <a href="filename.php?x=info">E-Mail Me&lt/a>


    Header ("Location: mailto:" + $x + "@mydomain.tld")

  • by Jason Levine ( 196982 ) on Friday April 12, 2002 @10:44AM (#3329341) Homepage
    There's a spam-blacklist, so how about a spambot-blacklist?

    You'd have a standardized spambot trap (like the one described in the article) on various webservers. The new spambot info could go into a "New SpamBots" database (which wouldn't be blocked). Once a day, the webserver would connect up with a central database and submit the new spambot info it's obtained. Then the server would download a mirror of the updated "SpamBots" database which it would use to block spambots.

    The centralized SpamBots database would take all of the new SpamBot info every day and analyze them in some manner as to detect abuse of the system (ensuring that only true spambots are entered). E-mails could be fired off to the abuse/postmaster/webmaster for the offending IP address. Finally, the new SpamBot info would be integrated into the regular SpamBot database.

    This way you'd be able to quickly limit the effectiveness of the Spambot-traps across many websites.
    • Odds are high that this system, should it become sufficiently widespread to be useful, would be vulnerable to poisoning by spammers spoofing spambot traps and causing legitimate IPs (such as Googlebot or large blocks of Net users) to be incorrectly blocked. There are countermeasures against this, but my guess is that the resulting arms race would not result in an adequately-usable system for enough of the time to be worth it. (Remember, the blacklist must update with reasonable frequency for both additions AND expirations, and must have a VERY low rate of false-positives). The authentication of "legitimate" submitters is a serious weakness of such a system. Nice thought, though...
  • Attn Spambot Authors (Score:5, Interesting)

    by NiftyNews ( 537829 ) on Friday April 12, 2002 @10:48AM (#3329367) Homepage
    Dear Spambot Authors,

    Thanks again for your interest. I hope that we were able to help you write the spambots of the future that will be able to detect and sidestep as many of the above protection schemes as possible. We tried to work all of our knowledge into one convienient thread for your development team to peruse.

    Thanks for your interest in SlashDot, home of too much information.
  • I was just checking out one of the email harvesting products and saw
    this [] in the description:

    Automatically avoids spam trap pages.

    I wonder if this is a lie.. I also think it's funny because the rest of the product literature doesn't refer to it as a spam tool, but then this blurb is straight-up admitting it.

    Here's another funny 'feature'--

    Resume at the same place it left off even if your computer

    Doesn't exactly instill confidence in the stability of this product..
    • This morning, after finding a junk fax on the office's voice mail system, I called the removal number (in little text at the bottom of the fax) and reached an automated voice system that would either 1) remove an inputted number, 2) add a new number, or 3) talk to a representative about their service.

      Well, I didn't trust (1), and (3) just got me a voice mail box instead of a person I could chew out, which I didn't use. That left (2), and I had a wicked idea:

      I hit 2, and input the number that I should call if I was interested in the fax (which appeared in BIG text right above the little text). Their own response number should start eventually getting faxes from them or, as I tend to experience, hangups.

      Cute story, I know, but what does this have to do with defeating spambots?

      I went to the page indicated...

      I was just checking out one of the email harvesting products and saw
      this [] []

      And I scrolled to the bottom, and looked at the source code, and noted two faaaaaascinating things:

      First, the HTML on that page is rather clean; I can see no evidence of anti-spambot code on their page.

      And second, the "Contact Us" link at the bottom is a mailto:.

      By all appearances, their page is vulnerable to their own spambot.

      So I had the thought... what if those generated-random-email-address pages were geared to produce not-so-random email addresses? What if the email addresses on those generated-page traps were geared to generate random email addresses at the domains of the various spambot-- (err, I mean) harvester producing companies? Let them see what it's like when less than discerning spammers use their software for evil. Hundreds of Viagra-substitutes! Thousands of hangover cures! Tens of thousands of opportunities to refinance their home mortgage!

      This is just an off-the-top-of-my-head idea. Opinions?

  • On, I use a custom perl script with a html-based form that is programmed only to send messages to me. Here it is. []

    On stuff like my FAQs, I use igPay Latin Encoded Email: ahgaray atyay ahgaray otday omcay
  • Before announcing new useful project to Slashdot community, create Freshmeat/Sourceforge page first there by eliminating the need for my host to shut me down for execssive bandwidth.
  • What I use (Score:3, Interesting)

    by Phroggy ( 441 ) <slashdot3@[ ] ['phr' in gap]> on Friday April 12, 2002 @11:26AM (#3329626) Homepage
    Take a look at these two bits of code from [] :

    <A HREF=""
    onMouseO ver="window.status='mailto:hostingsli';return true;"

    <!-- Spam trap
    (your domain) HREF="mailto:abuse@ (your domain) "
    (your domain) HREF="mailto:root@ (your domain) "
    (your domain) HREF="mailto:postmaster@ (your domain) " HREF=""
  • by dananderson ( 1880 ) on Friday April 12, 2002 @11:35AM (#3329676) Homepage
    I don't stop spambots, I feed them. I feed them phony email addresses and addresses of spammers (gathered from places such as my fake /cgi-bin/ I use [], mentioned before on /. to dish it out!
  • by SethJohnson ( 112166 ) on Friday April 12, 2002 @12:22PM (#3329970) Homepage Journal

    Another way your e-mail address can be susceptible to spambots is if you participate in any mailing list. If the administrator decides to archive the list on a website, in many cases the email addresses of the participants will be there in plain text. I found this out after doing a google search for my own email address and having it turn up on
    the SuSE web site []. I sent an e-mail asking that they do a regsub on the archive to substitute the '@' with [at] or something similar. That was more than six months ago and the SuSE website admin still hasn't done it.
  • What about requiring all of your users to go through a terms of service page before accessing any parts of your site?

    The page could have a form with "Accept TOS" and "Reject TOS" buttons. I wonder how many spambots would submit a form?

    And to catch spambots that did submit the form, your TOS could have some clauses that make it a violation for evil spiders (ones that don't honor "robots.txt") to use the site. Maybe you could make||lose a few bucks suing the spambotters who go through the TOS and still harvest your email addresses.
  • by Peale ( 9155 ) on Friday April 12, 2002 @12:25PM (#3329988) Homepage Journal
    Speaking of spam, I've come across this new program called mailwasher. You can check your mail while it's still on the server, and then - get this - fake a bounced message. There are probably other programs that do this, but this is the first one I've heard of.

    Anyway, AFAIK, it's WinBlows only, and available at [], although right now it seems the site is down, all I get is a 404!
  • Rather than filling the spider with a whole bunch of (potentially valid) addresses and loading your server with bogus clients you don't want, just make it difficult for them to extract the addresses.

    I wrote a bit of PHP a few months ago that applied some spamproofing ala SlashDot (only a bit less agressive) that some might find useful.

    Highlighted Source []

    Raw Source []

    It performs the following munging, depending on what you specify:

    freaky (at) aagh (dot) net

    f&#114;&#101;aky@&#97;&#97;g&# 104;&#46;n&#101;&#116;

    random one of the above

    random with entity encoding

    all of the above
  • There are "scanner" traps that start up a session and just drops it (not telling the scanner) which ties it up until the scanner softare times out.

    How about writing something for these spambots using a special web server that slowly responds to it's requests (sends out a small packet every 10 seconds) so it won't time out and won't consume much cpu time, and just feeds it a line or two lines of junk with each packet. Have it randomly generate a never ending supply of useless information to keep the spambot happy. While it's busy with the useless site, it's not bothering other people nor is it getting any real addresses.

"Well, social relevance is a schtick, like mysteries, social relevance, science fiction..." -- Art Spiegelman