Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
It's funny.  Laugh. Software Apache

Why I Hate the Apache Web Server 558

schon writes "Today's the last day of ApacheCon Europe; There was a hilarious presentation entitled 'Why I Hate the Apache Web Server' for anyone who has expressed frustration with the various inconsistencies and nuances of the Internet's favourite config file. And yes, it includes a comparison to Sendmail."
This discussion has been archived. No new comments can be posted.

Why I Hate the Apache Web Server

Comments Filter:
  • Whoops (Score:4, Insightful)

    by bigwavejas ( 678602 ) * on Friday July 22, 2005 @08:48PM (#13141090) Journal
    I think the subject was supposed to read, "Why I hate PDF files."
  • by Soulfarmer ( 607565 ) * on Friday July 22, 2005 @08:48PM (#13141097) Homepage Journal
    Could the on-duty-editor-at-the-moment PLEASE add small note after the links IF TFA is in fact A PDF file. Please? That is NOT too much to ask, I hope. Sorely hope.

    And no, I didn't RTFA, which was in fact TFPDF.
  • by chrispyman ( 710460 ) on Friday July 22, 2005 @08:50PM (#13141106)
    atleast a decent Apache install can keep on chugging along even when faced with a slashdotting.
  • by Rosco P. Coltrane ( 209368 ) on Friday July 22, 2005 @08:51PM (#13141116)
    - It runs acroread slowly, instead of loading in my already opened browser quickly

    - Uses huge ugly fonts

    - Has silly graphics that bring nothing to the point

    - Acroread requires two clicks to close (one for the document, one for acroread)

    - Yes, I want a pony
  • by sockonafish ( 228678 ) on Friday July 22, 2005 @08:54PM (#13141127)
    It's not the PDF format that sucks, it's Acrobat Reader. Use Preview or XPDF [foolabs.com].

    Complaining about PDFs is like complaining about HTTP cause you don't like IIS.
  • Comment removed (Score:2, Insightful)

    by account_deleted ( 4530225 ) on Friday July 22, 2005 @08:54PM (#13141133)
    Comment removed based on user account deletion
  • Hilarious? (Score:3, Insightful)

    by Exitar ( 809068 ) on Friday July 22, 2005 @08:57PM (#13141145)
    If the presentation was Hilarious, I assume that in future Apache configuration will be easier.
    Otherwise I'd define it "sadly realistic"...
  • by Anonymous Coward on Friday July 22, 2005 @09:00PM (#13141168)

    If you're too lazy to look at the URL before clicking on the link, then you are clearly the target demographic for most phishing schemes.

  • by NeoThermic ( 732100 ) on Friday July 22, 2005 @09:02PM (#13141182) Homepage Journal
    Sitting in #apache on freenode is actually fun sometimes. You'll actually see these common things bought up by many people every day. The PDF actually touches on only a few of the "problems" that the conf file has.

    However, its the 2G file limit that makes me laugh. Sure, there's LFS (Configure 1.3 with CFLAGS="-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64", enabled by default in 2.0.53 (and higher) and in 2.1), but to be really honest, there are far better ways to send large files. HTTP isn't one of them. There's FTP and there's also torrents; Both of which have the advantage of being designed for files rather than 'hypertext', which by nature is normally text...

    NeoThermic
  • by Rosco P. Coltrane ( 209368 ) on Friday July 22, 2005 @09:05PM (#13141196)
    Don't be silly. PDF files are very useful to distribute printable materials, such as books, spec sheets, PR and corporate bullshit (ugh), brochures, etc... Remember that PDF is essentially Postscript wrapped in an Adobe straightjacket.

    What does piss me off is:

    - People who use PDFs to make read-only documents
    - People who use PDFs where html or text is adequat and sufficient.

    I don't see why they require me to lauch that hateful Acrobat Reader when a browser does a better job.
  • by Burdell ( 228580 ) on Friday July 22, 2005 @09:17PM (#13141255)
    HTTP doesn't really have much to do with hypertext. A small percentage of the bits transferred via HTTP are text/html (think images, flash and java, and of course PDFs). In many ways, HTTP is a better file download protocol than FTP:
    • doesn't need a second port for transfers (so no firewall "fun")
    • byte ranges allow a client to only request part of the file (great for file completions)
    • easier to do per site, per directory, or per file authentication (since authentication is per request, not per "session")
  • by Linus Torvaalds ( 876626 ) on Friday July 22, 2005 @09:22PM (#13141282)

    I suspect most people here are able to position the cursor over the article link and look in the status bar, note the .pdf at the end of the URL, and know that this is a PDF.

    Assuming they are able to do it is one thing. Expecting them to do it every time they follow a link is another thing entirely.

  • by NanoGator ( 522640 ) on Friday July 22, 2005 @09:30PM (#13141331) Homepage Journal
    "PDF has no place on the Internet, thats why we use HTML , but that would interfere with Adobes buisness model"

    Bullshit. Have you ever tried printing a PDF file?

    PDF has its place, but I agree in this case it was silly.
  • by Penguin ( 4919 ) on Friday July 22, 2005 @09:47PM (#13141419) Homepage
    But... Why would you have your browser opening the file directly without asking you, if you don't like that behaviour?

    It's not like it's a HTML page with a lot of process consuming javascript, java-that-requires-a-lot-of-loading-of-the-java-en gine or the like. It's a totally different content type. You have every way of choosing what to do with it.

    Instead you choose to be annoyed. I don't get it.
  • by j1m+5n0w ( 749199 ) on Friday July 22, 2005 @09:49PM (#13141427) Homepage Journal
    More people who complain loudly when something doesn't work the way it should. I applaud Rich Bowen for this honest critique of Apache configuration, and I hope more people do the same for their favorite open source projects. Sometimes, that's the only way things get fixed.

    I'm also a big fan of the "Grumpy Editor's Guide" series of articles at Linux Weekly News.

  • by jadavis ( 473492 ) on Friday July 22, 2005 @09:51PM (#13141434)
    My main complaint about Apache is that it makes it difficult to divide up users' dynamic content.

    If one user wants mod_perl, one wants php, and one wants mod_ruby, you pretty much have to have different webservers running, which means an administrative hassle and separate IPs.

    There are a couple solutions I can think of:
    (1) Change unix user permissions after it's selected a vhost, but before running any code or accessing files. Not just for CGIs, either, but modules.
    (2) Make it easier to run seperate webservers as if they are one. Basically take the administrative hassle out of running multiple webservers.

    Right now ISPs basically just offer PHP and use safe mode. But that doesn't help other languages, and it's basically a php hack.

    It would also be nice if problems with one vhost didn't prevent the entire server from reloading the config. It should give a nasty error maybe, but the webserver shouldn't shut down the working vhosts, at worst it should leave it as it was before the reload.

  • by fm6 ( 162816 ) on Friday July 22, 2005 @09:57PM (#13141470) Homepage Journal
    This was never meant to be an article, text-based or otherwise. It's a presentation -- the stuff that appears on a big screen behind somebody's head while they're talking. People put these on the web because they're all that's left of the talk. Unless somebody thought to record the talk and put that on that web. Speaking of really big files...
  • by HrothgarReborn ( 740385 ) on Friday July 22, 2005 @10:00PM (#13141480)
    PASSV does NOT fix this. Passive still uses second ports it just changes the direction of the connection. On all firewalls you have to load extra stuff to properly allow the connection to work or fully open all outbound traffic. On IP Tables there is an ftp_contrack module to load, on PIX there is a fixup protocol to load. The fact is FTP is the most broken protocol there is. There is no reason that everything cannot be handled over a single port and the security issues involving race conditions, bounce scans, clear text sign on and transfer, are unacceptable. The ONLY thing going for it is tradition.
  • by DennyK ( 308810 ) on Friday July 22, 2005 @10:03PM (#13141496)
    The basic auth logout: yes, people have been asking for it for years, but it's HTTP itself that doesn't provide a mechanism for logging out users, it's not Apache's fault.

    This one baffled me as well. How could you have a "logout" function in a stateless protocol? Logins don't persist beyond the fulfillment of a single request. The storing of a username and password for HTTP authentication is implemented on the client side, it has nothing to do with the web server or even the protocol. Complain to Microsoft/Mozilla/Opera Software or whoever makes your browser if you don't like it.
  • by ScottSpeaks! ( 707844 ) * on Friday July 22, 2005 @10:08PM (#13141515) Homepage Journal
    What's worse is that some people actually use Comic Sans to letter comic books. Comics - yes, even self-published superhero comics - deserve more respect than that.
  • by Kalak ( 260968 ) on Friday July 22, 2005 @10:10PM (#13141521) Homepage Journal
    He has some great points, and if some non-fan boys would catch things, he's an apache developer. He has the right to hate some things in apache.

    I'm glad to see that someone who works with the project has some of the same frustrations I do:

    mod_imap - why does anyone still need this?
    http and https needing seperate entries in vhost
    vhosting in general

    And to those whining about PDFs would you rather to have this posted in a PPT file? Comic Sans probably means Powerpoint is at the root of this. And I'm guessing he didn't need to put the out there, so he picked a format everyone can read without resorting to PowerPoints horrible html conversion. I hate PDFs, and really hate them viewed in the browser, but that's what "save as" is for. And I'll bet you didn't have to go get a viewer just to read this. There is no pleasing the Slashbots who would rather whine about a PDF than take the criticism in stride, and with the humor it was presented in. If you have to whine about the delivery, then you're too childish to pay attention to the message. He may not have OpenOffice installed at his work (there are places who don't allow that), and this may have been the best he could do under reasonable effort.

    I'd prefer his effort go into the server than in giving us an HTML page rendered just for us. He could use that time to fix some of the annoyances! Some have better things to do than to please everyone.

    And I say we give him a pony!
  • by Penguin ( 4919 ) on Friday July 22, 2005 @10:23PM (#13141564) Homepage
    No, I didn't skip the torrent suggestion, nor did I attack it.

    I just pointed out that your argument - "there are far better ways to send large files. HTTP isn't one of them." - didn't justify your FTP suggestion at all.

    HTTP has the ability to resume as well. I have never had problem resuming HTTP download. Some web browsers might not offer you this possibility for downloading (but might use it itself when requesting images on a page that were only partially downloaded at last visit). But then again, these browsers might not offer you resume download on ftp as well.

    Anonymous login still doesn't qualify as "a better way" regarding large files. It's quite irrelevant regarding large files and only introduces more overhead. Not that overhead matters much as when the transfer is underway, the situation is the same whether you use HTTP or FTP.

    I really can't find ground for your statement that you should be "unable to resume in most cases". I honestly can't recall this being an issue. I often download and resume large files from different HTTP servers.
  • by totro2 ( 758083 ) on Friday July 22, 2005 @10:25PM (#13141571)
    Pardon the obvious comment I'm compelled to spew in Apache's defense:

    Due to the Open Source nature of apache, anyone who is ready to actually improve apache (in ways that the apache people potentially don't like and won't accept into the code) can fork apache and make their own even-easier-to-configure web browser.

    Also remember that functionality comes before user friendliness. It should be no suprise there are warts on the config syntax, just be glad the damn thing works at all! If you want a real taste of ugly, go use IIS or (shudder) Weblogic. You'll run back to apache so fast your legs will fly off.

    As apache matures even more, no doubt these warts will eventually get addressed. Maybe some kind of little task force will even form with this goal in mind.
  • by Linus Torvaalds ( 876626 ) on Friday July 22, 2005 @10:37PM (#13141622)

    I've never had a download from an FTP server ever fail. I've had *many* fail from HTTP served downloads that I really do try avoid downloading anything over about 3MB on HTTP.

    I've downloaded many, many large files (e.g. ISOs) over HTTP with no problem and have done for years. If you are having problems downloading anything over 3MB, then I would guess that you are misconfiguring these computers. Really - you think the rest of the world is just putting up with flaky downloads?

    TCP ensures an error-free connection for both FTP and HTTP. Neither FTP nor HTTP handle that part of the work. When you say "fail", what do you mean, exactly? Dropped connections? Corrupted files?

    In any case, your personal experience and my personal experience is unimportant. That's what I was asking for stats. You are the one claiming that HTTP is unsuitable for large downloads; the burden of proof is on you to show that.

    You keep skipping over torrents.

    I think you are confusing me with somebody else.

    Really, are you trying to attack one point by ignoring points you can't argue?

    Er, what? I'm arguing that HTTP isn't as bad compared with FTP as you make it seem. BitTorrent doesn't factor into that argument whatsoever.

    Or will you acknowledge that torrents can be far better than HTTP for downloads of large files?

    I'll acknowledge that all three protocols have advantages and disadvantages. BitTorrent is not a silver bullet, the fact that users have to download and install additional software is a showstopper for many people, as is the fact that it's not simply a client downloading from a server (e.g. you have to open up ports and sacrifice upstream bandwidth to get a decent speed).

    All three protocols "can be" far better than the other two. It depends on the circumstances. For large files, it depends on what servers are available, the update schedule, the bandwidth available, and so on. It's wrong to simply call one "far better" than another.

  • by swmccracken ( 106576 ) on Friday July 22, 2005 @10:49PM (#13141679) Homepage
    LOOK, you lot have missed one critical point. The guy is a committer to the apache httpd project itself. He's on the INSIDE. He knows more about apache than YOU.

    He's just pointing out some of the sillyness to his own teammates that apache has that people that are involved with and use apache get used to. (And, even if it is documented, that doesn't mean it's not silly.)

    mod_imap? Why is that still on by default, for example.

    As for the PDF complaints, THIS IS A PRESENTATION AT A CONFERENCE. What would you have perferred? A PPS file? Those that complain about the fonts? Get over yourselves.
  • My Biggest Request (Score:4, Insightful)

    by DarkHelmet ( 120004 ) * <mark AT seventhcycle DOT net> on Friday July 22, 2005 @10:52PM (#13141697) Homepage
    Okay...

    Why can't apache's configuration file be XML compliant? It would make life sooo much easier if it were.

    It would be sooo much easier to parse and validate the configuration file if it actually conformed to SOME kind of standard.

    For that matter, why not use some limited XSL syntax in order to handle conditions?

  • by mcrbids ( 148650 ) on Friday July 22, 2005 @11:23PM (#13141881) Journal
    'runs as user NOBODY'

    Perchild MPM, which lets apache run as the user owning NN vhost has been all-but dropped. [apache.org]

    A few other guys have (kind of) picked it back up again, [metux.de] and gotten it to (mostly) work, but it doesn't scale well, yet... (barfs at 256 hosts)

    Why can't somebody get this to work? (I would, but I'm not a c coder)
  • Re:RewriteMap (Score:3, Insightful)

    by Covener ( 32114 ) on Saturday July 23, 2005 @12:03AM (#13142048)
    For my own site, I wanted a rule that catches www.* and issues a permanent redirect to the browser, pointing them to the domain without the 'www' attached. Since I had two or three domains hosted on this box, I wanted to do it globally.

    The only sane way to do this with Apache as it is today was:

    RewriteCond %{HTTP_HOST} ^www.* [NC]
    RewriteMap www prg:/etc/apache2/conf/rewrite/www.pl
    RewriteRule (.*) http:/// [http]{www:%{HTTP_HOST}}$1 [R]

    #!/usr/bin/perl
    $| = 1;
    while () {
    $_ =~ s/^www\.//;
    print $_;
    }




    RewriteCond %{HTTP_HOST} ^www(.*) [NC]
    RewriteRule (.*) http:||%1$1 [R]

    (Pipes instead of slashes for slashcode :/)
    (or UseCanonicalName?)
  • by omega9 ( 138280 ) on Saturday July 23, 2005 @12:56AM (#13142270)
    "Tcl/Tk front end?"

    And for X-less webservers? Maybe something like the menuconfig frontend to kernel building would be neat.
  • His later points are pretty Apache-specific, but most of the early stuff (if-else, variables, case sensitivity, and so on) are all symptoms of trying to produce an ad hoc implementation of a general coding problem-- config file parsing-- instead of doing it just once in a library.

    This problem is *everywhere*. Why are we still putting up with differently-designed config files for your webserver, your ftp server, your mailserver, your nameserver and heaven knows what else, all supported by their own pieces of custom code which, like Apache's, each have the possibility of growing up to be subtly wrong?

    I know the Windows idea of a centralised registry sucks in too many ways (inscrutable binary is no match for human-readable text files), but there's one thing it's got right: all the apps which access their configuration use a consistent API to do so. Is it an impossible dream to hope that someone gets a bunch of large free software projects to agree on what needs to go into a libconfigparse, then implements it, and provides bindings for major languages? Then we might stand a chance of avoiding weird config file problems cropping up in Apache and everywhere else, slightly differently each time.

  • 2GB File limit (Score:2, Insightful)

    by BoldAndBusted ( 679561 ) on Saturday July 23, 2005 @02:08AM (#13142520) Homepage
    Man, the comments are way off the rails on PDF readers. Funny. So, back on topic... the PDF mentions one of my big problems with the current apache

    2 GB file limit

    Why, oh why? It's 2005! Makes throwing video around a bit limited. Please, good Apache people, make this a priority!
  • No supprise (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Saturday July 23, 2005 @02:34AM (#13142604)
    A good number of OSS zealots (of which a good number are found here) have the need to believe that OSS is always better, in every case, and part of that is not admitting faults. You admit faults, you admit the possibility something else could be done better.

    I got in to that some time ago over audio apps in Linux. I mentioned that one of the reasons I run Windows is pro audio work, Linux just doens't have the tools. I was told ya it does, so I asked like what? I mean hey, maybe they know something I don't, I'm always looking for new tools. No, I get pointed to the same ones I've tried. So I talk about what is wrong with them, why I don't like them. In response basically every flaw is downloayed, denied, blamed on me, or declared to be "a better way of doing things".

    Zealots, of whatever type, want to believe their product/way of life/whatever is the best there is. Thus when presented with real criticism, they are likely to either ignore it, or try and change the argument to something else.
  • by Sycraft-fu ( 314770 ) on Saturday July 23, 2005 @02:44AM (#13142640)
    The attitude that so many have of "If you don't like it, fix it yorself!" That's a very harmful attitude to take, it's very abrasive and turns many people off to OSS.

    I mean you have to remember, that most of the people in the world CAN'T, even if they want to, because they aren't coders. The majority of the population, well over 90%, does not know how to program. It's stupid to say they should learn how to. The whole point of specialization of labour is that peopel dont' ahve to do everything. Coders code, other peopel use what they make.

    Then, of the few that can code, most don't have the time. It's a serious undertaking to make major changes to make major changes to a codebase, and it's really har dwhen it's not yours. You have to spend a lot of time just in learning what the fuck is going on and hwo it all works, before you can start making changes. Well, most coders can't do that, espically for every product they happen to use. There a fixed amount of time, and most of us have most of it taken up by more important things (like a paying job, family time, housework, etc).

    Then, even if you do have the ability and time, it's not always easy. I'd not the guy that gave this presentation is an Apache developer, so he IS putting his money where his mouth is. It's just pretty clear that making tha fixes isn't some little 1 hour coding job, it's some major work that needs to happen.

    So really, people who want to push OSS shouldn't take this isntantly hostile "Well fix it yourself!" attitude. Problems should be listend to, and should be fixed when possible. When it's not, the reasons should be explained why, and the person should be helped to figure out how ot work with what they have as best as possible.

    Oh, and having configured both IIS and Apache, IIS wins hands down. Easy GUI config, options do what you think they do, plenty of context sensitive documentation. That's not to say it's a better web server, and sure as hell not more secure, but when it comes to configuration, that's just no contest.
  • by ensignyu ( 417022 ) on Saturday July 23, 2005 @03:07AM (#13142723)
    Are you using a browser to download files over HTTP? Most browsers have horrible resume support. Try something like GetRight (for Windows) or even wget -c. I've never run into a problem with those but I wouldn't trust Firefox to handle a large download correctly.

    Torrents are pretty nifty, but they're more complicated to support (need a seeder, etc) and much less reliable over slow connections. Generating SHA1 hashes for a 2GB file takes a while, so you can't just drop a file in the web directory and serve it immediately.
  • by Spy Hunter ( 317220 ) * on Saturday July 23, 2005 @03:45AM (#13142835) Journal
    Amen. I only wish HTTP had an INDEX method, where you could get a real file/directory index listing in a standard XML format suitable for use in a file manager (where permitted only of course). That, and proper support for the PUT method. It would then truly blow all other file transfer protocols out of the water. Why use FTP, NFS, WebDAV, or SMB with all their bloat, complexity, and security problems when you could just be using good old HTTP which you already know and love? If only the creators of HTTP had seen fit to include an INDEX method, it would have saved us all so much trouble. I don't know why it's never been added by anyone as an extension, either.
  • by poulbailey ( 231304 ) on Saturday July 23, 2005 @07:25AM (#13143250)
    > It runs acroread slowly, instead of loading in my already opened browser quickly

    Reader 7.0 runs okay here. It's no speed monster, but it's noticeably faster than earlier versions of the program.

    > Uses huge ugly fonts

    Christ. This is not the format's fault! Blame the content creator for being a lousy designer. If you use nice typefaces, PDF will display them just fine. You could go for a nice looking type like Adobe Garamond Pro.

    > Has silly graphics that bring nothing to the point

    Again a designer problem. You're really bad at this trolling thing, you know. :waycool:
  • by 10101001 10101001 ( 732688 ) on Saturday July 23, 2005 @09:56AM (#13143685) Journal
    I mean you have to remember, that most of the people in the world CAN'T, even if they want to, because they aren't writers. The majority of the population, well over 90%, does not know how to write. It's stupid to say they should learn how to. The whole point of specialization of labour is that people don't have to do everything. Writers write, and other people use what they make.

    I mean you have to remember, that most of the people in the world CAN'T, even if they want to, because they aren't literate. The majority of the population, well over 90%, does not know how to read. It's stupid to say they should learn how to. The whole point of specialization of labour is that people don't have to do everything. Readers read, and other people use what they hear.

    etc

    Just because it was once true 90% of most countries were illiterate doesn't mean it's a specialized skill that a select few should know. The same can be said for math, which many people are told has no "real world" use beyond simple arithmetic. Programming/coding is a combination of language, math, and logic to perform tasks. Perhaps if a larger percentage of the world was coders there'd be a lot less people who would accept closed proprietary products; ie, more people would demand to do their own source code modding. There's a reason it's called computer literacy. And there's a reason why taking a course in using Excel isn't it.

Happiness is twin floppies.

Working...