Speed Up Sites with htaccess Caching 29
produke writes "Increase your page load times and save bandwidth with easy and really effective methods using apache htaccess directives. mod_headers to set expires, and max-age, and cache-control headers on certain filetypes. The second method employs mod_expires to do the same thing -- together with FileETag, makes for some very fast page loads!"
Increase page load times? (Score:5, Funny)
Re: (Score:3, Funny)
Re: (Score:1)
I use it all the time, but be aware.. (Score:5, Interesting)
However, if you are one to be changing images around, like using a Holiday logo or something, you have to change the image file name to force browsers to reload it.
I'm sorta surprised that slashdot doesn't use this on their images:
wget -S --spider http://images.slashdot.org/logo.png [slashdot.org]
--08:31:01-- http://images.slashdot.org/logo.png [slashdot.org]
=> `logo.png'
Resolving images.slashdot.org... 66.35.250.55
Connecting to images.slashdot.org|66.35.250.55|:80... connected.
HTTP request sent, awaiting response...
HTTP/1.0 200 OK
Date: Mon, 04 Dec 2006 14:30:12 GMT
Server: Boa/0.94.14rc17
Accept-Ranges: bytes
Cache-Control: max-age=43200
Connection: Keep-Alive
Keep-Alive: timeout=10, max=1000
Content-Length: 7256
Last-Modified: Fri, 01 Dec 2006 03:02:14 GMT
Content-Type: image/png
Length: 7,256 (7.1K) [image/png]
200 OK
Re: (Score:2)
If you're making regular changes to a particular piece of content, your max-age and/or expiration date needs to be set up to facilitate that. If you change something every day at 6am, set your expiration date for 6am. If you could change it at any time, and you want to see changes picked up within an hour, set a max-age=3600. Let your caching policies work with your
Re: (Score:2)
caching htaccess? (Score:5, Informative)
They don't.
If you're going to set caching in your server to decrease load time, make sure to set in the main configuation files, and disable htaccess, which can potentially increase the time of every page load. (the decreased hits and bandwidth may be an advantage to you -- you'll have to benchmark to see if this solution helps or hurts you for your given platform and usage patterns)
Re: (Score:3, Interesting)
What is the performance loss in htaccess files anyway? For instance, would it be faster to have htaccess redirect moved pages or would it be faster to have a server-side script (i.e. php, python, etc.) do redirecting?
htaccess performance loss (Score:4, Informative)
Basically, if someone were to request a file from your site:
Then apache has to look for, and if there, parse, each of the following files:
And then, should the rules allow the file to be served, it'll be sent to the requestor.
So the problem isn't the
As for question about redirects -- you have to tell the system how to process the 404s
Re: (Score:2)
Or, it could be an added option to the already huge config file, during the next release. Maybe someone wants to add the feature?
If code is not your expertise, then you can probably pay someone $100 to do it for you.
Re: (Score:2)
The nice part, however, is that web-serving is so easily and cheaply scalable that it's almost pathetic. If your alternative is to buy an extra few megabits (at guaranteed bandwidth rates, not shared-connection rates), then for what you'd pay for bandwidth in a single month, you can throw in another Apache machine to help carry the load.
I'm pretty excited for "hardware season" this year (making purchases to accomodate growth in our peak season). These [supermicro.com] are sexy, cheap, and compact. At 14" deep, I can do
Re: (Score:2)
Re: (Score:2)
The performance loss comes from Apache having to check the current directory and every directory above it up to the webroot for .htaccess files. This means that if you store your images in /foo/bar/etc/images/ and you have 50 images per page, Apache needs to check for 5*50 = 250 .htaccess files just to serve the images. A stat isn't that expensive, but they add up.
httpd.conf (Score:5, Informative)
Re:httpd.conf (Score:4, Informative)
So using cache control headers is "news", huh? (Score:4, Informative)
Also, from the comment on this "innovative" article:
1.DrBacchus said:
Yes, these techniques *can* result in performance improvements, but should be put in your main server configuration file, rather than in
Re: (Score:2)
Anyway, next thing to do is teach equivalent techniques to PHP programmers. You, too, can learn the wonders of the HTTP specification!
Re:So using cache control headers is "news", huh? (Score:4, Insightful)
Seriously, Linux's F_NOTIFY has been around since 2.4 and other operating systems have similar.
Re: (Score:3, Informative)
Increase page load times? (Score:1)
Probably not exactly what most people want to do, but yeah, if you can throttle your server so that page load time approaches infinity, bandwidth consumption will approach zero -- especially once people stop trying to use your site...
Go to Apache 2.2 (Score:2)
I meant Decrease page load times/str (Score:1)
Ooops.. I meant
Decrease page load times
As an example of how I implement this caching scheme..
So the js and css get cached for a week, but if I make a change to o
Re: (Score:2)
Some suggestions:
It's important to be aware that the max-age cache-control directive is only one part of a site's caching strategy. The presence of Last-Modified headers and/or Etag allow for Conditional GETs, which are also great at improving a site's
the benefit is a well-thought-out caching scheme (Score:1)
For everyone making the point about the performance hits of running these types of operations in htaccess as opposed to httpd.conf file yes I don't think anyone would argue with that, but it is true that this is for those billions of people on some type of shared hosting environment.. Besides, You can use the AllowOverride directive in httpd.conf to allow .htaccess in /z/image/ folder but not /z/df/even/cgi-b/live/ folder. Just turn it off, problem solved.
Remember the article is called "Speed Up Sites