Originally Posted by jimtsap
I just want to add a comment on that...
That would help google speed but it will spent some time on the uncompressions on the visitors CPU. Also many hosts don't allow gzip, in order not to overload the server's CPU....
Not really true.
The problem is (or was) that those hoster do (or did) not cache the contents.
That's why they disabled gzip to prevent CPU load for every gzip request.
I have an older webserver running with several thousands of accesses every day (over 250.000 every month).
The configuration is like this:
- static content on cookieless domains
- gzip text/html files
- cache static text/html
The expiration date for the static content is 365 days.
I run memcache daemon, disk_cache module, mod_deflate (for gzipping), mod_expires, mod_headers, ...
The server is sleeping most of the day, it's a dual core AMD with 4GB RAM only.
If you configure your server correctly there is no problem with high traffic websites.
Uncompressing gzipped content on the clients browser is ... minimum CPU load and time consumption.
You do not even feel the difference.
Oh by the way, on my webserver is a mail-server with spamassassin/clamav doing his work, too. It produces higher CPU load then the Apache webserver.