From: Johannes Baagoe on
Stefan Weiss :

> IIRC, they were discussing the case where the web server wasn't
> equipped or configured to do this automatically.

I understand the exercise of writing a decent compressor in javascript.
It improves knowledge both of compression algorithms and the language.

But if one actually wants to save bandwidth to a meaningful extent,
there is no way such an exercise can compete with well-established
industrial solutions. If the present ISP doesn't provide one, switch
to one who does. Javascript is not an option.

> But since you mention mod_deflate - do you know if there is a
> reason why neither this module nor the older mod_gzip will cache
> the compressed files?

My guess: because it hardly ever is a genuine problem, you would have
to gzip several megabytes per second before the percentage of the time
spent on doing so becomes noticeable on even a modest server. And
since it doesn't make sense to compress already compressed content
like images or sounds or videos, it means several megabytes of *text*
per second. That may be a likely scenario for Google or online banks
or airline reservations, it isn't for most sites. When it is, a cache
can only be part of a more global solution, and it hardly makes sense
to use a special cache for gzipped content only.

> It seems wasteful to compress the same JS library over and over
> again for every new visitor.

Why would one compress javascript libraries and nothing else ? HTML,
XML and plain text benefit no less. If you gzip, gzip all text content.

On the other hand, never compress images or sounds or video or anything
else that is already compressed. Doing that may well cause the kind
of problems you envision, because *then* the several megabytes per
second rates become a distinct possibility. The solution, however,
is "Don't gzip MPEGs", not "Cache the gzipped MPEGs".

> A pre-generated gzipped version could potentially take some of the
> load off the server.

As far as I can tell after a quick glance, mod_cache can do that.
But I have never needed it, so I haven't any experience in using it.

--
Johannes
From: Stefan Weiss on
On 24/05/10 23:40, Johannes Baagoe wrote:
> Stefan Weiss :
>> But since you mention mod_deflate - do you know if there is a
>> reason why neither this module nor the older mod_gzip will cache
>> the compressed files?
>
> My guess: because it hardly ever is a genuine problem, you would have
> to gzip several megabytes per second before the percentage of the time
> spent on doing so becomes noticeable on even a modest server.

Yes, that's seems to be the case. I haven't noticed any significant
change in server load with or without compression, but then I only
manage small servers; the big ones and the clusters all have experts in
charge.

I've heard a couple of lectures about the Deflate algorithm, a long time
ago. Apparently it's fast enough to make it almost unnoticeable when
small (<100k) text files are compressed, at least on modern hardware.
I'm probably worrying about nothing. I just don't like the idea of
wasted cycles.

I think I also read about a technical reason for mod_deflate's failure
to cache already compressed content, something to do with the order in
which the Apache modules are activated, but I don't have a reference at
hand.

>> It seems wasteful to compress the same JS library over and over
>> again for every new visitor.
>
> Why would one compress javascript libraries and nothing else ? HTML,
> XML and plain text benefit no less. If you gzip, gzip all text content.

I know. That was a rather transparent attempt to stay on topic in a JS
newsgroup ;)

>> A pre-generated gzipped version could potentially take some of the
>> load off the server.
>
> As far as I can tell after a quick glance, mod_cache can do that.
> But I have never needed it, so I haven't any experience in using it.

Thanks, I'll look into that. mod_cache is installed here on my laptop,
but not enabled by default.


--
stefan
From: Thomas 'PointedEars' Lahn on
David Mark wrote:

> Johannes Baagoe wrote:
>> David Mark :
>>> You can't just plop GZIP files on the server. You have to do content
>>> negotiation and would need two copies of each static file (one
>>> compressed, one not). I know how to do it, I just choose not too.
>>
>> http://httpd.apache.org/docs/2.0/mod/mod_deflate.html
>
> Thanks, I know about that (and other similar scripts) but my host does
> not run Apache. It is an on-the-cheap hosting service that I set up
> mainly to post examples to. I'm going to be switching soon now that I
> actually have something up there I want to promote.

You don't need Apache to do that (there's cgi_buffer, for example). You
have to do content negotation alright, but you surely don't need "two copies
of each static file". That would be ridiculous. GZIP compression can be
done on the fly, and because it is fast, with almost no performance loss.


PointedEars
--
Anyone who slaps a 'this page is best viewed with Browser X' label on
a Web page appears to be yearning for the bad old days, before the Web,
when you had very little chance of reading a document written on another
computer, another word processor, or another network. -- Tim Berners-Lee
From: David Mark on
Thomas 'PointedEars' Lahn wrote:
> David Mark wrote:
>
>> Johannes Baagoe wrote:
>>> David Mark :
>>>> You can't just plop GZIP files on the server. You have to do content
>>>> negotiation and would need two copies of each static file (one
>>>> compressed, one not). I know how to do it, I just choose not too.
>>> http://httpd.apache.org/docs/2.0/mod/mod_deflate.html
>> Thanks, I know about that (and other similar scripts) but my host does
>> not run Apache. It is an on-the-cheap hosting service that I set up
>> mainly to post examples to. I'm going to be switching soon now that I
>> actually have something up there I want to promote.
>
> You don't need Apache to do that (there's cgi_buffer, for example).

Also incompatible with my current host account I'm sure. Trust me, I've
looked into this. The best solution is to move to another host that has
properly configured servers. My current host gives me virtually no
control over such things. It's a glorified FTP site, which is all I
needed until recently.

> You
> have to do content negotation alright, but you surely don't need "two copies
> of each static file".

You would with my host. I can't just install any old CGI program.
Trust me, I've looked into it. All I could do is write an ASP to do the
negotiation and serve the appropriate file. I've considered and
dismissed the idea (for reasons that should be obvious).

> That would be ridiculous. GZIP compression can be
> done on the fly, and because it is fast, with almost no performance loss.
>

Irrelevant in the case of my account.
From: Thomas 'PointedEars' Lahn on
David Mark wrote:

> Thomas 'PointedEars' Lahn wrote:
>> [optional gzip-compressed HTTP response body]
>> You have to do content negotation alright, but you surely don't need "two
>> copies of each static file".
>
> You would with my host. I can't just install any old CGI program.

You don't need to install anything with cgi_buffer. You put the scripts
(written in Perl, Python or PHP) where you need them, and include them.
Yes, cgi_buffer is old, but it works well (I use the PHP version because I
want cache control for those, too).

> Trust me, I've looked into it. All I could do is write an ASP to do the
> negotiation and serve the appropriate file. I've considered and
> dismissed the idea (for reasons that should be obvious).

The reasons are not obvious. If you can have the My Library builder in ASP,
there is no good reason why you could not have a cgi_buffer equivalent in
ASP.


PointedEars
--
realism: HTML 4.01 Strict
evangelism: XHTML 1.0 Strict
madness: XHTML 1.1 as application/xhtml+xml
-- Bjoern Hoehrmann