Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Matts (1087)

Matts
  (email not shown publicly)

I work for MessageLabs [messagelabs.com] in Toronto, ON, Canada. I write spam filters, MTA software, high performance network software, string matching algorithms, and other cool stuff mostly in Perl and C.

Journal of Matts (1087)

Sunday March 17, 2002
06:56 AM

Crawlers--

[ #3604 ]

Why don't web crawlers send Accept-Encoding: gzip. It's very frustrating to have AxKit web sites and the biggest bandwidth suckers of them all are crawlers. Simple web browsers are using 1/10th of the bandwidth of these beasties. Very annoying.

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • It would help them, too, because it would save their bandwidth.

    I was just thinking about mod_gzip the other day and the thought struck me that some day there will probably be a mod_bzip. I wonder when that will be.

    --
    J. David works really hard, has a passion for writing good software, and knows many of the world's best Perl programmers
    • I don't think we will see mod_bzip in the near future. While it does provide better copression rate it is much more slower and CPU consuming than mod_gzip.
      --

      Ilya Martynov (http://martynov.org/ [martynov.org])

    • We'll have to wait until the CPUs become a touch more powerful, but at least Konqueror already supports Content-encoding: bz so it could happen :) (and nothing keeps you from having bzip encoded files on your box so that Konqueror can do some content-negociation and get those).

      --

      -- Robin Berjon [berjon.com]