You could use Perl and LWP to first get the list of files you need to
look at, then grab each page and measure the bytes or whatnot line by
line (saves memory) and total them up. It would take about as long as
grabbing the whole site, but it's a lot more efficient with disk space.
Eden
Eden.Li@asu.edu
Wes Bateman <
wbateman@epicrealm.com> wrote:
> Hey Guys:
>
> I've been playing around and trying to see how large a given website is
> remotely. I've been using wget and du. It can take a long time, and I
> really don't want to mirror the site (I erase the data right after I do
> it). Seems a shame to waste so much time and bandwidth, when all I really
> want is the total space (size in bytes) that a website occupies. I've
> been following infinite depth of links, but only remaining in the original
> domain.
>
> There must be a way to get this info, without either sucking the whole
> site down, or having access to the webserver?
>
> Anyone have any ideas, suggestions, etc.?
>
> Thanks,
>
> Wes
>
>
> ________________________________________________
> See http://PLUG.phoenix.az.us/navigator-mail.shtml if your mail doesn't post to the list quickly and you use Netscape to write mail.
>
> Plug-discuss mailing list - Plug-discuss@lists.PLUG.phoenix.az.us
> http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss