automation of collecting web site size

トップ ページ
添付ファイル:
Eメールのメッセージ
+ (text/plain)
このメッセージを削除
このメッセージに返信
著者: Wes Bateman
日付:  
題目: automation of collecting web site size
Hey Guys:

I've been playing around and trying to see how large a given website is
remotely. I've been using wget and du. It can take a long time, and I
really don't want to mirror the site (I erase the data right after I do
it). Seems a shame to waste so much time and bandwidth, when all I really
want is the total space (size in bytes) that a website occupies. I've
been following infinite depth of links, but only remaining in the original
domain.

There must be a way to get this info, without either sucking the whole
site down, or having access to the webserver?

Anyone have any ideas, suggestions, etc.?

Thanks,

Wes