Tracking file storage space use
Alan Dayley
alandd at consultpros.com
Mon Jul 6 14:43:14 MST 2009
Good input, everyone, Thanks.
I'll try the script over the next few days.
Alan
On Mon, Jul 6, 2009 at 2:05 PM, Bob Elzer<bob.elzer at gmail.com> wrote:
> I found this perl script in Linux Journal. What's great is it creates a web
> page, that everyone can look at, and see who the disk hogs are.
>
> Article http://www.linuxjournal.com/article/2416
>
> Linux Gazette http://linuxgazette.net/issue18/disk_hog.html better picture
> of web page generated at the end.
>
>
>
> -----Original Message-----
> From: plug-discuss-bounces at lists.plug.phoenix.az.us
> [mailto:plug-discuss-bounces at lists.plug.phoenix.az.us] On Behalf Of Eric
> Shubert
> Sent: Monday, July 06, 2009 9:19 AM
> To: plug-discuss at lists.plug.phoenix.az.us
> Subject: Re: Tracking file storage space use
>
> Alan Dayley wrote:
>> I have a server running Red Hat Enterprise Linux 5. It's running very
>> well but lately we have been running out of disk space on occasion.
>> The truth is we need more storage and that solution is coming. In the
>> mean time, I need to figure out where all the space is being consumed.
>>
>> Every once in a while I can see 3-5GB get consumed in about a day.
>> Then, when I warn everyone we are running out, this space suddenly
>> comes free. I think a user is eating the space and then freeing it up
>> when my warning goes out. But none of the users will admit to this
>> behavior. That's not a big deal because, whether a user or not, I'd
>> like to know what or who is eating this space and then releasing it.
>>
>> The server is running SAMBA shares for /home and other directores,
>> Bugzilla with MySQL on the database, TWiki, Subversion, CVS and ftp
>> services. Tracking each of these individually may be a bear. I was
>> thinking there may be a tool that tracks recent usage from the file
>> system level.
>>
>> What tools can I use to get a handle on this issue and increase my
>> knowledge about disk usage?
>>
>> Alan
>
> I would think that a periodic find command could suffice. You could write a
> find command that would "find all of the files over 1 gig that were created
> in the last 24 hours", then put it in cron.daily/. You can tailor the find
> command to suit your situation.
>
> --
> -Eric 'shubes'
>
> ---------------------------------------------------
> PLUG-discuss mailing list - PLUG-discuss at lists.plug.phoenix.az.us
> To subscribe, unsubscribe, or to change your mail settings:
> http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss
>
> ---------------------------------------------------
> PLUG-discuss mailing list - PLUG-discuss at lists.plug.phoenix.az.us
> To subscribe, unsubscribe, or to change your mail settings:
> http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss
>
More information about the PLUG-discuss
mailing list