RE: too much data on disk~ 'x' won't start

Top Page
Attachments:
Message as email
+ (text/plain)
Delete this message
Reply to this message
Author: Craig White
Date:  
To: plug-discuss
Subject: RE: too much data on disk~ 'x' won't start
On Wed, 2004-06-30 at 10:30, Jeffrey Pyne wrote:
> On Wednesday, June 30, 2004 10:17 AM, Michael Havens wrote:
> >
> > There is so much data on my disk that when I started Linux
> > this morning 'x' wouldn't start.
> > where could all that extra data be that are filling my disk be?
>
> You can use the df and du commands to track this down. For example:
>
> -bash-2.05b$ df -k
> Filesystem           1K-blocks      Used Available Use% Mounted on
> /dev/hda2              6048352   6048352         0 100% /
> /dev/hda1               101086      8650     87217  10% /boot
> /dev/hda3              9448100   8192648    775512  92% /home
> /dev/hdb1            115380192  11906844  97612312  11% /apps1
> /dev/hde1            118169876  32456600  79710592  29% /apps2
> /dev/hdg1            118176996 101251552  10922404  91% /apps3
> none                    516128         0    516128   0% /dev/shm

>
> >From this you can see that the / filesystem is 100% full. Then you can use
> du to track down which directory in that filesystem is the biggest:
>
> -bash-2.05b$ du -sk * | sort -n
> 0       misc
> 4       initrd
> 4       opt
> 4       razor-agent.log
> 44      mnt
> 424     dev
> 4536    boot
> 4964    bin
> 7920    tmp
> 13340   sbin
> 31108   etc
> 85316   lib
> 248656  var
> 922661  proc
> 3630148 usr

>
> Here you can see that the /usr directory is the biggest. So you'd cd /usr
> and repeat that du command:
>
> -bash-2.05b$ du -sk * | sort -n
> 0       tmp
> 4       dict
> 4       etc
> 72      kerberos
> 468     doc
> 3912    games
> 9928    libexec
> 39916   sbin
> 60440   java
> 72276   include
> 123288  X11R6
> 196876  bin
> 277260  src
> 496144  local
> 1075940 share
> 1273616 lib

>
> Uh oh, looks like /usr/lib is big. Keep descending into the filesystem
> until you find the culprit.

-----
I like this concept but I am a bit slow in Byte arithmetic...

df -h (human readable-had me in mind)
du -sh /home/* give me summary of each folder (again human readable)
du -sh /root/* you get the idea

OH and one for the road...the script that I wrote (with PLUG help) for
checking user profiles and emailing users with large profiles (can be
adapted pretty easily)...

# cat chksize-users-profiles
#!/bin/sh
wfile1=/root/scripts/users-homes
wfile2=/root/scripts/profiles-exceed-limits
chdirect=/home/filesystems/samba/profiles
size_plateau=153600
du -s $chdirect/* > $wdirect/users-homes
> $wfile2

for i in `cut -f1 $wfile1`; do if [ $i -gt $size_plateau ]; then grep $i
$wfile1 >> $wfile2; fi; done
for i in `cut -f2 $wfile2`; do echo "Your Profile is getting too large.
You probably need to clean up your email or move some of your files in
your My Documents folder. You might want to check out
http://www.mullenpr.com/support/profiles.html for suggestions on how to
reduce the size of your stored profile." | mail ${i##/*/} -s "You are
close to reaching your quota for file space on the server"; done

Craig

---------------------------------------------------
PLUG-discuss mailing list -
To subscribe, unsubscribe, or to change you mail settings:
http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss