oops... to many files and can't clean up...
John (EBo) David
plug-discuss@lists.plug.phoenix.az.us
Thu, 20 Jun 2002 20:26:36 -0700
ummm....
I have a unit and regression test suite for my ecological modeling
virtual machine. I needed to bump up one of the tests to run for a
longer time for model testing. Problem was that I forgot that I am
creating an image dump for *every* variable specified each and every
iteration... start_time=0, stop_time=25, dt=0.01... that is 2,500 images
for umm... looks like 8 variables, and there are 15 other unit tests...
So now I find that I have over 25,000 files in a single directory.
oops. Ok, off to clean them up....
First, ls and rm complain that there are to many files to "rm *.pgm", so
I go though and delete them by group name... ok, appears to go ok. Now
I am finally able to "rm *.pgm" so they should be clean up. Problem is
that once I do that I still have hundreds of pgm files in the directory
that "ls" reports, but an "ls *meta_pop*" does not. I am affraid that I
have corrupted the file system or something.
any suggestions?
thoughts:
shut down the machine, reboot single user, fsck ever partition
(including XFS partions), and recite some prayer to Boolean...
other ideas, thoughts, intuitions as to what happens when creating 10's
of thousands of files in a single directory by accident?
EBo --