Re: Number of files in a directory / performance

Top Page
Attachments:
Message as email
+ (text/plain)
Delete this message
Reply to this message
Author: Darrin Chandler
Date:  
To: Main PLUG discussion list
Subject: Re: Number of files in a directory / performance
On Sun, Nov 19, 2006 at 08:00:11PM -0800, keith smith wrote:
>
> About 18 years ago I was working on the file/server (not client/server) based desktop database that ran in DOS.
>
> The maker of the product suggested that we put no more than 300 files in each directory because DOS would slow and we would experience a degradation in performance.
>
> I was thinking of this and wondering if such a limit would exist in Linux in a web server environment. In other words if say I have 500 pages in one directory is that or some other quantity of pages going to cause the web server to slow?


*nix has less of a problem with this than DOS did. For any filesystem
and operating system there are parameters than can affect performance.
You usually won't need to tweak these at all, even with a couple of
thousand files. This falls under the Rules of Optimization (paraphrased
from memory, sorry):

1. Don't do it.
2. Don't do it yet.
3. Profile. (optimize the right thing)
3. Measure before and after. (did you help or hurt?)

If you want to see examples of a NEED to tweak, and how to do it, Google
for what people have done for running NNTP servers (absolutely HUGE
numbers of small files, which are added and removed continuously). *IF*
you need to tweak, you must take into account your particular situation.

-- 
Darrin Chandler            |  Phoenix BSD Users Group
   |  http://bsd.phoenix.az.us/
http://www.stilyagin.com/  |
---------------------------------------------------
PLUG-discuss mailing list - 
To subscribe, unsubscribe, or to change  you mail settings:
http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss