Number of files in a directory / performance

der.hans PLUGd at LuftHans.com
Sun Nov 19 23:13:10 MST 2006


Am 19. Nov, 2006 schwätzte keith smith so:

> About 18 years ago I was working on the file/server (not client/server) based desktop database that ran in DOS.
>
> The maker of the product suggested that we put no more than 300 files in each directory because DOS would slow and we would experience a degradation in performance.
>
> I was thinking of this and wondering if such a limit would exist in Linux in a web server environment.  In other words if say I have 500 pages in one directory is that or some other quantity of pages going to cause the web server to slow?

Depends on the filesystem and the utillities. I think shells and ext2
still don't do well with more than a few thousand files in a single
directory. Haven't tested it in a while. I don't know if ext3 might have
the same type of issues.

reiser should handle that type of situation just fine.

apache should handle it fine as well.

If some of those pages are including files from the full directories,
there might be an issue. I would think PHP, Perl, Jave, etc. should be
able to handle that fine, though.

ciao,

der.hans
-- 
#  https://www.LuftHans.com/        http://www.CiscoLearning.org/
#  Join the League of Professional System Administrators  https://LOPSA.org/
#  "Human kind cannot bear very much reality."
#    -- T.S. Eliot, "Four Quartets: Burnt Norton"


More information about the PLUG-discuss mailing list