Per process memory limit?
Kevin Brown
kevin_brown@qwest.net
Mon, 15 Jan 2001 22:24:58 -0700
> > I'm doing some NIDS stuff (Network Detection Intrusion System) and I've
> > encountered an interesting problem. A cron job stops the NIDS once a day and
> > moves the logs. I then transfer the logs to another box to convert the flat
> > text logs into crosslinked html. The system it's doing the conversion on is a
> > dual PII 450 with 2GB RAM (used to be 512MB). My problem is that the perl
> > script errors out with an Out of Memory message, yet watching top on the box the
> > process never exceeds 40% of the RAM and while there is a lot of cached data
> > there is nothing else on the system using more than 1% of RAM. So the only
> > thing I can guess is that there is a per process limit on memory. Do I have
> > this right and if so, how do I increase the limit?
>
> Did the ulimit mods fix the problem, or is it something else? Any way
> to figure out when the error occurs, specifically? The reason I ask
> this is that if your regularly using up to 40% of the ram on a 2GB
> system processing logfiles (!) for NIDS, something is probably wrong
> on a fundamental level... unless the daily logfiles in question are
> several hundreds of megs in length (possible I guess, for an ISP..)
>
> Either way, look to the future... It seems as though an increase in
> the amount of data the script is asked to handle could easily break
> the system in other ways, if its using that much memory to start with,
> even if hard limits are raised.
running ulimit from the shell gives a response of unlimited. The log files are
around 100 - 200MB in size for each day that I'm trying to process (one day's
worth at a time). The only thing I can think of is that either perl (5.005) or
the script can't use that much ram. Running the script on an Enterprise 450
(Dual 480MHz Ultra Sparc w/ 2GB RAM, Solaris 7, perl 5.005) the script works
fine and is able to process several days worth of logs at one time. Only one
day of the two weeks worth of logs was able to be processed and the funny thing
is it isn't the smallest of the logs. I'm looking into switching from flat
files to logging to a remote SQL server (MySQL or PostgreSQL) and using php to
dig through the db.