Per process memory limit?

Jason jkenner@mindspring.com
Tue, 16 Jan 2001 05:23:17 +0000


Kevin Brown wrote:
> I'm doing some NIDS stuff (Network Detection Intrusion System) and I've
> encountered an interesting problem.  A cron job stops the NIDS once a day and
> moves the logs.  I then transfer the logs to another box to convert the flat
> text logs into crosslinked html.  The system it's doing the conversion on is a
> dual PII 450 with 2GB RAM (used to be 512MB).  My problem is that the perl
> script errors out with an Out of Memory message, yet watching top on the box the
> process never exceeds 40% of the RAM and while there is a lot of cached data
> there is nothing else on the system using more than 1% of RAM.  So the only
> thing I can guess is that there is a per process limit on memory.  Do I have
> this right and if so, how do I increase the limit?

Did the ulimit mods fix the problem, or is it something else? Any way
to figure out when the error occurs, specifically? The reason I ask
this is that if your regularly using up to 40% of the ram on a 2GB
system processing logfiles (!) for NIDS, something is probably wrong
on a fundamental level... unless the daily logfiles in question are
several hundreds of megs in length (possible I guess, for an ISP..)

Either way, look to the future... It seems as though an increase in
the amount of data the script is asked to handle could easily break
the system in other ways, if its using that much memory to start with,
even if hard limits are raised.


-- 
jkenner @ mindspring . com__
I Support Linux:           _> _  _ |_  _  _     _|
Working Together To       <__(_||_)| )| `(_|(_)(_|
To Build A Better Future.       |                   <s>