Per process memory limit?

sinck@ugive.com sinck@ugive.com
Tue, 16 Jan 2001 07:25:00 -0700


\_ running ulimit from the shell gives a response of unlimited.  The
\_ log files are around 100 - 200MB in size for each day that I'm
\_ trying to process (one day's worth at a time).  The only thing I
\_ can think of is that either perl (5.005) or the script can't use
\_ that much ram.  Running the script on an Enterprise 450 (Dual
\_ 480MHz Ultra Sparc w/ 2GB RAM, Solaris 7, perl 5.005) the script
\_ works fine and is able to process several days worth of logs at one
\_ time.  Only one day of the two weeks worth of logs was able to be
\_ processed and the funny thing is it isn't the smallest of the logs.
\_ I'm looking into switching from flat files to logging to a remote
\_ SQL server (MySQL or PostgreSQL) and using php to dig through the
\_ db.

Hum.  Is the algorithm exponential or have dangerous regexen in it?  

Dangerous regexen would be things along the lines of 

$foo = "=XX=============================================================";
print "Hi\n" if ($foo =~ /X(.+)+X/);

Notice in particular the (.+)+ formation.  You'll be off on a scenic
trip to warmer climes in a handbasket before you know it.

It's safe to run, and best yet, with only slight modification, can run
in browsers with the same results.

YMMV.  In fact, you may not have mileage at all.

David