Ulimit not limiting memory usage

ulimit

When writing program, there are times when a runaway program slurps half of my RAM (generally due to practically infinite loops while creating large data structures), and bringing the system to become really slow that I can't even kill the offending program. So I want to use ulimit to automatically kill my program automatically when my program is using an abnormal amount of memory:

$ ulimit -a
core file size          (blocks, -c) 1000
data seg size           (kbytes, -d) 10000
scheduling priority             (-e) 0
file size               (blocks, -f) 1000
pending signals                 (-i) 6985
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) 10000
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 6985
virtual memory          (kbytes, -v) 100000
file locks                      (-x) unlimited
$ ./run_program

but why is my program still using more RAM than the given limit (yes, I'm starting the program in the same bash shell)?

Have I misunderstood something about ulimit?

Best Answer

  • Your example should work like you think (program gets killed after consuming too much RAM). I just did a small test on my shell server:

    First I restricted my limits to be REALLY low:

    ulimit -m 10
    ulimit -v 10
    

    That lead to about everything getting killed. ls, date and other small commands will be shot before they even begin.

    What Linux distribution you use? Does your program use only a single process or does it spawn tons of child processes? In the latter case ulimit might not always be effective.

  • Related Question