Linux – Running commands in parallel with a limit of simultaneous number of commands

bashlinuxparallel processing

Sequential: for i in {1..1000}; do do_something $i; done – too slow

Parallel: for i in {1..1000}; do do_something $i& done – too much load

How to run commands in parallel, but not more than, for example, 20 instances per moment?

Now usually using hack like for i in {1..1000}; do do_something $i& sleep 5; done, but this is not a good solution.

Update 2:
Converted the accepted answer into a script: http://vi-server.org/vi/parallel

#!/bin/bash

NUM=$1; shift

if [ -z "$NUM" ]; then
    echo "Usage: parallel <number_of_tasks> command"
    echo "    Sets environment variable i from 1 to number_of_tasks"
    echo "    Defaults to 20 processes at a time, use like \"MAKEOPTS='-j5' parallel ...\" to override."
    echo "Example: parallel 100 'echo \$i; sleep \`echo \$RANDOM/6553 | bc -l\`'"
    exit 1
fi

export CMD="$@";

true ${MAKEOPTS:="-j20"}

cat << EOF | make -f - -s $MAKEOPTS
PHONY=jobs
jobs=\$(shell echo {1..$NUM})

all: \${jobs}

\${jobs}:
        i=\$@ sh -c "\$\$CMD"
EOF

Note that you must replace 8 spaces with 2 tabs before "i=" to make it work.

Best Answer

GNU Parallel is made for this.

seq 1 1000 | parallel -j20 do_something

It can even run jobs on remote computers. Here's an example for re-encoding an MP3 to OGG using server2 and local computer running 1 job per CPU core:

parallel --trc {.}.ogg -j+0 -S server2,: \
     'mpg321 -w - {} | oggenc -q0 - -o {.}.ogg' ::: *.mp3

Watch an intro video to GNU Parallel here:

http://www.youtube.com/watch?v=OpaiGYxkSuQ

Related Question