From: Ignoramus30118 on
#!/usr/bin/perl

#
# This script reads its standard input, or from files given on
# command line (<>).
#
# It executes every line as a separate shell command. If --parallel
# argument is given, it executes as many jobs in parallel, as
# possible, but no more tham "--parallel" at any given time.
#
# This can be helpful to speed up some tasks.
#
# You need module Parallel::ForkManager. It is available as
# an ubuntu package.
#
# Copyright(C) Igor Chudov, 2009. All rights reserved.
# This script is made available to the public under the latest
# GPL Version found at http://www.gnu.org/licenses/gpl.html
#
# No warranty is given or implied. Refunds will not be provided.
#
# Igor Chudov, http://igor.chudov.com/
#

use strict;
use warnings;

use Getopt::Long;
use Parallel::ForkManager;

my $parallel = 0;

my $pm = new Parallel::ForkManager( $parallel );

GetOptions(
"parallel=i" => \$parallel,
);

while( <> ) {
chomp;
my $pid = $pm->start and next;
system( $_ );
$pm->finish;
}

$pm->wait_all_children;
From: RjY on
Ignoramus30118 posted:
># This script reads its standard input, or from files given on
># command line (<>).
>#
># It executes every line as a separate shell command. If --parallel
># argument is given, it executes as many jobs in parallel, as
># possible, but no more tham "--parallel" at any given time.
>#

Nice, but I can't think of a use case for it that isn't adequately
covered by GNU xargs... (Its -P0 will I believe emulate your --parallel)

[script snipped]

--
http://rjy.org.uk/
From: Ignoramus28865 on
Updated version, fixes a bug

#!/usr/bin/perl

#
# This script reads its standard input, or from files given on
# command line (<>).
#
# It executes every line as a separate shell command. If --parallel
# argument is given, it executes as many jobs in parallel as
# possible, but no more tham "--parallel" at any given time.
#
# This can be helpful to speed up some tasks.
#
# You need module Parallel::ForkManager. It is available as
# an ubuntu package.
#
# Copyright(C) Igor Chudov, 2009. All rights reserved.
# This script is made available to the public under the latest
# GPL Version found at http://www.gnu.org/licenses/gpl.html
#
# No warranty is given or implied. Refunds will not be provided.
#
# Igor Chudov, http://igor.chudov.com/
#

use strict;
use warnings;

use Getopt::Long;
use Parallel::ForkManager;

my $parallel = 0;

GetOptions(
"parallel=i" => \$parallel,
);

my $pm = new Parallel::ForkManager( $parallel );

while( <> ) {
chomp;
my $pid = $pm->start and next;
system( $_ );
$pm->finish;
}

$pm->wait_all_children;
From: despen on
Ignoramus30118 <ignoramus30118(a)NOSPAM.30118.invalid> writes:

> #!/usr/bin/perl
>
> #
> # This script reads its standard input, or from files given on
> # command line (<>).
> #
> # It executes every line as a separate shell command. If --parallel
> # argument is given, it executes as many jobs in parallel, as
> # possible, but no more tham "--parallel" at any given time.
> #
> # This can be helpful to speed up some tasks.

Make with the -j option.

-j = as many as possible
-j nnn up to nnn at a time.

In addition job interdedpence can be handled.

Ie. don't start these 3 jobs until these other 2 run to completion
successfully.

From: Ignoramus28865 on
On 2009-11-17, despen(a)verizon.net <despen(a)verizon.net> wrote:
> Ignoramus30118 <ignoramus30118(a)NOSPAM.30118.invalid> writes:
>
>> #!/usr/bin/perl
>>
>> #
>> # This script reads its standard input, or from files given on
>> # command line (<>).
>> #
>> # It executes every line as a separate shell command. If --parallel
>> # argument is given, it executes as many jobs in parallel, as
>> # possible, but no more tham "--parallel" at any given time.
>> #
>> # This can be helpful to speed up some tasks.
>
> Make with the -j option.
>
> -j = as many as possible
> -j nnn up to nnn at a time.
>
> In addition job interdedpence can be handled.
>
> Ie. don't start these 3 jobs until these other 2 run to completion
> successfully.
>

This is similar to what my script does, except that it does not
require a makefile.

My immediate use of this would be to speed up image conversion with
imagemagick. I have a script that would download pictures from my
camera, and then "convert" them to make it 50%x50% of original size.

I want to speed up "convert" by parallelizing it.

i