From: Chris Seberino on
Possible to make subprocess.Popen jobs run serially rather than in
parallel?

In other words, if a computer is low on memory and doesn't mind
waiting.....can Popen be configured to submit to a queue and run jobs
*ONE AT TIME*??

That might be useful and avoid crashes and disk swapping.

cs
From: Chris Seberino on
On Jun 15, 2:03 pm, Stephen Hansen <me+list/pyt...(a)ixokai.io> wrote:

> Just call "process.wait()" after you call process = subprocess.Popen(....)

I may have not been clear.....
I *don't* want web app to block on Popen.wait.
I *do* want the Popen process to run in the background which web app
still runs doing other things.

Rather, I don't want *MANY* Popen processes to run in the
background....just one preferably.

cs
From: Chris Seberino on
On Jun 16, 11:27 am, Stephen Hansen <me+list/pyt...(a)ixokai.io> wrote:
> On 6/16/10 7:04 AM, Chris Seberino wrote:
>
> > On Jun 15, 2:03 pm, Stephen Hansen <me+list/pyt...(a)ixokai.io> wrote:
>
> >> Just call "process.wait()" after you call process = subprocess.Popen(...)
>
> > I may have not been clear.....
> > I *don't* want web app to block on Popen.wait.
> > I *do* want the Popen process to run in the background which web app
> > still runs doing other things.
>
> > Rather, I don't want *MANY* Popen processes to run in the
> > background....just one preferably.
>
> The simpliest method that comes to mind then is to have a "Process
> Runner" thread that you start when the web app begins. Then create a
> Queue.Queue() instance, share it between said thread and your web app.
>
> When you want to run an application, do Queue.put( (argspec,) )
>
> Have Process Runner do a blocking wait with Queue.get().
>
> When you wake it up with Queue.put, have it pass the args off into
> subprocess.Popen. Then have it do process.wait() to block on said
> process's completion.
>
> Once it's done, our little infinite loop jumps to the top, and it calls
> queue.get() again -- if another process request has been put in, it
> immediately gets it and goes and runs it, thus your processes are
> executing one at a time. If nothing is ready for it, it blocks until you
> wake it up.
>
> Something like (written off of top of head, may have errors):
>
> import threading
> import Queue
> import subprocess
>
> class ProcessRunner(threading.Thread):
>     def __init__(self, queue):
>         self._queue = queue
>         self.setDaemon(True)
>
>     def run(self):
>         while True:
>             args, kwargs = self._queue.get()
>             process = subprocess.Popen(*args, **kwargs)
>             process.wait()
>
> # ... And somewhere in our real web-app initialization, we do...
>
>     runner_queue = Queue.Queue()
>     runner_thread = ProcessRunner(runner_queue)
>     runner_thread.start()
>
> # ... And later, when we want to start a process ...
>
>     runner_queue.put( (("ls -la",), {"shell": False}) ) # (*) see bottom
>
> --
>
>    Stephen Hansen
>    ... Also: Ixokai
>    ... Mail: me+list/python (AT) ixokai (DOT) io
>    ... Blog:http://meh.ixokai.io/
>
> P.S. Passing in 'args' and 'kwargs' into the queue is usually in my
> experience overkill (in addition to being slightly ugly); generally the
> subprocesses I want to run are similar in nature or environment, so I
> just have the runner-thread smart. But, the above is the most naive
> implementation.
>
>  signature.asc
> < 1KViewDownload

Thanks all. I must say I implemented the threading + Queue module
suggestion and it is incredibly simple and elegant. I'm still
recovering from the glorious light rays emanating from the Python
code.

cs