From: Norm Matloff on
Should be a simple question, but I can't seem to make it work from my
understanding of the docs.

I want to use the multiprocessing module with remote clients, accessing
shared lists. I gather one is supposed to use register(), but I don't
see exactly how. I'd like to have the clients read and write the shared
list directly, not via some kind of get() and set() functions. It's
clear how to do this in a shared-memory setting, but how can one do it
across a network, i.e. with serve_forever(), connect() etc.?

Any help, especially with a concrete example, would be much appreciated.
Thanks.

Norm

From: Kushal Kumaran on
On Thu, Apr 8, 2010 at 3:04 AM, Norm Matloff <matloff(a)doe.com> wrote:
> Should be a simple question, but I can't seem to make it work from my
> understanding of the docs.
>
> I want to use the multiprocessing module with remote clients, accessing
> shared lists.  I gather one is supposed to use register(), but I don't
> see exactly how.  I'd like to have the clients read and write the shared
> list directly, not via some kind of get() and set() functions.  It's
> clear how to do this in a shared-memory setting, but how can one do it
> across a network, i.e. with serve_forever(), connect() etc.?
>
> Any help, especially with a concrete example, would be much appreciated.
> Thanks.
>

There's an example in the multiprocessing documentation.
http://docs.python.org/library/multiprocessing.html#using-a-remote-manager

It creates a shared queue, but it's easy to modify for lists.

For example, here's your shared list server:
from multiprocessing.managers import BaseManager
shared_list = []
class ListManager(BaseManager): pass
ListManager.register('get_list', callable=lambda:shared_list)
m = ListManager(address=('', 50000), authkey='abracadabra')
s = m.get_server()
s.serve_forever()

A client that adds an element to your shared list:
import random
from multiprocessing.managers import BaseManager
class ListManager(BaseManager): pass
ListManager.register('get_list')
m = ListManager(address=('localhost', 50000), authkey='abracadabra')
m.connect()
l = m.get_list()
l.append(random.random())

And a client that prints out the shared list:
from multiprocessing.managers import BaseManager
class ListManager(BaseManager): pass
ListManager.register('get_list')
m = ListManager(address=('localhost', 50000), authkey='abracadabra')
m.connect()
l = m.get_list()
print str(l)

--
regards,
kushal
From: Norm Matloff on
Thanks very much, Kushal.

But it seems to me that it doesn't quite work. After your first client
below creates l and calls append() on it, it would seem that one could
not then assign to it, e.g. do

l[1] = 8

What I'd like is to write remote multiprocessing code just like threads
code (or for that matter, just like shared-memory multiprocessing code),
i.e. reading and writing shared globals. Is this even possible?

Norm

On 2010-04-08, Kushal Kumaran <kushal.kumaran+python(a)gmail.com> wrote:
> On Thu, Apr 8, 2010 at 3:04 AM, Norm Matloff <matloff(a)doe.com> wrote:
>> Should be a simple question, but I can't seem to make it work from my
>> understanding of the docs.
>>
>> I want to use the multiprocessing module with remote clients, accessing
>> shared lists.  I gather one is supposed to use register(), but I don't
>> see exactly how.  I'd like to have the clients read and write the shared
>> list directly, not via some kind of get() and set() functions.  It's
>> clear how to do this in a shared-memory setting, but how can one do it
>> across a network, i.e. with serve_forever(), connect() etc.?
>>
>> Any help, especially with a concrete example, would be much appreciated.
>> Thanks.
>>
>
> There's an example in the multiprocessing documentation.
> http://docs.python.org/library/multiprocessing.html#using-a-remote-manager
>
> It creates a shared queue, but it's easy to modify for lists.
>
> For example, here's your shared list server:
> from multiprocessing.managers import BaseManager
> shared_list = []
> class ListManager(BaseManager): pass
> ListManager.register('get_list', callable=lambda:shared_list)
> m = ListManager(address=('', 50000), authkey='abracadabra')
> s = m.get_server()
> s.serve_forever()
>
> A client that adds an element to your shared list:
> import random
> from multiprocessing.managers import BaseManager
> class ListManager(BaseManager): pass
> ListManager.register('get_list')
> m = ListManager(address=('localhost', 50000), authkey='abracadabra')
> m.connect()
> l = m.get_list()
> l.append(random.random())
>
> And a client that prints out the shared list:
> from multiprocessing.managers import BaseManager
> class ListManager(BaseManager): pass
> ListManager.register('get_list')
> m = ListManager(address=('localhost', 50000), authkey='abracadabra')
> m.connect()
> l = m.get_list()
> print str(l)
>
From: Kushal Kumaran on
On Thu, Apr 8, 2010 at 11:30 AM, Norm Matloff <matloff(a)doe.com> wrote:
> Thanks very much, Kushal.
>
> But it seems to me that it doesn't quite work.  After your first client
> below creates l and calls append() on it, it would seem that one could
> not then assign to it, e.g. do
>
>   l[1] = 8
>
> What I'd like is to write remote multiprocessing code just like threads
> code (or for that matter, just like shared-memory multiprocessing code),
> i.e. reading and writing shared globals.  Is this even possible?
>

Try this server:
from multiprocessing.managers import BaseManager, ListProxy
shared_list = []
class ListManager(BaseManager): pass
ListManager.register('get_list', callable=lambda:shared_list,
proxytype=ListProxy)
m = ListManager(address=('', 50000), authkey='abracadabra')
s = m.get_server()
s.serve_forever()

Just changed the proxy type appropriately. See the managers.py file
in the multiprocessing source for details.

> <snipped>

--
regards,
kushal