From: IngoognI on
On Dec 11, 11:12 am, Wolodja Wentland <wentl...(a)cl.uni-heidelberg.de>
wrote:
>
> Which library would you choose?

looking at the galery at networx, it seems to be all balls 'n sticks,
how about writing the data to a file POV-Ray can read and render it
there?
From: Wolodja Wentland on
On Fri, Dec 11, 2009 at 07:31 -0800, IngoognI wrote:
> On Dec 11, 11:12 am, Wolodja Wentland <wentl...(a)cl.uni-heidelberg.de>
> wrote:

> > Which library would you choose?
>
> looking at the galery at networx, it seems to be all balls 'n sticks,
> how about writing the data to a file POV-Ray can read and render it
> there?

Huh? I am not really concerned about rendering the graphs but after a
library with a small memory footprint. Preferably one that contains a
number of typical algorithms.
--
.''`. Wolodja Wentland <wentland(a)cl.uni-heidelberg.de>
: :' :
`. `'` 4096R/CAF14EFC
`- 081C B7CD FF04 2BA9 94EA 36B2 8B7F 7D30 CAF1 4EFC
From: Neal Becker on
Wolodja Wentland wrote:

> On Fri, Dec 11, 2009 at 08:55 -0500, Neal Becker wrote:
>> Bearophile wrote:
>> > Wolodja Wentland:
>> >> Which library would you choose?
>
>> > This one probably uses low memory, but I don't know if it works still:
>> > http://osl.iu.edu/~dgregor/bgl-python/
>
>> How about python interface to igraph?
>
> Don't know :-) as I have not yet worked with it. Why do you recommend it?

My understanding is that igraph is a high performance graph library (all
implemented in C). It seems to have a very active user community.

There is also boost graph library, which IIRC also has a python interface.

From: anand jeyahar on
On 12/11/2009 10:27 PM, Neal Becker wrote:
> Which library would you choose?
>
Hmm.... i have tried python-graph and was happy with it....but the most
use i did was for complete graphs of 60-65 nodes..

Also there is an experimental branch for faster implementations, which
is under development.

--
==============================================
Anand J
http://sites.google.com/a/cbcs.ac.in/students/anand
==============================================
The man who is really serious,
with the urge to find out what truth is,
has no style at all. He lives only in what is.
~Bruce Lee

Love is a trade with lousy accounting policies.
~Aang Jie

From: geremy condra on
On Fri, Dec 11, 2009 at 5:12 AM, Wolodja Wentland
<wentland(a)cl.uni-heidelberg.de> wrote:
> Hi all,
>
> I am writing a library for accessing Wikipedia data and include a module
> that generates graphs from the Link structure between articles and other
> pages (like categories).
>
> These graphs could easily contain some million nodes which are frequently
> linked. The graphs I am building right now have around 300.000 nodes
> with an average in/out degree of - say - 4 and already need around 1-2GB of
> memory. I use networkx to model the graphs and serialise them to files on
> the disk. (using adjacency list format, pickle and/or graphml).

Huh. Using graphine- which should be somewhat more memory hungry
than networkx- I generated a naive million node 4-cycle graph and wound
up using something under 600 meg of ram. Can you post some code?

Geremy Condra