From: AlexM on
On Jan 25, 2:37 pm, "Alf P. Steinbach" <al...(a)start.no> wrote:
> * AlexM:
>
>
>
> > On Jan 25, 2:07 pm, Terry Reedy <tjre...(a)udel.edu> wrote:
> >> On 1/25/2010 2:05 PM, Alexander Moibenko wrote:
>
> >>> I have a simple question to which I could not find an answer.
> >> Because it has no finite answer
>
> >>> What is the total maximal size of list including size of its elements?
> >> In theory, unbounded. In practice, limited by the memory of the interpreter.
>
> >> The maximum # of elements depends on the interpreter. Each element can
> >> be a list whose maximum # of elements ..... and recursively so on...
>
> >> Terry Jan Reedy
>
> > I am not asking about maximum numbers of elements I am asking about
> > total maximal size of list including size of its elements. In other
> > words:
> > if size of each list element is ELEMENT_SIZE and all elements have the
> > same size what would be the maximal number of these elements in 32 -
> > bit architecture?
> > I see 3 GB, and wonder why? Why not 2 GB or not 4 GB?
>
> At a guess you were running this in 32-bit Windows. By default it reserves the
> upper two gig of address space for mapping system DLLs. It can be configured to
> use just 1 gig for that, and it seems like your system is, or you're using some
> other system with that kind of behavior, or, it's just arbitrary...
>
> Cheers & hth.,
>
> - Alf (by what mechanism do socks disappear from the washer?)

No, it is 32-bit Linux.
Alex
From: Diez B. Roggisch on
Am 25.01.10 21:49, schrieb AlexM:
> On Jan 25, 2:37 pm, "Alf P. Steinbach"<al...(a)start.no> wrote:
>> * AlexM:
>>
>>
>>
>>> On Jan 25, 2:07 pm, Terry Reedy<tjre...(a)udel.edu> wrote:
>>>> On 1/25/2010 2:05 PM, Alexander Moibenko wrote:
>>
>>>>> I have a simple question to which I could not find an answer.
>>>> Because it has no finite answer
>>
>>>>> What is the total maximal size of list including size of its elements?
>>>> In theory, unbounded. In practice, limited by the memory of the interpreter.
>>
>>>> The maximum # of elements depends on the interpreter. Each element can
>>>> be a list whose maximum # of elements ..... and recursively so on...
>>
>>>> Terry Jan Reedy
>>
>>> I am not asking about maximum numbers of elements I am asking about
>>> total maximal size of list including size of its elements. In other
>>> words:
>>> if size of each list element is ELEMENT_SIZE and all elements have the
>>> same size what would be the maximal number of these elements in 32 -
>>> bit architecture?
>>> I see 3 GB, and wonder why? Why not 2 GB or not 4 GB?
>>
>> At a guess you were running this in 32-bit Windows. By default it reserves the
>> upper two gig of address space for mapping system DLLs. It can be configured to
>> use just 1 gig for that, and it seems like your system is, or you're using some
>> other system with that kind of behavior, or, it's just arbitrary...
>>
>> Cheers& hth.,
>>
>> - Alf (by what mechanism do socks disappear from the washer?)
>
> No, it is 32-bit Linux.
> Alex

I already answered that (as did Alf, the principle applies for both OSs)
- kernel memory space is mapped into the address-space, reducing it by 1
or 2 GB.

Diez
From: AlexM on
On Jan 25, 2:42 pm, "Diez B. Roggisch" <de...(a)nospam.web.de> wrote:
> Am 25.01.10 21:15, schrieb AlexM:
>
>
>
> > On Jan 25, 2:03 pm, "Diez B. Roggisch"<de...(a)nospam.web.de>  wrote:
> >> Am 25.01.10 20:39, schrieb AlexM:
>
> >>> On Jan 25, 1:23 pm, "Diez B. Roggisch"<de...(a)nospam.web.de>    wrote:
> >>>> Am 25.01.10 20:05, schrieb Alexander Moibenko:
>
> >>>>> I have a simple question to which I could not find an answer.
> >>>>> What is the total maximal size of list including size of its elements?
> >>>>> I do not like to look into python source.
>
> >>>> But it would answer that question pretty fast. Because then you'd see
> >>>> that all list-object-methods are defined in terms of Py_ssize_t, which
> >>>> is an alias for ssize_t of your platform. 64bit that should be a 64bit long.
>
> >>>> Diez
>
> >>> Then how do explain the program output?
>
> >> What exactly? That after 3GB it ran out of memory? Because you don't
> >> have 4GB memory available for processes.
>
> >> Diez
>
> > Did you see my posting?
> > ....
> > Here is what I get on 32-bit architecture:
> > cat /proc/meminfo
> > MemTotal:      8309860 kB
> > MemFree:       5964888 kB
> > Buffers:         84396 kB
> > Cached:         865644 kB
> > SwapCached:          0 kB
> > .....
>
> > I have more than 5G in memory not speaking of swap space.
>
> Yes, I saw your posting. 32Bit is 32Bit. Do you know about PAE?
>
>    http://de.wikipedia.org/wiki/Physical_Address_Extension
>
> Just because the system can deal with more overall memory - one process
> can't get more than 4 GB (or even less, through re-mapped memory).
> Except it uses specific APIs like the old hi-mem-stuff under DOS.
>
> Diez

Yes, I do. Good catch! I have PAE enabled, but I guess I have compiled
python without extended memory. So I was looking in the wrong place.
Thanks!
AlexM
From: Diez B. Roggisch on
Am 25.01.10 22:22, schrieb AlexM:
> On Jan 25, 2:42 pm, "Diez B. Roggisch"<de...(a)nospam.web.de> wrote:
>> Am 25.01.10 21:15, schrieb AlexM:
>>
>>
>>
>>> On Jan 25, 2:03 pm, "Diez B. Roggisch"<de...(a)nospam.web.de> wrote:
>>>> Am 25.01.10 20:39, schrieb AlexM:
>>
>>>>> On Jan 25, 1:23 pm, "Diez B. Roggisch"<de...(a)nospam.web.de> wrote:
>>>>>> Am 25.01.10 20:05, schrieb Alexander Moibenko:
>>
>>>>>>> I have a simple question to which I could not find an answer.
>>>>>>> What is the total maximal size of list including size of its elements?
>>>>>>> I do not like to look into python source.
>>
>>>>>> But it would answer that question pretty fast. Because then you'd see
>>>>>> that all list-object-methods are defined in terms of Py_ssize_t, which
>>>>>> is an alias for ssize_t of your platform. 64bit that should be a 64bit long.
>>
>>>>>> Diez
>>
>>>>> Then how do explain the program output?
>>
>>>> What exactly? That after 3GB it ran out of memory? Because you don't
>>>> have 4GB memory available for processes.
>>
>>>> Diez
>>
>>> Did you see my posting?
>>> ....
>>> Here is what I get on 32-bit architecture:
>>> cat /proc/meminfo
>>> MemTotal: 8309860 kB
>>> MemFree: 5964888 kB
>>> Buffers: 84396 kB
>>> Cached: 865644 kB
>>> SwapCached: 0 kB
>>> .....
>>
>>> I have more than 5G in memory not speaking of swap space.
>>
>> Yes, I saw your posting. 32Bit is 32Bit. Do you know about PAE?
>>
>> http://de.wikipedia.org/wiki/Physical_Address_Extension
>>
>> Just because the system can deal with more overall memory - one process
>> can't get more than 4 GB (or even less, through re-mapped memory).
>> Except it uses specific APIs like the old hi-mem-stuff under DOS.
>>
>> Diez
>
> Yes, I do. Good catch! I have PAE enabled, but I guess I have compiled
> python without extended memory. So I was looking in the wrong place.


You can't compile it with PAE. It's an extension that doesn't make sense
in a general purpose language. It is used by Databases or some such,
that can hold large structures in memory that don't need random access,
but can cope with windowing.

Diez
From: AlexM on
On Jan 25, 3:31 pm, "Diez B. Roggisch" <de...(a)nospam.web.de> wrote:
> Am 25.01.10 22:22, schrieb AlexM:
>
>
>
> > On Jan 25, 2:42 pm, "Diez B. Roggisch"<de...(a)nospam.web.de>  wrote:
> >> Am 25.01.10 21:15, schrieb AlexM:
>
> >>> On Jan 25, 2:03 pm, "Diez B. Roggisch"<de...(a)nospam.web.de>    wrote:
> >>>> Am 25.01.10 20:39, schrieb AlexM:
>
> >>>>> On Jan 25, 1:23 pm, "Diez B. Roggisch"<de...(a)nospam.web.de>      wrote:
> >>>>>> Am 25.01.10 20:05, schrieb Alexander Moibenko:
>
> >>>>>>> I have a simple question to which I could not find an answer.
> >>>>>>> What is the total maximal size of list including size of its elements?
> >>>>>>> I do not like to look into python source.
>
> >>>>>> But it would answer that question pretty fast. Because then you'd see
> >>>>>> that all list-object-methods are defined in terms of Py_ssize_t, which
> >>>>>> is an alias for ssize_t of your platform. 64bit that should be a 64bit long.
>
> >>>>>> Diez
>
> >>>>> Then how do explain the program output?
>
> >>>> What exactly? That after 3GB it ran out of memory? Because you don't
> >>>> have 4GB memory available for processes.
>
> >>>> Diez
>
> >>> Did you see my posting?
> >>> ....
> >>> Here is what I get on 32-bit architecture:
> >>> cat /proc/meminfo
> >>> MemTotal:      8309860 kB
> >>> MemFree:       5964888 kB
> >>> Buffers:         84396 kB
> >>> Cached:         865644 kB
> >>> SwapCached:          0 kB
> >>> .....
>
> >>> I have more than 5G in memory not speaking of swap space.
>
> >> Yes, I saw your posting. 32Bit is 32Bit. Do you know about PAE?
>
> >>    http://de.wikipedia.org/wiki/Physical_Address_Extension
>
> >> Just because the system can deal with more overall memory - one process
> >> can't get more than 4 GB (or even less, through re-mapped memory).
> >> Except it uses specific APIs like the old hi-mem-stuff under DOS.
>
> >> Diez
>
> > Yes, I do. Good catch! I have PAE enabled, but I guess I have compiled
> > python without extended memory. So I was looking in the wrong place.
>
> You can't compile it with PAE. It's an extension that doesn't make sense
> in a general purpose language. It is used by Databases or some such,
> that can hold large structures in memory that don't need random access,
> but can cope with windowing.
>
> Diez

Well, there actually is a way of building programs that may use more
than 4GB of memory on 32 machines for Linux with higmem kernels, but I
guess this would not work for python.
I'll just switch to 64-bit architecture.
Thanks again.
AlexM