From: Jerry on
The following program produces a segmentation fault when run with the
array size set to 1_048_343 but not when it is set to 1_048_342.
What's going on and how do I fix it? If I put in a bit of text IO the
allowed array size is some what smaller.

GNATMAKE 4.4.0 20080314 (experimental) [trunk revision 133226]
OS X 10.5.8
4 GB RAM



procedure bomb is
type Float_Array_Type is array (Integer range <>) of Long_Float;
-- 1_048_343 causes segmentation fault, 1_048_342 does not.
x : Float_Array_Type(1 .. 1_048_343);
begin
x(1) := 1.0;
end bomb;


Jerry
From: Gautier write-only on
Perhaps you hit a stack limit (pure guess), or there is really a hole
in the memory :-).
In case of a stack issue you would have perhaps
raised STORAGE_ERROR : object too large
or
raised STORAGE_ERROR : EXCEPTION_STACK_OVERFLOW
(with -fstack-check)
G.
From: jonathan on
On Mar 17, 7:21 pm, Jerry <lancebo...(a)qwest.net> wrote:
> The following program produces a segmentation fault when run with the
> array size set to 1_048_343 but not when it is set to 1_048_342.
> What's going on and how do I fix it? If I put in a bit of text IO the
> allowed array size is some what smaller.
>
> GNATMAKE 4.4.0 20080314 (experimental) [trunk revision 133226]
> OS X 10.5.8
> 4 GB RAM
>
> procedure bomb is
>     type Float_Array_Type is array (Integer range <>) of Long_Float;
>     -- 1_048_343 causes segmentation fault, 1_048_342  does not.
>     x : Float_Array_Type(1 .. 1_048_343);
> begin
>     x(1) := 1.0;
> end bomb;
>
> Jerry

Have you tried setting the stacksize in your shell. In bash the
command is

ulimit -s unlimited

in csh, (I seem to recall):

limit stacksize unlimited

Type ulimit or limit to see what the defaults are.
When I do it on Debian Linux, your program I get arrays up to
2 Gigabyte in size.

HTH ...

Jonathan



From: Georg Bauhaus on
Gautier write-only wrote:
> Perhaps you hit a stack limit (pure guess), or there is really a hole
> in the memory :-).
> In case of a stack issue you would have perhaps
> raised STORAGE_ERROR : object too large
> or
> raised STORAGE_ERROR : EXCEPTION_STACK_OVERFLOW
> (with -fstack-check)
> G.

Anything to do with
http://en.wikibooks.org/wiki/Ada_Programming/Tips#Stack_Size ?

Any news on GCC stack checking and page(?) size?
Trampolines, if applicable?

From: Jerry on
Thanks for the helpful comments.

First,
ulimit -s unlimited
does not work on OS X:
-bash: ulimit: stack size: cannot modify limit: Operation not
permitted
but I understand that it works on Linux. And possibly the reason is
the difference in the way that Linux and OS X treat stack and heap
memory. (Don't be confused and think I know what I'm talking about but
I read that somewhere.)

ulimit allows querying the hard limit of stack space
ulimit -Hs
which on OS X reports 65532 = 2^16 -4 kilobytes, about 67 MB. The user
via ulimit can set the stack up to that size but not higher:
ulimit -s 65532
The default soft limit on OS X is 8192 kB, found by
ulimit -s

So here's me being naive: I would have thought that Ada (or GNAT
specifically) would be smart enough to allocate memory for large
objects such as my long array in a transparent way so that I don't
have to worry about it, thus (in the Ada spirit) making it harder to
screw up. (Like not having to worry about whether arguments to
subprograms are passed by value or by reference--it just happens.)

But it seems that I will have to allocate memory for large objects
using pointers (and thus take the memory from the heap). Is that
right?

In this context, is there any advantage to declaring the large object
inside a declare block? Would that force the memory to be allocated
from the heap?

Jerry