From: Georg Bauhaus on
On 27.04.10 13:41, Martin Krischik wrote:
> Am 23.04.2010, 16:37 Uhr, schrieb Georg Bauhaus
> <rm.dash-bauhaus(a)futureapps.de>:
>
>> On 23.04.10 15:56, Maciej Sobczak wrote:
>
>> BTW, why do we still subscribe to the notion "integer overflow"
>> when the one thing that any sequence of what is commonly known
>> as integers cannot possibly do is to overflow? Maybe the
>> wording is at the heart of the problem.
>
> Not at all. This is an integer overflow:
>
> int dayOfMonth = 32;

I don't think that this is an integer overflow
by the common definition of "integer overflow" (which
relies on "int", not implications of the name "dayOfMonth"
---and also not on "integer"). No overflow there,
just a terrible programming mistake.
In Ada, other than predefined "Integer" and such, there are no
named integer types. Good thing, as this lack suggests
adding at least range constraints, if not new types.
In C, many programmers seem to think there are integers
and that their name is "int".

(On a 5bit architecture one could even be mislead to
believe that the above declaration would create overflow
(disregarding the logical error), since 32 > 2^5 - 1.
But I think that C's minimum requirement for an "int"
is storing values between -(2^15 - 1) and +(2^15 - 1)---
whatever width the underlying hardware's words do
actually have.)

So the above would "flow over" the range constraint of a
suitably defined Month subtype in Ada. (Forgetting about
February problems for the moment, that can only to be solved
in type systems such as Qi's.)


> Simplified example of course.

Uhm, the heart of the problem is that "int" is taken to mean,
...., well, ... an integer? Yes, the above is a logical
error, one that could have been prevented mechanically
by using a good base type system, one that does not
include "the integers". Which is my point: that it is
a misconception to think of "int" as an integer.
If you think of "int" as what it is: "int", and if you are
smart, then little can go wrong.
This is easier to get right once you have a base type system
that naturally suggest not to think in terms of an
infinite set of arbitrary high school numbers, but of
computer entities. Like Ada's. I like this characterization
of C's int:

"The int type was typically the most convenient native
data type for integer math." (*)

The "native" part is what seems lost in stereotypical C
knowledge.

(*) http://www.ibm.com/developerworks/power/library/pa-ctypes3/index.html
From: AdaMagica on
But Standard.Integer is an integer ;-)

Very_Big : constant := Integer'Last + Integer'Last; --
"+" of Integer
Even_Bigger: constant := Long_Integer'Last + Long_Integer'Last; --
"+" of Long_Integer
Illegal : constant := Long_Integer'Last + Integer'Last; --
"+" mixing