From: Scott M. on 16 Oct 2009 13:48
"Eduardo" <mm(a)mm.com> wrote in message news:hbaaot$sp$1(a)aioe.org...
> Scott M. escribi�:
>> I'm for the zero-based array, not because it's easier or harder than
>> something else. I think ANY programmer (even a new one) has to be adept
>> enough with numbers and math to be able to handle a list that starts with
>> 0 and/or a list that starts with 1 - I really don't think that is a
>> The reason I'm for zero-based arrays is simply because, im my experience,
>> zero-based is the more common approach in most modern programming
>> languages and since these days, programmers are being asked to become
>> familiar with more than just one language, having more of them support
>> the same standard makes jumping back and forth easier and makes careless
>> mistakes because over there I use zero, but over here I use 1, not an
>> We haven't even mentioned that in corporate America (and corporate
>> non-America), programmers don't program in a vacuum. They write code
>> that must integrate with other code. Having all programmers code
>> according to the default array starting index just makes more sense to me
>> in an Enterprise, than having some people use one and others using zero.
>> Can you say "standards"?
>> Just my 1 cents (but please start your cent count from zero).
> OK, those are valid reasons, but then it means that the programmers have
> to carry "a karma" forever and ever just because something is too commonly
Well, if you are interested in standards, yes. Conceptually, that may seem
very constricting, but simply take a look at the "browser wars" of the 90's
and you'll see what a lack of standards did.
1. On the one hand, it promoted innovation. Many of what have become
"standard" parts of HTML and CSS were first developed as proprietary code.
2. On the other, writing web applications to a particular audience became
tedious and/or daunting in some cases, causing applicaiton archtrectures to
become overly complex, development and testing time to be unduly long, and
scalability almost impossible.
Sure, a case can be made for choice #1, but strictly speaking from a
real-world perspective of Enterprise development, the choice becomes clear
that the benefits of 1, are completely outweighed by the downside of 2.
There's no doubt that without standards, you can't get past a certain point
in develoipment. This has been shown over and over again with technology
(VHS vs. Betamax, BlueRay vs. [what was the other one?]). BetaMax was
actually technically superior to VHS, but when the war was still raging,
when there was no standard, people became hesitant to purchase either so
overall, the both technologies remained reletively stagnant. This was the
case again when DVD formats had no clear standard. Now that BlueRay has
emerged as the standard, you're starting to see people going out and buying
DVD players again. And that will spawn innovation between electronics
manufacturers to show why Panasonic's BlueRay player is better than Sony's,
> Some language must be the first in changing that.
> (Asumming that it would be better in the other way, that it's also under
> I'm not saying I'm definitely right, may be I'm wrong, but that's the way
> I see it.
From: dpb on 16 Oct 2009 13:44
> Of course there are situations when 0 based is more appropriate, or even
> required, but _my opinion_ is that in general (and not just for me),
> there are many more situation that 1 based is more natural.
I don't think there's any way to possibly quantify this as being so.
But, even if it were so, it misses the point entirely... :(
> But this must come along with automatic redim, otherwise it could make
> the things worse.
???? I have absolutely no clue why you think that should have anything
whatever to do with it.
> The goal is that the programmer could concentrate his/her efforts in the
> functions of his program and not in the technical problems of the language.
> Yes, you lose the possibility of "not the check arrays bounds" in the
> compiler options.
Hopeless... :( Nothing to do one w/ the other. The implementation has
to have bounds whether they've been set explicitly or implicitly.
Bounds-checking is still possible (outside the question of dummy
arguments passed in code units compiled in separate compilation units so
there really isn't the required information required).
That would in reality be a _major_ step backwards rather than forwards
as it would allow for a large class of previously detectable bugs to no
become much more difficult to track down and fix thus undoubtedly
increasing debugging effort far more than any compensating initial
coding time that could possibly be saved (and I don't see such savings
as being significant in general anyway). That doesn't then even touch
upon the question of longer-term code maintenance and robustness.
One of the most significant enhancements in Fortran beginning with
F90/95 was the implementation of "explicit interfaces" such that
subprograms have this information available to them even across separate
compilations. The production increases in having this feature and the
quickened debugging is another subject of frequent conversation at c.l.f
for examples of results of having the feature.
Again, your focus is on minimal-value issues at the expense of the
> In that case, if you opt for the option of not checking it (the bounds)
> in order to get some speed increase, you'll have to redim the arrays in
> your code as it's today.
> You don't lose anything, because if you want to program as today, you can.
See earlier note on Matlab and delayed dynamic allocation.
Implementation details of compilers that such choices in allowable
syntax imply are not at all insignificant and you need to begin to
appreciate those effects.
From: Karl E. Peterson on 16 Oct 2009 14:05
>> What if the person developing code works in the "0 is better" field
>> exclusively? Your preference should still reign over their development
>> environment? I don't think so... :)
> Do a census.
> Does the minority who need 0 have to reign over all the majority who need 1?
Sometimes, I "need" 1995 or 1982 or 2000 as the LBound. But I cope, especially in
langauges that don't offer that option, by just adding a constant to the index
Yeah, 0 is the *only* sensible default. I didn't appreciate that either, at first.
Humans may often start counting at 1, but by no means do they always start there.
..NET: It's About Trust!
From: dpb on 16 Oct 2009 14:05
> I'm not saying I'm definitely right, may be I'm wrong, but that's the
> way I see it.
Well, look at it this way...
Computing started some 50 years or so ago and there have been quite a
number of pretty smart folks involved in that time.
There must be a reason things evolved to 0-based as default and if there
were a clear advantage to another choice it would seem that it would
have succeeded long before now.
One other way to see why is to consider the hardware...
You start w/ a bit--what are it's possible values?
You then go to (say) a byte as that's the most common next addressable
size at the moment, there have been others in the past but that's not
How many integers are representable in a (say unsigned for convenience)
byte? It's 256, right? What are those actual 256 values,
though--they're 0(where did that come from so sneakily??? :) ) to 255.
The lesson to be learned is that the hardware is 0-based, hence it's not
surprising to learn that implementations are simpler and more efficient
that way irrespective of the human interface.
If you write something as 1-based it still has to get translated to what
actually the machine does. While certain abstractions have benefits,
there's a point at which one may as well recognize that it's better to
fit to the implementation than fight it.
The Standards argument is also a powerful one not to be minimized.
From: Eduardo on 16 Oct 2009 14:11
> Well, if you are interested in standards, yes.
But AFAIK there is still not "standard for lower bound in arrays in high
level programming languages".
It's just common use to be 0, but not an established standard.
From the machine point of view, it's favored 0 based, but from a high
Unfortunately, I think that many people who decide in language design,
is too much tending to think from the machine point of view.
That's also why we have now 0-based as the "standard".