From: Brooks Moses on
Gary Scott wrote:
> Brooks Moses wrote:
>>I think Fortran's other big -- and largely unsung -- strength is its
>>development model. As modern languages go, it's a little behind the
>>state of the art, and will probably continue to be so (and I think this
>>is probably a good thing). But it's also backward compatible for three
>>decades. Thus, if I'm starting a new project today, and I expect that
>>in three decades I'll want to be using large parts of it in stuff that's
>>programmed with a relatively modern language three decades hence, I
>>think there's really only one clear choice. C++ will be quite old by
>>then, C will remain a painfully low-level language, and who knows where
>>today's popular things will be. Fortran will still be a decade behind
>>being up-to-date, and will be backward compatible.
>
> You don't think that F2003 isn't a substantial catch-up? I can only
> think of a small number of improvements to increase its broad appeal.

I'll consider F2003 relevant to that timeline when we have a compiler,
so give it a year or so. :) More relevantly -- yes, it's a substantial
catch-up, but still most of the features in F2003 have been around for
at least a decade. Certainly the biggest features of the "object
oriented" parts are nothing new. Currently, F2003 seems roughly as
modern as Java, give or take a little, and that's a decade old.

There are plenty of new features in programming languages that have been
invented more recently than that. Some of them are good ideas, many
more of them aren't. A few of the good ideas will also prove to be very
useful, and will catch on.

So, in my opinion, about a decade behind "state of the art" is about the
right place to be, for a long-lived language. I think it's about the
point where it's clear what new ideas are worth adopting, and ideally
it's also clear what mistakes were made in the first languages to use
them, and how to avoid those mistakes. And I think the fact that right
now you can only think of a small number of improvements that would
increase the appeal to most programmers is a sign that it's about the
right place to be, too.

- Brooks


--
The "bmoses-nospam" address is valid; no unmunging needed.
From: Steve O'Hara-Smith on
On Mon, 30 Oct 06 12:17:52 GMT
jmfbahciv(a)aol.com wrote:

> In article <m3vem36n18.fsf(a)garlic.com>,
> Anne & Lynn Wheeler <lynn(a)garlic.com> wrote:

> >one of the other things left over from the os/360 real memory days is
> >figuring out where the program image was to be loaded ... and then
> >having to swizzle all the "relocatable" address constants that were
> >frequently randomly distributed thru-out the program image.
>
> Yes. Is the word "relocatable" even used in CS classes anymore
> other than as buzzword to sound impressive?

I strongly suspect that if it is mentioned it is mentioned as
something historical and baroque which has been rendered obsolete by the
more modern approaches of memory mapping and position independent code.

Cue many people to tell us where relocation is still in active
use and why it's better than PIC or memory mapping :)

--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/
From: Steve O'Hara-Smith on
On Mon, 30 Oct 2006 15:41:27 -0600
Charles Richmond <richchas(a)comcast.net> wrote:

> I saw a documentary about Stepben Hawking from 15 or so years
> ago, and his folks were using Pascal. I also understand that
> Algol used to be *very* popular in Europe for scientific
> programming. AFAIK, FORTRAN was *not* so popular in Europe as
> it was in the US.

Hmm they might have been using Pascal and Algol a lot in DAMTP
(where Hawking has his office) but down the road in the Cav FORTRAN ruled
the roost around 1980 - although there were many people who sang the
virtues of Algol and did their work in FORTRAN. This may point up a
difference between theoretical physicists (computing mostly concerned with
building models) and the experimentalists (computing mostly concerned with
crunching numbers to check a hypothesis).

--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/
From: Joe Morris on
Charles Richmond <richchas(a)comcast.net> writes:

>I knew several engineering graduate students in the late 70's,
>and understand that programming work for their theses *had* to
>be done in FORTRAN. The thesis would be rejected if the software
>was done in another language.

Ouch. What was the justification for that policy? And at what university?

Assuming that FORTRAN was not the *subject* of the thesis, I can't
see any reason why "the most appropriate tool available" would not
have been the test. Or *was* that the issue -- discouraging the
use of cute alternatives which at that particular university were
inferior *for the specific thesis topics*? (e.g., using COBOL
to do scientific calculations)

Joe Morris
From: jmfbahciv on
In article <12jhsjqqnja4q72(a)corp.supernews.com>,
Larry__Weiss <lfw(a)airmail.net> wrote:
>jmfbahciv(a)aol.com wrote:
>> I do not consider 128-character variable names an improved solution.
>> It's a huge mess maker.
>>
>
>I have to agree. The problems I have reading code these days
>involve either sets of those very long names where the difference is
>subtle, -or- a common simple name that applies to each of a set of
>dissimilar objects, and the binding to a particular object is
>several dozen lines back from the reference to that common name.

What you are doing is spending all your code-reading, thus
code-thinking, time proofreading. That's why I cut the
nine-edge off the code somebody displayed here.

One the things we eyeball because of our naming standards
is immediately...well, theoretically immediately, find the
bad instruction that is using a field name instead of a bit
name for a bit test.

We did that by the position of a period or percent sign within
the symbol name.

<snip--I don't want to know ;-)>

/BAH