From: Vincent LAFAGE on
Nasser M. Abbasi a �crit :
> I was browsing the net for scientific software written in Ada, and came
> across this strange statement:
>
> http://farside.ph.utexas.edu/teaching/329/lectures/node7.html
>
> "Scientific programming languages
> What is the best high-level language to use for scientific programming?
> This, unfortunately, is a highly contentious question. Over the years,
> literally hundreds of high-level languages have been developed. However, few
> have stood the test of time. Many languages (e.g., Algol, Pascal, Haskell)
> can be dismissed as ephemeral computer science fads. Others (e.g., Cobol,
> Lisp, Ada) are too specialized to adapt for scientific use.
>
> ......
>
> The remaining options are FORTRAN 77 and C. I have chosen to use C "
>
> I find this strange, because I think Ada can be the best programming
> language for numerical work. So, I do not know why the author above thinks
> Ada is "too specialized to adapt for scientific use". Is there something in
> Ada which makes it hard to use for scientific programming?
>
> The main problem I see with Ada for scientific use is that it does not have
> as nearly as many packages and functions ready to use output of the box for
> this, other than that, the language itself I think is better than Fortran
> and C for scientific work.
>
> (the above quote is from a course on Computational Physics at University of
> Texas at Austin, may be I should write to the professor and ask him why he
> said that, but I am not sure I'll get an answer, my experience is that most
> professors do not answer email :)
>
> --Nasser

just my 2 (numerical) eurocents

I have been developping some Fortran (77) code during my PhD (17 years ago).
It was a basic MonteCarlo Code for simulation of electron-positron
collision.
1001 SLOC (physical Sources Lines Of Code, measured with sloccount)
Not big, most work went into chiseling the analytical low-level
expressions so that the numerical approach would be used at its most.
No dedicated simplification or hardware related optimisation beyond what
the compiler provides with -O3.
Not much software engineering: only using Fortran 77 with COMMON thought
as objects, and a minimum modular approach.
The result was fast, but it would be arrogant boasting to describe how
much ;-)

Time passed and then Open MP appeared. I wanted to test my old pal
reference code on my brand new two-core CPU, without much effort, I thought.
To ease the move to parallel code, and to learn more about the now
unavoidable Fortran 90 (which simply copied the easiest part of Ada 83),
I first translated my code to F90 (not simply compiling the same code
with an F90 compiler).
I turned the COMMON into modules (a.k.a. package) and a type within this
package, and according to the famous equation :
module + type = class
it resulted in an object oriented code ;-) .
1150 SLOC
Time-wise, I was paying only a 20% abstraction penalty with the same
compiler. Not bad.

The objectification process was made in C++ at the same time. A quite
painful translation indeed, but C++ was my way to earn money at that time...
1259 SLOC.
Not only the translation was painful, but the compilation gave me result
flawed with memory faults that I had to debug. In the end, I finally had
the same accuracy as Fortran 77, but...
The result looks like a good start for a flame war: C++ was 7.5 times
slower than Fortran 77. With the same compiler (g++ versus g77/gfortran).
I do not think it is so significant in fact, as my algorithm relies
heavily on complex number, which are well integrated in Fortran, but
pays the full abstraction penalty in C++... The good way to do it in C++
would be to use libraries such as Boost or Blitz.
We can do it, but then do we really want to go through the hassle of
extra layers?

My program wasn't still parallelized, and Ada became the next candidate
for the test.
1120 SLOC.
The translation was swift, and when the compiler finaly let me go, the
code ran (no debug needed) and was delivering the same accuracy as
Fortran 77, and only a factor 2 of speed loss (compared to 7.5 for
C++...). With gnat (to stay in the same compiler family).
Given that complex number are not hard-embedded in Ada, they should pay
the abstraction price, but they keep the price low.

Later, I could capitalize on the tasks and protected objects to
parallelize my program in Ada, which is as yet still not done in
Fortran... But this is another story.

I plan to do it in Java as well, but I expect almost the same thing as
for C++: it will not be testing the strong point of the language, but
really peeking at its weakest point: complex numbers.

So, as far as performance is concerned, Fortran was, IN THIS CASE, the
winner. Ada, was a strong contender.
I will not elaborate too much on the easiness of translation: when you
have worked for many month with such a short program, and converted it
to 2 other languages, the 3rd translation can not be so hard.

I believe that if I had to translate (or even write) other more
significant codes, I would use the ability of Ada to interface with
other languages: I would keep my low level routines in fast Fortran, and
have the general flow of the computation driven by Ada, to make the
whole picture clearer and more efficient.
In between, I try to code more in Fortran 90, which is a close nephew of
Ada 83.

* I did the same kind of multi-language port for my education, with a
quasi-random number generator, starting from a C++ version,
and I have not yet found a way as elegant, and memory-wise efficient, to
store an upper triangular matrix as in C++.
* I also miss some syntaxic sugar of Fortran that allows to refer to
line I of a matrix M as a vector with M(I, *)
=> If someone would like to discuss it, I will be glad to exchange on these.

Best regards,
Vincent

PS : for language crusade, I believe that the problem of Fortran is
"Fortran users", who are mostly not computer scientist. I hear the Ada
fans boast "Ada doesn't make assumptions about the programmer: he can be
uneducated, but the compiler will save him", but I strongly doubt the
actual population of Ada coders be AS uneducated (or
software-engineering-unconscious) as the actual population of Fortran
coders (averaged over the last 50 years).
From: Shmuel Metz on
In <4bbb5386$0$56422$c30e37c6(a)exi-reader.telstra.net>, on 04/07/2010
at 01:30 AM, "robin" <robin51(a)dodo.com.au> said:

>Because you cut the sentence and the one before it,
>you lost the significance.

No. You still don't get the significance of what you replied to.

>Restoring it, we have:

Bupkis.

>"| Dismissing Algol as ephemeral ignores its influence and continuing
>usage "| as a base of pseudo-codes. Important numerical libraries were
>first "| implemented in ALgol,

Read it carefully this time and note what words it doesn't contain.

>"No, they were first implemented in machine code,
>"and later rewritten in Algol and FORTRAN."
>you can see that it is patently obvious that "they" refers
>to "Important Numerical libraries".

Then Does "No" also refer to them? Because that "No" is dead wrong.

>You will also realize that it's referring to important ones,

Who decides what's important? Do you believe that no important algorithms
were written in the late 1950's, the 1960's and the 1970's?

>and that it's disputing the claim that such libraries were first
>implemented in Algol.

Yes, because you're confusing existential quantifiers with universal
quantifiers.

>Restoring the immediately following sentence that you also cut out,

Because it was irrelevant.

>we see that I said:
> "The numerical procedures of the General Interpretive Programme
> "were written in machine code, from 1955."

Which has nothing to do with the point in dispute.


>Had you actually read my post,

ROTF,LMAO. Too bad you didn't read your own post before replying to mine.

>you would have seen that I gave reference to a important numerical
>library.

Strangely enough, I also noticed that it was a library, not an algorithm.
I also noticed that the algorithms in it were not the only algorithms ever
to be developed.

>Come to think of any numerical algorithm developed before Algol, you may
>have heard of J. H. Wilkinson's work on numerical algorithms, for which
>he wrote machine code from 1947.

Algorithms that were developed on dead trees. Translations of existing
algorithms are not what is in dispute.

--
Shmuel (Seymour J.) Metz, SysProg and JOAT <http://patriot.net/~shmuel>

Unsolicited bulk E-mail subject to legal action. I reserve the
right to publicly post or ridicule any abusive E-mail. Reply to
domain Patriot dot net user shmuel+news to contact me. Do not
reply to spamtrap(a)library.lspace.org

From: Shmuel Metz on
In <m2tyrnjc5f.fsf(a)pushface.org>, on 04/07/2010
at 08:27 PM, Simon Wright <simon(a)pushface.org> said:

>Wasn't Ada Augusta's first program an algorithm to compute Fibonacci
>numbers? That would certainly have been in machine code.

But was it a new algorithm, or merely a transcription of an algorithm that
she already knew? And, more important, do you know for a fact that *Robin*
knew about it? Note carefully what I asked and what I didn't ask.

--
Shmuel (Seymour J.) Metz, SysProg and JOAT <http://patriot.net/~shmuel>

Unsolicited bulk E-mail subject to legal action. I reserve the
right to publicly post or ridicule any abusive E-mail. Reply to
domain Patriot dot net user shmuel+news to contact me. Do not
reply to spamtrap(a)library.lspace.org

From: Georg Bauhaus on
Vincent LAFAGE schrieb:

> Given that complex number are not hard-embedded in Ada, they should pay
> the abstraction price, but they keep the price low.

An example of Ada's complex type (a record) being less efficient
than something else is seen in the following Mandelbrot programs;
the difference between the two Ada entries is in part caused by
one of them using type Complex,

http://shootout.alioth.debian.org/u64q/performance.php?test=mandelbrot

I guess that (some) computations involving object of type Complex will be
faster when compilers generate SSE instructions for complex arithmetic.
From: Colin Paul Gloster on
On Mon, 5 Apr 2010, Georg Bauhaus posted:

|---------------------------------------------------------------------|
|"[..] |
| |
|As Dmitry Kazakov has recently said, when Ada run-time systems |
|starts addressing the properties of multicore hardware |
|there is hope that it could really shine: Not just because concurrent|
|sequential processes are so simple to express using Ada tasks |
|---and you'd be using only language, not a mix of libraries, |
|preprocessors, specialized compilers, programming conventions, |
|etc. But also in case the fine grained complexity of OpenMP 3.0 |
|can be bridled by simple language and a good run-time system. |
|At little cost." |
|---------------------------------------------------------------------|

I met someone today who described himself as "an ordinary FORTRAN
programmer" who advocated C for the practical reason that libraries
are designed for C. He claimed that small tasks are good for multicore
and large tasks are good for GPUs.
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Prev: A good methodology ...
Next: Build raw binary on Windows