From: David Brown on
Lorenzo Villari wrote:
> On Tue, 12 Jan 2010 23:44:08 +0100
> David Brown <david.brown(a)hesbynett.removethisbit.no> wrote:
>
>> Thirdly, rather than having a compiler try to guess what algorithm
>> you are trying to implement, it would be better to have more
>> algorithms implemented in the libraries and preferably the language
>> itself in a way that is natural for the programmer. For example, if
>> the language has support for real arrays with functions such as
>> sorting, the programmer can work with an array and let the compiler
>> figure out the best way to implement it.
>>
>
> 1) Coding a custom algorithm is not natural to the programmer?

Coding efficient sort algorithms (since this is the example mentioned)
is not natural to most programmers. "Low-level" algorithms for data
structures, data manipulation and calculations is not something that
many programmers are good at - it requires a more mathematical
background to do well. Application algorithms are a different matter,
of course.

> 2) What's a "real" array?
>

I mean something more than syntactic sugar for a pointer and a block of
memory (which is C's concept of arrays). Some languages have direct
support for higher level data structures such as arrays which know their
size and have useful methods, dictionaries, tress, etc. My point is,
when the language and compiler support these sorts of structures, the
compiler has a better chance to generate good code than when all it has
is low-level pointer access, and it has to guess what you are doing with
the array.

>> With C, the programmer will
>> either write their own bubble sort, or call qsort - with all its
>> overheads.
>
> Which overhead?
>

The overhead of calling the comparison function indirectly through a
pointer, and using run-time widths. Have you ever looked at the source
code for a typical standard C library qsort() routine? As with many
"generic" functions in C, it is far less efficient than if you right a
dedicated quicksort function with known sizes and comparison functions.

>> This is why programs written in interpreted languages
>> such as Python can often be faster than compiled C code for code
>> working with complex data structures, hash tables, sorting, list
>> manipulation, etc., Of course the C code /could/ be faster - but
>> people don't write optimal algorithms in most code. With the
>> libraries and language support in Python, you get very good
>> algorithms for free.
>>
>
> If you can stand the formatting of python... ouch! It's the same case

Python formatting is a matter of taste. At least there you are forced
to work to a reasonably standard formatting - C gives people freedom to
write the most hideous code (though it also lets you write quite nice code).

> of custom code vs standard code, or custom C container vs STL C++
> container. One asks himself\herself why the c++ people agreed to have a
> standard way of doing this and the C counterpart won't never agree...
>

C++ templates let you write something like a generic qsort() function
that will generate good target code.
From: Måns Rullgård on
karthikbalaguru <karthikbalaguru79(a)gmail.com> writes:

> On Jan 15, 3:26�am, Hans-Bernhard Br�ker <HBBroe...(a)t-online.de>
> wrote:
>> David Brown wrote:
>> > Nothing stops the compiler from doing this sort of thing in /theory/.
>> > But /practice/ is a different matter.
>>
>> The same thing applies to the original question. �If a compiler's
>> full-blown optimizer, given practically infinite time to ponder the
>> problem, can't get that analysis job done, then no other tool can, and
>> certainly not while just looking over the programmers' shoulders as they
>> type.
>
> I think, the tool should be trained to develop tiny logics by giving
> tiny infos to it. I think, the tool should also be trained to
> develop its own self decision capabilities by giving lot of small
> tasks that might require tiny decisions initially. The above should
> make it robust in finding alternate logic. Maybe, it can also bank
> on the reliable sources on internet or other servers to get more
> info incase it runs out of steam. But i think, it is better to avoid
> some internet dependencies as sometimes we might need to use a PC
> that is not connected to internet !

There's a name for that: outsourced engineer.

--
M�ns Rullg�rd
mans(a)mansr.com
From: David Brown on
Andy wrote:
> On Jan 14, 9:57 am, David Brown <da...(a)westcontrol.removethisbit.com>
> wrote:
>> I don't agree here (perhaps as a compiler writer you are thinking of
>> "implementation" in terms of generated target code - then I'd agree).
>> Kids use Logo to learn about programming concepts, and how to get the
>> computer to do what you want it to do. They learn to write things like:
>>
>> TO SQUARE :size
>> REPEAT 4 [ FD :size RT 90 ]
>> END
>>
>> This is entirely about writing an imperative implementation of how you
>> want the system to draw a square.
>>
>> Compare this to a sample program in a real functional programming
>> language, Haskell:
>>
>> factorial 0 = 1
>> factorial n = n * factorial(n - 1)
>>
>> Here you tell the system what you want - you give a mathematical
>> description of the results. You don't care how it gets them - maybe it
>> uses recursion, maybe it uses iteration, maybe it builds up a cache of
>> calculated values as it goes along.
>>
>
> The LOGO interpreter/compiler is just as free to implement alternative
> solutions to drawing a square as the Haskell compiler is of altering
> the described recursive implementation of a factorial. Whether the
> compiler is smart enough to do so has nothing to do with the language
> being "procedural" or "functional".
>

It is certainly true that the compiler is free to implement alternative
solutions in any language. However, the style of language has a great
deal to say in how hard a task this is for the compiler writer. In
particular, with a language that does not have states or allow side
effects in functions, it is much easier for the compiler to have a
complete view of what the function does - and thus it has a better
chance of generating alternative implementations.

You can think of compiling (or interpreting - it's the same thing in
theory) as taking a high level source code down to a low level
implementation in a series of steps. At each step down, the compiler
can pick one of many implementations at the lower level. It can also
re-arrange the code at the same level. Moving up a level from an
implementation is a much harder task, and will rarely lead to a better
result in the end - too much of the "overview" information is already
lost, and there are too many implementation details that cannot be
distinguished from requirements. Thus an ideal compiler has the best
chance of generating the best code if the source is at a higher level,
and if it is written in a language that gives a clearer and more
mathematically precise specification.

Of course, practical reality does not match directly with the theory.
In particular, vastly more man-hours have been spent to write better
compilers for C than for Haskell - the result being that you can compile
C code to tighter target code than you can with Haskell code.
From: Rainer Weikusat on
Nick Keighley <nick_keighley_nospam(a)hotmail.com> writes:
> On 13 Jan, 16:43, dj3va...(a)csclub.uwaterloo.ca.invalid wrote:
>> In article <4b4def88$0$22938$e4fe5...(a)news.xs4all.nl>,
>> [Jongware] <so...(a)no.spam.net> wrote:
>> >Walter Banks wrote:
>
>> >> Defining goals at a much higher level than C opens the possibilities
>> >> for automating algorithmic choices at the function level.
>>
>> >Aha -- wouldn't the logical end point be a programming language where
>> >you type "word processor", save it as source, compile, and have a word
>> >processor?
>>
>> Why bother to compile it? �Just have it interpret on-the-fly.
>> That way you could even run it in interactive mode, and it's
>> sufficiently high-level that even non-programmers could usefully use
>> it.
>>
>> Unix people call this a "shell".
>
> I'm guessing you're trying to be funny/ironic. But in case you aren't,
> Unix has dozens of stranglely incompatible Command Line Interfaces
> that Unix people call "shells". None of them are word processors.

UNIX(*) has a single type of 'interactive command processor/ simple
scripting language' and its features are described by an IEEE
standard. You observation that lots of different people and
organizations have written application software for UNIX(*) and that
this different applications differ is nevertheless correct.

Pearls of wisdom ...
From: Richard Tobin on
In article <87ska5ezlg.fsf(a)fever.mssgmbh.com>,
Rainer Weikusat <rweikusat(a)mssgmbh.com> wrote:

>UNIX(*) has a single type of 'interactive command processor/ simple
>scripting language' and its features are described by an IEEE
>standard.

This is pedantry of the most pointless kind. You're welcome to
your "UNIX(*)", but don't pretend that your comments have anything
to do with the real world.

-- Richard
--
Please remember to mention me / in tapes you leave behind.