From: Chris Gray on
"Andy \"Krazy\" Glew" <ag-news(a)patten-glew.net> writes:

> My only difference is that I hope that it may be possible to define a
> meta-language, in which many of the new notations can be expressed.

Something like this *might* be practical. See below.

> Do we really need a new language that defines yet another form of
> IF-THEN-ELSE syntax? Yet another form of blocking and scope?
> (E.g. yesterday I learned that Javascript's curly brace { } does not
> have the same scoping significance as in other languages.)

My answer here is a big NO. The proliferation of special-purpose languages
(many of which start small but soon expand horribly) is costing the industry
a lot of overhead in terms of learning time, maintainance time, etc. A single
consistent framework with stuff embedded in it would be much better, IMHO.

> A big problem with everyone defining their own new language, whether
> translated to C or the Java bytecodes or whatever, is that if two
> people A and B have their own new languages, it is often hard to write
> a program that uses both notations A and B in the same code If A and
> B are completely mutually exclusive, you may have to serialize the
> data (maybe al the way to XML) to pass it from the A module to the B
> module.

> I just hope (and would love to have the chace to work on) that we
> could create a language framework, a metalanguage, so that the same
> programmer can intermix A and B notations at as fine grain a level as
> possible.
> Ideally within the same {block}. Possibly in different
> functions or classes. Possibly in different files or compilation
> units.

At the level of a compiler, there are some problems. One has to do with the
basic parsing of such code. A mini-language will typically introduce new
keywords to the language. What if there are conflicts between a pair of
mini-languages created by different people? How does the compiler parse
something that contains syntactic extensions from multiple sources? There are
solutions to problems like this, but they require care to define. My initial
thought is that a "package" containing a bunch of type definitions,
declarations code, etc. should indicate what mini-languages it uses. That
should let the compiler handled it properly. Trying to do this kind of thing
at a finer level could easily produce an unmaintainable mess.

My current project, the "Zed" system, actually provides a pretty good
framework to try out something like this. I'll add notes to my immense
"to think about" file. Currently, my extensibility is limited to things
like this (some details skipped):

struct Pair_t {
uint p_left, p_right;
};

instance PairDList = Lists/DList(Pair_t);
type PairDListHead_t = PairDList.DHead_t;
type PairDList_t = PairDList.DList_t;

PairDListHead_t Pdlh;

proc
iterate1()void:
Lists/Iterate("dl", Pdlh) begin
Fmt/FmtL("first scan (", dl.p_left, ", ", dl.p_right, ')');
end;
corp;

Package "Lists" could have been written by anyone. The point here is that it
exports a "construct" called "Iterate", which can iterate over any kind of
list. It runs at compile time and expands itself and its body into the
appropriate code. (Note that "FmtL" is another variant of compile-time proc
that replaces its call with calls to run-time routines to do output
formatting.) This gives a small bit of what Andy is looking for, but the
syntax is not very nice. One of the reasons I've been this restrictive so far
is my concern for the readability and maintainability of code. I *could* let
some kind of "construct" proc do its own parsing, but the possibilities for
badness are endless. Such a proc could essentially take over the handling of
much of the program, resulting in the program doing something quite different
from what it looks like it does. Restricting who can produce official
mini-languages might be an answer.

> Possibly communicating only via least-common-denominator
> daastructures common to both A and B in the base language. (E.g. no B
> datastructures accessible to A, or vice versa.) But possibly,
> occasionally, expressing the datastructures of B in a form accessible
> to the base language, so that A can manipulate them.

If this is all done within the same base language, then structures should
be universally shareable. There may be variations in the ways in which the
accesses are done.

--
Experience should guide us, not rule us.

Chris Gray cg(a)GraySage.COM
From: Robert Myers on
On Feb 20, 12:35 pm, "Andy \"Krazy\" Glew" <ag-n...(a)patten-glew.net>
wrote:

>
> The best reason to talk about supercomputer-friendly features inside CPUs and GPUs is if you have reason to expect that
> the features may be useful in commercial, consumer, etc, markets.

That just about says it all.

1. The people who buy and pay for "supercomputers" have little
understanding of the work-a-day world of the people who might want to
use them. They have even less understanding of where the science is.

2. No one designs supercomputers any more. People design networks,
where, no matter the rhetoric, the burden is on the work-a-day user to
worry about where things *actually* are and cache (and associated
mindlessly-repeated truisms) has left its costly, indelible mark.

Given those ugly realities, I think it's time for the US taxpayer to
get out of the business of paying for "supercomputers," which are no
more a supercomputer than this desktop (Core i7 920).

Robert.

From: Morten Reistad on
In article <4B8021BB.4070401(a)patten-glew.net>,
Andy \"Krazy\" Glew <ag-news(a)patten-glew.net> wrote:
>Morten Reistad wrote:
>> In article <0384a40d$0$1344$c3e8da3(a)news.astraweb.com>,
>> Nicholas King <ze(a)zerandconsulting.com> wrote:
>>> On 02/10/2010 01:31 PM, Robert Myers wrote:
>

>> But it is the only path I see to contain complexity.
>>
>> -- mrr
>
>I mainly agree with Morton on this. Invent a new notation that
>simplifies thinking about the problem, and use that
>notation directly to express the solution as a program.
>
>Even better if somebody else can make the implementation of the idioms
>of the new notation faster, better.
>
>My only difference is that I hope that it may be possible to define a
>meta-language, in which many of the new notations
>can be expressed.

I am not certain that this is a solvable problem. I would be happy
if we could make significant parts of the language toolchain common.

I have arrived here after making a few of these languages in the
completely wrong manner, as "config files" that have outgrown all
sense. Mostly it has been by adding macro processors, lexical sugar,
simple program constructs etc.

The most difficult part is not the language syntax. We have lots of
tools for that. The real challenge is the declarative powers of
such languages. The imperative programming model that sits so powerfully
with programmers (i think that is one of the lures about programming)
is under challenge here.

A telephony dial plan, an accounting plan, a permissions rule set are
all very declarative in nature, and you would want a language to handle
that directly, and have hooks for "onunits" to catch events.

>Do we really need a new language that defines yet another form of
>IF-THEN-ELSE syntax? Yet another form of blocking and
>scope? (E.g. yesterday I learned that Javascript's curly brace { } does
>not have the same scoping significance as in
>other languages.)

I wouldn't dismiss such stuff so lightly. This is part of what I
call the "syntactic sugar", and it is easy to apply, and pretty powerful.
The real challenge is that the declarations and the scopes in the
language may not look very much like a traditional programming language,
and the syntax assists will have to follow the declarations and the
common definitions in the field, not vice versa.

>A big problem with everyone defining their own new language, whether
>translated to C or the Java bytecodes or whatever,
>is that if two people A and B have their own new languages, it is often
>hard to write a program that uses both notations
>A and B in the same code If A and B are completely mutually exclusive,
>you may have to serialize the data (maybe al the
>way to XML) to pass it from the A module to the B module.

I am not so afraid if such languages will have to be interpreted by
a pretty simple interpreter, because statements become a lot more
powerful, and affect a lot more data. That is the whole point, to break
a 10Mloc project into 100k lines of interpreter, 100k lines libraries,
and 200k lines of high-level code.

>I just hope (and would love to have the chace to work on) that we could
>create a language framework, a metalanguage, so
>that the same programmer can intermix A and B notations at as fine grain
>a level as possible.
> Ideally within the same {block}. Possibly in different functions
>or classes. Possibly in different files or
>compilation units.
> Possibly communicating only via least-common-denominator
>daastructures common to both A and B in the base
>language. (E.g. no B datastructures accessible to A, or vice versa.)
>But possibly, occasionally, expressing the
>datastructures of B in a form accessible to the base language, so that A
>can manipulate them.
>
>I have a dream:
> * New languages, yes
> * But no more gratuitous invention of new languages
> * New languages for the new stuff only


And try to make good languages, and keep the toolchains as common
as possible, but not commoner.

-- mrr
From: Del Cecchi on
Andy "Krazy" Glew wrote:
> Morten Reistad wrote:
>> In article <0384a40d$0$1344$c3e8da3(a)news.astraweb.com>,
>> Nicholas King <ze(a)zerandconsulting.com> wrote:
>>> On 02/10/2010 01:31 PM, Robert Myers wrote:
>
>> Programming in the real world is about dealing with complexity, and
>> doing whatever you can to contain it. But we are forgetting the most
>> important tool we have for dealing with that complexity : language.
>>
>> The success of languages like PHP is that they make layers of languages
>> to deal with that complaxity. You program the machine API in plain C,
>> and build another language on top.
>> People can handle finite amounts of code lines; and complexity correlates
>> directly with code lines when the project becomes more than trivial in
>> size. This is where programming projects explode in programmer count.
>>
>> But we can address the semantics by having specialist languages....
>>
>> All the large projects I have seen the last 4 decades have had huge
>> internal semantic gaps. One example of such a gap is programming
>> business logic where you have to take care of database consistency
>> issues all the time.
>> ...
>> I have long advocated "surface languages" to address such complaxity.
>> This may mean actually designing new languages for the applications.
>> There is huge resistance to doing this. But I see daily the productivity
>> that it generates. ...
>>
>> We need more languages.
>> This is where the junior programmers scream foul, of course. When
>> they have to master language implementation and a new, initially
>> pretty volatile language definition.
>>
>> But it is the only path I see to contain complexity.
>>
>> -- mrr
>
> I mainly agree with Morton on this. Invent a new notation that
> simplifies thinking about the problem, and use that notation directly to
> express the solution as a program.
>
> Even better if somebody else can make the implementation of the idioms
> of the new notation faster, better.
>
> My only difference is that I hope that it may be possible to define a
> meta-language, in which many of the new notations can be expressed.
>
> Do we really need a new language that defines yet another form of
> IF-THEN-ELSE syntax? Yet another form of blocking and scope? (E.g.
> yesterday I learned that Javascript's curly brace { } does not have the
> same scoping significance as in other languages.)
>
> A big problem with everyone defining their own new language, whether
> translated to C or the Java bytecodes or whatever, is that if two people
> A and B have their own new languages, it is often hard to write a
> program that uses both notations A and B in the same code If A and B
> are completely mutually exclusive, you may have to serialize the data
> (maybe al the way to XML) to pass it from the A module to the B module.
>
> I just hope (and would love to have the chace to work on) that we could
> create a language framework, a metalanguage, so that the same programmer
> can intermix A and B notations at as fine grain a level as possible.
> Ideally within the same {block}. Possibly in different functions
> or classes. Possibly in different files or compilation units.
> Possibly communicating only via least-common-denominator
> daastructures common to both A and B in the base language. (E.g. no B
> datastructures accessible to A, or vice versa.) But possibly,
> occasionally, expressing the datastructures of B in a form accessible to
> the base language, so that A can manipulate them.
>
> I have a dream:
> * New languages, yes
> * But no more gratuitous invention of new languages
> * New languages for the new stuff only

Are things like SPICE, Verilog, VHDL, considered programming languages
or something else?
From: Del Cecchi on
Robert Myers wrote:
> On Feb 20, 12:35 pm, "Andy \"Krazy\" Glew" <ag-n...(a)patten-glew.net>
> wrote:
>
>> The best reason to talk about supercomputer-friendly features inside CPUs and GPUs is if you have reason to expect that
>> the features may be useful in commercial, consumer, etc, markets.
>
> That just about says it all.
>
> 1. The people who buy and pay for "supercomputers" have little
> understanding of the work-a-day world of the people who might want to
> use them. They have even less understanding of where the science is.
>
> 2. No one designs supercomputers any more. People design networks,
> where, no matter the rhetoric, the burden is on the work-a-day user to
> worry about where things *actually* are and cache (and associated
> mindlessly-repeated truisms) has left its costly, indelible mark.
>
> Given those ugly realities, I think it's time for the US taxpayer to
> get out of the business of paying for "supercomputers," which are no
> more a supercomputer than this desktop (Core i7 920).
>
> Robert.
>
You can say no one designs supercomputers anymore apparently because
those supercomputers that people are designing or have recently designed
don't fit some personal idiosyncratic definition of what a supercomputer
is supposed to be.

So, please enlighten us and tell us what you think a supercomputer would
be if someone were designing it, and why that is indeed superior to say
a Blue Gene or a roadrunner or all the other pretend supercomputers out
there.

del