From: nmm1 on
In article <0b8a1e65-94d8-4776-99d5-ef5356a246b2(a)g19g2000yqo.googlegroups.com>,
Robert Myers <rbmyersusa(a)gmail.com> wrote:
>On Oct 18, 7:17=A0am, jacko <jackokr...(a)gmail.com> wrote:
>
>> This does imply that languages which forbode the software bad days are
>> the best place to put research. I think method local write variables,
>> with only one write point, and many read points. Yes spagetti
>> languages are long in the tooth
>
>Much as I loathe c, I don't think it's the problem.

It's part of the problem - even worse, the programming paradigms it
implies (and sometimes requires) are not the right ones.

>I *think* the problem is that modern computers and OS's have to cope
>with so many different things happening asynchronously that writing
>good code is next to impossible, certainly using any of the methods
>that anyone learned in school.

The former is not true, and I can't speak for the latter. What WERE
you taught in school? Even calculators were not part of my formal
schooling :-)

There are plenty of good programming paradigms that can handle such
requirements. Look at Smalltalk for an example of a language that
does it better. Note that I said just "better", not "well".

>It's been presented here has a new problem with widely-available SMP,
>but I don't think that's correct. Computers have always been hooked
>to nightmares of concurrency, if only in the person of the operator.
>As we've come to expect more and more from that tight but impossible
>relationship, things have become ever more challenging and clumsy.

There are differences. 40 years ago, few applications programmers
needed to know about it - virtually the only exceptions were the
vector computer (HPC) people and the database people. That is
about to change.

>All that work that was done in the first six days of the history of
>computing was aimed at doing the same thing that human "computers"
>were doing calculating the trajectories of artillery shells. Leave
>the computer alone, and it can still manage that sort of very
>predictable calculation tolerably well.

Sorry, but that is total nonsense. I was there, from the late 1960s
onwards.

>Even though IBM and its camp-followers had to learn early how to cope
>with asynchronous events ("transactions"), they generally did so by
>putting much of the burden on the user: if you didn't talk to the
>computer in just exactly the right way at just exactly the right time,
>you were ignored.

Ditto.

>Even the humble X-windowing system contemplates an interaction that
>would at one time have been unimaginable in the degree of expected
>flexibility and tolerance for unpredictability, and the way the X-
>windowing system often works in practice shows it, to pick some
>example other than Windows.

Ditto. A more misdesigned and unreliable heap of junk, it is hard
to imagine - as soon as the user starts to stress it, some events get
lost or directed to the wrong window. It is common for most of the
forced restarts (and not just session, often system) to be due to a
GUI mishandling error recovery, and the design of the X Windowing
System is integral to that. Microsoft just copied that, by legally
ripping off IBM Presentation Manager.

>In summary:
>
>1. The problem is built in to what we expect from computers. It is
>not a result of multi-processing.

It may be what you expect from them. It is not what either the first
or second generation computer users did (and I count as one of the
latter). You would count as the third, and there is now a fourth.

>2. No computer language that I am aware of would make noticeable
>difference.

Perhaps I am aware of a wider range, because I do.

>3. Nothing will get better until people start operating on the
>principle that the old ideas never were good enough and never will be.

That is PRECISELY the principle on which modern general-purpose
operating systems, interfaces and programming languages WERE designed.
The old mainframe people despair at them reinventing the wheel, and
not even getting the number of sides right :-(

>Eugene will tell me that it's easy to take pot shots. Well, maybe it
>is. It's also easy to keep repeating the same smug but inadequate
>answers over and over again.

Very, very true.


Regards,
Nick Maclaren.
From: Ken Hagan on
On Sat, 17 Oct 2009 10:26:17 +0100, <nmm1(a)cam.ac.uk> wrote:

> People knew how to build a functional steam locomotive in Hero's
> day - they didn't have the technological base to do it.

The Disc of Phaistos is typed document, demonstrating both know-how and
technical base, but printing didn't take off until 1500. And maybe ancient
Iraqis were electroplating, but we had to wait a while for any other
applications of that idea, too.

Maybe we already know how to build scalable parallel machines, and there's
an example collecting dust in a museum somewhere, but the time isn't
right. :) Wouldn't *that* be annoying?
From: nmm1 on
In article <op.u11ikrbgss38k4(a)khagan.ttx>,
Ken Hagan <K.Hagan(a)thermoteknix.com> wrote:
>
>> People knew how to build a functional steam locomotive in Hero's
>> day - they didn't have the technological base to do it.
>
>The Disc of Phaistos is typed document, demonstrating both know-how and
>technical base, but printing didn't take off until 1500. And maybe ancient
>Iraqis were electroplating, but we had to wait a while for any other
>applications of that idea, too.

Er, if you think that being able to hand-stamp a clay tablet with
hand-carved seals is a suitable technological base for building a
printer, I suggest that you try it.

>Maybe we already know how to build scalable parallel machines, and there's
>an example collecting dust in a museum somewhere, but the time isn't
>right. :) Wouldn't *that* be annoying?

We do, and have done for a long time. What we don't know how to do,
is to map the majority of the requirements we have into programs that
will run efficiently on the machines we know how to build.


Regards,
Nick Maclaren.
From: Bernd Paysan on
nmm1(a)cam.ac.uk wrote:
>>The Disc of Phaistos is typed document, demonstrating both know-how and
>>technical base, but printing didn't take off until 1500.
>
> Er, if you think that being able to hand-stamp a clay tablet with
> hand-carved seals is a suitable technological base for building a
> printer, I suggest that you try it.

Printing in China has been used for about a 1000 years when movable types
where invented there - and that invention didn't catch on. It was easy
enough to carve the writing into stone or wood and print that way, movable
types are only cost effective if you print in low volume. The main
invention that made printing feasible was the invention of (cheap) paper -
both in Europe and in China.

Coming back to topic: massive parallel machines have been developed in the
past, as well, and are already rotting in museums. Remember Thinking
Machines? MasPar? They all did architectures somewhat similar to current
GPUs more than 20 years ago. Why didn't they catch on? Because it's more
expensive to write massive parallel programs.

That's why we are getting massive parallel CPUs through the wheel of
reinvention, through the GPUs. It's cost efficient to write some small
massive parallel programs for rendering graphics, the main algorithms are
still serial. It's even cost efficient to put a physics engine into the
massive parallel domain. Step by step, the GPGPU takes tasks over from the
CPU.

Maybe we now have reached the sophistication of the tools so that writing
parallel programs is cheap enough. This is my point when looking at these
historical comparisons: It usually wasn't the invention of some principle
(steam engine, movable types) that made the break-through, it usually was
something else that made the resulting machine cost effective. When we
think of the steam engine, we think of James Watt, but in reality, during
the time when James Watt hat his patent on a particular improvement of steam
engines, not so many were deployed. The revolution came afterwards, when an
"open source"-like community of steam engine developers and operators
rapidly improved the steam engine, so that it became cost effective in many
more use cases than before.

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/
From: nmm1 on
In article <1947351.S7ZIpNfcyU(a)elfi.zetex.de>,
Bernd Paysan <bernd.paysan(a)gmx.de> wrote:
>
>>>The Disc of Phaistos is typed document, demonstrating both know-how and
>>>technical base, but printing didn't take off until 1500.
>>
>> Er, if you think that being able to hand-stamp a clay tablet with
>> hand-carved seals is a suitable technological base for building a
>> printer, I suggest that you try it.
>
>Printing in China has been used for about a 1000 years when movable types
>where invented there - and that invention didn't catch on. It was easy
>enough to carve the writing into stone or wood and print that way, movable
>types are only cost effective if you print in low volume.

That's not true. The converse is. Carving a whole block in reverse
and printing from that is only cost-effective if the number of pages
printed is small. You need extremely skilled people to carve them,
in order to keep the error rate down. There are also problems with
movable type and Chinese characters (i.e. the number of them!)

Before moveable type, almost all printing was things like pictures,
prayers, fabric patterns and other uses where the number of different
pages is small.

>The
>invention that made printing feasible was the invention of (cheap) paper -
>both in Europe and in China.

That's true.


Regards,
Nick Maclaren.