From: Robert Myers on
On Oct 19, 4:34 am, n...(a)cam.ac.uk wrote:
> In article <0b8a1e65-94d8-4776-99d5-ef5356a24...(a)g19g2000yqo.googlegroups..com>,
> Robert Myers  <rbmyers...(a)gmail.com> wrote:
>
> >On Oct 18, 7:17=A0am, jacko <jackokr...(a)gmail.com> wrote:
>
> >> This does imply that languages which forbode the software bad days are
> >> the best place to put research. I think method local write variables,
> >> with only one write point, and many read points. Yes spagetti
> >> languages are long in the tooth
>
> >Much as I loathe c, I don't think it's the problem.
>
> It's part of the problem - even worse, the programming paradigms it
> implies (and sometimes requires) are not the right ones.
>
> >I *think* the problem is that modern computers and OS's have to cope
> >with so many different things happening asynchronously that writing
> >good code is next to impossible, certainly using any of the methods
> >that anyone learned in school.
>
> The former is not true, and I can't speak for the latter.  What WERE
> you taught in school?  Even calculators were not part of my formal
> schooling :-)
>
I have exactly two classroom courses about computers on my
transcripts. In the first, as a freshman at a very well-known trade
school, I learned PL, a pedagogic subset of PL/I. We also were taught
a bit of IBM assembly language. I used PL/I in graduate school for a
course in numerical analysis while everyone else was using Fortran,
and I used PL/I for the computations required by my thesis work, which
required double-precision complex arithmetic that was buggy (in the PL/
I compiler). I discovered on my own that IBM's scientific subroutine
package was garbage and wrote all my own subroutines to calculate non-
elementary special functions. I didn't learn Fortran until I was "in
industry" and I've never used c professionally. I've tried most every
language that has been mentioned in this forum. I used an electronic
calculator to finish my thesis, and I believe I belonged to the last
generation actually to use slide rules (and log tables). There was a
gigantic and noisy mechanical calculator in the lab where I worked at
the previously mentioned trade school. Something about radar and the
building I worked in, but then I'd be playing your game, wouldn't I?
I also have more hardware courses than software courses on my
transcripts, but that wouldn't be very many hardware courses, either,
would it?

> There are plenty of good programming paradigms that can handle such
> requirements.  Look at Smalltalk for an example of a language that
> does it better.  Note that I said just "better", not "well".
>
> >It's been presented here has a new problem with widely-available SMP,
> >but I don't think that's correct.  Computers have always been hooked
> >to nightmares of concurrency, if only in the person of the operator.
> >As we've come to expect more and more from that tight but impossible
> >relationship, things have become ever more challenging and clumsy.
>
> There are differences.  40 years ago, few applications programmers
> needed to know about it - virtually the only exceptions were the
> vector computer (HPC) people and the database people.  That is
> about to change.
>
Vector computing was trivial, relatively speaking. I learned Cray
assembly language before I learned any microprocessor assembly
language. None of us really knew much of anything about concurrency
except how to turn loops inside out, to insert $IVDEP directives, to
use the vector mask register, and to avoid indirect addressing.

> >All that work that was done in the first six days of the history of
> >computing was aimed at doing the same thing that human "computers"
> >were doing calculating the trajectories of artillery shells.  Leave
> >the computer alone, and it can still manage that sort of very
> >predictable calculation tolerably well.
>
> Sorry, but that is total nonsense.  I was there, from the late 1960s
> onwards.
>
Where do you think I was, Nick? Do you know? I don't want to make
this personal, but when you use comp.arch as a forum for your personal
mythology, I don't know how to avoid it. I stand by my
characterization.

For one thing, the computers that were available, even as late as the
late sixties, were pathetic in terms of what they could actually do.
I've had lengthy public discussions about some of the modeling that
went on with one of the actual players and presented citations to
published documents that talk about the failures of that modeling. In
designing the atomic bomb, scientists at Los Alamos used human
pipelining to do calculations (that bit *was* before my time), and the
first few decades of computers were not a huge improvement on that in
terms of capability.

> >Even though IBM and its camp-followers had to learn early how to cope
> >with asynchronous events ("transactions"), they generally did so by
> >putting much of the burden on the user: if you didn't talk to the
> >computer in just exactly the right way at just exactly the right time,
> >you were ignored.
>
> Ditto.
>
Nick. I *know* when time-sharing systems were developed. I wasn't
involved, but I was *there*, and I know plenty of people who were. I
*know* what using IBM batch processes was like. I *know* what having
to make another trip to the computer lab because of some cryptic bit
of JCL was suddenly required. I *know* what using a line-oriented
editor from a 300 baud tty was like. I also know how to backspace a
punch card machine.

> >Even the humble X-windowing system contemplates an interaction that
> >would at one time have been unimaginable in the degree of expected
> >flexibility and tolerance for unpredictability, and the way the X-
> >windowing system often works in practice shows it, to pick some
> >example other than Windows.
>
> Ditto.  A more misdesigned and unreliable heap of junk, it is hard
> to imagine - as soon as the user starts to stress it, some events get
> lost or directed to the wrong window.  It is common for most of the
> forced restarts (and not just session, often system) to be due to a
> GUI mishandling error recovery, and the design of the X Windowing
> System is integral to that.  Microsoft just copied that, by legally
> ripping off IBM Presentation Manager.
>
Ok, Great Wizard of Oz. Write a better display manager.

> >In summary:
>
> >1. The problem is built in to what we expect from computers.  It is
> >not a result of multi-processing.
>
> It may be what you expect from them.  It is not what either the first
> or second generation computer users did (and I count as one of the
> latter).  You would count as the third, and there is now a fourth.
>
You remind me of a plasma physicist that I knew at a national lab.
For someone of his self-proclaimed stature, he didn't know something
very basic about Green's functions for the wave equation in even-
numbered dimensions. He, too, was keen to establish his creds by
counting generations.

> >2. No computer language that I am aware of would make noticeable
> >difference.
>
> Perhaps I am aware of a wider range, because I do.
>
I stand by my assertion, and I doubt very seriously that you know
better. Yes, there are languages, like ada, that should be used in
place of c, but they aren't. The issue been discussed at length here
and ada was the only language identified that had sufficient
generality to hope to replace c as a systems programming language. If
you had a big secret to reveal, I don't know why you were holding
back.

> >3. Nothing will get better until people start operating on the
> >principle that the old ideas never were good enough and never will be.
>
> That is PRECISELY the principle on which modern general-purpose
> operating systems, interfaces and programming languages WERE designed.
> The old mainframe people despair at them reinventing the wheel, and
> not even getting the number of sides right :-(
>
Unix was a hail Mary pass, so far as I understand it, because what was
being developed when I was around was simply too ambitious. I stand
by my comment.

Robert.
From: Robert Myers on
On Oct 19, 10:24 am, "Del Cecchi" <delcecchioftheno...(a)gmail.com>
wrote:

> Actually that wasn't the aim of Itanium, at least if you are a cynic
> like me.  Some would say that it was an attempt to create a new
> proprietary architecture that was outside the web of cross licensing
> agreements that Intel had.
>
> The notion that it would be better merely had to be plausible in order
> to achieve that goal.

Agreed.

However suspect Intel's motivations may have been, I thought, and I
still think, for reasons only marginally related to hardware
architecture, that it was one of the more interesting undertakings in
the history of computing. The triumph of x86 may have been a triumph
of common sense, but common sense has never seemed very interesting to
me.

Robert.

From: "Andy "Krazy" Glew" on
Andrew Reilly wrote:
> Isn't it the case, though, that for most of that "popular
> software" speed is a non-issue?

I've been manipulating large Excel spreadsheets.

Minutes-long recalcs.

Sometimes as long as 15 minutes to open up such a spreadsheet embedded
as an OLE object in a Word document.

I'm reasonably sure it's computation, and not disk.

However, I am also fairly certain, because I have done a few
experiments, that there are a few quadratic algorithms that could be linear.

Algorithms trump hardware, nearly every time.

(Hmm.... in the past I have devised "instruction rewriting" hardware
that converted linear code such as i++;i++;...;i++ into O(lg N) latency,
O(N lg N) size parallel prefix code. I wonder if the problems I am
seeing Excel could be optimized away by a subsystem.)

From: kenney on
In article <1947351.S7ZIpNfcyU(a)elfi.zetex.de>, bernd.paysan(a)gmx.de
(Bernd Paysan) wrote:

> Printing in China has been used for about a 1000 years when movable
> types where invented there -

Movable type really requires an alphabetic script. You are better of
with wood block printing for a written language that has several
thousand characters. The Chinese attempt at movable type was associated
with the Yuan dynasty and a script that Kublai introduced, it failed as
much for political reasons as technical ones.

> movable
> types are only cost effective if you print in low volume.

Even given that the major constraint on printing in the West was the
cost of type I think you are wrong. Cheap books required the
introduction of the rotary press and means to convert type to plates
that could be used in them. Still type when it was introduced only had
to compete with the Scriptorium. There were other constraints than type
on early printing as well. Using a screw press was slow and paper was
expensive.


Ken Young
From: Terje Mathisen on
Del Cecchi wrote:
>> Once upon a time there was an attempt to satisfy "the aims of
>> science / pursuit of excellence".
>> It was called Itanium. Turned out to be a multi-billion dollar flop.
>> No more projects like that please!
>
> Actually that wasn't the aim of Itanium, at least if you are a cynic
> like me. Some would say that it was an attempt to create a new
> proprietary architecture that was outside the web of cross licensing
> agreements that Intel had.
>
> The notion that it would be better merely had to be plausible in order
> to achieve that goal.

I'm not quite as cynical as you Del, but I still agree:

The main target of Itanium was to have a "closed-source" cpu that simply
couldn't be cloned at all:

A new separate (joint venture) company to develop it, making existing
deals with both Intel and HP moot, lots of funky little patented details
that were intentially exposed to the programmer, making it very much
harder to invent around.

OTOH, I really do believe Intel intended to start deliver in 1997, in
which case it _would_ have been, by far, the fastest cpu on the planet.
When they finally did deliver, years later, it was still the fastest cpu
for dense fp kernels like SpecFP.

They delivered too little, too late, but still managed to terminate
several competing architecture development tracks at other vendors.

Terje
--
- <Terje.Mathisen at tmsw.no>
"almost all programming can be viewed as an exercise in caching"