From: Eugene Miya on
>> Look not everything is parallel.
>> As a start, you should read Amdahl's (law) paper.

In article <1162421248.333748.269580(a)m7g2000cwm.googlegroups.com>,
BDH <bhauth(a)gmail.com> wrote:
>The core of that is pretty much obvious. But the slow things can be
>made more parallel.

Hmmm, no.
I've worked on several benchmark teams and have to sweat various issues.
If you want brutal, non-trivial slow problems outside cryptanalysis,
I can think of many. A favorite might be Golomb rulers.
No floating point. Can fit in memory.


>> I think for theoretic purposes, Backus' Turing lecture on getting out of
>> the von Neumann paradigm has potential (you have to read these words
>> carefully, FP didn't go far, and FL also stagnated).
>
>I don't know what FP and OO and half a dozen other acronyms are
>supposed to be. They both succeed when they do by making certain kinds
>of abstraction easier in a half-assed way, and then say they're really
>about something totally ridiculous. Let's get rid of state on a machine
>that's mostly state! Let's put our code in objects because...I don't
>know, some guy thinks without justification that it will make X easier!

FP was merely Backus' own short hand for Functional Programming
(which got suceeded by Functional Language or FL). OO Object-oriented:
a fad in a way since the Simula 67 days which has uses but is still no
silver bullet to the software problem.


>> >it's to do with the language used to program gets turned into an RTL
>> >(register transfer language) which specifies all variables as
>> >registers, and then this has to be mapped onto a machine which only has
>> >so many regs.
>
>Thanks a lot, Mr Von Neumann. That is not a very good system.

On the contrary what von Neumann wrot was a very good simple system for
his time. He was an awesome intellect. I wish I had the intellect he
had in his little pinky and I would bet the most knowledge left here
would, too. **


>> >this is why we use paper for calculations, so that the variable space
>> >can fit on it while the short term memory (register space) can only
>> >hold so much.
>
>You're going to argue that we should base computer I/O on human I/O?

That's the person I was responsing to, they have to answer to you.


>> Have you used APL?
>> That's where part of the field was 30 years ago.
>
>What happened to APL? There's K and J but it just hasn't had the
>influence on modern popular languages of Lisp or Smalltalk or C. Or
>Pascal or Fortran or Algol probably.
>
>I've been half-assedly working on sort of an APL/Lisp hybrid.

A number of things killed APL. I recall the special APL character set
(the Culler-Fried keyboard was little better and its was QWERTY).
I extracted from an officemate involved with the CDC Star-100 that the
direction of evaluation was a poor choice. But at the memorial for Ken
Iverson it's hard to say what came out. I've also had some nice
conversations with Bob Bernecky.

Part of the problem which kills simple languages like these are the
kinds of numeric applications which have volumetric and border (edges,
faces, vertices, and hyper-structures (4D and higher Ds) which are
exception cases which then go to irregular geometries, etc.

** Merge ideas:
Yesterday I attended a seminar given by Bill Dally at Stanford.
The series is an amazing collection of people (Bill, whom I had dinner
with afterward, is now the CS chair, and we synced on friends in common).
I'm going to let the cat out of the bag for people outside the Santa
Clara valley. The lecture is "taped" and put on the web for the
graduate seminar series:
ee380.stanford.edu
He'll tell you about parallelism, and he was worked on respectable
architectures. I also worked with one of the benchmarks he talked about
in the talk.

The amazing thing of the series run by the D of Dr. Dobb's and others
(I have provided a number of speakers), when you go back in time is the
preponderance of Santa Clara Valley talent both giving talks and sitting
in the audience. It's possible to go back and see many archived talks.
Amazing talks on problems for EE etc. (including climatology
[application areas]). Fortunately my 380 talk was before they saved the
tapes. ;^)

--
From: Eugene Miya on
In article <1162438637.216441.54720(a)m73g2000cwd.googlegroups.com>,
BDH <bhauth(a)gmail.com> wrote:
>I am enthusiastic over humanity's extraordinary and sometimes very
>timely ingenuities. If you are in a shipwreck and all the boats are
>gone, a piano top buoyant enough to keep you afloat may come along and
>make a fortuitous life preserver. This is not to say, though, that the
>best way to design a life preserver is in the form of a piano top. I
>think we are clinging to a great many piano tops in accepting
>yesterday's fortuitous contrivings as constituting the only means for
>solving a given problem.=E2=80=A8- R. Buckminster Fuller

That's it?

One of my favorite smart guy quotes (caught on tape, not mine):
You see what I do is imagination with a straight jacket....
--R. P. Feynman.

Oh to have considering going out with his goddauther.

--
From: Eugene Miya on
>> Shirley, you're not serious.

In article <1162438715.331466.142640(a)h48g2000cwc.googlegroups.com>,
BDH <bhauth(a)gmail.com> wrote:
>Maybe I'm biased - I hate Java.

So?
The tools we have are far from perfect.
Make better ones.

--
From: Eugene Miya on
On Wed, 01 Nov 2006 21:27:06 -0800, BDH wrote:
>> So how do you build and move to a boat while sitting on a piano top?
>> Welllll, first we assume a sufficiently smart corporation, then we
>> assume a sufficiently smart compiler, then we assume sufficiently smart
>> developers...

That's the punchline to the joke what's the difference between a
computer scientist and an EE? One of each is stranded on a "desert"
island (this being before Survivor). Both come to the conclusion that a
computer will help them. The CS says like above for Turing machines.
The EE says first we take some sand.....

In article <4qtfa0Foj3v6U1(a)individual.net>,
Andrew Reilly <andrew-newspost(a)areilly.bpc-users.org> wrote:
>How are the developers, however smart, going to express their algorithms
>without introducing sequential dependencies, however inadvertently?
>(What is an algorithm, without sequential dependencies?) Don't you need an
>appropriate language, and perhaps a plausible parallel machine
>abstraction, before you start on the compiler? How would your language be
>different from Verilog or VHDL or Occam? What would be different, this
>time?

Take a look and try what friends did on SISAL.


>Or are you, perhaps, hinting at the "High Productivity" DARPA project, or
>one of Sun, IBM or Cray's sub-projects, each of which, I assume, has
>working answers to my previous questions?

They mostly work on hardware.

--
From: Terje Mathisen on
Eugene Miya wrote:
> One of my favorite smart guy quotes (caught on tape, not mine):
> You see what I do is imagination with a straight jacket....
> --R. P. Feynman.

Feynman was a "wizard of the highest order". Even after he had figured
something out and explained it, there was no way for mere mortals to see
how they could ever have done the same.
>
> Oh to have considering going out with his goddauther.

Please tell!

Terje
--
- <Terje.Mathisen(a)hda.hydro.com>
"almost all programming can be viewed as an exercise in caching"