From: Eugene Miya on
In article <d5746c05-1aa7-4dc3-af0a-66fb4b387954(a)36g2000yqu.googlegroups.com>,
Robert Myers <rbmyersusa(a)gmail.com> wrote:
>On Feb 10, 11:25=A0pm, n...(a)cam.ac.uk wrote:
>> >as a dead issue.
>> There are people who don't think that Elvis is dead, too.
>>
>But they're wrong. Elvis and I regularly meet for a gospel singalong.

Hey, Costello doesn't count.

--

Looking for an H-912 (container).

From: Nicholas King on
On 02/10/2010 01:31 PM, Robert Myers wrote:
> Andy Glew has said that a supercomputer is nothing but a big, multi-
> tiered switch. That is certainly what supercomputers have come to be,
> and I don't want to give the impression that I think even the lame
> interconnects we get are necessarily trivial.
I think if we look at this and Terje comment on programming being an
exercise in caching. It seems to an ignorant outsider that HPC is all
about communication and furthermore it seems that if the core counts
keep going up (which I have no doubt will continue to happen) then the
same shall apply to general purpose computing.

Programmers are just going to have get used to the real world being more
complex. Historically we haven't had a huge success with this.

Cheers,

Nicholas King.
From: Del Cecchi on
Eugene Miya wrote:
> In article <7thjn9FbcmU1(a)mid.individual.net>,
> Del Cecchi <delcecchinospamofthenorth(a)gmail.com> wrote:
>> That is not what killed osborne. What killed osborne was announcing a
>> better machine that was not in production and freezing the market when
>> they didn't have cash to cover the interval.
>
> I seem to recall another firm which announced a "better machine not in
> production...."
>
IBM nearly died by the way they announced and priced 370. But I presume
you are talking about FS? That was never announced actually. Itanium?
Intel had enough money, but some didn't.

Whatcha talking about?

del
From: Robert Myers on
On Feb 11, 9:06 pm, Nicholas King <z...(a)zerandconsulting.com> wrote:

>
> Programmers are just going to have get used to the real world being more
> complex. Historically we haven't had a huge success with this.

No, "we" haven't, although, in the world according to Knuth (and his
legion admirers), the important problems are the ones he's good at.

I used to worry about this a lot. I've worried about here, and most
of the wisdom I have on the subject, such as it is, I owe to
conversations here.

I started this thread partly because my concerns have shifted:

1. Even *if* we knew how to be smarter about programming, I think the
field is essentially dead. The problems that have been explored have
turned out to be disappointing and/or uninteresting. So now we can
virtualize a zillion things on a desktop. What does that get us,
other than that I no longer have to build separate crash and burn
machines?

2. I don't see any obvious or inobvious way around the communication
problem for big machines, compared to which the "memory barrier," that
scared the hell out of everybody a long time ago, is trivial.

Eugene's advice and Terje's cleverness and insight notwithstanding, we
don't need more hotshot programmers with sharp pencils. We need a
Wiener, a Shannon, or a Turing.

None of what I've said would necessarily true of embedded
applications, like robotics, about which I know absolutely nothing,
but you don't need a zigaflop machine at LLNL for those applications.

Robert.

From: Terje Mathisen "terje.mathisen at on
Nicholas King wrote:
> On 02/10/2010 01:31 PM, Robert Myers wrote:
>> Andy Glew has said that a supercomputer is nothing but a big, multi-
>> tiered switch. That is certainly what supercomputers have come to be,
>> and I don't want to give the impression that I think even the lame
>> interconnects we get are necessarily trivial.
>
> I think if we look at this and Terje comment on programming being an
> exercise in caching. It seems to an ignorant outsider that HPC is all
> about communication and furthermore it seems that if the core counts
> keep going up (which I have no doubt will continue to happen) then the
> same shall apply to general purpose computing.

The fact is that all computers have been heavily (cc or not)-NUMA for a
couple of decades.

I've previously told the story of how I came in second in a programming
competition with a 486 target because I hadn't grok'ed what having a
cache really meant. The guy who did understand this just blew me away,
running on average twice as fast even though he was doing _more_ "real
work" than what my code did.
>
> Programmers are just going to have get used to the real world being more
> complex. Historically we haven't had a huge success with this.

:-)

Terje

--
- <Terje.Mathisen at tmsw.no>
"almost all programming can be viewed as an exercise in caching"