From: Quadibloc on
This article on Forbes.com

http://www.forbes.com/2010/04/29/moores-law-computing-processing-opinions-contributors-bill-dally.html

came to my attention from visiting HPC Wire.

I think that he is right that massively parallel chips, designed to
maximize throughput per transistor, are more useful for problems that
can be broken down and solved in parallel than conventional multicore
chips.

But I also think he is very wrong to claim that the only problem is
that people are reluctant to change. That would be true only if every
problem could be solved by a parallel algorithm as efficiently as by a
serial algorithm, in which each step can benefit from information from
all the steps that have gone before.

Some problems can't be broken down into an arbitrary number of small
pieces. For those problems, we need the fastest possible serial
performance to work on the big pieces that are present. We can work to
advance our knowledge of possible parallel algorithms, but they either
exist or not, independently of our wishes.

John Savard
From: Del Cecchi on

"Quadibloc" <jsavard(a)ecn.ab.ca> wrote in message
news:4ffd0143-1dc7-4a93-98cc-097530848a6e(a)p2g2000yqh.googlegroups.com...
> This article on Forbes.com
>
> http://www.forbes.com/2010/04/29/moores-law-computing-processing-opinions-contributors-bill-dally.html
>
> came to my attention from visiting HPC Wire.
>
> I think that he is right that massively parallel chips, designed to
> maximize throughput per transistor, are more useful for problems
> that
> can be broken down and solved in parallel than conventional
> multicore
> chips.
>
> But I also think he is very wrong to claim that the only problem is
> that people are reluctant to change. That would be true only if
> every
> problem could be solved by a parallel algorithm as efficiently as by
> a
> serial algorithm, in which each step can benefit from information
> from
> all the steps that have gone before.
>
> Some problems can't be broken down into an arbitrary number of small
> pieces. For those problems, we need the fastest possible serial
> performance to work on the big pieces that are present. We can work
> to
> advance our knowledge of possible parallel algorithms, but they
> either
> exist or not, independently of our wishes.
>
> John Savard

Algorithms are not like rocks, laying around to be discovered. They
are creations of humans used to express a method of performing a
calculation which in turn is often an approximation of physical
reality. Other times it is just adding up money or something.

The weather happens in a parallel fashion. Gas flows turbulently in a
parallel fashion. Bombs explode in a parallel fashion. All parallel
with respect to space. Time is at least perceived to be sequential.
first things first and all that. Unless you are from denmark. :-)

So the fact that we don't know enough yet to build hardware and
program it to perform these calculations in a true parallel manner
doesn't mean that it can't be done.

del


From: Robert Myers on
On Apr 29, 11:23 pm, "Del Cecchi" <delcec...(a)gmail.com> wrote:
> "Quadibloc" <jsav...(a)ecn.ab.ca> wrote in message
>
> news:4ffd0143-1dc7-4a93-98cc-097530848a6e(a)p2g2000yqh.googlegroups.com...
>
>
>
>
>
> > This article on Forbes.com
>
> >http://www.forbes.com/2010/04/29/moores-law-computing-processing-opin...
>
> > came to my attention from visiting HPC Wire.
>
> > I think that he is right that massively parallel chips, designed to
> > maximize throughput per transistor, are more useful for problems
> > that
> > can be broken down and solved in parallel than conventional
> > multicore
> > chips.
>
> > But I also think he is very wrong to claim that the only problem is
> > that people are reluctant to change. That would be true only if
> > every
> > problem could be solved by a parallel algorithm as efficiently as by
> > a
> > serial algorithm, in which each step can benefit from information
> > from
> > all the steps that have gone before.
>
> > Some problems can't be broken down into an arbitrary number of small
> > pieces. For those problems, we need the fastest possible serial
> > performance to work on the big pieces that are present. We can work
> > to
> > advance our knowledge of possible parallel algorithms, but they
> > either
> > exist or not, independently of our wishes.
>
> > John Savard
>
> Algorithms are not like rocks, laying around to be discovered.  They
> are creations of humans used to express a method of performing a
> calculation which in turn is often an approximation of physical
> reality.  Other times it is just adding up money or something.
>
> The weather happens in a parallel fashion.  Gas flows turbulently in a
> parallel fashion.  Bombs explode in a parallel fashion.   All parallel
> with respect to space.  Time is at least perceived to be sequential.
> first things first and all that.  Unless you are from denmark.  :-)
>
> So the fact that we don't know enough yet to build hardware and
> program it to perform these calculations in a true parallel manner
> doesn't mean that it can't be done.

One does wonder, though, why Bill Dally thinks that opinion pieces in
Fortune will move things along any faster.

Technology and markets both seem to have minds of their own. The
success of Moore's Law is a startling exception to our apparent
inability to predict long-term trends, and, now that power scaling has
come to an end (is this really news to anyone by now?), even that long-
term prediction has lost much of its punch.

What comes next? Who knows?

To pursue your example of fluid motion, it's true that nature does
seem to be effortlessly parallel. The Navier-Stokes equations,
though, are derived from a global operation on the Boltzmann equation
that is nearly effortless on paper, but which converts local dynamics
(at least in classical physics) to global dynamics that can be done
accurately in parallel only at very high cost.

One possible (likely?) outcome is that we will ditch the Navier-Stokes
equations (at least for computation) in favor of approaches like
Lattice Boltzmann methods:

http://en.wikipedia.org/wiki/Lattice_Boltzmann_methods

I don't know that the mathematics are in great shape, but we've been
using the Navier-Stokes equations for a very long time without ever
taming the mathematics of them.

To me, the central feature is that this is not a problem for
professors of electrical engineering, no matter how distinguished, nor
for programmers, no matter how slick.

We get new tools, new options, and new constraints, and we adjust.
Was the progress of science and technology ever any different?

Robert.

From: nmm1 on
In article <83v0pgF5o5U1(a)mid.individual.net>,
Del Cecchi <delcecchi(a)gmail.com> wrote:
>"Quadibloc" <jsavard(a)ecn.ab.ca> wrote in message
>news:4ffd0143-1dc7-4a93-98cc-097530848a6e(a)p2g2000yqh.googlegroups.com...
>>
>> Some problems can't be broken down into an arbitrary number of small
>> pieces. For those problems, we need the fastest possible serial
>> performance to work on the big pieces that are present. We can work
>> to
>> advance our knowledge of possible parallel algorithms, but they
>> either
>> exist or not, independently of our wishes.
>
>Algorithms are not like rocks, laying around to be discovered. They
>are creations of humans used to express a method of performing a
>calculation which in turn is often an approximation of physical
>reality. Other times it is just adding up money or something.
>
>The weather happens in a parallel fashion. Gas flows turbulently in a
>parallel fashion. Bombs explode in a parallel fashion. All parallel
>with respect to space. Time is at least perceived to be sequential.
>first things first and all that. Unless you are from denmark. :-)
>
>So the fact that we don't know enough yet to build hardware and
>program it to perform these calculations in a true parallel manner
>doesn't mean that it can't be done.

No, but Robert Myers has pointed out the issue. Some algorithms
provably can't be parallelised. Therefore, in order to parallelise
an intractable problem, we must go back to the original, physical
(or whatever) problem and use an entirely different mathematical
approach. And, in some cases, there isn't one that is known, so
new mathematics needs inventing, which is a LOT harder than just
inventing an algorithm!

That being said, MOST of the problem IS only that people are very
reluctant to change. We could parallelise ten or a hundred times
as many tasks as we do before we hit the really intractable cases.


Regards,
Nick Maclaren.
From: Quadibloc on
On Apr 29, 9:23 pm, "Del Cecchi" <delcec...(a)gmail.com> wrote:

> Algorithms are not like rocks, laying around to be discovered.

It's true that an algorithm is a series of steps, specified by a
human.

But whether or not a certain algorithm achieves a desired result or
not is a mathematical fact.

Thus, the fact that Goldschmidt division is effective, that this is a
possible way to do division efficiently, is indeed a fact of
mathematics. Or the algorithms based on equations by Ramanujan that
allow pi to be evaluated quickly to a high precision. Or the
algorithms for factoring, or for testing for primality.

These things are considered to be discoveries, not just inventions.

So it isn't an engineering problem, but a fundamental scientific
problem, to find an algorithm to do a fundamental task (like sorting)
faster than any previous algorithm has permitted. And nature and
mathematics get to say "no".

Which means we won't always find algorithms as fast, as efficient, or
as parallelizable as we might like.

John Savard