From: ChrisQ on
nmm1(a)cam.ac.uk wrote:
> In article <IbGdndmlAckfTIzWnZ2dnUVZ_h6dnZ2d(a)bestweb.net>,
> Mayan Moudgill <mayan(a)bestweb.net> wrote:
>> One big problem is that there is no "big picture"; there are an
>> abundance of them, and they all admit of different solutions.
>
> Precisely. Or, in some cases, there is no known solution.
>
>> The question is (excluding graphics), are there enough problems left to
>> make it worthwhile to persue a systematic approach to parallelization
>> (i.e. languages, environments, etc.)? Or is it more cost-effective to
>> work on serial computation for the mainstream, and let the few
>> applications that need parallelism adopt ad-hoc solutions for it?
>
> Yes and no. I doubt that a single solution is worth pursuing, but
> there are areas and aspects that are.
>
> The problem with working on serial computation is that we have reached
> limits that we have no idea how to get round, and believe are absolute,
> at least in the medium term. I don't think that's quite true, but the
> only approach that I know of that has a hope is programming language
> redesign. And I know of nobody working on directions that are likely
> to help with performance.
>
>
> Regards,
> Nick Maclaren.

I think this is getting a bit overdramatic. Change in any discipline is
usually incremental, not revolutionary and we have to make a start
somewhere to get the parallel programming mind set and idiom
established, even if not ideal. We already have all the hardware we
need, so no excuses possible there. We need to get the toolchain sorted.
That is, it's a software problem.

If you accept that part of what is needed is the ability of the
programmer to communicate parallel intent to the machine, then there
must be support for this at programmer level. To get started, standard C
compilers could have extensions (pragmas) to communicate the intent. The
linker would need to be modified, and added system calls would required
to allocate cpu space. Of course, where there is no OS, the problem is
simplified considerably.

I don't see it as a major problem, the like of which has never been
solved before. Even Vax C in the early 90's had support for parallel
processing idioms via language extensions, but they (Digital) did have
the advantage of having the big picture all the way from code source to
hardware.

Why can't all this be done now with the gnu tool chain and the Linux OS,
for example ?. Just whinging that it's an unsolvable problem is just a
copout and avoidance of the issue...

Regards,

Chris
From: ChrisQ on
Robert Myers wrote:
> On Nov 28, 8:42 pm, Mayan Moudgill <ma...(a)bestweb.net> wrote:
>
>> In yet others, the transforms required to parallelize the program are
>> problem specific, require domain specific knowledge, and are complicated
>> enough that the actual expression of the parallelism is a nit.
>>
>
> I have a hard time imagining how you view the world. I've made enough
> not-so-subtle comments about the IBM view of the world, so I'm going
> to drop that tack, but I assume that your world-view is driven by
> whatever priorities are given to you and that you assume that that is
> "the world."
>
> I have an even harder time imagining how Nick views the world. He
> presumably works in the midst of people like me, and yet, to read his
> posts, you'd begin to think that what the world really lacks is enough
> grubby details to obsess over endlessly.
>
> The reality is that, for people who don't particularly want to spend
> their time obsessing over details, computers have not become easier to
> use along with becoming more powerful. Quite the opposite.
>
> The recent thread on c talked as if using gcc were a perfectly
> reasonable proposition and as if code written in c really were machine
> and, even more important, environment-portable. Every scrap of non-
> trivial c code I've touched recently comes with tons of goo.
>
> ./configure is great when it works. When it doesn't, you are once
> again reduced to worrying about the kinds of details that are central
> to why you *didn't* major in EE/CS.
>
> Nick likes Fortran. I'll venture to say that I dislike Fortran much
> less than I dislike c, but the entire world of programming is geared
> to java/c programming these days, and Fortran *always* seemed like a
> step backward from PL/I.
>
> You said that dense matrix programming is in hand. What you really
> mean is that, if the builders of big clusters get their boxes to
> perform reasonably well on anything, it will be on linpack. As I
> commented on another thread recently, the linpack mentality has
> resulted in many codes being rewritten so that they don't exploit
> massive vector pipes very easily, and massive vector pipes have
> suddenly become very inexpensive. If that's what you mean by "in
> hand," then I guess things are in hand.
>
> It's no big deal, really. People like me have always worked close to
> the iron and I don't see that changing.
>
> It's the snarky, knowing tone that pisses me off.
>
> Robert.
>

It's quite easy to make C portable. Abstract all hardware references
through an abstraction layer. Abstract all needed types through a
typedef header and stick to strict ansi with all warnings and errors on.
Keep going until there are no errors or warnings.

It's not rocket science (sorry :-), just that some people are in too
much of a hurry to do it right, produce unreadable undisciplined code
etc and still wonder why they get the wrong answers...

Regards,

Chris


From: Robert Myers on
On Dec 1, 5:53 am, ChrisQ <m...(a)devnull.com> wrote:

>
> It's quite easy to make C portable. Abstract all hardware references
> through an abstraction layer. Abstract all needed types through a
> typedef header and stick to strict ansi with all warnings and errors on.
> Keep going until there are no errors or warnings.
>
> It's not rocket science (sorry :-), just that some people are in too
> much of a hurry to do it right, produce unreadable undisciplined code
> etc and still wonder why they get the wrong answers...
>

Jokes about rocket science are in some ways richly-deserved.

It's been my experience that abstraction layers are just another
obstacle to portability.

Even without e^3 (embrace, extend, extinguish), abstraction layers
morph and fork, and the net version is some never-ending version of
dll hell. Once you include e^3, it's hopeless.

Whether it's object-oriented classes (Java or C++ variants), languages
to end all languages, object-library variants, broken api's, weird
header variants, development forks, and things I probably can't even
imagine, it is a never-ending nightmare if all you want is to get
things done. Even if you limit yourself to the gnu toolchain (not
realistic if you are performance-driven), you run into the same
problems.

I suspect that reality explains the success of Matlab and
Mathematica. Of course, when those businesses are bought or morphed
or whatever, tons of good work will be left high and dry. We'll have
theses fluffed out with source code that no one can execute because
the appropriate run-time environment no longer exists.

The open-source community is only marginally better. At least for
open source, if you want to badly enough, you can resurrect anything.

Robert.
From: ChrisQ on
Robert Myers wrote:

>
> Jokes about rocket science are in some ways richly-deserved.
>
> It's been my experience that abstraction layers are just another
> obstacle to portability.
>
> Even without e^3 (embrace, extend, extinguish), abstraction layers
> morph and fork, and the net version is some never-ending version of
> dll hell. Once you include e^3, it's hopeless.
>
> Whether it's object-oriented classes (Java or C++ variants), languages
> to end all languages, object-library variants, broken api's, weird
> header variants, development forks, and things I probably can't even
> imagine, it is a never-ending nightmare if all you want is to get
> things done. Even if you limit yourself to the gnu toolchain (not
> realistic if you are performance-driven), you run into the same
> problems.
>

By abstraction layer, i'm thinking embedded, where you tend to run up
against the same sort of functional requirements over and over again.
Things like timers, comms handlers, queues, protocol stacks, lcd
graphics, hardware drivers and more means that there a real opportunity
for code reuse if the underlying hardware can a least be partly
abstracted away into a set of defined interfaces. If you write all these
functions in a strict ansi subset (misra c is not a bad reference), it
should compile using any other ansi compiler. You just have to do some
serious design work and define all the interfaces carefully. The devil
in the detail is that things like interrupt handlers and any direct
hardware access will always be device specific, but that's what the
abstraction layer is for. What you end up with is a load of source
libraries that are included as required into the build / makefile of
every project. It saves a lot of time and effort, especially for systems
where there is no rtos.

I've always avoided windows programming of any kind. Perhaps because it
just looks a mess, but also because it ties the programmer down to the
uSoft methodology in terms of all the prebuilt libraries, templates,
toolchain and all the goop that comes with it. Even trad unix, Solaris
10 here, has started to get the Linux bloat disease in terms of all the
indecipherable packages you have to add just to install, for example,
apache. The world of computing has forgotten the kiss principle in
favour of titillation, unnecessary complexity and to satisfy hungry egos
methinks. Just look at current default install linux desktops to see
what I mean. Give me a simple editor, makefile and compiler any day over
that.

>
> I suspect that reality explains the success of Matlab and
> Mathematica. Of course, when those businesses are bought or morphed
> or whatever, tons of good work will be left high and dry. We'll have
> theses fluffed out with source code that no one can execute because
> the appropriate run-time environment no longer exists.
>
> The open-source community is only marginally better. At least for
> open source, if you want to badly enough, you can resurrect anything.
>
> Robert.

One of the best things about open source is the fact that an effectively
unlimited number of eyes get to review the code. The ultimate code
review and something no company on earth could afford to do. I'm not
saying that it always produces the best code, but the possibility is
there at least. Life is lost without at least some hope, even if it must
be tempered by a healthy dose of cynicism :-)...

Regards,

Chris
First  |  Prev  | 
Pages: 1 2 3
Prev: I love USB displays!
Next: PEEEEEEP