From: Michael J. Mahon on
fred mueller wrote:
> Paul Schlyter wrote:
>
>> In article <S6mdnYNMG-caHhnZnZ2dnUVZ_vidnZ2d(a)comcast.com>,
>> Michael J. Mahon <mjmahon(a)aol.com> wrote:
>>
>>> Paul Schlyter wrote:
>>>
>>>> Some programming languages even had these formatting rules built-in.
>>>> In e.g. FORTRAN (up to FORTRAN-77), these rules had to be obeyed:
>>>>
>>>> Column 1: C or * marked the whole line as a comment
>>>> Columns 2-6: Labels went here - 1-5 decimal digits
>>>> Column 7: A non-space character here marked this line as a
>>>> contination line
>>>> Columns 8-71: The program statement went here
>>>> Columns 72-80: Comment - anything here was ignored by the compiler
>>>
>>> Actually, it was:
>>>
>>> Column 1-5: Statement number, or "C" in column 1 for comment lines
>>> Column 6: Non-blank marks continuation line (often 1-9 to indicate
>>> sequence of the continuations if more than one)
>>> Columns 7-72: FORTRAN Statement
>>> Columns 73-80: ID/sequence field, ignored by compiler
>>
>>
>> Thanks for the correction. Yep, my fortran skills are getting rusty
>> -- it's some
>> 20 years since I last coded in that language.
>>
>
> If you think you are rusty, its been 45 years for me and on an IBM 7094
> mentioned below. And just think that was state of the art then. It was
> fun getting the back together if you were clumsy and dropped it.
> Speaking from experience :-).
>
>>
>>
>>> The sequence field was often left blank on manually keypunched cards,
>>> but was almost always filled with some combination of a "deck" ID and
>>> a sequence number on machine-punched decks, as were "binary" decks.
>>>
>>> I remember many times using an IBM 82 card sorter to restore order
>>> to a deck containing gravity-induced entropy. ;-)

I think most people who used punched cards had the experience of
dropping a deck at one time or another! And if it wasn't a dropped
deck, it was a reader jam and subsequent imperfect recovery.

I habitually used a Magic Marker to make a diagonal stripe across the
top of my deck after making any changes, using different colors and
locations for the stripes over time. That is a very simple way to
"sequence" a deck, and provides instant visual verification that the
cards are in the intended order.

-michael

Parallel computing for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: Paul Schlyter on
In article <hbOdncoOR8IOJxjZnZ2dnUVZ_vqdnZ2d(a)comcast.com>,
Michael J. Mahon <mjmahon(a)aol.com> wrote:

> mdj wrote:
>> Paul Schlyter wrote:
>
> <snip>
>
>> My concern is the manner in which small libraries or copy/paste chunks
>> of code pollute the relatively portable space of newer languages. In
>> ..NET, you can issue a keyword and switch into old-school mode. It
>> doesn't exactly provide an environment that encourages good design. Why
>> you would include features that good developers won't use is beyond me,
>> unless of course it's lousy developers you're catering to, which may
>> well be the point.
>>
>>
>>>I suppose you mean "the commercial world" when you say "the real
>>>world". Yes, the commerical world is a continuous hectic race where
>>>there's not really any time to do solid work. It's more important
>>>that the product is flashy and that it appears early. Buggy? Of
>>>course, but so what? Let customer support take care of the
>>>complaining customers while we're developing the next product to
>>>appear early and being flashier still .... the lifetime of a software
>>>product is nowadays so short anyway.....
>>
>>
>> It's actually irrelevant what sector we speak of - productivity
>> enhancements are productivity enhancements. While there are sectors
>> where you can 'afford' the extra time investment required to develop in
>> legacy languages, there's little reason to do so.
>
> The previous exchange neglects to make the important distinction between
> time spent "up front" on design and implementation, and time spent
> "after the fact" on support and maintenance.
>
> It is a sad but inescapable fact of commercial life that there is never
> time to do a job right, but always time to try to make it work.
>
> One of my development laws is: "'Quick and dirty' is never quick but
> always dirty."
>
> The commercial pressure to get a product out the door that Paul refers
> to is so real that it generally precludes the design and implementation
> team from doing what they would do in the "best of all possible worlds"
> and instead condemns them to shipping a product that has many structual
> flaws. Those flaws will cost dearly over the next few years, but, given
> the structure of corporate software teams, it is unlikely that senior
> team members will have to deal with much of the flak.
>
> Management will be rewarded for "making the schedule" and the elevated
> support costs won't hit the fan until months later--when they can be
> blamed on an inexperienced implementation team.
>
> Matt is making a idealistic argument for what is achievable with great
> discipline--and there's nothing wrong with that! But in the "real
> world", discipline is much harder to come by, and almost impossible to
> stick to without (rare) management support.

I couldn't have said that better myself....

In the end, it's really the customers' fault -- after all, we make the
choice to get stuff as cheaply and quickly as we can, rather than wait
for, and pay for, quality stuff.

So I guess we all get what we deserve....



>>>In such an environment it's probably best to throw away old code and
>>>let everyone reinvent their wheels once more .... with humps and bumps
>>>that there's no time to polish away.
>>
>>
>> In this case, the old wheel has humps and bumps with regards to
>> portability and security. Should we throw those away? Absolutely.
>
> Never assume that doing something over will mean doing it better.
> Life is full of counterexamples.
>
> It will only be done better if a higher quality design and
> implementation can be done, and that's a big "if". For one thing,
> management always thinks of the bad old solution as a _solution_,
> so they are unwilling to invest much time and effort in re-solving
> a problem "just" to obtain some "airy-fairy benefit" on the *next*
> manager's watch... ;-(
>
>>>>This is the real world effect of Microsofts design decision (pollute
>>>>the new language with old problems) versus the Java model. Still
>>>>hobbling with legacy for no reason whatsoever than a couple of very
>>>>poorly concieved design ideas. And ones that history has already shown
>>>>has a simple, clean solution.
>
> Many of the short-sighted additions to otherwise clean languages are
> there *exactly* to solve short-term, schedule-driven problems.
>
> We have a long history as a species of mortgaging the future for the
> present. Whoever said "Pay me now or pay me later" neglected to
> mention the effect of high interest rates. ;-)
>
> As I often say, we tend to use our nose for a wall detector. It works
> very well in the sense that it detects all the walls, but by the time
> it works, many of the wall's consequences are already felt. ;-)
>
>>>>"Those who cannot learn from history are doomed to repeat it...."
>
> And, statistically, that would be all of us... ;-)
>
>>>Actually, there are lots of people who do this, for their enjoyment.
>>>I'm referring to vintage computing of course, and this very newsgroup is
>>>part of that movement.
>>>
>>>Yep, it belongs to "the real world" too....
>>
>>
>> Of course, but when your platform isn't evolving you don't need
>> evolving development methodologies. The older techniques are adequate,
>> and in the case of the Apple II, a great deal of fun on such a
>> constrained environment. However, old techniques only scale so high,
>> and hit those limits. In the case of C/C++, those limits have been hit,
>> or close to it. There's still a large problem domain you use these
>> tools for, as it's the most appropriate. But many new problem domains
>> demand tools that have less restrictions, particularly in terms of
>> development time. Sometimes this involves using a slightly more
>> constrained language. Less is more!
>
> Software folks are more likely to be afflicted with grandiosity than
> hardware folks. Perhaps its the stronger engineering discipline of
> the hardware world, perhaps its the real smoke when something "blows
> up", or perhaps its because hardware people live *constantly* with
> constricting limits that discipline their dreams.
>
> Software folks seldom experience real limits anymore. They find it
> all too easy to imagine that they can accomplish *anything* through
> programming (even if they don't understand how to do it ;-).
>
> Most of the problems of todays software are a result of attempting to
> deal with more complexity than we can actually manage--and on a short
> schedule. ;-) The irony is that much of the complexity comes from
> aspects of "solutions" that are entirely optional--like GUI eye candy.
>
> <fogeymode>
>
> Most of what is done with computers today is much like what was done
> decades ago--text editing and formatting, modest number crunching.
> Only now, instead of spending 20 minutes choosing your words when
> writing a letter, you spend 10 minutes writing and 10 minutes choosing
> fonts. ;-)
>
> High quality image processing is relatively new, because of its demand
> for large memories and fast processors, but the basic image processing
> operations are purely mathematical and easily disciplined. The same is
> true for most "media" processing.
>
> Pasting databases together over the web is an example of an emergent
> capability, but one that is largely done using scripting languages.
>
> Sometimes it's hard to see just what real value is being created in
> this endless hierarchy of levels of interpretation. If anything
> turns out to be really useful, then it could be rewritten "flatter"
> to save about a factor of 1000 in computing resources!
>
> Maybe in computers, too, power corrupts. ;-)
>
> </fogeymode>
>
> -michael
>
> Parallel computing for 8-bit Apple II's!
> Home page: http://members.aol.com/MJMahon/
>
> "The wastebasket is our most important design
> tool--and it is seriously underused."
--
----------------------------------------------------------------
Paul Schlyter, Grev Turegatan 40, SE-114 38 Stockholm, SWEDEN
e-mail: pausch at stockholm dot bostream dot se
WWW: http://stjarnhimlen.se/
From: mdj on

Michael J. Mahon wrote:

> > It's actually irrelevant what sector we speak of - productivity
> > enhancements are productivity enhancements. While there are sectors
> > where you can 'afford' the extra time investment required to develop in
> > legacy languages, there's little reason to do so.
>
> The previous exchange neglects to make the important distinction between
> time spent "up front" on design and implementation, and time spent
> "after the fact" on support and maintenance.

We're straying pretty far from the comparison of approaches between
implementation languages, of which the core of it was portability, but
anyway....

> It is a sad but inescapable fact of commercial life that there is never
> time to do a job right, but always time to try to make it work.
>
> One of my development laws is: "'Quick and dirty' is never quick but
> always dirty."
>
> The commercial pressure to get a product out the door that Paul refers
> to is so real that it generally precludes the design and implementation
> team from doing what they would do in the "best of all possible worlds"
> and instead condemns them to shipping a product that has many structual
> flaws. Those flaws will cost dearly over the next few years, but, given
> the structure of corporate software teams, it is unlikely that senior
> team members will have to deal with much of the flak.
>
> Management will be rewarded for "making the schedule" and the elevated
> support costs won't hit the fan until months later--when they can be
> blamed on an inexperienced implementation team.
>
> Matt is making a idealistic argument for what is achievable with great
> discipline--and there's nothing wrong with that! But in the "real
> world", discipline is much harder to come by, and almost impossible to
> stick to without (rare) management support.

I agree completely, as would anyone who's done it for any length of
time. My point, *is* that it's idealistic to believe that you can
expect such discipline in a development team. I'm not arguing about how
people will work, but the development tools they're using. If anything,
Pauls argument is idealistic to believe programming teams will produce
portable C/C++ code just because it's feasible to do so with great
discipline. Sure you can do it. In the open source world most of the
code achieves architecture neutrality (but not often OS neutrality).
But there's no time pressure there.

Java teams will produce *far* more portable solutions essentially by
accident, since they're using toolsets that are designed with those
concerns in mind.

> >>In such an environment it's probably best to throw away old code and
> >>let everyone reinvent their wheels once more .... with humps and bumps
> >>that there's no time to polish away.
> >
> >
> > In this case, the old wheel has humps and bumps with regards to
> > portability and security. Should we throw those away? Absolutely.
>
> Never assume that doing something over will mean doing it better.
> Life is full of counterexamples.

Indeed. But if someone does something over for you (designs a new
language) and the results come out better, then you should take a look
at it. I'm not big on reinventing wheels either (hell, I'm a UNIX fan)

> It will only be done better if a higher quality design and
> implementation can be done, and that's a big "if". For one thing,
> management always thinks of the bad old solution as a _solution_,
> so they are unwilling to invest much time and effort in re-solving
> a problem "just" to obtain some "airy-fairy benefit" on the *next*
> manager's watch... ;-(

There are some good managers who see the benefits, but they're a rarity
for sure. I've seen at least one place do the opposite of everyone
else, and hire contract teams to maintain their existing systems, while
their staff went off and designed newer systems. The idea was that the
money they saved on maintenence down the track more than covered the
contracting costs, and they had better systems and happier staff, for
less money.

> >>>This is the real world effect of Microsofts design decision (pollute
> >>>the new language with old problems) versus the Java model. Still
> >>>hobbling with legacy for no reason whatsoever than a couple of very
> >>>poorly concieved design ideas. And ones that history has already shown
> >>>has a simple, clean solution.
>
> Many of the short-sighted additions to otherwise clean languages are
> there *exactly* to solve short-term, schedule-driven problems.

Indeed. There have been a few additions to Java of late to make it more
like C, which makes it easier to port code. They've managed not to
break the portability of Java in the process, but they've added a lot
of functionality that allows some pretty ugly code. In some ways I
*like* it that you have to rework certain things, and if it's too much
effort, use the callthroughs.

The limitations placed in Java haven't prevented it from becoming one
of the most widely used languages today. The portability is a real
benefit too, especially now that we're starting to hit some real
limitations: floor space and heat. Having the option to pick different
hardware based on the restrictions of the physical environment has real
benefit.

> We have a long history as a species of mortgaging the future for the
> present. Whoever said "Pay me now or pay me later" neglected to
> mention the effect of high interest rates. ;-)

lol!

> As I often say, we tend to use our nose for a wall detector. It works
> very well in the sense that it detects all the walls, but by the time
> it works, many of the wall's consequences are already felt. ;-)

Brilliant analogy :-)

> >>>"Those who cannot learn from history are doomed to repeat it...."
>
> And, statistically, that would be all of us... ;-)
>
> >>Actually, there are lots of people who do this, for their enjoyment.
> >>I'm referring to vintage computing of course, and this very newsgroup is
> >>part of that movement.
> >>
> >>Yep, it belongs to "the real world" too....
> >
> >
> > Of course, but when your platform isn't evolving you don't need
> > evolving development methodologies. The older techniques are adequate,
> > and in the case of the Apple II, a great deal of fun on such a
> > constrained environment. However, old techniques only scale so high,
> > and hit those limits. In the case of C/C++, those limits have been hit,
> > or close to it. There's still a large problem domain you use these
> > tools for, as it's the most appropriate. But many new problem domains
> > demand tools that have less restrictions, particularly in terms of
> > development time. Sometimes this involves using a slightly more
> > constrained language. Less is more!
>
> Software folks are more likely to be afflicted with grandiosity than
> hardware folks. Perhaps its the stronger engineering discipline of
> the hardware world, perhaps its the real smoke when something "blows
> up", or perhaps its because hardware people live *constantly* with
> constricting limits that discipline their dreams.

I agree. I don't feel that I'm being grandious, though. I'm referring
to a language that's been in the field for a decade now. The Ruby boys
sure get a little grandious, but hey, a little dreaming is a necessity
to progression. It's all about balance.

I'm finding FPGA's pretty fun from that regard. Hardware, without
breadboards and blue smoke :-)

> Software folks seldom experience real limits anymore. They find it
> all too easy to imagine that they can accomplish *anything* through
> programming (even if they don't understand how to do it ;-).

And routinely fail to achieve anything useful in the process :-)

> Most of the problems of todays software are a result of attempting to
> deal with more complexity than we can actually manage--and on a short
> schedule. ;-) The irony is that much of the complexity comes from
> aspects of "solutions" that are entirely optional--like GUI eye candy.

This is the principle reason for evolving languages and tools. Improved
langugages allow ideas to be expressed more concisely, support
encapsulation mechanisms that allow complex modules to reused, thus
allowing complexity to be more effectively managed. Sure it's
idealistic to expect new tools solve all the problems, they don't. They
do however mitigate some of the old issues and allow some progress to
be made.

> <fogeymode>
>
> Most of what is done with computers today is much like what was done
> decades ago--text editing and formatting, modest number crunching.
> Only now, instead of spending 20 minutes choosing your words when
> writing a letter, you spend 10 minutes writing and 10 minutes choosing
> fonts. ;-)
>
> High quality image processing is relatively new, because of its demand
> for large memories and fast processors, but the basic image processing
> operations are purely mathematical and easily disciplined. The same is
> true for most "media" processing.

Ultimately, that's all we do with machines - manipulate media. Now that
extremely cheap machines can manipulate high definition AV content in
better than realtime, we're running out of reasons to make faster
machines. This is a good thing, in many ways.

> Pasting databases together over the web is an example of an emergent
> capability, but one that is largely done using scripting languages.
>
> Sometimes it's hard to see just what real value is being created in
> this endless hierarchy of levels of interpretation. If anything
> turns out to be really useful, then it could be rewritten "flatter"
> to save about a factor of 1000 in computing resources!

It certainly opens up areas of research for 'real' Computer Science.
How do we get the computer to flatten the hierarchies for us ... ;-)

> Maybe in computers, too, power corrupts. ;-)

It sure does... Don't get me wrong, guys, I've enjoyed my journey from
BASIC, 6502 assembly, Pascal, through to the C/C++ world, through Java,
and now emerging into 'scripting' languages like Ruby (although
scripting implies something ad-hoc, which I don't like). Through this
journey I've essentially used the highest level tool that would let me
get what I need done on current computers. Each step has allowed me to
do more, with less (as in my time). It's still fun to go back and write
6502 code on an Apple II, so I do. Still fun (at times) to write C, and
at times I do that (on the Apple II these days). It's a fun ride.

Matt

From: Michael J. Mahon on
mdj wrote:
> Michael J. Mahon wrote:

<snip>

>>It will only be done better if a higher quality design and
>>implementation can be done, and that's a big "if". For one thing,
>>management always thinks of the bad old solution as a _solution_,
>>so they are unwilling to invest much time and effort in re-solving
>>a problem "just" to obtain some "airy-fairy benefit" on the *next*
>>manager's watch... ;-(
>
>
> There are some good managers who see the benefits, but they're a rarity
> for sure. I've seen at least one place do the opposite of everyone
> else, and hire contract teams to maintain their existing systems, while
> their staff went off and designed newer systems. The idea was that the
> money they saved on maintenence down the track more than covered the
> contracting costs, and they had better systems and happier staff, for
> less money.

And most organizations give maintenance tasks to new programmers, as
a kind of hazing, I think!

But not supporting the code that you produced, at least for its first
year in the field, deprives a team of the *real* learning experience,
in which you discover which of your grand ideas worked and which didn't.

And it also serves as a test of whether the code is *actually*
maintainable, as opposed to theoretically maintainable.

I see doing at least "early" maintenance as a kind of accountability.

>>>>>This is the real world effect of Microsofts design decision (pollute
>>>>>the new language with old problems) versus the Java model. Still
>>>>>hobbling with legacy for no reason whatsoever than a couple of very
>>>>>poorly concieved design ideas. And ones that history has already shown
>>>>>has a simple, clean solution.
>>
>>Many of the short-sighted additions to otherwise clean languages are
>>there *exactly* to solve short-term, schedule-driven problems.
>
>
> Indeed. There have been a few additions to Java of late to make it more
> like C, which makes it easier to port code. They've managed not to
> break the portability of Java in the process, but they've added a lot
> of functionality that allows some pretty ugly code. In some ways I
> *like* it that you have to rework certain things, and if it's too much
> effort, use the callthroughs.
>
> The limitations placed in Java haven't prevented it from becoming one
> of the most widely used languages today. The portability is a real
> benefit too, especially now that we're starting to hit some real
> limitations: floor space and heat. Having the option to pick different
> hardware based on the restrictions of the physical environment has real
> benefit.
>
>
>>We have a long history as a species of mortgaging the future for the
>>present. Whoever said "Pay me now or pay me later" neglected to
>>mention the effect of high interest rates. ;-)
>
>
> lol!
>
>
>>As I often say, we tend to use our nose for a wall detector. It works
>>very well in the sense that it detects all the walls, but by the time
>>it works, many of the wall's consequences are already felt. ;-)
>
>
> Brilliant analogy :-)
>
>
>>>>>"Those who cannot learn from history are doomed to repeat it...."
>>
>>And, statistically, that would be all of us... ;-)
>>
>>
>>>>Actually, there are lots of people who do this, for their enjoyment.
>>>>I'm referring to vintage computing of course, and this very newsgroup is
>>>>part of that movement.
>>>>
>>>>Yep, it belongs to "the real world" too....
>>>
>>>
>>>Of course, but when your platform isn't evolving you don't need
>>>evolving development methodologies. The older techniques are adequate,
>>>and in the case of the Apple II, a great deal of fun on such a
>>>constrained environment. However, old techniques only scale so high,
>>>and hit those limits. In the case of C/C++, those limits have been hit,
>>>or close to it. There's still a large problem domain you use these
>>>tools for, as it's the most appropriate. But many new problem domains
>>>demand tools that have less restrictions, particularly in terms of
>>>development time. Sometimes this involves using a slightly more
>>>constrained language. Less is more!
>>
>>Software folks are more likely to be afflicted with grandiosity than
>>hardware folks. Perhaps its the stronger engineering discipline of
>>the hardware world, perhaps its the real smoke when something "blows
>>up", or perhaps its because hardware people live *constantly* with
>>constricting limits that discipline their dreams.
>
>
> I agree. I don't feel that I'm being grandious, though. I'm referring
> to a language that's been in the field for a decade now. The Ruby boys
> sure get a little grandious, but hey, a little dreaming is a necessity
> to progression. It's all about balance.
>
> I'm finding FPGA's pretty fun from that regard. Hardware, without
> breadboards and blue smoke :-)

As we enter the era of 10 million transistor FPGAs, system compilers,
and "turnarounds" measured in seconds--in short, as the constraints on
hardware design are eased--I expect to see many of the same problems
that have afflicted software shift into the "hardware" realm.

Discipline is hard-won. Discipline can only coexist with ease and
convenience *after* it has been formed through hard experience, since
ease puts greater demands on discipline.

Tools can give the appearance of discipline by restricting expression,
but to a truly disciplined mind, tools are merely secondary.

I think of "strict" tools as "discipline for the undisciplined", but
so much of system design is outside the realm of any formal tools, that
there is no substitute for design discipline. A terrible fate awaits
those who think that there is.

My first software tools phase was a macrogenerator phase--anything
was possible, and complexity could flourish in five pages of code.
It was not well suited to team efforts. ;-)

My second software tools phase was strict typing and enforced structure.
My mantra was, "If you think you need a macro, then something is missing
from the language." Experienced programmers chafed at the "training
wheels" the language forced upon them. Some of them filled their code
with unstructured "workarounds", perhaps a sign of their resentment at
the strictures of the programming environment. (Unstructured code can
be written in any language.)

My third software tools phase was "the only thing that matters is
the team". I strove for a small team of 98th percentile people, who
implicitly understood the need for and benefits of discipline, and
who had learned this by experience. Tools are useful, but secondary.
If a tool is really needed, it will be written. (Structured code can
be written in any language.)

Although I don't consider any of the three approaches ideal, there
is no doubt that the third worked the best, both in terms of team
esprit and in terms of product quality (function & reliability).

Don't count too much on tools--it's the people that make the real
difference.

>>Software folks seldom experience real limits anymore. They find it
>>all too easy to imagine that they can accomplish *anything* through
>>programming (even if they don't understand how to do it ;-).
>
>
> And routinely fail to achieve anything useful in the process :-)

And those are the *good* cases. Many times they achieve something that
is sold and wastes many people's time, but is actually worthless.

Remember the "Serius Computer Corporation", whose products were so
full of niggly little errors and inconsistencies that no one ever
found out that they actually didn't work at all! (Paraphrased ;-)

>>Most of the problems of todays software are a result of attempting to
>>deal with more complexity than we can actually manage--and on a short
>>schedule. ;-) The irony is that much of the complexity comes from
>>aspects of "solutions" that are entirely optional--like GUI eye candy.
>
>
> This is the principle reason for evolving languages and tools. Improved
> langugages allow ideas to be expressed more concisely, support
> encapsulation mechanisms that allow complex modules to reused, thus
> allowing complexity to be more effectively managed. Sure it's
> idealistic to expect new tools solve all the problems, they don't. They
> do however mitigate some of the old issues and allow some progress to
> be made.

For balance, I have to point out that they also permit *needless*
complexity to be more effectively managed. When "Hello, World!"
executes 8 megabytes of code, you know something has gone sour.
(And, yes, I do include *all* the code executed, not just the code
in the "Hello, World!" module.)

>><fogeymode>
>>
>>Most of what is done with computers today is much like what was done
>>decades ago--text editing and formatting, modest number crunching.
>>Only now, instead of spending 20 minutes choosing your words when
>>writing a letter, you spend 10 minutes writing and 10 minutes choosing
>>fonts. ;-)
>>
>>High quality image processing is relatively new, because of its demand
>>for large memories and fast processors, but the basic image processing
>>operations are purely mathematical and easily disciplined. The same is
>>true for most "media" processing.
>
>
> Ultimately, that's all we do with machines - manipulate media. Now that
> extremely cheap machines can manipulate high definition AV content in
> better than realtime, we're running out of reasons to make faster
> machines. This is a good thing, in many ways.

Since we're running out of silicon "smoothness", and with it Moore's
"Law", I suppose its just as well that we're feeling satisfied with
where we are. ;-)

There are still orders of magnitude of improvements in cost/performance,
so the game certainly isn't up. But those lavish resources will demand
ever more discipline from designers, lest they all go down the drain
(like much of the last few orders of magnitude ;-).

I have no problem conceiving computing tasks that would require 1000
times as much processing power as is currently affordable, but it all
needs to be spent on crunching, not on 3D widgets and redundant code.

>>Pasting databases together over the web is an example of an emergent
>>capability, but one that is largely done using scripting languages.
>>
>>Sometimes it's hard to see just what real value is being created in
>>this endless hierarchy of levels of interpretation. If anything
>>turns out to be really useful, then it could be rewritten "flatter"
>>to save about a factor of 1000 in computing resources!
>
>
> It certainly opens up areas of research for 'real' Computer Science.
> How do we get the computer to flatten the hierarchies for us ... ;-)

I think that will be done as a partnership between very clever designers
and very fast machines.

-michael

Parallel computing for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: Jorge ChB on
mdj <mdj.mdj(a)gmail.com> wrote:

> Now that extremely cheap machines can manipulate high definition AV
> content in better than realtime, we're running out of reasons to make
> faster machines.
>
> This is a good thing, in many ways.

Wow wait !
What are you saying ?
A good thing in what way ?
More MIPS + less Watts + smaller.
That's what hardware designers are after and will always be after.
That's been the key to now-possible before-unthinkable everyday things
like mobiles, ipods, psps, palms, dvb tv, ABS, EFI, portables, google
earth, WIFI, etc etc. A microprocessor everywhere.
And that's the key to many unthinkable wonderful new inventions that
have to come yet and won't be possible unless the hardware keeps
evolving == (More MIPS + less Watts + smaller)

The only lack "of reasons to make faster machines" I can think of comes
from the fact that the software is evolving at a so *much* slower pace
(than hardware)...
Voice recognition ?
"Artificial Intelligence" ?
User (human) interface ?
etc... :-(
--
Jorge Chamorro Bieling