From: mdj on

Michael J. Mahon wrote:

> I've found any card with a 6522 a great source of programmable timers
> and interrupts. It's pretty easy to hook up the IRQ line if the card
> hasn't done so.
>
> Just running two counters as a 1MHz counter provides a great running
> cycle counter for sampling at interesting places.

That's true. Alas there are no 6522 cards in my Apple II parts box. I
should pick up one though.

> >>>It's good fun exploring these ideas on smaller environments that have
> >>>nice predictable behaviors, but you already know this :-)
> >>
> >>...and I *love* it! Of course, I love it even more when I fail to
> >>predict a behavior. ;-)
> >
> >
> > And it's very enlightening just how frequently this occurs, even on
> > machines that are very humble. There is so much to be learned.
>
> Hear, hear.
>
> >>Yes, I always provided a way for processes to handle interrupts
> >>directed to them. (Most language designers hated the idea.)
> >
> >
> > It's hard enough getting OS designers to acknowledge the issue :-(
>
> Most OS designers think of application programmers as wimps,
> and the *first* set of wimps are the language folks. ;-(

This is another one of those stigmas I detest. Application programmers
would rightly call OS designers arrogant, and then feel righteous about
it when OS's don't cater for the needs of applications. Vicious cycle
of destruction.

Egoism in computing is (like any other field) very destructive.

> > Most language designers are fundamentally opposed to concurrency
> > concerns entering the language. This is a folly, and a real impediment
> > to having modern system provide realtime scheduling :-(
>
> Yep--that's been my experience, too.
>
> In most languages, concurrency further weakens the already weak
> axiom system defined by type and scope rules. The fact that it
> adds tremendous power as it does so is apparently lost on them.

There a lot of the old axioms that will by necessity move aside for
progress. It's very unfortunate that it will take so long

> >>That is the very simple approach which I believe the encoders themselves
> >>should do after taking a "census" of the system they are running on.
> >
> >
> > Yeah - it's simple enough to get a poor mans version through a simple
> > wrapper script, but you're right encoders need to do this. Bring on
> > parallelism support in languages says I !
> >
> >
> >>Quantum computing evaluates all possible computations simultaneously,
> >>so the "selection" of the answer tends to be done at the end. ;-)
> >>
> >>NP-complete problems become solvable because a huge combinatoric
> >>space of possible solutions is explored "in superposition" in the
> >>time required to explore a single solution. Then you have to "read
> >>out" the final state(s) to discover the solution(s).
> >
> >
> > Parallelism provides a reduction in time required to compute many
> > combinatorial problems too.
>
> But only (at best) in proportion to the degree of parallelism.
>
> Quantum mechanics is using the whole universe to get the answer. ;-)

We do have though, a great degree of parallelism to exploit. A large
organisation could have 5000 PC's sitting mostly idle on a very fast
LAN.

> > I think it has dawned on them the stock price is falling, but beyond
> > that, the actual measures of such failure tend to evade being
> > addressed. :-(
>
> I think it's more a problem of not knowing what to do. The "machine"
> of the semiconductor industry has been tuned up to the rhythm of Moore's
> "Law"--actually a simple economic prediction that if you can increase
> the number of transistors on a chip fast enough, you can create enough
> business to finance the work required to maintain the density increases.
>
> Now that physics is interfering with the rate of improvement, and making
> the transistors less ideal, business is down, and what was a virtuous
> cycle is turning vicious.
>
> Like the proverbial frog, the industry is taking a long time to figure
> out that their business model needs to change fundamentally if they are
> to survive. And that includes the PC market, where the average life
> of a system has increased from about 2.5 years to about 4 years, causing
> a massive decrease in the effective size of the (saturated) market.
>
> We're now in a *serious* buyers' market--with the real price of
> computers dropping like a rock as their effective performance has
> effectively stagnated.
>
> Now that the stage is set, it will be interesting to see what happens!

Agreed. Clinging to well-proven, yet no longer applicable approaches is
certainly not the way to go.

> > Indeed the things I'd like to change are fairly little things too, yet
> > the degree of defiance that ones faces when suggesting it is staggering
> > :-)
>
> So if you're going to make a change, make it a *big* one--it won't get
> any worse reaction than a small one, and maybe *less*! It's certainly
> a better average return on your investment. ;-)

Hmm the magnitude of a change ultimately affects its rate of adoption.
Java showed that being mostly C/C++ like made it more attractive as it
reduced the learning curve. Take something like Ruby, that's been
around now for in excess of 10 years, yet has only very recent
visibility. It's a markedly different language that those it's
appropriate to replace, and the learning curve is thusly much greater.

> Nah--the only thing we learn from history is that no one ever learns
> anything from history. ;-) (And the Barber of Seville has a beard. ;-)

It's about time we started, it's repleat with patterns.

Matt

From: mdj on

Michael J. Mahon wrote:

> I expect that strictfp will become the most commonly unimplemented
> modifier in the language. ;-)

:-) probably. It was needed for completeness but virtually no code
would depend on it's behaviour.

> > What remains to be done is an extension to handle richer numerics, like
> > complex numbers.
>
> Ah, yes--and simple (non-precedence changing) operator overloading.

What we might get instead is language level support for complex
numerics, but it's hard to say. Working groups with JCP's come and go,
and rarely get much further than stating this to be a problem. Most
proposed solutions end up sounding worse than not having it at all (one
of the reasons Gosling didn't want it in there)

Matt

From: Michael J. Mahon on
mdj wrote:
> Michael J. Mahon wrote:
>
>
>>I've found any card with a 6522 a great source of programmable timers
>>and interrupts. It's pretty easy to hook up the IRQ line if the card
>>hasn't done so.
>>
>>Just running two counters as a 1MHz counter provides a great running
>>cycle counter for sampling at interesting places.
>
>
> That's true. Alas there are no 6522 cards in my Apple II parts box. I
> should pick up one though.

One still being made (!) is the new Mockingboard clone card.

>>>>>It's good fun exploring these ideas on smaller environments that have
>>>>>nice predictable behaviors, but you already know this :-)
>>>>
>>>>...and I *love* it! Of course, I love it even more when I fail to
>>>>predict a behavior. ;-)
>>>
>>>
>>>And it's very enlightening just how frequently this occurs, even on
>>>machines that are very humble. There is so much to be learned.
>>
>>Hear, hear.
>>
>>
>>>>Yes, I always provided a way for processes to handle interrupts
>>>>directed to them. (Most language designers hated the idea.)
>>>
>>>
>>>It's hard enough getting OS designers to acknowledge the issue :-(
>>
>>Most OS designers think of application programmers as wimps,
>>and the *first* set of wimps are the language folks. ;-(
>
>
> This is another one of those stigmas I detest. Application programmers
> would rightly call OS designers arrogant, and then feel righteous about
> it when OS's don't cater for the needs of applications. Vicious cycle
> of destruction.
>
> Egoism in computing is (like any other field) very destructive.

The basis of much of it is ignorance of the challenges faced by the
"other" group.

I have found that the best chance for building bridges is by developing
cross-discipline competency, so that mutual respect can work its magic.
It then becomes possible to form teams with members from the different
disciplines and, with some serouls work, get them truly functioning as
creative teams. This has the advantage that the team members are well
prepared to "return home" and sell the benefits of their proposals in
the "home language".

>>>Most language designers are fundamentally opposed to concurrency
>>>concerns entering the language. This is a folly, and a real impediment
>>>to having modern system provide realtime scheduling :-(
>>
>>Yep--that's been my experience, too.
>>
>>In most languages, concurrency further weakens the already weak
>>axiom system defined by type and scope rules. The fact that it
>>adds tremendous power as it does so is apparently lost on them.
>
>
> There a lot of the old axioms that will by necessity move aside for
> progress. It's very unfortunate that it will take so long
>
>
>>>>That is the very simple approach which I believe the encoders themselves
>>>>should do after taking a "census" of the system they are running on.
>>>
>>>
>>>Yeah - it's simple enough to get a poor mans version through a simple
>>>wrapper script, but you're right encoders need to do this. Bring on
>>>parallelism support in languages says I !
>>>
>>>
>>>
>>>>Quantum computing evaluates all possible computations simultaneously,
>>>>so the "selection" of the answer tends to be done at the end. ;-)
>>>>
>>>>NP-complete problems become solvable because a huge combinatoric
>>>>space of possible solutions is explored "in superposition" in the
>>>>time required to explore a single solution. Then you have to "read
>>>>out" the final state(s) to discover the solution(s).
>>>
>>>
>>>Parallelism provides a reduction in time required to compute many
>>>combinatorial problems too.
>>
>>But only (at best) in proportion to the degree of parallelism.
>>
>>Quantum mechanics is using the whole universe to get the answer. ;-)
>
>
> We do have though, a great degree of parallelism to exploit. A large
> organisation could have 5000 PC's sitting mostly idle on a very fast
> LAN.

Absolutely. I had the pleasure of working in an organization that
"got it" in the early 1990s, and did IC design tasks across floor-
fulls of available workstations.

However, the methods of thought that lead to effective use of these
high degrees of parallelism are not much related to quantum computing,
but are the product of systematic, recursive application of rules for
dividing computations into relatively independent parts.


>>>Indeed the things I'd like to change are fairly little things too, yet
>>>the degree of defiance that ones faces when suggesting it is staggering
>>>:-)
>>
>>So if you're going to make a change, make it a *big* one--it won't get
>>any worse reaction than a small one, and maybe *less*! It's certainly
>>a better average return on your investment. ;-)
>
>
> Hmm the magnitude of a change ultimately affects its rate of adoption.
> Java showed that being mostly C/C++ like made it more attractive as it
> reduced the learning curve. Take something like Ruby, that's been
> around now for in excess of 10 years, yet has only very recent
> visibility. It's a markedly different language that those it's
> appropriate to replace, and the learning curve is thusly much greater.

You don't put enough weight on the level of marketing effort expended.

Sun blitzed the entire computing world with rosy promises from day 1,
spending millions of dollars to promote a scheme that they designed
and felt they could profit from.

>>Nah--the only thing we learn from history is that no one ever learns
>>anything from history. ;-) (And the Barber of Seville has a beard. ;-)
>
>
> It's about time we started, it's repleat with patterns.

Yes, like the pattern about "What happened when someone else did X
won't happen to me, because I'm smarter." We learn that one by the
time we're 3, and seldom un-learn it.

-michael

Parallel computing for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."