From: Terje Mathisen "terje.mathisen at on
Robert Myers wrote:
> My if-it-worked-more-than-once-it-will-probably-work-again machinery
> thinks it sees a pattern:
>
> 1. Anticipation is central to (almost) all cognitive and computational
> processes.

Anticipation, afaik, is just another name for prediction, right?

Pretty much all current computers (with some very low-end exceptions)
employ at least some form of this, in the form of branch prediction,
static or dynamic.
>
> 2. Anticipation cannot be anticipated. It must be learned.

Experience has shown us that dynamic (i.e. learning) predictors perform
far better than static compiler hints. :-)

> 3. The agenda of computer architecture, which is far from dead, is to
> keep working the if-it-worked-more-than-once-it-will-probably-work-again
> angle harder and harder. That is, after all, practically all the human
> central nervous system knows how to do (I think).
>
> If all of this seems patently obvious, feel free to say so. It has only
> slowly dawned on me.

:-)

Terje

--
- <Terje.Mathisen at tmsw.no>
"almost all programming can be viewed as an exercise in caching"
From: Robert Myers on
Terje Mathisen wrote:
> Robert Myers wrote:
>> My if-it-worked-more-than-once-it-will-probably-work-again machinery
>> thinks it sees a pattern:
>>
>> 1. Anticipation is central to (almost) all cognitive and computational
>> processes.
>
> Anticipation, afaik, is just another name for prediction, right?
>

Just to clarify. You can predict and not act (anticipatory action)
correctly or choose not to act at all as a matter of policy. In
anticipation, I include not only prediction, but taking whatever actions
seem promising based on the prediction, which could include even further
prediction and more anticipatory action.

> Pretty much all current computers (with some very low-end exceptions)
> employ at least some form of this, in the form of branch prediction,
> static or dynamic.

It's startling to me how well some fairly simple schemes work.

>>
>> 2. Anticipation cannot be anticipated. It must be learned.
>
> Experience has shown us that dynamic (i.e. learning) predictors perform
> far better than static compiler hints. :-)
>

Was obvious to all right along, no? ;-)

Robert.
From: nmm1 on
In article <5R4Yn.6004$cO.5854(a)newsfe09.iad>,
Robert Myers <rbmyersusa(a)gmail.com> wrote:
>Terje Mathisen wrote:
>
>>> 2. Anticipation cannot be anticipated. It must be learned.
>>
>> Experience has shown us that dynamic (i.e. learning) predictors perform
>> far better than static compiler hints. :-)
>
>Was obvious to all right along, no? ;-)

Well, yes and no. I don't actually think that it is right, as a
universal rule.

The point is that it has been known since time immemorial that
highly disciplined languages enable much better static prediction.
It always was obvious that static prediction was a clear loser in
the sort of C/C++/etc. spaghetti that dominates so many areas at
present. What wasn't clear (to me, at least) is whether dynamic
prediction would be enough better to be worth the extra complexity.
Well, the answer is that it is ....

Also, all of the current approaches to static prediction that I
know of seem to have been designed by hardware people living in
an ivory tower that hasn't had any contact with the programmers
since about 1960. In particular, using the program locality as
the primary key is obviously inane, once you start using an
'object-oriented' approach.

I think that we could do a lot better if we approached static
prediction more intelligently, even with current programs, and
potentially better even than current dynamic prediction with
improved programming paradigms.


Regards,
Nick Maclaren.
From: Andy 'Krazy' Glew on
On 7/4/2010 10:02 AM, Robert Myers wrote:

>
> 1. Anticipation is central to (almost) all cognitive and computational
> processes.
> ...
> 3. The agenda of computer architecture, which is far from dead, is to
> keep working the if-it-worked-more-than-once-it-will-probably-work-again
> angle harder and harder. That is, after all, practically all the human
> central nervous system knows how to do (I think).

I don't think this is the only agenda item. My own contributions to computer architecture have mostly not been in the
form of predictors; my own contributions have all been in the form of mechanisms, usually mechanisms that work in the
absence of predictors, sometimes in competition with predictors, but best when in combination.

I think that the need for such mechanisms is not oveer. At least I hope not. But I think that the money is always on
the predictor guys.

E.g. my work in out of order execution and renaming: I remember being *annoyed* when branch predictors became common,
since I saw that as a distraction. I was younger then.

E.g. large window out-of-order execution tends to compete against prefetchers. Simple prefetchers get much of the
benefit of big instruction windows. But not all - so in some ways prefetchers take you down a dead end road, that you
have to back out of, a bit, if you want to make progress.


From: Robert Myers on
Andy 'Krazy' Glew wrote:

>
> I think that the need for such mechanisms is not oveer. At least I hope
> not. But I think that the money is always on the predictor guys.

Because of a decision-making bias you don't agree with?

> E.g. my work in out of order execution and renaming: I remember being
> *annoyed* when branch predictors became common, since I saw that as a
> distraction. I was younger then.
>
> E.g. large window out-of-order execution tends to compete against
> prefetchers.

For cache space? Memory bandwidth?

Robert.