From: nmm1 on
In article <4B1D1685.7080307(a)patten-glew.net>,
Andy \"Krazy\" Glew <ag-news(a)patten-glew.net> wrote:
>nmm1(a)cam.ac.uk wrote:
>
>> Now, there I beg to disagree. I have never seen anything reliable
>> indicating that Larrabee has ever been intended for consumers,
>> EXCEPT as a 'black-box' GPU programmed by 'Intel partners'. And
>> some of that information came from semi-authoritative sources in
>> Intel. Do you have a reference to an conflicting statement from
>> someone in Intel?
>
>http://software.intel.com/en-us/blogs/2008/08/11/siggraph-larrabee-and-the-future-of-computing/
>
>Just a blog, not official, although of course anything blogged at Intel
>is semi-blest (believe me, I know the flip side.)

I don't see anything in that that even hints at plans to make
Larrabee available for consumer use. It could just as well be a
probe to test consumer interest - something that even I do!


Regards,
Nick Maclaren.
From: nmm1 on
In article <b81b8239-b43c-46e9-9eea-4da2a73493a0(a)k13g2000prh.googlegroups.com>,
Robert Myers <rbmyersusa(a)gmail.com> wrote:
>
>The blog post reminded me. I have assumed, for years, that Intel
>planned on putting many (>>4) x86 cores on a single-die. I'm sure I
>can find Intel presentations from the nineties that seem to make that
>clear if I dig hard enough.

Yes. But the word "planned" implies a degree of deliberate action
that I believe was absent. They assuredly blithered on about it,
and very probably had meetings about it ....

>From the very beginning, Larrabee seemed to be a technology of destiny
>in search of a mission, and the first, most obvious mission for any
>kind of massive parallelism is graphics. ...

Yes. But what they didn't seem to understand is that they should
have treated it as an experiment. I tried to persuade them that
they needed to make it widely available and cheap, so that the mad
hackers would start to play with it, and see what developed.
Perhaps nothing, but it wouldn't have been Intel's effort that was
wasted.

The same was true of Sun, but they had less margin for selling CPUs
at marginal cost.


Regards,
Nick Maclaren.
From: "Andy "Krazy" Glew" on
Del Cecchi wrote:
> "Andy "Krazy" Glew" <ag-news(a)patten-glew.net> wrote in message news:4B1D1685.7080307(a)patten-glew.net...
>> nmm1(a)cam.ac.uk wrote:
>>
>>> I have never seen anything reliable
>>> indicating that Larrabee has ever been intended for consumers,
>>
>> http://software.intel.com/en-us/blogs/2008/08/11/siggraph-larrabee-and-the-future-of-computing/
>>
>> Just a blog, not official, although of course anything blogged at
>> Intel is semi-blest (believe me, I know the flip side.)
>
> Does this mean Larrabee won't be the engine for the PS4?
>
> We were assured that it was not long ago.

My guess is that Intel was pushing for Larrabee to be the PS4 chip.

And, possibly, Sony agreed. Not unreasonably, if Intel had made a
consumer grade Larrabee. Since Larrabee's nig pitch is programmability
- cache coherence, MIMD, vectors, familiar stuff. As opposed to the
Cell's idiosyncrasies and programmer hostility, which are probably in
large part to blame for Sony's lack of success with the PS3.

Given the present Larrabee situation, Sony is probably scrambling. Options:

a) go back to Cell.

b) more likely, eke out a year or so with Cell and a PS4 stretch, and
then look around again - possibly at the next Larrabee

c) AMD/ATI Fusion

d) Nvidia? Possibly with the CPU that Nvidia is widely rumored to be
working on.

AMD/ATI and Nvidia might seem the most reasonable, except that both
companies have had trouble delivering. AMD/ATI look best now, but
Nvidia has more "vision". Whatever good that will do them.

Larrabee's attractions remain valid. It is more programmer friendly.
But waiting until Larrabee is ready may be too painful.

Historically, game consoles have a longer lifetime than PCs. They were
programmed closer to the metal, and hence needed stability in order to
warrant software investment.

But DX10-DX11 and Open GL are *almost* good enough for games. And allow
migrating more frequently to the latest and greatest.

Blue-sky possibility: the PS3-PS4 transition breaking with the tradition
of console stability. The console might stay stable form factor and UI
and device wise - screen pixels, joysticks, etc. - but may start
changing the underlying compute and graphics engine more quickly than in
the best.

Related: net games.



From: Torben �gidius Mogensen on
"Andy \"Krazy\" Glew" <ag-news(a)patten-glew.net> writes:

> Although I remain an advocate of GPU style coherent threading
> microarchitectures - I think they are likely to be more power
> efficient than simple MIMD, whether SMT/HT or MCMT - the pull of X86
> will be powerful.

The main (only?) advantage of the x86 ISA is for running legacy software
(yes, I do consider Windows to be legacy software). And I don't see
this applying for Larrabee -- you can't exploit the parallelism when you
run dusty decks.

When developing new software, you want to use high-level languages and
don't really care too much about the underlying instruction set -- the
programming model you have to use (i.e., shared memory versus message
passing, SIMD vs. MIMD, etc.) is much more important, and that is
largely independent of the ISA.

Torben
From: nmm1 on
In article <4B1DD041.7090009(a)patten-glew.net>,
Andy \"Krazy\" Glew <ag-news(a)patten-glew.net> wrote:
>Del Cecchi wrote:
>>
>> Does this mean Larrabee won't be the engine for the PS4?
>>
>> We were assured that it was not long ago.
>
>My guess is that Intel was pushing for Larrabee to be the PS4 chip.
>
>And, possibly, Sony agreed. Not unreasonably, if Intel had made a
>consumer grade Larrabee. Since Larrabee's nig pitch is programmability
>- cache coherence, MIMD, vectors, familiar stuff. As opposed to the
>Cell's idiosyncrasies and programmer hostility, which are probably in
>large part to blame for Sony's lack of success with the PS3.

Could be. That would be especially relevant if Sony were planning
to break out of the 'pure' games market and producing a 'home
entertainment centre'. Larrabee's pitch implied that it would have
been simple to add general Internet access, probably including VoIP,
and quite possibly online ordering, Email etc. We know that some of
the marketing organisations are salivating at the prospect of being
able to integrate games playing, television and online ordering.

I am pretty sure that both Sun and Intel decided against the end-user
market because they correctly deduced that it would not return a
profit but, in my opinion incorrectly, did not think that it might
open up new opportunities. But why Intel seem to have decided
against the use described above is a mystery - perhaps because, like
Motorola with the 88000 as a desktop chip, every potential partner
backed off. And perhaps for some other reason - or perhaps the
rumour of its demise is exaggerated - I don't know.

I heard some interesting reports about the 48 thread CPU yesterday,
incidentally. It's unclear that's any more focussed than Larrabee.


Regards,
Nick Maclaren.
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13
Prev: PEEEEEEP
Next: Texture units as a general function