From: nmm1 on
In article <db0caa7f-6e7f-4fe2-8f99-8e5cb0edf075(a)v37g2000vbb.googlegroups.com>,
Michael S <already5chosen(a)yahoo.com> wrote:
>
>Nick, SCC and Larrabee are different species. Both have plenty of
>relatively simple x86 cores on a single chips but that's about only
>thing they have in common.
>
>1. Larrabee cores are cache-coherent, SCC cores are not.
>2. Larrabee interconnects have ring topology, SCC is a mesh
>3. Larrabee cores are about vector performance (512-bit SIMD) and SMT
>(4 hardware threads per core). SCC cores are supposed to be stronger
>than Larrabee on scalar code and much much weaker on vector code.

Thanks for the correction.

.. I have been fully occupied with other matters, and
so seem to have missed some developments. Do you have a pointer
to any technical information?

>4. Larrabee was originally intended for consumers, both as high-end 3D
>graphics engine and as sort-of-GPGPU. Graphics as target for 1st
>generation chip is canceled, but it still possible that it would be
>shipped to paying customers as GPGPU. SCC, on the other hand, is
>purely experimental.

Now, there I beg to disagree. I have never seen anything reliable
indicating that Larrabee has ever been intended for consumers,
EXCEPT as a 'black-box' GPU programmed by 'Intel partners'. And
some of that information came from semi-authoritative sources in
Intel. Do you have a reference to an conflicting statement from
someone in Intel?


Regards,
Nick Maclaren.
From: "Andy "Krazy" Glew" on
Mayan Moudgill wrote:
> All I've come across is the announcement that Larrabee has been delayed,
> with the initial consumer version cancelled. Anyone know something more
> substantive?

I can guess.

Part of my guess is that this is related to Pat Gelsinger's departure.
Gelsinger was (a) ambitious, intent on becoming Intel CEO (said so in
his book), (b) publicly very much behind Larrabee.

I'm guessing that Gelsinger was trying to ride Larrabee as his ticket to
the next level of executive power. And when Larrabee did not pan out
as, Hicc well as he might have liked, he left. And/or conversely: when
Gelsinger left, Larrabee lost its biggest executive proponent. Although
my guess is that it was technology wagging the executive career tail: no
amount of executive positioning can make a technology shippable when it
isn't ready.

However, I would not count Larrabee out yet. Hiccups happen.

Although I remain an advocate of GPU style coherent threading
microarchitectures - I think they are likely to be more power efficient
than simple MIMD, whether SMT/HT or MCMT - the pull of X86 will be
powerful. Eventually we will have X86 MIMD/SMT/HT in-order vs X86 MCMT.
Hetero almost guaranteed. Only question will be heteroOOO/lO, or hetero
X86 MCMT/GPU. Could be hetero X86 OOO & X86 W. GPU style Coherent
Threading. The latter could even be CT/OOO. But these "Could be"s have
no sightings.
From: "Andy "Krazy" Glew" on
nmm1(a)cam.ac.uk wrote:

> Now, there I beg to disagree. I have never seen anything reliable
> indicating that Larrabee has ever been intended for consumers,
> EXCEPT as a 'black-box' GPU programmed by 'Intel partners'. And
> some of that information came from semi-authoritative sources in
> Intel. Do you have a reference to an conflicting statement from
> someone in Intel?

http://software.intel.com/en-us/blogs/2008/08/11/siggraph-larrabee-and-the-future-of-computing/

Just a blog, not official, although of course anything blogged at Intel
is semi-blest (believe me, I know the flip side.)
From: Del Cecchi on

"Andy "Krazy" Glew" <ag-news(a)patten-glew.net> wrote in message
news:4B1D1685.7080307(a)patten-glew.net...
> nmm1(a)cam.ac.uk wrote:
>
>> Now, there I beg to disagree. I have never seen anything reliable
>> indicating that Larrabee has ever been intended for consumers,
>> EXCEPT as a 'black-box' GPU programmed by 'Intel partners'. And
>> some of that information came from semi-authoritative sources in
>> Intel. Do you have a reference to an conflicting statement from
>> someone in Intel?
>
> http://software.intel.com/en-us/blogs/2008/08/11/siggraph-larrabee-and-the-future-of-computing/
>
> Just a blog, not official, although of course anything blogged at
> Intel is semi-blest (believe me, I know the flip side.)

Does this mean Larrabee won't be the engine for the PS4?

We were assured that it was not long ago.

del


From: Robert Myers on
On Dec 7, 9:51 am, "Andy \"Krazy\" Glew" <ag-n...(a)patten-glew.net>
wrote:
> n...(a)cam.ac.uk wrote:
> > Now, there I beg to disagree.  I have never seen anything reliable
> > indicating that Larrabee has ever been intended for consumers,
> > EXCEPT as a 'black-box' GPU programmed by 'Intel partners'.  And
> > some of that information came from semi-authoritative sources in
> > Intel.  Do you have a reference to an conflicting statement from
> > someone in Intel?
>
> http://software.intel.com/en-us/blogs/2008/08/11/siggraph-larrabee-an...
>
> Just a blog, not official, although of course anything blogged at Intel
> is semi-blest (believe me, I know the flip side.)

The blog post reminded me. I have assumed, for years, that Intel
planned on putting many (>>4) x86 cores on a single-die. I'm sure I
can find Intel presentations from the nineties that seem to make that
clear if I dig hard enough.

From the very beginning, Larrabee seemed to be a technology of destiny
in search of a mission, and the first, most obvious mission for any
kind of massive parallelism is graphics. Thus, Intel explaining why
it would introduce Larrabee at Siggraph always seemed a case of
offering an explanation where none would be needed if the explanation
weren't something they weren't sure they believed themselves (or that
anyone else would). It just seemed like the least implausible mission
for hardware that had been designed to a concept rather than to a
mission. A more plausible claim that they were aiming at HPC probably
wouldn't have seemed like a very attractive business proposition for a
company the size of Intel.

Also from the beginning, I wondered if Intel seriously expected to be
able to compete at the high end with dedicated graphics engines using
x86 cores. Either there was something about the technology I was
missing completely, it was just another Intel bluff, or the "x86"
cores that ultimately appeared on a graphics chips for market would be
to an x86 as we know it as, say, a, lady bug is to a dalmatian.

Robert.
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12
Prev: PEEEEEEP
Next: Texture units as a general function