From: Brett Davis on
Larrabee Dead
http://www.semiaccurate.com/2010/05/27/larrabee-alive-and-well/

>'Note: For what it's worth, the leading candidate at Sony right now
>is an internal design that several describe as "Emotion Engine based".
>Sony should have waited for Larrabee three.....'

Without a console win Larrabee is dead, no matter what they say.
It becomes a hobby project with little hope.

Avatar and the other 3D movies have changed the equation from back
when I posted that Larrabee won PS4.
Two years from now every new TV not in the bargain bin will do 3D.
The bandwidth demands of a second frame buffer to do 120Hz updates
likely makes things like ray tracing fall by the wayside.
Economics has dictated that the new console will not intro at such
high prices as the last generation, that nearly killed the PS3.

A scaled up PS3 with more Cell processors can render polys to
tiles without breaking a sweat. Hits the 3D cost/performance
window, unlike Larrabee.

And of course I could be wrong again, this is all a SWAG.

Brett
From: Terje Mathisen "terje.mathisen at on
Brett Davis wrote:
> Larrabee Dead
> http://www.semiaccurate.com/2010/05/27/larrabee-alive-and-well/
>
>> 'Note: For what it's worth, the leading candidate at Sony right now
>> is an internal design that several describe as "Emotion Engine based".
>> Sony should have waited for Larrabee three.....'
>
> Without a console win Larrabee is dead, no matter what they say.
> It becomes a hobby project with little hope.

Without a console win, LRB has lost the X million guaranteed production
run, which I agree is significant, but not a real showstopper since
those X million chips would probably generate close to zero real income.

I.e. Intel would have to price the chips at a point where they gave
compelling cost advantages. Having a single chip which could do both the
core game logic and all the 3D stuff at the same time still seems like a
very good idea to me for a mass market, rock bottom production cost,
product.
>
> Avatar and the other 3D movies have changed the equation from back
> when I posted that Larrabee won PS4.

Not at all.

> Two years from now every new TV not in the bargain bin will do 3D.
> The bandwidth demands of a second frame buffer to do 120Hz updates
> likely makes things like ray tracing fall by the wayside.

A factor of two is the smallest factor you even notice in this business.

> Economics has dictated that the new console will not intro at such
> high prices as the last generation, that nearly killed the PS3.
>
> A scaled up PS3 with more Cell processors can render polys to
> tiles without breaking a sweat. Hits the 3D cost/performance
> window, unlike Larrabee.

The main advantage to this is of course that it is almost completely
known territory, with much smaller risks.

The original Cell/PS3 design was in fact more of a leap into the unknown
than LRB would have been.

Terje

PS. Those of my friends who have worked on LRB for some years now, are
all still doing it. :-)

--
- <Terje.Mathisen at tmsw.no>
"almost all programming can be viewed as an exercise in caching"
From: Brett Davis on
In article <ei24d7-85o.ln1(a)ntp.tmsw.no>,
Terje Mathisen <"terje.mathisen at tmsw.no"> wrote:

> Brett Davis wrote:
> > Larrabee Dead
> > http://www.semiaccurate.com/2010/05/27/larrabee-alive-and-well/
> >
> >> 'Note: For what it's worth, the leading candidate at Sony right now
> >> is an internal design that several describe as "Emotion Engine based".
> >> Sony should have waited for Larrabee three.....'
> >
> > Without a console win Larrabee is dead, no matter what they say.
> > It becomes a hobby project with little hope.
>
> Without a console win, LRB has lost the X million guaranteed production
> run, which I agree is significant, but not a real showstopper since
> those X million chips would probably generate close to zero real income.
>
> > Two years from now every new TV not in the bargain bin will do 3D.
> > The bandwidth demands of a second frame buffer to do 120Hz updates
> > likely makes things like ray tracing fall by the wayside.
>
> A factor of two is the smallest factor you even notice in this business.

Each of the Playstation generations was 10 times faster, including ten
times more memory bandwidth.

The PS3 has two memory buses, XDR and GDDR3.

The PS4 will have roughly twice the bandwidth, cost and heat reasons
have presented a brick wall for external RAM.

Todays graphics chips are now largely limited in performance by bandwidth.
We a facing a temporary end of scaling, and that pathetic two X increase
might be largely eaten up by the double frame buffer for real 3D.

Yes the CPU will be 10 times faster, but what you see besides 3D is not
much better than today. Will the user pay $600 for a PS4 with the PS3 at
$200, and looking the same on the 2D TV the user may have?

Bear in mind that the Wii_HD-3D will cost $250, with the bare minimum
hardware to meet the users needs. Just like the original Wii.

For Sony the phrase is "We live in interesting times."

Brett


RRAM will get us an easy 10 times bandwidth speedup, but that is looking
like a PS5 issue.


I am betting against an embedded frame buffer, as it would have to be
HUGE and costly. Tiling looks like the way forward, due to the small
parallax difference it makes sense to render the same tiles for both
screens at a time, to benefit from nearly identical texture fetch,
and nearly identical verts, etc. Best use of precious bandwidth.
Cell processors are a good match for compute rendering tiles.
From: Jeremy Linton on
On 5/29/2010 12:47 AM, Brett Davis wrote:
> Todays graphics chips are now largely limited in performance by bandwidth.
> We a facing a temporary end of scaling, and that pathetic two X increase
> might be largely eaten up by the double frame buffer for real 3D.
But is that going to cause an image quality wall? I was under the
impression that tessellation was fairly localized and provided a nice
image boost without requiring a large increase in geometry/texture/etc
bandwidth.
My basic understanding was that there is still a fair number of other
"tricks" just awaiting more processing power.

I've seen the nivida 3d vision demos, and from what I understand they
are just utilizing the cards ability to render frames at rates > 30fps.
Tweak the camera angle every other frame and you have 3d. The game
detail doesn't seem to suffer because of the 3d.