From: Tim Bradshaw on
Pascal Bourguignon wrote:

>
> Of anything. Galaxies, planets, meteo, ecosystems, animals, cells,
> nanobots, chemicals, particules, etc.

I think you're missing my point (which is, I admit, a bit subtle,
especially by the standards of cll in recent history).

I don't doubt that multicore processors will make it into desktops in a
big way. But they will make it into desktops for reasons which are not
what you might think, and in particular are substantially not due to
the computational demands of desktop applications.

So for instance: do a commercially significant proportion of desktop
users spend their time simulating ecosystems, the weather, galaxies
etc? I suggest that they do not, and they will not. (As to whether
multicore CPUs are a good approach to that kind of simulation: I
suspect they're not as good as you might think, but that's another
story.)

> Let me see, in my far out corner of the net, my ISP doubled my ADSL
> speed every year (without me asking anything even). In ten years, I
> should have here 2Gb/s of Internet bandwidth. I don't think 64 cores
> will be too many to handle that.

I suspect it will top out long before that. It's amazing (really, it
is a deeply spectacular feat of engineering) what can be got over a
phone line, but there are limits. More to the point latency is not
something you can make vanish.

> That's because rendering consumes all CPU that nothing else is done in
> games (but tricks).

All the interesting rendering happens in the video card, and I don't
see that changing any time soon. Video cards already often have
considerably more (specialised) computing power than general purpose
CPUs on serious gaming PCs. They cost more than the rest of the
machine too.

> They won't be anymore.

Sorry, they will. The memory wall is emphatically not going away.

--tim

From: Rob Thorpe on
Tim Bradshaw wrote:
> Pascal Bourguignon wrote:
>
> (game) AI.
>
> I know nothing about games really, but I'd lay odds that the thing that
> consumes almost all the computational resource is rendering. See above.

The rendering does, but it performs most of it in the graphics
hardware.
Of the remaining runtime a large proportion is consumed by game "AI".
I remember reading that one recent first-per-shooter game used 50% of
it's runtime for AI controlling the movement of the enemies.

Whether this type of AI actually is AI of-course varies from game to
game.

From: Tim Bradshaw on
Rob Thorpe wrote:

> The rendering does, but it performs most of it in the graphics
> hardware.

That was my point: the computationally demanding part of the
application *already* lives in special-purpose hardware, and will
continue to do so. And I suggest that as games become closer and
closer to being photorealistic, the rendering will consume vastly more
resources, almost none of which can usefully be provided by a general
purpose multicore CPU.

> Of the remaining runtime a large proportion is consumed by game "AI".

Yes, and the issue is: how much more of that is needed? I suggest the
answer might be `much much less than is needed in the rendering'.

--tim

From: Pascal Bourguignon on
"Tim Bradshaw" <tfb+google(a)tfeb.org> writes:

> Pascal Bourguignon wrote:
>
>>
>> Of anything. Galaxies, planets, meteo, ecosystems, animals, cells,
>> nanobots, chemicals, particules, etc.
>
> I think you're missing my point (which is, I admit, a bit subtle,
> especially by the standards of cll in recent history).
>
> I don't doubt that multicore processors will make it into desktops in a
> big way. But they will make it into desktops for reasons which are not
> what you might think, and in particular are substantially not due to
> the computational demands of desktop applications.
>
> So for instance: do a commercially significant proportion of desktop
> users spend their time simulating ecosystems, the weather, galaxies
> etc? I suggest that they do not, and they will not. (As to whether
> multicore CPUs are a good approach to that kind of simulation: I
> suspect they're not as good as you might think, but that's another
> story.)

Ah, well, if you want to discuss the real reason why they'll be put in
desktop PC, it's clearly because they cannot increase the clock
frequency much more, so instead being able to say "Hey, my PC has a
5GHz processor!", we must be able to say "Hey, my PC has 128 cores!".



Anyways, you leave us expecting. What is the reason we're not
suspecting why multicores will spread on desktops?


>> Let me see, in my far out corner of the net, my ISP doubled my ADSL
>> speed every year (without me asking anything even). In ten years, I
>> should have here 2Gb/s of Internet bandwidth. I don't think 64 cores
>> will be too many to handle that.
>
> I suspect it will top out long before that. It's amazing (really, it
> is a deeply spectacular feat of engineering) what can be got over a
> phone line, but there are limits. More to the point latency is not
> something you can make vanish.

They'll have switched to optical fiber by then. Perhaps by nanobots
who would convert the coper in the existing cable into some kind of
transparent copper cristal ;-)


--
__Pascal Bourguignon__ http://www.informatimago.com/

This is a signature virus. Add me to your signature and help me to live.
From: Tim Bradshaw on
Pascal Bourguignon wrote:
> "Tim Bradshaw" <tfb+google(a)tfeb.org> writes:
>

> Ah, well, if you want to discuss the real reason why they'll be put in
> desktop PC, it's clearly because they cannot increase the clock
> frequency much more, so instead being able to say "Hey, my PC has a
> 5GHz processor!", we must be able to say "Hey, my PC has 128 cores!".

> Anyways, you leave us expecting. What is the reason we're not
> suspecting why multicores will spread on desktops?

Basically what you say (so, clearly it is what you were expecting
anyway): they have to keep selling upgrades to people.

I think a couple of other reasons (don't know their respective
importance) are
- power consumption: people, I hope, will finally realise that liquid
cooled PCs are actually not funny;
- it's what the CPU vendors will have to sell: they won't want to spend
huge amounts of money developing entirely different desktop and server
processors (I'm not sure if this argument holds water).