From: Robert Myers on
On Feb 9, 5:02 pm, ga...(a)allegro.com (Gavin Scott) wrote:
> Eugene Miya <eug...(a)cse.ucsc.edu> wrote:
> > My friends who work at Pixar on their clusters have reached the point
> > that a given cluster is only good enough for 1 film.  While they have
> > not completely depreciated, they work better assembling a new cluster
> > for each film and hope that input cost is recouped by the new film.
>
> As far as I know it has always been this way and I doubt there's any
> application that depreciates hardware faster. 3D has an insatiable
> appetite for computing power.
>
> Some of the numbers for Avatar are fairly impressive. 10TB average
> amount of assets produced per day, 34,000 CPU cores, >100TB RAM,
> 47 hours (of something) per frame, 50 billion polygon scenes, etc.
> (numbers from memory from the January 3D World magazine, and IIRC).
>
> Maybe not so impressive relative to what Eugene's friends run in the
> "fluid dynamics" area, but pretty cool none the less I think. And if
> they could have afforded 10x or 100x of any or all of those parameters
> then they would have had no trouble putting that power to use.
>
> The Cray era of graphics (where the machine was needed to do the actual
> rendering rather than running a simulation of some sort) was VERY
> short. I think only one or two were ever really sold into that market
> and I think their purchasers were hoping to amortize the cost over
> significantly more years than turned out to be realistic.
>
> > Computers are much more than graphics.
>
> Indeed but haven't things like PC 3D gaming and 3D content creation
> been responsible for a lot of the market force that has driven
> innovation both in CPUs as well as 3D hardware that has lead now
> to GP-GPU computing? I'd hate to imagine a world where Visicalc was
> the most powerful application anyone needed to run on a personal
> computer :)
>

Even Boeing had to sell time on its Cray to justify it, and the
important thing about a Cray at the time was that it always had
potential customers, unlike the one-off machines of today. Then, as
now, only the government could make silly purchases and not go out of
business.

As soon as I saw what was involved in realistic graphics and what the
demand was likely to be, I knew that graphics would drive computers
far into the future, and I believe that was a correct prediction.
Without graphics, DEC would still be selling overpriced machines to
compete with an even more overpriced IBM.

I'm pretty pessimistic about the future of everything having to do
with computers--from the Internet to the material that makes up
comp.risks. With respect to graphics, though, the future remains
secure, because, as you point out, it can gobble up as many flops and
bytes per second as anyone can pay for, and it adapts readily to the
lazy architectures that now dominate big machines, whether cobbled
together out of rack-mounted hardware or slickly packaged to be
photogenic.

Robert.


From: Rick Jones on
Robert Myers <rbmyersusa(a)gmail.com> wrote:
> On Feb 9, 5:02?pm, ga...(a)allegro.com (Gavin Scott) wrote:
> > Eugene Miya <eug...(a)cse.ucsc.edu> wrote:
> > > My friends who work at Pixar on their clusters have reached the point
> > > that a given cluster is only good enough for 1 film. ?While they have
> > > not completely depreciated, they work better assembling a new cluster
> > > for each film and hope that input cost is recouped by the new film.
> >
> > As far as I know it has always been this way and I doubt there's any
> > application that depreciates hardware faster. 3D has an insatiable
> > appetite for computing power.
> >
> > Some of the numbers for Avatar are fairly impressive. 10TB average
> > amount of assets produced per day, 34,000 CPU cores, >100TB RAM,
> > 47 hours (of something) per frame, 50 billion polygon scenes, etc.
> > (numbers from memory from the January 3D World magazine, and
> > IIRC).


> Even Boeing had to sell time on its Cray to justify it, and the
> important thing about a Cray at the time was that it always had
> potential customers, unlike the one-off machines of today. Then, as
> now, only the government could make silly purchases and not go out of
> business.

I may share what I perceive to be your disdain for calling a bunch of
systems connected via a network a "supercomputer" but in the case of
Avatar, if what I've heard is correct, it was rendered on hardware
enough of which was "general purpose" that if the folks who bought it
wanted to, could break it up and sell the blades/chassis to people who
wanted to do general purpose stuff. So I'm not sure how much of what
was used to render Avatar would be "one-off."

rick jones
--
firebug n, the idiot who tosses a lit cigarette out his car window
these opinions are mine, all mine; HP might not want them anyway... :)
feel free to post, OR email to rick.jones2 in hp.com but NOT BOTH...
From: Robert Myers on
On Feb 9, 7:44 pm, Rick Jones <rick.jon...(a)hp.com> wrote:

>
> I may share what I perceive to be your disdain for calling a bunch of
> systems connected via a network a "supercomputer" but in the case of
> Avatar, if what I've heard is correct, it was rendered on hardware
> enough of which was "general purpose" that if the folks who bought it
> wanted to, could break it up and sell the blades/chassis to people who
> wanted to do general purpose stuff.  So I'm not sure how much of what
> was used to render Avatar would be "one-off."

Andy Glew has said that a supercomputer is nothing but a big, multi-
tiered switch. That is certainly what supercomputers have come to be,
and I don't want to give the impression that I think even the lame
interconnects we get are necessarily trivial.

Someone here asked what the subject matter of computer architecture
really is, and one of the "real" architects responded with a list of
subject matter that extends no further than the front-side bus or
whatever has replaced it. I'm glad the "real" computer architects are
here, but the things that once obsessed them have been pretty
thoroughly worked over.

As to recycled hardware, the economics of energy consumption have made
old hardware unattractive for all but the naive. That will cease to
be true when the chipmakers run out of power-saving tricks, but we're
not there yet.

Robert.
From: Eugene Miya on
In article <TaCdnXku4vEPRuzWnZ2dnUVZ_g6dnZ2d(a)supernews.com>,
Gavin Scott <gavin(a)allegro.com> wrote:
>Eugene Miya <eugene(a)cse.ucsc.edu> wrote:
>> Pixar clusters
>
>As far as I know it has always been this way and I doubt there's any
>application that depreciates hardware faster. 3D has an insatiable
>appetite for computing power.

The world had a brief flirtation with specialized hardware:
E&S, SGI, the Pixar PIC. And then the various attached frame buffers
from Ramtek, etc. The specialized super workstations (Ardent was a
brief personal favorite) made their appearance. Specialized hardware
still holds favor with people willing to pay.

>Some of the numbers for Avatar are fairly impressive. 10TB average
>amount of assets produced per day, 34,000 CPU cores, >100TB RAM,
>47 hours (of something) per frame, 50 billion polygon scenes, etc.
>(numbers from memory from the January 3D World magazine, and IIRC).

The original Lucasfilm numbers for requirements (pixels) were fairly
impressive as well. 16K on a size with 32-bit of color depth including alpha.

>Maybe not so impressive relative to what Eugene's friends run in the
>"fluid dynamics" area, but pretty cool none the less I think. And if
>they could have afforded 10x or 100x of any or all of those parameters
>then they would have had no trouble putting that power to use.

Oh, yeah, that was a 2010 Jupiter scene as well.

>The Cray era of graphics (where the machine was needed to do the actual
>rendering rather than running a simulation of some sort) was VERY
>short. I think only one or two were ever really sold into that market
>and I think their purchasers were hoping to amortize the cost over
>significantly more years than turned out to be realistic.

Yep, my friend Gordon knew the answer to which drives were attached to
the 1. That was 3 going on 4 decades ago.....

>> Computers are much more than graphics.
>
>Indeed but haven't things like PC 3D gaming and 3D content creation
>been responsible for a lot of the market force that has driven
>innovation both in CPUs as well as 3D hardware that has lead now
>to GP-GPU computing? I'd hate to imagine a world where Visicalc was
>the most powerful application anyone needed to run on a personal
>computer :)

I would suggest 1 step beyond PC 3D to network 3D. A friend visited
Cameron with Katzenberg (who insisted that he was not a technologist) one day
on the Avatar set. That friend is actually attempting a new venture
attempting to hire interns (cloudpic.com if people are interested).

--

Looking for an H-912 (container).

From: Eugene Miya on
In article <2c10d8e7-e588-4b1b-b65e-3a2ec0929011(a)k41g2000yqm.googlegroups.com>,
Robert Myers <rbmyersusa(a)gmail.com> wrote:
>Even Boeing had to sell time on its Cray to justify it, and the
>important thing about a Cray at the time was that it always had
>potential customers, unlike the one-off machines of today. Then, as
>now, only the government could make silly purchases and not go out of
>business.

Boeing owned multiple Crays over time.
Boeing Computers Services (BCS) was a service bureau independent of Boeing
Aircraft. I did a summer job 2 years which considered buying time vs a
7600. BA which also had various systems thought long and hard, for
instance about acquiring a Cray 2, but opted to use ours. We didn't
regard that as a silly purchase. It was a decade before LANL caught up
with us and LLNL to do their first 3-D sim while they played with their
two Connection Machines.

>As soon as I saw what was involved in realistic graphics and what the
>demand was likely to be, I knew that graphics would drive computers
>far into the future, and I believe that was a correct prediction.

So long as one doesn't have to think about what goes behind rendering,
anyone can be be dazzled.

>Without graphics, DEC would still be selling overpriced machines to
>compete with an even more overpriced IBM.

Neither firm had/has much idea of what goes on in graphics. To IBM color
graphics was a 3279. IBM you paid for service.

>I'm pretty pessimistic about the future of everything having to do
>with computers--from the Internet to the material that makes up
>comp.risks. With respect to graphics, though, the future remains
>secure, because, as you point out, it can gobble up as many flops and
>bytes per second as anyone can pay for, and it adapts readily to the
>lazy architectures that now dominate big machines, whether cobbled
>together out of rack-mounted hardware or slickly packaged to be
>photogenic.

LOL 8^)

I don't have a lot of time for Peter's mailing list, but I see him every
now and again, and he steals desserts from me and Knuth. Probably again
in about 2 months if not before.

My cell phone's software still has problems, I have to get around to
getting a new laptop, but I don't see a whole lot of problems with that.
It's a wide field.

--

Looking for an H-912 (container).