Slimming Down a Supercomputer 64
1sockchuck writes "Happy Feet animator Dr. D Studios has packed a large amount of supercomputing power into a smaller package in its new render farm in Sydney, Australia. The digital production shop has consolidated the 150 blade chassis used in the 2007 dancing penguin feature into just 24 chassis, entirely housed in a hot-aisle containment pod. The Dr. D render farm has moved from its previous home at Equinix to the E3 Pegasus data center in Sydney. ITNews has a video and photos of the E3 facility."
So instead of happy feet.... (Score:1)
Re: (Score:1)
This is a blade.
Does it blend? (Score:2)
In slow motion, please.
Priceless (Score:5, Funny)
Cost of real estate in prime metropolitan area - $15 million
Cost of state of the art server rocks - $30 million
Cost of flying in a cooler the size of a small bus on a 747 - $2 million
Cost of seeing data center employee's face when they realise they're on call 24/7 for no extra cash - Priceless.
Re: (Score:2)
Value of free publicity for Happy Feet and Hewlett Packard from the advertorial: $50,000
Re: (Score:2)
A380 actually. :P
Re: (Score:2)
Cost of seeing data center employee's face when they realise they're on call 24/7 for no extra cash - Priceless.
Well, if he agreed to give away work for free, because he thinks he’s worth that little, then that’s his own damn fault.
People who don’t learn to say no, will obviously be walked all over. Your boss is only a client of yours. You can always get another client. Either he offers a good deal, or he can GTFO.
I read TFA (Score:2)
It was a billion times more entertaining than Happy Feet.
Re: (Score:2)
Happy Feet was a fun movie. How dare you?
Re: (Score:1)
Happy feet was at least more entertaining than the lord of the rings. They had singing, dancing and walking in happy feet. The walking parts was also better as penguins do that in a more entertaining way.
Re: (Score:2)
We here in the countries of Europe don’t get what you Americans like about singing in movies and shows. Always the pointless singing. While we all here just collectively cringe. It ruins the whole movie for us. :)
Not judging here. Do whatever makes you happy.
But we don’t get it, and can’t stand such movies.
Re: (Score:2)
Indeed. It was an insult to pediphiles everywhere.
I work in a national lab computer center (Score:1)
Tried to check out the E3 Networks site (Score:1)
But it's in Flash. And I didn't have the patience to wait for the clouds and animation to finish.
http://www.e3networks.com.au/ [e3networks.com.au]
Who is this supposed to be targeting? You have to be a class A moron to build a data centre website using flash on the landing page.
Re:Tried to check out the E3 Networks site (Score:5, Funny)
Re: (Score:2)
Did you spot any pointy hair?
Re: (Score:1, Interesting)
Slightly off-topic, but...
1. The funny thing about their site design is that about 90% of it could have easily been done with mouseovers and no flash.
2. None of the text can be highlighted. Lets say that they were the solution for my business, and I just needed to e-mail someone in management a snippet about their site. Too bad. No copy and paste.
It feels like it's 2001 or 2002 again.
Re: (Score:2)
Re: (Score:2)
You don't wait for it to finish, there is no finish. You just click on the background and it loads the rest of the site.
Flash isn't just used for the landing page: it's the whole site. Every scrap of it is flash. I feel sick!
About relocating supercomputing power to Australia (Score:5, Funny)
Re: (Score:2, Funny)
Not only that they are also upside down.
Re: (Score:2)
You all do realize that electrons spin backwards there, right?
Only when you are not watching.
Re: (Score:2)
You all do realize that electrons spin backwards there, right?
Moderation +2, 100% Informative
Only on Slashdot.
Re: (Score:2)
Dude you could at least CITE it. Hello!
http://en.wikipedia.org/wiki/Coriolis_effect [wikipedia.org]
Re: (Score:1)
Re: (Score:2)
I was joking to make your joke about electrons going backwards seem more real. But nobody modded me funny cuz they thought I was serious. No deadpan humor on the net. It's all about the voice.
Re: (Score:1, Funny)
It's ok, they just flip the servers upside-down.
What about the rest of it ? (Score:4, Interesting)
What racks are they using (at least 42RU in height) ?
How do they get power into these (4 chassis, each with 6 x 15A power inlets) ?
Are they using rack top switches, or is there more equipment?
Are they using liquid cooled doors - if so whose ?
I once tried to get answers from HP on how to power their equipment at this density - they diddn't have a clue. It's worth remembering that each of these chassis has six power supplies, each rated at up to 2.2KW. Even allowing for a 2N configuration, that's a massive amount of power, and a lot of cables.
Re:What about the rest of it ? (Score:4, Informative)
TFA says they use 48RU, and each cabinet uses 14.4 kW (60A) which in my opinion is not that impressive: you just need 3 phases at 20A, 240V.
As for cooling, you can easily get away with no water-cooling if your hot aisle confinement is well done. From the pics it is just Dell's 1U servers, and if you fill one 48U rack with those you do get to 14.4kW. But not all racks are for number-crunching, you have racks for storage, control and network, and those make less than 8kW.
The problem is not powering those things, but more cooling. With a good hot-aisle or cold-aisle confinement you can go up to 15kW/rack, but depending on the air volume, you're quickly screwed up if the cooling fails.
Re: (Score:3, Interesting)
Re: (Score:2)
We just got a few 1U dual socket quad core servers from Dell, so I don't know why you're saying they can't do it.
Re: (Score:2)
Re: (Score:2)
Oh, they just released those. They have a 2U box with 4 servers sharing two PSUs now.
PowerEdge C6100 I think. But you're right, I remember looking at HP and the others and they did have them but they weren't the right price for us, as we didn't need density.
Re: (Score:2)
There comes a point when you pass the point of cramming too much heat in a case and the whole system rapidly becomes unstable.
Shoving 16 cores into a single 1U case, without doing the numbers, safely bypasses any sane risk.
Great, you can get that many in one U... Dell doesn't want to deal with the supporting of such hardware and dealing with all the heat issues.
Theres more to a data center than how much you can stuff into the racks, its actually got to work w
Re: (Score:2)
"Doing the numbers" is called design - in those cases enough airflow does the job.
There are of course denser setups than that anyway but that changes the price catagory - while the two servers in 1U is less than what you would pay for 2 x 1 U Dell servers of equivalent specs. If you don't need the extra drive bays it's not worth going for Dell especially if you are in a country where their suppor
Re: (Score:2)
Dell can't do two boards with 8 cores on each in 1U and I've got some of those a couple of years old now.
They're called blades.
The density isn't quite as high, but since you'll nearly always run out of power or cooling long before you run out of rack space, even with 1U boxes, there's not a lot of benefit from increasing density much past even a simple 1U pizza box. The benefits of blades are more in the management centralisation and reduced cabling, which you don't get in those servers you're talking a
Re: (Score:2)
My main point is Dell lags well over a year behind many of the other vendors and often costs more, so if they won't give you support in there is no reason to go with them.
Re: (Score:1)
The article does mention that they're using HP Blade servers, not Dells as another commenter posted. In the video they showed a BL490c g6 blade, which is a dual socket Nehalem blade at 16 per chassis. For cooling they were using watercooled APC pods. The power isn't really the
Re: (Score:2)
Slimming down? (Score:2)
Seems fairly run-of-the-mill (Score:2)
News story is that computers are faster and have more memory than they were 3 years ago, so they need fewer of them. They bought APC enclosed systems to avoid having a hot isle due to open air cooling (of course, that means they paid a non-trivial amount for that).
Re: (Score:2)
Re: (Score:2)
I may be wrong, but I believe it takes way more than that to render a frame.
And (Score:2)
who manages their security? (Score:2)
Strange... (Score:1)
I'm really thinking that this article is leaving some very important details out... It's really strange that a money-making data center would have physical space as it's primary limiting factor. Things like power, cooling, network, etc are usually far more important than square feet of tile, especially when anyone with an experience in data centers isn't going to put it in a high-value real estate market, it's going to be out in some industrial/commercial zone in the burbs where land/power/water are cheape
Re: (Score:2)
And the point is...? (Score:1)
Re: (Score:1)
A renderfarm is NOT a supercomputer! (Score:1)
So? (Score:2)
If they waited another 2 years they could pack the same processing power into a desktop PC.
Why are we posting stories about companies who are just upgrading old PCs they use for their rendering farm.
Whats next? Google server farm updates? Going to start posting to us when redhat upgrades its FTP servers to faster hardware just because its cheaper than replacing the old?
I mean seriously, all they did was upgrade, and ... it wasn't even a big upgrade, I've made bigger purchases than that over the phone to d
Wot? (Score:2)
No GPUs?
Opportunity... (Score:1)
Now let's see if they could put that technology to good use by creating a good film.