Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Businesses The Internet IT

Building a Data Center In 60 Days 117

miller60 writes "The facilities team at Australia's Pipe Networks is down to the wire in its bid to complete a data center in 60 days. And in an era when many major data-center projects are shrouded in secrecy, these guys are putting the entire effort online, with daily updates and photos on the company blog, a live webcam inside the facility, a countdown timer, and a punch-list of key tasks left to finish. Their goal is to complete the job by Friday morning eastern US time."
This discussion has been archived. No new comments can be posted.

Building a Data Center In 60 Days

Comments Filter:
  • why? (Score:5, Interesting)

    by thedrunkensailor ( 992824 ) on Wednesday June 13, 2007 @08:14AM (#19489317) Homepage
    Why not forget the deadline and get it right? TFA says this was an exec's idea....go figure
    • Re:why? (Score:5, Interesting)

      by Aladrin ( 926209 ) on Wednesday June 13, 2007 @08:22AM (#19489391)
      Why not? It's a challenge, not a true 'deadline'. Think of it as an episode of 'Monster House' where they get to keep the tools if they get the nearly-impossible project done on time. There's -always- work to be done afterwards to finish it off, but the work is complete as far as they were contracted.

      It's not 'have a fully functional data center filled with customers.' It's only 'build it.'
    • Because it's exciting!!!!!! We're really into open source too so we're totally cool with you guys knowing all our internal security layout. In fact, the keypad combo for the data center gate for visitors is 31337! Welcome anytime 24/7! Please don't steal anything or mess with any cables though... we don't believe in cameras since they create a hostile work environment so it'd be really cool if you were good fellows and didn't mess anything up if you want to pop on down to check s out in the middle of th
    • Why not forget the deadline and get it right?

      No shit. Why on earth would I want to locate in a datacentre which was intentionally thrown together in a hurry? That's like buying a parachute that was made by the lowest bidder.

      This seems like really bad advertising to me. Anyone who is careful in vendor selection will be unimpressed, and they'll have an uphill battle to convince these prospective customers that no corners were cut or harmful shortcuts taken.

      • Yeah, except that if you actually look through the process, you'll see that it was all done quite well, and better than their previous two datacentres, which they had no trouble filling to capacity.
  • by Anonymous Coward on Wednesday June 13, 2007 @08:15AM (#19489329)
    It's already Slashdotted.
  • Either pipenetworks.com has been /.'d or their 'network of pipes' needs a little visit from RotoRooter. ;-)
  • by Max Romantschuk ( 132276 ) <max@romantschuk.fi> on Wednesday June 13, 2007 @08:21AM (#19489377) Homepage
    Oh, it's just the server going up in smoke trying to serve a live webcam on Slashdot...
  • A couple black boxes (Score:5, Interesting)

    by mikaelhg ( 47691 ) on Wednesday June 13, 2007 @08:23AM (#19489405)
    They could also just have bought a couple of Sun Black Box [sun.com] datacenters in a truck container.
  • Checklist (Score:5, Funny)

    by WillRobinson ( 159226 ) on Wednesday June 13, 2007 @08:26AM (#19489435) Journal
    1. Get first DS3 up - check
    2. Setup webcam - check
    3. Setup webserver - check
    4. Post on slashdot and soak the DS3 - check
    5. Stress test in progress
    • 5. Stress test - failed Slashdotted at 13:50UTC Never underestimate the power of the /.
  • Datacenter???? (Score:5, Interesting)

    by Critical Facilities ( 850111 ) on Wednesday June 13, 2007 @08:27AM (#19489449)

    Pipe's DC3 facility will be about 4,800 square feet and will be able to accommodate 170 server racks.

    I'm sorry, but 4,800 square feet and room/capacity for 170 server racks is a SERVER ROOM not a DATACENTER. I'm not trying to troll here, but this mis-use of the word datacenter gets old. The time/effort/planning/money it takes to build a datacenter is exponentially more complicated than to upfit an area to accommodate a few server racks.

    In short, sticking in a few Liebert CRACs and a little 150kva UPS does not constitute "building a datacenter".
    • Re: (Score:3, Informative)

      by walt-sjc ( 145127 )
      It depends on the community it is serving. Yeah, that is pretty pathetic compared to datacenters in major cities, but for a small city it would be perfectly fine.

      When I first started using colocation back around 96, Exodus's colo room was 6 racks. They had explosive growth and by 2001 had massive datacenters in several cities around the globe. Anyway, give them time. If they do things right, they will grow.

    • Re:Datacenter???? (Score:5, Insightful)

      by Xicarius ( 935198 ) on Wednesday June 13, 2007 @09:11AM (#19489885)
      In Australia, its a datacentre. Comparative to the number of people in Au & the connections within Australia & to the rest of the world, its pretty big.

      We only have two major cables out of Australia & capacity on them to the US costs hundreds of dollars a megabit/month.
    • Re: (Score:3, Informative)

      by zevans ( 101778 )
      But with blades, 1U pizza boxes, xSeries+VMWare, LPARs, 156 and 288gb spindles, etc and the consolidation tools that all the vendors are pushing, data centres can and should be smaller than they were five years ago.
      • Re: (Score:3, Insightful)

        by walt-sjc ( 145127 )
        That depends on how the data center is designed. Is it the typical 300W / sq foot that typical datacenters are, or is it designed for high density servers and the additional power / cooling they need? From the size of the generator, there is no way they can go that dense.
      • Re: (Score:3, Insightful)

        I think that's a little bit of wishful thinking. With the shift to online apps, the increase in streaming media, and general hunger for bandwidth/throughput (especially on corporate LAN's), I'd say that while it's true that advances in server design and virtualization has enabled the IT industry to do more with current equipment, the "market" that those products/services serve have stepped up their demand as well. The idea that datacenters are serving a static need just plainly isn't true. The demand
    • Re: (Score:2, Informative)

      by jlf278 ( 1022347 )
      It is a data center; and actually, you're the one guilty of mis-use as datacenter is not, in fact, a word. http://en.wikipedia.org/wiki/Data_center [wikipedia.org]
      • Sure it is, check here [reference.com] or here [techtarget.com] or here [thefreedictionary.com] or here [sun.com].
        Also, by the link you provided, some of the criteria for a datacenter include

        To prevent single points of failure, all elements of the electrical systems, including backup system, are typically fully duplicated, and critical servers are connected to both the "A-side" and "B-side" power feeds.

        which doesn't appear in the description of the facility listed in the article.

        • Nowhere in the wikipedia entry does it say that is "criteria" for a data center. In fact, it says things like "they generally include..." or "are usually". The wikipedia entry does say this: "A data center can occupy one room of a building, one or more floors, or an entire building.". So give it up. I agree that the article is a bit of a stretch, "build" apparently doesn't include planning or sourcing equipment, and they started with an existing empty area of a building. But that's no reason to go on a
          • Nowhere in the wikipedia entry does it say that is "criteria" for a data center. In fact, it says things like "they generally include..." or "are usually"

            While it doesn't say that those are the "criteria", you even concede that the article says that they "generally include" or "are usually" comprised of the listed criteria, so I don't think I'm that far off the mark in thinking that the items mentioned would be considered "standard".

            But that's no reason to go on a rant about what a "data center" vs. a "server room" is.

            In fairness, I didn't go on a rant. I made a rather succinct (and ok maybe a little sarcastic) comment about the scope of what is generally considered a datacenter, and subsequently responded to various differing opinio

    • Re: (Score:3, Informative)

      by afidel ( 530433 )
      You think a 150KVA UPS will service 170 racks?!?!? HAHAHAHAHA
      You have lost all credibility to determine what a datacenter is. A 150KVA UPS would service about 50 moderately loaded (about half empty) racks with most current equipment. 170 racks could power many midsized companies, my employer's an S&P 500 company and we have 11 racks moderately full. Wikipedia defines a datacenter as:

      A data center is a facility used for housing a large amount of electronic equipment, typically computers and communicat
      • You think a 150KVA UPS will service 170 racks?!?!?

        No, I don't think that a 150kva UPS will service 170 modern rack servers, I was making an exaggerated example of what sometimes gets referred to as a datacenter/data center. However, by your Wikipedia "definition", a closet with a few servers and a window air conditioner would constitute a data center since "large amount of electronic equipment" is a very subjective term.
    • by AK Marc ( 707885 )
      Well, aside from you calling 170 racks "a few", perhaps rather than whining that this isn't a datacenter, you could tell us how many racks are needed for a datacenter. Or is it services? You like to mention grading and such, so does a datacenter have to be a fresh build? The largest facility of this type in my state is a converted building. Of course, it was, at one time, the largest building of the largest bank, built with security and building strength in mind. All they had to do was add "a few" tons
      • I never called 170 server racks a "few". As far as it having to be a fresh build, no, it doesn't have to be, but the headline says "Building a Data Center in 60 days", so maybe I inferred a bit.

        I'm not "whining" about anything at all, I'm just suggesting that the headline is a little misleading and senstaional. The largest facility in your state that you refer to, did they build it in 60 days? In 180 days? In 365 days? If you've been around environments like these (which I'm assuming you have)
        • by AK Marc ( 707885 )
          I'm sorry, but 4,800 square feet and room/capacity for 170 server racks is a SERVER ROOM not a DATACENTER. I'm not trying to troll here, but this mis-use of the word datacenter gets old. The time/effort/planning/money it takes to build a datacenter is exponentially more complicated than to upfit an area to accommodate a few server racks.

          I never called 170 server racks a "few".

          Well, you refer to 170 racks in one sentence, and while apparently on the same subject, you say that accommodating "a few" serve
          • What's Google have to do with this? Are you saying that they are the minimum for a datacenter, or are you saying that it's offenseive to think of "datacenter" to cover both 170 rack installations as well as Google's facilities? With illogic like that, a Ferrari and a Yugo can't both be cars, since they are so dissimilar in action, ignoring that they both have 4-wheels and an engine.

            No I'm not saying they're the minimum, I'm just trying to illustrate how extreme the differences can be. While we're on that

            • by AK Marc ( 707885 )
              Please tell me you're kidding here. So you're claiming that your friend(s) have redundant feeds from their utility provider, redundant cooling/humidity control, all the necessary electrical components to ensure 100% uptime (static transfer switches, UPS modules and syncing gear, diesel generators, enterprise grade surge suppression, harmonics mitigating transformers, etc), and that it was built to withstand natural disasters? How much did this garage cost?

              I am telling you that I have seen someone take a 2
    • What, are trying to compensate vicariously or something?
      Most people's houses are less than 4800 square feet. Most businesses fit in less than 4800 square feet. By those accounts you can't call it a "server room" then, can you?

      170 racks, assume 42U per rack, 1U servers will get you hmmm a damned lot of servers. Take a couple racks out for infrastructure and some SAN and it looks like a DC, sounds like a DC, quacks like a DC, and smells like a DC. I'd call it a DC.

      Seriously, if you insist in being pedantic:
      • What, are trying to compensate vicariously or something? Most people's houses are less than 4800 square feet.

        Are you trying to compare datacenters/computer rooms with people's homes? Where's bad analogy guy when you need him.

        * A "data centre" is a "center for data", not "a giant room filled with thousands of computers". So by that account the two racks in my garage count. They've got raid arrays, multiple servers and switches, UPS, etc..

        No, the two racks in your garage do not count. Unless that is

    • by DJMajah ( 738313 )
      If a shipping container can be a datacentre, this room definitely can.
  • by yohanes ( 644299 ) on Wednesday June 13, 2007 @08:30AM (#19489479) Homepage Journal
    Everyone is doing that, why shouldn't they. Then we can fire managers that we hated the most.
  • Australia's telco(s) must be vastly more responsive than Verizon. To even get a DSL line up and running within 60 days here would be amazing - a DS3+ typically takes 4-5 months, minimum.
    • by tbcpp ( 797625 )
      I have a friend from Australia. Seems like he told me that they have a country wide provider for cable, telephone, internet....Any Aussies want to help me out? Or am I just off my rocker.
      • That would be Telstra: A company which sells "broadband" for $30/month which includes 200MB/month transfers (up+down) and 15c/MB excess use. You can get this price on either 8mbit cable (metro) or 256kbit ADSL (around 95% of the country). (And yes, people have had huge bills and Telstra makes them pay) See: http://bc.whirlpool.net.au/isp.cfm/Telstra-BigPond /1.html [whirlpool.net.au] You can get better deals off other companies, but all must give some money to Telstra (except the very few non-Telstra-cabled areas).

        Quickest I'
      • by mcbridematt ( 544099 ) on Wednesday June 13, 2007 @09:39AM (#19490231) Homepage Journal
        Former government phone monopoly, now privatized and run by evil Americans - Telstra, basically owns 99% of fixed-line infrastructure. (As they are legislated to do so) Captial cities got TWO separate cable networks during the 1990's - one from Optus (who got the first telco license after deregulation), and the other from Telstra, who built one thinking pay TV was the bomb - it wasn't, and both cable networks have actually shrunk by some degree since.

        (Note the Optus cable network provided , and was designed to fixed line telephones from the start, which makes up the small percentage of non-Telstra fixed-line infrastuture around.)

        However, Telstra, as a monopoly, MUST provide wholesale access to the fixed-line infrastructure, as such most Australians are actually with internet providers who wholesale off Telstra, either over Telstra DSLAM's or their own. The wholesale prices of which have been ENFORCED even DICTATED by the Australian competition authorities, who among other things, refuse to tolerate American crap such as "up to XXX mbps" (Australian consumers, unlike American's, demand full line speed, no lousy contention or else), "unlimited... up to XXX GB" etc.

        A federal election issue this year is an FTTN (fibre-to-the-node) rollout to every single location within these captial cities, and an assortment of regional centers. Two proposals are in play - one from Telstra, who set wholesale prices up high because they don't want to share, and their shareholders (investment funds, small % of mon'n'dad investors) who want returns, and the "G9" - favored by many, but the pricing still sucks.

        As the majority of Australian internet traffic is to/through the US, Australian bandwidth pricing is dictated by capacity on submarine cables to the US - of which there is only one - running out of "spare" capacity fast*, despite only being turned on a decade ago. Some providers lease additional capacity via Japan, and there are three new submarine cables under planning that are attempting to remedy the bandwidth shortage, either by going to Guam to patch into Japanese capacity, or only up to Hawaii. As I've said, unlike American's, Australian users, after suffering a few years of low broadband speeds, don't tolerate US style bandwidth overselling (those that have tried failed miserably), and as such a lot of ISPs, outside Telstra (who charge almost business rates anyway), we're forced to raise prices due to the increasing use of bittorrent etc.

        * even worse the operators of the cable in question, Southern-Cross cable, aren't in a particular hurry to upgrade either.
        • If I'm reading things wrong I apoloigze, but I think there is some confusion here about dc3 and ds3. Pipe Networks is setting up a DC3, but it looks like somebody thought they were talking about a DS3 connection. Also, here in America yes, we have to put up with "up to xxxkbps" speed claims, but hardly any of our ISPs limit how much transfer volume we get. I've heard about how most UK and Aussie ISPs limit how much volume you can use each month. Despite our ISP's fudging the speed numbers, I count myself lu
          • by catprog ( 849688 )
            Instead you have to worry about monopoly in each local area?

            We have a range of isp that serve everybody
            Inre HOME-512-Elite 80 GB Shaped Free Dynamic $144.95 /mo
              Smart Choice 512^ No set limit(high downloaded are shaped during congestion) Dynamic $49.95 /mo

            • Most areas in the US have many ISPs to choose from. We don't have to buy our internet from the local phone company. For instance, I have internet from Speakeasy even though AT&T (formerly SBC) owns the actual wires. Of course I can get AT&T's service but I prefer Speakeasy's features and unrestricted ports. It's a bit more expensive but I run my FTP server from home with no problems. There are also lots of dialup services to choose from all over the country. There are some exceptions in rural areas
    • To even get a DSL line up and running within 60 days here would be amazing
      Strange also through Verizon here. Most of the time a T1 we can get within a couple days, and our last DS3 was in under a week. I guess it is a YMMV.
  • by spectrokid ( 660550 ) on Wednesday June 13, 2007 @08:40AM (#19489583) Homepage

    The connection has timed out
    The server at www.pipenetworks.com is taking too long to respond.

    This does not look good!
  • by invisik ( 227250 ) on Wednesday June 13, 2007 @08:46AM (#19489633) Homepage
    Project Black Box. Just drop it off in the parking lot and plug it in.

    http://www.sun.com/emrkt/blackbox/index.jsp [sun.com]

    -m

    • Ok, I just spent like 20 minutes poking around that site. They really have thought of everything with that box. It's pretty amazing.

      The scope is limited to specific applications, but if you need a data room on the go? I can think of a few places where it is exactly ideal. Post-disaster, i.e. New Orleans, would be one. NATO actions, or similar needs for mobile infrastructure in war zones? I mean, it's a really neat idea.

      And I was like "they can't have solved the cooling problems". But, appearantly, th
      • by eln ( 21727 ) on Wednesday June 13, 2007 @10:29AM (#19490867)
        Actually, it requires "chilled water." I took a tour of this thing when it came to the local Sun campus, and it really is quite an amazing piece of engineering. Basically, you need one (small) cargo container for the data center itself, and a chiller for the water. They are able to carry the cargo container and a chiller around in a standard sized 18-wheeler. Obviously, if you were trying to take this into a disaster area, you'd need another truck or two to carry generators and fuel.

        Inside the building, they had a bunch of photoshopped pictures of these black boxes in various locations like on top of an offshore oil rig, stacked 3 high in a warehouse, and sitting on top of a skyscraper. The photoshopping was fairly good, but you could tell the photos were faked, mostly because at the time only 2 black boxes had actually been built, and one of them was outside in the parking lot.
      • It's pretty brilliant.

        The US (and I would assume basically every other) military has been doing this for longer than I've been alive. They have standardized connections for power, water, et cetera. They've got a darkroom-in-a-box, a hospital-in-a-box, et cetera.

        And it's not a closed system if you have to pipe cooling water into and out of it, nor if it consumes electricity. It's a "sealed box" but not a closed system. I mean, Earth isn't a closed system.


        • Oh, right, right about the "closed system". I just meant that it recirculates the same air and doesn't rely on having air vents to the outside world (such as would be required for, say, a traditional air conditioner).

          And it figures that the military would have something like this. In sci-fi novels, they often have like "city in a box" or whatever where it's a bunch of modular buildings that you snap together like legos and bam instant city.

          Neat!

          ~Wx
        • Not to be too picky, but something can consume electricity and still be a closed system.

          In thermodynamics, a system is "closed" if it does not exchange matter with the surroundings (but optionally may exchange heat and/or work). A system that does not exchange heat or work is called an isolated system.

          Earth is not isolated, but it is very nearly a closed system (we launch craft into space, meteors land).
    • by Barny ( 103770 )
      Just add time to ship it to Australia, get it through customs, hope some dockworkers don't take a liking to it and probably 3-4 months after sun send it, it may get to its location and plugged in.

      Did you even read the summary?
    • by Stu101 ( 1031686 )
      Could Sun make it any plainer to the would be thief "High quality, expensive computers in here". Its the exact opposite of what it should be, ie non discript so all the scrotes don't even think to try and bust it open.
      • Only the demo Blackbox has Sun logos all over it; presumably the real ones are nondescript and rusted on the outside. :-)
  • It did have a webcam It Did have a Blog Ahh well ....
  • Time zones (Score:5, Insightful)

    by Xiroth ( 917768 ) on Wednesday June 13, 2007 @09:26AM (#19490063)
    Uh, the article says that they aim to be complete at 9 am EST. While that might mean an American time zone in America, in Australia that means an Australian time zone (specifically, AEST, or GMT+10, aka their local time). So they're actually aiming to finish on Thursday afternoon Eastern American time.

    Just a FYI, unless there's clarification somewhere that they were speaking of the American EST.
    • by eln ( 21727 )
      It's a built-in safety mechanism. If they can't get it done by Friday morning American EST, they'll just claim they meant Australian EST the whole time.
      • Your times zones are the wrong way around... (The international dateline plays a role here)
        So I guess AEST is the first goal, but since they just said EST, they could always say they meant the US time zone.
    • Do you think an article from Australia about an event IN Australia by an Australian company just might be enough context for EST to mean AEST? Sure, they should have used the correct *full* AEST but hey habit is as habit does.

      And there is no "Eastern American Time", it's EST/EDT. if you feel the need to spell it out, it is "North American", don't forget the Canadians, eh?

      Sorry mods, nothing insightful about the parent. Informative perhaps, but certainly not bearing any insight.
  • by VinB ( 936538 ) <VinBrown@cox.net> on Wednesday June 13, 2007 @09:40AM (#19490247)
    This challenge would be great if they also had David Hasselhoff, Paula Abdul and John Schneider making comments after each piece of equipment installed.
  • Probably not. The blog server is probably running on an old P2 under someone's desk. And is currently leaking magic smoke.
  • Building something in a hurry is not an accomplishment in itself. Keeping it well-maintained is the real challenge.

    Would you rather slap together a DIY PC in 15 minutes or spend time ensuring your cables are positioned to allow good airflow, etc? Same principle applies.

  • And in an era when many major data-center projects are shrouded in secrecy, these guys are putting the entire effort online....

    Really? Data-Center projects shrouded in secrecy ?

    Maybe it is simply because you want it to work before the customers actually connect to it, not like the actual datacenter which can't handle the load of a few slashdot users....

    • Ok, much as it is a big joke that their datacentre has been slashdotted, ha ha, it does strike me as a little unlikely that they are running the blogs etc. from the actual finished datacentre itself. After all, they haven't actually built it yet, right? I mean if they actually managed to get a blog/webcam/whatever to work in an empty room _before_ they installed the hardware, i'd say they were pretty good at their jobs, non?
      • My only real complaint is the premise they were starting from: that most Data-centers are created 'in secret', when in fact not too many people are interested in webcamming an empty room. It would be more logical to set up the servers then install the webcam and post to slashdot, but I suppose that is just me. As nice as it is for them to show them hard at work, I still think this story is a lot of 'much ado about nothing'. Some of us do this type of work routinely and it is just a little strange to call it

      • Yeah. As the name suggests, DC3 is the third datacentre. As a former PIPE employee and someone who helped build both DC2 and DC3, I can tell you it was devoid of computers except for a random machine that was capturing webcam and uploading it to the web server.
    • Maybe you didn't understand the concept of what's going on here. That seems to be going around lately. The datacenter is something they sell space in to customer. It's a third party colocation room. Slashdot traffic doesn't go anywhere near it because there's no active servers or data comms gear in there, it's just an empty room. PIPE's a telecommunications carrier. The datacentre isn't for their own number crunching. Load on the web server has nothing to do with it.
  • .. if they'd moved hosting for the blog and webcam to North America or Europe once they were mentioned on SlashDot?

    In this case, there is a case of too much publicity. And I'd hate to see their bandwidth bills for this month.

    --Alex

  • by xxxJonBoyxxx ( 565205 ) on Wednesday June 13, 2007 @11:11AM (#19491547)
    This is a great PR piece! Budding marketeers take note: "experiments" like this is a great way to get all kinds of free press. I hope the marketing team at Pipe gets a raise for this.
  • In the future we won't need to worry about all this secrecy. You can just do everything virtually, on the Internet. Use encryption if you have something to hide.

    Oh, wait... never mind.

  • More details (Score:2, Informative)

    by BDPrime ( 1012761 )
    More details in this story [techtarget.com]. It definitely seems like a "we promised the customer we'd be ready by this time so you'd better get it done" type of deal. Demand for colo space is strong, but I don't know that it's so strong that Pipe Networks has to cobble together a data center as fast as it can. It could have probably doubled the time and it wouldn't have made a difference.

    The story also says the 60-day period is just the construction time period, and not the planning behind it, etc. But whatever. They cre

  • The /. effect is in full motion at 1:45EST. The site is down or too busy.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...