Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Keeping Google's In-house Database Ticking 79

An anonymous reader writes "ZDNet has a short but interesting piece on the what Google did with its 12GB database when it became a challenge for the finance department. The database was split into three, says Chris Schulze, technical program manager for Google — one for the current financial planning projections, one for the actual current data from existing HR and general ledger systems, and one storing historic information. The article says Google has been using a variety of products from Hyperion (recently bought by Oracle) to manage its internal financial systems since 2001."
This discussion has been archived. No new comments can be posted.

Keeping Google's In-house Database Ticking

Comments Filter:
  • Re:WTF WTF? (Score:5, Insightful)

    by pasamio ( 737659 ) on Friday April 27, 2007 @09:22AM (#18899245) Homepage
    Its an advertisement! Read the bottom: "Angus Kidman travelled to Orlando as a guest of Hyperion". The thing mentions Hyperion a dozen times, its the old trick of substituting news with press releases written by companies.
  • Only 12GB? (Score:5, Insightful)

    by WapoStyle ( 639758 ) on Friday April 27, 2007 @09:28AM (#18899317)
    I don't get it, that doesn't seem like much to me.

    We have many databases that are larger here from MSSQL to Oracle, some around the 600GB mark.

    What's so special about Google's database?
  • Re:WTF WTF? (Score:5, Insightful)

    by eln ( 21727 ) on Friday April 27, 2007 @09:35AM (#18899385)
    It's not only a press release, it's a very unimpressive one. Hyperion can handle data larger than 12 GB?! Stop the presses! You could manage a company of 50, maybe even 60 employees with that!

    Plus, the "story" says that in order to manage such a large (*cough*) amount of data, the solution was to partition the database into 3 different parts. Now, I can see partitioning it for ease of management along functional areas, but certainly not because it grew to 12 whole gigabytes. If you can't handle chunks of data larger than 4 GB without partitioning it, you're in big trouble.

    I'm guessing the "anonymous reader" who submitted this works for Hyperion.
  • Press release (Score:3, Insightful)

    by gtoomey ( 528943 ) on Friday April 27, 2007 @09:42AM (#18899449)
    1. Move on, nothing to see
    2. Sack Zonk (sorry man you post some good stories, this ones a stinker)
  • by suv4x4 ( 956391 ) on Friday April 27, 2007 @10:36AM (#18900239)
    FTFA:

    The database grew in size to more than 12 gigabytes, and the period restructuring required to ensure accuracy could see the system, which is now used by more than 150 staff, taken offline for two hours at a stretch.

    "Right now, we're on a not very powerful Windows box," Couglin said.


    Uhmm, maybe it's some other Google, right...?

    I can't be reading a press release from Google, the one that has more or less a copy of the whole Internet on its servers, whining about the difficulties of managing a small database on a slow Windows machine.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...