858TB of Government Data May Be Lost For Good After South Korea Data Center Fire (datacenterdynamics.com) 82
South Korea's government may have permanently lost 858TB of information after a fire at a data center in Daejeon. From a report: As reported by DCD, a battery fire at the National Information Resources Service (NIRS) data center, located in the city of Daejeon, on September 26, has caused havoc for government services in Korea. Work to restore the data center is ongoing, but officials fear data stored on the government's G-Drive may be gone for good. G-Drive, which stands for Government Drive and is not a Google product, was used by government staff to keep documents and other files. Each worker was allocated 30GB of space.
According to a report from The Chosun, the drive was one of 96 systems completely destroyed in the fire, and there is no backup. "The G-Drive couldn't have a backup system due to its large capacity," an unnamed official told The Chosun. "The remaining 95 systems have backup data in online or offline forms." While some departmers do not rely on G-Drive, those that do have been badly impacted in the aftermath of the fire. A source from the Ministry of Personnel Management said: "Employees stored all work materials on the G-Drive and used them as needed, but operations are now practically at a standstill."
According to a report from The Chosun, the drive was one of 96 systems completely destroyed in the fire, and there is no backup. "The G-Drive couldn't have a backup system due to its large capacity," an unnamed official told The Chosun. "The remaining 95 systems have backup data in online or offline forms." While some departmers do not rely on G-Drive, those that do have been badly impacted in the aftermath of the fire. A source from the Ministry of Personnel Management said: "Employees stored all work materials on the G-Drive and used them as needed, but operations are now practically at a standstill."
They obviously did a risk analysis. (Score:4, Insightful)
Re:They obviously did a risk analysis. (Score:5, Funny)
Re:They obviously did a risk analysis. (Score:4, Funny)
Maybe the risk analysis found a catastrophe would be their only chance of moving the country and government off of ActiveX.
I heard it was a Flash fire after someone had a bit too much Java.
I’ll see saw myself out now.
Re: (Score:2)
I’ll see saw myself out now.
Thank you. Saves us the trouble of having to find our hook. :-)
Re: (Score:2)
The shockwaves from this event will reverberate around the web.
Re: (Score:2)
Not saying you don't deserve your Funny mod, but there were lots of unmodded funnies in the discussion...
Re: (Score:2)
Re: (Score:3)
One more Adaptec 2940UW and a couple of SCSI drives was too expensive.
My bet is that they had the proper tools and resources but lacked either the knowledge or motivation to implement it.
Re: (Score:2)
The Korean gov't didn't want to spend the money or were lazy.
Re: (Score:2)
Yes, my first though as well. I mean, otherwise they would have off-site _and_ offline backups. Clearly not needed.
Re:They obviously did a risk analysis. (Score:5, Insightful)
Correct: "The G-Drive couldn't have a backup system due to its large capacity," merely means "we were too cheap to provide a backup".
Re: They obviously did a risk analysis. (Score:2)
Reminds me of that line from Contact (1997): "Why build one when you can have two at twice the price?", which was about government spending, too.
Re:They obviously did a risk analysis. (Score:5, Insightful)
Correct: "The G-Drive couldn't have a backup system due to its large capacity," merely means "we were too cheap to provide a backup".
As we always used to say, "If you think having a backup is too expensive, try not having one."
Re: (Score:2)
>"Correct: "The G-Drive couldn't have a backup system due to its large capacity," merely means "we were too cheap to provide a backup".
Or it means the CUSTOMER was too cheap to PAY for storage with a backup. I wouldn't point the finger only at the data center. It is highly probable they sold what the customer wanted/chose and it wasn't a secret that there was no off-site backup.
It might have been intended for scratch or temporary use, or staging for unimportant stuff, and they had access to more expens
Geographically Redundant Data Center Backup (Score:2)
It's not just for Microsoft anymore
Couldn't have a backup (Score:5, Interesting)
The Korean government couldn't afford a petabyte of storage to back up documents without which "operations are at a standstill".
Re:Couldn't have a backup (Score:4, Interesting)
The Korean government couldn't afford a petabyte of storage to back up documents without which "operations are at a standstill".
Yes, this. We're not talking about a huge amount of money or physical space ... for a national government. Furthermore, why do 28 thousand government workers each need 30GB of disk space? Usually information that is either critical or even just functional is stored in a database or a repository to allow access to team members or even just to survive the end of employment for that worker. This seems like a badly designed system.
Re: (Score:2)
It COULD be a case where some departments (marketing, for example) needed a lot of space, and it was deemed simpler to just allocate the same space to everyone whether they'd use it or not.
Also possible there was a habit or common workflow of copying active case files etc. to your G-Drive, which means ideally those case files still exist in at least an older version elsewhere.
Re: (Score:3)
30GB of "cloud storage" (which is what G-Drive is) is likely the random scratch s
Re: (Score:2)
Re: (Score:2)
Furthermore, why do 28 thousand government workers each need 30GB of disk space?
They need them to store the 100-page AI-generated weekly reports they need to submit to justify not being replaced by AI.
Re: (Score:3)
I think something must have been lost in translation.
More likely that this was "scratch" space, shouldn't have been used for critical stuff, and was never intended to be backed up.
But "the Ministry of Personnel Management" objected to the fees that they were charged to store data on on of the other 95 systems that were also destroyed in the fire and were backed up and told employees to use the G-Drive which, quite possibly, every government employee got automatically so it was "free" to the department.
Re:Couldn't have a backup (Score:5, Informative)
Re: (Score:2)
1U. 30x60TB SSDs (1350TB raw). Some storage vendors already ship them.
Project Mayhem (Score:2)
Banks next, space monkeys of Fight Club.
storage in the cloud (Score:5, Insightful)
now, it's a cloud of smoke :) (Score:5, Funny)
now, it's a cloud of smoke :)
Re: (Score:3)
No backup. (Score:4, Informative)
They could however have mirrored the data in another location.
Talk about putting all your eggs in one basket!
Heads will roll....
Re: (Score:1)
'A good year' (Score:3)
In BBC TV's 'Yes Minister' the civil servant refers to the year in which a lot of flooding destroyed files as a good year as it allowed them to get rid of embarrassing material that reflected badly on civil servants.
Re: (Score:3)
They could however have mirrored the data in another location. Talk about putting all your eggs in one basket!
Heads will roll....
My favorite is the snowjob. "The G-Drive couldn't have a backup system due to its large capacity."
Couldn't. A quick Amazon lookup shows me a 20T Ironwolf Pro is $455 CDN. 50 of those gets you 1P, at under $25k CDN.
Now, sure, you need a some of infrastructure to connect 50 drives. And you need some infrastructure and bandwidth to handle syncing between the two sites. But... that's a cloud provider's task. Bottom line is that for the price of less than a single Hyundai EV, you could own the hardware
Re: No backup. (Score:3)
Re: No backup. (Score:4, Insightful)
Re: (Score:2)
The government official never said they it could not be done due to cost. Everyone here jumped to that misguided conclusion right away. He said it could not be backed up due to size. My interpretation is their current backup solution could not handle the size and they would have to design a new one. My company can easily afford to build a new petabyte server. However installing one is not as easy as me ordering a massive amount of HDDs and doing it over a weekend. There are procedures to follow when it comes to that kind of infrastructure change. Being a government agency, there were probably additional constraints on solving that problem.
I agree the official didn't say it couldn't have a backup due to cost. And I demonstrated that size is not a prohibitive factor. That quantity of data can be backed up, and can be backed up easily. Could have had a backup. Not could not have had a backup.
For the official to claim could not have when it could have is misleading. The why of it not having a backup doesn't get asked when the baseline is "couldn't have".
Once we arrive back at could have had a backup system it's just about the reasons/ex
Re: (Score:2)
For the official to claim could not have when it could have is misleading. The why of it not having a backup doesn't get asked when the baseline is "couldn't have".
In almost every case when a problem arises, every outsider's answer to the problem is "it could have been avoided." You and others do not exactly why the problem was not avoided but do not probe into details. The problem was not technically a petabyte server can be built easily these days. The problem was existing government infrastructure did not have a petabye server in place. From my time working with government organization, it takes lifetimes to change things. Someone could have realized they needed a
Re: (Score:2)
For the official to claim could not have when it could have is misleading. The why of it not having a backup doesn't get asked when the baseline is "couldn't have".
In almost every case when a problem arises, every outsider's answer to the problem is "it could have been avoided." You and others do not exactly why the problem was not avoided but do not probe into details.
Thanks for replying, but in two words, "don't care". Not about your comment, but about the direction it goes. I object to misleading verbiage. What the official said was demonstrably, provably, clearly false. That is bad and a problem and that shouldn't be allowed to go unchallenged.
The why of there not being a backup is beside the point. The speed at which government moves is beside the point. That you got crap Internet is beside the point. They may all be interesting points, but they are not rele
Re: (Score:2)
Thanks for replying, but in two words, "don't care".
So you asked for an explanation and then basically don't care when one was provided. In other words, you were never really interested in the reason at all. You just wanted to complain to complain.
Not about your comment, but about the direction it goes. I object to misleading verbiage. What the official said was demonstrably, provably, clearly false.
No, it was not false. The actual verbiage: “The G-Drive couldn’t have a backup system due to its large capacity . . ." You don't work in South Korea and know of a system that the South Korean government that was up and running that could handle 858TB of data at the time. No you do not. You are still stu
Re: (Score:2)
The government official never said they it could not be done due to cost. Everyone here jumped to that misguided conclusion right away. He said it could not be backed up due to size. My interpretation is their current backup solution could not handle the size and they would have to design a new one.
Data of that volume typically don't appear overnight. And it seems unlikely that they just now realized "OMG! That data is critical!"
My point is that the need for backups probably didn't just spring up unexpectedly or grow too quickly for them to keep up. It seems likely that they had years to initiate and implement gradually without breaking any budgets, but chose not to do so.
Re: (Score:2)
but chose not to do so.
That depends on your definition of "chose". If you have ever worked with government organizations, it takes a lot of work to make changes. There might have been a solution that was planned but it took too long to get through all the stages of planning, approvals, budgeting, bidding, etc.
Re: (Score:2)
My favorite is the snowjob. "The G-Drive couldn't have a backup system due to its large capacity."
Agreed. Jeff Geerling made a 1.2 PB NAS [youtube.com] using a rack of drives and a Raspberry Pi. (Wholly inadequate for this South Korea job, but it does show that 1-PB storage isn't that hard or expensive.)
Also: Tony Stark was able to build this in a cave! With a box of scraps!
Re: (Score:2)
You're assuming the data is important. It obviously wasn't meant to be treated as important, but users were using it as such.
"was used by government staff to keep documents and other files. Each worker was allocated 30GB of space"
Similar things would probably happen if workers are fired and the G Drive storage is purged as they exit. This is really an example of poor file system hygiene and group ownership.
The new closet. (Score:2)
They could however have mirrored the data in another location. Talk about putting all your eggs in one basket!
Heads will roll....
Heads will roll? Guess that depends on how many skeletons just burned up in that fire. Data centers are the new closet.
858TB? Might make someone wonder how big the Epstein video surveillance archive is. Or, was.
Re: (Score:2)
They could however have mirrored the data in another location. Talk about putting all your eggs in one basket!
Heads will roll....
Heads will roll? Guess that depends on how many skeletons just burned up in that fire. Data centers are the new closet.
858TB? Might make someone wonder how big the Epstein video surveillance archive is. Or, was.
4 people have already been arrested for professional negligence: https://www.datacenterdynamics... [datacenterdynamics.com]
Improtant things should have remote backup... (Score:5, Interesting)
I was working 10+ years ago for one of the EU governments and the rule was that there should be backup located at leaset 30km away from the prime location.
Re: Improtant things should have remote backup... (Score:2)
That's it? Our corporate policy at my last job was to keep it on separate American coasts, 3000 miles apart, and separated by the Rockies and the Mississippi.
30km would have us worrying every time there was a natural disaster. Wild fires can easily cover 30km, earthquakes travel hundreds of miles, regional electricity production can cause widespread blackouts putting your data centers out of commission.
Europe doesn't have disasters like that (Score:1)
Or so believe based on centuries of experience. The USA however is a nasty, dangerous place. ;)
Re: (Score:2)
Make a bet? The floods last year in Europe come to mind.
In the first six months of 2025, 208,000 hectares of forest have already been destroyed by wildfires, and will get worse over time.
Re: (Score:2)
That's it? Our corporate policy at my last job was to keep it on separate American coasts, 3000 miles apart, and separated by the Rockies and the Mississippi.
Maybe Lavandera was working for Malta, where 30 km is all they can do using their two islands.
Re: (Score:2)
Backups (Score:1)
"The G-Drive couldn't have a backup system due to its large capacity," an unnamed official told The Chosun.
I bet the backup costs sounds cheap now, compared to having operations stalled for days or weeks.
If our servers burned in a fire without a backup, the company would have to dissolve immediately. It's just unacceptable. So we have snapshots, multi-region cloud backups, and 3x rotating offline backups.
Too big to be backed up (Score:2)
Lizzo's ass was unavailable for comment.
$1013/month to backup one petabyte on Amazon Deep (Score:3)
Re: (Score:2)
Re: (Score:3)
I was using Glacier to store (C) 70Tb for $900+/month. Heaven forbid should you want to do a read, they really get you for that. .
I have come across a couple of different cases of outsourced IT which will "provide a backup service". And then discovering that restoring from the backup costs extra. Simply because the definition of "backup service" didn't include "backup and restore on demand service"
Re: (Score:2)
I was using Glacier to store (C) 70Tb for $900+/month. Heaven forbid should you want to do a read, they really get you for that.
As a cold-storage backup presumably you need to do retrievals rarely if ever (until your data center burns down). At the ~$0.07/GB I see on the AWS website for transferring data out over the internet, it would have "only" cost them ~$70k to restore. That has to be cheap compared to the cost of whatever was lost.
Too big to fail? (Score:4, Insightful)
"The G-Drive couldn't have a backup system due to its large capacity,"
What kind of horse shit is that?
Re: (Score:3)
It's interesting that a system meant to help with (Score:3)
Good news... and bad news (Score:1)
The good news is we can finally get this Oracle shit out of our system !
The bad news is we first have to destroy it all and start again.
43 drives (Score:3)
858TB in terms of 20TB drives is only 43 drives. One can put 90 drives into a single 4U server. It would weigh 200 lbs, but being a single 4U unit is somewhat portable and can be stored off-site.
We are past the days of when 1PB is "too much".
Re: (Score:2)
We're replicating about 1.6PB between two sites using "file storage appliances from a Round Rock, TX based company", and the remote side is 12U for a little less than 3PB capacity. Granted, that product is expensive however it's pretty bulletproof once it's rolled out, and even at our size we're considered "tiny" compared to some installations that use it. Not sure why a major government couldn't get their [ stuff ] together and user LITERALLY ANYTHING to create some kind of off-site backup for that compa
Sing it! (Score:1)
Re: (Score:3)
Wrong Korea.
Re: (Score:2)
Re: (Score:2)
Your first clue is that it isn't a floppy disk fire. Though the critical server probably has a Zip disk or two.
Re: (Score:2)
A case for distributed backup (Score:2)
Too big to back up? (Score:4, Insightful)
The G-Drive couldn't have a backup system due to its large capacity
Seriously? A dataset that is close to a PetaByte is too big to backup?
I've been doing business continuity planning for a couple of decades along with many other hats. Nothing is ever too large to backup.
However, the other side of the coin is to make sure backups finish and can be recovered. I've been in situations where I wasn't involved the the company's backups, but saw first hand what happens when backups take too long to run where they eventually fail, day in and day out, but was assured that the backups were good. Purged some old data, and sure enough, not a month goes by that someone needs some of that old data that was purged but was backed up, only to discover that the backups were failing due to a number of factors including that they would never finish, and no one bothered to look into why or verify backup integrity.
I keep redundant, and sometimes double redundant backups of most everything. I try to have backups in different geographical locations.
It is not like south korea (Score:2)
It is not like south korea is major manufacturer of storage devices, right?
Samsung probably could find 1PB of flash memory lost between their offices cushions...
And the justification that 1PB of data was too big to back-up is completely absurd, these days this amount of data is not even that much, there are people if more than that in their servers at home, let alone a datacenter, for crying out loud, there are systems with more RAM than this in a single pod.
They didn't ask IBM (Score:3)
"The G-Drive couldn't have a backup system due to its large capacity"
IBM TS4500 stores up to 2.63 exabytes per library, compressed. 877.5 petabytes native. Thats enough to store 3066 copies of the G drive with compression, without deduplication.
Data sovereignty (Score:1)
But, at least they didn't let the bad Americans hold their data. Couldn't have THAT happen, could you? More important to assert local control than it is to not lost the data...
Given their behavior and reaction, this doesn't seem to be state secrets or classified information. This is just basic private shit. Use a professional cloud provider. Don't roll your own.
Re: (Score:2)
But, at least they didn't let the bad Americans hold their data. Couldn't have THAT happen, could you? More important to assert local control than it is to not lost the data...
It's irrelevant anyway if you upload encrypted backups to cloud storage. That's one of the few cases where putting your data on someone else's server actually makes sense.
A marvellous [winter] day. (Score:2)
[last lines]
James Hacker: How am I going to explain the missing documents to "The Mail"?
Sir Humphrey Appleby: Well, this is what we normally do in circumstnces like these.
James Hacker: [reads memo] This file contains the complete set of papers, except for a number of secret documents, a few others which are part of still active files, some correspondence lost in the floods of 1967...
James Hacker: Was 1967 a particularly bad winter?
Sir Humphrey Appleby: No, a marvellous winter. We lost no end of embarrassing files.
https://www.imdb.com/title/tt0... [imdb.com]
Ridiculous (Score:3)
"The G-Drive couldn't have a backup system due to its large capacity,"
**1. Tape libraries (LTO-9/10):**
Still the king for bulk, cold, or archival storage. An LTO-9 tape holds 18 TB native (45 TB compressed). A mid-range autoloader with 60–80 slots covers your 900 TB easily, costs under €100 k, and draws almost no power. Ideal for long-term off-site backups if latency isn’t critical.
**2. Object storage clusters:**
Systems like MinIO, Ceph, or AWS S3 Glacier equivalents can handle petabytes with redundancy (e.g., 3× replication = 2.7 PB raw). Hardware could be 12–24 bay servers with 22 TB disks on 100 Gb Ethernet. You can build on-prem with commodity nodes or rent from cloud providers (AWS, Wasabi, Backblaze B2).
**3. Enterprise backup appliances:**
Dell EMC Data Domain, HPE StoreOnce, Quantum DXi, or Veeam-driven scale-out repositories use deduplication to shrink footprint; 900 TB effective might need only ~300 TB physical if data repeats heavily.
A practical hybrid is active data mirrored to disk/object storage and cold copies vaulted to tape. The critical constraints are bandwidth (need ~10 Gbps just to back up 10 TB/hour) and verification time — at that scale, backup *integrity* becomes the bottleneck, not raw capacity.
Re: (Score:2)
Too big to backup (Score:2)
Best excuse of the year (Score:2)
"The G-Drive couldn't have a backup system due to its large capacity"
Best excuse of the year.
That'll won't hold up in court, for sure.