Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google

Google Drive Has a Hidden File Limit (9to5google.com) 40

Google Drive is enforcing a new file limit for the total number of files you can store on an account. 9to5Google reports: Some Google Drive users have recently noticed a message on their accounts which says that the account has reached its "creation limit" and won't accept any new files until existing ones are deleted. The issue was first highlighted by Ars Technica, and appears to be enforced for both free accounts as well as those subscribed to Google Workspace and Google One.

The issue was flagged by users on Reddit as well as Google's Issue Tracker and appears to have been put in place around mid-February. The file limit in place puts a hard ceiling on the total number of files stored in Google Drive at five million items. This limit ignores file size and type, and is a simple count of the number of files in your online storage bucket. This also includes items stored in the trash (which is automatically emptied every 30 days). When that limit is reached (or if the account has already exceeded it), Google Drive shows the following message: "This account has exceeded the creation limit of 5 million items. To create more items, move items to the trash and delete them forever."

One user reports having seven million items in their account prior to the limit being enforced, with their account no longer able to add any new files. Effectively, that user and anyone else in the same situation are locked out of their accounts, with the files stored now in a "read-only" mode. Google appears to have confirmed the limit to some users via support, but has yet to speak out publicly about it.

This discussion has been archived. No new comments can be posted.

Google Drive Has a Hidden File Limit

Comments Filter:
  • Google drive... (Score:5, Insightful)

    by Anonymous Coward on Friday March 31, 2023 @05:48PM (#63415458)

    If you're into the millions of files, you might as well just use a google bucket (their cloud storage.)

    • For those of us that bought a Pixel 4 and Pixel 5 with unlimited (standard quality) photo uploads, I'm not surprised people are hitting the 5 million limit. The offer wasn't limited to pictures taken by the phone, it could be any picture as long as the phone was the one updating it.

      I've used it to make back ups of all my photos and shared private albums with my wife. (The original and higher quality ones are on my OneDrive though.) This is great when we want to look through pictures on a phone or tablet and
  • of those are abusing the soft storage cap

  • by AutoTrix ( 8918325 ) on Friday March 31, 2023 @06:05PM (#63415496)
    The file limit, at least for GSuite Business accounts is visible in the Administration dashboard and has always been there.
  • by GFS666 ( 6452674 ) on Friday March 31, 2023 @06:06PM (#63415498)
    ..I think in this case 5 Million files/items is kinda acceptable. I don't think the normal user will every exceed that number of files in several lifetimes.
    • Yeah, at this point anyone willingly using a Google service should just be thankful the company hasn't EOLed it yet.

      Don't get me wrong - setting aside my significant reservations regarding the company itself, I strongly prefer how Google's online tools work in comparison to Microsoft's. But depending on them just feels like you're living on borrowed time...

      • I remember when I loved google’s services and used them to manage my whole life. Then I got my first smartphone and I couldn’t imagine how it would ever get better.

        and then Google+ flopped and they were never good at anything ever again.

    • So what you are saying is that 5 million files ought to be enough for anybody. I hope history treats you kindly.
      • It's expensive to have lots of small files, at minimum there is processing overhead for whatever mechanism packs them, otherwise maybe there is disk overhead too. If you have lots of activity across lots of small files, that's even more expensive, especially if they are packed somehow. It's obviously in google's interest to dissuade people from storing data that way. Systems or applications that currently do that could be more efficient in multiple ways if they would move to another kind of store.

        It's a sha

    • I don't think the normal user

      It's not a user. It's an account. There can be multiple users.

  • Did someone come up with a way to store unlimited data in filenames/metadata for 0 sized files?
  • to get around this limit. You can use a script in MacOSX, Windows, Linux or BSD to create archives.
  • Approaching a limit (Score:5, Interesting)

    by williamyf ( 227051 ) on Friday March 31, 2023 @06:16PM (#63415528)

    Probably, they were approaching some kind of limit of the underlying infrastructure (most likely a distributed filesystem limit), and were presented a choice...

    Either do a costly and dangerous migration to new infrastructure (probably a new distributed Filesystem that can handle the total number of files they are managing) or restrict the maximum number of files per google drive...

    Imagine a couple (or two couples) of months ago, when there was no more time to delay said decision, in the middle of cost cutting meassures and layoffs...

    You would probably decide to implemement the limit and save your behind, customers be damned, and deal with the fallout of not informing customers sooner, than recomending the costly migration, being overruled on that decision by the higher-ups, then being fired in the next round of layoffs, and then hearing through the grapevine that your replacement did implement the hard limit anyway...

    • Costly and *dangerous*?

      Costly I understand. No one works for free.

      Dangerous? If you don't know what you're doing and think rsync behind a screen session is valid enterprise IT I suppose I can understand how you'd think a planned upgrade could be dangerous.

    • I'm also getting similar vibes here. But the thing is, if Google can not figure out a way to fix such a technical problem, this tells us a lot about where the company currently is. They might have some of the best engineers in the world, but the management is cancer like everywhere. Maybe Google Drive is already considered legacy tech and they just don't care to put any effort into it anymore. Maybe they ordered everyone onto the AI bandwagon.

      Whatever the reason, if this is technical limitation, and they ar

      • Peak Google was after they started buying up other non-search technologies left n right once they owned search.

        Other than search, is there anything of note developed in house they haven't killed?

  • Comment removed based on user account deletion
  • by 140Mandak262Jamuna ( 970587 ) on Friday March 31, 2023 @06:21PM (#63415544) Journal
    For a 100 GB account this means an average of 20 kb per file. Kind of ok I guess. It is not a full file system, such limits are inevitable. Even coding projects that create numerous tiny header files would not easily reach 5 million. May be with some brain dead source control system locally put together by a long dead unix wizard based on awk, grep, diff, and bash may be. The original guru left ages ago, the new team had no idea how it worked but they kept using it....

    Anyway, one can always download folders and zip them and re upload them. May be some scripts to archive long unused files. Many ways to work around it. It is not even a slow news day. Some serious stuff is going on in Manhatten DA's office.

    • For a 100 GB account

      Why would a Google Enterprise account only be using 100GB? The issue here isn't 140Mandak262Jamuna's home personal free Google Drive.

      Even coding projects that create numerous tiny header files would not easily reach 5 million.

      Just for fun I checked my works' OneDrive sync folder. I have just shy of 800000 files. So if there are 10 of me working in a small business we'd exceed this limit. Lots of software generates lots of working files for a variety of reasons.

      • I dont deny such use cases exist. But you can't expect such users to be supported for 20$ a year, on a fully backed up cloud storage.

        BTW I cant recall where I saw the tale of woe by a new hire to a start up who discovered the source control was a cron job making level 0 backup to some free cloud storage everyday.

        • It is not necessarily a stretch to imagine that the amount of files creates expenses for Google that can exceed the blanket fee charged for the disk space. We are aware of things like inodes, although in this case any metadata for the files is going to exist in a database.

          But where your argument of "you can't expect such users to be supported for 20$" falls flat is the fact that there is no possibility to pay more to get more. There is no 200$ storage plan to store 50M files. There is not even a 200$ storag

          • Google got into making a lot of new products all the time earlier than other big tech companies (putting aside for now the small percentage of them which persist) because they were first to have this big, broad, cloudy architecture. But what products they can create is clearly limited by the characteristics of that architecture, which was designed foremost for their search and corresponding advertising products. No doubt if there are limitations which are actually affecting them (like this apparent one) the

        • Woeful engineer didn't understand startups.

          Shit like that happens at every startup. They focus on the big stuff that might make money and half ass everything else, at best, until they are doing well enough they can afford to hire woeful engineer to fix their half assed shit.

          If they wasted and yes I literally do mean wasted precious resources on an enterprise quality backup system as a startup and do everything else like that they'd never make it.

  • I suppose you could otherwise cheat and make a 1024 byte file name on a 1 byte file. A few bytes for the index as well as many directories so you could get much data space for cheap?

  • ...for them to prune their porn collection.

  • Guys, the -i argument isn't the number of inodes; it's a divisor. Just look at mkfs.ext4's output and it'll tell you how many inodes you're getting. If it's too many, redo it with the -i arg higher, and if it's not enough, make the -i arg lower. Yes, it feels like a backwards UI, but how often are you having to do it?

  • This can really not be stressed enough. Anything Google offers for anybody (except ad customers) only serves to sell ads and may get restricted, reduced or removed at any time.

You can tune a piano, but you can't tuna fish. You can tune a filesystem, but you can't tuna fish. -- from the tunefs(8) man page

Working...