Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware Apple

Apps Reportedly Limited To Maximum of 5GB RAM In iPadOS, Even With 16GB M1 iPad Pro (macrumors.com) 159

Despite Apple offering the M1 iPad Pro in configurations with 8GB and 16GB of RAM, developers are now indicating that apps are limited to just 5GB of RAM usage, regardless of the configuration the app is running on. MacRumors reports: The M1 iPad Pro comes in two memory configurations; the 128GB, 256GB, and 512GB models feature 8GB of RAM, while the 1TB and 2TB variants offer 16GB of memory, the highest ever in an iPad. Even with the unprecedented amount of RAM on the iPad, developers are reportedly severely limited in the amount they can actually use. Posted by the developer behind the graphic and design app Artstudio Pro on the Procreate Forum, apps can only use 5GB of RAM on the new M1 iPad Pros. According to the developer, attempting to use anymore will cause the app to crash: "There is a big problem with M1 iPad Pro. After making stress test and other tests on new M1 iPad Pro with 16GB or RAM, it turned out that app can use ONLY 5GB or RAM! If we allocate more, app crashes. It is only 0.5GB more that in old iPads with 6GB of RAM! I suppose it isn't better on iPad with 8GB." Following the release of its M1-optimized app, Procreate also noted on Twitter that with either 8GB or 16GB of available RAM, the app is limited by the amount of RAM it can use.
This discussion has been archived. No new comments can be posted.

Apps Reportedly Limited To Maximum of 5GB RAM In iPadOS, Even With 16GB M1 iPad Pro

Comments Filter:
  • by Entrope ( 68843 ) on Friday May 28, 2021 @08:06PM (#61432936) Homepage

    The obvious rationale for this is that Apple would like an app's performance to be consistent across all of its (iPad) devices. Reasonable people might disagree on whether that is worth the trade-off of reduced performance on higher-RAM tablets. However, Apple making that kind of decision on behalf of developers and users is exactly par for the course.

    • by Anonymous Coward

      There is no "problem" here.

      (a) You can run more than one program at a time.

      (b) Each individual program is limited to 5GB.

      (3) If you write programs for a toy computer and you can't do it in less than 5GB, then *YOU* are the problem.

      (IV) The only people complaining about this are shitty, incompetent developers.

      • Kinda ridiculous (Score:5, Insightful)

        by raymorris ( 2726007 ) on Friday May 28, 2021 @11:15PM (#61433196) Journal

        This week at work I went to install a program that checks an input string against a list of a million "disallowed" strings.
        I had glanced at the code and it looked like the programmer was reasonably competent.

        When I ran it, I saw it used about 300MB of RAM and took over 80 milliseconds for each lookup.

        I spent two hours writing my own version that uses 12 MB of RAM (for the .Net library) and takes 0.6 milliseconds.

        So it's 100X faster and uses 20X less memory, by - having a clue how to program. As in, actually knowing WTF I'm doing.

        I could have made it faster by using a binary index file for the textual list, but I decided I liked to have all the data easily human-readable in any text editor.

        If someone is writing software for an iPad and they're using more than 5 GB of RAM, more than 5 BILLION bytes, they just might be fucking clueless about programming.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          If someone is writing software for an iPad and they're using more than 5 GB of RAM, more than 5 BILLION bytes, they just might be fucking clueless about programming.

          Or they might be writing an audio app that needs to load gigabytes' worth of audio samples. The lack of imagination I see on Slashdot is staggering, as is the hubris.

          • Yup, or 3D rendering, people regularly use more than twice as much RAM as that limit for semi-complex scenes.
            Now, I have written software to process very large audio files that did it by streaming chunks of data, but it's always a trade-off: use less memory, get less speed. What works for offline processing does not necessarily work for real-time processing, etc.
            Sure, it's a tablet, but it's an interesting design decision...it it *is* a decision, and not some bug.

            • by jwhyche ( 6192 )

              I'm a regular user of Blender and I, again, can't see any situation where anyone is going to use a tablet to do 3d rendering. People that do 3D rendering are going to use desktops and laptops with gigabytes of ram and powerful CPUs and GPUs. I just don't see a tablet having that kind of processing power. I could se you using a tablet to display your rendered work.

          • by sxpert ( 139117 ) on Saturday May 29, 2021 @03:37AM (#61433526)

            the SSD is fast enough to stream the fscking audio samples real time.
            it's high time for these people to learn how to code properly, instead of just porking out

            • the SSD is fast enough to stream the fscking audio samples real time.

              Sure, if that's the only thing it's doing. If not, then you're going to want to preload to make sure you're adding absolutely zero latency. If you're using the audio samples in some way that is unpredictable, and doing other I/O at the same time, you're doing to need to cache.

            • the SSD is fast enough to stream the fscking audio samples real time.
              it's high time for these people to learn how to code properly, instead of just porking out

              Or we could use hardware that common people have available to them rather than specifying bizarrely high minimum system requirements of an SSD for a basic app to be functional. I buy RAM because it's a fast form of memory and I expect you developers to use it competently not pussy foot around with a slower solution. If you were a remotely competent programmer you'd look at the available memory and only revert to a slower way of doing something when the faster method is exhausted.

              You sound like my wife. Spen

          • by dfghjk ( 711126 )

            "The lack of imagination I see on Slashdot is staggering, as is the hubris."

            That's exactly what I thought when I read your post. It's as though you think that everything must be loaded into memory at once, and that doing such things was impossible prior to having more than 5GB available to applications.

            The OP is right, the problem here is clueless programmers, and you appear to be one of them.

          • by jwhyche ( 6192 )

            Or they might be writing an audio app that needs to load gigabytes' worth of audio samples. The lack of imagination I see on Slashdot is staggering, as is the hubris.

            I can't think of any situation where anyone is giong to load gigabytes worth of audio samples on a tablet, into RAM. Maybe into storage, but not RAM. Anyone is going to load that much in audio samples into RAM is going to use a real computer with gigabytes of RAM.

        • Honestly, I don't have a clue, but this discovery comes from a company that makes image manipulation software. I always thought this kind of applications could use quite some ram due to having to load, display and manipulate multiple layers of high resolution images
          • by teg ( 97890 )

            Honestly, I don't have a clue, but this discovery comes from a company that makes image manipulation software. I always thought this kind of applications could use quite some ram due to having to load, display and manipulate multiple layers of high resolution images

            A picture from Sony's flagship A1 camera is 50 megapixels. Even without compression, that should be just 200 MB (4 bytes per pixel). 5 GB should be plenty. Video, on the other hand... there's hardly a limit for how much you could use if don't use streaming from storage.

            • by jaa101 ( 627731 )

              A picture from Sony's flagship A1 camera is 50 megapixels. Even without compression, that should be just 200 MB (4 bytes per pixel).

              Camera megapixels are different from display megapixels. There's only a single colour per pixel requiring, in this case, 14-bits. Call it half of your estimate.

            • A picture from Sony's flagship A1 camera is 50 megapixels. Even without compression, that should be just 200 MB (4 bytes per pixel). 5 GB should be plenty.

              Per layer. And then there's the memory consumed by whatever image manipulation operations you're executing. And hey, if you don't want to experience degradation, you might want to actually increase the resolution before certain types of manipulation...

              Video, on the other hand... there's hardly a limit for how much you could use if don't use streaming from storage.

              Everything is faster with more memory for caching.

          • "makes shitty image manipulation software"

            There fixed that for you.

        • by swilver ( 617741 )

          Congratulations. In software however, not everything is a first year student problem.

        • For a problem like that, the obvious solution is a hashset. Nothing really novel.

        • can't you do this with one line or bash written in 30 seconds

        • by pjrc ( 134994 )
          Kinda sounds like you're saying Artstudio Pro "just might be fucking clueless" and suggesting their powerful ProCreate layer-based image editing app [procreate.art] doesn't need 5GB RAM and could be "100X faster and uses 20X less memory, by having a clue how to program". But then you insisted you "actually knowing WTF I'm doing". I kinda wonder if that knowledge includes knowing anything about the specific memory-hungry app mentioned in this article?
          • Re: (Score:2, Interesting)

            by drinkypoo ( 153816 )

            Your average Apple enthusiast knows dick about shit when it comes to computing. That's why they love Apple so much despite all the artificial limitations it puts on your computing. Even Microsoft does less to get in your way, although they do plenty of other offensive crap.

            You can make Linux have the same look and smell as OSX down to the mousing and keyboard behavior, with literally all the same functionality, but with the added element of choice. The only benefit to OSX over that, which admittedly is an i

            • But people who spend a lot of energy defending Apple's lock-in and walled garden approaches to computing are ignorant at best,

              Wrong.

              Embedded Developer (and Windows Application Dev.) with over 4 decades of paid Dev. Experience here. I both understand the reasoning behind, and agree with, nearly all of Apple's decisions regarding their various "ecosystems" (although I hate that term as applied to computing). I gladly accept the Walled Garden on iOS/iPadOS (and I think that will be relaxed on iPadOS as it matures away from iOS), and know how to do as I please with macOS, and when it is appropriate and safe to use Appleâ(TM)s wel

            • by jeremyp ( 130771 )

              The average person knows dick about shit when it comes to computing. The average person just wants to run their applications. They don't care about having choices in the operating system. They don't care about being able to compile stuff themselves. They just want to run their apps.

              The first car that I owned had a manual choke (look it up if you are too young to know what one is). It gave me much more control over the running of my car than modern cars do, but you never seem me, or anybody else, lamenting t

        • I am even kind of sorry you already are 5 Insightful and I can’t upvote you more
        • by n0ano ( 148272 )

          If someone is writing software for an iPad and they're using more than 5 GB of RAM, more than 5 BILLION bytes, they just might be f...ing clueless about programming.

          Sigh. Said with the same mindset that said way back when "640K ought to be enough for anybody." (No, Bill Gates didn't actually say that but he still gets credit for it).

        • If someone is writing software for an iPad and they're using more than 5 GB of RAM

          I'm glad you know how to write a string compare function. I expect a little more functionality than that from my devices and your list of "a million" is laughably small compared to normal datasets that programs often work with, such as a simple image editor.

        • Photogrammetry (turning a series of hires images into a decent sized point cloud or a 3d mesh) traditionally laughs at 5GB and some of that code is written in real programming languages (FORTRAN) instead of scripting languages (C#, JavaScript, Python), although some of the projects are based on a Jenga tower of libraries. And yet apple google and microsoft are apparently bundling this sort of thing into newer versions of their libraries (yay). Machine learning also is a way of soaking up VRAM. Also I believ
    • The obvious rationale for this is that Apple would like an app's performance to be consistent across all of its (iPad) devices.

      The obvious rationale for this is that iDevices are meant to be toys, and if you do real work Apple wants real money out of you, so you should buy a Mac.

      Apple making that kind of decision on behalf of developers and users is exactly par for the course.

      Yes, and the course is full of alligators and sand traps which Apple put there in order to make sure you have only one path to the green.

      • The obvious rationale for this is that iDevices are meant to be toys, and if you do real work Apple wants real money out of you, so you should buy a Mac.

        Wrong.

        Apple thinks Smartphones are essentially sophisticated Appliances. I wholeheartedly agree.

        However, it is obvious to anyone with half a brain that one of the biggest reasons Apple split-off iPadOS was that they see that Class of products more as fulfilling a "general-purpose computing device" role, and as time goes on, we will start to see some decidedly "non-Toy" Applications and Dev. Tools (Swift Playgrounds notwithstanding) specifically for iPadOS.

        I have a feeling that WWDC 2021 will have some quite

        • Apple thinks Smartphones are essentially sophisticated Appliances. I wholeheartedly agree.

          Well, they aren't, unless you place artificial limitations on them. And the problem isn't even that they do that. The problem is that they don't let you turn them off. I get why people want the walled garden, the illusion of security is very appealing. But insisting that locking you into it is for your own benefit is just internalization of abuse.

          • Apple thinks Smartphones are essentially sophisticated Appliances. I wholeheartedly agree.

            Well, they aren't, unless you place artificial limitations on them. And the problem isn't even that they do that. The problem is that they don't let you turn them off. I get why people want the walled garden, the illusion of security is very appealing. But insisting that locking you into it is for your own benefit is just internalization of abuse.

            There are ample alternatives for the adventurous. Go forth and be Happy (or is that Hacked?).

            And the Security of iOS/iPadOS and the App Store is very real. Nothing is 100% perfect; but the relative malware percentages between iOS/iPadOS and Android speaks (quite loudly and clearly) for itself.

    • However, Apple making that kind of decision on behalf of developers and users is exactly par for the course.

      Apple providing hardware that is entirely useless on the other hand is not par for the course. Their normal MO is to not provide hardware that people actually need.

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Friday May 28, 2021 @08:07PM (#61432938)
    Comment removed based on user account deletion
    • Well it seems the first time any iOS device is ahead of the competition for the amount of RAM. So yes, it is unprecedented, even though the iPad Pro is blurring the lines with the Mac Book

      • Re:Unprecedented (Score:4, Informative)

        by Entrope ( 68843 ) on Friday May 28, 2021 @08:18PM (#61432954) Homepage

        Microsoft sold versions of the Surface Pro 4 [wikipedia.org] with 16 GB of RAM starting something like 5 years ago. The Surface Pro 7+ is now available with 32 GB of RAM.

        I suppose you could argue that it's not exactly the same market, but it is pretty close.

        • I was talking about Android competitors to iOS phones/tablet.
          Of course there are PCs with more RAM.

          • I was talking about Android competitors to iOS phones/tablet. Of course there are PCs with more RAM.

            There are already Android phones with 18G of ram though.

        • I suppose you could argue that it's not exactly the same market, but it is pretty close.

          It's the same market approached from a wildly different side. Microsoft trying as hard as possible to limit the possibilities for customers used to a highly advanced and versatile OS, Apple trying as hard as possible to advance the possibilities for customers used to a horrendously crippled toy, and they are meeting in the middle carrying a lot of baggage.

      • Re:Unprecedented (Score:4, Interesting)

        by Freischutz ( 4776131 ) on Friday May 28, 2021 @08:46PM (#61433004)

        Well it seems the first time any iOS device is ahead of the competition for the amount of RAM. So yes, it is unprecedented, even though the iPad Pro is blurring the lines with the Mac Book

        Sure, because the only thing that matters for computer performance is the amount of RAM. As for the iPad Pro it's not going to replace the Mac Books any time soon for people doing any serious every day work and that's not because the iPad is lacking in raw CPU power but because the iPad UI just plain sucks for doing serious work and the whole concept of a tablet doesn't really work all that well for most types of day to day work. By the time you've turned your iPad Pro into something that is half way usable by adding a keyboard you've basically turned it into a MacBook Air with a touch screen and a shitty, limited UI. Now cue a big noisy clown posse of Apple haters cracking shitty jokes ...

        • "By the time you've turned your iPad Pro into something that is half way usable by adding a keyboard you've basically turned it into a MacBook Air with a touch screen and a shitty, limited UI. Now cue a big noisy clown posse of Apple haters cracking shitty jokes"

          Same exact thing with Android. Yeah, I had a hackneyed (highly portable) practice programming enviroment set up with a rooted Kindle Fire and a Bluetooth keyboard that was just barely suitable for what I was intending it for, but I then bought a "r

          • but they expect you to only use these machines to consume and throw real money after fake online baubles.

            Old man yells at cloud.

        • The UI of iPads is actually not shitty or limited (versus macOS/OS X).
          Or do you have something special in mind?

          • Comment removed based on user account deletion
            • The equivalent of right clicking has an inbuilt delay. The equivalent of click and drag has an inbuilt delay. Selecting text has an inbuilt delay and interacting with any small menu items is a chore.

              Never mind the fact that you have to lift your hand off the keyboard physical keyboard, onto the screen to select anything and the screen then becomes grimy and covered with oily residue. You can use the virtual software keyboard but that takes up screen real-estate, it's slow, glitchy and using it covers the screen with even more oily residue and gunk.

              • Never mind the fact that you have to lift your hand off the keyboard physical keyboard, onto the screen to select anything and the screen then becomes grimy and covered with oily residue. You can use the virtual software keyboard but that takes up screen real-estate, it's slow, glitchy and using it covers the screen with even more oily residue and gunk.

                You do realize, of course, that if your fingers are oily enough to leave "oily residue and gunk" on a touchscreen, that they are leaving that same gunk on a keyboard and mouse, or even stuff like a remote control, right? Just because you can't see it as well, doesn't mean it is clean.

                Pro Tip: Wash your greasy hands after eating things like Cheeseburgers, French Fries and Cheetos if you don't want to leave "oily residue and gunk" on stuff you touch.

  • I suppose at some point down the line you'll be able to buy some premium pack in the app store that de-cripples the device.
  • by Fly Swatter ( 30498 ) on Friday May 28, 2021 @08:14PM (#61432950) Homepage
    Run the app twice!

    -

    Or thrice?
  • Deja-Moo (Score:2, Troll)

    by geekmux ( 1040042 )

    (2010 Apple iConsumer) "My iPhone4 signal sucks."

    (Steve Jobs) "You're holding it wrong."

    (2021 Apple iConsumer) "My iPad memory support sucks."

    (Tim Cook) "You're RAMming it wrong."

  • Pansies! (Score:5, Insightful)

    by methano ( 519830 ) on Friday May 28, 2021 @08:44PM (#61432998)
    What a bunch of whining pansies! "I only get 5 GB, wah!"

    My first mac only had 128K of RAM and it was pretty fancy. That was back when real men and women wrote code, close to the metal. These spoiled wimper-snappers don't know how good they got it. "I only get 5 GB, wah!"
    • by msk ( 6205 )

      My first computer had 4K of RAM.

      Get off my lawn.

    • I would assert that a Mac Plus running Word 3.x is faster and more responsive than virtually any computer made today when it comes to UI based stuff. Amazing what people could produce when hardware was a limiting factor.

    • 128K ought to be enough for anyone.
    • My first mac only had 128K of RAM and it was pretty fancy. That was back when real men and women wrote code, close to the metal.

      The 128k Macintosh was barely capable of performing its mission because it was underdesigned for it. The system had graphics-only output yet had absolutely zero graphics acceleration hardware, and in fact Macintoshes didn't get any until the Macintosh II line, and even then only if you bought a graphics card that was more expensive than an entire PC (The 8(bullet)24 GC). And as a result performance was frankly atrocious compared to the direct (but to be fair, slightly later) competition that used the same C

    • My first mac only had 128K of RAM and it was pretty fancy.

      Pansy. Back in my day we didn't have any RAM at all. If I wanted to kill someone with a chainsaw with Doom I had to do it the old fashioned way and end up on the 7pm news.

      I never understood the "back then we didn't have X" argument. No shit sherlock, you also didn't do Y. Back then you would have posted that message by getting out pen and paper and putting your comment up at the local library.

  • Wow (Score:4, Insightful)

    by Malays2 bowman ( 6656916 ) on Friday May 28, 2021 @09:14PM (#61433058)

    When I was growing up, programmers often were limited to 64 KILOBYTES or less of RAM. Often "tricks" were employed to pull off some truly amazing shit (check out the C64 demo scene)

    Maybe this is a good thing, as programmers will have to be more skilled and efficient with code, rather than expecting the user to throw more ram and cpu at the problem they've created.

    Now if websites were given stricter limitations on resource usage. :-\

    • Even in x86-land with 640K+ memory installed, one was still stuck with 64K segments for quite a while (until, i think, 80286 showed up with 'protected mode'). If you're into demos, you're probably already aware of it... but 8088 mph [youtube.com] is worth a watch.
    • Maybe this is a good thing, as programmers will have to be more skilled and efficient with code

      Who said the programmers are being inefficient with code? RAM is used as the fastest available memory storage medium. It exists to speed up the user experience. I highly doubt a single person is hitting the 5GB limit with their app, and rather running into both speed issues and limits in what the app does.

      64KB of RAM? I suspect limiting programmers that low may make it difficult to edit 4K 60fps video from the device's own camera.

      Websites? It's not websites which use RAM. It's the fact that we expect our br

  • 5 GB ought to be enough for anybody...
  • Does that 5 GB limit affect purgeable memory or just RAM allocated with malloc/new/CFAlloc/alloc/*?

    If you're using more than 5 GB of RAM that isn't purgeable, you're more than likely doing something wrong, but if you can't use more than 5 GB of purgeable memory when it isn't otherwise in use, that's probably a bug.

    See also: mmap.

  • this is A GOOD THING !
    time for these people to learn to code lean and mean.
    they should use temporary files for things they don't need in ram such as the undo stack, the SSD is sufficiently fast for this to be of absolutely no consequence for the user experience

    • this is A GOOD THING !

      TREAD ME HARDER DADDY

      Filter error: Don't use so many caps. Filter error: Don't use so many caps. Filter error: Don't use so many caps. Filter error: Don't use so many caps.

  • by Flu ( 16236 )
    5GB of RAM is roughly 600 HD images. If you can't engineer a program that can process more than 5GB of data efficiently without keeping all of it in RAM all the time, maybe you should consider a career outside software engineering.
  • If we allocate more, app crashes.

    I don't develop in Swift or Objective-C, but for all I've been able to find out in short time, it should be possible to catch when an allocation fails, and handle it -- avoiding a crash.

    Even if a large allocation fails, it might still be possible to do smaller allocations for such as showing an error message to the user.

    • by jythie ( 914043 )
      Yeah, I would be really curious to see what the actual crash was. Are they simply allocating till something fails, or are they allocationing, treating memory like it is 'infinite' like on a desktop with a arbitrarily large swap file?

      The behavior sounds a lot like quotas being enabled, which for a iPad makes sense since you do not want any one app to be able to eat up ALL the system resources. That being said, if it is simply a quota, they should really include an option for changing it since there are
  • If you _really_ need more than 5GB to run an app, then a tablet is not the right tool for the job.

"For the love of phlegm...a stupid wall of death rays. How tacky can ya get?" - Post Brothers comics

Working...