Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Intel Technology

Planar NAND Development Ends After 26 Years 109

Lucas123 writes: The non-volatile memory used in thumb drives, SSDs, smartphones and any other mobile device today has at last hit an engineering wall. The major developers of planar NAND this week said now that they've reached 15 or 16 nanometer process technology, they no longer expect to shrink their lithography process any further, as the capacity and economic benefits no longer make sense. Toshiba, which produced the first NAND flash chip in 1989, SanDisk, Intel and Micron said they will turn their engineering efforts to 3D flash trap NAND, 3D resistive RAM and other vertically-stacked non-volatile memories that offer a much longer road map. The manufacturers all said they'll continue to produce planar NAND while developing 3D NAND, which has already doubled previous capacities while also offering two to 10 times the erase-writes of previous non-volatile memories and twice the write performance. Intel and Micron are also producing a 3D NAND, based on floating gate, and a ReRAM that the companies say will increase performance and endurance 1,000 time over planar NAND. Toshiba and SanDisk have come out with a 48-layer 3D NAND that could allow them to produce 400GB microSD cards next year.
This discussion has been archived. No new comments can be posted.

Planar NAND Development Ends After 26 Years

Comments Filter:
  • by Anonymous Coward on Thursday August 06, 2015 @11:36AM (#50263137)

    Despite these advances, IPhone 7's will still be just 32 or 64GB, with a ridiculous upcharge for the 64GB version...

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      That's nothing compared to the cost of RAM upgrades for the 2014 Mac mini. And you can't even upgrade the damn RAM yourself either, so this makes their "entry-level Mac" extremely expensive unless you agree to buy something with inadequate RAM. And when you need more RAM, you need another computer. That's the complete opposite of being a green company.

      • Re: (Score:3, Insightful)

        by sremick ( 91371 )

        Apple anti-consumer design of their phones has now infected their desktops and laptops as well. Apple has given the middle-finger to 30+ years of standard personal computer design practice. Want to upgrade anything? Buy a whole new computer. Any part breaks? Buy a whole new computer.

        Any single reason they give for it is utter BS. Anyone who buys into it is a gullible blind sheep. I'm sorry, but I've seen too many companies do exactly what Apple says they can't for me to give an ounce of credibility to their

        • by ShanghaiBill ( 739463 ) on Thursday August 06, 2015 @12:45PM (#50263607)

          Computers smaller and thinner than Apple devices, with removable/upgradable components.

          Could you please provide a link to these mythical devices?

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          Apple anti-consumer design of their phones has now infected their desktops and laptops as well.

          I think you mean anti-power user. The average Apple consumer is quite happy with the arrangement.

          • The problem comes when a consumer wants to become a power user. Without a reasonable upgrade path, the sticker shock of replacement might deter consumers from becoming power users in the first place.

            • by DedTV ( 1652495 )
              I'd think the frequency of a consumer converting to a power user is much rarer than the other way around.
              Most power users start off as power users as soon as they become computer literate. People don't often spend 10 years looking at cat videos on an eMachine and then suddenly decide they want to build their own computer. But a lot of power users do seem to get fed up with constantly chasing down benchmarks for new components when kids, work and life become more time consuming and all they want to do when
              • by tepples ( 727027 )

                Perhaps I was unclear. By "power user", I didn't necessarily mean someone who builds a computer or is otherwise deeply spec-conscious. I meant somebody who puts a computer to substantial use other than just viewing entertainment works and light socialization. Is there a better term for someone who uses a computer as a tool in this manner? "Professional" doesn't cut it because I also want to include amateurs who use a computer to make things.

        • If you don't like what Apple is selling, then buy something else.

          Most computers never get upgraded. Apple thus made the reasonable tradeoff to sacrifice upgradability to make smaller, simpler, and more durable products. If you think this was a bad tradeoff, then you are free to buy something else. I'm perfectly happy with non-upgradable HW and I consider myself an informed shopper, not a gullible blind sheep. Different people care about different things.

          Maybe it's time to stop caring so much how other peopl

      • by laffer1 ( 701823 )

        You're not even accounting for the fact it's slower than the 2012 Mac Mini quad core. The cut CPU performance significantly with the new model and then made it non upgradable to boot.

        After test driving several Macs, I realized that I was better off putting a SSD in my 2012 mini rather than buy a new one. It was going to cost 1800 to buy a mac that was faster in CPU today (I paid ~ $1000 in dec 2012) and I had to get an iMac or top of the line Macbook Pro to match it.

        Apple has lost their minds on pricing at

    • by dfghjk ( 711126 )

      Curious this is modded Insightful when it's not even true for iPhone6 models.

      • by Khyber ( 864651 )

        Looks plenty true to me. And then we get into Apple's bullshit marketing lies "Seamless design" I can sure as fuck see seams. How was your device even assembled without seams?

        And only a 1080p screen on their latest model when I've got 4K screens the same size in my hand RIGHT NOW.

        iPhone 6:


        iPhone 6 Plus:


  • by Dan East ( 318230 ) on Thursday August 06, 2015 @11:51AM (#50263253) Journal

    I was just thinking to myself how awesome it would be to have a 1 petabyte micro SD card, but then realized, "What could I possibly use that much storage for?" Yes, I know, the supposed "640k is enough for anyone" fallacy. Well, there really is a limit to what a normal human being needs to store. Why aren't MP3 files today 100 times larger than they were 15 years ago? Because the normal human's audio perception cannot tell the difference between a 5 MB MP3 and a 500 MB MP3. So the space required to store 1,000 songs is pretty much the same as 10 years ago, for most people.

    In the last few years, we've reached the limits of human perception when it comes to image resolution. The display on my phone and my ultrabook are both so high resolution that I cannot see individual pixels without a magnifying glass. How high of a resolution does a photograph need to be to print it out 8x10 with pixels so small that they cannot be seen? We've already surpassed that resolution a long time ago.

    Why don't computer monitors and image formats use 64 bit colors instead of 32 bit color that we've had for 15 years? Because the normal human cannot distinguish shades of color beyond 32 bit RGB.

    When everything is in 4k video, why would we need higher resolution (unless people are regularly projecting things on screens as wide as their house)?

    The amount of storage we need has already plateaued when it comes to certain kinds of media, and it will soon plateau in the others (video, etc) as well. At that point it's just a matter of quantity. What good would it do me to be able to store 1 million songs, or 1 million pictures on my phone? I certainly cannot produce that many myself, and I cannot even consume them either.

    For normal consumers, there will be a limit to the amount of storage we need and thus will pay for. When that occurs, research will slow down as the profit to be gained from selling petabyte of storage vs an exabyte will no longer justify the research. We are quickly reaching the point where speed and longevity are more important than capacity, so I expect, within 5 years, the emphasis will switch from mainly quantity to quality.

    • I reckon that you're mostly right, up to an order of magnitude. I'd poke at the details a little bit though. For desktop work I'd happily replace multiple screens with a 55" panel. 4k is not enough for that scenario, although 8k probably is, and I would guess that 16k would be redundant. For less detailed use, movies/games a 100" screen would be better, but the target resolution is basically the same.

      No idea how it works for VR but there doesn't seem to be much point trying it until it gets better than the

      • 55" panel? From what distance?

        At some point, you're moving away from your large screen, and the pixel density needed is less.

        If you're sitting 18 " away from your 55" inch screen, you're doing something wrong.

        • by jfengel ( 409917 ) on Thursday August 06, 2015 @01:35PM (#50263993) Homepage Journal

          You can also turn your head and move around. You may not need to keep the entire thing in your field of view all at once. Sometimes you want the big picture, sometimes you want to drill down.

          You can emulate that in software, but there are kinaesthetic senses you can take advantage of. If you're looking at a large map, for example, it's very intuitive to move your face in to read names, and then away to see where that fits into the whole. It's faster and more effective for me to switch from a debugger window on screen B and my running program on screen A.

          I don't know what the limits are; the GP suggested 8K and that sounds about right to me. But I think that assuming a single, fixed head position for the user can be unnecessarily limiting, and miss out on one kind of gesture to enable smoother interaction.

        • 18" is about the distance that pixels become indistinguishable on a 28" screen, well very roughly. I'd say around 60cm for me. On a 55" in a desktop I would use a viewing distance of 1m (40"?) roughly, so not quite view filling but close enough that it would involve a bit of head motion and windows near the bottom may as well be a different workspace from those at the top.

        • by Khyber ( 864651 )

          "If you're sitting 18 " away from your 55" inch screen, you're doing something wrong."

          Or maybe you're just an ignorant tool that doesn't understand some people need to be that close EVEN WITH FUCKING GLASSES TO CORRECT THEIR VISION.

          You insensitive thoughtless fool.

          • But if that were the case DPI (resolution) doesn't matter.

            How do I know? One of my best friends has Macular Degeneration and indeed does need to be very close to see what's going on. I can assure you that 55" TV at 18 inches to someone suffering Macular Degeneration wouldn't work for her.

            • by Khyber ( 864651 )

              "But if that were the case DPI (resolution) doesn't matter."

              Apparently you failed basic human physiology. DPI does still matter. Doesn't help if you've only got ONE CHARACTER ON SCREEN REGARDLESS OF SIZE, though, now does it?

              And if your friend truly has macular degeneration, putting her close to a screen is THE WORST thing you can possibly do, given the knowledge that blue light makes macular degeneration worse, and guess what the primary color output of CCFL and LED backlighting happens to be? BLUE.

    • by Anonymous Coward

      The formats may change, though. As spot speculation..

      In future, we will probably see "actual" 3D video, storing Z coordinates of pixels as well as X,Y and all of the associated color information. Additional information will be packed in to allow on-the-fly changing of viewing angles (information not "by default" on the display).
      This can be for photos, too. uncompressed 3000x3000x3000 images with associated RGB.

      In music, files may not be pre-m

      • by tepples ( 727027 )

        In future, we will probably see "actual" 3D video, storing Z coordinates of pixels

        How would cameras gather an X,Y->R,G,B,Z depth field like that, other than just by taking two X,Y->R,G,B pictures and guessing the depth value using computer vision algorithms, the way it's done in Kinect?

        In music, files may not be pre-mixed channels of the completed sound wave and instead be many individual channels of information.

        This is already true of file formats used in recording studios. But major record labels have been unwilling on the whole to distribute these to the public.

      • by Khyber ( 864651 )

        " In future, we will probably see "actual" 3D video, storing Z coordinates of pixels as well as X,Y and all of the associated color information."

        You do know we already have light-field cameras, yes? There's no 'in the future' for anyone that's been paying attention for the last THREE YEARS.

    • by rsmoody ( 791160 )
      Some of us think MP3, no matter how high the bitrate, is inadequate (and fyi MP3 tops out at 320kbps). Do they sound ok? Mostly. All of my CDs are stored in FLAC. What I can get in higher resolution (24-48 to 24-192) I get, if at all possible, I prefer DSD (aka SACD). It's quite common for an average album to approach 5GB in DSD. And yes, it does sound better. Especially DSD, much better. So a 1 or 2 TB microSD card would be very welcome to myself and many people like me that appreciate high resolution audi
      • If you're using a phone, ipod or other, and not using full speaker set, in a dedicated room, you're just fooling yourself. Almost everyone I have ever seen tested, can't tell the difference between 320kbps MP3, and FLAC. Most can't tell the difference at 250kbps.

        • MP3 has some limitations, especially for time resolution. Percussive sounds will be wrong no matter what, though we would need to find or create some test files and compare them to find out how it sounds like at 320 kbps. (there are some examples at lower bitrate)
          Ironically, 320 kbps or 384 kpbs MP2 would not have the same problem.

          What's really bunk is higher-than-CD resolution, CD quality is the absolute best quality or to stretch things you can have 24bit/48KHz as the absolute best for music listening.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      You're making (at least) two assumptions.

      1: The majority of storage space will be spent on multimedia that is to be consumed by humans.
      2: The kinds of multimedia being used will not change.

      1 could easily be violated if it's not humans, but the computer, that's making use of the data. Our AI research hasn't really made any progress towards sentience, but computers are getting damn good at processing large amounts of data and then answering questions about it.

      2 could be violated with increased demand for true

    • by Bengie ( 1121981 )

      Because the normal human cannot distinguish shades of color beyond 32 bit RGB.

      Really? I see color banding all the time in games. When color banding is no longer an issue, then the color depth will be enough. A quick google returns some answers saying the usage of the term "color" is ambiguous in most cases or miss-understood. If you define "color" as how a layman would use, we can see closer to 100 million colors. Most usages of the term "color" does not include luminosity.

      • by Bengie ( 1121981 )
        I forgot to also mention, the topic gets even more convoluted once you include that the eye amplifies contrast, which can increase human perception of color near contrasting edges.
      • by paulpach ( 798828 ) on Thursday August 06, 2015 @03:30PM (#50264807)

        I am a game developer.

        Indeed many games have color banding, so do many jpeg images. But this has nothing to do with the color depth.

        When a game bundles an image, it is normally compressed in a lossy format such as DXT5 or ETC1 (depends on your platform) . These formats are typically much smaller than say a PNG, and are sent compressed to the video card. The video card has hardware that can get a pixel when needed from these images without having to decompress it. This saves a lot of video card memory which can be used for more polygons and whatnot.

        These formats like jpeg, do modify the image a little bit if it helps makes them smaller. A somewhat oversimplified explanation is this: suppose there are 5 pixels that are almost the same color, for example: (red, red+1, red-1, red + 2, red +1), the algorithm will change them to be the same color: (red, red, red, red, red), then instead of saving each individual pixel, it will just store: (5 red), which takes a lot less space. A particularly bad effect of this is that gradients end up being not so smooth so you see banding. Reality is a lot more complex than this, but you get the idea.

        In addition, when a texture is rendered at a distance, the hardware actually chooses a scaled down version of the image. The farther the texture, the less precision is used until there is only 1 pixel. This is called mipmap. Depending on the algorithm used for blending mipmaps, it can also generate banding.

        You could use 128 bit RGBA color depth, and you would still see the same banding due to these optimizations.

    • by cdrudge ( 68377 )

      When everything is in 4k video, why would we need higher resolution (unless people are regularly projecting things on screens as wide as their house)?

      If I'm watching a movie on my phone, I don't need 4k. If I'm watching a movie on my 80" TV across the rec room, I want 4k. In the future when I want to watch my 4K3D holographic movie, I'm going to want that petabyte microsd card.

      I'd also love that petabyte microsd card to put into my home theater PC/server. I don't necessarily need SSD speeds for movie storag

    • You obviously haven't see The Final Cut. Imagine a world where everything we see is recorded all of the time until we die. Just video everything all the time.

      • That's the idea with Vloging (Video blogging), just record 24/7 and only index the relevant events in real-time. Then go back and publish the important edited bits online. It's basically what Youtube is for. Just imagine a Looxcie LX2 with a lot more flash storage and battery life. Done and Done!

      • by MrL0G1C ( 867445 )

        Imagine a world where everything we see is recorded all of the time until we die

        Why, who's going to watch it?

        Do you work for secret services? That sounds like something they'd love to do.

    • I think you are underestimating the memory needs of future applications. When text and spreadsheets were all we used computers for, then a gigabyte seemed like overkill, but that of course proved grossly inadequate when photos, audio and video came along.

      What will be these new storage hungry applications? Well, obviously no one knows for sure (this will only become clear when the technology is cheap enough to enable it). But here are a couple ideas:
      - Immersive video (allows you to look in any direction
    • by Rinikusu ( 28164 )

      Porn. Porn will figure out a way. Choose your angle, choose your partner, hook up your Oculus, grab your lube and hang the fuck on.

    • In a future of unlimited storage & compute power, we'd no longer use MP3 - or any other compressed file format for that matter. Everything would be lossless.
      It's not a question of us being able to tell the difference, but unnecessarily degrading data.

      Also, as mentioned above, our media will probably start including things such as depth & other environmental data. True VR will require much more than a simple 3D projection & stereo audio.

    • Well if anything you would expect it to follow some kind of S curve.

    • by digitalPhant0m ( 1424687 ) on Thursday August 06, 2015 @01:30PM (#50263953)

      to have a 1 petabyte micro SD card, but then realized, "What could I possibly use that much storage for?"

      blah blah blah ..... porn.

    • by hey! ( 33014 )

      Well, we can take a lesson from what happened when RAM became larger. People found more uses for it because it was there.

      My favorite laptop of all time was the old PowerBook 540c "Blackbird". On the evolutionary scale it's right at the mid-point between the very first portable computer, the Osbourne 1, and the laptop I'm writing this on now. If you plot the amount of memory on these computers on a *logarithmic* scale, the Osbourne has order of magnitude 4 memory; the 540c has oom 7 memory, and the laptop

    • by jfengel ( 409917 )

      Because the normal human cannot distinguish shades of color beyond 32 bit RGB.

      We can see a bit outside of the RGB color gamut. That's largely a limitation of the screens: it's cheap and easy to produce RGB, and there's not a whole lot of advantage to adding in the indigo. Artists would like it, since it's one of the biggest differences between the screen and print; they can buy special monitors, driven by special software. They require a wide-gamut format with up to 48 bits.

      That's not a huge imposition on memory. The limiting factor tends to be processing speed (as well as the limita

    • by Khyber ( 864651 )

      "Why aren't MP3 files today 100 times larger than they were 15 years ago?"

      Nowhere for the fucking stupid reasons you're positing without any obvious knowledge on the subject. If you even understood how the MP3 format worked in the first place you wouldn't be saying such stupid and ill-educated things.

  • Stacking at 2x the depth produces 2x the heat per square unit of die size. We already have heat and power budget issues that give rise to "dark silicon". Therefore, there isn't very much vertical room to expand either. (And to think that the entire human brain runs at about 0.5 Watt...)
    • by cb88 ( 1410145 )
      That is only when running at the *same* switching frequency for transistors... keep in mind some of these technologies are not entirely transistor based so the power usage metrics that would apply for CPUs don't entirely apply to memory... memory is inherently mostly dark anyway though.

      Stacked memory like the HBM in AMD's new GPUs being a fine example, or stacked nand... can allow the same data to be transferred at lower frequencies due to physically smaller bus lengths and wider buses. Which leads to lower

"I prefer the blunted cudgels of the followers of the Serpent God." -- Sean Doran the Younger