Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
AI Television

Netflix Used AI To Upscale 'A Different World' and It's a Melted Nightmare (vice.com) 57

Netflix has deployed AI upscaling on the 1987-1993 sitcom "A Different World," resulting in significant visual artifacts documented by technology commentator Scott Hanselman. The AI processing, intended to enhance the original 360p footage for modern displays, has generated distortions resembling "lava lamp effects" on actors' bodies, improperly rendered mouths, and misshapen background objects including posters and tennis rackets. This marks Netflix's second controversial AI implementation in recent months, following December's AI-powered dubbing and mouth morphing on "La Palma."
This discussion has been archived. No new comments can be posted.

Netflix Used AI To Upscale 'A Different World' and It's a Melted Nightmare

Comments Filter:
  • Can it slice? Can it dice? Can it prepare a five-star, three-course dinner out of a can of beans, an onion, and leftover cold cuts?

    Can it replace too-big-for-their-britches skeptics who doubt the flawlessness of AI-generated content?

    Can it down-vote my post before breakfast?

    Can it play a game?

    Can it dominate all the slashdot headlines for days on end to stay relevant?

    Who knows?

  • by backslashdot ( 95548 ) on Wednesday March 12, 2025 @10:49PM (#65229379)

    There's one screenshot on there, I think .. but where's a video showing this? A link I could find was https://www.youtube.com/watch?... [youtube.com] but it doesn't look terrible. I think this is a great idea though, though they definitely need to keep improving the technology. I would like to see my favorite 70s and 80s shows get upscaled.

    • There's a funny thing with video where if you distort anything other than the focus of the scene the core audience doesn't notice it. You pretty much have to be neurodivergent to notice because it isn't the part of the scene that is intended to be the focus.

      • I guess but then why bother upscaling in the 1st place?
        • >"I guess but then why bother upscaling in the 1st place?"

          So that everything, including the focal/subject looks much, much, much better.

          For me, the only thing that would make conversions unwatchable would be if they force high-framerates. I *hate* 48-60 fps stuff, it makes everything look "too real"- and "soap opera-like." My brain can't suspend disbelief with it. That could be because my entire life as been watching 24/30FPS content. It might be OK for sports (which I never watch) or news, but for a

          • by DarkOx ( 621550 )

            I wonder how much of the problem is actually the AI up scaling not understanding what *is* supposed to be in focus.

            You have blury low-res content, a lot of early 80's and early 90s stuff was shot on video. On top of that you have depth of field choices the photographer might have made intentionally.

            The computer vision sees an object it recognizes, it can't just use stable-diffusion to try to make the most in-focus detailed render of that thing, or if it does it has to apply the right amount of diffuse filte

        • by Targon ( 17348 )

          For content that came out before the era of HD(2006 and earlier), the programming was VERY VERY VERY blurry. The larger your screen, the more you notice just how bad the quality is. So, doing an upscaling on it makes some sense. Now, there are two ways to approach it, try doing it on demand, or, you do it properly, you do the upscale with real professionals, get it done properly, and then you have something decent.

          Using AI as a tool isn't a problem, but if the people who handle the upscaled content do

      • by Calydor ( 739835 )

        But did you notice the moonwalking gorilla?

      • I don't think that's necessarily the case.

        Lots of people today have very large 8K TVs at arms length almost, in their living rooms. Pixelation and smoothing/blurring of older videos that looked fine in low resolution is a thing that people do notice, especially after they've gotten used to the higher resolution.

        Not everyone watches videos on their smartphone.

      • by Tablizer ( 95088 )

        core audience doesn't notice it. You pretty much have to be neurodivergent to notice

        Neurodivergent people on Slashdot? Neeeever.

    • by Rei ( 128717 )

      "Where's the video" is exactly what I came here to write. All they have is a long non-fast-forwardable vertical video of a guy ranting about upscaling. Terrible "journalism".

    • "I would like to see my favorite 70s and 80s shows get upscaled." Good news. Most shows from the 70s and 80s (and everything earlier) were shot on film which is better than HD quality so no AI upscaling is necessary. Hence, you already have many shows from that era and earlier in HD such as The Twilight Zone and MacGyver. It wasn't until the later 80s and 90s that SD Video started replacing film. Thankfully not all shows switched to video. Seinfeld was shot on film, for example.
      • I remember when I learned this fact and looked further into it, only to discover that film literally has almost infinite resolution - if you have the original film negative, you have literally captured the exact pattern of photons that composed the original image. Every. Single. Photon. It's pretty incredible if you think about it!
    • by markdavis ( 642305 ) on Thursday March 13, 2025 @07:18AM (#65229841)

      but where's a video showing this?

      I show the article has a video in it. But it is guy doing commentary and showing the video from his phone.

      The "Melted Nightmare" is a huge exaggeration. From what I can tell, most everything looks very good. There are some tiny errors sprinkled around in the background, mostly, that nobody would ever really notice unless they were TRYING to notice it.

      From what I could tell, the upscaling video would be wildly better than looking at the original. One of the main problems, however, with upscaling old 4:3 content is that old TV video was designed for small screens. So they zoom into faces/subjects much more than in movies and modern TV shows so people on 13" to 27" TV's could see stuff. But during many upscaling conversions, they will crop the top/bottom to force it to also be 16:9, and that makes the "zoomed in effect" even worse. So you end up with these huge faces in your face. Often they have to add more artificial panning as well, to keep people/things in the scene. I find that VERY distracting.

      The best conversions will come from old TV *film*, where they can rescan the original for more resolution, and overscan horizontally which helps with a 16:9 conversion (as long as they are careful not to include microphone booms and such). Unfortunately, there was a period post-film (but pre-modern/digital) where the originals were shot magnetically and processed in rigid 4:3 and at poor quality. For those, there isn't as much hope fixing things well.

  • by 50000BTU_barbecue ( 588132 ) on Wednesday March 12, 2025 @11:02PM (#65229393) Journal

    Was the summary written by AI as well? Standard definition NTSC video would be described as 480i, come on now.

    • If the article was written by AI, it would have probably been less shitty in general.

    • The 360p is likely the source resolution netflix serves up. Converting 480i60 to 360p30 doesn't lose much fidelity so it's common for streaming.

      I don't understand why a less-than-ideal upscaling is a "nightmare". It's not like they destroyed the source and can't do it over with a better, future AI.

    • Hard to believe they don't have a higher quality source but I could see some AI systems working almost as well with a smaller size and running much faster. If it's going to not look so great anyway why run slower? Also you can see what people will put up with... I remember a fractal scaling app from the 90s which produced results kind of like that and it was slow. Looked sort of like a smart blur.

    • I was a little confused by that too, together with one of the repliers to your comment suggesting it's easy to store 480i as 360p... what? How could that even work? 240p, yeah, that's a simple removal of one of the fields, but 360p? Google has nothing if you search for 480i to 360p - quoting it produces nothing, and unquoting it gives you a patronizing lecture on how 480p is better together with a whole lot of links about converting 480i ton 480p because Google is shit.

      I'm going to guess there's no real 360

    • That's how the guy describes it.

  • Does anybody critically watch it before release?
    Or do they simply not care
    Old-school upscaling doesn't require AI and it's probably all the show needs

    • you mean the tiktok in the headline right? I mean, sure there are some artifacts and teksts that are wrong, but this headline is pure hyperbole
    • To critically watch an episode carefully for issues would probably take 3 or 4 times longer than the episode itself. The human eye can only see a small area of the screen at a time so you would have to watch it and rewind very often to make sure you didn't miss something off to the side. In other words it would cost Netflix about a $500 per episode if they did that (and that's assuming no corrections are needed). It may not be worth it for a lot of their old content that gets only a few thousand or less vie

      • > To critically watch an episode carefully for issues would probably take 3 or 4 times longer than the episode itself. The human eye can only see a small area of the screen at a time so you would have to watch it and rewind very often to make sure you didn't miss something off to the side. In other words it would cost Netflix about a $500 per episode if they did that (and that's assuming no corrections are needed). It may not be worth it for a lot of their old content that gets only a few thousand or les

  • Not "360p" (Score:5, Informative)

    by PhantomHarlock ( 189617 ) on Thursday March 13, 2025 @01:56AM (#65229621)

    Programs shot for NTSC television were the analog equivalent of 720X486i (interlaced) at D1 aspect ratio. This is a whole lot of stuff that few people remember now.

    The frame rate was 60 fields per second, with every odd line producing a 1/2 vertical resolution frame at 30 FPS and every even line producing same. With an overall refresh rate of 15khz on an NTSC CRT TV, it was easy to get a noticeable flicker artifact if graphics and camera images were too sharp, so there was a lot of filtering and anti-aliasing going on to give the 'perfect picture' you would see on broadcast television - crisp but not too sharp.

    When upscaling, if not done properly it can look terrible. AI is getting better at this when not overdone. You first need to combine the two fields to either make 60 full frames per second or 30 full frames per second, and up-res along the way. It's not magic, and if you go too far (480 / 486i to 4K/30P or 60P) it's gonna look pretty terrible and be a waste of processing time. Right now a good compromise for software like Topaz video is to go for 1280X720, or 2K at the most. That gives you some interpreted sharpness without looking like a Salvador Dali painting. I am sure it will improve over time, but it will be making up a lot of data that was never there to begin with. Modern TVs are so large that this is not a bad thing when done right, NTSC content is noticeably soft on our modern sets.

    These TVs are already caable of performing frame doubling of films in realtime, creating the dreaded 'soap opera' effect - because it looks very close to NTSC 60i footage, which all soap operas and lower budget sitcoms were shot, using television cameras instead of film. (higher budget series like Cheers were shot on 35mm and then transferred with a 'pulldown' where frames were doubled or blended to go from 24 FPS to 60i - the end result was seamless and looked like 24 FPS film.)

    Anyway, random trivia of technical knowledge slowly being lost to time.

    -A former broadcast engineer from the 90s

    • Thanks for your insights. I think AI can eventually (10 years from now?) do a good job of "filling in the missing information" in a credible manner. Frankly what Netflix has done doesn't seem like AI .. a lot of it looks like they simply vectorized the scene and then enlarged it to more pixels. The AI would have to be about 10 times smarter than what we have today. For example if a close up shot in the original of a person reveals he has a mole, that mole better needs to be there in wider shots that have b

    • by AmiMoJo ( 196126 )

      The AI is supposed to fill in realistic details, but tends to fail. Skin looks too smooth like a waxwork, and small things like faces in the background get horribly distorted.

      In theory a properly trained AI should be able to cope with that stuff. It should know what a real face looks like, and ideally you could feed it higher quality photos of the main cast so that it's not just guessing with them. But that is apparently beyond us right now, and even going from 1080p film scans to 4k, like the recent Aliens

    • by Gilmoure ( 18428 )

      I'm guessing this is why Star Trek:TOS looks so good upscaled; probably shot on film?

      Seems like the '90s Star Trek series were shot on video tape so the upscaling is noticable?

      • Yeah TOS wasn't upscaled to make the HD versions, it was literally always recorded on a high definition medium, 35mm film.

        The thing that's going to upset most Gen Xers like me is that most of the shows we were brought up (late 1970s to early 1990s) on were made during a time when video tape was considered high enough quality for pre-recording, editing, and archival purposes, and so most of the shows we love were shot using electronic cameras at 480i or 576i standards. Those older than Gen Xers have a diffe

        • by macwhiz ( 134202 )

          It also depends on how effects-heavy the show was. Older shows shot on film and edited on film are pretty easy to re-scan in HD. Then you get to the 90s and you get shows like Star Trek: The Next Generation that were shot on film but edited on video—Paramount had to go back and re-edit the entire series, and redo a lot of the effects like phaser beams that were done with video paintboxes, to get an HD version.

          One of the saddest cases is Babylon 5. The show was shot on film and protected for 16:9, beca

          • Yep rendering was intensive with the renderfarms available at the time. B5's effects were mostly done in Lightwave by Foundation Imaging, and if the source files were available it would tentatively be possible to re-render. The problem is all of the third party plugins uesd at the time. I was working in F/X by that time using Lightwave and we used a lot of plugins to make up for things that LW did not have natively, like global illumination, anisotropic specularity (when a spot highlight becomes a curve

      • by sirket ( 60694 )

        Yep- film doesn't need to be up-scaled- just re-scanned (sort of- it depends when special effects were added and the like- sometimes those need to be redone).

        Shows shot on tape require a lot more effort to produce a high-quality upscaled version. AI can help there- but it takes time and effort to do it right.

        As I said in another post- I have a 1080p AI upscaled version of ST:DS9 and it looks incredible- but the folks who did it clearly cared more than Netflix and spent the time and effort to choose the righ

  • The title is misleading. The title implies that AI is to blame, unable to do what it's supposed to. Today's AI is perfectly capable of getting good results for this type of task. The problem they had is mainly due to the algorithms they chose to use and how they were implemented.
    • by sirket ( 60694 )

      I have a 1080p AI upscaled version of ST:DS9 and it looks incredible- but the folks who did it clearly cared more than Netflix and spent the time and effort to choose the right method and tune it appropriately.

  • The upscaled version looks leaps and bounds better, except for a few glitches that you have to actively look for, but it's "not good enough". OK, watch your blurry 240x360 VHS copy with no details whatsoever.

    Gratitude is completely lost on this one.

    • VHS at 240i looks way better than this.

      Also you have the resolution backwards. In fact I dont know where you got 240x360 from anyway, not only should it be 360x240 as VHS is 240i but the horizontal resolution itself is defined in MHz.

      VCD was designed to approximate the digital equivalent of VHS with 352x240 (for PAL it was 352x288), and that looks immeseley superior to this DNR'ed crud.

  • Netflix Used AI To Upscale 'A Different World' and It's a Melted Nightmare

    ... and this wondrous technology is supposed to end all human labor by 2030, solve cancer and climate change, take us to the stars and make us a post scarcity civilization (according to the AI bros).

    • by butlerm ( 3112 )

      It should be no particular surprise when delusional AI acts delusional when you ask it to make up information out of thin air. It is like a computer on drugs. In this case at least they can review the output and tell that it is a disaster though. I would hate to use it for anything you actually trust or need to trust, i.e. for more than just pure entertainment or in any field where the user in question is not smarter than the AI and can verify the output. That way lies madness.

  • I watch a lot of old shows that are not upscaled and they look fine to me, as long as my glasses are cleaned. If 'A Different World' was a show that I was interested in for the first go-around, I would much more prefer to just watch the original. Truthfully I remember when 1080 was a big thing and I found the artifacts it caused such as moire patterns and such to be very annoying. Never went to 4K because all my 1080 TVs still work. The best TV I have ever had to watch movies on is a 540p Hitachi projec
    • I just enjoy not having to spend thousands to get the latest TV.

      You can get a really big 4k TV under $1000. I'm not saying you should, if you're happy that's great, but you don't need to spend "thousands" to get a bigger and higher-res set.

  • It's like peripheral vision, but with the ability to look directly at it.

  • As the tech gets better, the AI up-scaling will as well. Baby steps...
    • by jonadab ( 583620 )
      Eh, at this point I think it has to get worse before it gets better. What we're looking at, is the uncanny valley: scale-up tech has reached the point where people's visual cortexes are starting to want to try to compare the results of a scale-up, to things that were shot at the preferred resolution in the first place, and the scaled-up versions suffer from that comparison, for obvious reasons.

      Moving from nearest-neighbor to bicubic interpolation is a clear win. It doesn't try to add detail that isn't th
  • From the article:

    "If you’ve ever played an old show on a modern, high-definition TV, you noticed that it looked like shit, like someone smeared Vaseline over the screen. That’s because they shot programs of that era in 360p (usually), and people were watching them on CRT televisions. Nobody missed higher definition because we wouldn’t have been able to see it on our screens, anyway."

    Yeeeeeaaaah, ok. 480i SD video was actually shot in 360p and HD was something nobody knew anything about.

    We

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...