Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Displays

NVIDIA's G-Sync Is VSync Designed For LCDs (not CRTs) 139

Phopojijo writes "A monitor redraws itself top to bottom because of how the electron guns in CRT monitors used to operate. VSync was created to align the completed frames, computed by a videocard, to the start of each monitor draw; without it, midway through a monitor's draw process, a break (horizontal tear) would be visible on screen between the two time-slices of animation. Pixels on LCD monitors do not need to wait for above lines of pixels to be drawn, but they do. G-Sync is a technology from NVIDIA to make monitor refresh rates variable. The monitor will time its draws to whenever the GPU is finished rendering. A scene which requires 40ms to draw will have a smooth 'framerate' of 25FPS instead of trying to fit in some fraction of 60 FPS." NVIDIA also announced support for three 4k displays at the same time. That resolution would be 11520×2160.
This discussion has been archived. No new comments can be posted.

NVIDIA's G-Sync Is VSync Designed For LCDs (not CRTs)

Comments Filter:
  • In English (Score:2, Interesting)

    Okay, can someone who isn't wrapped up in market-speak tell us what the practical benefit is here? The fact is that graphic cards are still designed around the concept of a frame; the rendering pipeline is based on that. 'vsync' doesn't have any meaning anymore; LCD monitors just ignore it and bitblt the next frame directly to the display without any delay. So this "G-sync" sounds to me like just a way to throttle the pipeline of the graphics card so it delivers a consistent FPS... which is something we can

    • Re:In English (Score:5, Informative)

      by Reliable Windmill ( 2932227 ) on Friday October 18, 2013 @04:42PM (#45169475)
      For starters it reduces memory contention because the display device doesn't have to read and send the display data over the wire 60 times a second, while rendering the next frame. Theoretically, if there's nothing happening on the screen, such as an idle desktop, the display device won't consume x*100 Mbyte of bandwidth just to show a still image on the screen.
    • Re: (Score:3, Informative)

      by LazyBoot ( 756150 )
      Vsync still tend to add noticeable input lag even in games today. And some games still have issues with tearing even on lcd screens. So I'm guessing this is what they are trying to fix.
      • by Anonymous Coward

        Yeah, I was going to say that I've got all sorts of games that have tearing on side-to-side scrolling scenes. Whether it's a problem with my panel, the videocard, or the game, I'd like it to stop (and usually enabling vsync in the game stops it).

      • "Vsync still tend to add noticeable input lag even in games today. And some games still have issues with tearing even on lcd screens. So I'm guessing this is what they are trying to fix."

        I still have tearing on 2 of my machines (Vista and 7)

        Switching to an Aero theme fixes it, even though I hate those.

      • Input lag is how long a game takes to react on your manipulation of controls, not how long it takes to display it on the panel or CRT you're looking at. Maybe you mean output lag? Since screens get updated 50 or 60 times a second with TFTs and CRTs often get higher refresh rates, you're looking at 20ms or less for a screen refresh, when it comes to pure VSync delay. "Whoa dude, 20ms, my ping time is less than that!" I hear you say. Apart from the fact that most of the planet has ping times that are way high

        • I don't have LCD lag, either. At least, not anything appreciable. That's because my PC monitor is running at native resolution, and my TV is a SHARP Aquos and it has a spiffy scaler which can scale in a single frame cycle, and at high quality.

        • Believe it or not, enabling vsync, especially on 60 Hz LCD diplays (about 16.7 ms time per frame), still causes a very perceptible delay in fast-paced games (even without triple buffering). Disable it, and you no longer see the delay, movement feels much more instanteous. Yes, scaling and adjusting refresh rates may introduce delay, but who runs their LCD in a non-native mode?

          If you don't see the difference, then your game is too slow or doesn't render enough frames (>= 100) per second.
        • High speed video of an LCD refresh occuring in real-time:
          http://www.youtube.com/watch?v=hD5gjAs1A2s [youtube.com]

          Also, input lag is the whole chain, INCLUDING how long it takes to display.
          See AnandTech's article:
          http://www.anandtech.com/show/2803/7 [anandtech.com]

          CRT's are only zero input lag at the top edge of the screen.
          CRT's even have input lag for the bottom edge of the screen, because they have a finite frame transmission time (scanning from top to bottom).
          Some gaming LCD's (certain BENQ and ASUS gaming LCD's) are the same way; the

    • by Anonymous Coward

      No more screen tearing: http://en.wikipedia.org/wiki/Screen_tearing

    • The only advantage I see is that you see the image as soon as possible after it's done rendering. If your graphics card is done rendering the scene, but you monitor just refreshed 1 ms before that, you're going to have to wait another .016 seconds (assuming 60 hz) to see that frame, because that's the next time the screen refreshes. If you can make the refresh of the monitor exactly match the rate at which frames are produced, there's minimal lag between the frame being rendered, and the frame being shown
      • Images where there's a low difference of changes can operate at a much higher framerate.

        I think this opens up interesting possibilities.

        Of course I think the physical response time of the display will be a bottleneck, capping the rate at a maximum below what the GPU can pump out.

        Also it can save power. If the GPU is creating frames lower than 60hz then that's less power it needs to spend to do it.

        • Comment removed based on user account deletion
          • Well, the video card of course will save power by not rendering a frame until the LCD is ready to take it.

            I'm not keen on the low level details of HDMI, but I do know that HDMI and friends send the entire image per frame over the wire. The "RAMDAC" of HDMI is sending a full frame's worth of data over that wire regardless of any changes. It would save a bit of power to not send that if there are no changes.

            Even if your videocard did nothing on a given frame, it's framebuffer still got shoved over the HDMI

      • by dow ( 7718 )

        Indeed. I have a 144hz screen, and I noticed as soon as I went from 60hz to 144hz that even when the frame rate was below 60fps, it was still smoother than before. It was obvious that it would be smoother when getting over 60fps, but this surprised me. Thinking about it I came to the same conclusion.

        I have also been wondering what would the picture be like if using a high refresh rate when the graphics card cannot render enough frames for one every refresh, what if it only rendered half the pixels every upd

    • by Anonymous Coward

      VSyncs problem is that it is dictated by your monitor and is a fixed rate where graphics card frame rates are variable. This causes either studder or tearing depending on if you want to wait for VSync before drawing.

      This solution instead is controlled by the video card so it will never tear and the monitor will update when told to rather than at a specific time. No more tearing and no more performance loss to deal with it... also no need to triple buffer which will help reduce input lag (among other things)

    • Re:In English (Score:5, Informative)

      by The MAZZTer ( 911996 ) <(megazzt) (at) (gmail.com)> on Friday October 18, 2013 @04:54PM (#45169619) Homepage

      No marketspeak here, but if you're not familiar with the technical details you might be a bit lost. First of all, in order to understand the solution, you need to identify the problem.

      The problem is that, currently, refresh rates are STATIC. For example, if I set mine to 60Hz, the screen redraws at 60fps. If I keep vsync disabled to allow my gfx card to push out frames as fast as it can, my screen still only draws 60fps, and screen "tearing" can result as the screen redraws in the middle of the gfx card pushing out a new frame (so I see half-and-half of two frames).

      As described, let's say my gfx card is pushing out 25fps. Currently the optimal strategy is to keep vsync off, even though this can result in screen "tearing", because with low fps bigger problem emerges even though screen "tearing" is fixed, with vsync on.

      Every time my gfx card pushes out a frame, since vsync is on, it waits to ensure it will not be drawing to the screen buffer while the screen is updating. Since it waits, the screen only draws complete frames. So at 60fps the screen updates in 1/60 second intervals, and the gfx card render at 1/25 second intervals. So, at the beginning of a frame render, the gfx card renders... and the screen redraws twice, and then the gfx card has to wait for the third opportunity to draw before syncing up again. Since it is waiting instead of rendering, I am now rendering at 20fps (since I lose 2/3 refresh opportunities) instead of the optimal 25fps. If I disable vsync, I get tearing, but now 25fps.

      This "G-sync" claims to solve that issue by making refresh rates DYNAMIC. So if my gfx card renderas at 25fps, the screen will refresh at that rate. It will be synchronized. No tearing or gfx card waiting to draw.

      • by AvitarX ( 172628 )

        I think worse than 20 FPS even, some frames will last longer than others, causing jerky motion. it won't be a smooth 20 FPS, it will be 20ish, but some frames lasting extra draws vs others.

        • Exactly, and that is especially a problem for the Oculus Rift and other virtual reality headsets that are coming onto the market, because it becomes really noticeable when you move your head quickly. I think that that is what they're mainly targeting here, although according to John Carmack, G-Sync won't work on the Rift [twitter.com]. Anyway, for those interested in the technical details, graphics programming legend Michael Abrash (currently at Valve) wrote an excellent technical piece [valvesoftware.com] about the frame timing issues you
      • by Guspaz ( 556486 )

        G-Sync enforces a 30Hz minimum refresh rate (the monitor will never wait longer than 33ms to refresh, or in the 144Hz demo monitors, it will never wait less than 7ms), so your example wouldn't work, but apart from that, yeah :)

      • by Chemisor ( 97276 )

        Currently the optimal strategy is to keep vsync off, even though this can result in screen "tearing"

        No, the optimal strategy is to keep vsync on and throttle your redraws exactly to it. To make it work you must set up an event loop and a phase-lock timer (because just calling glFinish to wait for vsync will keep you in a pointless busywait all the time). Unfortunately, game programmers these days are often too lazy to do this and simply ignore vsync altogether. While this may result in smoother animation, i

      • >This "G-sync" claims to solve that issue by making refresh rates DYNAMIC. So if my gfx card renderas at 25fps, the screen will refresh at that rate. It will be synchronized. No tearing or gfx card waiting to draw.

        Well, we already have Adaptive VSYNC (if you have bothered updating your drivers in the last year), which does in fact make your GPU refresh rates somewhat dynamic to avoid the annoying 60 -> 30fps hops.

        G-SYNC looks even better, though. My only worry is that it will be horrendously overprice

    • by Anonymous Coward

      Slashdot: where anything you don't understand must be bullshit.

      Basically, you know how you think monitors work? They actually don't. But with g-sync, they do.

    • Re:In English (Score:5, Informative)

      by jones_supa ( 887896 ) on Friday October 18, 2013 @05:02PM (#45169707)

      LCD monitors absolutely do not ignore VSync. Now let's not forget that the primary function of a VSync signal is to tell the monitor (CRT or LCD) where the start of the picture begins. There's also HSync to break the picture into scanlines. VSync always takes a certain amount of time which the monitor will "take a breath" (CRT will also move the gun back to top). At this time, it is the perfect moment for the GPU to quickly swap its framebuffers in the video memory. The "scratch" draw buffer will be moved as the final output image and then the GPU can begin drawing the next one in the background. At the same time the completed image is sent to the monitor in the normal picture signal when the monitor gets back to work to draw a frame. If the buffers are swapped in the process of the monitor drawing the frame, the halves of two frames will get shown together which leads to the video artifact called "tearing".

      If we are a good citizen and swap buffers only during the VSync period we can get a nice tear-free (typically 60fps) image. However if instead it takes more than the time of one picture (which about 16ms) to draw the next one, we have to wait a long time for the next VSync and that means that we also slide all the way down to a 30fps frame rate. Now if the game runs fast at some moments but slower at some others, the bouncing back between 60fps and 30fps (or even 15fps) makes this annoying jerky effect. NVIDIA's G-Sync tries to solve this problem by making the frame time dynamic.

      • Re:In English (Score:5, Interesting)

        by Anonymous Coward on Friday October 18, 2013 @06:35PM (#45170479)
        Well I don't want to be "that guy", but I am "that guy". The real reason for vsync in the days of CRTs is to give time for the energy in the vertical deflection coils to swap around. There is a tremendous amount of apparent power (current out of phase with voltage) circulating in a CRT's deflection coils.

        Simply "shorting out" that power results in tremendous waste. They used to do it that way early on, they quickly went to dumping that current into a capacitor so they could dump it right back into the coil on the next cycle. That takes time.

        An electron beam has little mass and can easily be put anywhere at all very quickly on the face of a CRT. It's just that the magnetic deflection used in TVs is optimized for sweeping at one rate one way. On CRT oscilloscopes they used electrostatic deflection and you could, in theory, have the electron beam sweep as fast "up" as "left to right".

        So why didn't they use electrostatic deflection in TVs? The forces generated by an electrostatic deflection system are much smaller than a magnetic system, you'd need a CRT a few feet deep to get the same size picture.

        Ta dah! The wonders of autism!

    • by Anonymous Coward

      So this "G-sync" sounds to me like just a way to throttle the pipeline of the graphics card so it delivers a consistent FPS...

      Actually, it's the inverse, with G-sync, the monitor retrace tracks the instantaneous FPS delivered by the game... That way there is no stutter (or tearing) as a result of quantizing the display scans to a pre-determined arbitrary frame rate.

    • Re:In English (Score:4, Informative)

      by Nemyst ( 1383049 ) on Friday October 18, 2013 @05:25PM (#45169921) Homepage
      Actually, it's the reverse. Instead of forcing the GPU to wait for the screen's refresh rate (as is the case with V-sync), potentially causing some pretty bad frame drops, G-sync makes the monitor wait for the GPU's frames. Whenever the GPU outputs a frame, the monitor refreshes with that frame. If a frame takes longer, the screen keeps the old frame shown in the meantime.

      Remember, V-sync forces the GPU to wait for the full frame's duration, regardless of how long it's taken to render the frame. If the GPU renders the frame in 3ms but V-sync is at 10ms per frame, the GPU waits around for 7ms. Flip side, if the GPU takes 11ms, it's "missed" a frame (lag) and still has to wait 9ms until it can start drawing the next frame. G-sync is supposed to make it so as soon as the GPU's done rendering a frame, it pushes it to the monitor, and as soon as the monitor can refresh the display to show that new image, it will.

      In theory, this could effectively give the visual quality of V-sync (no screen tearing) with a speed similar to straight rendering without V-sync.
    • You sync the panel's refresh rate to the application's.
      Say a frame gets delayed (40 fps instead of 60fps, for instance), traditionally, the monitor is blissfully ignorant of that fact and just refreshes whatever it is given when the time comes.
      Nvidia's solution is to have the GPU signal the LCD's controller, telling it when to refresh. This allows the monitor to refresh when the frame is done rendering, instead of at a fixed point in time.

      It's essentially a method for allowing the panel to be refreshed on c

    • by Anonymous Coward

      This technology will do 3 things:

      1) Reduce input lag to the lowest possible delay due to each frame being displayed immediately on the screen. With standard fixed refresh rate displays, there is almost always a delay between a frame being generated and being displayed on the screen and the delay is not constant.

      2) Remove the need for vsync to eliminate screen tearing. Since the monitor's refresh cycles are controlled by the GPU, it can be guaranteed to avoid tearing without requiring the GPU to render fra

    • by Anonymous Coward

      Oh, slashdot. Yet another ignorant "girlintraining" post modded up to 4+ interesting/informative/etc. for no discernible reason.

      LCDs do not ignore vsync. They have never ignored vsync. How on earth did you get the idea that they ignored vsync? Same comment re: "bitblt the next frame directly to the display without any delay". The "next frame" isn't even in the display, you buffoon. It's either not computed yet, or sitting in buffers on the video card. The display can't magically pull those bits out o

    • by Hsien-Ko ( 1090623 ) on Friday October 18, 2013 @10:44PM (#45171965)
      It seems strange that they didn't call it nSync....
    • by AC-x ( 735297 )

      'vsync' doesn't have any meaning anymore; LCD monitors just ignore it and bitblt the next frame directly to the display without any delay.

      Ah, but vsync does still have meaning because LCD monitors essentially emulate it. The video feed is still sent to the monitor as if it were a CRT - sequentially top to bottom left to right at a set frequency. If a game finishes drawing a frame while the video stream is still halfway down the screen then you get a tear because the display frequency is fixed.

      What this technology seems to do is allow the graphics card to send a complete frame to the monitor then tell the monitor to display it straight away. W

    • just turn off v-sync in the card settings. Doesn't screw things up if you have an LCD. Hell I notice no difference as my display only handles 59 FPS (max refresh rate) anyhow.

  • Finally (Score:5, Interesting)

    by Reliable Windmill ( 2932227 ) on Friday October 18, 2013 @04:37PM (#45169431)
    Now we just wait until they finally figure out to employ a smarter protocol than sending the whole frame buffer over the wire when only a tiny part of the screen has changed. It would do wonders for APUs and other systems with shared memory.
    • On one hand, I see where this could be a good idea. On the other hand, I kind of like dumb displays. Stuff like displays and speakers should really just display/play the signal sent to them. They should be as simple as possible, because they are expensive, and this allows them to last for longer and be cheaper. If TVs were smart, I would probably have to upgrade my TV every time they came up with a new video encoding standard. Luckily, TVs just understand a raw signal, and I can much more easily upgrade
    • by lgw ( 121541 )

      A huge part of good remote desktop protocols is just that! Keeps the bandwidth down. If your graphics card could know "for free" that all changes were in a given rectangle, and I bet it often could, that doesn't even sound hard to do.

    • by Richy_T ( 111409 )

      This was my thought. But typically, where it would make a difference, the whole screen is probably changing anyway. I can still see some advantage to that though.

      I guess ultimately, the GPU(s) should be in the monitor and the PCIe bus would be the connection. It appears there's no defined cable length so it would probably require a standards update.

    • by tlhIngan ( 30335 )

      Now we just wait until they finally figure out to employ a smarter protocol than sending the whole frame buffer over the wire when only a tiny part of the screen has changed. It would do wonders for APUs and other systems with shared memory.

      They exist - they're called "smart" LCD displays and are typically used by embedded devices. These maintain their own framebuffer, and the LCD controller sends partial updates as it needs to then shuts down. It saves some power and offloads a lot of the logic to the scre

      • there is a standard - MIPI
        all MIPI displays accept partial updates, and use local framebuffer

    • Now we just wait until they finally figure out to employ a smarter protocol than sending the whole frame buffer over the wire when only a tiny part of the screen has changed.

      Wouldn't that depend on whether it's faster to just send the entire framebuffer over the wire, or do a pixel-by-pixel compare between the current framebuffer with the previous one to figure out which parts have changed?

      This sort of streaming compression makes sense when bandwidth is limited, like back when people used dialup to acce

      • You wouldn't do a naive pixel-by-pixel compare, the operating system would tell the driver what to update since it directs all screen changes and always knows what has changed. It's not about the bandwidth between the screen and the display device, but the memory bandwidth consumed from the video RAM, since the display device currently reads the screen memory 60 times per second even if there's nothing currently happening. By only sending the parts that have changed, you free up memory bandwidth for any ot
  • I would feel pretty good about this if it were being proposed as some sort of standard, but from the blurb, it looks like a single-vendor lock-in situation. You will need an Nvidia graphics card to make it work, but your monitor will also need an Nvidia circuit board to regulate the framerate. The only value of this kind of variable framerate technology is for gaming. This means that the needed circuitry will appear only in monitors that are meant specifically for gamers. This means that they will be segmen
    • by Nemyst ( 1383049 )
      Interestingly, Nvidia will be providing the G-sync chips by themselves [anandtech.com], allowing people to mod their monitor to install the chip on them. I'm not sure just how compatible this would be, but it might allow you to upgrade your existing monitors with G-sync support or get someone to do it for you, depending on your capabilities and willingness to risk your monitor.
      • I don't see anything about them selling chips to end users, just stuff about them selling upgrade modules. I guess each module will be specific to one make/model of monitor and will require cooperation of the monitor manufacturer to produce.

        • You mean this quote? "Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors."

    • I know that lots of PC gamers now use big LCD televisions as their desktop monitors

      When did this come to be the case? A few years ago, people were telling me that almost nobody does that [slashdot.org].

      • You're confusing HTPCs and using panels designed as TVs for computer monitors. We're talking about people who stick a 32" monitor (or larger) on the wall in front of their desk in their office, vs putting a computer under the TV in your living room. While the components are the same, the ergonomics are different.
      • A few years ago

        An important detail there. Back then, as I recall, HD TVs (1280×720), or even lower res, were very common. While that's okay for watching movies or TV shows from the couch, that's awful for a large screen sitting within arm's reach. And even now, most TVs are only Full HD (1920x1080), no matter the size, while computer monitors often go higher; 27" monitors at WQHD (2560x1440) are getting quite popular, I heard.

  • G-sync (i.e. sync originated by the graphics card) seems like a good idea.
    It:
    allows for the ability of single or multiple graphics cards within a computer to emulate genlock for multiple monitors, so that the refresh rates and refresh times of those monitors interact properly
    allows for the synchronization of frame rendering and output, i.e. reducing display lag which is important for gamers and realtime applications.
    allows for a graphics card to select the highes

    • i.e. sync originated by the graphics card

      Sync has always been originated by the graphics card so no special assistance from the monitor would be needed to lock the framerates and timings of multiple monitors together.

      The problem is that traditionally monitors don't just use the sync signals to sync the start of a frame/line, they also use them as part of the process of working out what geometry and timings the graphics card is sending. Furthermore some monitors will only "lock" successfully if the timings are roughly what they expect. So you can't

  • It's over 9000! [youtube.com]

    (Oblig.)

  • ... When movies are shown at 24fps and the motion still seems fluid?
    • Because skilled directors and camera operators have learned in the last 100 years of movie making history which kind of camera movements work, and painstakingly avoid those which don't work with low framerates.

    • Because of this: http://en.wikipedia.org/wiki/Motion_blur [wikipedia.org]

      • by mark-t ( 151149 )
        So why doesn't adding motion blur to a 24fps video game feel much better?

        Not saying it doesn't make any difference, but why doesn't adding motion blur to a 24fps game look as good as a 120fps game?

        • because games use shitty motion blur effects
          there is a difference between shitty simulation of motion blur, and a real thing recorded with a camera

    • If you freeze a movie frame shot at 24fps you'll see that moving objects are blurry. And in a fast pan it still looks anything but fluid.

    • by jandrese ( 485 )
      Because you have been trained by films for your entire life to think that blurry stuttery 24fps is smooth and cinematic. If you ever watch a movie where lots of action is happening on the screen at once, you'll probably get slightly lost because everything turns into an unrecognisable blur. For an example of this, watch any of the Michael Bay transformers movies and try to figure out which Transformer is on the scene during any random action shot.

      Unfortunately upgrading to 60fps actually causes people t
      • by Miamicanes ( 730264 ) on Saturday October 19, 2013 @01:49AM (#45172817)

        Actually, the problem is even bigger. Somewhere around 200fps, you start flying into "uncanny valley" territory. 200fps is faster than your foveal cones can sense motion, but it's still less than half the framerate at which your peripheral rod can discern motion involving high-contrast content. When it comes to frame-based video, Nyquist makes a HUGE mess thanks to all the higher-order information conveyed by things like motion-blur. That's why so many people think 24fps somehow looks "natural", but 120fps looks "fake". Motion-blurred 24fps video has higher-order artifacts that can be discerned by BOTH the rods AND cones equally. It's "fake", but at least it's "consistent". 120fps video looks flawless and smooth to the cones in your fovea, but still has motion artifacts as far as your peripheral rods are concerned. Your brain notices, and screams, "Fake!"

        • I've never met anyone who thought 120fps motion looked fake. I've heard them say it looks un-movie-like but not fake.

          Run a 120fps video of a landscape on a picture frame and it looks a lot like a window, it does not look like a movie, but it does look real.

          • Sit closer, so the screen completely fills your field of vision and immerses you in the image. Your opinion will probably change.

            If you sit back from the screen, you're using foveal cones to watch it. It's the rods along your vision field's periphery that cause the problems.

            The "uncanny valley" problem affects mainly immersive videogames where you're either sitting really close to the screen, or have additional screens off to the side that are viewed mainly with peripheral vision.

            This is a problem people in

            • I already spend most of my video watching time in front of a 103" DLP projection screen at 10' ... and I prefer IMAX high frame rate to low frame rate films because the jitter drives me nuts on lower frame rates.

  • Something the summary fails to point out is this will not work with existing LCD monitors. The monitors will have to have special hardware that supports G-Sync.

    Standard LCD monitors and TVs update the pixels the same way old CRTs do. They start from the top and update line by line until they reach the bottom.

    It is actually a little surprising they haven't done something like this for phone and laptop screens yet. The only thing that stopped them from doing it with the first LCDs was compatibility wi
  • Given how few CRT monitors there are in the wild (let alone on those computers that are running new hardware), I'm not sure why the CRT vs LCD distinction was noteworthy.

    • by TyFoN ( 12980 )

      I have friends that play CS with CRT monitors for the higher sync you get on them.
      I guess those are the individuals that are most interested in this :)

      • Ouch. I don't envy them. Of course, ditch 21" CRTs for 27" LCDs didn't really save me any physical desktop space. Clutter multiplies to fit its container.

    • Because without mentioning CRTs its hard to explain why things are the way they are right now.

      Also, CRTs do still exist and have their place.

  • by fa2k ( 881632 )

    Almost spilled my coffee there, NVidia and VSync in the same sentence? The nVidia linux driver has tearing artifacts on video almost no matter what you do, it's ridiculous. VLC, Dragon player, Totem, all have obvious tearing. mplayer looks better if you disable compositing and turn off all but one monitor, but still has some tearing if you look closely. I just tried xbmc yesterday, and it may be good.

    Anyway, "GSync" seems like a good idea. Seems nice for videos with different refresh rates, like displaying

  • Hello,

    Here's a high speed video of an LCD refreshing:
    http://www.youtube.com/watch?v=hD5gjAs1A2s [youtube.com]

    This includes regular LCD refreshing modes, as well as motion-blur-eliminating LightBoost strobe backlight modes (that allows LCD to have less motion blur than some CRT's).

    Mark Rejhon
    Chief Blur Buster

  • I've been expecting this ever since we brought out DVI-D and then HDMI and Display Port. I'm in fact a little shocked its taken this long. Its really a simple concept; when the frame buffer is ready to be drawn, tell the monitor to refresh with that data, then work on the next frame. In fact, that's exactly how people think video output works already in most cases, but its not.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...