Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
X GUI Graphics Software Linux

The State of Linux Graphics 349

jonsmirl writes "I've written a lengthy article covering what I learned during the last two years building the Xegl display server. Topics include the current X server, framebuffer, Xgl, graphics drivers, multiuser support, using the GPU, and a new display server design. Hopefully it will help you fill in the pieces and build an overall picture of the graphics landscape."
This discussion has been archived. No new comments can be posted.

The State of Linux Graphics

Comments Filter:
  • ATI Drivers (Score:5, Interesting)

    by GecKo213 ( 890491 ) on Wednesday August 31, 2005 @09:22AM (#13445397) Homepage
    I just want an ATI driver that will work in full screen mode with my Dell Laptop. Too much to ask, maybe, but I'm making due just fine with what I've got. (Fedora FC4 w/ Enlightenment)
    • Frankly, I'd like to see an unbroken; non-system-lagging ATI driver at all, whatever the platform... but maybe thats just me.
      • One that could play the GNOME logout fade animation without stuttering is all I want. Why oh why did I buy a laptop with an ATI X700?
        • Re:ATI Drivers (Score:3, Interesting)

          by MattBurke ( 58682 )
          At least Linux users get the occasional driver from ATI. ATI's idiocy means users of other OS's like the BSDs are well and truly stuffed. How I wish I'd bought an Nvidia-powered laptop...
    • I'm using Fedora Core 4 (with Xorg) on my Dell 600M laptop.

      It has a Mobility Radeon 9000 and I haven't had any trouble using it (hardware accelerated) with the default Fedora drivers. OpenGL is a little slower than with ATI's drivers, but it's not enough to make games look bad.
  • by Anonymous Coward on Wednesday August 31, 2005 @09:32AM (#13445484)
    Im sure you guys remember the Looking Glass demo that Sun showed us a year or so back. This article mentions Project Looking Glass, but only explain what it does. Did Sun just throw this up for publicity? Are they ever going to opensource it or make it more widely avaliable?
    Linuxgangster.org [linuxgangster.org]
  • lkml discussion (Score:5, Informative)

    by slavemowgli ( 585321 ) on Wednesday August 31, 2005 @09:33AM (#13445490) Homepage
    There's also a discussion about this on the linux-kernel mailing list (lkml) currently - certainly worth reading:

    http://marc.theaimsgroup.com/?t=112541793700006&r= 1&w=2 [theaimsgroup.com]
  • by Anonymous Coward on Wednesday August 31, 2005 @09:34AM (#13445498)
    Interesting read. I'm quite happy with my nvidia proprietary drivers as long as Cairo/glitz/whatever will use that to make my desktop more responsive.

    I hato to see a lousy looking window-drag (ie. sloow update) when I know I have a professional video card. It really bugs me and this is the reason I also bought an iBook (well, besides the BSD inheritance).
    • by Trigun ( 685027 ) <evil&evilempire,ath,cx> on Wednesday August 31, 2005 @09:42AM (#13445554)
      ...and this is the reason I also bought an iBook (well, besides the BSD inheritance).

      With BSD Dying, you should be collecting that inheritance shortly.
    • After reading this excellent piece of writing the author makes it obvious (maybe not to him) why his Xgl project failed to get acceptance. The Gnu community does not want to move forward on this until there are open drivers to support the OpenGL desktop...and there is the problem. We will be years behind MS and Apple because they lack such requirements.

      If Nvidia and ATI released their specs today Linux would still be at least 2 years behind Windows and OSX- thats how long it would take to make drivers from

  • A little OT, but... (Score:5, Interesting)

    by Anonymous Coward on Wednesday August 31, 2005 @09:35AM (#13445503)
    here's a demo of a hacked version of KDE running on a XGL server

    http://rapidshare.de/files/4553011/xgl_wanking.avi .html [rapidshare.de]

    Demoed at aKademy 2005, KDE's developers conference.

    According to the developer, this is on a 4-years-old notebook running ATi hardware. Quite impressive.
  • Thanks Jon! (Score:5, Interesting)

    by IamTheRealMike ( 537420 ) on Wednesday August 31, 2005 @09:35AM (#13445505)
    I'd just like to say thanks to Jon Smirl for writing this. I've been following X development for some time on the various mailing lists and so on, but for an outsider looking in it's nearly impossible to get an accurate picture of what's happening and which bits do what let alone what peoples plans are.

    I think it's a crying shame Jon has stopped working on Xegl - we can only hope others will pick up from where he left off. It looks like Linux graphics is going to go through a series of half-way steps before arriving at fully OpenGL accelerated graphics: Exa based drivers first to speed up RENDER based graphics, then Xglx running on top of an existing X server to utilise its mode setting and input code, then finally Xegl which eliminates the existing X server entirely in favour of a new one that pipes all its drawing directly into the 3D pipeline.

    Question is, how long will it take?

    • Re:Thanks Jon! (Score:5, Insightful)

      by Anonymous Coward on Wednesday August 31, 2005 @09:54AM (#13445650)
      I'm posting anonymously because I don't want to seem like a suck-up - rest assured, though, that I am not Jon :)
      I think it's a crying shame Jon has stopped working on Xegl
      I think it's a crying shame that no one (i.e. Red Hat, Novell, IBM, etc) stepped up to sponsor such an intelligent and capable guy, even with just a living wage (although I'm glad that Novell hired Reveman, at least) - and the same goes for drobbins. IBM in particular has damn-near bottomless pockets for R&D, and I bet they hire legions of lesser skilled workers doing more menial jobs. Could they not have spared the budget equivalent to one extra support-monkey for such an exceptional talent? It boggles the mind, quite frankly.
  • by b100dian ( 771163 ) on Wednesday August 31, 2005 @09:37AM (#13445519) Homepage Journal
    My conclusion is that most people don't really know what is going on with graphics in Linux. It's understandable that people don't see the whole picture. Graphics is a large and complex area with many software components and competing developer groups.
    But it really should be like.. at most 3 indirections:

    the toolkit --> the X server --> and the driver/hw!!
    When I saw this (App->gtk+->Cairo->XRender->Xgl ->GLX(X)->GL->hw) it blew my mind..

    ...now just lemme read that X server RFC...
    • by IamTheRealMike ( 537420 ) on Wednesday August 31, 2005 @09:47AM (#13445595)
      Well, it's not as bad as it looks.

      App to GTK+ is just some function calls and data structure manipulation. Like on any OS widget toolkit.

      GTK+ to Cairo is the same: Cairo and GTK+ are both shared libraries. Cairo takes drawing instructions from GTK+ and translates it into low level primitives that map directly to the XRENDER protocol.

      XRENDER is just a wire format - a way to tell the X server what to do.

      Xgl is an X server. You need a single entity controlling video hardware, otherwise things get complicated very fast. Existing GL drivers don't like being used by lots of apps at once as they were built primarily for games. By centralising control of the hardware you can optimise things and deal with existing hardware/drivers.

      GLX->GL->hw - this is only temporary until enough infrastructure has been integrated into the kernel to obsolete the existing X server.

    • by serialdogma ( 883470 ) <black0hole@gmail.com> on Wednesday August 31, 2005 @09:57AM (#13445685)
      >the toolkit --> the X server --> and the driver/hw!!
      >When I saw this (App->gtk+->Cairo->XRender->Xgl ->GLX(X)->GL->hw) it blew my mind..
      Well it is really more like this

      App->toolset->Cairo->XRender->Xgl->GLX(X)->GL->hw
      |___App______||__Xserver_______________||OpenGLdri vers/hardware
      (might not be lined up)

      It is in fact the 3 part system that you think it should be, however it (for many diffent reasons) spilt up into subparts.

      Like if I where to show you the way a file system works i might draw:
      App->API->driver->hardware
      when it is more like:
      App->API->filesystem driver->device diver framework->PCI bus driver->PCI to IDE controller driver->Disk driver
      It is still in the same 4 parts as it was shown in the first but this( 2nd one) is more detailed.
      And as this is "News for nerds" surly we should crave the more detailed account.
    • App->gtk+->Cairo->XRender->Xgl ->GLX(X)->GL->hw

      That's exactly why you don't want to use Xglx. You want to use direct rendering, which looks like:

      App -> gtk+ -> Cairo -> glitz -> GL -> hw
  • by minginqunt ( 225413 ) on Wednesday August 31, 2005 @09:37AM (#13445521) Homepage Journal

    Two years ago at FOSDEM, the Xorg fork had just occurred, and there was much excitement. Maybe this time, free from the shackles of the X consortium and XFree86, X would actually improve to the point where we can be proud, and snicker at our Mac OS X using chums and say "Why can't Quartz do this then, eh?"

    Unfortunately, the way I read this article is:

    1) Linux Graphics is a bloody mess.

    2) X is still an embarassment, five years behind (at least) what Quartz and Avalon are capable of.

    3) Nobody has the time, manpower or inclination to fix it.

    Ah tits.

    Ten years ago, we were having the discussion about X being b0rken. In ten years time will we still be having this discussion?

    Plus ca change...?

    Actually I am still excited about X's future. Yes, X development stagnated pretty badly under XFree86. But things are moving along nicely now that X development is being conducted at X.org.

    The state of Linux Graphics isn't a mess. The controversy this article caused on LKML shows that many people are talking and working together and feel that things are improving. It may not be close to what Quartz is capable of yet. But it is still moving the right way.

    The Big Iron vendors let X stagnate because they never ever seemed understand the desktop space. Stupidly, they let Bill and his minions stroll in and take it over before they really had any chance to grasp what a mistake they'd made.

    Then XFree86 let X stagnate further, thinking of itself as some exclusive Gentleman's Club.

    Fortunately, the foundations of X are right. Simple, modular, highly extensible. If there's one thing the Unix Way gets right, it's simple, modular and extensible.

    Now, perhaps, X has finally space to really thrive and grow.

    I reckon the Slashdot will still be having "X Suxx0rs!!!" flamewars in 10 years. I hope also that those trolls will be even more wrong than they are now.

    Perhaps my terminal optimism is sweetly naive, but I sincerely hope and expect X to go from being "just-about-ok" now to leaving Mac OX smoking dead in the dust in the next few years.
    • by kahei ( 466208 ) on Wednesday August 31, 2005 @09:41AM (#13445545) Homepage

      I reckon the Slashdot will still be having "X Suxx0rs!!!" flamewars in 10 years.


      It's not a flamewar if everyone agrees :D

      • Well, having seen a lot of the other comments on this story, I'd have to say a lot of people don't agree. Personally, I think X blows, and it drags most of the Linux world down with it. It's an ancient idea that was poorly implemented in the first place. Pretty much every commercial offering has a better windowing server than X.

        Actually, that might be a good idea... how about someone creates a really good commercial windowing system for those poor souls who have to use X every day? I'd love to have somethin
        • by Coryoth ( 254751 ) on Wednesday August 31, 2005 @10:50AM (#13446167) Homepage Journal
          Actually, that might be a good idea... how about someone creates a really good commercial windowing system for those poor souls who have to use X every day? I'd love to have something with the quality of Avalon or Aqua on Solaris. That would be fantastic!

          It's all a matter of what level of graphics architecture your talking about though. In many ways X is simply a matter of how you draw graphics to the screen, how you access the hardware. You're, for some reason, comparing it with Aqua and Avalon. In practice X is more comparable with Quartz and GDI which Aqua and Avalon sit on top of. You want something comparable, then try looking at GTK sitting on top of Cairo. Cairo provides the same sort of drawing abstraction and interface that Quartz offers, the same sort of thing Avalon offers. It also has multiple backends so if you work in Cairo you can display on X, Quartz, Windows, or in print via PDF or Postscript. You can use Cairo acclerated over OpenGL. In terms of ease of programming Cairo offers a nice graphics API of various drawing commands. If you want a GUI interface (as in Aqua or Avalon (I think - I'm still a little unclear on what all Avalon exactly entails)) then you'll want a toolkit to expose an interface there. Something like GTK is being converted to run on Cairo (the latest version of GTK uses Cairo for some of its rednering already). It's there in Free software, though it is still young. It provides a lot of what you're looking for and X doesn't matter a bit - X is just how you draw to screen... and in a conveniently network transparent way. X doesn't necessarily suck, but a good graphics stack in Free software is certainly fairly young right now. The need is fairly new as well though... the desktop was not something that was much of a focus (everyone kept saying the desktop wasn't viable). It is coming along though.

          Jedidiah.
    • Perhaps my terminal optimism is sweetly naive, but I sincerely hope and expect X to go from being "just-about-ok" now to leaving Mac OX smoking dead in the dust in the next few years.

      Never happen.

      Oh, X as a protocol/platform may conceivably support Quartz functionality and beyond (Display SVG?), but in terms of the interface, I don't see any free WM in the same ballpark as OSX in terms of usability. KDE comes closest to quartz in theory (frameworks, OO, DCOP, KParts), but there is still only a nascent appr
    • Perhaps at Sept 30 things may start to change. A stable render and composite will be nice.
    • The problem is that your optimism is unfounded, especially your last sentence, as history has not shown that to be even close to true. This has never and will never happen between Apple and Linux. Hell, even OS/2 4.0 is still years and years ahead of Linux and OS/2 4.0 is going on 10 years old now. Apple will always be ahead of Linux, even 10-20 years from now. Apple always has and always will be unless something drastic happens.

      To reinforce my point, the major drawback to Linux is simply 'death
      • Where would Apple be right now if it wasn't for Steve Jobs? Where would Microsoft be without Bill Gates? Exactly. What Linux needs is for one company and/or person to do the same thing. Otherwise, Linux will always be 2nd or 3rd to something else.

        I migrated to Linux precisely because it was free from Bill Gates, Steve Jobs, and anyone else's domination. The whole appeal of Linux is that you can have your OS your way, not how Bill Gates or Steve Jobs wants you to have it. Can Linux improve? Yes, of cour

      • Too many people wanting too many different things and nothing gets done. And what does get done is usually only half-assed in its implementation.

        As far as X is concerned, I personally have some hope that the eventual modularization of the X system MIGHT help with this. Right now, X is still one gigantic project and therefore (I presume) run by one gigantic virtual committee of developers. If they can manage to modularize it, it might at least reduce the size of the committees to separate groups of people

      • by cahiha ( 873942 ) on Wednesday August 31, 2005 @04:51PM (#13449308)
        To reinforce my point, the major drawback to Linux is simply 'death by committee'.

        I have seen this phrase popping up from Mac advocates over and over recently; it seems to be the latest marketing meme from Apple.

        In fact, nothing could be further from the truth. Linux isn't designed by committee or anybody else; Linux isn't even an operating system in the sense of OS X, it's a family of operating systems. And what goes into those systems is shaped by market forces and user choice.

        Windows and OS X are designed by little self-appointed elites inside Microsoft and Apple; if anything is "designed by committee", it's those systems. Whether that's a good thing is debatable. I believe more in the power of market forces and evolution than despotism, but your preferences may differ.

        What Linux needs is for one company and/or person to do the same thing.

        There are companies that are doing just that. Have a look at Ubuntu and Linspire, for example.

        Otherwise, Linux will always be 2nd or 3rd to something else.

        Given Apple's checkered history and modest market share, it doesn't seem like Apple ought to be the model to go for. In any case, we'll take your advice for what it's worth.
    • Perhaps my terminal optimism is sweetly naive, but I sincerely hope and expect X to go from being "just-about-ok" now to leaving Mac OX smoking dead in the dust in the next few years.

      I agree. I think we're reaching a level of critical mass where the X developers actually are seeing limitations with X on the desktop. Historically, many of the X developers were either embedded guys or server/cross-network guys, and things they found to be problems got fixed rapidly.

      Personally I've never had problems with X
    • by cahiha ( 873942 ) on Wednesday August 31, 2005 @04:27PM (#13449114)
      Unfortunately, the way I read this article is:
      1) Linux Graphics is a bloody mess.


      And you think other window systems aren't? Apple tried to redo MacOS multiple times, until they eventually gave up and bought NeXT. Microsoft tried GDI+, then Avalon, and both have had big problems. Reengineering large amounts of code, and augmenting interfaces that have been in use for two decades simply is a hard task. Unlike both Apple and Microsoft, which have solved the problem by starting over (and maintaining old versions for compatibility), X11 has managed to evolve.

      2) X is still an embarassment, five years behind (at least) what Quartz and Avalon are capable of.

      Quartz didn't even really exist five years ago, it got limited 3D hardware acceleration only recently, and even today, most of it isn't hardware accelerated by default. If you really want a Quartz-like graphics subsystem under X11, there have been multiple implementations of DPS for X11 around for years; it's no coincidence that Linux desktop developers have chosen not to use them.

      And Avalon? Avalon has been delayed over and over again. Eventually, it may give you about what Firefox and several other systems already give you on Linux. With Avalon, Microsoft is years behind, not years ahead, the state of the art.

      Now, perhaps, X has finally space to really thrive and grow.

      X has thrived and grown since its beginning, despite people badmouthing it. See, unlike the stuff Apple or Microsoft put out back then, X11 has actually survived this long, and that's because it works and it can be adapted.
  • by LegendOfLink ( 574790 ) on Wednesday August 31, 2005 @09:43AM (#13445562) Homepage
    The WinXP user in me takes graphics and gui for granted. You turn on your PC and it just works, no matter what.

    But when I run Linux, that isn't necessarily true. I've run Redhat, Mandrake, Fedora, and just last week, Kubuntu. It's always "just worked" for me, until I installed Kubuntu. I threw it on an old IBM laptop, and I couldn't connect to the X server for the life of me. Well, after several hours spent on Google Groups, I finally found the solution: my .conf file had the wrong PCI Bus address.

    After fixing that, all worked wonderfully! Any of you who know X well enough to be able to do anything with it, props to you. Especially those developers who made it possible to just throw an install CD into a PC and have it automatically detect all the drivers AND set up X correctly. Very cool.
  • Y'know... (Score:3, Insightful)

    by Otter ( 3800 ) on Wednesday August 31, 2005 @09:44AM (#13445568) Journal
    Just like discussions of Linux sound server issues underscore that the real problem is that it's insane that the user of a desktop operating system ever encounters something called a "sound servers"...

    This is a very well-written, comprehensive discussion, that I look forward to reading through thoroughly. But I can't help being pessimistic about how this Frankenstein is going to keep adding new pieces without a central authority to enforce a consistent plan.

    • Re:Y'know... (Score:3, Interesting)

      by MsGeek ( 162936 )
      KDE needs to put a stake through the heart of artsd...it's a fsckn disaster.

      C'mon...alsa/jack for everything!!!
      • KDE 4 will not have artsd.

        It is too big a change to do in the KDE 3 series.

        artsd has some good features, it will be a shame to lose them, but it had some problems too. Ones I've notice is a lag on XMMS of a second or 2 when you start a song and sometimes the connect to it would get hosed and XMMS would just say something like "unable to connect" or somesuch generic error and I'd had to restart KDE sometimes.

        There are supposedly other technical problems as well.

        The big non-technical problem is that the autho
        • Re:Y'know... (Score:3, Informative)

          by msh104 ( 620136 )
          kde will have kdemm (kdemultimedia layer)
          and that layer will then allow you to select a backend. and one of the will be arts...

          but it can be left out. :)
  • by Anonymous Coward on Wednesday August 31, 2005 @09:46AM (#13445585)
    There are lots of issues that need to get resolved reguarding X and Unix/Linux. The biggest one I've seen is that the developers are super focused on everything being GPL all the way down to the driver level. Here's an example I have a SiS 650 it uses the SiS 315 chipset. Currently there is no 3D driver available in X.

    But, When I started to dig further into why the SiS 315 wasn't supported. I found out that the SiS 315 was the basis for all of SiS/XGI's new chipsets and included all kinds of new IP, register informtion/locations, and therefor datasheets could not be released to create an open driver. Ok, that is reasonable. So I asked if I could view the datasheets. After sighing an NDA I receievd all chipset datasheets within 2 weeks and an internal chip development contact. SiS/XGI was more than happy to work with me to get things to run under Linux/Unix but, their hands were just tied about releasing the specs as open. Also they don't have the technical resources to create a X driver.

    Why can't a binary driver be accepted? I understand the implications. But seriously there are times when you need to look at the bigger picture.

    My rant is done...
    • A very nice rant indeed, I'd mod you up if I had points.

      My answer is to your question

      Why can't a binary driver be accepted?

      is that if it would be accepted, it couldn't be part of a "free" operating system anymore.

      What most people understand is that you can't have a totally free operative system if it runs on proprietary hardware. You need to set your priorities: do I want a free operative system, or just an alternative to Windows/OSX? In the first case, you _need_ to buy hardware from vendors that com
    • I mostly agree... it basically depends on what the goal of Linux is, which varies depending on who you ask. It is my (probably uninformed) opinion that if Linux is ever to be accepted by big buisiness as well as the non technical users then some concessions, like Binary Drivers, are to be made. I don't think you can say to someone like my mom or dad that they should use linux, and, oh you have xyz video card? Well it's xyz's fault that it's slow because they refuse to release all of their information. N
      • "Personally I don't see supporting Binary Drivers as the death of Linux, but that's just me."

        It IS the death of Linux. Of course "big buisiness as well as the non technical users (and) my folks" don't get it.

        Linux is an Open Source kernel. Meant for experimentation. It grew because Minix wasn't free. And there were hassles with BSD at the time. It only makes sense if it STAYS free.

        Then, investigations into other processors, architectures, etc. can take place. By introducing the CONCEPT of a "binary driver",
    • by krmt ( 91422 ) <therefrmhere@@@yahoo...com> on Wednesday August 31, 2005 @11:02AM (#13446276) Homepage
      Why can't a binary driver be accepted? I understand the implications. But seriously there are times when you need to look at the bigger picture.
      I think you need to take your own advice. What happens when you go away because SGI won't pay you any more or decides to cancel your contract? Who can port the driver or make bufixes? In a year? Or two? What about all the users who are dependant on your driver?

      The bigger picture is that we need open drivers so that we're not reliant on you or anyone else. If you want to distribute your own binary driver, go ahead, but the rest of the world needs that driver free.

      Oh, and X.Org doesn't want things licensed under the GPL, but the MIT/X license, just like everything else in the tree.
    • I found out that the SiS 315 was the basis for all of SiS/XGI's new chipsets and included all kinds of new IP, register informtion/locations, and therefor datasheets could not be released to create an open driver. Ok, that is reasonable.

      I guess I'm just dumb, but while I agree that the data can be held closely (it's theirs, after all), I don't understand what the company loses by releasing it. It's not like their competitors can scan the data-sheets and walk down to the chip fab with the design, any more t

      • by Aumaden ( 598628 ) <Devon.C.Miller@g[ ]l.com ['mai' in gap]> on Wednesday August 31, 2005 @11:59AM (#13446801) Journal
        It's not like they make their money selling drivers, so what's the point? They didn't make any money when they told you the Big Secret, so why shouldn't they tell me, Cookie Monster, and anybody else who asks? What are we gonna do -- support their hardware in new applications, possibly increasing sales? Anything but that...!

        It's not about making money, it's about not losing money. Specifically, not losing money to lawsuits. Exposing the commands implemented on the chipset may reveal that the hardware manufacturer is using some bit of logic that falls under someone else's patent. By not revealing how you actually talk to the chip, they hope to buy themselves a little safety from the vicious patent land sharks, er, lawyers.
    • Why can't a binary driver be accepted?

      Let me count the ways:

      1. Will the manufacturer support every target processor, including compiling with options optimized for each processor in a given family of processors? I doubt it.

      2. Will the manufacturer maintain feature and performance parity with Windows drivers? Maybe nVidia does; I'm not aware of anyone else.

      3. Will the manufacturer maintain API/ABI compatibility, or continue to support older hardware? ATI's Linux drivers don't even support older Radeons.

      I cou
    • The biggest one I've seen is that the developers are super focused on everything being GPL all the way down to the driver level. Here's an example I have a SiS 650 it uses the SiS 315 chipset. Currently there is no 3D driver available in X.
      I want everything to be free software to the driver level too. Its not all that long ago that nVidia released a Linux driver that broke old and low-end cards and didn't bother to release a fix for months.
    • The problem with binary-only drivers is that older products can't be supported by the community, if/when the maker decides to stop supporting them. A real-world example is that Nvidia has stopped supporting [nvidia.com] older cards (do a search for "TNT"). I can understand not wanting to provide Linux drivers for the aged TNT series cards (despite owning one), but the original GeForce and the GeForce 2 aren't supported any longer either. I'd like to think there's some kind of technical limitation, but the realist in me
  • Gentoo (Score:5, Funny)

    by MoogMan ( 442253 ) on Wednesday August 31, 2005 @10:00AM (#13445702)
    ...during the last two years building the Xegl display server.

    He must be using Gentoo. *ducks*
  • If this guy was really interested in Linux desktop graphics, he would have at least made a passing mention of the Open Graphics Project ( http://opengraphics.org/ [opengraphics.org] ).
  • by starseeker ( 141897 ) * on Wednesday August 31, 2005 @10:08AM (#13445781) Homepage
    Even if a "perfect" X server is implemented, that's not the end of the battle to give the Linux desktop a facelift. It's the beginning.

    Toolkits running on top of X are just as important to Desktop Goodness as the Xserver is, and they can only be updated AFTER the X situation is stable. GTK and QT are the obvious ones, and I'm sure work will proceed on them, but I suspect such changes would be significant enough that they would warrant a major release, and lots of work to fully integrate new X features as opposed to just bolting them in.

    Frankly, I think the best way to proceed would be to take the useful parts of Gnome and assorted GTK apps and port them over to the Enlightenment Foundation Libraries, once they are stable. Enlightenment DR17 is probably the only environment available with the potential to pass itself off as a next generation desktop for Linux and make it stick. Can you imagine what Gimp would be like written on top of the EFLs? (drool). Of course, that's too much work to expect it to actually happen on a large scale, but it might be that Gnome's recent trend toward simplicity could make such a target easier to achieve.

    QT I think is in good hands - trolltech has proven quite good at making good toolkits with increasing performance in each new release. I'm sure it's just my perception, but GTK widigets feel clunky to me and I really think a shift by the Gnome effort to the EFL base would rock the Linux desktop world. Of course, that's easy to say and hard to do, but major landscape changes are not made by minor efforts.
    • by DreadSpoon ( 653424 ) on Wednesday August 31, 2005 @10:32AM (#13446026) Journal
      "Can you imagine what Gimp would be like written on top of the EFLs?"

      Ugly, inconsistent, unusable, gimmicky, and unprofessional?

      The Englightenment libraries are certainly great as demos of what you can do with a graphics system, but they are *not* a replacement for Xegl. That is, the Enlightenment libraries have just as much to gain from Xegl as do any GTK/cairo-using apps or Qt/arthur-using apps.

      Switching from GTK to the Enlightenment libraries really bugs you nothing. If, and *only* if, the Enlightenment libraries offered *all* of the features of GTK, including the extensive accessibility support, advanced multi-lingual support, and so on would the Enlightenment libraries even be good enough for GIMP, or any serious application for that matter. Even then, if you already have something running on GTK/cairo, what do you hope to gain? The Enlightenment libraries pretty much give you nothing noteworth except for some optimized rendering (which really can and should be done in GTK/cairo, removing the need to recode the entire damn application for a likely imperceptible speed boost) and some funky theming options, which likewise will probably be seen in forthcoming GTK releases now that the Cairo integration is underway. (Check out Seth's blogs on Cairo-GTK themes, his mockups/examples do many of the things that the Enlightment libraries do, but do it without needing to rewrite your application or lose vital functionality provided by GTK/Qt.)

      Enlightenment is a lot like the graphics demo scene: they are *really* cool looking, but not paticularly practical or useful. They could have spent the time writing all those new Enlightenment libraries as new GTK/Qt theme plugins and patches and had a usable, complete, functional desktop and set of development libraries today, or they could, well, spend 5+ years implementing a still incredibly incomplete environment that has little to no mindshare. Oops.

      Rewriting is usually not the answer, especially not at a high level. Xegl can be installed on your machine and all your old apps will continue to work with no changes. Drop in a new GTK theme or GTK library that uses cairo and all your existing apps get the new functionality (like rendering over GL and anti-aliasing and such) for free. Even if you have to extend the GTK API to get things like funky animated themes, it's much easier to port a GTK app to a new GTK version than it would be to port it to a totally new set of libraries.

      Summarizing with a popular phrase among engineers: "evolution, not revolution."
    • Cairo is ready to take advantage of these advanced X servers whenever they appear, and GTK already uses Cairo for its drawing. EFL looks nice, but it's not clear to me how it's better than Cairo, if at all.
  • by Anonymous Coward
    I was just skimming the comments and I notice just in general the lack of understanding of the issue this person is bringing up. I think the main thrust of this article is that by the time Longhorn is released, Linux will be the last non-GPU-accelerated major desktop (Thats if you even consider Linux a desktop OS) OS available. What makes it worse is that Linux has no real solution in the pipe line, other than some "band aids" applied to the current X server. These "band aids" will still not get us to the
  • by MECC ( 8478 ) * on Wednesday August 31, 2005 @10:14AM (#13445836)
    I recently tried reinstalling windows on a Dell, using the Dell 'recovery' CDs (OS and drivers) that came with the box. Everything worked except the video. Windows had to boot into 'safe' mode in order for the video to work, and then it was at reduced bit-depth. This was a factory shipped Dell, with factory shipped CDS - I added nothing to it. Of course the problem can be easily fixed, but my point was that it was a problem in the first place.

    I booted the same box with Ubuntu live 5.04. X came up fine, no problems. I had to do nothing at all for it to work just fine.

    Windows: 0
    Linux: 1

    This kind of thing happens way too often. What the hell is MS doing with its time - making TPS reports? I guess this is what you get when you spend your resources buying software instead of making software.

    • Wouldn't that be more of a question on what the hell Dell was doing and not MS?
  • by master_p ( 608214 ) on Wednesday August 31, 2005 @10:15AM (#13445846)
    The X-Window system runs in a variety of O/Ses, including every flavor of Unix, Mac OS X and Microsoft Windows.

    What I haven't understood all these years of 3d development is that why X-drawing calls are not converted to OpenGL drawing lists. An X-Window server could take the graphic calls and store them as OpenGL drawing commands, and each time some window is redrawn, the commands are sent to OpenGL and thus the graphics card. That would mean automatic antialiasing, full zooming etc.
  • by Anonymous Coward on Wednesday August 31, 2005 @10:51AM (#13446182)
    Security: Currently, X runs as root on Linux. It doesn't need to, it just does. That means 16M lines of code to audit. A set of kernel modules for Linux to handle root access plus running the rest of X under a non-root user would go far in solving this issue and getting wider (aka DOD, ...) acceptance.

    Portability: Linux + X does things differently from other *nix + X. X should act more like an OS with the OS interface needed for hardware acess only. This would eliminate a few layers and projects currently handling different issues.

    3D vs. 2D: 2D is going away. 3D hardware is cheap and highly accelerated. 2D is not even on older hardware. You can do 2D with 3D hardware. 3D support under X is limited and mostly propriatory...partly because of the kernel (security) and layers (portability) problems.

    There is no reason why X can't be updated to handle these problems, though it does take quite a bit of effort and a road map from X.org that currently does not exist.
  • Flamebait (Score:4, Interesting)

    by krmt ( 91422 ) <therefrmhere@@@yahoo...com> on Wednesday August 31, 2005 @10:52AM (#13446188) Homepage
    A lot of this article is flamebait. Jon is pretty obviously bitter that the rest of the X developers didn't feel his sense of urgency in moving everything to Xegl right away.

    The fundamental difficulty in getting specs to write and maintain open drivers for various video cards still exists, and any move to a fully OpenGL-based system will still have this barrier for a large number of people. If you've ever tried to run sw-based mesa, you know how slow it is, so on a fully OpenGL subsystem a large number of people will have to run it using the proprietary drivers. These work well for some people but for others they crash constantly and integrate poorly with the rest of their system. Ultimately, the X developers have their hands tied with these drivers because they can't fix them. Imagine a world where most everyone running all of X on these drivers, from 3D games to xterm, and you can see a serious problem.

    Jon just brushes this off in his article ("believe it or not some people like the proprietary drivers"). Meanwhile, he calls the current effort to actually make the code work a "bandaid" even though it shows great promise to actually deliver useable drivers for a large number of people in a very short amount of time. He laments that X doesn't handle hotplugging well, but ignores the many efforts to implement this (check the X wiki for info) and the fact that no one has really figured out the best way to do it. He willfully ignores the fact that X needs to run on non-Linux systems, and as such it can't rely on many of the facilities he talks about.

    Jon's definitely a smart guy, and he understands X incredibly well, but he's unwilling to accept that maybe he's not prioritizing things very well. He certaintly hasn't done a great job of selling Xegl to the rest of the X world, because if he had he might not have written this wonderfully elaborate troll.
    • Re:Flamebait (Score:3, Insightful)

      That's rather unfair. Sure, the author has a bias, but then given the total lack of coherent communication the X developers give to the rest of the free software community the only people who can write this sort of article are the type who are heavily involved and therefore not detached. So I'm not surprised it's heavy on opinion.

      The issue of open source 3D drivers is a real one, but I think Jon - like perhaps many of us - have accepted that the solutions to this lie at the political level and not at the

    • Re:Flamebait (Score:5, Informative)

      by jonsmirl ( 114798 ) on Wednesday August 31, 2005 @11:45AM (#13446678) Homepage
      If you compare the 2D performance of an ATI Rage1128, Radeon 9000, and a Radeon X850 you will discover that they all perform about the same. But if you compare 3D performance of the X850 to the R128 you will see a 500:1 improvement.

      I didn't brush the open driver issue off, I simply chose not to address a topic that is the source of a lot of controversy. I am well aware of the problems of obtaining driver documentation.

      X just needs to make a choice, continue with the flat-lined 2D performance or make the jump to the 3D hardware. If X chooses 3D I would much rather see if use a well designed, standardized API like OpenGL than to slowly extend the existing code base to start using 3D features like EXA does.

      If you want open 3D drivers go lobby the hardware vendors to release code and specs. However, I think it is wrong for Linux to ignore the immense performance gains available from the 3D hardware on the grounds that the hardware is not completely open. Withholding use of 3D hardware on Linux will do nothing to open the vendors and it will definitely result in Linux having an inferior competitive desktop experience.

      Have you considered that the opposite effect might happen? It Linux builds an excellent 3D desktop and attracts a lot of new users the hardware vendors may start to take Linux seriously and open their specs.
  • By the way, the X Window System [wikipedia.org] article will be the Wikipedia front page feature [wikipedia.org] on September 3rd.
  • where you will need to download the pdf and read Part I Chapter 7 [washington.edu]. I keep a copy of the book on my desk (next to my RHCE study book as a psychic counterweight) and often refer back to it whenever I've spent too much time with things like X configuration.

    It's very funny, even more so when you know just a little bit of Unix and even more than that when you know too much which is to know any, really. ; )
  • by tji ( 74570 ) on Wednesday August 31, 2005 @11:16AM (#13446426)
    One area the article didn't touch on at all was MPEG decoding. Most video cards today have hardware to accelerate decoding of MPEG2 video. Some even have MPEG4 acceleration.

    For lo-res stuff, like DVDs, this is not a big deal because modern CPUs don't break a sweat decoding that stuff. But, when you go to HDTV (1920x1080i / 1280x720p) video, even fast processors feel the load. In Windows, there is a standard API (DxVA) which is supported by most video drivers. In the Linux world, there is similar support, but it's a bit trickier..

    Linux XvMC API - Enables hardware offload of iDCT and Motion Compensation in MPEG2 processing. The API is relatively new, and support has recently been added to key applications (MythTV, Mplayer, Xine, (vlc?) ).

    NVidia - supports XvMC in their binary / closed source driver releases. XvMC is suported by their newer FX series cards (and newer), and GeForce4 MX cards. It is not supported in the hardware of the other GeForce4 cards.

    ATI - No support for XvMC in Linux. (ATI was the pioneer of the MPEG2 acceleration hardware, available in their Radeon line for many years. But, they don't support this at all in Linux.)

    VIA/S3 Unichrome - There is a Unichrome driver project on sourceforge, which supports the excellent MPEG2 accleration of the Unichrome integrated graphics processors. (Though, it's not clear to me if it's completely open source or relies on VIAs closed drivers/libraries). The S3 MPEG2 processing is beyond normal acceleration. They do full MPEG2 decoding in hardware - which allows for HDTV display with very low CPU requirements. S3 also has standalone video cards (DeltaChrome, GammaChrome), I don't know the state of Linux or MPEG2 support for those. The Unichrome also has hardware support for MPEG4 processing, which is not yet supported in Linux.

    Others - Any other Video cards with XvMC support in Linux that I missed?

    --- The MPEG2 acceleration support in Linux is not great yet. But, at least it's better than MacOS.. In OS X, the DVD player is built with MPEG2 acceleration support, but no other applications can use it (there is no open / published API). So, HDTV display has ridiculous CPU requirements (Dual G5 is stated as required for the ElGato EyeTV 500). The vast majority of Macs have video hardware that supports MPEG2 accel, but none can actually use it.
  • The article claims that BSD miss "PCI and framebuffer support". All BSDs have PCI support and have had so since 1994. At least FreeBSD, NetBSD, OpenBSD and DragonflyBSD have framebuffer support. In FreeBSD, the support for framebuffers was introduced in early november 1999. (For comparison, the Linux fbdev project was started in late december 1999.)

    Next, a lot of standard features are referred to as "Linux" features. From the concept of kernel drivers to virtual terminals, these are implied to be Li

    • I've never booted BSD and I don't claim any real knowledge of it. The article just reflected what I have read and seen posted to the xorg mailing lists.

      If BSD has parallel support to Linux fbdev that will make it easier to run OpenGL/EGL on it. I was also under the impression that all of that PCI probing code was in X because BSD lacked the needed functions.

      The focus of the article was the state of Linux graphics but write up some BSD clarifications and I'll add them as soon as we can convince our censorist
  • Kick It Up A Notch (Score:3, Interesting)

    by http101 ( 522275 ) on Wednesday August 31, 2005 @11:42AM (#13446654) Homepage
    The state of Linux graphics is in my opinion, better than Windows. I've had long nights of reinstalling the OS (Windows) just because I had a bad video driver that corrupted the system. Not even restoring the system from a backup helped. But what I am certainly curious about is minimizing the compile time on systems with higher-end video cards. If GPUs can be utilized for sorting processes [slashdot.org] and some boards contain more than one [slashdot.org] processors [slashdot.org], why aren't we utilizing these [slashdot.org] high-speed processors to aid in compiling a kernel for our computers? I don't see the problem since audio processing [slashdot.org] is already being done.
  • by Stalyn ( 662 ) on Wednesday August 31, 2005 @04:02PM (#13448878) Homepage Journal
    Seriously I was losing hope for Slashdot. But posting articles like this remind me of the old days. Thank You.

news: gotcha

Working...