Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
X GUI

Anti-Aliased Text in X11 Continued 150

keithp sent in a bit more information about the Font stuff we mentioned yesterday. Besides a nice shot of twm & xterm, Keith sent us proof in the form of a screenshot with Konqueror, the KDE web browser. He also says "Most of this code is in XFree86 CVS today. The hacked Tk and Qt libraries will be available in source form soon. Expect the latter to change; they were pretty seriously whacked. All of the text is rendered with the fine FreeType 2 library using 256 levels of translucency and composited to the screen using hardware acceleration at around 200000 glyphs/sec. If performance becomes an issue, I'm sure we can improve that. These images are regular anti-aliased images not optimized for any particular sub-pixel geometry. With a single X resource change, the text would be rasterized to improve quality for LCD screens as seen here." Now I'm just waiting for mozilla to support this.
This discussion has been archived. No new comments can be posted.

Anti-Aliased Text in X11 Continued

Comments Filter:
  • by Anonymous Coward

    Does anyone know which graphics cards this will work with? I had heard something about it only working with matrox cards, but wanted to verify.

    It works with all graphic cards the drivers of which have been converted to use fb. In current XFree86 CVS, this means the vast majority of the supported cards.

    On the other hand, only a few (only one?) drivers accelerate this. It's a simple matter of hacking, I'm sure quite a few drivers will do by 4.0.3.

  • by Anonymous Coward
    ... had anti-aliased text in its desktop 14 years ago ...
  • 4h? What kernel version? Which gcc? I'm certain building 2.2.14 (with 2.95.2 iirc) took a stupidly long time on my system (at least 10h, though it seemd to still be building after 20h). That was the last time I built a kernel on anything other than my desktop box: I set up multiple kernel trees for each destination box and I now have <3m builds for all my machines :)

    Bill - aka taniwha
    --

  • However, when you are looking at very small letters non-anti-aliased text is illegible. ...

    If you look at a Windows machine, you'll see that normal reading sized fonts are not anti-aliased. Only large and small fonts are.

    I'm not sure either of these statements are correct. Take a look at this:
    You'll find two images of a Word document. The text varies from 48pt down to 8pt. The left image is with the default settings for Windows (NT 4.0 at least). If you zoom in, you'll see none of the fonts use antialiasing, regardless of their size.

    The image on the right is with "Smooth edges of screen fonts" set in the Plus! tab of display properties. This turns on antialiasing. However, you'll see that for the smallest fonts, MS has decided not to use antialiasing.

    Now, one can argue as to whether that means small antialised fonts look bad, but your statements about Windows seem to be incorrect.

  • You're right. I only went down to 8pt because that's what the picklist offers, but setting it to 6pt, it *does* get antialiased. I added this to the web page.

    However, I still think the main problem is not anti-aliasing (or lack thereof). I haven't looked at the latest releases, but a typical experience for me was to download WP for Linux, or Abiword, install it, type some text at the default pt size (8 or 12), and think "Yuck!"

    With no antialiasing, comparable font size, and out-of-the box Linux/Windows setup, Windows fonts look better (IMHO).

  • 4h? What kernel version? Which gcc? I'm certain building 2.2.14 (with 2.95.2 iirc) took a stupidly long time on my system (at least 10h, though it seemd to still be building after 20h). That was the last time I built a kernel on anything other than my desktop box: I set up multiple kernel trees for each destination box and I now have <3m builds for all my machines :)

    I will get to this point too shortly. Basically I'd like to get rid of GCC and all the development fluff for my OTHER boxen and keep a single box available to do all the compiles and other fun stuff.

    On the topic of the 4h compile, though... I reran the compile last night. Here are the results:

    • # uname -a
      Linux pokey 2.2.13 #2 Sun Jan 23 18:14:42 EST 2000 i386 unknown

      # cat /proc/cpuinfo
      processor : 0
      vendor_id : unknown
      cpu family : 3
      model : 0
      model name : unknown
      stepping : unknown
      fdiv_bug : no
      hlt_bug : no
      sep_bug : no
      f00f_bug : no
      coma_bug : no
      fpu : yes
      fpu_exception : no
      cpuid level : -1
      wp : no
      flags :
      bogomips : 6.53

      # cat /proc/meminfo
      total: used: free: shared: buffers: cached:
      Mem: 15192064 14159872 1032192 4194304 5566464 4796416
      Swap: 136208384 1245184 134963200
      MemTotal: 14836 kB
      MemFree: 1008 kB
      MemShared: 4096 kB
      Buffers: 5436 kB
      Cached: 4684 kB
      SwapTotal: 133016 kB
      SwapFree: 131800 kB

      # cat /proc/ide/hda/model
      QUANTUM FIREBALL1080A

      # time make dep clean bzImage modules
      real 221m50.261s
      user 207m33.500s
      sys 9m13.160s

    I copied my .config file and did a make mrproper before copying the configuration file back and doing the compile. 221 minutes is 3.6 hours. Now I couldn't find a man page for time, so the compile either took 3.6h or 7.1h. I was in bed so I can't vouch for it personally. :-) Either way it's a good lot shorter than the 10-20h compile you had mentioned. This is a DTK 386DX33 from 1986 with a Weitek 80387 I finally found in a pile of old computers at a junk shop and a memory expansion riser card I found in the same shop. The motherboard takes SIPPs and the expander SIMMs. fun stuff. :-) Good old pokey is my firewall.

  • 6.54 bogomips, whee, 1 day kernel compiles

    I've got a comparable system: 80386DX/33 with 16M (8 on a proprietary riser card, doncha love the old way? heh) -- kernel compiles take about 4 hours on my system. (regular old IDE HDD) Regards, Andrew

  • I hope that FreeType 2 will support OpenType fonts as well. That, and hopefully the patent issues will be 100% resolved before this hits XFree "for real"...
  • "the entire Linux industry" - lol!
    --
  • Someone at microsoft implemented a better way than anti-aliasing. The problem with current fonts is that they are not made for the screen. When they are anti-aliased, they look fuzzy. You _can_ design fonts that don't need anti-aliasing, its just that they won't look like the fonts we're used to. Anyway, that seems like a great idea to me. I don't know if M$ is implementing it or not, but I do believe they figured it out. Could be the first real improvement they've made.
  • ...then Rasterman will have to go back to flipping burgers.
  • I was asking about the acceleration. I know it can be done in software, hence the title of my message.. ;-)


    -- Thrakkerzog
  • Does anyone know which graphics cards this will work with? I had heard something about it only working with matrox cards, but wanted to verify.

    -- Thrakkerzog
  • Well, most of those processor cycles are on your graphics card.

    -- Thrakkerzog
  • What resolution is your monitor, how good is your vision, and how far do you sit from the monitor?

    If your eyes can't resolve individual pixels on the screen, antialiased fonts aren't going to do much for you. On my monitor, antialiased fonts are much more readable than bitmapped fonts at 640x480, but at 1280x1024 I can't tell the difference without a magnifying glass or Xmag. If you want to see some antialiased fonts right now (without installing the CVS version of X), all the text rendered by enlightenment (window titles, menus, and the Dox help browser) is antialiased.
  • ok, s/verdana/times new roman/g (or courier new or any of the truetype serif fonts coming with windows). the point still stands - at low resolution, it's hinting that matters, not anti-aliasing.

  • In windows, all of the true type fonts I use look great without anti-aliasing. If you want beautiful fonts in X windows use an X server that supports true type fonts.

    it's not a matter of them being truetype or not - it's because fonts shipped with windows are hinted properly for low resolutions.

    there are many things that go wrong when you try to render a font in extremely low res - stem widths and line thickness become unequal between letters, features become slightly dislocated, and so on. anti-aliasing doesn't solve the problem - it only makes it appear slightly less problematic.

    the right solution really is using properly hinted fonts. hinting [fontlab.com] is a process in font design where you specify additional information in the font description for low-resolution rendering.

    for an example, fire up a windows wordprocessor, and try typing some text in the verdana font (it's beautifully hinted!) at different sizes ranging from 6 to 24. now lean closer to the monitor, and take a look at pixel-level differences between the same letter in different sizes. notice, for example, that the loops (such as in d or p or o) will become sometimes rectangular, sometimes square, but consistenly so for every letter of that size, line thickness (say, vertical lines in d, p, t, etc.) will change consistenly for all letters as you go up in size, etc.

    now go back to your linux partition, and run netscape with the default serifed font (times, if i'm not mistaken). at low resolution it just looks sad in comparison - t is differrent thickness than d, different loops have different shapes, and so on.

    the matter is not aliasing. it's using fonts that are designed for low-resolution display. unfortunately, hinting is a delicate and thankless aspect of font design (well-hinted font doesn't get your work into the emigre catalog [emigre.com], you know!), and while there are a few, it's not clear how many more free well-hinted fonts we'll see in the future...
  • Because Linux kernel config required TK.

    That is incorrect, one method of configuration requires TK, but you can just as easily configure the kernel using Curses with
    # make menuconfig
    That would make two apps that need it for you. :) Or you can go the hardcore way and answer questions one by one at your terminal with only a rudimentary shell. Personally, I prefer menuconfig.

    But it's ugly. Having a library around just for one program is ass-ugly.

    That is your opinion, which is to be respected. Most people in the *NIX world wouldn't agree with you though. Having half a megabyte of libraries isn't that big of a deal at all. What is this single application you have? On my machine there are plenty of applications I use that need ncurses. It is a pretty common library, the typical Linux machine uses it quite a bit.

    I have no clue how. According to ktop, my memory usage hovers at 50MB while I'm in KDE 2.0, just after having run a GNOME app.

    No offense, but I see several problems with your setup that could be causing this. One is that you are using ktop to scope your processes. I would suggest to you that if memory is a big deal, use the smaller more well tested applications such as top or ps. They are not as pretty, but they do not use so many resources. Which Gnome application are you running? That makes a big difference. If it is gnote then there is a problem.

    If you are going to be running both Gnome and KDE simultaniously, you need to be prepared for the usage fees. You are running two major system in tandom (and be thankful that you can do that.) so do not be surprised when it uses a lot of hardware. To be honest with you, once I installed KDE 2 I havn't used Gnome at all. I never thought I would be saying that, but in my opinion KDE 2 surpasses Gnome as far as polish goes. Just my opinion. I don't even use KDE 2 that often, very rarely. Most of my applications do not reside in either the Gnome or KDE realm. I have them there in case I need them, dorment on my hard disk.

    And bleeding edge browsers are an absolute necessity for desktop use, ask any BeOS user.

    Maybe I should ask what you are defining as a desktop user. I'm a desktop user and I can do 90% of my browsing with lynx. I'm comfortable with the program and it only returns what I need, the information, not the fancy mouse rollovers, flash plugins, and other such glut that serve no purpose other than to wow the user. I fully realize that the average desktop computer user (in my definition) wouldn't understand or like lynx. I don't suggest the world uses it, however I consider myself to be a desktop user and do not need floating layers on my web pages! So your definition is flawed somewhere.

    Linux IS a server OS, but I get flamed whenever I say that. Linux is a wanabee desktop OS that just isn't there yet. Have you taken a look outside slackware land lately? Everyone is trying to cram Linux into the desktop.

    A work in progress would be a much more apt and intelligent way of putting it than 'wanabee.' Do recall that the entire desktop movement upon which you are referring to is only two or so years old now. It is very infantile, and to assume that it is going to have every single thing you are used to with the MacOS or even Windows only displays your lack of understanding on the issues here.

    Now, that being said, I think great progress is being made. When you consider the fact that these desktop distributions, Gnome, KDE, and other solutions are so young, and driven practically 100% by volunteer work it is very impressive and I have great respect for the folks working on those projects.

    I'm not going to compare Linux with any other pure desktop system simply because it isn't there yet. It is getting there quickly, but not yet.

    Funny, I always seem to need those features.

    I find that difficult to believe. You actually use every feature of the latest Word versions? If you do not mind me asking, what is your career? I know that there are some positions that might require that kind of usage in large corporations and whatnot. However, for ever 1 person that actually needs that feature-set there are 10,000 who do not. That puts you in a very, very small minority. You cannot expect the entire development community of open sourcers to cater to such a small percentage before the basics are even complete. Incidentally you might want to check out StarOffice from Sun. I believe it is open source now, and probably the most feature-full office suite out there under that label. If not, it is free.

    No, let me guess, you have tried it and it doesn't have one or two features you desperately need to get your work done. Three words: Display some adaptability. That is what makes a good employee anyhow, not one who whines non-stop because of one little detail, but one who can find ways to work around things. I've found that in working around things I usually end up finding much much better ways to do things anyway.

    You do know you could probably completely automate or imitate a lot of Words features using the plethora of *NIX tools out there, and end up with a much more consistent and appealling output using LaTeX, don't you?

    I don't need Perl, ZSH, CSH, TK, Python, and 90% of the other stuff that the average Linux distro forces me to install. The problem is that the utility developers think that they are writing applications (they're not, utilities are OS-level apps), and indescriminantly use non-standard libraries.

    Good for you, you happen once again to be in a pretty minute minority. Most people who use Linux do in fact need those applications. Expecially Perl, in fact some distributions use Perl extensively for their inner workings. Even if you do not personally ever invoke a perl command line, there is a lot of stuff going on behind your back you don't know about that is very likely done with Perl. If you really do not need these things, I suggest you switch distributions. If your distribution does not allow you to select what you install and you so irately do not want it, that is. Again, display some adaptability.

    You can very easily uninstall things you do not need too (again depending on your distribution) so even if the Beginner's Installation puts this stuff on without your knowledge, you can always take it off after install.

    Once again, ncurses, perl, TK, these are most certainly not non-standard libraries (Perl isn't really a library, but you get the point). Whoever told you that does not have a clue what they are talking about. As for Zsh, and Csh, those are shells, not libraries. I don't know any utilities or applications that require those shells to be functional. The only shell requirement is SH for installation... and please do not tell me that sh is a non-standard library. I will die laughing.

    Okay, my honest opinion after hearing you talk for a bit is that you are maybe a one-year or less user of Linux. You got into it because everybody told you it was cool. Shame on those people for not fully preparing you for reality. The reality is that it is a server OS that has a lot of folks working very hard to get desktop-ish features attached to it in a very short amount of time. I still only recommend Linux to developers, sys admins, and computer hobbiests. I don't try to get everyday gamers and whatnot here yet because it isn't quite ready yet. First impressions are important to some people, and right now it doesn't give a good first impression to somebody who has used Windows their entire life.

    There are many other factors that one has to overcome when switching OSes. The biggest thing people need to understand is that you are making a very big change in philosophy. The *NIX philosophy hardly resembles the philosophy of Windows users. Even in the way they use GUI applications. You tend to see lots of little apps spread out side by side over multiple virtual screens with X11. This is because over here, people design an application to do a few things, well. Over in the other camp, people pile their applications into one screen on top of each other, and each of these apps are monolithically huge, replacing 5 or 10 *nix apps. It comes down to opinion over which is better. Personally I prefer lots of little applications. Other people like the consistency of having everything in one big application. It is precisely this change in opinion that is required though, to make an effective switch. I had to do it when I switched. I had to realize that such and such application was not going to do everything such and such app did in Windows, but that was okay because I found other apps that worked together and in total did more! With less memory usage, with better stability. That is just one aspect, until you can get over some of these philosophy changes, it will always be weird and "inferior"

    It extends to everything. The filesystem is based on this, the inner workings of the OS structure are based on this.

    In another area I think where you are going wrong is that you are assuming that the current state of Linux is a good determinate in what it actually is This is not true. Linux, and the rest of the open source world can not be defined by a release date or a static slice in time. Things are very shifting here. Always getting better, though. So if you really do want to see what the future of it will be, stick around. Give it a year, trust me it will impress you more and more as time goes by. Don't just install it once and get hissy over details that you do not like currently. Continue using NT 4 for your games (Which is in itself absurd, why are you doing that? Why not use Win98ME??) and your MS Word needs, and keep tabs on how the Linux world is growing.

    I know the tone of this not is a bit harsh, but your outlook is a common one and needs addressing. It is part of the rift between those who have been served by the software giants all their computing life, and those who have been apart of a growing development process.

  • That is too exaggerated to work. Nobody will fall for that troll.
  • I'm using an SGI Indigo2. While the display quality is very good, I didn't notice that X did any anti-aliasing at all. If it did on your O2, how do you enable it?
  • You need to install an up to date X server
    (XFree86 CVS, or the shortly to be released
    4.0.2).

    Otherwise, you'll still be using the core fonts.
  • Another thing that Microsoft does right is to design excellent fonts for the screen, as someone else has pointed out. If you go to the Microsoft Typography [microsoft.com] page, you will find a bunch of free TT fonts. These are absolutely the best scalable screen fonts you can get anywhere. TrueType fonts can be rendered with quality comparable to hand-designed bitmap fonts. In addition, these fonts are designed especially to look good at low resolution. I use Georgia now for web browsing. I would rather have good screen fonts than antialiased bad fonts.
  • I did a "man dhcpcd", copied the text into my text editor, and went to that link you cited to compare the font rendering side by side. Guess what? My non-anti-aliased text -- the same text at the same size -- looked sharper, clearer and better formed. That's because it is using "Courier New", a TrueType font that I snagged from Microsoft's Typography page. Good fonts will beat anti-aliased bad fonts any time.
  • Type 1 fonts look like crap on XFree86, regardless of the quality of the font or its hints. The reason is that the Type 1 renderer is bad. Even when a better engine appears, TrueType has a natural advantage. Even the Freetype project acknowledges [sourceforge.net] that TT is "by far the technology that produces the best looking monochrome bitmaps". In addition, there are excellent screen-optimized TT fonts available from Microsoft, Monotype and others. There is absolutely no way to get better scalable screen fonts than to use TrueType with these fonts.
  • Anti-aliasing bad fonts will not make them good fonts. If you want scalable fonts, XFree86 is still not there, at least out of the box.

    There are good Type 1 fonts out there, but the font renderer that comes with XFree86 is junk and produces consistently bad results. Besides, TrueType is "by far the technology that produces the best looking monochrome bitmaps", as the Freetype project acknowledges [sourceforge.net].

    But TrueType fonts take real expertise to create. The best free screen fonts you can get are excusively from Microsoft Typography [microsoft.com]. They are great (Georgia is my favorite), but cannot be freely redistributed. You have to download them from Microsoft. Designing good, well-hinted, screen-optimized TrueType fonts is beyond the ability of most, if not all, open source hobbyists. This means that for Microsoft-haters in the Linux world, the only way to get good screen fonts is through Microsoft, or to buy them.

  • ahem, a 486-33? That's a fast box. I'm using a 386-33/16M. However, As performance is an issue, X isn't even installed, let alone configured for anti-aliased fonts. (it's a firewall, what does it need X for anyway? :)

    6.54 bogomips, whee, 1 day kernel compiles :)

    Bill - aka taniwha
    --

  • How are you measuring RAM? I would guess that you're just looking at what the monitors in Windows and Linux (such as provided from /proc via top or ps) are telling you. I have no idea how the Window's memory monitor (it does come with one, yes?) measures things, but /proc includes the frame buffer on your 64MB super-duper video card.

    -Paul Komarek
  • If you have processor power enough to get good interactive performance with aa fonts, then yes it is worth it. If you don't have the processor power, then no, it is not worth it.

    It will not affect non-interactive performance, like compiling, and since most Unix application are non-interactive, it doesn't really matter to them.
  • It is still worse than non-antialiased TrueType courier fonts under Windows. Look at the s'es, and the progressively worse white-on-black status bar.

    This and courier is one of the better fonts under X. Try Times New Roman in an X-application.

    Most of what people appear to be saying is that if you increase the screen resolution enough, and recompile the programs, suffer a performance hit then antialias, you can shrink the screen fonts and they'll be reasonable.

    When you're using a 100+dpi screen with antialiasing, you approach the quality of a 150dpi printer. Isn't printing what the fonts were designed for?

    By the time X gets antialiasing and reasonable fonts, we'll all be using 200dpi flatpanels, and it won't matter anymore.

  • Gtk might be as easy, I just haven't looked.

    This mail message to "gtk-devel" from Owen Taylor" [gnome.org] says

    ...the xrender extension, which we will be supporting in GTK+ soon...

    (presumably meaning in the main CVS branch; i.e., it'll eventually be in 2.0, but probably not 1.2[.x]).

  • Not to be a pain, or maybe to be a pain...

    But why the hell can't they fix the existing Xlib calls to do antialiasing!

    Certainly, add new and intelligent interfaces to draw text. But there is no excuse for not changing the existing interface to make it's output as nice as possible. Without that, all existing X programs will still produce ugly text.

    And I don't give a s**t about back compatability. I know it won't color in the exact same pixels, but anybody who wrote a program that relied on that is an idiot anyway! I also don't care about this working on non-TrueColor visuals, those should have been eliminated long ago as well.

    Besides coming out with antialiased text SIX YEARS AGO, MicroSoft modified their existing code so older programs got antialiased text. The fact that XFree86 seems incapable of doing the same thing is an insult to the entire Linux industry!

  • Your O2 did not do antialiasing. Perhaps you had a better monitor, or perhaps a better font? The very small bitmap fonts on Irix were pretty good, but they can easily be copied to another X server.
  • ClearType is a form of antialiasing. In an idealized form the centers of the red, green, and blue filters are offset to match the centers of the colored lcds. In addition an "error diffusion" pass is done to alter this result so that the output does not have any color bias (the XFree86 code does not seem to be doing this, resulting in the blue letters, but the MicroSoft sample was rather simple-minded in that their error diffusion oscillated without damping all the way to the right of the image resulting in a pattern in the clear area).

    ClearType is actually a clever idea, but it is not a complex idea. The best ideas often are.

    Hey: A CRT version of ClearType could be done if there was a percise way to set where the colored phosphers are in relation to the pixels, too. Perhaps a program could be used which displayed patterns and the user adjusted knobs to make those patterns as brightly colored as possible and this was used to figure the phospher position and pitch. This idea is (C)2000 ME and may not be used in commercial software by companies whose name starts with 'M'.

  • Like you said the human retina has logarithmic response, thus you want a monitor where the mapping of the numbers put into the display buffer to the output light levels is logarithmic as well. If you don't, you are not allocating the numbers in the most efficient way possible.

    The steps are not Mach banding. Mach banding is the eye/brain detecting non-continuous first derivatives in a scene and making the appear to be an "edge" (since they probably are a corner in a real-world scene).

    I am talking about non-continuous levels themselves (ie not the derivative). The human eye can detect these obviously, but if the two levels are close enough together it is very hard. Ideally all the steps in the gray ramp (255 of them for an 8-bit display) should be equally hard to see. This requires a logarithimic mapping from the integers to the light levels.

    The reason the display on this monitor and the real world would look "identical" is that the mapping from the real-world light levels to the numbers stored in the image file is not linear either, instead it is the inverse of the monitor.

    I am well aware that the human response is logarithmic and the monitor is exponential plus a constant, and that it is impossible to make these curves match exactly. However they match far better than a straight line!

  • That is exactly what is wrong.

    But you don't want to change the DAC converters, because adjusting the monitor to be "linear" would spread the samples at the dark end out so much that the steps between them would be easily visible, while compressing the samples at the white end so close together that thye cannot be distinguished, thus wasting a lot of your intensity resolution up there.

    An unadjusted monitor, through fortunate circumstances, very closely matches the non-linear response of the human eye.

    There has been a long and sad history of people thinking the monitor has to be linear to be "correct", espeically in the Mac area (where they built-in a gamma correction in the software of about 1.7) and in SGI's which were often set to corrections of 2.5 or more. This was all wrong and we are finally getting away from it.

  • You seem to be quoting a lot of material, I am guessing we seem to be misunderstanding each other somehow, as what I am trying to say is trivial basic knowledge of how the eye responds to light that anybody who has read so much on light must know. I am guessing we are misundestanding the terms "linear" and so on.

    The linear/logarithmic/gamma curve I am talking about is a plot from "number put into the display buffer" (called x here, and normalized so 0.0 is the smallest number in the buffer, and 1.0 is the largest number), and the "intensity of light emmitted by the screen" (called y here, and normalized so that 1.0 is maximum the screen + d:a converters produce, and 0.0 is the minimum).

    "Linear" in my terminolgy means y = x.

    "Gamma" means y = pow(x,G)

    "Logarithimic" means y = pow(B,x-1)

    The display buffer can only store discreet values, the difference between them is Dx (1/255 for an 8-bit buffer). This will produce descreet steps in the output intensity, the size of these is (approximately) dy/dx * Dx.

    My argument is that the best use of this display buffer and hardware is if the perceived steps in output intensity are about equal. This is the only way to minimize the noticability of the steps everywhere.

    It is pretty well established that the perceived difference in intensity is a ratio or (y+Dy)/y-1 or Dy/y.

    Setting Dy/y to a constant means that dy/dx/y = c or dy/dx = y*C. This is the derivative of a logarithmic curve, which is why I say that a logarithmic curve is the ideal way to map the values in a display buffer to display intensities.

    CRT monitors naturally map the voltage level through a gamma curve to the screen intensity, this gamma exponent G is about 1.8. I also propose that this gamma curve, though not equal, is quite close to a logarithimic curve (try drawing both of them in the range 0-1), and certainly MUCH MUCH better than a straight line plotted by a linear function.

    Mach banding:

    My understanding of the term "Mach banding" is the perception of an "edge" caused by the eye and brain amplifying discontinuities in the first derivative of the intensity across the scene, event though the value is continuous. What I think we are talking about is a discontinuous value, or step in intensity. I have never heard the term "Mach banding" applied to this.

    CCD's

    Yes CCD's are linear, but you will find that the circuitly in all digital cameras forces that linear output through a lookup table to convert it to a nonlinear value. This is necessary for compatability with all existing binary data formats which are based on binary values that are passed through a gamma function before producing output intensities.

    Logarithimic matching exponential

    Of course these don't match outside the 0-1 range, but since the monitor is physically incapable of producing these intensities anyway, it does not matter! I know perfectly well that the eye has greater dynamic range than the monitor, though it is not as great as you say if you discount the ability of the iris to reduce/increase exposure, in fact it is only about 1000:1. Inside the 0-1 range I think you will find that the gamma curve matches the exponent much better than a straight line, it is at least curved in the right direction!

    References:

    Digital Video and HDTV: Pixels, Pictures, and Perception, published by John Wiley & Sons in July, 2000.

    Also check Siggraph course notes for Charles Poyntons introduction to color science and color management. It was course # 21 in the New Orleans 2000 Siggraph, but he has done the same course every year for awhile now.

  • Okay, I think I'm beginning to figure out where we are misunderstanding each other.

    I totally agree that proper anti-aliasing and other compositing operators must be done in linear space, where the numerical values are equal to the light intensity multiplied by a constant. In my own work I use floating point numbers for this. This is important, as the intensity resolution must be much smaller near zero than away from it, floating point naturally does this, while using integers or fixed point results in huge waste of resolution at the bright end while making huge noticable steps at the black end. All cgi rendering programs work this way, I am working on fixing compositing software to work this way as well.

    However, when I take that floating point number and put it into an 8-bit display buffer, I fully expect to "gamma correct" it, rather than do something simple like rint(x*255). (cheap or old cgi rendering software like still does this, completly ruining the realism of the result).

    I was under the impression that you were arguing that the circuitry (D/A converter and display device) should be designed such that the output intensity is a constant multiplied by the value stored in the display buffer. As you seem to have a lot of knowledge about this, I must assumme that you are well aware that this is a very inefficient way to use the 8 bits. Ideally those 8-bit numbers should be used in a formula like pow(B,(x/255)-1) to get the intensity displayed on the screen, to get the limited number of intensities as evenly distributed across the "perceptual" scale, which is approximately logarithmic.

    Further confusion is that the default behavior of a CRT screen and a linear D/A converter happens to be very close to the ideal for human vision, the gamma curve just happens to be the right power value so that it can be scaled to closely fit the log curve of human vision over the 100:1 or so range of intensities a CRT can display. This was entirely coincidental as it was not a design decision when television was developed.

    Because of this coincidence, a good way to improve the display of an 8-bit buffer is to do nothing to the bits before sending them to the screen, this of course conflicts with the assuption that doing something complicated would be better.

    Although I think it is stamped out now, around 1990-1994 or so there was a lot of belief that something complicated had to be done between the 8 bits and the CRT, so that nothing complicated had to be done when converting the floating point linear value to 8 bits. This resulted in people trying to "linearize" the monitors, such as the 1.7 gamma still built into Macintoshes, and the SGI software that set the gamma up as high as 2.5, both of these result in horrible banding in the dark areas. People then tried to solve this problem by raising the number of bits, first to 12, and then to 16, trying to get rid of the black banding. Though the extra bits are nice, they are still using them very inefficiently, 16 bits linear just equals the quality of an 8-bit 2.0 gamma display (the step between the bottom two entries is equal), while those same 16 bits used with gamma could give you 256 times as many gray levels!

    I have only seen consumer CCD cameras so you may be talking about the internal circuitry or professional equipment, but certainly by the time they output a jpeg image for your photo cd, they have gamma-corrected the image. If they had not done so the display on a home pc through a non-correcting image viewer would be extremely dark. I agree that the CCD itself produces linear response. Also even the first tube cameras required adjustment circuitry, because even though they produced a gamma response, it was a different gamma than the television screens, and the engineers decided (for obvious reasons) to put the expensive adjustment circuitry in the camera rather than in every tv set.

    On the eye, I was talking about the reception range of an individual cell, or two adjacent cells, simultaneously, you report this range as 100:1, though I have heard more. You are right that the cells as well as the pupil adjust, I did not know that the cells adjustment is far greater than the pupil, that is pretty interesting. However for our purposes, since all the pixels on the screen are being examined simultaneously, the range of the CRT is approaching that of the eye. My own believe is that we need to get it up to 10000:1 or so before truly realistic scenes can be displayed, as the eye adjusting to look in dark areas is quite natural and we need to stimulate that somewhat to get a fully natural feel.

  • Actually, I'm not talking about hinting. Hinting is another possible solution to the same problem. The one I'm talking about is having a set of fonts specifically for screens. The fonts that M$ created are only being used in a few applications, I believe, and aren't widely known (I don't know their names). But this is definitely different than hinting. Hinting modifies the display of _existing_ fonts, it does not create new ones.
  • ...is there a way that programs will automatically render anti-aliased text, or will they have to specifically call the FreeType libraries?
  • It would be okay (with me :) if the renderer performed the correct calculation using the host's CPU instead of the wrong calculation using the display adapter's CPU. If you can turn it off, you'll make the people with slow CPUs happy as well.

  • The higher your resolution gets, the more pretty anti-aliasing gets... At very low resolutions it can make unreadable text readable, but it does look blurry. At middling resolutions it doesn't really improve the readability much and it certainly makes the text look blurry. At high resolutions (where even the heavily anti-aliased portions of a line have a solid center) it looks FINE.



    I'm waiting very impatiently for some decent anti-aliasing. :-)



    One thing to keep in mind is that (at least IMHO) the anti-aliasing in Windows, MacOS, etc. SUCKS. Please don't base your opinion of anti-aliasing off these... Grab a FreeType library and play with the included demos, or check out the anti-aliasing in BeOS or QNX if you want to see some pretty anti-aliasing. The quality of the implementation can obviously make a big difference...


    Ethan
  • OK, I agree that hinting is more important than anti-aliasing.

    Cheers,

  • You don't want to stare at anti-aliased fonts for long periods of time. If you're coding, you really need some nice hand-crafted bitmaps.

  • It's not really fair to compare verdana to a serif font.
  • I'm curious as to why I should hold this with any higher regard than any other opinion piece.
    Near as I can tell, he's just a programmer that doesn't like the way antialiasing looks, and
    happened to write a rant about why he dislikes it...

    While I appreciate a good rant [jwz.org], I don't see this article as being anything more than a rant.

    --K
    ---
  • About goddamn time X started getting more advanced graphics features, like alpha channels and such.
    These features are needed in a modern graphics system.

    I'm sick of people bitching about how much memory
    or processor an alpha channel or AA will use - it's optional.
    If it bogs down your P133, the solution is simple - don't run it.
    No one is forcing this on you, and work like this
    is definately not a waste of time.

    Maybe now it won't be such a shock going from OSX to X... ;)

    --K
    ---
  • [...] but at a low resolution like 640x480 (or even 800x600), it looks like barf. [...]

    And I use those resolutions for what? Gaming?

    Before you flame me, keep in mind one thing: Autodesk has not and most likely will not implement anti-aliasing in AutoCAD.

    So what? The things I do most involve text and raster graphics.
    It really doens't matter what's best for CAD guys - I'm not one of them.
    Not to mention line algorithms on bitmap displays are naturally lossy due to downsampling...

    Don't like antialiasing? Don't use it and quit bitching.

    --K
    Don't mind me, I like feeding the trolls.
    ---
  • Interesting. I havn't used word for a while (yeay LaTeX) but I seem to remember very small fonts being anti-aliased. I could be wrong. Also, it could be that antialiasing isn't turned on again untill the letters get very tiny---about 6pt or less perhaps---when the distinguishing characteristics of letters would be completely gone.

    --Ben

  • Hmmmm. I pulled up the Konqueror snapshot of the KDE.org homepage with all this cool antialiased stuff displayed (the link provided in the article). I then pulled up the same page in my nightly build of mozilla and put the two side by side. Couldn't tell a difference. If anything, the mozilla version rendered better especially with regards to links. What have I missed here? I'm sure antialiased text support in X is pretty cool, but that particular screen shot sure doesn't show it off much to me. Anybody got a shot that actually shows what a difference one might see?
  • > If your eyes can't resolve individual pixels on the screen, antialiased fonts aren't going to do much for you.
    >On my monitor, antialiased fonts are much more readable than bitmapped fonts at 640x480

    Good point. When I switched to 640x480 mode, the difference was very clear. FWIW, my typical resolution is 1600x1200 on a 20" monitor. Apparently antialiased fonts aren't gonna buy me much.

    Thomas Dorris
  • Some good technical information on exactly what sub-pixel rasterizing and ClearType is all about can be found here [Steve Gibson's Website]. (I like it because it shows that Microsoft is once again reinventing the wheel and calling it "new". :))


    The problem is that when it comes to ClearType, Steve Gibson doesn't have a frickin' clue. If you'd like to see someone who does, try reading A
    href="http://www.geocities.com/SiliconValley/Rid ge /6664/ClearType.html">Ron
    Feigenblatt's website for a more balanced (and informed) view on this.

    And here's what the Microsoft Research team has to say about what ClearType actually is -- be warned, except for the first link, it's highly technical:
    Brief [microsoft.com]
    overview
    IEEE [microsoft.com]
    paper on the technology
    Paper for the [microsoft.com]
    Society for Information Display Symposium

    Try reading those. Gibson
    literally does not know what he's talking about here. For a start, what the
    Apple II does is NOT sub-pixel rendering. It's not even pixel-color splitting,
    as all the color splitting occurs in the NTSC signal, not at the phosphor level
    (you'll see more than one green phosphor per green pixel).

    Simon
  • They are for print... I beive tthat fonts were originally designed for use on paper, not monitors. While it may look better on the screen because of high resolutions and such, it is a bit misrepresenting of fonts if you would change them on the screen and not on paper.

    That's what hinting algorithms are for -- to bridge the gap.

    Simon
  • I'm coding, so I spend a lot of time looking at text.

    If you don't use them already, I strongly recommend you the JMK fonts for X: they provide simply the best looking monospaced fonts (expecially "neep" 18 points) a developer could ever desire for his editor. [ntrnet.net]

    Now, if only I were able to convert them into something usable also on Windows... suggestions?

  • Is antiailiased text really worth the extra processor/graphic cycles in most unix applications?

    What UN*X applications are you talking about? UN*X does everything from digital imagery for Hollywood to laying out publications like newspapers and phone directories to controlling embedded devices, to handling tax collection for governments, to controlling factories, to serving Web sites. Some of these things will benefit from anti-aliasing, some won't. Those who won't benefit won't use it, so they won't waste processor cycles.

    If you have an operating system that's really only good for one thing, you tend to think other operating systems are similar. They aren't. And, of course, X doesn't only run on UN*X.

  • QT/Embedded [trolltech.com] already supports anti-aliasing and alpha compositing. Look pretty sweet on my iPaq. (mirror of QPE [flyingbuttmonkeys.com])

    I wonder if the QT/X that KDE uses already includes support. If not, I don't imagine it would be too hard to move it over from QT/E.

    ________________________________________
  • QPE looks good. The QPE demo for the iPaq is, of course, a demo and not polished software. But it's pretty good. I don't think X is appropriate for a handheld. However, if you really have to have remote display, QPE includes support for VNC.


    ________________________________________
  • Most people posting here haven't the first clue about anti-aliasing. AA is a DSP technique which band-limits a signal prior to sampling in order for it to be faithfully reconstructed. If this is not done the higher-frequency components "wrap around". For instance, if you sample a sound signal at 20 kHz, then you must remove all frequency components above 10 kHz (assuming perfect roll-off). If (say) a 12 kHz signal was present it would appear to be an 8 kHz signal in your digital representation of the waveform. The alias is NOT supposed to be there. It's called the Nyquist sampling theorem. One use of this is in downsampling - where you take a signal originally sampled at one frequency and generate samples at another lower frequency. To do this by an integral amount (e.g. downsample by factor of N) you firstly band-limit the signal via a digital filter to at most half of your new sampling rate, then take every Nth sample. If you wish to change to (say) 2/3 of the sampling rate you need to upsample by two and downsample by 3 (upsampling is a bit different and not all that relevant). Now, the same applies to 2-D (and 3-D) signals as well. If you are trying to view (e.g.) a 600 dpi image on a 100 dpi monitor, you are in effect down-sampling. If you wish to see what is really there (to the best ability of your display) then you MUST anti-alias. To not do so is to see artifacts which are NOT present in the original signal (remember the example of the apparently 8 kHz signal! It's not really there!). The only issue is the attributes of the anti-aliasing filter (roll-off, ripple etc.). This depends on the size and type of digital filter you choose - bigger and more complex digital filters may give better quality but be slower to execute. If you are complaining about "blurring" at some chunky coarse resolution, it's not the anti-aliasing, it's the poor resolution. The anti-aliasing is helping you. - Daniel
  • I'm not sure why you need Konqueror since you assert later on that you need Netscape 6. XFree86 4 is, in my experience, slimmer and faster than it's
    ancestors. You do not need it for 3D support, but it makes it a lot easier. I don't see why it is a negative in the first place though. Anyway, why do you
    need 3d support on a server? You don't use 3dfx administration tools do you?
    >>>>>>>
    I'm talking about the desktop.

    If you never use them, then why do you need them?
    >>>>>>
    Because Linux kernel config required TK.

    My servers don't need Tk, they don't even need X11. Additionally, if you are not using them, they only
    take up disk space, and minimal disk space at that. The Ncurses library takes up less than 500k on my installation. Even if you have Tk 8.0, 8.2, and 8.3
    installed for backwards compatability it still only takes up under 6 megabytes of disk space. If you don't use them, they do not consume RAM at all.
    >>>>>>
    But it's ugly. Having a library around just for one program is ass-ugly.

    Another good example would be my workstation at work. It runs Gnome libraries, KDE libraries, everything you listed and more. The total running RAM
    usage of that machine hovers around 20-30 megabytes, and that is using Enlightenment as my windowmanager, hardly a slim example! I figure why not,
    everything else is using such a small amount of memory and I have 128 floating around to use.
    >>>>>>
    I have no clue how. According to ktop, my memory usage hovers at 50MB while I'm in KDE 2.0, just after having run a GNOME app.

    It sounds like you are trying to compare NT 4 with Linux as a desktop solution.
    >>>>
    Yea, I thought I made that sufficiantly clear!

    Interesting comparison, using server operating systems to do so. Why not throw OS/390 in there, why not IRIX, why stop with those two. Why not Solaris? It would be ridiculous to try and compare these for how well they play Unreal Tournament, don't you think? So why are you comparing NT4 with Linux on those grounds?
    >>>>>>>
    Linux IS a server OS, but I get flamed whenever I say that. Linux is a wanabee desktop OS that just isn't there yet. Have you taken a look outside slackware land lately? Everyone is trying to cram Linux into the desktop.

    But even given that your entire premise has some sort of twisted merit, a lot of your generalizations and assumptions are faulty. Such as needing bleeding
    edge browsers and extensive component architectures. For some setups sure, they might be a necessity, but most of those setups would be better placed on
    entirely different operatings systems than either NT4 or Linux.
    >>>>>>>>>>
    Desktop OSs are incredibly unifrom in many of their needs. A component architecture is nice because it encourages reuse of binary code. If implemented correctly (ie NOT by Microsoft) the idea has a lot of merit. And bleeding edge browsers are an absolute necessity for desktop use, ask any BeOS user.

    Just because a feature exists and looks nice, does not mean that you need that feature. Lots of ignorant people point out that MS Word has way more features than any Linux word processor.
    >>>>>>
    Funny, I always seem to need those features. The problem is, that feature poor stuff is usually aimed at one group of people, and people who have limited, but different needs, are left out.

    I would like to ask those people to personally take a survey of everybody in the office, with a checklist on how many of those features are used. Not very many. So what if it has crazy features if you do not need them. The same goes for operating systems, desktop
    setups, server setups. Install only what you need! That is what makes Linux nice.
    >>>>>>>
    I don't need Perl, ZSH, CSH, TK, Python, and 90% of the other stuff that the average Linux distro forces me to install. The problem is that the utility developers think that they are writing applications (they're not, utilities are OS-level apps), and indescriminantly use non-standard libraries.
  • Geez, I feel bad. Sorry for being a jackass, I guess I was persnickity that day.
  • # make menuconfig
    That would make two apps that need it for you. :) Or you can go the hardcore way and answer questions one by one at your terminal with only a rudimentary shell. Personally, I prefer menuconfig.
    >>>>>>>>
    Yea, I guess I could do that, but I'm running X for a reason. Also, Cygnus's source navigator needs TK, and KDevelop doesn't have as good source browsing yet.

    That is your opinion, which is to be respected. Most people in the *NIX world wouldn't agree with you though. Having half a megabyte of libraries isn't that big of a deal at all. What is this single application you have? On my machine there are plenty of applications I use that need ncurses. It is a pretty common library, the typical Linux machine uses it quite a bit.
    >>>>>>>>
    While my ncurses goes unused (I don't use many console/GUI hybrid apps) I do consider it core functionality, so my mistake if I included it in my list of useless libraries.

    No offense, but I see several problems with your setup that could be causing this. One is that you are using ktop to scope your processes.
    >>>>>
    ktop eats up 20MB of RAM?

    Which Gnome application are you running? That makes a big difference. If it is gnote then there is a problem.
    >>>>>>>>>
    I'm running gnibbles.

    If you are going to be running both Gnome and KDE simultaniously, you need to be prepared for the usage fees.
    >>>>
    I thought Linux was free?

    You are running two major system in tandom (and be thankful that you can do that.)
    >>>>>>
    I'm not

    so do not be surprised when it uses a lot of hardware. To be honest with you, once I installed KDE 2 I havn't used Gnome at all. I never thought I would be saying that, but in my opinion KDE 2 surpasses Gnome as far as polish goes. Just my opinion. I don't even use KDE 2 that often, very rarely. Most of my applications do not reside in either the Gnome or KDE realm. I have them there in case I need them, dorment on my hard disk.
    >>>>>>
    I don't like unneeded things dormant on my harddisk. Second, several GNOME programs are nice, as are several KDE programs. I don't think my setup would be usable without both.

    Maybe I should ask what you are defining as a desktop user. I'm a desktop user and I can do 90% of my browsing with lynx. I'm comfortable with the program and it only returns what I need, the information, not the fancy mouse rollovers, flash plugins, and other such glut that serve no
    purpose other than to wow the user.
    >>>>>
    I use Lynx too, before I install X, but a graphical browser is necessary. With all the imagemaps and javascripts, etc, using Lynx is so much less efficient.

    I fully realize that the average desktop computer user (in my definition) wouldn't understand or like lynx. I don't suggest the world uses it, however I consider myself to be a desktop user and do not need floating layers on my web pages! So your definition is flawed somewhere.
    >>>>>>>>>>>
    Not really. You're not an average desktop user, I'd go so far to say that you are in the elite realm of "UNIX grognard." It is silly to think that the mass of people that GNOME and KDE2 are aimed at will use Lynx.

    A work in progress would be a much more apt and intelligent way of putting it than 'wanabee.'
    >>>>>>>>
    Probably

    Do recall that the entire desktop movement upon which you are referring to is only two or so years old now. It is very infantile, and to assume that it is going to have every single thing you are used to with the MacOS or even Windows only displays your lack of understanding on the issues here.
    >>>>>>>>>>>>>.
    That's the problem. I've been using Linux (I only got BeOS a year or two ago) since the Slack 3.5 days. Admittedly its not that old, but this was before KDE and GNOME were ready, before glibc, before WordPerfect was ported. Linux has come a long way since then, but people are acting like it is all ready to take over the desktop market, and the only think keeping people from replacing Windows is the MS's dirty tricks. That's simply not true. At least somebody is willing to admit that Linux isn't nearly as ready for the desktop as everybody pretends.

    Funny, I always seem to need those features.

    I find that difficult to believe. You actually use every feature of the latest Word versions?
    >>>>>>>>
    No, I use some of the obscure features that prevent me from using a lot of free word processing packages. If most people use 10% of the features, I use 11%, and that 1% are those features that only MS bothered to put in.

    Incidentally you might want to check out StarOffice from Sun. I believe it is open source now, and probably the most feature-full office suite out there under that label. If not, it is free.
    >>>>>>>
    Good god, I tried StarOffice, worst UI design ever concieved by man. Sadly, I'm happy to see it being ported to BeOS.

    Good for you, you happen once again to be in a pretty minute minority. Most people who use Linux do in fact need those applications.
    >>>>>
    That's the problem. People are trying to push Linux into the group of people who are NOT "Most people who use Linux." These people don't need all this, and app developers shouldn't force them to have these installed. They should code to one set of APIs, and let the user choose what to use. If the user wants Python, the user can use Python. But the app developer should not use it, if there is a standard alternative (Perl) available.

    Expecially Perl, in fact some distributions use Perl extensively for their inner workings. Even if you do not personally ever invoke a perl command line, there is a lot of stuff going on behind your back you don't know about that is very likely done with Perl.
    >>>>>>>>>
    When did I saw Perl was useless? An OS needs a scripting language, and Perl is quite useful. Hell, I even use it on BeOS. But take urpmi, for example. Why does it use Python?

    If you really do not need these things, I suggest you switch distributions. If your distribution does not allow you to select what you install and you so irately do not want it, that is. Again, display some adaptability.
    >>>>>>>>>>>>>
    Why don't the app developers display adaptability? Why use Python when Perl is available?

    You can very easily uninstall things you do not need too (again depending on your distribution) so even if the Beginner's Installation puts this stuff on without your knowledge, you can always take it off after install.
    >>>>>>>>
    I'm not an idiot, I know that. Its the Linux app developers who force me to keep stuff that I could uninstall, want to uninstall, but can't. Are you suggesting I code an urpmi alternative? Or I could switch to Debain and apt, and live with less than bleeding-edge software?

    Once again, ncurses, perl, TK, these are most certainly not non-standard libraries (Perl isn't really a library, but you get the point). Whoever told you that does not have a clue what they are talking about. As for Zsh, and Csh, those are shells, not libraries. I don't know any utilities or applications that require those shells to be functional.
    >>>>>>>>>
    CVS requires csh, and something in either Mandrake or Slackware requires either zsh or ash.
    While I agree that neither Perl nor ncurses are non-standard (which is why I didn't say they were) I would say that TK is. readline is. Python is. imlib is. fnlib is. Half the stuff installed on a Linux system is there to satisfy dependencies. That shows in the 500MB minimal, compatible (GNOME + KDE) install.

    Okay, my honest opinion after hearing you talk for a bit is that you are maybe a one-year or less user of Linux.
    >>>>>>>>
    Sorry, been using it (off and on addmitedly) since '96.

    You got into it because everybody told you it was cool.
    >>>>>
    When I started using it, it was still in hacker-land.

    The reality is that it is a server OS that has a lot of folks working very hard to get desktop-ish features attached to it in a very short amount of time. I still only recommend Linux to developers, sys admins, and computer hobbiests. I don't try to get everyday gamers and whatnot here yet because it isn't quite ready yet.
    >>>>
    But it's not being hyped as such.

    First impressions are important to some people, and right now it doesn't give a good first impression to somebody who has used Windows their entire life.
    >>>>>>>
    So why the hell is everybody on Slashdot pretending that their grandmother uses Linux?

    The *NIX philosophy hardly resembles the philosophy of Windows users. Even in the way they use GUI applications. You tend to see lots of little apps spread out side by side over multiple virtual screens with X11. This is because over here, people design an application to do a few things, well.
    >>>>>>>>>>>
    Speaking of philisophy, that USED to be the UNIX philosophy. If you take a look at KDE, GNOME, and half of modern applications, you'll notice that that is not the philosophy anymore. (For example, X does printing. Why does GNOME do printing? Why are KDE and GNOME incomptable? The *NIX way to do it would have a consistant interface, and then let whatever you want fulfil that interface. Kinda like how the text streams provide an interface, and you use whatever you want to manipulate that text stream. The way staight *NIX does it, the way X does it, that's the way to do it. Not the way modern Linux is doing it.

    other, and each of these apps are monolithically huge,
    >>>>>>>>
    Like X, Mozilla, KDE, and GNOME? I just heard GNOME is 4 million lines of code. Not only twice the size of the kernel, but bigger than all of BeOS!

    Give it a year, trust me it will impress you more and more as time goes by.
    >>>>>>>

    I've been waiting for it to impress me since '96.
    Continue using NT 4 for your games (Which is in itself absurd, why are you doing that? Why not use Win98ME??)
    >>>>>>>>>>
    Because NT4 runs OpenGL apps faster, and all the games I have are compatible. That's not even an issue, all I play is Quake3 and CorumIII anyway.

    and your MS Word needs, and keep tabs on how the Linux world is growing.
    >>>>>>.
    90% of my documents are created in BeOS's Gobe productive.

    I know the tone of this not is a bit harsh, but your outlook is a common one and needs addressing.
    >>>
    Really, who else is a rabid BeOS user, likes simplicity, speed, and elegance, admires the philosophy of *NIX but is disguested by its current implementation? (BTW, Quake3 and OpenGL are the only reason why NT is still on my machine)

    It is part of the rift between those who have been served by the software giants all their computing life, and those who have been apart of a growing development process.
    >>>>>>>>>>
    Yahoo! Be is a software giant!

    Look, I have nothing against Linux, nothing against *NIX, and I understand the philosophy behind it, I just can't stand the fact that Linux is being sullied by what is currently happening. UNIX was designed to be elegant. In its current state on the desktop, Linux is decidedly NOT.
  • Uh, no. BeOS anti-aliases all its fonts, and from two feet away on my 1152x864 display, even the size 10 fonts (which are a little smaller as BeOS tends to render on the small side) are perfectly legible. Its nothing special, QNX does it too.
  • What ARE you talking about? A Linux environment comparable to NT4.0 (Linux 2.4, KDE2, GNOME 1.2, and Netscape 6) takes up more RAM than does NT4. Linux may have a lot of advantages but RAM usage (at least from the GUI POV) ain't one of them.
  • Mod this up, he has a point. If the render extension uses hardware acceleration to do the alpha blit, then performance is no worse then it would be with regular TrueType fonts.
  • You are, in a word, stupid. While you may have it nice and all with your high end 133ppi display, we mere mortals are stuck at 80ppi, and for us, anti-aliased text beats the hell out of anything else out there. If you want to send me one of these 200ppi displays, I'll recant all my statements in favor of anti-aliasing.
  • My vid card is a mere 16MB, but I use ktop. With KDE 1.2-only, it gives me around 19-22MB, which is about right. Load Netscape and GNOME (running all popular GUI applications is a critereon here), and switch to KDE 2.0 (for the network and component stuff NT already includes) and it bumps about to 45+MB, which is way beyond NT's 30-something MB, and about on par with Win2K.
  • How am I exaggerating? Comparing NT 4.0 to Blackbox or FVWM is stupid. When people say Linux doesn't have XXX feature, they point to KDE2 or GNOME and say, "but it DOES!" when someone points out that Linux is bloated, they say, "so use Blackbox or FVWM!" KDE 2.0, GNOME 1.2 (both, because I need, say, gnometoaster and Konqueror or whatever, mainly because I use apps from both) are necessary because NT already includes component technolgies and other tech found in KDE2 and GNOME. XFree86 4.0 is needed for 3D support, Netscape 6 is needed to compete with IE 5.5, and you need various other libs such as TK and ncurses (which I never use, but exactly ONE important app on the system does) and when you add all these together, you've got a system that easily rivals Win2K in memory usage.
  • > Is antiailiased text really worth the extra processor/graphic cycles in most unix applications?

    (I'm a long time linux user). Every time I boot linux from using windows, I see how ugly the fonts are. GNOME looks (IMHO) much nicer than windows, but the fonts really suck. I'm coding, so I spend a lot of time looking at text. The sooner this makes it into the distributions the better. First impressions count, and linux fonts just aren't as nice as on windows.

    So, I think it's great.

    0.02,
    Mike.
  • They are for print... I beive tthat fonts were originally designed for use on paper, not monitors. While it may look better on the screen because of high resolutions and such, it is a bit misrepresenting of fonts if you would change them on the screen and not on paper.

  • Well made observations, but there are of course no rules without exceptions - either way. The biggest problem in X is however, getting good enough fonts. No free distribution packages good fonts simply because the best ones are commercial.

    Here's a resource to obtain fonts, free or not:
    http://www.linux.org/docs/ldp/howto/Font-HOWTO-10. html [linux.org]
    The easiest way is to just copy fonts from your Windows-partition or CD (if you have Windows at all, but some of them may be downloaded too).

    I found this resource good when I did this on Mandrake 7.2 (before I saw the option in DraConfig->fonts to automagically install Windows fonts):
    http://www.linuxdoc.org/HOWTO/mini/FDU/index.html [linuxdoc.org]

    I got this page from this resource (containing alot of links):
    http://www.kegel.com/linux/tt.html [kegel.com]

    Everybody who have Linux should do this. You will not want to barf at your screen again. You will enjoy Linux and browsing web-pages is like browsing with Internet Exploder. That's partly unfortunately since both Nutscrape and Konqueror is just as stable. However, it's free and it's improving alot!

    - Steeltoe
  • There is only so much anti-aliasing will do to correct bad fonts.

    Most of the bad fonts are bitmaps and aren't going to be affected by antialiasing unless they are rescaled bitmaps (excuse me, if you use rescaled bitmaps... I think I'm going to be sick...).

    That said, there is no reason to restrict your font habits to truetype ones. Postscript Type 1 fonts can also be rendered well using antialiasing and can also give extremely good results. At the end of the day, the quality of the font on screen is limited by the quality of the hinting done to protect the important features of each glyph from being lost when rendered.

    In windows, all of the true type fonts I use look great without anti-aliasing. If you want beautiful fonts in X windows use an X server that supports true type fonts.

    Even at tiny point sizes? I doubt it. Antialiasing is good for increasing readability of fonts at all point sizes but especially for small fonts. Without antialiasing, small fonts become a muddle of pixels.

    Cheers,

    Toby Haynes

  • I've installed the latest CVS code (to get the proper DGA mouse code for Quake 3) and it runs quite well w/nVidia's drivers.

    http://linuxquake.com/news?start=20

    This link provides some more info on how to install the latest CVS Xfree and get it to work w/nVidia's drivers.

  • Is higher resolutions on monitors. I can anti-alias the text into a blur, but (for my eyes anyway) I'd rather see SHARPER text with better contrast. Most monitors are in the range .22 - .25 these days, and I can definatly see a difference between the two. If I undersand it right, anti-aliasing improves the apparent resolution (translation to "readability") of a text glyph. I'd like to see the actual resolution improved. Don't get me wrong, AA is nice and good (and I use it), but lets get those 10K x 10K monitors out the door!
  • How much extra processing does it require? Has anyone measured it?
  • I'm certainly not promoting Qt over Gtk; I just wanted to get some samples that people could understand. Qt has the nice property that nearly all of the text rendering funneled down to a very small set of functions.

    Talk about bang for the buck; two hours of hacking and I had a browser with anti-aliased text.

    Gtk might be as easy, I just haven't looked. Plus, what I wanted was a browser and konqueror is pretty usable for basic browsing.

  • Windows fonts are already anti-aliased; anti-aliasing is one of the features of TTF that should have been ported to X IMNSHO at least 3-4 years ago. The current XFree server doesn't anti-alias any of the fonts, hence they look blocky. Hopefully, this means that the non Truetype fonts, some of which I love, are anti-aliased too.
  • The reason for that is that Windows does not support sub-pixel positioning and does not do full supersampling. Horizontal or vertical lines are not "smoothed". The intention is to keep the font look as sharp as possible. The problem with this approach is that the local black and white distribution of characters is modified, and since the relative error gets bigger with smaller fonts, they would look too ugly with adaptive antialiasing enabled.

    On the other hand, if you do full sub-pixel positioning and full antialiasing, small fonts really benefit from antialiasing.

    What you want is a screen resolution such that you can almost but not quite tell two pixels apart from normal viewing distance, floating point screen addressing and full antialiasing.

  • The optimum is a high resolution display and anti-aliasing. High resolution provides sharpness, anti-aliasing provides correct black-white distribution. The latter is very important and can not be achieved with just (reasonably) higher resolution.
  • Well, Keith, the guy doing this work, has made "hacked Tk and Qt libraries [which] will be available in source form soon", according to the article. So, no, Qt doesn't have the support by default, but it appears that it is being added as we speak.
  • Unless I'm mistaken, Mozilla makes use of Gtk for doing is GUI stuff, right? And does this include the font work? Because, if that's the case, then modifying Gtk to support font anti-aliasing (as was done to Qt to get that Konqueror screenshot we saw) could result in automatic support in Mozilla (not to mention all the other Gtk applications).
  • Sub-Pixel rasterization is, in a word, stupid. Just because something can be done doesn't mean it should.

    This technology is only beneficial in LCD environments with typical filter configurations.

    It is a work-around for a rapidly diminishing problem. The requirement for extremely high resolutions to support very fine text rendering.

    The economics of LCD displays is very similar to that of RAM chips. There are technical requirements to increase density, but, there are very few cost factors. LCD displays are currently displaying up to 200 PPI and will ultimately display 300+ PPI on consumer displays. This rivals laser quality print, on screen.

    Currently available laptops from DELL and IBM display 1600x1200 on 14.x inch displays. This represent 133+ PPI! These displays really don't need sub-pixel rendering. They need scaling.

    The problem which really needs to be addressed is scaling on these very high-res displays. Not dumb work-arounds for problems which don't exist. Anti-aliasing is of course always beneficial, but the perceived need for sub-pixel rendering is only being driven by Microsoft's ClearType having resurected an otherwise dead idea. Its appeal rests solely in its apparent cleverness.

  • I have to agree with the comment about AA on non-Linux systems. When I've turned on the font-smoothing in MacOS it turns all the beautiful system fonts into an illegible mess.

    It's goddamn computer. Personally I want all my on-screen text to look like I'm using a computer. I prefer Courier and the even more classic computer type fonts. These are displayed in very clear contrast colors (like black on white) that make it simple to detect the edges of the letters.

    The only time anti-aliasing is enjoyable for me is when I'm doing layout in graphics applications, or creating text in a JPG/GIF/PNG.
  • Now I'm just waiting for mozilla to support this.

    And waiting to go through a major XFree86 upgrade as well, right? X server upgrades are never fun.

    Does anybody know if the CVS code with XRender in it is even module binary-compatible with XFree 4.0.1 (for my poor nVidia drivers... sniff)? I'm guessing that this requires some fairly major changes to the guts of X to implement.

  • Sure, AA might look beautiful in higher resolutions, but at a low resolution like 640x480 (or even 800x600), it looks like barf. Furthermore, some fonts were meant to be shown without anti-aliasing (MS Sans Serif, Times New Roman, and Arial in Windows; I'm sure there's some in X).

    Before you flame me, keep in mind one thing: Autodesk has not and most likely will not implement anti-aliasing in AutoCAD. Autodesk sees no reason to implement AA; or at least in the Model view. The key reason for this is because an anti-aliased picture tends to "lie" about its details. Any lies/deceptions/blurriness in CAD drawings can be fatal to a legitimate project.

  • Is antiailiased text really worth the extra processor/graphic cycles in most unix applications?

    That depends entirely on the user and the role of the computer. Me, I like things pretty on my workstations, speedy on my servers; thus, I'd be inclined to take advantage of this on a workstation and ignore it (along with the rest of X, in many cases) for a server.

    That said, there are apparently enough people out there that want anti-aliased fonts that they're being added to X, despite the added overhead. This is one of the great strengths of community-driven code: if enough users want something, the issue eventually gets addressed by the developer.

    $ man reality


  • Anti-aliasing for black-and-white texts can work but it's worth noting that anti-aliasing of colored text is intrinsically flawed because anti-aliasing of colored texts on colored backgrounds creates linear combinations of two different hues which are perceptually different from the original hues, creating a false dark border next to the text.

    To see what I mean, using xmag or xzoom look at the border area between the red text and green background color image of the anti-aliased xterm [xfree86.org] (4th column, 8th row) on Keith Packard's X rendering page.


  • An unadjusted monitor, through fortunate circumstances, very closely matches the non-linear response of the human eye.

    Why would you want a monitor with a non-linear response? Suppose a monitor is displaying an image of some everyday street-scene. A human watching the monitor would like the monitor and the real-world to look identical. This means the light emitted at every point on the monitor should be equal to, or, less ideally, just proportional to, the intensity of light at the corresponding point in the original scene. This is identical to saying ideal monitors have a linear response.

    The visibility of steps between neighbouring gray levels in a gray level ramp/gradient image is mainly due to the Mach effect in human vision exaggerating the perceived steps, rather than being due to the size of the gray-level differences. Because of the Mach effect, viewing a gray level ramp is actually a poor method of judging the quality of a monitor and its calibration. A monitor should be calibrated with proper test equipment like a photometer, and, ideally, a spectrophotometer.

    By the way, the human retina has a logarithmic response to light intensity, while an uncalibrated monitor has an exponential response to gray level, with gamma as the exponential base. Visual perception and many interesting neurophysiological experiments are discussed by Spillman and Werner (Visual Perception, 1990)

  • Anti-aliasing isn't the only reason that Windows fonts look better than X fonts.

    First of all, Win fonts are only antialiased at > 16pts or so, which is almost none of text on my screen. Second, Font Smoothing wasn't even enabled on this default Win2000 install. Third, the standard system font on older versions of Windows (98, NT4) is "MS Sans", which is a non-scalable bitmapped font anyway.

    No, Windows fonts look better because they spent more time and effort designing them. Particularly, they heavily use TTF's hinting features to improve screen readability, especially for the web fonts that you can download with IE or from their Truetype page.

    So, yeah, anti-aliasing can look pretty slick (especially in the XTerms in the screens), but the first step is still having a good font to render in the first place, which ever Linux install I've seen does not have.
  • Anti-aliasing and blending could be done using a 3D video card (such as Voodoo or GeoForce) with a X 4.01 DRI driver.
    Was it ever tried?
  • by spitzak ( 4019 ) on Tuesday December 05, 2000 @11:02AM (#581219) Homepage
    You are completely mistaken. Small font resolutions are much more readable with antialaising.

    Obvious example is to look at text broadcast on the TV verses the text displayed on the screen by your VCR or cable box. It should be obvious that you can read the TV text at sizes that are far smaller than the box will attempt. This is because the box is not antialiased, while text used by the stations is (especially if it is a video image of a printed piece of text).

    I also don't know what AutoCAD is talking about. Antialiased lines are far easier to see than aliased ones, especially if there are many almost-parallel ones at angles slightly off horizontal and vertical. Antialising of thin lines has been done for 25 years in top-of-the-line CAD workstations, despite the extreme (for then) computation overhead. Take a look at any old graphics book.

  • by mattbee ( 17533 ) <matthew@bytemark.co.uk> on Tuesday December 05, 2000 @07:48AM (#581220) Homepage
    Yes, dammit! Given that my old Acorn A3000 (based on an 8MHz ARM2, 2MB memory) had anti-aliased fonts switched on by default, and the desktop still flew along nicely (the rendering might have been slow, but the bitmap cache ensured that the desktop was always responsive for a fairly modest outlay of memory). Here's a nice shot of the font rendering [tu-muenchen.de]. No, I don't use it any more, so I can't possibly be a rabid advocate, but I know from experience that anti-aliasing isn't hard to do efficiently.
  • by tcd004 ( 134130 ) on Tuesday December 05, 2000 @07:13AM (#581221) Homepage
    Is antiailiased text really worth the extra processor/graphic cycles in most unix applications?

    sure, it's great for page layout, but that's why I have a mac.

    tcd004 Tired of Election Coverage? How about some UNCOVERAGE? [lostbrain.com]

  • by cradle ( 1442 ) on Tuesday December 05, 2000 @08:03AM (#581222) Homepage Journal
    I think decent scaled fonts may be more important than anti-aliasing. Take a look at this:
    -David
  • by joshv ( 13017 ) on Tuesday December 05, 2000 @07:18AM (#581223)
    There is only so much anti-aliasing will do to correct bad fonts.

    In windows, all of the true type fonts I use look great without anti-aliasing. If you want beautiful fonts in X windows use an X server that supports true type fonts.

    -josh
  • by frantzdb ( 22281 ) on Tuesday December 05, 2000 @09:00AM (#581224) Homepage
    Yes and no...


    That's an interesting view of anti-aliasing but somewhat closed-minded. The author's main argument is ``Frankly, anti-aliased text just looks bad.'' This is silly. I'll agree that at some sizes with fonts designed for the job anti-aliasing is harder to read. That certainly doesn't make it look bad, though. The problem with anti-aliased text is when it's blurryness makes your eyes strain to try to focus as you read. Non-anti-aliased and slightly anti-aliased font rendering fixes this problem. However, when you are looking at very small letters non-anti-aliased text is illegible. Very large letters, on the other hand look pixelated when not anti-aliased. If you look at a Windows machine, you'll see that normal reading sized fonts are not anti-aliased. Only large and small fonts are. The author also bashes ``ClearType'' for being anti-aliasing, even though it actually uses more ``pixels'' (by addressing each color of the pixels separately).

    I'll shut up before I go off on too much of a rant, but it seems like this person simply doesn't understand what he's talking about. Anti-aliasing has limitations but so does your screen.

    --Ben

  • by AntiPasto ( 168263 ) on Tuesday December 05, 2000 @07:09AM (#581225) Journal
    I've had anti-aliased text on my xterms for quite a while with this .50 dot pitch 14-inch monitor I have. Hell sometimes it's so anti-aliased I can't even tell if its BSD or Linux...

    ----

  • by Abcd1234 ( 188840 ) on Tuesday December 05, 2000 @07:13AM (#581226) Homepage
    AFAIK, apps have to be modified to make use of the new anti-aliasing features. Of course, if you modify the toolkits (GTK/Qt/Xt/etc) to use the anti-aliasing stuff, you're half way there already.
  • by tjwhaynes ( 114792 ) on Tuesday December 05, 2000 @08:01AM (#581227)

    Sure, AA might look beautiful in higher resolutions, but at a low resolution like 640x480 (or even 800x600), it looks like barf.

    Total and utter *&^*&^^.. :-)

    Antialiasing improves the readability of a font at small sizes. That's why Acorn went to all the trouble of having it in Risc OS back in 1987- when your vertical resolution can be as low as 256 lines or less, keeping the fonts readable as the point size drops below 6 pts is impossible without anti-aliasing. They had this resolution because that was back in the days where people used their tellies as monitors.

    Furthermore, some fonts were meant to be shown without anti-aliasing (MS Sans Serif, Times New Roman, and Arial in Windows; I'm sure there's some in X).

    Thats because MS still hasn't got it's antialiasing working properly - hence MS 'font smoothing' is an appropriate title. Anti-aliasing is not just about blurring the edges - it is about increasing the apparent resolution of the text by using greyscales - the same way a truecolour phot has a higher apparent resolution than a black-and-white 2 colour image on the same display. Doing it right has a massive effect on the readability of the text on screen. Because MS's implementation doesn't cut it at small point sizes, they tweaked the truetype fonts to render more reliably to the screen instead.

    Cheers,

    Toby Haynes

  • by the coose ( 171981 ) on Tuesday December 05, 2000 @07:26AM (#581228)
    Correct. I would suggest reading the Deuglification HOW-TO. [linuxdoc.org] In 10 minutes you can tweak X 3.3.x or 4.0.x to have great looking fonts.
  • by eXtro ( 258933 ) on Tuesday December 05, 2000 @07:34AM (#581229) Homepage
    Yes, for me at least, it would be worth the extra processing power. My previous workstation was a Silicon Graphics O2. It wasn't the most powerful workstation in the world but the display quality was amazing.

    The reports I currently am working with require me to analyze them and make decisions based on the information within them. The problem is that these files are about 220 characters wide.

    I could reduce the point size on my O2 and still easily read these files. The whole line is available at once, no horizontal scrolling. Very convenient.

    I recently 'upgraded' to a linux box. The power in this box absolutely dwarfs my O2, but I can no longer use these small fonts and read stuff. I'm forced to work around it by using a text editor with horizontal scrolling rather than a simple terminal window and 'less'.

    I'm still using the same monitor from my O2, so that isn't where the weakness is. Arguably the video card could be spitting out less sharp graphics (nVidia quadro or whatever its called) but most of the impact seems to be from the lack of anti-aliasing.

    You've always got the option of not using it, but to make a blanket statement that it isn't useful is inane.

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...