Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Open Source

Glyphy: High Quality Glyph Rendering Using OpenGL ES2 Shaders 59

Recently presented at Linuxconf.au was Glyphy, a text renderer implemented using OpenGL ES2 shaders. Current OpenGL applications rasterize text on the CPU using Freetype or a similar library, uploading glyphs to the GPU as textures. This inherently limits quality and flexibility (e.g. rotation, perspective transforms, etc. cause the font hinting to become incorrect and you cannot perform subpixel antialiasing). Glyphy, on the other hand, uploads typeface vectors to the GPU and renders text in real time, performing perspective correct antialiasing. The presentation can be watched or downloaded on Vimeo. The slide sources are in Python, and I generated a PDF of the slides (warning: 15M due to embedded images). Source code is at Google Code (including a demo application), under the Apache License.
This discussion has been archived. No new comments can be posted.

Glyphy: High Quality Glyph Rendering Using OpenGL ES2 Shaders

Comments Filter:
  • Re:Surprising (Score:5, Interesting)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday January 15, 2014 @03:57PM (#45968757) Homepage

    Although rendering text correctly is maddenly complex, the reasons described here aren't actually any of them.

    The things described here are more a result of the good established libraries only being written for the CPU. Not because GPU is more complex, but simply because nobody had taken the time to do it.

  • Utter dribble (Score:4, Interesting)

    by Anonymous Coward on Wednesday January 15, 2014 @05:49PM (#45969813)

    There is NOTHING that the GPU can do that software rendering on the CPU cannot do. There MAY be a speed penalty, of course (and were you using the CPU to render a 3D game, rather than your GPU, the speed penalty would in in the order of thousands to tens of thousands of times slower).

    The reverse is NOT true. There are rendering methods available on the CPU that the GPU cannot implement, because of hardware limitations. Take COVERAGE-BASED anti-aliasing, for instance.

    On the CPU, it is trivial to write a triangle-fill algorithm that PERFECTLY anti-aliases the edges by calculating the exact percentage of a pixel the triangle edges cover. Amazingly, this option is NOT implemented in GPU hardware. GPU hardware uses the crude approach of pixel super-sampling- which can be thought of as an increase in the resolution of edge pixels. So, for instance, all 4x GPU anti-aliasing methods effectively increase the resolution of edge pixels by 2 (so a pixel becomes in some sense 2x2 pixels).

    Edge coverage calculations, while trivial to implement in hardware, were never considered 'useful' in common GPU solutions.

    A 'GLYPH' tends to have a highly curved contour, which sub-divides into a nasty mess of GPU unfriendly irregular tiny triangles. GPUs are most efficient when they render a stream of similar triangles of at least 16 visible screen pixels. Glyphs can be nasty, highly 'concave' entities with MANY failure conditions for fill algorithms. They are exactly the kind of geometric entities modern GPU hardware hates the most.

    It gets WORSE, much much worse. Modern GPU hardware drivers from AMD and Nvidia purposely cripple common 2D acceleration functions in DirectX and OpenGL, so they can sell so-called 'professional' hardware (with the exact same chips) to CAD users and the like. The situation got so bad with AMD solutions, that tech sites could not believe how slowly modern AMD GPU cards rendered the new accelerated Windows 2D interface- forcing AMD to release new drivers that backed off on the chocking just a little.

    Admittedly, so long as accelerated glyph rendering using the 3D pipeline, and ordinary shader solutions, the crippling will not happen- but the crippling WILL be back if non-gaming forms of anti-aliasing are activated in hardware on Nvidia or AMD cards. Nvidia, for instance, boasts that drawing anti-aliased lines with its 2000dollar plus professional card is hundreds of times faster than doing the same with the gaming version of the card that uses the EXACT same hardware, and actually has faster memory and GPU clocks.

    It gets WORSE. Rendering text on a modern CPU, and displaying the text using the 2D pipeline of a modern GPU is very power efficient. However, activate the 3D gaming mode of your GPU, by accelerating glyphs through the 3D pipeline, and power usage goes through the roof.

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...