Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics

Software Rendering Engine GPU-Accelerated By WebCL 84

Phopojijo writes "OpenGL and DirectX have been the dominant real-time graphics APIs for decades. Both are catalogs of functions which convert geometry into images using predetermined mathematical algorithms (scanline rendering, triangles, etc.). Software rendering engines calculate colour values directly from the fundamental math. Reliance on OpenGL and DirectX could diminish when GPUs are utilized as general 'large batches of math' solvers which software rendering engines offload to. Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL."
This discussion has been archived. No new comments can be posted.

Software Rendering Engine GPU-Accelerated By WebCL

Comments Filter:
  • by Hsien-Ko ( 1090623 ) on Wednesday October 02, 2013 @02:18PM (#45017711)
    ...is something i'd really like to see -especially one that does Glide and the dither+undither characteristic of the first three Voodoo cards.

    I know there's MAME/MESS but I don't think they do the infamous filtering.
    • by bored ( 40072 )

      This sounds like the perfect application for a shader against a whole screen texture. See https://en.wikibooks.org/wiki/OpenGL_Programming/Post-Processing [wikibooks.org]

      Or is there something I'm missing? After all there are a number of glide wrappers nGlide [zeus-software.com] and glidos [glidos.net] already.

      • Or is there something I'm missing?

        Yeah, the whole API and software rendering thing.

        The advantage of early API's like glide was that it was much lower level than opengl/d3d, allowing for very efficient hardware-assisted software rendering.

        As a simple for-instance, the transition between those lower level API's and software rendering into those higher level API's and fully-hardware rendering was that things like landscapes suddenly used polygons and their datasets ballooned enormously. There was nearly an entire decade of fixed-function-

        • by bored ( 40072 )

          Oh I understand the reasons for wanting a software rendering engine. But, the OP was looking for a "function" that is currently available (AFAIK) due to the fact that most of the fixed function behavior of early 3D APIs/GPU (like glide) is now programmable and can be simulated with the more generic pipelines in modern GPUs. In fact there are attempts at writing full blown ray tracers by misusing just GLSL.

          But, more to the point, i'm not even sure GPU's are necessarily for many game related drawing. I recent

        • by gl4ss ( 559668 )

          I think people are glorifying glide in this thread a biiiiiit toooo muuuch.

          glide drew triangles. what the voodoo did in hw was to draw horizontal lines zbuffered, textured and shaded. what this did enable was that people with sw software engines could just pop their trifiller to use that and away they went. now if the engine used something fancier like modelling the ground as a surface(like the game outcast) then there was no fucking way to use 3dfx voodoo for accelerating that.

          I am not aware of a SINGLE GA

  • STAAAAAHP! (Score:5, Insightful)

    by girlintraining ( 1395911 ) on Wednesday October 02, 2013 @02:18PM (#45017713)

    Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL."

    If web browsers were people, that statement would have caused a mass suicide of them. Guys, stop trying to turn the browser into a platform. It introduces so many layers of complexity and security issues that it's a miracle anyone has any trust or faith in the internet at all. It's getting to the point where the only way to safely browse the net is to shove the entire browser into a virtual machine... and even that only manages to protect your own computer, say nothing of your online activities, credentials, life, etc.

    We need to be making browsers simpler, not more complex. Feature bloat is making these things a leper's colony inside your computer... a cesspool of malware and vulnerability. Don't add to it by coming up with some new way for developers to directly access the hardware of your computer because you're too fucking lazy to write an app to do whatever it is, and want to cram it into the browser instead. You're just encouraging them.

    Seriously, we need a 12 step program for these "web 2.0" people.

    • +1

      Hell, make it +10.

    • Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL."

      If web browsers were people, that statement would have caused a mass suicide of them. Guys, stop trying to turn the browser into a platform. It introduces so many layers of complexity and security issues that it's a miracle anyone has any trust or faith in the internet at all. It's getting to the point where the only way to safely browse the net is to shove the entire browser into a virtual machine... and even that only manages to protect your own computer, say nothing of your online activities, credentials, life, etc.

      We need to be making browsers simpler, not more complex. Feature bloat is making these things a leper's colony inside your computer... a cesspool of malware and vulnerability. Don't add to it by coming up with some new way for developers to directly access the hardware of your computer because you're too fucking lazy to write an app to do whatever it is, and want to cram it into the browser instead. You're just encouraging them.

      Seriously, we need a 12 step program for these "web 2.0" people.

      Browser based apps are not done because developers are "too fucking lazy to write an app to do whatever" but because they are lazy/costs more money to do it for multiple platforms, including mobile ones, and they just-work without the need of installing anything (the app itself, JRE, whatever) and the need for some kind of user privileges.

      • Seriously, we need a 12 step program for these "web 2.0" people.

        I can think of 5:

        1. Line 'em up.
        2. Shoot 'em.
        3. Bulldoze 'em into a ditch.
        4. Cover with lime.
        5. Repeat.

        • 5. Repeat

          Well, that actually accounts for eight of the twelve steps.

          I suggest some modding:
          1. round up politicians and remaining lawyers
          2. add them to the "web 2.0" crowd
          3. line them up
          4. Shoot 'em
          5. cover with lime
          6. bulldoze 'em into a ditch
          7. repeat (you can never be too careful here)

          And there's your 12 step program!

      • by Lennie ( 16154 )

        As there are millions of webdevelopers and only a couple of hundred thousand of 'native app' developers for iOS and Android which charge a lot more money.

        Really, development speed and knowledge of native platforms is an important factor. If you only need to know one platform and can reuse code this translates to less time, less knowledge of native platforms and thus less cost.

        Less cost, that's what this is about. Businesses like less cost.

    • Re:STAAAAAHP! (Score:4, Interesting)

      by Phopojijo ( 1603961 ) on Wednesday October 02, 2013 @02:34PM (#45017901)
      Actually, I look at web browsers as an art platform. It is programmed by a set of open standards which gives any person or organization the tools to build support for the content which is relevant to society. A video game, designed in web standards, could be preserved for centuries by whoever deems it culturally relevant.

      For once, we have a gaming platform (besides Linux and BSD) which allows genuine, timeless art. If the W3C, or an industry body like them, creates an equivalent pseudo-native app platform... then great. For now, the web is the best we have.
      • POSIX has been around for a while. ISO/IEC also publish open standards for C and C++.
      • It is programmed by a set of open standards which gives any person or organization the tools to build support for the content which is relevant to society.

        I think you need to lay off the koolaid man.

        JPEG and [wikipedia.org] GIF [wikipedia.org] both have licensing issues; They are not free. The intended replacement for these, PNG, hasn't seen widespread adoption, can't do animations, but has no licensing issues. In fact, if you take a walk down a list of all the multimedia technologies commonly used on the web, MP3, MPEG4, h.264, AAC, surround sound... you will find yourself in a veritable desert when it comes to truly free standards. The standards may be 'open', that is, published... but a

        • JPEG and [wikipedia.org] GIF [wikipedia.org] both have licensing issues; They are not free.

          Are you kidding me? The patents for GIF expired long ago [freesoftwaremagazine.com]. As for JPEG, that's as much a "living standard" as HTML5 is. It's worth researching further, but I'd think the older parts of JPEG aren't too problematic [pcworld.com].

          The intended replacement for these, PNG, hasn't seen widespread adoption, can't do animations, but has no licensing issues.

          PNG has never been and never will be an intended replacement for JPEG, as PNG is lossless and JPEG is (mostly) lossy. And in what way hasn't PNG seen widespread adoption? It is the dominant lossless image format and is used absolutely everywhere. PNG can do animations too (though it's not supported

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Yeah, they really missed the boat not freezing computing in the 70s. These kids and wanting features and interesting applications and useful computing are a bunch of assholes. They should be forced to do things MY preferred way because my opinion is the only one that matters.

      Fucking slashtards.

    • ... or if your users are too fucking lazy to install an app. Too bad, make them do it.

      • I'm not sure that's the best solution either. With the current desktop offerings, all applications run with the full permissions of the user. Things are a little bit better on the mobile side. At least with Android I can see which permissions an app has, and by default they are very limited in what they can do. With Windows/Linux, any application I run can go and delete my entire home folder, or send it all out to some site on the web, or wreak all kinds of havoc. Currently, running in a web browser, if t
        • I'm not sure that's the best solution either. With the current desktop offerings, all applications run with the full permissions of the user. Things are a little bit better on the mobile side. At least with Android I can see which permissions an app has, and by default they are very limited in what they can do. With Windows/Linux, any application I run can go and delete my entire home folder, or send it all out to some site on the web, or wreak all kinds of havoc. Currently, running in a web browser, if the only pseudo-sandbox that exists for desktop systems. I'd much rather run a web app from some random company then install some application on my computer.

          on Linux isn't that what AppArmour or selinux is supposed to help negate

      • Or if your users happen to have chosen a platform that gives the operating system publisher veto power over apps, and the operating system publisher has chosen to exercise this power over your app. For example, see Bob's Game or any story about rejection from Apple's App Store.
        • So sad for the users (and developers who write for it).

          • by tepples ( 727027 )
            Would you prefer to be able to read Slashdot through the web but have to buy an app (and a different brand of computer to run the app) in order to post? That's what it'd be like.
            • The difference is that Slashdot is a website, not an application.

              • If you make a distinction [wikipedia.org], you need to explain the difference [logicallyfallacious.com]. Where does a website end and an application begin? Slashdot and other web boards are essentially web-based workalikes of an NNTP user agent, and that's certainly an application. I had assumed that the line was that one reads a "website" but posts using an "application". If elsewhere, where do you draw the line?
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Guys, stop trying to turn the browser into a platform.

      No. Or "too late", rather.

      these "web 2.0" people

      These "web 2.0" people are going to continue to ignore you until what you espouse is as obviously stupid to everyone as it is to them. I'm not sure it isn't already.

    • The silver lining in this is that it lets us dazzle the suits with just how fast our C++ apps are in comparison to the mess that the web stack has turned into.
      • It's getting much closer. Most ASM.js demos show C++-compiled-into-Javascript is only half performance of native C++ (and getting faster). That's a difference between 30fps and 60fps if all code was Javascript. WebCL, on the other hand, is almost exactly OpenCL speeds... so for GPU-accelerated apps (depending on whether Javascript or WebCL is your primary bottleneck) you could get almost native performance.

        SmallPtGPU, from the testing I did a while ago, seems to be almost the same speed whether run in WebC
        • by Anonymous Coward

          only half performance of native C++

          Google's Native Client sandbox only suffers a few percent performance degradation relative to native software. "Half" is comparatively awful.

          • by Lennie ( 16154 )

            Half is faster than most scripting languages. Look it up, a lot of scripting languages are 100 times slower than native.

            Javascript is the fastest generally used scripting language, after or similar to Lua. And Lua was optimized to be an embedded language from the start, nobody really considered Javascript would go this far so it wasn't designed for that.

            I think it's kinda cool how far people are able to push it.

        • Handwave it all you want, but there's huge difference between 30fps and 60fps, and "almost there" isn't good enough when you're still chasing frame rates that were common in native C++ applications in the late 90s and the customer wants your build yesterday.
          • by narcc ( 412956 )

            Meh, who cares?

            Remember VB6? (Oh, the horror!) VB6 apps were slower and bigger than the equivalent written in VC++.

            Do you know why it was so popular?

            Because the "horrible" performance was good enough for most applications. A good developer using VB6 cut his development time significantly (hours vs days, days vs. weeks). A beginner could actually get something to work in a reasonable amount of time. That's powerful.

            The web as a platform has its own set of advantages that, for many applications,

          • by Lennie ( 16154 )

            If you think people use Javascript to create the frames in high speed games you are being silly.

            They use native-like typed-arrays and WebGL when they make Javascript/HTML games, basically offloading most of the tasks to the native code.

            Javascript/HTML5/OpenWeb platform/whatever you want to call it. Is just a collection of APIs to talk to native code.

            Only the application specific code will be written in Javascript.

    • Re:STAAAAAHP! (Score:4, Insightful)

      by blahplusplus ( 757119 ) on Wednesday October 02, 2013 @02:59PM (#45018211)

      " stop trying to turn the browser into a platform."

      The reason why they are doing this, is the big push by major industries for more DRM. Although current DRM is ineffective against more technically inclined people. They want to eventually be able to encrypt and split up programs and data tying them to the server. Just like how diablo 3 took part of the program hostage across the internet and you had to constantly 'get permission' to continue playing the game.

      If you think big companies are not looking at what the game industry and others are doing locking down apps, then you haven't been paying attention.

      • by Anonymous Coward

        For the sake of all our sanity, could you please learn the distinctions between full stops (which you may know as periods... lol), commas and semi-colons before you next post? It will make your posts look less like Spot the Dog, and make people reading them hurt less.

        Thank you.

      • The reason why they are doing this, is the big push by major industries for more DRM.

        I think you are mixing up cause and effect. Major industries would like to continue using desktop apps or even "Trusted Computing". But the web is a platform, so they try to force their DRM onto it.

    • I respectfully disagree. Whether anybody likes it or not, when JavaScript support was added to browsers (a really long time ago) the browser became a platform. Great web based apps are harder to do than native apps. Well designed web based apps work on all platforms and do not require client side installation or support. The cost of distribution and maintenance of web based apps is dramatically lower and that reduces cost. Centralized code management makes change management much more effective and that can
    • Guys, stop trying to turn the browser into a platform.

      Then what's a better platform for developers who want to reach users of Windows, OS X, desktop Linux, Android, iOS, Windows RT, Windows Phone, and the game consoles? Making a program work on more than one platform requires severe modifications, sometimes including translation of every line of code into a different programming language. Windows Phone 7 and Xbox Live Indie Games, for example, couldn't run anything but verifiably type-safe .NET CF CIL. And all except the first four require permission from the

  • I thought the point of GPU's was to not only offload the rendering of 3D graphics but also the algorithms. Game developers don't want to have to program primary rendering algorithms with every game they create. Do they? Am I missing something?

    • Some want to use the same algorithms OpenGL and DirectX does... and those APIs are still for them.

      Some do not. A good example is Epic Games who, in 2008, predicted "100% of the rendering code" for Unreal Engine 4 would be programmed directly for the GPUs. The next year they found the cost prohibitive so they kept with DirectX and OpenGL at least for a while longer. Especially for big production houses, if there is a bug or a quirk in the rendering code, it would be nice to be able to fix the problem direct
    • Since OpenGL 3.0, the traditional OpenGL rendering pipeline (supposedly) has been implemented using shaders under the hood. Also, check out the OpenGL Mathematics (GLM) library for the matrix operations in userspace that used to be at the driver level before.
    • by Jonner ( 189691 )

      I thought the point of GPU's was to not only offload the rendering of 3D graphics but also the algorithms. Game developers don't want to have to program primary rendering algorithms with every game they create. Do they? Am I missing something?

      Yes, you are missing something. The point of GPUs is to efficiently calculate pixel values to show on the screen. Specific algorithms can be implement in hardware or software and GPU hardware has been moving toward exposing more generic functionality for years, which WebCL can make available to Javascript code. It's the game engine or libraries used by the game engine that worry about low level details about how to talk to the GPU, whether that happens via OpenGL, WebGL, Direct3D, WebCL or something else.

  • by damaki ( 997243 ) on Wednesday October 02, 2013 @02:38PM (#45017929)
    Great. I'd sure like my GPU, with its mad low level optimizations and surely ugly code to be used by unsigned code from random sources.
    Java applets are far too secure, let's get to the lowest level!
  • The terminology in the summary is confusing and wrong.

    First of all, software rendering vs. hardware rendering isn't the same as scanline rendering vs. "rendering from the underlying math", which I assume is a bad attempt at a layman's description of raytracing. You can have a scanline triangle renderer in software, and you can have a raytracer in hardware. It is true that most GPUs are built for scanline rendering and not raytracing, but plenty of raytracers have been written that run on GPUs.

    Second, if y

    • by Bengie ( 1121981 )
      Upcoming GPUs support protected memory, C++, and pre-preemptive multi-tasking. A GPU is just a type of CPU. You will actually be able to pass a pointer from the CPU to the GPU and not have to translate it, it will work natively with it.
    • Actually the demo doesn't raytrace. In this demo "scene" (one triangle) it uses barycentric coordinates to determine if a pixel is inside or outside of a triangle. If it is inside? It shades it with one of two functions. These two functions derive red, green, and blue from how far the pixel is away from a vertex compared to the distance between that vertex and the center of the opposite edge (the animated function also has a time component). If it is outside the triangle? Pixel is skipped.

      The specific algo
  • Web based content is fine for every urban dweller out there with a 3Mb+ pipe but there are a lot of people that barely have 1Mb DSL. Ever try and do a 60M Steam update on one of those sucky lines?

    • by tepples ( 727027 )
      60 MB at 1 Mbps takes 8 minutes. Let's say 10 minutes for overhead. Schedule it to run while you sleep.
    • "Perpetual Motion Engine" can operate on the FILE protocol. You can point the web browser to a web page located on your hard drive (or a USB thumb drive) and it will work.

      It can be run from a website over HTTP, but does not need to be. Heck, you could even burn it to a DVD and double-click the index.html file in it.
  • I/O Bandwidth (Score:4, Interesting)

    by Mr. Sketch ( 111112 ) <mister@sketch.gmail@com> on Wednesday October 02, 2013 @03:01PM (#45018233)

    Many 3D engines are carefully tuned to the limited bandwidth to the GPU cards that provides them just enough bandwidth per frame to transfer the necessary geometry/textures/etc for that frame. The results, of course, stay on the GPU card and are just outputted to the frame buffer. Now, in addition to that existing overhead, the engine writer would now have to transfer back the results/frame buffer back to the CPU to process, generate an image, that is then passed back to the GPU to be displayed as an image? Or am I missing something?

    While I'm sure it would allow customized algorithms, they would have to be rather unique to not be handled by the current state of geometry/vertex/fragment shaders. Are they thinking some of non-triangular geometry?

    Maybe there is a way to send the result of the maths directly to the frame buffer while it's on the GPU?

    • Only if you want it to! You can share resources between OpenCL and OpenGL without passing through the CPU.

      Now, of course, you may wish to (example: copy to APU memory, run physics, copy to GPU memory, render)... but the programmer needs to explicitly queue a memory move command to do so. If the programmer doesn't move the content... it stays on wherever it is.
    • While I'm sure it would allow customized algorithms, they would have to be rather unique to not be handled by the current state of geometry/vertex/fragment shaders. Are they thinking some of non-triangular geometry?

      The FA mentions voxel rendering for Minecraft-type applications. Although volume rendering can be achieved with traditional hardware accelerated surface primitives, there are many algorithms that are more naturally described and implemented using data structures that don't translate so easily to hardware accelerated primitives.

      Constructive solid geometry, vector based graphics, and ray tracing are also not such a nice fit to OpenGL and DirectX APIs. You don't always want to have to tessellate geometry tha

  • I don't get it. Using a video card to render without the video card? Since so many machines on the market have 2+ cores, why not write a software renderer that will sit on one of the extra cores?

    • by godrik ( 1287354 )

      cores or processors, are typically not that fast at doing graphical rendering. GPUs are typically much more efficinet at that task. (Hey that's what they have been built for.)

    • Because a GeForce Titan has about 2700 cores and about 4.5 teraflops of performance.

      But yes, even CPUs have OpenCL drivers so (albeit Intel's is buggy as heck for the time being) so you could even select your CPU as your "graphics processor" and it would run... just slowly.
  • by Anonymous Coward

    Reliance on OpenGL and DirectX could diminish when GPUs are utilized as general 'large batches of math' solvers which software rendering engines offload to.

    GPUs have never not been general 'large batches of math' solvers. It's just that games have never required a very large amount of math to be done apart from rendering 3D graphics. Hell, they still don't. Most of the time this kind of stuff is actively avoided, or faked when it cannot be avoided. Why? Because there are better things you can do with the development budget than try to bring a science project to life.

    "Good" code doesn't pay as well as "fast" code. (and since it's a little ambiguous, the "fast"

Experiments must be reproducible; they should all fail in the same way.

Working...