Software Rendering Engine GPU-Accelerated By WebCL 84
Phopojijo writes "OpenGL and DirectX have been the dominant real-time graphics APIs for decades. Both are catalogs of functions which convert geometry into images using predetermined mathematical algorithms (scanline rendering, triangles, etc.). Software rendering engines calculate colour values directly from the fundamental math. Reliance on OpenGL and DirectX could diminish when GPUs are utilized as general 'large batches of math' solvers which software rendering engines offload to. Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL."
Software-rendered API wrappers through OpenCL (Score:3)
I know there's MAME/MESS but I don't think they do the infamous filtering.
Re: (Score:2)
This sounds like the perfect application for a shader against a whole screen texture. See https://en.wikibooks.org/wiki/OpenGL_Programming/Post-Processing [wikibooks.org]
Or is there something I'm missing? After all there are a number of glide wrappers nGlide [zeus-software.com] and glidos [glidos.net] already.
Re: (Score:2)
Or is there something I'm missing?
Yeah, the whole API and software rendering thing.
The advantage of early API's like glide was that it was much lower level than opengl/d3d, allowing for very efficient hardware-assisted software rendering.
As a simple for-instance, the transition between those lower level API's and software rendering into those higher level API's and fully-hardware rendering was that things like landscapes suddenly used polygons and their datasets ballooned enormously. There was nearly an entire decade of fixed-function-
Re: (Score:2)
Oh I understand the reasons for wanting a software rendering engine. But, the OP was looking for a "function" that is currently available (AFAIK) due to the fact that most of the fixed function behavior of early 3D APIs/GPU (like glide) is now programmable and can be simulated with the more generic pipelines in modern GPUs. In fact there are attempts at writing full blown ray tracers by misusing just GLSL.
But, more to the point, i'm not even sure GPU's are necessarily for many game related drawing. I recent
Re: (Score:2)
I think people are glorifying glide in this thread a biiiiiit toooo muuuch.
glide drew triangles. what the voodoo did in hw was to draw horizontal lines zbuffered, textured and shaded. what this did enable was that people with sw software engines could just pop their trifiller to use that and away they went. now if the engine used something fancier like modelling the ground as a surface(like the game outcast) then there was no fucking way to use 3dfx voodoo for accelerating that.
I am not aware of a SINGLE GA
Re: (Score:2)
but that was the suck. 24 bit is better. I think you would need to use 16bit throughout for the effect though, to get alpha mess up as badly as on voodoo(which is why you just can't slap on a shader on the framebuffer to turn it into 16bit, it wouldn't look the same as if the whole scene was rendered with 16bit+dither from the start)
STAAAAAHP! (Score:5, Insightful)
Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL."
If web browsers were people, that statement would have caused a mass suicide of them. Guys, stop trying to turn the browser into a platform. It introduces so many layers of complexity and security issues that it's a miracle anyone has any trust or faith in the internet at all. It's getting to the point where the only way to safely browse the net is to shove the entire browser into a virtual machine... and even that only manages to protect your own computer, say nothing of your online activities, credentials, life, etc.
We need to be making browsers simpler, not more complex. Feature bloat is making these things a leper's colony inside your computer... a cesspool of malware and vulnerability. Don't add to it by coming up with some new way for developers to directly access the hardware of your computer because you're too fucking lazy to write an app to do whatever it is, and want to cram it into the browser instead. You're just encouraging them.
Seriously, we need a 12 step program for these "web 2.0" people.
Re: (Score:1)
+1
Hell, make it +10.
Re: STAAAAAHP! (Score:2)
Re: (Score:3)
Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL."
If web browsers were people, that statement would have caused a mass suicide of them. Guys, stop trying to turn the browser into a platform. It introduces so many layers of complexity and security issues that it's a miracle anyone has any trust or faith in the internet at all. It's getting to the point where the only way to safely browse the net is to shove the entire browser into a virtual machine... and even that only manages to protect your own computer, say nothing of your online activities, credentials, life, etc.
We need to be making browsers simpler, not more complex. Feature bloat is making these things a leper's colony inside your computer... a cesspool of malware and vulnerability. Don't add to it by coming up with some new way for developers to directly access the hardware of your computer because you're too fucking lazy to write an app to do whatever it is, and want to cram it into the browser instead. You're just encouraging them.
Seriously, we need a 12 step program for these "web 2.0" people.
Browser based apps are not done because developers are "too fucking lazy to write an app to do whatever" but because they are lazy/costs more money to do it for multiple platforms, including mobile ones, and they just-work without the need of installing anything (the app itself, JRE, whatever) and the need for some kind of user privileges.
Re: (Score:2)
I can think of 5:
1. Line 'em up.
2. Shoot 'em.
3. Bulldoze 'em into a ditch.
4. Cover with lime.
5. Repeat.
Missing steps.... (Score:2)
5. Repeat
Well, that actually accounts for eight of the twelve steps.
I suggest some modding:
1. round up politicians and remaining lawyers
2. add them to the "web 2.0" crowd
3. line them up
4. Shoot 'em
5. cover with lime
6. bulldoze 'em into a ditch
7. repeat (you can never be too careful here)
And there's your 12 step program!
Re: (Score:2)
Browser consistency, desktop and mobile included, is obtained by properly using a Javascript framework which does it for you and CSS Bootstrap, for example.
Re: (Score:2)
As there are millions of webdevelopers and only a couple of hundred thousand of 'native app' developers for iOS and Android which charge a lot more money.
Really, development speed and knowledge of native platforms is an important factor. If you only need to know one platform and can reuse code this translates to less time, less knowledge of native platforms and thus less cost.
Less cost, that's what this is about. Businesses like less cost.
Re:STAAAAAHP! (Score:4, Interesting)
For once, we have a gaming platform (besides Linux and BSD) which allows genuine, timeless art. If the W3C, or an industry body like them, creates an equivalent pseudo-native app platform... then great. For now, the web is the best we have.
Re: (Score:2)
Make that two: POSIX and X11 (Score:2)
Re: (Score:1)
It is programmed by a set of open standards which gives any person or organization the tools to build support for the content which is relevant to society.
I think you need to lay off the koolaid man.
JPEG and [wikipedia.org] GIF [wikipedia.org] both have licensing issues; They are not free. The intended replacement for these, PNG, hasn't seen widespread adoption, can't do animations, but has no licensing issues. In fact, if you take a walk down a list of all the multimedia technologies commonly used on the web, MP3, MPEG4, h.264, AAC, surround sound... you will find yourself in a veritable desert when it comes to truly free standards. The standards may be 'open', that is, published... but a
Re: (Score:2)
JPEG and [wikipedia.org] GIF [wikipedia.org] both have licensing issues; They are not free.
Are you kidding me? The patents for GIF expired long ago [freesoftwaremagazine.com]. As for JPEG, that's as much a "living standard" as HTML5 is. It's worth researching further, but I'd think the older parts of JPEG aren't too problematic [pcworld.com].
The intended replacement for these, PNG, hasn't seen widespread adoption, can't do animations, but has no licensing issues.
PNG has never been and never will be an intended replacement for JPEG, as PNG is lossless and JPEG is (mostly) lossy. And in what way hasn't PNG seen widespread adoption? It is the dominant lossless image format and is used absolutely everywhere. PNG can do animations too (though it's not supported
Re: (Score:2, Insightful)
Yeah, they really missed the boat not freezing computing in the 70s. These kids and wanting features and interesting applications and useful computing are a bunch of assholes. They should be forced to do things MY preferred way because my opinion is the only one that matters.
Fucking slashtards.
Re: (Score:2)
... or if your users are too fucking lazy to install an app. Too bad, make them do it.
Re: (Score:2)
Re: (Score:2)
I'm not sure that's the best solution either. With the current desktop offerings, all applications run with the full permissions of the user. Things are a little bit better on the mobile side. At least with Android I can see which permissions an app has, and by default they are very limited in what they can do. With Windows/Linux, any application I run can go and delete my entire home folder, or send it all out to some site on the web, or wreak all kinds of havoc. Currently, running in a web browser, if the only pseudo-sandbox that exists for desktop systems. I'd much rather run a web app from some random company then install some application on my computer.
on Linux isn't that what AppArmour or selinux is supposed to help negate
Rejected apps (Score:2)
Re: (Score:2)
So sad for the users (and developers who write for it).
Re: (Score:2)
Re: (Score:2)
The difference is that Slashdot is a website, not an application.
Line between website and application (Score:2)
Re: (Score:2)
Only if your OS is gimped. There is such a thing as Mandatory Access Control. You should look it up, we even have working implementations.
Re: (Score:2, Insightful)
Guys, stop trying to turn the browser into a platform.
No. Or "too late", rather.
these "web 2.0" people
These "web 2.0" people are going to continue to ignore you until what you espouse is as obviously stupid to everyone as it is to them. I'm not sure it isn't already.
Re: (Score:2)
Re: (Score:2)
SmallPtGPU, from the testing I did a while ago, seems to be almost the same speed whether run in WebC
Re: (Score:1)
only half performance of native C++
Google's Native Client sandbox only suffers a few percent performance degradation relative to native software. "Half" is comparatively awful.
Re: (Score:2)
Half is faster than most scripting languages. Look it up, a lot of scripting languages are 100 times slower than native.
Javascript is the fastest generally used scripting language, after or similar to Lua. And Lua was optimized to be an embedded language from the start, nobody really considered Javascript would go this far so it wasn't designed for that.
I think it's kinda cool how far people are able to push it.
Re: (Score:2)
Re: (Score:2)
Meh, who cares?
Remember VB6? (Oh, the horror!) VB6 apps were slower and bigger than the equivalent written in VC++.
Do you know why it was so popular?
Because the "horrible" performance was good enough for most applications. A good developer using VB6 cut his development time significantly (hours vs days, days vs. weeks). A beginner could actually get something to work in a reasonable amount of time. That's powerful.
The web as a platform has its own set of advantages that, for many applications,
Re: (Score:2)
"good developer using VB6"
hahaha
Re: (Score:2)
You're an idiot.
Re: (Score:2)
If you think people use Javascript to create the frames in high speed games you are being silly.
They use native-like typed-arrays and WebGL when they make Javascript/HTML games, basically offloading most of the tasks to the native code.
Javascript/HTML5/OpenWeb platform/whatever you want to call it. Is just a collection of APIs to talk to native code.
Only the application specific code will be written in Javascript.
Re:STAAAAAHP! (Score:4, Insightful)
" stop trying to turn the browser into a platform."
The reason why they are doing this, is the big push by major industries for more DRM. Although current DRM is ineffective against more technically inclined people. They want to eventually be able to encrypt and split up programs and data tying them to the server. Just like how diablo 3 took part of the program hostage across the internet and you had to constantly 'get permission' to continue playing the game.
If you think big companies are not looking at what the game industry and others are doing locking down apps, then you haven't been paying attention.
Re: (Score:1)
For the sake of all our sanity, could you please learn the distinctions between full stops (which you may know as periods... lol), commas and semi-colons before you next post? It will make your posts look less like Spot the Dog, and make people reading them hurt less.
Thank you.
Re: (Score:2)
The reason why they are doing this, is the big push by major industries for more DRM.
I think you are mixing up cause and effect. Major industries would like to continue using desktop apps or even "Trusted Computing". But the web is a platform, so they try to force their DRM onto it.
Re: (Score:2)
What's a better cross-platform platform? (Score:2)
Guys, stop trying to turn the browser into a platform.
Then what's a better platform for developers who want to reach users of Windows, OS X, desktop Linux, Android, iOS, Windows RT, Windows Phone, and the game consoles? Making a program work on more than one platform requires severe modifications, sometimes including translation of every line of code into a different programming language. Windows Phone 7 and Xbox Live Indie Games, for example, couldn't run anything but verifiably type-safe .NET CF CIL. And all except the first four require permission from the
Missing the point? (Score:2)
I thought the point of GPU's was to not only offload the rendering of 3D graphics but also the algorithms. Game developers don't want to have to program primary rendering algorithms with every game they create. Do they? Am I missing something?
Libraries are copyrighted (Score:2)
Other than that, developers of all kinds use software libraries to implement common tasks.
The problem comes when licensing such libraries becomes cost prohibitive or requires the developer to give up the keys to his own kingdom.
Re: (Score:2)
Some do not. A good example is Epic Games who, in 2008, predicted "100% of the rendering code" for Unreal Engine 4 would be programmed directly for the GPUs. The next year they found the cost prohibitive so they kept with DirectX and OpenGL at least for a while longer. Especially for big production houses, if there is a bug or a quirk in the rendering code, it would be nice to be able to fix the problem direct
Re: (Score:2)
Re: (Score:2)
I thought the point of GPU's was to not only offload the rendering of 3D graphics but also the algorithms. Game developers don't want to have to program primary rendering algorithms with every game they create. Do they? Am I missing something?
Yes, you are missing something. The point of GPUs is to efficiently calculate pixel values to show on the screen. Specific algorithms can be implement in hardware or software and GPU hardware has been moving toward exposing more generic functionality for years, which WebCL can make available to Javascript code. It's the game engine or libraries used by the game engine that worry about low level details about how to talk to the GPU, whether that happens via OpenGL, WebGL, Direct3D, WebCL or something else.
Security (Score:3)
Java applets are far too secure, let's get to the lowest level!
Re: (Score:2)
remember when dos game engines used just whatever math (voxels, fake shit, raytracing against 2d map..) the coder could get to run fast enough to create the graphics? to not be constrained with triangles?
the "article" is about "hey, wouldn't it be cool to do that again just with gpu's?". the summary makes it sound as if the dude had done something cool with that idea.
the video on in the article shows a shaded triangle. fail, waste of time. and sort of ignores that programmable shaders are just for this purp
summary has weird language (Score:2)
The terminology in the summary is confusing and wrong.
First of all, software rendering vs. hardware rendering isn't the same as scanline rendering vs. "rendering from the underlying math", which I assume is a bad attempt at a layman's description of raytracing. You can have a scanline triangle renderer in software, and you can have a raytracer in hardware. It is true that most GPUs are built for scanline rendering and not raytracing, but plenty of raytracers have been written that run on GPUs.
Second, if y
Re: (Score:2)
Re: (Score:2)
The specific algo
Great if you have real broadband (Score:2)
Web based content is fine for every urban dweller out there with a 3Mb+ pipe but there are a lot of people that barely have 1Mb DSL. Ever try and do a 60M Steam update on one of those sucky lines?
Re: (Score:2)
Re: (Score:2)
It can be run from a website over HTTP, but does not need to be. Heck, you could even burn it to a DVD and double-click the index.html file in it.
I/O Bandwidth (Score:4, Interesting)
Many 3D engines are carefully tuned to the limited bandwidth to the GPU cards that provides them just enough bandwidth per frame to transfer the necessary geometry/textures/etc for that frame. The results, of course, stay on the GPU card and are just outputted to the frame buffer. Now, in addition to that existing overhead, the engine writer would now have to transfer back the results/frame buffer back to the CPU to process, generate an image, that is then passed back to the GPU to be displayed as an image? Or am I missing something?
While I'm sure it would allow customized algorithms, they would have to be rather unique to not be handled by the current state of geometry/vertex/fragment shaders. Are they thinking some of non-triangular geometry?
Maybe there is a way to send the result of the maths directly to the frame buffer while it's on the GPU?
Re: (Score:2)
Now, of course, you may wish to (example: copy to APU memory, run physics, copy to GPU memory, render)... but the programmer needs to explicitly queue a memory move command to do so. If the programmer doesn't move the content... it stays on wherever it is.
Re: (Score:2)
While I'm sure it would allow customized algorithms, they would have to be rather unique to not be handled by the current state of geometry/vertex/fragment shaders. Are they thinking some of non-triangular geometry?
The FA mentions voxel rendering for Minecraft-type applications. Although volume rendering can be achieved with traditional hardware accelerated surface primitives, there are many algorithms that are more naturally described and implemented using data structures that don't translate so easily to hardware accelerated primitives.
Constructive solid geometry, vector based graphics, and ray tracing are also not such a nice fit to OpenGL and DirectX APIs. You don't always want to have to tessellate geometry tha
and they can't use a 2nd/3rd/4th core why? (Score:2)
I don't get it. Using a video card to render without the video card? Since so many machines on the market have 2+ cores, why not write a software renderer that will sit on one of the extra cores?
Re: (Score:2)
cores or processors, are typically not that fast at doing graphical rendering. GPUs are typically much more efficinet at that task. (Hey that's what they have been built for.)
Re: (Score:2)
But yes, even CPUs have OpenCL drivers so (albeit Intel's is buggy as heck for the time being) so you could even select your CPU as your "graphics processor" and it would run... just slowly.
Reinventing the GPU API (Score:1)
Reliance on OpenGL and DirectX could diminish when GPUs are utilized as general 'large batches of math' solvers which software rendering engines offload to.
GPUs have never not been general 'large batches of math' solvers. It's just that games have never required a very large amount of math to be done apart from rendering 3D graphics. Hell, they still don't. Most of the time this kind of stuff is actively avoided, or faked when it cannot be avoided. Why? Because there are better things you can do with the development budget than try to bring a science project to life.
"Good" code doesn't pay as well as "fast" code. (and since it's a little ambiguous, the "fast"