WebGL Poses New Security Problems 178
Julie188 writes "Researchers are warning that the WebGL standard undermines existing operating system security protections and offers up new attack surfaces. To enable rendering of demanding 3D animations, WebGL allows web sites to execute shader code directly on a system's graphics card. This can allow an attacker to exploit security vulnerabilities in the graphics card driver and even inject malicious code onto the system."
WebGL was always a bad idea (Score:2, Insightful)
Leave my hardware
Astroturf much? (Score:2)
Now that we finally have sandboxing in browsers they want to let any website run directly code on your hardware.
As opposed to what?
Keep in mind that sandboxing is a way to deal with websites having direct hardware access. That's how they can do NaCL [google.com] safely.
Silverlight has direct support for XNA which handles it everything better and safer anyway.
What's "better" or "safer" about XNA? Never mind that nothing's stopping you from porting frameworks to this anyway -- having a lower-level standard means that, ultimately, you can have something like XNA, and others can have other things.
Never mind that XNA doesn't have a particularly good OpenGL implementation, and that Silverlight is barely an open standard as i
Re: (Score:3)
XNA doesn't have a particularly good OpenGL implementation? As someone with years of experience with XNA, Direct3D, OpenGL, and even some experience with WebGL, I feel ethically responsible for saying that you have absolutely no idea what you're talking about at all.
Xbox 360 doesn't have the web (Score:2)
Not only that but the games also work on Xbox360 and mobile phones without such a major porting.
Know what else mobile phones have? The web.
Xbox 360 doesn't have the web. What do you recommend to get a game ported to a console?
Re: (Score:2)
the PLAYSTATION3 *does* have the web [...] What say you now?
I've heard it's rubbish [suite101.com], even compared to the Android browser that runs on less RAM. Does the PS3 web browser support even the 2D canvas? And how does it expose button presses on the controllers as events? Google failed me somehow.
PS3 with Firefox lacks new games (Score:2)
well, my ps3 runs firefox4
I'm assuming that your PLAYSTATION 3 console is running Firefox under Linux under Other OS under pre-3.21 PS3 system software. (Please correct me if I'm wrong.) But new PS3 games require a PS3 with post-3.21 firmware. This means people who still run a PS3 with pre-3.21 firmware aren't buying new PS3 games; they can be considered closer to the PC market than to the PS3 market.
Or has Sony replaced NetFront in the PS3 system software with Firefox?
Re: (Score:2)
Mono=an infectous disease that leaves you exhausted long after the virus is gone.
Re: (Score:2)
But you'd lose a lot of platforms too - many phones, PS3/Wii, Linux, Mac, UNIX, etc, using Silverlight.
Re: (Score:2)
Re: (Score:2)
Not true.
Firstly, you can run Silverlight applications on Linux and Mac with Moonlight.
Secondly, in the case of C# applications that use Direct3D you can use Wine and Mono together.
Thirdly, for phones, there is Mono for Android, and iOS support on the way, as described here [tirania.org].
Re: (Score:2)
Are we also supposed to write WebGL games with notepad?
No. [gnu.org]
Re: (Score:3)
No, VIM is a far better choice for not only WebGL but everything else as well :)
Re: (Score:2)
Adobe Molehill, the new accelerated 3D architecture for Flash and Flex basically does the same things.
Are we also supposed to write WebGL games with notepad?
I've tried out WebGL several times now and I can tell you it's disgusting. JavaScript is a terrible language, the only thing that makes it close to bearable is jQuery but even then you're still dealing with something there is almost no way to efficiently debug, no editor really fully supports, and still doesn't have a full standard or a standard everyone abides by. Imagine writing a full 3D game in JavaScri
Re: (Score:2)
Adobe Molehill...
really?
I've tried out WebGL several times now and I can tell you it's disgusting.
what you mean is: "I don't understand GL"
JavaScript is a terrible language....
what you mean is: "I don't undrestand prototype based OOP and first class functions"
what you mean is: "I don't know how to use firebug, opera firefly or google to find a better debugging tool"
what you mean is: "I'dont know how XHR works and have never heard of JSON or XML"
I think you did approach web development a but too fast. You should better tak a step or two back and look again at what you have done there, how much you understand of it and focus on developing your skills first before starting developing another web application. Why don't you start with something basic [fisher-price.com] and then go up the ladder?
Re: (Score:2)
Actually I started coding OpenGL on an SGI octane well before it was considered a consumer technology and I understand it quite well. I've written many applications using OpenGL and have written quite a few things with GLES 1 and 2. I do like GLES, I don't like GLES in JavaScript because as a language JavaScript really is terrible. It's slow, the context of "this" can sometimes change randomly, it's terrible at dealing with binary data and it runs differently on everything.
I do use firebug and it easily the
Re: (Score:2)
I used to hate Flash too, but then I realized Adobe really did an excellent job with ActionScript3 and the compiler, API, and a lot of tools are completely free (as in beer). AS3 as a language is much more powerful than JavaScript and it has many less issues. On top of that AS3 had noticably faster communication time for XHR when making a ton of tiny parallel requests. And don't think I'm just bad with JavaScript, I've done a lot of it and at times have found it a very good tool for the job (I wrote the Spi
Re: (Score:2)
Oh, and the truth is I think the idea of WebGL is a very good one. I'd really like to see sites where there is just one canvas tag with a script injecting WebGL into it. The thing is I've tried the samples, I've played with it, and it was a terrible experience. The sole defamer of WebGL is JavaScript. JavaScript is a text language, really everything is text and that is right down to the fact that you can change the text of the script while the script is running ["feature"?]. GLES is a very binary/data orien
Forgo iOS and Android or forgo Xbox 360 (Score:2)
Relegate yourself to Microsoft's little vertical romper room if you want.
If I ignore Xbox 360, I relegate myself to single-player and online multiplayer, as opposed to local multiplayer within a household. PCs can be connected to HDTVs and gamepads, but the majority of people appear unwilling to do so.
Re: (Score:2)
Re: (Score:2)
The main vector of attack in WebGL is through shaders. Silverlight doesn't support shaders (it only supports the Reach profile,) which - for better or for worse - means it really is more secure.
Re: (Score:3)
Silverlight does support shaders - Reach profile just means that you're limited to Shader Model 2. SM2 is sort of like a baseline requirement these days for a lot of things including Windows Aero. On the desktop it's supported by the old-school ATI Radeon 9 series or higher or Nvidia Geforce FX or higher. On the mobile, GPUs that support OpenGL ES 2.0 are generally SM2-compliant.
Re: (Score:2)
Right, my mistake. I was thinking of WP7.
Re: (Score:2, Flamebait)
Re: (Score:2)
Really? (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
I mean what could possibly be dangerous about allowing random websites to run hardware level code?
The same thing that's dangerous about allowing anyone to run random code on your system. This is not a new security problem. It's the same old security problem in a flashy new suit. Any access higher than no access risks infection.
Personally, I'm curious about what, if any, protections WDDM provides against this type of attack (by design or by accident).
Re: (Score:3)
Re: (Score:2)
Then you clearly have not been here more than 10 minutes.
Re: (Score:2)
http://en.wikipedia.org/wiki/HLSL [wikipedia.org]
Hmm, sure as hell looks a lot more hardware-level than javascript to me. I've never heard of javascript execution depending on the hardware you have, but the features of this HLSL and GLSL stuff seem to be closely tied to the particular make and model of graphics card you own. For instance, the question "can Javascript do X?" can be answered without knowing what hardware it is running on. The question "can HSLS or GSLS do X?" can not always be answered without knowing what h
Re: (Score:2)
To make the example specific, let me ask you, can HSLS or GSLS do geometry shading?
Can HTML5 do Canvas, JavaScript Web Workers, SVG filters, and XMLHttpRequest 2? That's environment-specific too. Just as one has to fall back to other methods when certain HTML-family technologies turn out unavailable, one has to fall back to other methods when geometry shaders turn out unavailable.
Re: (Score:2)
Who asked anything about HTML 5? Who said anything about "environment specific?" Your entire post is a non-sequitur, relating to nothing that was actually under discussion. Let me refresh your memory. We were comparing HSLS or GSLS to javascript and asking which was more hardware specific. Now, do you have anything to add that is relevant to the topic under discussion?
Reference software renderers (Score:2)
Re: (Score:2)
Let me summarize the conversation so far. Someone said "GLSL and HLSL are about as hardware level as javascript." Someone else disagreed in no uncertain terms. I gave evidence that GLSL and HLSL were not as hardware level as javascript. You attempted to sidetrack the discussion into the realms of HTML5 and "environment" specificity. When I pointed out that that was off topic, you brought up some supposed "theory versus practice" debate, as if that would actually change the answer to the original question: A
Re: (Score:2)
I am no graphics programmer, but I am going to stretch out my experience in other areas (audio APIs), and say that, while these are hardware-dependent in terms of features offered, I highly doubt the same low-level shaders that work on an ATI card will work on an NVIDIA card; in fact, I see HLSL and GLSL as wrappers, sorta of like Direct3D and OpenGL themselves a generation or two ago - certain Direct3D calls or OpenGL calls depended on what features are provided by the underlying hardware, and this also tr
still? (Score:2)
You keep using that word. I do not think it means what you think it means.
Mozilla Firefox before 3.5.16 and 3.6.x before 3.6.13, and SeaMonkey before 2.0.11
With record-setting speed and efficiency! (Score:5, Funny)
Re: (Score:2)
And running your nuclear centrifuge at the same time!
Glad I'm not using Binary Blob drivers (Score:4, Interesting)
Re:Glad I'm not using Binary Blob drivers (Score:4, Insightful)
Do any FOSS drivers even support shaders?
Re:Glad I'm not using Binary Blob drivers (Score:5, Funny)
probably why he feels more secure!
Re: (Score:2)
No, OpenGL only recently added support for Phong shading. But one day!
Re: (Score:2)
AFAIK the r600 driver for Radeon cards does support shaders, at least well enough to implement the OpenGL fixed-function pipeline.
Re: (Score:2)
Looks like it. GLSL 1.2 isn't exactly new, but I'd take old standards with a real, safe memory manager over the likes of nvidia with their history of root exploits.
Re: (Score:2)
Re: (Score:2)
the nvidia drivers work perfectly, even supporting e.g. S3 suspend
Wait, is that uncommon on your platform? On my platform of choice, if S3 suspend didn't work, I'd definitely be returning some hardware or other.
Re: (Score:2)
I'm going to assume you're running Linux or one of the *BSDs. I run Windows, and in my humble opinion, this is one of those things that ABSOLUTELY must work flawlessly before any sort of meaningful headway can be made into the desktop world.
Just my opinion.
Re: (Score:2)
Shrug; I hardly used suspend on my desktop back when I had it working (no idea whether it still works).
Fair enough, but in a business environment, sleeping several thousand machines every night can lead to real cost savings.
Info and thoughts (Score:5, Informative)
WebGL is a Javascript expression of OpenGL ES 2.0, the same OpenGL edition that appears on Apple's iOS and recent versions of Android. OpenGL ES 2.0 is essentially OpenGL 2.0 with the fixed function pipeline removed. This reduces the size of the API substantially.
Some may remember the little ARM11 based computer that appeared [slashdot.org] last week supports OpenGL ES 2.0. OpenGL ES 2.0 is also the choice of Wayland developers. There seems to be a big convergence happening around this particular edition of OpenGL due to embedded GPUs
WebGL is manifested as a context of a HTML5 canvas element. Javascript has been extended with new data types to provide aligned, dense arrays for loading vertex attributes into the GL. WebGL allows vertex and fragment shader code to be loaded into the GL.
The end result is very high performance graphics driven by Javascript hosted in a browser. WebGL integrates with the browser in some convenient ways; texture data is loaded from Javascript image objects and CSS can apply 3D transforms, for example.
WebGL has been supported in experimental form by Webkit and Mozilla since late 2010. Opera also supports WebGL. Microsoft is no where to be found.
Operating systems compromise security for the sake of GPUs. Obviously, exposing graphics subsystems to inevitably malicious code will get machines compromised. I think Google, Mozilla, et al. should adopt the 'no-script' paradigm for this stuff and require the operator to explicitly enable WebGL content case by case. The graphics subsystem will never prioritize security over performance so securing these code paths well enough for public exposure will never happen.
It would be nice if they gave this some thought before millions of people get owned and WebGL gets a huge black eye......
Re: (Score:2)
It would be nice if they gave this some thought before millions of people get owned and WebGL gets a huge black eye......
WebGL has been supported in experimental form by Webkit and Mozilla since late 2010. Opera also supports WebGL. Microsoft is no where to be found.
Well, that sounds like a good first step.
Re: (Score:2)
I'm much more worried about the 'hardware acceleration' browser makers are using. WebGL is a very well defined subset of OpenGL.
Anyway I've seen many browser/drivers/operating systems and other parts of the stack fail on it.
I think it will take time for the driver, operating system, browser makers to get it right.
You can also look at it differently: We've had exploits with images, do people disable it by default ?
If you want to disable something, do it like the HTML5 Geolocation, pop up a bar: "do you want
Re: (Score:3)
I agree, the users usually make the right choice. Cue "click here for awesome smileys!".
Re: (Score:2)
I am. Certainly, I'm clamoring to develop processing-intensive 3D games for web browsers.
Think of the success of free-to-play stuff online now. Imagine you could just click a link and jump into a demo, no downloading, no security risk (assuming this crap gets fixed), you could try games out without the "download random crap" problem. Even once you decide to pay for it, compare this to approaches like Steam -- the instant gratification of having a game right now with the rest of it progressively downloaded a
Re: (Score:2)
Even once you decide to pay for it, compare this to approaches like Steam -- the instant gratification of having a game right now with the rest of it progressively downloaded as needed, versus waiting a few hours for it to download, and that's on my 100 mbit fiber connection.
Kind of like Guild Wars, you mean? An installer that's a couple of megabytes and downloads the rest as required?
Re: (Score:2)
I haven't played Guild Wars, so I can't say, but I do know I haven't really seen this done right. What does "as required" mean in this case? With a sufficiently fast connection, could I conceivably walk through the entire world without seeing a loading screen?
And even in that case, there's still the part where you have to download that installer, and trust it with your system.
JavaScript, GPU, and graphics (Score:3)
Javascript is a language for use on the Web. You know, hypertext: text, links, and images.
Another word for "images" is "graphics".
There's no reason ever for this to have access to your GPU.
"GPU" stands for "graphics processing unit". You appear to claim that a language for use in a graphical environment shouldn't have access to capabilities exposed by a graphics processing unit. Can you clarify?
Are people clamoring to play processing-intensive 3D games in a web browser?
Yes. Otherwise, Adobe would not have added rudimentary 3D capability to Adobe Flash.
Re: (Score:3)
A web browser is not a video game console
Millions of FarmVille fans would disagree with your claim. Browser-based video games have been around since Flash and DHTML first came into use.
If not web-based, then what? (Score:2)
12 years later, gaming in a web browser is still a bad idea.
Browser gaming offers on-demand zero-install cross-OS deployment of video games over the Internet. It also offers sandboxed execution so that a game doesn't end up with permission to overwrite or disclose all of a user's documents. What better solution do you have that offers both of these features? Would you recommend Java Web Start or Adobe AIR?
Re: (Score:2)
The CPU has needed help with everything since it was invented, otherwise we would still have the same CPUs we had back in the 60s with absolutely no need for improvements. Your conservatism is baseless.
Re: (Score:2)
Obviously.
Re: (Score:2)
If WebGL breaks, I can disable it, fix it in my own browser (it's open source), or switch to a different browser with an entirely different implementation. If it breaks because of my video driver, I can upgrade my video driver, or switch to a different one, even swap out my entire video card, or run it on top of a software GL implementation.
If Flash breaks, I can either disable it and wait till Adobe gets off their lazy asses and fixes it, or leave it enabled and thus be vulnerable.
Hey, I'm not even necessa
Fix it in your own Gnash (Score:2)
If Flash breaks, I can either disable it and wait till Adobe gets off their lazy asses and fixes it, or leave it enabled and thus be vulnerable.
Or switch to Gnash and Moonlight, and fix any deficiencies in your own copy the same way you claimed that you can "fix [WebGL] in your own browser". Ultimately, your complaint appears to be that you haven't had the opportunity to help Gnash and Moonlight become viable.
Gee (Score:1)
Quake Live (Score:3, Interesting)
I raised this concern with Quake Live, but was quickly shut down by people. Nobody wants to listen to the possible security holes in something they want to ram through at all cost. Forgive my tone if I'm a little annoyed hearing this. Sometimes you want to be wrong about something, but now I have been proven correct, I'm annoyed with myself.
Re: (Score:3)
Quake Live isn't WebGL. Even if it was, Quake Live wouldn't be affected. This security concern is only about a user visiting a web page that runs a malicious WebGL program, which Quake Live is not.
I don't get it (Score:2, Insightful)
Re: (Score:3)
because up until now the response from graphics card manufacturers has been "security auditing? Open specifications? What, are you running arbitrary binaries on your PC and complaining when they take over your system?", whereas now they'll need to say "security auditing? Open specifications? What, are you running a web browser conceived of before 2010?"
Re:I don't get it (Score:4, Insightful)
So they're saying that enabling shader code execution allows web sites to exploit hypothetical vulnerabilities in the graphics driver?
They're not particularly hypothetical. Graphics driver code is such that games programmers carefully work around bugs in order to not crash anything. Imagine if every program running on the main CPU had to carefully avoid certain instruction sequences in order to not crash the system -- would you run a multi-user system on that?
Then again, that was how it was in the 80's on many time sharing systems...
Re: (Score:2)
The key here is "attack surface". Having relatively uninhibited access to low level graphics APIs that were not previously assumed to be public means there are probably lots of bugs with security implications. I wouldn't be surprised if graphics drivers eschew error checking in order to gain performance, but now malicious programmers can use that to crash the browser or OS. Shader compilers are also quite complex, and may present opportunities for specially crafted invalid programs to overflow buffers or
Re: (Score:3)
Your JS interpreter doesn't run at kernel level with full access to your low level hardware.
Re: (Score:3)
Because, like Adobe formats before it (like PDF or Flash), graphics card drivers were designed well before any serious thought was given to security issues. Even now, graphics drivers are 100% focused on speed, speed, speed, and security concerns are probably barely even on the radar. As GPUs move closer to general-purpose computing devices with true logic paths, this problem is only going to get worse. In other words, it's probably a softer target than a Javascript runtime environment is, and that's say
Re: (Score:2)
Why do you think the OpenType sanitizer had to be created?
Example of GPU overload? (Score:2)
They got BSODs by 'overloading' GPUs. By doing what, continuous high-stress activity? In which case, surely they should be sufficiently ventilated etc. isn't that the real issue - surely it's not possible to kill GPUs by just making them a lot of work? Imagine if CPUs did that, we'd be pretty screwed by now
If they don't mean high-stress/high-throughput activities, what do they actually mean - overloading them with textures or something?
Re: (Score:2)
Re: (Score:2)
No, GPUs do DMA bus mastering and other newer sorts of independent access to RAM and the bus address space at large. If the command stream is not filtered, or the hardware does not have provisions for contexts that restrict the GPU's memory access capabilities depending on what control channel is being used, a GPU can pretty much read and scribble all over the entire system RAM.
GPU manufacturers have been historically oblivious to this and not bothered to put much in the way of security features into the h
Re: (Score:2)
Actually, at least early NVIDIA cards had pretty good hardware support for this as far as I understood it. I don't know the status of their current cards, but their early cards had hardware support for different contexts and would generate an error if a user tried to do something it was not allowed to do. (For example, rendering outside of the selected window.) So the cards could allow a user to send commands via DMA to the card (this is often called direct rendering) in a secure manner without any risk of
Re: (Score:2)
It actually is very possible. I recall recently AMD had to place a block on FurMark in their drivers because it was stressing their video cards too much - causing them to fail. AMD had designed their power circuitry and cooling designs around "typical" use cases and not around the theoretical maximums - meaning you could overload the power circuitry or burn out your graphics card with just software.
I can't wait for Native Client! (Score:5, Insightful)
Can anyone remind me why we're putting EVERYTHING in a web browser anyway?
Re: (Score:2)
Can anyone remind me why we're putting EVERYTHING in a web browser anyway?
Because Native Client is specific to Google Chrome. What other instant, sandboxed application deployment method do you recommend before other companies' browsers begin to support Native Client?
Re: (Score:2)
Can anyone remind me why we're putting EVERYTHING in a web browser anyway?
Simple. Because we really don't give a fuck about web browsers.
All we really want / need is a cross platform widely distributed standardized* text & graphics display environment that can be manipulated via client side scripting, and can communicate with server side processes; And for "applications" or "services" created with such a system to be easily discoverable by our users.
* Yes, we need across the board standards conformance for stability and to reduce development costs, too bad we don't real
Re: (Score:2)
to reach critical mass, any alternative language such as lua would require sponsorship by each open source browser effort. Given each has its own javascript engine as a selling point (mine is 10% faster than yours), such consensus might be difficult. Perhaps a higher level API than DOM is needed to shield the language runtime from the underlying impl.
Perhaps a more realistic solution is to translate lua (or other) into javascript that can then be forwarded to the js engine. Coffescript uses this approach. I
Re: (Score:3)
Can I offer an alternative suggestion? How about we DON'T replace JavaScript with <insert favourite scripting language here>. Okay, part of the problem with web scripting is that we're using JavaScript, which sucks. But the majority of the problem is that we're using a programming language at all. We've just reinvented the biggest, clunkiest wheel on the planet.
Look at the number of projects that are taking programming code, or in some cases bytecode, and compiling it u
Re: (Score:2)
If you're implying that .NET meets the above criteria, it doesn't.
Firstly, a lot of people, myself included, are wary of Mono due to Microsoft's historically negative stance towards open source and accusations of patent infringement by the open source community. There's an ongoing debate about that now which I don't want to get into, but suffice to say that forcing all web browsers to adopt .NET will not sit well with a lot of the free software people. It's also extremely Windows-centric (as you would imagi
Re: (Score:2)
So that users don't have to install anything, making it easier to distribute software.
Re: (Score:2)
Because that's Google's business plan.
And people are too lazy to deal with installation, patching, etc.
Re: (Score:2)
Ubiquity
Re: (Score:2)
Are you kidding?
Browsers are cross platform, supposedly secure and sandboxed, easy to develop for and provide an easy way to make applications while delivering ads and data-raping your users. There is nothing for the user to install and no way to pirate your app or game. Updates are instant and require no user intervention.
Perhaps now Mozilla will finally enable their whit (Score:2)
Which they had already written for WebGL.
http://mxr.mozilla.org/mozilla-central/source/content/canvas/src/WebGLContextUtils.cpp#101 [mozilla.org]
At least then users have to click ok before trusting a site, in a similar fashion to location data.
And hopefully that check will be strongly worded.
Personally, I'm glad I run NoScript
Re: (Score:2)
What's also funny is I remember discussing precisely this attack method a few weeks ago w/ webgl folks.
Thought at the time it'd be a bit slower than the end result, in terms of image extraction.
Also joked about combining it with a webgl game to keep the user occupied, where their clicks on the "red ball" or "blue ball"
would steal more pixels.
Speed up the attack while keeping them occupied.
Not that big deal (Score:2)
I doubt this is going to be a major cause of future security problems.
As far as I'm aware, WebGL is only allowing shaders to be specified in GLSL which is a pretty high level shading language. Obviously there's no such thing as pointers, and unlike something like javascript there's no interaction with complex objects. Shaders form a very clean and thin interface, basically just being a bunch of floating point vector operations. The only complex objects you're really going to interact with is various texture
To disable WebGL in FF4 (Score:3)
about:config
webgl.disabled = true
-molo
Doesn't Linux solve this? (Score:3)
Re: (Score:2)
As a Linux/ATI user, I'm completely protected because there are no Linux browsers that currently support WebGL on ATI hardware with the official drivers.
Suck it, Windows users.
Much ado about nothing (Score:3)
For the most part this is a lot of security handwaving.
While the GPU itself can do DMA and whatnot, shaders don't have access to any of that. If a shader can access texture memory that hasn't been assigned to it *in certain browsers* then it sounds like a bug in the browser or the browser's WebGL plugin. Being able to "overload" the GPU and blue screen the computer sounds like Yet Another Windows Bug.
A shader isn't just some arbitrary binary blob that gets executed directly by the GPU. Even native programs can't do this. You provide the driver your shader source code, the driver does the rest. It's intentionally a black-box process so that the driver can optimize the shader for the GPU and not force a specific instruction set or architecture onto GPU designers. Thus allowing the underlying GPU design to evolve, possibly radically or in unforseen ways, without breaking compatibility.
Furthermore, a shader can only access memory via the texture sampler units, which must be set up by the application. If the WebGL application (which is just JavaScript) can set things up to access texture memory it isn't supposed to be able to, the problem is with the WebGL and/or HTML5 implementation, not the concept of WebGL or the GPU driver.
Re: (Score:2)
It was never stipulated that there's anything fundamentally insecure about WebGL. The problem is that exposing hardware acceleration to untrusted code significantly increases your attack surface. With WebGL, suddenly graphics drivers and hardware acceleration subsystems must also be made secure because they're running untrusted code from the web.
Graphics drivers aren't even close to being secure enough to allow untrusted code - I remember seeing a case a couple of months ago where a game developer had accid
Re: (Score:2)
Shaders themselves are pretty limited in scope, though. You can't really access anything beyond the GPU, textures, and model geometry.
GLSL (the language WebGL and OpenGL shaders are written in) doesn't have pointers and is most definitely NOT a general purpose language.
Even without shader support in WebGL you'd have the potential for intentionally bad model geometry crashing a really poorly written driver.
Re: (Score:2)
IIRC OpenGL ES 2.0 can, but it's all vendor-specific. And it's still intermediary bytecode, not something that will execute directly on the hardware.
Misleading (Score:3, Interesting)
This is overstated and inaccurate headline. A large area of harm can come from a software application accessing memory space which is a hardware adapter mapped into address space of the app, and manipulation of pointers and so on. This is a issue with C level programming where you have direct access to memory addresses and pointers. A javascript implementation of WebGL does not offer access to OpenGL functions by addresses or pointers to the functions at all. Instead it should use a dispatch table below the javascript environment. JS is thus isolated from direct access to hardware and IS NOT using hardware addresses in any way to access functions.The WebGL program can only therefore access the functions via that dispatch table. There are still possible issued with problems with the OpenGL API itself that may need to be addressed for safety. However, direct access to video hardware by a Javascript app is out of the question, it should not be done, and should not be allowed in a Javascript environment. the Javascript is a sandbox and does not provide direct access to memory address space, system calls or other such things. if security is a concern, due to Hardware faults that could be triggered by sending a series of OpenGL commands, the web browser can be configured to use software renderer instead of the hardware rendering, or selectively block that series of OpenGL commands in some way. If there are significant problems with hardware rendering, that may be sometjhing that may have to be considered as a default option. this is something that can affect all applications that are used over a network including gaming applications, since, even a gaming app that sends abstracted updates such as "soldier moves forward 200px" can trigger OpenGL calls in a predictable way. However, these nor WebGL should not expose the graphics hardware directly to Javascript or the network protocol, nor the address space, only OpenGL functions can be called by some sort of token or symbol which is contained within the sandbox environment and does not directly referfence a memory location. Those OpenGL functions are used by a wide range of different network applications, so this is not a problem particular to WebGL.
I am a web rendering engine developer so I have an interest in making sure these things are safe. If necessary we will software render by default until we can be sure about safety with direct rendering. Furthermore this is not a WebGL problem,s, these are bugs in OpenGL itself. There are software rendering workarounds. So this will not affect the availability of WebGL.
I do support WebGL since it will help the web environment compete against Flash.
Re: (Score:2)
GPU shaders have no access to main memory so where is the attack vector ?
GPUs have access to main memory, quite possibly the entire address space of the machine (I only briefly worked on Vista drivers so I'm not sure how they compare to XP). The shaders may not be able to access memory directly through a pointer, but if you can somehow exploit a driver bug to get a texture configured to use an arbitrary system memory address, then they could potentially access any memory on the machine.
Of course that's physical memory, so you'd need an exploit which can also map logical to physi
Re: (Score:2)
The truly dangerous attack vector is not any shader code. It's all the half-decent shader compilers out there that are part of the OpenGL runtime. GLSL shaders always need to be compiled from source at runtime by the driver. So the Browser must pass off a big piece of code/data to the driver's compiler, a complex, non-secure and typically also annoyingly buggy piece of software. In other words: the compilers' fragile parsers are now directly exposed to remote websites. They were never intended to take poten