Initial WebGL Support Lands In WebKit 181
appleprophet writes "WebGL is an upcoming standard from the Khronos Group, the same standards body behind OpenCL and OpenGL ES. It defines the use of OpenGL in websites using the standard canvas element. In other words, websites will be able to render hardware accelerated, 3D graphics natively inside of a web page. In the last week, WebKit, the rendering engine behind Safari and Google Chrome, has added initial support for WebGL, which means it probably won't be too long before Macs and iPhones everywhere get OpenGL web apps. This could have big implications for gaming. HTML5 has steadily been encroaching on desktop applications' territory, but I don't think many people expected browser-based, hardware-accelerated graphics this soon."
Ads (Score:5, Insightful)
Why? (Score:5, Insightful)
Re:Ads (Score:3, Insightful)
All you have to do is tell people in marketing that a lot of people do use the GMA chipsets (all early entry-level intel Macs), netbooks, cheap laptops, cheap desktops, etc.
The iPhone and iPod touch is also getting stronger every day, though I wouldn't be surprised if the 3D chipset in those isn't more powerful than an intel GMA, relative to the screen size of the iPhone/iPod touch.
Re:Why? (Score:5, Insightful)
Browsers might be ready for GL but not Javascript (Score:3, Insightful)
I've written a few games using the 2D canvas element. Invariably these games use 99% of the CPU because Javascript doesn't have a real sleep() function. There's no decent way to manipulate sounds (like an FMOD for javascript). Tests on my machine show that changing the line/fill color is expensive. There's no way to switch to full screen or to capture every key stroke/mouse movement. All of which is beside the really big issue: there's no decent debugger.
3D games sound like a nice idea but they'll be prohibitively expensive (time-wise) to develop, suffer bizarre bottlenecks not seen in native code, and have to work through the very limited browser interface. While Assembly demo coders might enjoy the challenge of working in such a limited environment, the rest of the world should wait for some real improvements.
Malware (Score:1, Insightful)
Oh boy, I just can't wait for the onslaught of malware-induced popups in fullscreen OpenGL to consume every last FLOP of my graphics device.
Re:Why? (Score:5, Insightful)
There are some positive uses for it (such as getting us out of dependence on proprietary technologies like Flash)
There are already a lot of ways we could get rid of Flash, if people would stop using it. After all, Flash is mostly used for vector graphics (we've got SVG), interactivity (we've got AJAX/DHTML), and audio/video (we've got the audio and video tags).
For one, this will add even more vulnerabilities to browsers which seem to already have loads of them.
So does every useful feature. I'll bet money that the first serious vulnerability is in a driver, not a browser.
And for another the web should be accessible for -everyone- from the low-end netbook to a Core i7, and even older systems should be able to browse web.
OpenGL can be implemented in software.
And you lost that argument already with Flash. When there are one or more video ads on many pages I visit, I doubt low-end netbooks are doing well.
Yeah, we all know that they should do it in HTML and that will still stick around, but how many of us have encountered sites built entirely in Flash?
The fact that technology can be misused is not a reason to avoid developing said technology.
For example: It's possible to build an entire site surrounded by an iframe, so that navigation is completely broken. That doesn't mean that iframes have no legitimate uses.
It's also possible to build an entire site as a single AJAX app. This can be done well, but it takes more work -- for example, with Gmail, notice that everywhere you go, it adjusts the hash in the URL, so that you can use browser navigation properly -- the back button works, so does bookmarking, open in a new window/tab, etc etc. Sites that don't do that could have really poor usability.
It's also theoretically possible to render images using massive HTML tables, with each cell representing a single pixel. Does that mean HTML tables should be made less flexible, just so no one can do that?
I could go on...
The fact is, there are ways to abuse any technology -- there's always the possibility that someone will print out a website, make the changes they want, scan it back in, and upload it as an image. The fact that people can abuse technology should never prevent us from creating new, interesting bits of technology that have real, practical applications.
While some things obviously need Flash (such as Homestar Runner because they are Flash cartoons)
Nope. They need Flash because they were authored in Flash. They could have been done with SVG, Javascript, and the audio tag, it's just that the authoring tools for these aren't anywhere near Flash itself, from what I understand.
adding a high-end graphics card to a computer just to view the web? Thats just a bit ridiculous.
See, now you're being ridiculous.
Consider that ten or fifteen years ago, using excessively large images would be considered bad taste -- you'd be asking for way too much RAM "just to browse the web", and you'd be wasting a ton of bandwidth. That's why we came up with the idea of thumbnail galleries -- which are still useful, but a better model is really a slideshow.
It's not a "high end video card" now, either. Just about any video card is going to have some 3D capability on it -- and we're moving in the direction of compositing window managers, which will actually lead to cards supporting just 3D (and having to emulate 2D) being cheaper than cards supporting just 2D (and having to emulate 3D).
So, 10 or 15 years from now, when people want to add a little effect to their website -- or build an interesting "flash game" that's capable of actually using 3D hardware -- would you prefer it use Flash? Or maybe you'd prefer Quake Live [quakelive.com] -- a plugin for each game -- may as well just download an exe?
Or would
Re:A trusted list of sites. (Score:5, Insightful)
The point is that nobody really cares all that much about what you do, as the web industry does not revolve around you.
Re:Ads (Score:3, Insightful)
Actually, hardware accelerated advertisements would improve the user experience, compared to regular advertisements.
I find that closing my eyes improves the user experience compared with most regular ads on the web. (And on the TV too for that matter.) It's the obnoxiousness that I object to, and the added indignity of ads too often being such horrendous messes that they impact browsing of unrelated sites with common browsers, though NoScript is rather a good response to that (and good for other things too).
Re:That's a Bit Optimistic Don't You Think? (Score:4, Insightful)
As other smartphones adopt the new standards (many already use Webkit-based browsers), you instantly gain compatibility on those devices. No need to maintain three separate codebases for the iPhone, BlackBerry and Android if your app works great in the browser of all three platforms.
I also prefer sites/apps where I don't have to perform any installation to get at the content, but that's more a matter of personal preferences. Right now that's only practical for a small subset of apps, but WebGL may change that.
Why shouldn't they? (Score:2, Insightful)
But adding a high-end graphics card to a computer just to view the web? Thats just a bit ridiculous.
Equally ridiculous is the suggestion that you need a "high-end graphics card" to run OpenGL. This isn't 1998 anymore.
Also ridiculous is the suggestion that this will be mandatory for web pages in general. You might as well claim that Google maps shouldn't exist, because good webpages ought to be viewable in Lynx. Or that YouTube shouldn't exist, because webpages ought to be viewable on computers with small amounts of CPU power. Or that Java shouldn't be allowed, because someone could write an application that uses a lot of your CPU power.
There's nothing wrong with having extra technologies when they're needed - such as someone doing an online game. Yes, it'll be stupid if someone requires a 3D card for what should be a simple webpage, but that's no different to them using Flash, or any other kind of CPU-hungry code. It's 2009 - 3D hardware support has been bog standard for years, and isn't any different to a website that requires a lot of CPU or RAM. Just as any computer these days has the CPU power to decode a YouTube video, they also have the GPU power to run OpenGL. Yes, someone could write an annoying webpage that sucked up your GPU power, but they could have done that to your CPU power for over a decade with Java.
Re:How will we manage to use this? (Score:4, Insightful)
How will we manage to use this? Programatically by Javascript, right? Javascript is so limited that I fail to see how it will be to make this actually usable [snip] Javascript also needs some cleanup and some more functionality.
Like what? It's a very rich language which is a pleasure to use. The historical short-coming has been it's libraries (primarily the DOM). But the language itself is really nice, as nearly anyone with enough experience will tell you.
And, besides, I can already imagine each browser doing it in it's own way and developers having to set up multiple ways to deal with the differences.
This is the case with all standards. If there are significant implementation errors, higher-level libraries will emerge that "fix" them for the users of this API. Ajax is different between browsers, is this a problem for anyone? No, there are multiple free libraries that are super-simple to use that make the right call in each browser. $.ajax({url: "foo.html"}); Just Works (tm).
Re:A trusted list of sites. (Score:3, Insightful)
You're right - I bet people were thinking making an online game with this, but now they've discovered that you alone will have this disabled, there's obviously no point doing it. It's a good thing you posted to let them know, and save them the trouble!
Re:Ads (Score:5, Insightful)
While I'm sure there are some great uses for this, it also sounds like a way to serve even more resource-hungry adverts than they can with Flash. Furthermore, if this became widespread in situations not really requiring it, a decent graphics card could essentially become a requirement for web surfing.
I don't see how this is any different than the current situation with Flash.
Flash is resource hungry, and plays annoying ads in both 2d and 3d, with sound. So the situation is presently worse. Even if WebGL includes audio (or perhaps in the future there will be WebAL, etc.), the situation is presently worse given how problematic Flash is.
Having seen some of the things Apple (and others, but it's really Apple that's pushing it at this point) is doing with HTML5, everything that can be done to replace Flash is a good thing. Even if it means the same annoying types of ads as now, at least they'll have less of a performance hit, and they won't be tied to a single program that is constantly plagued with security issues, has performance issues, crashes so often that Apple and Google sequester plug-ins/windows/tabs because of it, and whose updater is annoying and tries to push additional toolbars which no one wants.
Just as Flash served to kill off pretty much all the various and annoying plug-ins you had to download to make full use of the web, and replaced them all with a single plug-in to rule them all, HTML5, with things like canvas, the video tag, and WebGL, is looking to do the same to Flash--replace it with something better.
Re:Why? (Score:5, Insightful)
For one, this will add even more vulnerabilities to browsers which seem to already have loads of them.
While replacing one of the biggest ones: Flash.
WebGL won't be a plugin, it will b part of the browser. That means Firefox won't be vulnerable because of an Adobe bug, but because of a Mozilla bug, and Mozilla can fix the bug themselves.
Also, with more diversity, bugs will be less likely to be exploited, and when they are they will have a smaller path of destruction.
And for another the web should be accessible for -everyone- from the low-end netbook to a Core i7, and even older systems should be able to browse web.
This is no different than Flash is now, except that WebGL has the potential to be significantly more efficient. WebGL isn't going to replace HTML, it's going to augment it, similar to how Flash does today. Sites that want to be accessible to more users will avoid reliance on resource demanding WebGL elements, or avoid it altogether, just like sites avoid resource demanding Flash objects.
On the other hand, those sites that want to take advantage of it, and for those users with more modern machines (really, WebGL isn't going to require advanced GPUs and high-end Core2Duos unless you start putting game level 3D and AI into it, simple 3D rendering will work just fine on older hardware--better, in fact, that 3D Flash does now), we'll be able to have a more rich web experience.
I absolutely detest the notion that the web should not embrace new technologies just because some people have crap computers. I agree that most web sites should be designed to be accessible across a broad range of computers (either by limiting advanced features, or providing alternate pages), but if a site wants to specialize with features that tax even modern systems (like YouTube did not too long ago), then as long as they are not essential sites (like banks, utilities, government, news media, etc), then they should be encouraged.
Already many sites are unusable without a recent version of Flash, we don't need extra hardware as requirements to view sites.
There you go. So how is this worse?
how many of us have encountered sites built entirely in Flash? Or have a requirement of Flash for simple things like navigation? While some things obviously need Flash (such as Homestar Runner because they are Flash cartoons) others use Flash for no real reason.
Which has what to do with WebGL?
But adding a high-end graphics card to a computer just to view the web? Thats just a bit ridiculous.
Not as ridiculous as assuming WebGL will mean that you have to have a high-end GPU and CPU just to view the web.
Re:That's a Bit Optimistic Don't You Think? (Score:4, Insightful)
True. But WebKit is used by Android and WebOS. Also, RIM just bought a company that makes a WebKit-based browser.
So, potentially, you're running on four of the top five platforms.
Re:That's a Bit Optimistic Don't You Think? (Score:4, Insightful)
Yes, wait no, given that the iPhone bar, the Nokia bar, the iTouch bar, and a lot of the "other" bar (palm pre, and android) are *all* WebKit based browsers. In the mean time, NetApplications shows the iPhone *alone* ahead of opera.
Re:How will we manage to use this? (Score:3, Insightful)
The problem is, when will we have the chance to stop using wrappers to solve the problems caused by the divergence between different platforms, that only add additional dificulty, performance penalties, files, etc. When will developers be able to focus on creating new stuff from the start, instead of needing to first create ways to solve the problems created by others?