New Graphics Firm Promises Real-Time Ray Tracing 136
arcticstoat writes "A new graphics company called Caustic Graphics reckons it's uncovered the secret of real-time ray tracing with a chip that 'enables your CPU/GPU to shade with rasterization-like efficiency.' The new chip basically off-loads ray tracing calculations and then sends the data to your GPU and CPU, enabling your PC to shade a ray-traced scene much more quickly. Caustic's management team isn't afraid to rubbish the efforts of other graphics companies when it comes to ray tracing. 'Some technology vendors claim to have solved the accelerated ray tracing problem by using traditional algorithms along with GPU hardware,' says Caustic. However, the company adds that 'if you've ever seen them demo their solutions you'll notice that while results may be fast — the image quality is underwhelming, far below the quality that ray tracing is known for.' According to Caustic, this is because the advanced shading and lighting effects usually seen in ray-traced scenes, such as caustics and refraction, can't be accelerated on a standard GPU because it can't process incoherent rays in hardware. Conversely, Caustic claims that the CausticOne 'thrives in incoherent ray tracing situations: encouraging the use of multiple secondary rays per pixel.' The company is also introducing its own API, called CausticGL, which is based on OpenGL/GLSL, which will feature Caustic's unique ray tracing extensions."
Re:2009 (Score:3, Insightful)
Can't wait for the ray-traced BSD desktop version of Duke Nukem Invents The Flying Car.
I'll believe it when I see it! (Score:5, Insightful)
They've advertised Linux support too, but I haven't heard anything from these guys. Unless they're like nVidia and sit around killing kittens all day, it would be a good idea for them to actually do some research and figure out how GLX and DRI work. Even the ATI closed-source drivers still respect the GLX way of life.
(nVidia replaces the entire DRI stack. DDX, GLX, DRI, DRM, all custom. fglrx doesn't replace GLX. Just in case you were wondering.)
Re:"Caustic"? (Score:3, Insightful)
Re:Shitty summary! (Score:3, Insightful)
A post I made elsewhere on the subject (Score:5, Insightful)
Like with anything, I call vaporware until they show real silicon. Not because I think they are lying, most companies don't. However there are plenty of overly ambitious companies out there. They think they have figured out some amazing way to leap ahead and get funding to start work... only to realize it's way harder than they believed.
A great example was the Elbrus E2K chip. Dunno if you remember that, it was back in 2000. A Russian group said they were going to make the Next Big Thing(tm) in processors. It'd kick the crap out of Intel. Well obviously this didn't come to pass. The reason wasn't that they were scammers, in fact Elbrus is a line of supercomputers made in Russia. The problem was they didn't know what they were doing with regards to this chip.
Their idea was more or less to put their Elbrus 3 supercomputer on to a chip... Ok fine but the things that you can do on that scale, don't always work on on the microscale. There are all sorts of new considerations. So while their thing was all nice in theory on a simulator, it was impossible to fab.
Intel and AMD aren't amazing because of the chips they design, they are amazing because they can then actually fab those chips economically. You can design something that'll smoke a Core i7 in simulations. However you probably can't make it a real chip.
This smells of the same sort of thing to me. Notice that they have press releases and some shiny demo pictures, but it was clearly done on a software simulator. Ok well shit, I can raytrace pretty pictures. That doesn't prove anything. Their card? Apparently not real yet, the picture of it is, well, just a raytrace.
So who knows? Maybe they really do have some amazing shit in the pipeline. Doesn't matter though, they've gotta make it real before it matters. nVidia releases pretty pictures too. Difference is the pictures of the cards are of actual cards, and the pictures rendered are done on the actual hardware.
I am just never impressed by sites heavy on the press releases and marketing, and light on the technical details, SDKs, engineering hardware pics, and so on.
Unanswered questions (Score:4, Insightful)
I shall remain skeptical until more information is forthcoming.
Re:ray tracing - not just for chrome spheres anymo (Score:3, Insightful)
Re:Shitty summary! (Score:1, Insightful)
Perhaps, but it's mostly ScuttleMonkey's fault for posting such a misleading summary.
No video, no pictures. It smells like hoax. (Score:4, Insightful)
For something as ambitious as they have, it's very strange that their web site has no demos, absolutely nothing, of their products. No pictures, no videos, nothing.
Names that require explanation aren't good choices (Score:4, Insightful)
I remember when I first saw a very poorly drawn, shaky image of an animal and read that it was a Gnu, and read how clever the name was considered to be since it was, they said, "recursive": GNU is Not Unix.
It didn't bother the enthusiasts that most people in the world can't pronounce the name and have never seen a Gnu.
They found someone with artistic ability to make a better image of a GNU [pasteris.it], but I've seen no evidence that anyone with technical knowledge realizes the depth of the self-defeat in choosing an obscure reference to an obscure animal.
To most people the word "caustic" means only "capable of burning, corroding, or dissolving".
Re:GI is already an inaccurate trick (Score:0, Insightful)
There's plenty of 'realistic' renderers out there. If you mean 'doing it exactly as mother nature does', then no.. But if you look at something like the Maxwell Renderer, where you specify surface properties according to actual physical characteristics, etc. and the renderer itself calculates only by brute force (to tricks to speed things up - which invariably cause accuracy errors), then you get pretty darn close to a 'realistic' renderer.
Re:Big deal. (Score:3, Insightful)
Even my MSX computer did real time raytracing like a champ, providing that all the pixels were produced from a 2D non-reflective surface using a 90 degree angle. Of course you had a limited color space, but otherwise everything run just smoothly.
Kidding aside, I suppose it's how far you want to take it. Amiga or MSX are not interesting anymore for about 90% of the things they did. The one exception is probably playing retro games.