Wolfenstein Ray Traced and Anti-Aliased, At 1080p 158
An anonymous reader writes "After Intel displayed their research demo Wolfenstein: Ray Traced on Tablets, the latest progress at IDF focuses on high(est)-end gaming now running at 1080p. Besides image-based post-processing (HDR, Depth of Field) there is now also an implementation of a smart way of calculating anti-aliasing through using mesh IDs and normals and applying adaptive 16x supersampling. All that is powered by the 'cloud,' consisting of a server that holds eight Knights Ferry cards (total of 256 cores / 1024 threads). A lot of hardware, but the next iteration of the 'Many Integrated Core' (MIC) architecture, named Knights Corner (and featuring 50+ cores), might be just around the corner."
first ray trace (Score:3)
first ray trace...
now where is a decent link.
Re: (Score:1)
cache:http://blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php [googleusercontent.com]
Oh, look, there's something Bing can't do.
Re: (Score:2)
Mostly useless, though. None of the image links work...
Re: (Score:3)
Re: (Score:2)
On the other hand, the wayback machine's version does have working images [archive.org]. And it doesn't use your page view to harvest information about you.
Except that's the first article from June rather than the second article from September.
Re:first ray trace (Score:4, Funny)
That was like the ultimate combination of keywords on slashdot. That server had no chance.
Re: (Score:3)
Thanks to an AC down page:
http://software.intel.com/en-us/videos/cloud-based-ray-tracing-on-handheld-devices-at-researchintel-days-2011/ [intel.com]
Re: (Score:3)
Here is an earlier video showing the game.
http://www.youtube.com/watch?v=XVZDH15TRro [youtube.com]
Re: (Score:1)
What is up with this? Has Slashdot become so incestuous that we have to backlink through 5 articles to actually get to a real article which either won't load or doesn't show the pretty pictures?
Re: (Score:2)
www.youtube.com
searching for wolfenstein ray tracing
sort by upload date
first video being a knights demonstration
link: http://www.youtube.com/watch?v=p0PJjC-JLt0 [youtube.com]
Re: (Score:2)
Hmm... (Score:1)
Re:Hmm... (Score:4, Funny)
Re: (Score:2)
Those feebs! They are pathetic.
Amazing (Score:2)
Those giant pixels never looked better!
Re: (Score:2)
thats_the_joke.jpg
Re: (Score:2)
It's the 2009 Wolfenstein http://en.wikipedia.org/wiki/Wolfenstein:_Ray_Traced [wikipedia.org] ...dumbass
Who is going to care about a fucking reflection in a chandelier during a scene where you're shooting hot Nazi lesbian killers in leather bikinis (IIRCl)?
Get some artists already (Score:2)
I think Intel would find it easier to get people excited about this technology if they actually used it to render something that looked interesting, or at the very least looked good at all.
Re: (Score:2)
Re: (Score:2)
If they used proper global illumination, then there'd be a real change. It looks like no more than one or two bounces of light to me.
Re: (Score:2)
Wooing them by showing them graphics that look worse than pretty much everything else on offer? How is that supposed to work out for them?
Re: (Score:3)
I think Intel is wooing the Wall Street Journal more than anyone else.
Intel: "Look at these amazing graphics!"
WSJ: "Wow! Raytraced you say?
Intel: "Yeah the future!"
WSJ: "Ooooo! Buy stock!"
Slashdotted Intel (Score:2)
Did we?? Neither this nor the previous version seem accessible...
Re: (Score:2)
I think so. The irony is delightful.
Ray Traced on Tables (Score:1)
Re: (Score:2)
The summary clearly says it's rendered in the cloud on 256 cores. That link should read: "Wolfenstein: Ray Traced" on Tablets.
Re: (Score:2)
So, in other words "OnLive but with a software raytracer on the server-side instead of a GPU."
MIC, now with ALOT (Score:2)
Re: (Score:2)
Re: (Score:2)
Yeah, but it opens up the opportunity for them to make a lot of lame jokes about "Rocking the MIC."
Unable to connect (Score:1)
Not even very good performance (Score:2)
Re: (Score:2)
Depends if the rendering server is halfway across the country or halfway across the house. I remember people talking a while back (7 years or so) about using a home server to do the number crunching and moving back towards thin clients to access it. Wireless N bandwidth and latencies are pretty good, with modern technology you could probably make the idea work. Offer a suite of products that play well together: a powerful and easily upgraded server, lightweight laptops, and tablets. If you could make th
Nintendo beat them to it (Score:2)
Depends if the rendering server is halfway across the country or halfway across the house
A rendering server halfway across the house is called a "Wii U console".
Re: (Score:2)
And who can play anything rendered in the cloud?
One person at a time.
Re: (Score:2)
Re: (Score:2)
Next - Ray-Traced Nethack (Score:2)
UUUUUUUUU
The umber hulk hits! - more
The umber hulk hits! - more
The umber hulk hits! - more
You die - more
Re:Next - Ray-Traced Nethack (Score:5, Funny)
The ray bounces!
The ray hits you!
You die...
Re: (Score:3)
It hits you! You're rooted in graphics system dogma.
Re: (Score:2)
Almost as much as pirates on flying sharks that are on fire that shoot fireballs.
The the real market for this is for server farms for Progress Quest.
Link to Video (not slashdotted yet) (Score:2, Informative)
http://software.intel.com/en-us/videos/cloud-based-ray-tracing-on-handheld-devices-at-researchintel-days-2011/ [intel.com]
Side affects of ray tracing (Score:5, Funny)
Intel is apparently running the ray tracing process on the same server their blog is on.
Re:Side affects of ray tracing (Score:5, Funny)
Their blog is ray-traced.
Re: (Score:2)
Their blog is ray-traced.
mod parent +1 awesome
Google Caches! (Score:2)
http://webcache.googleusercontent.com/search?q=cache:hmhYOuzN3B4J:blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php+http://blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php&cd=1&hl=en&ct=clnk&gl=us&client=mozilla [googleusercontent.com]
http://webcache.googleusercontent.com/search?q=cache:hmhYOuzN3B4J:blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php+http://blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php&hl=en&client=mozilla [googleusercontent.com]
A screen shot with images! (Score:3)
It finally gave me images and I took a screen shot/capture to upload to share: http://i.imgur.com/opSLl.jpg [imgur.com] ... :)
Google Cache Link (Score:2, Flamebait)
https://webcache.googleusercontent.com/search?q=cache:hmhYOuzN3B4J:blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php+&cd=1&hl=en&ct=clnk&gl=us [googleusercontent.com]
Here ya go.
Re: (Score:2)
Cluster = Cloud (Score:2)
Re: (Score:3)
*remote* cluster = cloud
Re: (Score:2)
*remote* cluster = cloud = unacceptable latencies for gaming.
The only way this concept works is if the rendering farm is running in a closet somewhere in your house.
Re: (Score:2)
OnLive made it work with acceptable latencies, but then they did it with a cheap GPU and not a 256-processor cluster.
Re: (Score:2)
Acceptable or not depends on where you live and how good your ISP is. Personally, my mediocre cable internet regularly has latency in the 200s which is annoying enough trying to play online games, I can't imagine having that kind of latency for the basic I/O layer of the game. And that's not even at 12 midnight, when they decide to push out the schedule updates to every single cable box on their network simultaneously.
Re: (Score:2)
While this may be true in your particular case, many people are within the 1000 mile radius of an OnLive data center on a decent connection.
People talk a lot about how the network latency would make the input lag to OnLive unbearable, but consider this: 50ms of latency gets you from Montreal to Dallas (~2800km), and GTA IV on the XBox 360 has 133-200ms of input lag [eurogamer.net] despite being local. In fact, every console game that Eurogamer measured had at least 67ms of latency, and they claim that the average seemed to
Re: (Score:2)
GTA players may be willing to put up with high latency, but that doesn't fly so well with button-combo-fighting games (Soul Calibur, Street Fighter both 67ms) or competitive FPS games (CoD:MW 67-84ms). Those games just will not work with the additional latency of remote rendering over the Internet. 50ms light speed
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Sure, but a "core" in a GPU is far simpler than a "core" in a CPU, and Larrabee wasn't stripped down anywhere near that far. Larrabee was supposed to feature 32 cores in one package initially on a 45nm process, bumping it up to 48 on a later 32nm process. Intel is still on a 32nm process, so when they talk about a "256-core cluster", they're almost certainly talking about multiple systems; an 8-chip 32-core-per-chip system (or 4-chip 64-core) would not be a "cluster" in and of itself. And such a system does
Re: (Score:2)
Re: (Score:2)
It would help if I had read the summary. They've got a single server with eight Knight Ferry cards, each having 32 cores. That's where they get their 256 cores from. And they're calling the single server a "cloud".
What makes this most unimpressive is that nVidia has been making a GPU-accelerated real-time raytracing engine for years now (you can even download working demos [nvidia.com]), and before that they were selling a GPU-accelerated final-frame renderer (non-real-time raytracing). Intel is showing off in-house dem
Re: (Score:2)
Actually, they are talking about one system. It's a custom server designed for GPU computing, and has 8 PCIe 2.0 x16 slots filling nearly the whole back side of the chassis. They're using 8 32-core cards to render the images and video.
http://www.colfax-intl.com/ms_tesla.asp?M=102 [colfax-intl.com]
Re: (Score:2)
So what's the advantage here? The committed an eight-card 256 core server just to render a Quake 3 era game with raytracing. nVidia has been giving away (for free, as far as I can tell) a CUDA-based real-time raytracing engine for their CUDA cards (including Tesla) for a few years now, and before then, they had a final-frame renderer (non-real-time raytracer) available that predates CUDA.
If I can do with a $300-400 GPU in a $1000 computer what it takes Intel a massive custom-built server, what's the advanta
Re: (Score:2)
Re: (Score:2)
Right, but my point is that you don't need to wait five or ten years, you can buy a $400 graphics card that will do the same thing today.
Re: (Score:2)
The raytracing application is merely a demo. Real applications in the near term will take advantage of the fact that all of the cores on Intel's accelerator card are x86 compatible. The cores on a Nvidia graphics card are not x86 cores and probably never will be.
With lots of x86 cores you can do interesting things like implement drivers that make your multi-core accelerator card visible to your OS as if they were real CPU cores. Imagine that you have Chrome open with 100 tabs. Chrome runs each tab in a s
Re: (Score:2)
The raytracing application is merely a demo. Real applications in the near term will take advantage of the fact that all of the cores on Intel's accelerator card are x86 compatible. The cores on a Nvidia graphics card are not x86 cores and probably never will be.
With lots of x86 cores you can do interesting things like implement drivers that make your multi-core accelerator card visible to your OS as if they were real CPU cores. Imagine that you have Chrome open with 100 tabs. Chrome runs each tab in a separate process. Your Intel accelerator card with 50-256 x86 cores could be used to run Chrome processes, one process per core. All of a sudden your main CPU is no longer bogged down running flash and background javascript crap for each of your 100 open tabs.
If there was any real benefit to this, we'd see dual-processor consumer motherboards; those died off in the Pentium II era. These days, with a modern quad-core processor, your "main CPU" is no longer bogged down with background javascript or running flash; that's already handled by different cores.
Over the long term, moores law suggests that these Intel x86 accelerator cards will have enough cores and fast enough cores to do graphics acceleration for games that is good enough and fast enough. Eventually, all of these multitudes of cores will come standard inside every Intel cpu, no "accelerator card" needed. Intel has done exactly this with current gpu technology on their current line of processors. Their graphics performance is sufficient for everyone except serious gamers.
Or, we'll continue to see the current progression of a steadily increasing number of fulls-sized cores, and Intel's lots-of-tiny-cores approach will be of little interest to anybody but HPC seekers.
Re: (Score:2)
Have you used it? The IDF is for interactive frame-rates (haven't checked but last Intel demo I saw was about 20fps). That ray-tracer on the card takes several seconds per frame. They are not really comparable in performance.
Re: (Score:2)
I've used it... It's real time on my old GTX 285. The most fancy one, "Design Garage", gets 2-3 FPS. A modern nVidia card should be significantly faster, especially in SLI. But even in SLI, it'd still be enormously cheaper than Intel's 8-card solution.
Re: (Score:2)
I meant the SDK for ray-tracing, rather than the ray-tracing demo in the SDK. I've tried that on a GTX-580 and it seemed to have two different rendering modes, low quality when you move the model for 2-3fps and then a refinement step that took a couple of seconds to get the highest quality.
Re: (Score:2)
Well, how much of that refinement is actually useful for Intel's target use case here? They're going to stream this as compressed video to a tablet, antialiasing (which seems to be a large part of the refinement done in many of the nVidia demos) isn't that useful since it's all going to get crammed into a video stream anyhow. Looking at Intel's claims in terms of performance hits for various operations versus what I saw in the nVidia demos, it's clear that Intel has a better raytracing rendering engine, but
Re: (Score:2)
Ok, so cluster = cloud now? Even though they both serve very different purposes?
no, a cluster is a bunch of machines working together. the 'cloud' is purely a means to acquire funding from ADHD investors.
'fluffy' is the new 'shiny'
Re: (Score:2)
personally i love that in the pictures of the box with the cards - they used a WD drive instead of an Intel drive in an Intel box..
I don't get it (Score:3)
What's the big deal?
Ray tracing isn't new.
Parallel processing isn't new.
It's an old game.
What makes this news?
Re: (Score:2)
They're getting close to commodity hardware. A large 256 core server today is a run of the mill desktop in 5 years. Intel wants you to believe that GPUs have a limited lifespan, that they'll last only until real time ray tracing on the CPU can produce equivalent or better results. They could be right... but the only way to find out is going to be to wait until the hardware catches up to the point that it's economically competitive and see what the GPU makers have done in the meantime. All in all, these
Intel keeps slogging raytracing (Score:2)
About once or twice a year they go on a big press buzz about raytracing. Reason is they would like that you don't spend money on graphics cards, and instead take that money and spend it on a bigger processor. So they are looking in to something that GPUs don't do so well, which is raytracing. They keep trying to get people excited about the idea of raytraced games, which would be done by systems with heavy hitting Intel CPUs, rather than rasterized games done mainly with a GPU.
As long as they keep doing pre
Re: (Score:2)
Based on what I've read about Raytracing vs Rasterization, Raytracing *will* win out in the long run. I guess RT scales better than rast, but the overhead is expensive. Once we get to the point where RT is about the same speed as rast, it will only take 1-2 generations before RT is several factors faster.
Whichever company is ready to push out RayTracing, will stomp the market. If you release too early, your product will just be a gimmick, if you release too late, the competition will be several times faster
Re: (Score:2)
Yes... ray tracing will indeed win in the long run.
This is because the performance is almost independent of primitive (triangle) count.
As a matter of fact... currently, ray tracing really complex models is already faster than rasterizing them.
Few hundred million or more triangles or so can be done faster with RT.
This is why their choice for content baffles me:
They should NOT be using doom datasets for this, they should be using hundreds of millions of polygons in their dataset.
That is where RT is shining.
Ra
Re: (Score:2)
Rasterizing is O(N) in nr of triangles.
Ray tracing is better than O(logN), approaching O(1) even.
That's only really true for theoretical best-case scenario for raytracing and worst case for rasterization. In practice things look very different, as any real world realtime rasterisation engine will do LODs, tessellation, octrees, occlusion queries and whatever to drastically cut down the triangle count they have to render in rasterization, make it no longer O(N), but something much smaller. Equally O(logN) is only true for static scenes, when you have dynamic ones things look quite different as you and y
Re: (Score:2)
yeap - and this is Intel - a company that knows how to play for the future (to an extent).. an example is Hyper Threading.. most people pass it off but honestly if you expect for it and optimize some things for it you can see ~80% increase in performance. Now the group that came up with it and started designing it - started their research in i believe 1992.
some companies know how to do R&D and some don't, Intel is one that does.
Re: (Score:2)
When I run a synthetic benchmark on my i7-920(2.66ghz), this is what I get for my Int/DP(64bit float) performance.
5.32bil ints/sec/thread - 8 threads
2.65bil FP/sec/thread - 8 threads - about 21gflops w/o SSE
You should check out the execution units on the i7, there is a lot of duplicated FP/int units, which allows HT to get some pretty hefty performance benefits. SSE units are shared, so any SSE benchmarks would take a hit with HT.
If you can get more than 1 DP/cycle on your AMD chip with 6 "real" cores, let
Raytracing vs Rasterization (Score:2)
The future of rendering in video games has always been evident if you look at high-end rendering for films. If you look at a product like Renderman, ray-tracing is used for specific materials in the scene, but not commonly used for rendering the whole frame. Getting realistic materials out of the renderer is the real problem, not rendering mirror balls.
Hype Never Gets Old (Score:2)
In fact I hear they have hype out on Blue Ray in 3D super HD.
Not practical (Score:2)
Re: (Score:2)
Modern online games not running through the cloud are designed to mask online latency by providing you with instant feedback for certain actions (movement, firing animations). OnLive games, unless custom-coded, would not have this luxury. It would be akin to playing the original Quake (aka N
Re: (Score:2)
100 ms lag isn't even acceptable for a fucking web GUI. and that's that.
Re: (Score:2)
Lag? (Score:2)
Re: (Score:2)
Maybe for single-player games, but not for competitive twitch-games online. Most modern LCDs come with gobs of post-processing that puts display latency up to the 100-200ms range (of course, you turn this off if you care). I find it extreme enough to absolutely destroy my shotgun accuracy in a local split-screen game. I expect remote rendering would have a similar effect, only worse because the Internet is less deterministic than a TV's post-processing pip
Re: (Score:2)
I get a 19ms ping to Chicago, which is somewhere upwards of 500mi via the trace route in another state. Put in some more localized rendering farms, like per State, and you could easily keep latency low enough for the average user.
Jitter could be an issue if the network isn't well designed. It would probably show up as micro-stuttering.
Calling a Larrabee a Larrabee (Score:2)
Re: (Score:2)
DOF (Score:2)
Re: (Score:2)
Not only games but also movies. There's some argument for making sure the person is looking at the thing you want them to, but if you need to make everything else blurry to do so, makes you wonder.
It's one of the reasons 3D films don't work in general - they include depth of field. The only 3D film I've seen that really worked was Avatar - which has no depth of field. At all. I enjoy the movie more for this technical reason alone - I can look where I want!
Re: (Score:2)
It's code and art that's 13 years older than any 360 title.
Re: (Score:2)
Okay. 5 years. But W3D is still based on engine tech and visual standards that are much older. And I'll call "shenanigans" on your "any." There are likely many Xbox 360 games that look like crap compared with this.
And if Intel had a good-looking Xbox 360 title's source code, this box would out box that box by 359-1 at least.
Re: (Score:2)
The newer Wolfenstein running on the Doom 3 engine.
Re: (Score:2)
And also:
http://en.wikipedia.org/wiki/Castle_Wolfenstein [wikipedia.org] ..which was an interesting game back in the day.
Re: (Score:2)
Re: (Score:2)