Cloud-Based, Ray-Traced Games On Intel Tablets 91
An anonymous reader writes "After Intel showed a ray traced version of Wolfenstein last year running cloud-based streamed to a laptop the company that has just recently announced its shift to the mobile market shows now their research project Lalso running on various x86-tablets with 5 to 10 inch screens. The heavy calculations are performed by a cloud consisting of a machine with a Knights Ferry card (32 cores) inside. The achieved frame rates are around 20-30 fps."
Cloud cloud cloud (Score:4, Funny)
I've got a CloudPad running CloudOS 0.99. It is freakign cloudtastic.
Re: (Score:2)
Re: (Score:1)
You just gave me a great idea for a Chrome/Firefox plug-in. Universal replace. You set up key value pairs.. every instance of [key] on every page you visit will be replaced with [value].
It's existed for a very long time in the form of Proxomitron
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=proxomitron [google.com]
Best web filter there is... and it's browser
agnostic since it affects the stream at a
proxy level.
I've set it up for a parent before where every
single instance of "a word they don't want
their child to see" was replaced with some
irrelevant 'kid' word.
And it filters ads too, lol
-AI
Re: (Score:2)
Re: (Score:2)
Sort of. Cloud is dumbed down market-speak; it was a designed phrase that took the technobabble out of the presentation. In all seriousness cloud more closely is aligned with phase 2 of the Gnomes plan. [wikipedia.org]
Re: (Score:2)
See, for me, "the cloud" just means "the internet" as per networking diagrams. I swear we've lost sight of what it means, "a network that we don't want/need to draw".
It's really gotten out of hand to be honest, just whack cloud before something and it immediately sounds like you've got geek cred to idiots. From here on in I'm going to take cloud to mean the person means "a network/technology that's too complex for the journalist to understand".
Re: (Score:1)
I have a SmurfPad running SmurfOS and it's just Smurfy. Heh.
Re: (Score:2)
Re: (Score:3)
CTO: Well, what've you got?
Vendor: Well, there's servers, cloud and clients; software, hardware, cloud and peripherals; networking and cloud; servers, software and cloud; clients, software networking and cloud; cloud, software, networking and cloud; cloud, peripherals, cloud, cloud, servers, and cloud; cloud, clients, cloud, cloud, software, cloud, networking, and cloud;
Consultants (starting to chant): Cloud, cloud cloud cloud ...
Vendor: ...cloud, cloud, cloud, software, and cloud; cloud, cloud, cloud, clou
Re: (Score:2)
You reminded me of a penny arcade comic.
Lunch [penny-arcade.com]
Network lag (Score:1)
Re: (Score:2)
Having first-hand experience with OnLive (got a code for Trine on OnLive with a Humble Bundle), I'd say that it's not that bad. It complains if you're over wireless, but any wired broadband connection should be fine. It's been quite a bit smoother than a VMware PC-over-IP setup I used last year, and the server running that junk was on the local network.
If you want to try it yourself, download the client and go to the "Arena" section to watch other people playing games. Now imagine that was you; it'd almost
Re: (Score:2)
Having first-hand experience with OnLive (got a code for Trine on OnLive with a Humble Bundle), I'd say that it's not that bad. It complains if you're over wireless, but any wired broadband connection should be fine.
Cool. So tablet users will have to plug in a LAN cable to play games...
Re: (Score:2)
go to the "Arena" section to watch other people playing games. Now imagine that was you; it'd almost be easy to forget that the game isn't rendered locally.
Except that because you're not in control of the avatars in the game you have no sense of the input lag. I tried out a few games on OnLive (I have a 30Mb downstream connection as well, so that shouldn't have factored into this), and for me the input lag was too great for anything but single-player games and perhaps slow-paced multi-player games. To be fair, I did not really spend any time playing around with configuration (not actually sure if they even give you the option), so perhaps I was not connectin
Re: (Score:2)
I know, but it sure would be cool if you could play games on a low-power tablet while the actual processing is done by a server in your closet or living room.
I should pitch that idea to one of the big gaming companies, like Nintendo.
Re: (Score:2)
That's what I don't understand, put the "cloud" server in a closet somewhere in my house and connect all the other machines up through a strong wireless N network and you might have something usable. I just can't imagine relying on your ISPs latency when it comes to graphics.
Re: (Score:2)
Maybe not for an FPS or racer but if it was a really cool Starcraft killer that would have interesting possibilities. Handle the 2D GUI with the tablet processor and then raytrace a photorealistic game engine.
You could also stream data and handle some of the primary ray data locally.
If you rendered the visible lighting and then used irradiance caching. The cloud could do the heavy lifting with GI and the tablet could do the kNN lookups.
Re: (Score:2)
In this particular case, it sounds like the server-side power required per user is very large. So I would imagine it's impractical at the moment regardless of networking issues.
Followed one of the links and read this... (Score:5, Insightful)
The first product codenamed "Knights Corner" will target Intel's 22nm process and use Moore's Law to scale to more than 50 Intel cores.
Nonsense marketing babble. Moore's Law is predictive. You can't use it to MAKE anything happen.
Who says laws are predictive? (Score:2)
Just yesterday, I used the law of gravity to make myself fall down. So there!
Re:Who says laws are predictive? (Score:4, Insightful)
Just yesterday, I used the law of gravity to make myself fall down. So there!
No you didn't. You used the law of gravity to predict that you would fall down. You utilized gravity to make it happen.
Re: (Score:2)
Is that a nerd-burn?
Re: (Score:1)
Video (Score:2)
Latency (Score:3)
Re: (Score:1)
Re: (Score:2)
And I can't accept vsync disabled, screen tearing is far more noticeable and annoying than a tiny bit of lag.
Re: (Score:1)
My point was really that if I have trouble with vsync enabled, I have no idea how I could possibly play cloud streamed games. Unfortunately, I've not even had the chance to try OnLive yet because I don't live in the US, so I dunno, maybe it'll be fine.
Re: (Score:2)
Who cares if it looks awesome if latency sucks. I'd rather have SuperNES StarFox quality graphics with no lag than ray-traced graphics with horrible latency.
It can be reduced, but I don't yet believe it's possible to make it unnoticeable. I guess I'll believe it when I see it.
Latency is an absolutely huge problem. It's a bigger problem, for me, than poor image quality.
I'll happily turn down the visuals if it makes my game more responsive. And nothing will make me throw up my hands in frustration faster than input lag.
Re: (Score:2)
And nothing will make me throw up my hands in frustration faster than input lag.
Maybe if you quit throwing your hands up and kept them on the input there wouldn't be so much lag between when you want to do something and when it gets fed into the control. ;)
So? (Score:1)
Re: (Score:3)
Intel's PR department keeps releasing press statements and journalists keep eating it up.
Input latency is a real issue. I'm not impressed that Intel can take a bank of servers to produce content for one client. The business model for that just frankly doesn't work yet, and even when the business model for that does work, input latency will remain.
Write a story when they solve input latency.
Re:So? (Score:4, Informative)
I never thought OnLive would work, but it kinda, sorta does. Over my office connection.
Not my home connection, that's too slow. Or my friend's connection, which is fast enough, but suffers from occasional hiccups that break the stream... but at work, it works great, so this should work anywhere that a mobile device can receive and sustain a 5Mbps stream of data without going over their data cap in a month.
Re: (Score:2)
The last demonstration they did with Wolfenstein was 4 servers to produce the video stream for one client. Perhaps they're down to one server now. But even one server with a 32-core chip producing the video stream for a single client doesn't make financial sense yet.
Constant high def video streaming doesn't work well in the new age of data caps. Input latency depends how much overall WAN latency you have between each client and the servers. That will vary for person to person, but hardcore gamers do care ab
Re: (Score:2)
Well, why seperate the client and the server? If you think about it, this could be a market for 32...1024 core desktops: Raytraced games. I mean, currently most games don't need nearly so much CPU power, and Intel wants a market for their many-core chips. They do have a market server-side, but if they can get a client-side market too? Great!
Re: (Score:2)
Re: (Score:2)
This "OnLive" like brute-force streaming solutions will die for a simple reason of increasing cheap computing power and also increasing display resolutions (and bandwidth)
I doubt it. It might certainly change its target audience, as it doesn't seem to be all that good to replace a gamer PC or a console right now. But its hard to beat having access to a demo in a matter of seconds vs a matter of hours on the PC. With OnLive you could, in theory at least, flip through games like you flip through TV channels, you'll never get that ability when you have to download multiple GB before you can even see the title screen.
ray traced (Score:1)
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
Re: (Score:1)
Wolfenstein would fire one ray for each horizontal column on the screen
Aren't horizontal columns commonly called... rows?
Re: (Score:2)
It fired one ray along each pixel of the horizontal axis (ie 320 rays) to get the relevant rendering information for the entire vertical column.
Re: (Score:2)
Wolfenstein 3-D used ray casting [wikipedia.org] which is a bit different [permadi.com] than ray tracing.
Re: (Score:1)
Eheh (Score:5, Interesting)
So, basically Intel is saying, fuck Online with its 100ms lag times! We can go for SECONDS! No MINUTES even. Infinite lag! We can do it!
All you need is to run an old game on hardware that can easily run the real game with an insane data plan.
The bubble is indeed back, remember all those ideas of sites that wanted 100kb avatar images on real time updating forum posting sites with 500kb flash animated signatures? When most people were happy with a 56kb modem? Served from Sun hardware? Well this is the same thing.
I am quite willing to accept that you can render a game beautifully on a server. I am quite willing to believe tablet can be a lot cheaper if they only got to show a movie. I am even willing to believe that response time over the internet can in theory be fast enough not to produce outlandish lag in ideal circumstances.
In real life?
It is interesting geek stuff but the same thing that killed so many "great" ideas during the last bubble still is there. Bandwidth and latency are still not solved enough for this. We now finally can do the instant messaging people had dreamed up around 2000. 10 years later. (Check how long it has been since your mobile phone became truly capable of doing IM without endless waiting or insanely high prices)
Another piece of proof? Slashdot itself. Take the new "ajax" method of posting. No feedback, endless lag, errors, missing posts. It is clear what they wanted but the tech just ain't there yet. For the instant feel to work you need servers that can process a request in a handful of miliseconds, not seconds Mr Slashdot.
Nice idea Mr Intel, now get out your mobile phone and make a post on slashdot. Then you will know how unrealistic your dream is.
There is a reason we carry crotch burning CPU's and insane amounts of storage with us. Moore's law doesn't apply to the internet. AT&T won't allow it.
Re: (Score:1)
Unfortunately, it's your bandwidth and power, too. Granted, I get your point about the scalability of this, but your attack ain't free.
Re: (Score:2)
It often takes me more than 10 sec to post comments (and that's not the counter thingie; just network lag + serverload).
Re: (Score:2)
Not sure what you are meaning by feedback, but I've seen some pretty direct feedback on Slashdot when posting, such as NOSHOUTING and the like.
Usually when you enter a comment somewhere it's either accepted straight away or you get some kind of feedback to say that something is happening.
When you enter a comment on the NEW SUPER IMPROVED SLASHDOT, you sit there for twenty seconds wondering whether something is going to happen. Sure, there's a 'Working' icon at the bottom of the screen... BUT IT'S THERE ALL THE TIME. The bastard 'Working' icon is sitting there spinning away right now as I type this. What does it mean? What is it 'Working' to do? D
I thought the Blackberry tablet was bad (Score:1)
over wireless lan? (Score:1)
All this just makes me feel old (Score:2)
They are streaming content to a device; in other words, calculation happen on a central server and your device acts as a (somewhat) dumb terminal.
It's the mainframe world of Multics again, only 30 years later, much more complex, and servicing trivialities instead that business critical apps.
The "cloud" has become a buzzword for many, but deep down it's just some central servers doing the grunt-work, and you displaying that data. The reverse of a decentralized, democratic and transparent system; more control
We can already do that without exotic hardware (Score:2)
I'm a big fan of real-time ray tracing, but this doesn't sound all that exciting, considering that about three years ago I was able to play a real-time ray-traced game on a middle-of-the-road laptop. Resolution and framerate weren't great, but it was playable. The game I refer to is Outbound [igad.nhtv.nl] an open-source game based on the Arauna engine.
It's great that this is on Intel's radar, but whenever Intel demonstrates some new real-time ray-tracing demo that requires a cluster of machines, or some other kind of
Re: (Score:1)
Bull. The Wii is dead, nobody(*) wants one or plays the one they have. That's why Nintendo is hurrying out with a new console while the Xbox/PS3 have been around much longer and will continue to be around for a few more years.
Gameplay _and_ graphics matter.
* - Yes, I know there are a few wierdos out there who do still play their Wii.
Re: (Score:2)
Perhaps you meant scanline/raster rendering is faster than ray tracing? Both use polygons though...
What about pre-rendering everything? (Score:2)
Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves? (I realize you'd then have to superimpose the bad guys over the screen, but that's what the original Wolf3d did anyway.) It would be a lot of computation up front, but then you wouldn't be having to have the cloud computer constantly render a new frame. (Possibly some that's it's already rendered).
Or would that take so much time that it's not worth it?
Re: (Score:2)
Re: (Score:2)
I vaguely remember playing a SWAT game for PC that pretty much did this.. kind of..
Well, actually, it used real-life video in place of rendered graphics, the video changing based on which direction you went.
Re: (Score:2)
> Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves?
No, of course it wouldn't be. Let's say we have a completely featureless map that is 100m by 100m, and we track coordinates in (integer) millimeters. This gives us 100000 * 100000 = 1 billion different points we can occupy. Wait, now we can't jump, and our face is stuck at some distance above the ground. We add being prone and crouching to our positions, and the abilit
Re: (Score:2)
Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves?
Feasible? Not so much. Possible in theory, yes. The issue is that you need lots and lots of storage for complete free movement. Older CD based games like Myst3 did essentially that, but you couldn't move freely, just rotate. Movement was restricted to a fixed position every few meters. There was some adventure game prototype a few month/years (discussed on Slashdot) back that did move that tech to the next level and allow free movement an a single plane, but that still ended up looking kind of awkward as yo
net work (Score:1)