Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Cloud Graphics Intel Games Hardware

Cloud-Based, Ray-Traced Games On Intel Tablets 91

An anonymous reader writes "After Intel showed a ray traced version of Wolfenstein last year running cloud-based streamed to a laptop the company that has just recently announced its shift to the mobile market shows now their research project Lalso running on various x86-tablets with 5 to 10 inch screens. The heavy calculations are performed by a cloud consisting of a machine with a Knights Ferry card (32 cores) inside. The achieved frame rates are around 20-30 fps."
This discussion has been archived. No new comments can be posted.

Cloud-Based, Ray-Traced Games On Intel Tablets

Comments Filter:
  • by Anonymous Coward on Wednesday June 08, 2011 @11:09AM (#36376090)

    I've got a CloudPad running CloudOS 0.99. It is freakign cloudtastic.

    • by Anonymous Coward

      I have a SmurfPad running SmurfOS and it's just Smurfy. Heh.

    • CTO: Well, what've you got?

      Vendor: Well, there's servers, cloud and clients; software, hardware, cloud and peripherals; networking and cloud; servers, software and cloud; clients, software networking and cloud; cloud, software, networking and cloud; cloud, peripherals, cloud, cloud, servers, and cloud; cloud, clients, cloud, cloud, software, cloud, networking, and cloud;

      Consultants (starting to chant): Cloud, cloud cloud cloud ...

      Vendor: ...cloud, cloud, cloud, software, and cloud; cloud, cloud, cloud, clou

    • You reminded me of a penny arcade comic.

      Lunch [penny-arcade.com]

  • It may barely work with desktops (if you're close to the servers), but with mobile devices I'm quite sure that they're unplayable in reality.
    • I know, but it sure would be cool if you could play games on a low-power tablet while the actual processing is done by a server in your closet or living room.

      I should pitch that idea to one of the big gaming companies, like Nintendo.

    • That's what I don't understand, put the "cloud" server in a closet somewhere in my house and connect all the other machines up through a strong wireless N network and you might have something usable. I just can't imagine relying on your ISPs latency when it comes to graphics.

    • Maybe not for an FPS or racer but if it was a really cool Starcraft killer that would have interesting possibilities. Handle the 2D GUI with the tablet processor and then raytrace a photorealistic game engine.

      You could also stream data and handle some of the primary ray data locally.

      If you rendered the visible lighting and then used irradiance caching. The cloud could do the heavy lifting with GI and the tablet could do the kNN lookups.

    • In this particular case, it sounds like the server-side power required per user is very large. So I would imagine it's impractical at the moment regardless of networking issues.

  • by RoverDaddy ( 869116 ) on Wednesday June 08, 2011 @11:09AM (#36376094) Homepage

    The first product codenamed "Knights Corner" will target Intel's 22nm process and use Moore's Law to scale to more than 50 Intel cores.

    Nonsense marketing babble. Moore's Law is predictive. You can't use it to MAKE anything happen.

  • or gtfo
  • by LikwidCirkel ( 1542097 ) on Wednesday June 08, 2011 @11:24AM (#36376276)
    Who cares if it looks awesome if latency sucks. I'd rather have SuperNES StarFox quality graphics with no lag than ray-traced graphics with horrible latency. It can be reduced, but I don't yet believe it's possible to make it unnoticeable. I guess I'll believe it when I see it.
    • by Uhyve ( 2143088 )
      I can barely play games with vsync enabled, never mind adding live streaming into the mix. I'm not saying "barely play" in a snobby way either, I actually just suck at games when I get any extra input latency.
      • And I can't accept vsync disabled, screen tearing is far more noticeable and annoying than a tiny bit of lag.

        • by Uhyve ( 2143088 )
          Oh yeah, I didn't mean deride anyone who uses vsync, I just literally have trouble playing games with vsync enabled, I wish I didn't, because screen tearing is annoying.

          My point was really that if I have trouble with vsync enabled, I have no idea how I could possibly play cloud streamed games. Unfortunately, I've not even had the chance to try OnLive yet because I don't live in the US, so I dunno, maybe it'll be fine.
    • Who cares if it looks awesome if latency sucks. I'd rather have SuperNES StarFox quality graphics with no lag than ray-traced graphics with horrible latency.
      It can be reduced, but I don't yet believe it's possible to make it unnoticeable. I guess I'll believe it when I see it.

      Latency is an absolutely huge problem. It's a bigger problem, for me, than poor image quality.

      I'll happily turn down the visuals if it makes my game more responsive. And nothing will make me throw up my hands in frustration faster than input lag.

      • And nothing will make me throw up my hands in frustration faster than input lag.

        Maybe if you quit throwing your hands up and kept them on the input there wouldn't be so much lag between when you want to do something and when it gets fed into the control. ;)

  • by mazesc ( 1922428 )
    Will this be news everytime a new device is targeted?
    • Intel's PR department keeps releasing press statements and journalists keep eating it up.

      Input latency is a real issue. I'm not impressed that Intel can take a bank of servers to produce content for one client. The business model for that just frankly doesn't work yet, and even when the business model for that does work, input latency will remain.

      Write a story when they solve input latency.

      • Re:So? (Score:4, Informative)

        by Beardydog ( 716221 ) on Wednesday June 08, 2011 @11:36AM (#36376404)
        Isn't it just one server with a 32-core chip?

        I never thought OnLive would work, but it kinda, sorta does. Over my office connection.

        Not my home connection, that's too slow. Or my friend's connection, which is fast enough, but suffers from occasional hiccups that break the stream... but at work, it works great, so this should work anywhere that a mobile device can receive and sustain a 5Mbps stream of data without going over their data cap in a month.
        • The last demonstration they did with Wolfenstein was 4 servers to produce the video stream for one client. Perhaps they're down to one server now. But even one server with a 32-core chip producing the video stream for a single client doesn't make financial sense yet.

          Constant high def video streaming doesn't work well in the new age of data caps. Input latency depends how much overall WAN latency you have between each client and the servers. That will vary for person to person, but hardcore gamers do care ab

          • Well, why seperate the client and the server? If you think about it, this could be a market for 32...1024 core desktops: Raytraced games. I mean, currently most games don't need nearly so much CPU power, and Intel wants a market for their many-core chips. They do have a market server-side, but if they can get a client-side market too? Great!

    • This "OnLive" like brute-force streaming solutions will die for a simple reason of increasing cheap computing power and also increasing display resolutions (and bandwidth), you can barely stream 1080p now over Wifi, but wait for the next year's new "retina" resolution displays, the bandwidth needed for streaming next-gen video and 3D is beyond the current network capabilities, it is an unfortunate (VC-cow funded) solution that eats your 20GB months allowance in a few minutes, clogs the network for your peer
      • by grumbel ( 592662 )

        This "OnLive" like brute-force streaming solutions will die for a simple reason of increasing cheap computing power and also increasing display resolutions (and bandwidth)

        I doubt it. It might certainly change its target audience, as it doesn't seem to be all that good to replace a gamer PC or a console right now. But its hard to beat having access to a demo in a matter of seconds vs a matter of hours on the PC. With OnLive you could, in theory at least, flip through games like you flip through TV channels, you'll never get that ability when you have to download multiple GB before you can even see the title screen.

  • Wasn't Wolfenstein ray traced to begin with?
    • Ray-cast to begin with. But I think Voxelstein 3D was raytraced before Intel got around to doing it.
    • by Anaerin ( 905998 )
      Not even close. Wolfenstein 3D was "Ray Casted".
    • by Toonol ( 1057698 )
      The algorithm used rays, but not in the sense that ray-tracing uses. Wolfenstein would fire one ray for each horizontal column on the screen, to see where it intersected with the wall. That would be 320 rays for the full screen, and was why the maps were effectively 2d. Ray-tracing, of course, uses at least one ray per pixel.
      • Wolfenstein would fire one ray for each horizontal column on the screen

        Aren't horizontal columns commonly called... rows?

        • by vux984 ( 928602 )

          It fired one ray along each pixel of the horizontal axis (ie 320 rays) to get the relevant rendering information for the entire vertical column.

    • Wolfenstein 3-D used ray casting [wikipedia.org] which is a bit different [permadi.com] than ray tracing.

  • This story would have been "Bluetooth-Based Game blablabl" a couple of years ago.
  • Eheh (Score:5, Interesting)

    by SmallFurryCreature ( 593017 ) on Wednesday June 08, 2011 @11:39AM (#36376446) Journal

    So, basically Intel is saying, fuck Online with its 100ms lag times! We can go for SECONDS! No MINUTES even. Infinite lag! We can do it!

    All you need is to run an old game on hardware that can easily run the real game with an insane data plan.

    The bubble is indeed back, remember all those ideas of sites that wanted 100kb avatar images on real time updating forum posting sites with 500kb flash animated signatures? When most people were happy with a 56kb modem? Served from Sun hardware? Well this is the same thing.

    I am quite willing to accept that you can render a game beautifully on a server. I am quite willing to believe tablet can be a lot cheaper if they only got to show a movie. I am even willing to believe that response time over the internet can in theory be fast enough not to produce outlandish lag in ideal circumstances.

    In real life?

    • ISP's and especially mobile ISP's have been trying to cut back on that amount of data consumed. Can you imagine their reaction to the demands of an UNBUFFERED HD video stream during peak hours on a MOBILE device? Most of them can barely stand you downloading a low rez youtube video of a couple of minutes. How long is the average game level?
    • Latency. It is already bad enough on the wired internet were with multiplayer your computer only has to transmit the data not the actual screen itself. On 3G or even 4G? Forget about it. For web pages the latency can easy reach a minute. In chess game that would be to slow.
    • Hardware. It just keeps on getting more powerful, my netbook can now play some of the older games just fine. I recently replaced my Linux desktop with a amd cpu on motherboard setup because I realized that for desktop and movie watching, I don't need anymore (game machine is properly silly overpowered). Tablets are getting more powerful all the time, someone for fun sake report how wolvenstein runs on an atom CPU.

    It is interesting geek stuff but the same thing that killed so many "great" ideas during the last bubble still is there. Bandwidth and latency are still not solved enough for this. We now finally can do the instant messaging people had dreamed up around 2000. 10 years later. (Check how long it has been since your mobile phone became truly capable of doing IM without endless waiting or insanely high prices)

    Another piece of proof? Slashdot itself. Take the new "ajax" method of posting. No feedback, endless lag, errors, missing posts. It is clear what they wanted but the tech just ain't there yet. For the instant feel to work you need servers that can process a request in a handful of miliseconds, not seconds Mr Slashdot.

    Nice idea Mr Intel, now get out your mobile phone and make a post on slashdot. Then you will know how unrealistic your dream is.

    There is a reason we carry crotch burning CPU's and insane amounts of storage with us. Moore's law doesn't apply to the internet. AT&T won't allow it.

  • I thought it was bad that the Blackberry tablet requires your phone to get email and function correctly. This Intel tablet requires a 32 core 'cloud' machine? Am I going to need a co-location service provider for my backend to ensure I can play a tablet based fps? Or is Intel planning to provide unlimited cloud capacity for each low power (and low cost) tablet processor they sell?
  • If this could work over a wireless lan to a beefy desktop computer it might be feasible.
  • They are streaming content to a device; in other words, calculation happen on a central server and your device acts as a (somewhat) dumb terminal.

    It's the mainframe world of Multics again, only 30 years later, much more complex, and servicing trivialities instead that business critical apps.

    The "cloud" has become a buzzword for many, but deep down it's just some central servers doing the grunt-work, and you displaying that data. The reverse of a decentralized, democratic and transparent system; more control

  • I'm a big fan of real-time ray tracing, but this doesn't sound all that exciting, considering that about three years ago I was able to play a real-time ray-traced game on a middle-of-the-road laptop. Resolution and framerate weren't great, but it was playable. The game I refer to is Outbound [igad.nhtv.nl] an open-source game based on the Arauna engine.

    It's great that this is on Intel's radar, but whenever Intel demonstrates some new real-time ray-tracing demo that requires a cluster of machines, or some other kind of

  • Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves? (I realize you'd then have to superimpose the bad guys over the screen, but that's what the original Wolf3d did anyway.) It would be a lot of computation up front, but then you wouldn't be having to have the cloud computer constantly render a new frame. (Possibly some that's it's already rendered).

    Or would that take so much time that it's not worth it?

    • I don't think pre-rendering all possible images would be practical (except in a game like Myst where movement is confined); however when we get into the realm of global illumination, it does make a certain amount of sense to do the ambient lighting computation (which is the same regardless of camera position) up front. In a sense, this is what modern games already do -- the lighting effects are essentially painted on the walls as textures. I think the areas where there's room for improvement is that globa
    • I vaguely remember playing a SWAT game for PC that pretty much did this.. kind of..

      Well, actually, it used real-life video in place of rendered graphics, the video changing based on which direction you went.

    • by Thiez ( 1281866 )

      > Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves?

      No, of course it wouldn't be. Let's say we have a completely featureless map that is 100m by 100m, and we track coordinates in (integer) millimeters. This gives us 100000 * 100000 = 1 billion different points we can occupy. Wait, now we can't jump, and our face is stuck at some distance above the ground. We add being prone and crouching to our positions, and the abilit

    • by grumbel ( 592662 )

      Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves?

      Feasible? Not so much. Possible in theory, yes. The issue is that you need lots and lots of storage for complete free movement. Older CD based games like Myst3 did essentially that, but you couldn't move freely, just rotate. Movement was restricted to a fixed position every few meters. There was some adventure game prototype a few month/years (discussed on Slashdot) back that did move that tech to the next level and allow free movement an a single plane, but that still ended up looking kind of awkward as yo

  • Maybe not for an FPS or racer but if it was a really cool Starcraft killer that would have interesting possibilities. Handle the 2D GUI with the tablet processor and then raytrace a photorealistic game engine. [url=http://www.cheapesthandbagsforsale.com/goods-182.html]Chanel Fashion Handbags Ball Grain Leather Red 47976[/url] [url=http://www.cheapesthandbagsforsale.com/goods-176.html]chanel handbags collection[/url] [url=http://www.cheapesthandbagsforsale.com/goods-182.html]chanel handbags for cheap[/url]

1 1 was a race-horse, 2 2 was 1 2. When 1 1 1 1 race, 2 2 1 1 2.

Working...