Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware Technology

Snapdragon XR2 Chip To Enable Standalone Headsets With 3K x 3K Resolution, 7 Cameras (roadtovr.com) 34

An anonymous reader quotes a report from Road to VR: Qualcomm today announced Snapdragon XR2 5G, its latest chipset platform dedicated to the needs of standalone VR and AR headsets. The new platform is aimed at high-end devices with support for 3K x 3K displays at 90Hz, along with integrated 5G, accelerated AI processing, and up to seven simultaneous camera feeds for user and environment tracking. While XR1 was made for low-end devices, XR2 5G targets high-end standalone headsets, making it a candidate for Oculus Quest 2, Magic Leap 2, and similar next-gen devices.

XR2 offers up notable improvements over Snapdragon 835 (one of the most common chipsets found in current standalone headsets, including Quest); Qualcomm claims 2x performance in CPU & GPU, 4x increase in pixel throughput for video playback, and up to 6x resolution per-eye compared to Snapdragon 835 -- supporting up to 3K x 3K displays at 90Hz. [...] Notably, XR2 supports up to seven simultaneous camera feeds (up from four in prior platforms). This is key for advanced tracking, both of the environment and the user. [...] Qualcomm also says that XR2 offers low-latency pass-through video which could improve the pass-through video experience on headsets like Quest, and potentially enable a wider range of pass-through AR use-cases. Additionally XR2 boasts significantly accelerated AI processing; 11x compared to Snapdragon 835, which could greatly benefit the sort of operations used for turning incoming video feeds into useful tracking information.

This discussion has been archived. No new comments can be posted.

Snapdragon XR2 Chip To Enable Standalone Headsets With 3K x 3K Resolution, 7 Cameras

Comments Filter:
    • Nah, we're a long way from that kind of over-the-top.

      3Kx3K is still way below what you'd really like for VR/AR. A 4KUHD screen mostly filling your field of view will have clearly visible pixels, but would at least be pushing them down to the size that you'd mostly notice them as "jaggies" on sharp edges. Provide one of those for each eye, and you're up to 16Mpixels/frame, well over the 9Mpixel/frame of the 3Kx3K. And you could probably double that to an 8K screen per eye and still have room for visual im

  • Now you just need a $9k rig to run it on high. (I'm just jealous)
  • The next step is for headsets to go to 802.11ay, and hit the 60GHz spectrum for wired speeds without the tangled wires. You'll need a battery-pack belt for the first generations, but higher resolutions, and full speed with no wires will be what it takes for AR/VR to take off. All the VR need to be AR headsets with masks to be VR. Then you can switch between AR and VR in 2 seconds or less, letting everyone code for a single headset for both applications.
    • Wireless VR at 3k already exists [intel.com], using Intel WiGig technology, paired with some decent video compression that the most recent intel i9 cpus can (just) handle.
    • Right, 5G is a dumb play - it won't work in most places before these headsets are discontinued.

        I want a wireless high-res display tethered to my phone so I can get some reasonable work done on a four-hour bus ride without lugging along a laptop. Wireless keyboard/trackball can fold up nicely.

    • Why do people need help to love alaska? If they don't love it, then maybe it shouldn't be love? Don't force love!!!! DONT!!!
  • Because even high end videocards have issues with this kind of resolutions.

    • Really? I mean this is less than double the resolution that an Index already has, and that headset has no issue running on mid-range graphics cards.

  • I can see industrial, military, and medical uses but why does the average consumer need any sort of AR/VR? I don't get it.
    • Because. Its. Fun!

      Books and movies have industrial. military and medical uses too, but mostly are for fun :)

    • why does the average consumer need any sort of AR/VR? I don't get it.

      Because it's incredibly frigging fun. Even now with the technology still far from perfect I don't regret getting a VR headset for even a moment and I highly recommend to anyone to actually try it at a local VR lab. I got hooked by playing with an old Rift CV1 headset, and the current generation is even better.

    • Define "need."
    • Because GAMES !!!! Almost every game is better in VR because of the immersion. And someday, movies: When movies are generated on-the-fly by our entertainment AI's, the VR version where you are inside the movie will be better than the screen version.
  • I own a wireless Vive Pro which is around 3k resolution. Its "enough" to give a crisp view. A simple stand-alone headset at that resolution would be a game-changer.

     

    • The Vive Pro is in total not even 3K (yes it's almost 3K across both eyes, but certainly not in vertical resolution), here it is about 3K PER EYE, that's double what the vive pro has PER EYE. So yeah, this would be a game-changer.. BUT seeing the nvidia RTX line already has trouble driving the Vive Pro at 90fps with good visuals, don't expect these RX2 chipsets to be able to drive those displays with the visuals you'd expect from modern games (don't count on Red Dead Redemption 2 to even run at low to mediu
  • by Miamicanes ( 730264 ) on Friday December 06, 2019 @01:17AM (#59490300)

    Passthrough AR (using cameras to capture the room around you for realtime display on the screen) needs a minimum of ~200-250hz, and *really* 400-1000hz if it extends into your peripheral vision. Otherwise, the 'slosh' will have you puking in no time.

    Motion sickness is a problem, but SLOSH-induced VR sickness is ENORMOUSLY worse, and not something you EVER really adapt to (besides learning to turn your head slowly & close your eyes before turning your head quickly to hide it from yourself).

    Jitter (synthetic images not anchored properly) in foveal vision is annoying, but is stressful & sickness-inducing in peripheral vision. Blame evolutionary survival mechanisms that depend upon noticing 'danger' out of the corner of your eye.

    One stopgap idea I had: do 'bluescreen' passthrough AR. Render synthetic video to an alpha-blended framebuffer at ~100hz. Capture stereo camera images at 400hz, and combine both for output to a 400hz display (repeating each frame of synthetic overlay video 4 times per camera-frame).

    The net effect would be similar to optical AR (eg, Magic Leap & Hololens) today... the synthetic image would still lag a bit, but at least the reality-anchoring passthrough video wouldn't slosh.

    Slosh is just one problem of many... but it's a BAD problem that totally kills the usability of passthrough AR @ 90hz.

    I'd also propose a short-term display compromise: 8k displays, with the understanding that only a small part can sustain synthetic video at full framerate. Use case: developers who run their IDE directly on virtual monitors in the headset. At 8k, you can render the equivalent of 3 2560x1600 monitors (one fully in view, plus ~1/2 of the adjacent ones) at normal viewing distance, so you can develop VR apps without having to constantly put on & take off the headset. Yes, everyone knows 90hz + 8k games aren't viable with present-day GPUs... and that's ok. Make 'developer' headsets with double-rez displays so they can render the IDE at 8k, but program games for 4k & just scale them 2x2 when running on a 'developer' headset.

    Same deal with 4k monitors. Yes, we all know interface speeds & GPUs are too slow today for 4k @ 240hz. So make the display 3848x2560 & capable of 60-120fps at that resolution... but capable of 240-480fps at 1920x1080 with 2x2 nearest-neighbor scaling, and maybe even 960-1000fps at 960x540. I'm so sick of having to choose between a display that can do 1080p120 or 2160p30, instead of both 1080p120 and 2160p30 (or 1080p240 + 2160p60). Computationally, any monitor with g-sync or FreeSync should be able to do this. Worst-case, it might need another $5 worth of RAM to allow static double-buffering. For an expensive premium monitor, that's *nothing*.

    • ^-- elaboration about the 'bluescreen passthrough' idea:

      Have the app render a synthetic scene that's ~25% taller & wider than the visible display at 1/4 the passthrough camera rate. Then, instead of simply repeating each rendered frame 4 times or attempting to use the GPU to render it at camera speed, have a DSP-like intermediate GPU that integrates the motion sensors & camera views to transform the synthetic reference frame slightly during the intermediate frames to improve its anchoring to your su

    • Passthrough AR (using cameras to capture the room around you for realtime display on the screen) needs a minimum of ~200-250hz, and *really* 400-1000hz if it extends into your peripheral vision. Otherwise, the 'slosh' will have you puking in no time.

      Completely incorrect. Passthrough AR needs improved *latency* not improved framerates. 80/90Hz is more than enough to prevent VR sickness providing the latency is low.

      • At 90fps, your camera-display latency will NEVER be less than the duration of a single video frame... 11.11ms.

        11.11ms is ABSOLUTELY perceptible as "slop" or "slosh". Attempting to engage in tasks requiring hand-eye coordination with 11.11ms latency would have you fumbling like you're drunk, and feeling like your bones have turned to floppy rubber. And 11.11ms is the BEST case for 90fps video.

        To reduce the latency further, you MUST increase the video framerate (at least, for the camera and display, regardles

        • No you only need to increase the frame rate for the camera not the display. Being able to display a prepared input on a 90Hz display already solves VR sickness. If it didn't then VR wouldn't be working right now. The only issue for AR is that the image is ready to be displayed when the display is ready to switch to the next frame, and that isn't a limitation of display technology but rather everything else around it.

          • That's the thing, though... VR *isn't* very usable today. At least, not for more than a few minutes at a time. And slosh is a big reason *why* forcing an employee to spend hours at a time working in 90hz VR would almost certainly constitute an OSHA violation.

            The injury is cumulative over the span of a few hours. If you're playing a high-adrenaline game, you might not notice the stress. If you're an architect doing CAD in a passthrough-AR environment, you absolutely WILL notice the stress that comes from end

    • Motion sickness is a problem for only a fifth of the population. The younger the VR/AR user, the lower the chance they have nausea.
      • Slosh != "motion sickness".

        Motion sickness is what you get when your eyes tell you that you're moving, but your vestibular system says you aren't.

        "Slosh" is what you perceive when you start turning your head, but the world around you lags by 5-100+ milliseconds. It's not as bad as motion sickness... it's much, MUCH worse. It's the main reason why passthrough AR has been non-viable so far.

        If you're in a 100% synthetic immersive cartoon/Seuss-like world, your brain somewhat can rationalize slosh. If you're in

  • Let's not get ahead of ourselves. Yes it might be able to drive a dual 3kx3k display at 90hz, but it sure won't be graphics which can be produced by GPU's like the Nvidia RTX series (which already have trouble pumping out visuals at dual 2Kx2K displays at 90hz). So don't go expecting Red Dead Redemption 2 at ultra settings to run on that RX2 chip (I think it won't even do that on low settings).
  • Now there are a lot of interesting developments. I like it when modern developments make it easier to solve issues and use smartphones and other gadgets. Especially pleased when developers use web motion graphics opengeekslab.com/blog/benefits-of-motion-graphics/ [opengeekslab.com] . Such graphics gives good emotions to the user and attracts the attention of potential customers for companies.
  • "XR2 boasts significantly accelerated AI processing"

    What else could that mean? It's not like the term would be misused as part of a marketing scheme or anything...

"Imitation is the sincerest form of television." -- The New Mighty Mouse

Working...