Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Entertainment Games Hardware

Xbox 360 GPU A Vector Co-Processor? 55

Anyone Seen Thomas? writes "While Beyond3D's article on the ATI C1 (XENOS) graphics processor in the XBOX 360 gives you all you need to know about ATI's next generation hardware in terms of generating screen pixels, it also gives a big clue as to how it'll be useful for general purpose vector programming. XENOS is able to write data out of its unified memory architecture into system memory, and read it back again later. So with a large pool of powerful vector hardware available, does anyone fancy the idea of having a generalised , high-performance vector processor in their PC?. Read about that and the rest of XENOS." From the article: "Since XBOX 360's announcement and ATI's unleashing from the non disclosure agreements we've had the chance to not just chat with Robert Feldstein, VP of Engineering, but also Joe Cox, Director of Engineering overseeing the XBOX graphics design team, and two lead architects of the graphics processor, Clay Taylor and Mark Fowler. Here we hope to accurately impart a slightly deeper understanding of the XBOX 360 graphics processor, how it sits within the system, understand more about its operation as well as give some insights into the capabilities of the processor."
This discussion has been archived. No new comments can be posted.

Xbox 360 GPU A Vector Co-Processor?

Comments Filter:
  • In simple terms the MEMEXPORT function is a method by which Xenos can push and pull vectorised data directly to and from system RAM.

    The only problem is that if the GPU has to use the same bus as the CPU in which case they'll have to compete with each other, potentially leading to bottlenecks and/or messy bus arbitration. Hopefully Microsoft will give the GPU direct access to the memory.

    Performance-wise, it wouldn't be quite as good as having built-in memory on the GPU, but it will be a lot cheaper

    • Re:Interesting... (Score:3, Interesting)

      by Keeper ( 56691 )
      The GPU also functions as the main memory controller; it has full access to all of the memory on the machine.

      MEMEXPORT is exceptionally cool because this means that the CPU can stream data to the cache and let the GPU access it directory. This reduces latency and yields more useable bandwidth for main memory for both the CPU and the GPU.

      • > The GPU also functions as the main memory controller

        Cool.

        What's next? A floppy controller in charge of CPU to CPU communications?
        • Re:Outsourcing? (Score:3, Informative)

          by Keeper ( 56691 )
          Think about it for a minute. What component in the system is going to be hitting memory the most? (hint: it renders textures to the screen) Look at what kind of bus that most PC video cards have between the GPU and on-card memory -- they're built to reduce latency as much as possible.

          Moving the memory controller onto the same die as the GPU yields a non-trivial performance benefit both in available bandwidth and latency (look at what kind of gains AMD got by moving the NB on-chip) .

          If the controller w
        • You laugh, but in days of yore, the CPU inside the Commodore 64 disk drive (the 1571) was more powerful than the CPU in the C=64 itself.

          • > but in days of yore, the CPU inside the Commodore 64 disk drive (the 1571)
            > was more powerful than the CPU in the C=64 itself.


            Actually, the 1571 came in 1995 (as a companion to the C=128), 3 years after the C=64.
            The step-up from 1MHz to 2MHZ during that time is to be expected.
    • Check out this page [beyond3d.com] of the article. The XBox 360 uses a unified memory architecture, and the same 512 MB of memory is used by the CPU for general purpose use and the GPU for storing textures or whatever. There is also 10 MB of eDRAM which serves as a frame buffer which can handle some processing such as Z-buffering, blending, etc.

      The heart of the GPU seems to be on the same chip as the Northbridge controller, which basically means that the GPU has more direct access to the memory than the CPU.

      Check

  • Just an Example... (Score:5, Insightful)

    by superpulpsicle ( 533373 ) on Wednesday June 15, 2005 @05:46PM (#12827562)
    Does anyone else think Gran Turismo 4 looks actually better than Forza sports? Point being, the console world is always 50% hardware, 50% software.

    • Don't you think that beauty is in the eye of the beholder? It's really a matter of opinion which game "looks" better. You can ask which has more polygons on screen at any given moment, in which case you could probably guess Forza and be correct, but that doesn't necessarily translate to looking better.

      I could make a car that takes all the processing power the the Xbox to draw, and still have it look like crap. I'm not saying this is the case with either game, but it's a valid point.

      The software is

    • I don't. The difference between the two is the "character" of the graphics.

      Tracks: GT4 is "brighter" and more pleasant to look at, but isn't very realistic; for larger tracks GT4 is lacking detail/draw distance. Forza is "duller", but is a more accurate reproduction of the environments they're trying to re-create.

      Environment: Forza wins this one hands down; color reflections and accurate shadows are the big differences.

      Cars: A bit of a tossup; GT4 is better with some models, Forza is better with oth
    • I don't. The difference between the two is the "character" of the graphics.

      GT4 has a "brighter" more pleasant look to some of its tracks, but isn't very realistic; for larger tracks GT4 is lacking detail/draw distance. The overall graphic quality hasn't changed much since GT3 IMO.

      Forza tracks are "duller", but is a more accurate reproduction of the environments they're trying to re-create (with the exception being the ring; it "looks" like the ring, but the layout isn't much like the real thing). Forza
    • by oGMo ( 379 )
      If anyone thinks anything else, they're just being silly. The video comparison on IGN between the two had me more suprised than I thought I'd be. At first I thought the GT4 screen was the XBOX game just because it was so much more realistic.

      This is why recently I've been comparing PS2 and XBOX games. XBOX stuff hasn't seemed to improve considerably. Compare various shots/videos of GT4, God of War, Metal Gear Solid 3, Haunting Ground, Jak3, and others to Halo 2, Forza, etc. and you'll be suprised: the

      • Make the same comparison on a large screen and you'll change your mind. Then turn on 480p or 720p in the Xbox games and you'll wish slashdot had UBB-style "Delete My Post" capabilities. :)

        Seriously, we have a 68" rear-projection TV in one room and a DLP front projector shooting a 92" picture in the HT room and PS2 games just look awful by comparison.
  • by Anonymous Coward on Wednesday June 15, 2005 @06:03PM (#12827710)
    I know that both ATI/Microsoft and Nvidia/Sony really want to 'hype' their technologies but is anyone out there actually delusional enough to think that any of the upcomming systems ( XBox 360/PS3/Revolution) will actually produce graphics that are dramatically different from any of the other systems?

    To a certain extent I'm personally expecting very little in the way of technical progress in graphics and a far greater focus on artistic considerations. Let's face it, we're hitting a point where using 'brute-force' and dramatically increasing the geometry in your objects is not what will produce a better looking game; what will make a difference is well designed objects and a more populated environment.

    Now, more technical power is needed to obtain these more populated environments and can help with designing better objects but there is a limitation on what is currently needed. I expect that, for the most part, if you could produce 4 times the geometry of the XBox (twice the geometry per object and twice as many objects on screen) and you can have every pixel calculated by a shader (which is aproximatley 4 times as complicated as one that can be run on the XBox) you will be meeting the requirements of almost every game made in the next generation. I expect that every one of the upcomming consoles surpasses these specifications.
    • Have you seen the screenshots for Elder Scrolls: Oblivion? To me the forests look amazingly better than anything else I've seen in a game. I think there's something like 8 square miles of them total. I believe it uses the CPU to procedurally generate the geometry. It's possible to generate this much detail dynamically because you can set it up where the CPU can write data to the cache and the GPU can read directly from the cache without having to go through main memory.

      Screenshot 1 [firingsquad.com]
      Screenshot 2 [firingsquad.com]
      S [firingsquad.com]

      • it's also coming out for the pc and with current gen gfx cards and with the imminent introduction of new cards, it'll run with no problems and will look just as good (it could look nicer if they took advantage of the pcs strengths).

  • So does this mean that apple can rewrite vecLib to take advantage of future vector processors? With a fast enough interconnect (PCIe?) this would be a viable replacement for altivec.
    • So does this mean that apple can rewrite vecLib to take advantage of future vector processors?

      This has nothing to do with that. Although, they can and are. They've been calling it the "Accelerate Framework" [apple.com] and it's been available ( well, most of it, I guess ) since 10.3. But we're not talking about a vecLib re-write. If you use vec_foo, you'll need to look for _mm_foo and re-write your own code. But vDSP, vImage, and other higher-level functions outlined on the linked documentation page, those should 'jus

  • Can someone explain to me what a vector processor is and what is it good for?
  • does anyone fancy the idea of having a generalised , high-performance vector processor in their PC?

    Yea, occasionally I'd like one, but I'd rather it didn't take away from what my graphics card can do, wouldn't I ?

    Like, maybe I'd like to have a generalized DSP chip like the one in my NeXT machine here [nationmaster.com], or one of these specialized DSP boards [google.com] ?

    But if you want a GPU that's targeted to supporting DirectX, I don't know if using it for a DSP is really the right idea. Maybe it is. Or maybe Intel's own vector pro

Say "twenty-three-skiddoo" to logout.

Working...