Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Intel Hardware

NVIDIA nForce 4 SLI Intel Edition Launched 133

Spinnerbait writes "NVIDIA took the wraps off their nForce 4 SLI chipset platform for Intel Processors today and there's a full review and showcase with benchmarks up at HotHardware. As with NVIDIA's AMD version of this chipset, motherboards based on the technology will support dual PCI Express graphics cards for load sharing in 3D Gaming applications. What's perhaps even more interesting is how the new NVIDIA memory controller actually allows the platform to out-pace Intel's own i925XE in virtually all of the benchmarks."
This discussion has been archived. No new comments can be posted.

NVIDIA nForce 4 SLI Intel Edition Launched

Comments Filter:
  • by winkydink ( 650484 ) * <sv.dude@gmail.com> on Tuesday April 05, 2005 @11:08AM (#12143893) Homepage Journal
    Here [networkmirror.com]
  • At this rate soon we will have processors that are capable of rendering real video instead of animation. Or say animation as real as videio footage. Imagine you can select characters like Brad Pit or Tom Cruse for the game. Way to go...
    • by BigWhiteGuy_27 ( 804307 ) on Tuesday April 05, 2005 @11:14AM (#12143938)
      Imagine you can select characters like Brad Pit or Tom Cruse for the game

      Or CowboyNeal...
    • What a scam. I could already select Tom Cruise [vgmuseum.com] twenty years ago...
    • But not real "real video" is much-much worse than not real animation, isn't it?
    • by Ford Prefect ( 8777 ) on Tuesday April 05, 2005 @11:49AM (#12144248) Homepage
      At this rate soon we will have processors that are capable of rendering real video instead of animation. Or say animation as real as videio footage.

      Hardly. Most game-style rendering today is mostly smoke and mirrors; while 3D graphics hardware has improved at a ridiculous rate over the last couple of years, there's still a long way to go before certain, everyday scenes can be rendered.

      Something I'd like would be a 'city-renderer', capable of rendering a decent-sized European city (i.e. not a grid) from aerial views down to individual rooms. While a clever level-of-detail system could go a long way towards this, there would still be an utterly horrendous amount of geometry for a typical skyline shot [hylobatidae.org].

      Now add traffic, crowds of humans (typical FPS-style games give up after about ten or so, strategy games use crude mannequins for more), properly reflective surfaces and whatnot, motion blur and decent HDR [daionet.gr.jp] and your quadruple-SLI Geforce 9000-Hyper-Pro-Matic setup will still grind to a halt.

      Things are slowly getting there, but I'm still waiting - but like a gas, FPS-style generic corridors will expand in processing requirements until they saturate even the greatest hardware. Look at Doom 3, for example... ;-)
      • Hardly. Most game-style rendering today is mostly smoke and mirrors

        Agreed, but...

        Most movies and videos are produced using "smoke and mirrors" too. E.g. typically the lighting is changed for every camera shot, reflectors, gels and masks are used to highlight or darken parts of shots, actors are coated with makeup to compensate for the incredibly artificial lighting, etc. etc. etc. And then everything is shot on a sound stage or in front of a facade or on a virtual set.

        If you point a video camera at a ci
      • If you haven't taken a gander at it yet, you may want to take a look at OpenRT [openrt.de] and projects using OpenRT such as Quake3 Raytraced [uni-sb.de]. Also take a look at the hardware architecture [saarcor.de] as well.

        Ray-tracing presents a much more detailed rendering of a scene, but was always considerably slower than rasterization. If hardware-accelerated ray-tracing architecture grows in the market, you may see your skyline beautifully rendered in real-time .. with traffic, crowds, etc.
    • Riiiight. Tom Cruise or Brad Pitt. I choose Pamela Anderson and Natalie Portman - here's a hint: they won't be fighting...
      • by Anonymous Coward
        Pamela Anderson? She's so 90's! Now, Natalie Portman and Keria Knightley....that would be almost like watching twins!
  • by Anonymous Coward on Tuesday April 05, 2005 @11:10AM (#12143911)
    I remember reading on the Inquirer, that on a one on one comparison, the nForce Intel boards weren't able to keep up to the AMD ones, on more than just a processor basis. Was a few weeks ago though, so possible could have been fixed, i.e. driver probs
    • The test is biased towards intel. They compare the top of the line extreme edition intel cpu to the athlon 64 4000+ instead of the fx55.

      That being said, the amd64 system still edges out the p4 system in gaming benchmarks although the p4 system is ahead in synthetic benchmarks.

      In other words: it looks cool but the situation is more or less the same as it has been since amd introduced cpus with onboard memory controllers.
    • If anything the test was "fixed" to make the Intel CPUs look better.
      For example the test systems were Intel 3.73GHz versus the AMD 4000+ at 2.4GHz.

      Considering the competitive AMD CPU is the 2.6GHz FX55 model, this is obviously a skewed result.

      They pitted an $1,100 Intel P4 against the $500 AMD Athlon64 4000.

      Even if they had compared against the much faster AMD Athlon64 FX55, the price delta is still huge. The FX55 is an $835 chip versus the $1,100 Intel!

      Even so, on most tests the AMD soundly won.
      Some o
  • by Pants75 ( 708191 ) on Tuesday April 05, 2005 @11:16AM (#12143959)
    When are these things going to start spitting out microwaves? We're already into TV and FM Radio Band emissions at 200-400Mhz. Microwaves really aren't that far off.
    While there are some radar bands from 1,300 to 1,600 MHz, most microwave applications fall in the range 3,000 to 30,000 MHz (3-30 GHz). Current microwave ovens operate at a nominal frequency of 2450 MHz, a band assigned by the FCC.

    You'll be glad you kept your old steel PC case when we get this sort of speed out of MBs

    Pete

    • You do realize that MHz is a measure for any sort of frequency (ie clock cycles) and not just radio waves, right?
      • Indeed, of course.

        But when you have current switching at a given frequency, you get EM emissions at that frequency. (And others, but thats another story)

        They may not be particularly powerful, but they are there.

        Regards

        Pete

        • They may not be particularly powerful, but they are there.

          For some applications they are powerful enough to be a nuisance. Forget picking up weak amateur radio stations when you are close to a PC. I guess AM broadcast (if weak enough) could even be disturbed by a nearby PC. And indeed also at other frequencies than the clock: your machine may run at 1 GHz, but probably has higher harmonics. Also subharmonics and in short ALL kinds of noise will be created.

    • When are these things going to start spitting out microwaves?
      ...
      You'll be glad you kept your old steel PC case
      It's been my experience that microwaves and metal don't mix very well ;)
      • Would you rather the case to get hot or your flesh? ;-)
        • I was thinking more of electrical discharges frying your computer.
          • Everyones a critic! Besides, you'll never have to worry about it. The manufacturers will never allow the product to emit anything harmful for fear of a million half cooked geek lawsuits.
          • I was thinking more of electrical discharges frying your computer.

            Your PC would indeed 'fry' if placed in a powerful enough electromagnetic field, such as a microwave oven. This is why we usually don't put it in. However this has nothing to do with the PCs clock speed, except maybe that technologically, to increase the CPU clock you need to decrease the gate length and the oxide thickness of your digital technology, making it less robust to fields.

        • Re:EM emissions (Score:2, Interesting)

          by tehcrazybob ( 850194 )
          Since computer components run at extremely low power, the radiation shouldn't be an issue.

          At the moment, computers do cause some harmful radio interference if you leave the side of the case off. Since this is Slashdot, I assume there are several people reading this who have theirs off. However, even acrylic cases or case windows are enough to stop that radio interference.

          Even if the frequency picks up enough that we were getting microwave radiation, an aluminum case would still be able to block it. Eve
          • Re:EM emissions (Score:3, Interesting)

            by Ford Prefect ( 8777 )
            Since computer components run at extremely low power, the radiation shouldn't be an issue.

            You joke, but I gather there were minor problems at Jodrell Bank [man.ac.uk] when PCs' clock frequencies (and/or harmonics) happened to coincide with important radio frequencies used for radio astronomy.

            As you say, though it's hardly dangerous - but having done an undergraduate experiment there some years ago in which an FFT of pulsar data detected nasty big peaks at 50Hz, 100Hz, 150Hz etc. (mains power...) I'm wondering if all
          • actually I doubt that acrylic cases stop radio interference at all. Plastic is pretty good at letting it out. Some of the better cases might have a conductive coating that will block it.
            That being said computers do not really emit much in the GHz range at all. That clock speed tends to be limited to the cpu. The FBS is where you will get the interference. Usually around 166-200 MHz for the Amd family. Now if they ever tried to get the bus to run at 1+GHz then things would get exciting. You could have neons
    • Re:EM emissions (Score:1, Insightful)

      by Anonymous Coward
      What's the big deal with microwaves anyway? They arn't ionising. Even visible light is of a higher frequency, and I'll bet you have light bulbs in your house.
    • by Morgaine ( 4316 ) on Tuesday April 05, 2005 @12:30PM (#12144644)
      Current microwave ovens operate at a nominal frequency of 2450 MHz, a band assigned by the FCC.

      I think you'll find that the physics of water molecule resonance had something to do with choice of this band.

      Funny how every other country in the world chose the same band, despite not being ruled by the FCC ... :)
      • Re:Physics FCC (Score:1, Flamebait)

        by idlake ( 850372 )
        I think you'll find that the physics of water molecule resonance had something to do with choice of this band.

        Less than one might think. Microwaves over a fairly broad range of frequencies work--more than enough for different countries to choose different frequencies. In fact, I wouldn't be surprised if they have.
        • Microwaves over a fairly broad range of frequencies work--more than enough for different countries to choose different frequencies.

          Yoda Hawking? Is that YOU??

          What did you do to your hair...

  • that nVidia were able to create a memory controller which out performed Intel comes as no great surprise when you look at the history of both companies Intel of course has the inside information on their cpu's but they have always been trying for a performance reliability compromise or the other hand nVidia try for cutting edge 300 miles per hour or nothing technology and thats why we love em of course nVidia have always pulled off this speed with stability anyway so you may not see where i am coming from
  • And what of... (Score:5, Interesting)

    by Anonymous Coward on Tuesday April 05, 2005 @11:19AM (#12143987)
    All the motherboard manufacturers who dumped R&D into having to build alternative SLI solutions? One example being the Tyan S2895 which uses dual nForce4 chipsets to achieve true 16x pci-e in SLI mode. I'm hoping that nVidia didn't try to hold this information back from motherboard manufacturers otherwise we may see a lashback against nVidia. And considering I spent months hunting and waiting for a true 16x pci-e SLI solution I am a little disappointed in nVidia for waiting so long.
    • Re:And what of... (Score:4, Informative)

      by pantherace ( 165052 ) on Tuesday April 05, 2005 @11:45AM (#12144220)
      The Opteron/Nforce4 boards are still the only 16x/16x sli solutions available.

      I just double checked on Intel's website, and the best I could find was 8x/8x (3 x8 and 1 x4 PCI express slots (28 lanes total)) And with that it is not possible to have multiple x16 slots (Heck, it's impossible to have 1) (It's possible I missed a better one. I was looking in the server section.)

      The main reason that Tyan can do that is because of AMD's superior Hypertransport-based bus design in Opterons, over the shared bus favored by Intel. It's also the reason why Opteron scales a lot better than Xeon.

      The other reason Tyan can do that is that Nvidia realized how easy it would be to make very slightly different chipsets that facilitated that. Basically they are just Nforce 4 chipsets, that can operate in parallel, giving 40 Pci express lanes (2-way) or 80 PCI express lanes for a 4-way Opteron. (Note a maximum of 4 x16s, as the other 16 can only be a max of x4, due to the 20 lanes per nforce4)

      You can't do x16/x16 with any Intel Processor, as of now. (Though having seen how little x16/x4 or x16/x2 hurts benchmarks (vs standard x8/x8) I'm not convinced it's a big deal at all.)

    • Re:And what of... (Score:2, Insightful)

      by Slashcrap ( 869349 )
      One example being the Tyan S2895 which uses dual nForce4 chipsets to achieve true 16x pci-e in SLI mode.

      The Tyan is an insanely specified server board with something like 40 PCI-E lanes is it's basic config. It's not like that because Nvidia wouldn't release specs. It's like that so you can run several high performance workstation level video cards. I don't even think it uses two NForce 4 chipsets, I think you've just misunderstood the specs. Do you have evidence to the contrary Mr Coward?

      And considerin
      • I'm looking at one of these for my next board. The smp and sli give a great upgrade path. From single opteron to dual to dual core to dual dual core. From single gpu to sli to single next gen to sli next gen. That's a 4-8 year lifespan for that board.

        Also, it does use 2 nforce pro's... a 2200 and a 2050. Might want to check out the datasheet [tyan.com]. Most definately a killer workstation board for years to come.

  • nvidia (Score:2, Insightful)

    by chrisnewbie ( 708349 )
    Sure benchmark is good when it reflects what most gamers have at home. ---- Sure they score 20 gazillions points with 3dmark but it's almost a machine fit for nasa that would cost around 3000$ to buy. ------ why cant they use a normal machine like a pentium IV 2,4 ghz with a ultra-ata 166 and 1 gig of ddr 400.That's more common and more realistic.
    • > it's almost a machine fit for nasa
      > that would cost around 3000$ to buy

      Benchmarks aren't about making you feel good about your system. They're about making you feel inadequate so you'll buy a new one. And they WANT you to spend $3000 on a machine fit for NASA.

      It's a good policy, anyway. A $1000 machine lasts about a year; a $1500 machine lasts about two years; a $3000 machine lasts about *five*. My P3-500 was about $3000, and still going strong. You pay for what you get.
  • multi-everything (Score:2, Interesting)

    by Cruithne ( 658153 )
    Seems like we're trending towards multiple everything recently.. multicore CPUs, SLI.. how long before this propagates to everything?

    As a sys admin, I love the prospect of redundancy, but are there any benefits to bringing this multiplicity to anything else from a consumers perspective? Or does it stop here?
    • Can't wait to have dual Soundblaster Audigys for 10.2 surround sound.
    • This isn't redundancy. The GPUS are NOT performing the same task and being checked. They are performing different independant tasks to get the throughput higher. If one goes down- they both go down- (or at least performance is hampered).
    • Uhm..."propagates to everything?" Surely you don't mean everything. The benefits of multicore-whatever are obvious, but having two keyboards on your laptop solves nothing. Perhaps elucidate on what you mean by "everything?" IMO, any processing unit can benefit from distributed/multicore design.
    • Re:multi-everything (Score:4, Interesting)

      by kannibal_klown ( 531544 ) on Tuesday April 05, 2005 @11:34AM (#12144112)
      Seems like we're trending towards multiple everything recently.. multicore CPUs, SLI.. how long before this propagates to everything?


      Wouldn't doubt it.

      You can only improve on things so long before you need a complete redesign. Adding more to the mix is a great stopgap that extends the usefulness of technology.

      At some point AMD and Intel are going to have to perform a MAJOR redesign (even bigger than the dual-core). Granted this might not be until we reach the 7GHz mark, but there is an invisible line somewhere.

      There is one big downside for the consumer though: increased prices. Dual-Core CPU's will be more expensive than regular ones. SLI graphics will require buying 2 cards. RAID storage requires multiple hard drives.

      Personally I think it would be cool if my next computer were dual-core with SLI video ports and a RAID setup. Whether or not I can afford it, that's another story.

      With the obvious effects of distributed and grid computing Sony's supposed cell tech might actually prove to be interesting (though I'd prefer it on a more local scale).
      • At some point AMD and Intel are going to have to perform a MAJOR redesign (even bigger than the dual-core). Granted this might not be until we reach the 7GHz mark, but there is an invisible line somewhere.

        We've already hit it. Haven't you noticed how Intel already dropped their plans for 4GHz chips, and is going dual-core instead? The shift from upping clock speeds to parallelization is happening now, and we're never going to reach that 7GHz mark (at least not in the forseeable future).

      • If we end up having two of everything (CPU, GPU, monitors, disk drives), wouldn't we be better off having two PC's with a high-speed network?
    • Really, there is no redundancy to speak of in these designs, this is because they are multi purely for performance and no other reason. If say one of your memory sticks is defective, it isnt going to recover without the memory stick being replaced or removed. From that perspective its similar to Striped RAID.
    • "As a sys admin, I love the prospect of redundancy, but are there any benefits to bringing this multiplicity to anything else from a consumers perspective? Or does it stop here?"

      Well, the reason I find these cards interesting is I can buy one now. In a few months, when prices have dropped, I can buy the other and get ~2x performance.

      YMMV etc, but it's intriguing to me. I'm sick of throwing away hardware.
    • Already has (Score:5, Funny)

      by elgatozorbas ( 783538 ) on Tuesday April 05, 2005 @11:51AM (#12144267)
      Seems like we're trending towards multiple everything recently.. multicore CPUs, SLI.. how long before this propagates to everything?

      I have even heard about a guy with TWO complete individual PCs...

    • Multi chicks? Lining up women in an SLI mode that would be willing to do so with me? THAT my friend, would be a great thing.

      Mind you, it's price range is beyond my means, but that's why you have online shopping carts you can always clear out.
    • Hey, it worked for the Klingons.
  • Humbug! (Score:4, Funny)

    by Robotron23 ( 832528 ) on Tuesday April 05, 2005 @11:31AM (#12144082)
    3D gaming, 3D schaming.
    Back in my day we had the Voodoo 2's and the ol' 6mb of ram, 12 if you were rich! Couldn't even get two separate sprites on the screen without extreme lag... but we liked it!
    • It was 8, not 6.
    • Re:Humbug! (Score:3, Interesting)

      by UWC ( 664779 )
      Voodoo 2 came in 8MB and 12MB versions, the latter having an extra 4MB of texture memory. They were pass-through cards, requiring a separate, primary video card for non-3D stuff. They could be connected to do SLI (which at that time I think was Scan-Line Interleaving, the cards handling alternate lines of monitor resolution). I think with SLI, you could play your games at an astonishing 1024x768 in glorious 16-bit color. Single cards were limited to 800x600.

      I had a single 12MB one that I bought used on eB

      • Apologies son, my memory ain't so good these days! :)

        On a separate note though, I recall a member of a popular Voodoo card fan site actually using two Voodoo cards to run Doom 3, it actually looked quite intriguing, the game was utterly stripped of its effects, including the darkness which pretty much defined it to most gamers.

        In the end it seemed the Voodoos had successfully displayed the barebones of Doom 3 in general. It almost seemed like the older Doom games.

        Interestingly though, one area which was
        • I think I remember that, too! Seeing the textures and models without all the effects added was pretty interesting. The colors and textures reminded me a lot of the original Jedi Knight game.

          I'm of two minds now. On one hand (mind? crazy metaphors), I want to find that old PC and be reminded of how older accelerated games looked. On the other, I want to build a new one and revel in the progress of consumer technology.

          • Indeed, it is tempting either way, I don't think it'd be easy for any strategy gamers to give up modern games like Rome Total War... the sight of cavalry literally smashing apart entire files of men for example is truly amazing and a testament to what new technology can acheive.

            Yet seeing these games in a retro sense would be wondrous, after years of advanced effects its pretty hard to imagine what older games are like (I recently replayed Tomb Raider 2, and was amazed at how primitive and dissapointing th
    • Hmm, Voodoo2 is old days now? IMHO the real leap came when the original voodoo came out. They pretty much changed PC gaming forever. I remember the first time I saw glquake on my Diamond Monster Voodoo card and basically told everone I knew they had to have one.
    • It's funny, just in reverse. A Voodoo 2 with 6M of RAM is already hugely powerful. People used to do gaming on Apple II's and Commodore 64's, with 64K of memory and no hardware acceleration, even some simple 3D games (Wolfenstein).
  • by supremebob ( 574732 ) <themejunky&geocities,com> on Tuesday April 05, 2005 @11:33AM (#12144100) Journal
    Will it work with the new dual core P4 CPU's? It doesn't make much sense to buy a high-end motherboard if you can't get the high-end CPU to go with it.
    • At the bottom of the first page:

      To test the nForce 4 SLI Intel Edition chipset, NVIDIA shipped us a reference motherboard with features that should be indicative of retail-ready products. We should note that the motherboard we tested does not support Intel's new dual-core processors, even though the nForce4 SLI Intel Edition was designed for both single and dual-core processors from the start. NVIDIA has informed us that support for dual-core processors is board dependent, and that top-tier manufacturers
    • It doesn't matter, since Intel's dual core gaming performance will "blow dead goats" [theinquirer.net]. Games being mostly single-threaded and all, and Intel's dual cores running substantially slower than their single-core counterparts.

      You'd be far better off buying an Athlon 64 FX with a nForce4 SLI board today. That's still the gaming king-of-the-hill. Torch your money responsibly.
      • Actually people who would buy such systems with dual cpu's and video cards would be video production profesionals (the parent poster), cad and 3d modelers, and engineers. Such systems are not cheap for the average user.

        At Liz Claiborne in the 90's when I worked there, the merhcandising team used dual vodoo's for studioMax when I worked there over the more pro video cards. They were very fast.

        All these apps fly on Intel cpu's if you look at any benchmark. This is because they contain hand written assembly
        • Forgive my grammer in the previous post.

          I re-edited two sentances which is why redundancy is present.

        • Ah. Didn't know he was doing video production. In that case, I'd spend the money for a dual processor Opteron 252 system with a Tyan Thunder K8WE [tyan.com] SLI motherboard and 8 1GB DIMMs. Two real CPUs with their own integrated memory controllers makes more sense than a dual core P4, and the power requirements are probably similar. The Opteron 252's use the new 90nm Troy core with SSE3 support, which narrows the benchmark gap with Intel on creatively Intel-optimized code (and anything not specifically Intel-opti
  • RAID 5 (Score:5, Informative)

    by gallard ( 873308 ) on Tuesday April 05, 2005 @11:34AM (#12144116) Homepage
    The one thing the Intel version has over the AMD version of this chipset is RAID 5 support. A RAID 5 controller card by itself is over 100 bucks. Dammit this is going to make me want to turn over to the dark side.
    • Is it true hardware raid 5 or does it still use the CPU for xor etc? If its just a plain software implementation then those cards are pretty cheap.
    • Re:RAID 5 (Score:2, Informative)

      by Anonymous Coward
      It is "fakeRAID" so it is all done in software (however some fakeraid cards can do RAID1 in hardware) where as the 100 or so bucks one you can get normally has its own XOR processor (sometimes a few megs of ram).
    • I wouldn't consider Nvida RAID a feature worth buying into at this point.

      I have one box under Windows using Nvraid, and it is just terrible. It drops drives from the RAIDs seemingly for fun, and configuring a bootable RAID is difficult under XP, and impossible under Win2k (even with an SP4 slipstream install, in case anyone was going to point that out).

      The management software is crude at best. It cannot, for example, email alerts when a drive drops off.

      My $.02.

      jh
    • Check out the Gigabyte GA-K8NXP-SLI [giga-byte.com] board which has SATA-150 RAID-5 via an extra chip.

      Damien

    • Does it really matter? This is still using your main processor for all the XORing so what is the benefit?
    • Its been said this review is a little biased.

      AMD's top of the line FX was not included. Most FX chipsets are not only the same class as the extreme Intel boards but many come with RAID 5 as well.

  • Office and Zangband will now run faster on my computer. Yay!
  • by hirschma ( 187820 ) on Tuesday April 05, 2005 @11:42AM (#12144187)
    Hmm, let's see. Let's take an Intel processor with these characteristics:

    * Fastest consumer CPU they offer,
    * Priced at about $1100, street

    And compare it to the AMD offering, with these characteristics:

    * Second fastest CPU they offer,
    * Price of about half of the Intel offering.

    Yes, that is a most fair review. It makes perfect sense to conclude that, on mostly identical chipsets, that Intel is faster.

    How much are these sites paid under the table?
    • How about we take an Intel CPU at 3.73 GHz and compare it to an AMD CPU advertised to perform around that same clock speed? That gives us the 3800+, the 4000+, and the FX-55. Traditionally, AMD's estimation has been 5 to 10 percent high in terms of what Intel processors they can match (across the board, as opposed to gaming where the estimates are about right), so we can assume:

      - The 3800+ performs at 3.42-3.61 GHz which is too low.

      - The 4000+ should perform at about 3.6-3.8 GHz, which is about right.

      - T
      • The 3.73EE is an 'enthusiast' CPU, it ought to be compared to AMD's 'enthusiast' CPUs.

        We don't use Celerons to compare Intel & AMD64 and think its a useful comparison.

      • It's not just the processor choice, it's the overall impression the "review" gives. I wouldn't say they're Intel shills. They've just spun everything positively for the platform they reviewed. They listed off all the nForce's features without testing them. If it had been a more thorough review, we might have learned whether the performance issues with the hardware firewall acceleration had been ironed out, or how well that nTune app tunes. Or disk performance, USB/Firewire, CPU utilization, sound quality, e
      • Your reply is just silly.

        AMD's PR ratings are hardly something that should be used as the basis of comparison. Moreover, they're supposed to suggest, from my understanding, of how a particular AMD model will compare to an Intel CPU ***from the corresponding family***. If you really don't understand this, you're effectively saying that you don't know enough about the industry to be writing about it.

        The clock speed or name of the processor just shouldn't matter. Do you really think that your logic would fac
        • > The clock speed or name of the
          > processor just shouldn't matter.

          The name, no, that shouldn't matter. Neither should the price. We're talking about performance differences, plain and simple; who made the processor and how much it cost are two things that have nothing to do with that.

          > Do you really think that your logic
          > would factor into someone's buying
          > decisions?

          No, I think it would factor into the question of which processor is comparable to another processor for purposes of benchmar
  • Maybe some knowledgeable /.ers can explain why VR seems to have died/stillborn on the personal computer? If we can get 100+ fps in Doom3 at 1600x1200 and wireless networking and battery consumption have dramatically improved, what's the deal? Is it consumer apathy? Is there some other technological hurdle that needs to be overcome (LCD's seem to be getting pretty damn small and good, ie PSP)
    • Wearing a headset is gay?

      That and they couldn't push the latest pixel cruncher when the display on your headset would be at most ~640x480 ... a GeForce 3 can push that just easily... there would be no demand for 1600 by a billion pixel screens with gouraud shaded phong shaded radiosity effects ... on the culled faces ... let's not get talking about the exposed faces...

      All about supply and demand.

      There is a demand for the latest pixel pusher and not for the actual innovation...

      This is why most cpu lines
      • by podperson ( 592944 ) on Tuesday April 05, 2005 @02:15PM (#12145692) Homepage
        My wife operates a VR research lab (they still exist) and all the new hardware is just great for them. Unfortunately, it only addresses one of the concerns which must be addressed before we can all live in the world of "Snow Crash". The basic problems for mainstream VR are as follows:

        1) The headsets really haven't "tipped" price-wise. Kind of like LCD screens for a long time, they stay expensive (around $10k) while slowly improving in features (e.g. resolution, motion tracking). Until they get "good enough" the prices won't trend downwards. (There are cheap headsets, but they make you sick pretty fast. Even the pricey ones make you sick after 30 mins or so ... so you won't be playing WoW in them.)

        2) The big issues w.r.t. UI remain unsolved. E.g. a lot of VR setups involve complex motion tracking and setting aside a room for subjects to walk around in. Usually a second person watches the subject to prevent them from, say, running into a wall... There are rigs that allow you to suspend the subject to allow them to walk through theoretically infinite landscapes... we're talking six figures though.

        3) Behavior capture. The solutions to tracking movement remain pretty experimental and invasive. All the stuff we've talked about so far will, at best, get you walking around in a virtual landscape, capture your head movements (kind of), and maybe capture some of your arm and finger movements. Even assuming your $500,000 suspension rig captures all your body movements perfectly, we still to capture facial expression and lip synch. (So far, spacial 3d audio is pretty primitive too ... Teamspeak is a long way from a person's voice emanating from their position in a shared world with lots of people.)

        4) Force Feedback. All this VR is going to seem pretty lame when you can walk through solid objects or your hand passes through an item you're trying to manipulate. Arguably, this is a subset of item (3) above, but in fact just allowing people to walk around in an unlimited expanse is a big enough problem...

        There are plenty of finer grained issues to deal with, but the rendering of VR scenes (at least, so far) has turned out to be the easy part. At the moment, if you wanted to play WoW in VR you'd need to set aside a large room, buy an expensive HMD, and a really expensive suspension rig. (Luckily, WoW lets you run straight through people so the UI will match this perfectly.)
    • Short answer: total lack of worthwhile content.

      Games are nice, and I'm somewhat surprised helmet displays aren't more popular for them, but that's not really VR and there's no content outside of that that would make VR more popular.
  • So, is Soundstorm gone for good?
  • I recall reading somewhere how the benchmark of memory interleave performance on 865/875 left most other chipsets in the dust, but in real world has only marginal gain over other chipsets.
  • by Anonymous Coward
    I just bought an SLI board (AMD) and was wondering how well the raid and SLI video configuration is supported under Linux.

    I've booted my machine into it and to my suprise the ethernet devices worked out of the box with Xandros (based on debian sid). I still do not know about the raid or SLI video, however. I'm using a crappy old S3 PCI video card right now, but am about to receive two GeForce 6800 GTs in the mail. Can I use these bad boys in linux? Anybody know?

  • My biggest concern is the fact that NVIDIA can't get drivers right it seems... Their Unified ForceWare Drivers don't ever seem to work right all around the board, no matter what card or what version you're using... Maybe I just don't get it, but I think a 1 year old mongaloid chimp that's crippled in one hand could write better video drivers... Fix what you have before you release something else is my policy... Note: I own and only have owned Nvidia graphics cards since I have been gaming... It just frust

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...