NVIDIA nForce 4 SLI Intel Edition Launched 133
Spinnerbait writes "NVIDIA took the wraps off their nForce 4 SLI chipset platform for Intel
Processors today and
there's a full review and showcase with benchmarks up at HotHardware.
As with NVIDIA's AMD version of this chipset, motherboards based on the
technology will support dual PCI Express graphics cards for load sharing in 3D
Gaming applications. What's perhaps even more interesting is how
the new NVIDIA memory controller actually allows the platform to out-pace
Intel's own i925XE in virtually all of the benchmarks."
Full article mirrored (Score:3, Informative)
Humans in my game (Score:1)
Re:Humans in my game (Score:5, Funny)
Or CowboyNeal...
Re:Humans in my game (Score:2)
Re:Humans in my game (Score:2)
Re:Humans in my game (Score:5, Insightful)
Hardly. Most game-style rendering today is mostly smoke and mirrors; while 3D graphics hardware has improved at a ridiculous rate over the last couple of years, there's still a long way to go before certain, everyday scenes can be rendered.
Something I'd like would be a 'city-renderer', capable of rendering a decent-sized European city (i.e. not a grid) from aerial views down to individual rooms. While a clever level-of-detail system could go a long way towards this, there would still be an utterly horrendous amount of geometry for a typical skyline shot [hylobatidae.org].
Now add traffic, crowds of humans (typical FPS-style games give up after about ten or so, strategy games use crude mannequins for more), properly reflective surfaces and whatnot, motion blur and decent HDR [daionet.gr.jp] and your quadruple-SLI Geforce 9000-Hyper-Pro-Matic setup will still grind to a halt.
Things are slowly getting there, but I'm still waiting - but like a gas, FPS-style generic corridors will expand in processing requirements until they saturate even the greatest hardware. Look at Doom 3, for example...
Re:Humans in my game (Score:2)
Agreed, but...
Most movies and videos are produced using "smoke and mirrors" too. E.g. typically the lighting is changed for every camera shot, reflectors, gels and masks are used to highlight or darken parts of shots, actors are coated with makeup to compensate for the incredibly artificial lighting, etc. etc. etc. And then everything is shot on a sound stage or in front of a facade or on a virtual set.
If you point a video camera at a ci
Open Real-Time Ray-Tracing (Score:2, Informative)
Ray-tracing presents a much more detailed rendering of a scene, but was always considerably slower than rasterization. If hardware-accelerated ray-tracing architecture grows in the market, you may see your skyline beautifully rendered in real-time
Re:Humans in my game (Score:1)
Re:Humans in my game (Score:1, Insightful)
I heard that its isn't supposed to be as good (Score:4, Informative)
Re:I heard that its isn't supposed to be as good (Score:1)
That being said, the amd64 system still edges out the p4 system in gaming benchmarks although the p4 system is ahead in synthetic benchmarks.
In other words: it looks cool but the situation is more or less the same as it has been since amd introduced cpus with onboard memory controllers.
Re:I heard that its isn't supposed to be as good (Score:2, Insightful)
For example the test systems were Intel 3.73GHz versus the AMD 4000+ at 2.4GHz.
Considering the competitive AMD CPU is the 2.6GHz FX55 model, this is obviously a skewed result.
They pitted an $1,100 Intel P4 against the $500 AMD Athlon64 4000.
Even if they had compared against the much faster AMD Athlon64 FX55, the price delta is still huge. The FX55 is an $835 chip versus the $1,100 Intel!
Even so, on most tests the AMD soundly won.
Some o
EM emissions (Score:5, Funny)
You'll be glad you kept your old steel PC case when we get this sort of speed out of MBs
Pete
Re:EM emissions (Score:2)
Re:EM emissions (Score:1)
But when you have current switching at a given frequency, you get EM emissions at that frequency. (And others, but thats another story)
They may not be particularly powerful, but they are there.
Regards
Pete
powerful enough to cause trouble (Score:2)
For some applications they are powerful enough to be a nuisance. Forget picking up weak amateur radio stations when you are close to a PC. I guess AM broadcast (if weak enough) could even be disturbed by a nearby PC. And indeed also at other frequencies than the clock: your machine may run at 1 GHz, but probably has higher harmonics. Also subharmonics and in short ALL kinds of noise will be created.
Re:EM emissions (Score:2)
Re:EM emissions (Score:1)
Re:EM emissions (Score:2)
Re:EM emissions (Score:1)
Nothing to do with clock (Score:2)
Your PC would indeed 'fry' if placed in a powerful enough electromagnetic field, such as a microwave oven. This is why we usually don't put it in. However this has nothing to do with the PCs clock speed, except maybe that technologically, to increase the CPU clock you need to decrease the gate length and the oxide thickness of your digital technology, making it less robust to fields.
Re:EM emissions (Score:2, Interesting)
At the moment, computers do cause some harmful radio interference if you leave the side of the case off. Since this is Slashdot, I assume there are several people reading this who have theirs off. However, even acrylic cases or case windows are enough to stop that radio interference.
Even if the frequency picks up enough that we were getting microwave radiation, an aluminum case would still be able to block it. Eve
Re:EM emissions (Score:3, Interesting)
You joke, but I gather there were minor problems at Jodrell Bank [man.ac.uk] when PCs' clock frequencies (and/or harmonics) happened to coincide with important radio frequencies used for radio astronomy.
As you say, though it's hardly dangerous - but having done an undergraduate experiment there some years ago in which an FFT of pulsar data detected nasty big peaks at 50Hz, 100Hz, 150Hz etc. (mains power...) I'm wondering if all
Re:EM emissions (Score:2)
Re:EM emissions (Score:2)
That being said computers do not really emit much in the GHz range at all. That clock speed tends to be limited to the cpu. The FBS is where you will get the interference. Usually around 166-200 MHz for the Amd family. Now if they ever tried to get the bus to run at 1+GHz then things would get exciting. You could have neons
Re:EM emissions (Score:1, Insightful)
Physics >> FCC (Score:5, Funny)
I think you'll find that the physics of water molecule resonance had something to do with choice of this band.
Funny how every other country in the world chose the same band, despite not being ruled by the FCC
Re:Physics FCC (Score:1, Flamebait)
Less than one might think. Microwaves over a fairly broad range of frequencies work--more than enough for different countries to choose different frequencies. In fact, I wouldn't be surprised if they have.
Re:Physics FCC (Score:1)
Yoda Hawking? Is that YOU??
What did you do to your hair...
nVidia better than Intel (Score:1, Interesting)
Re:nVidia better than Intel (Score:5, Funny)
Re:nVidia better than Intel (Score:1)
Re:nVidia better than Intel (Score:2, Funny)
Re:nVidia better than Intel (Score:1)
Re:nVidia better than Intel (Score:1)
Re:nVidia better than Intel (Score:1, Redundant)
bleerggg... damn my lightening submit button finger...
Heisenberg (Score:1)
Re:nVidia better than Intel (Score:1)
60% Overcaffienated
And what of... (Score:5, Interesting)
Re:And what of... (Score:4, Informative)
I just double checked on Intel's website, and the best I could find was 8x/8x (3 x8 and 1 x4 PCI express slots (28 lanes total)) And with that it is not possible to have multiple x16 slots (Heck, it's impossible to have 1) (It's possible I missed a better one. I was looking in the server section.)
The main reason that Tyan can do that is because of AMD's superior Hypertransport-based bus design in Opterons, over the shared bus favored by Intel. It's also the reason why Opteron scales a lot better than Xeon.
The other reason Tyan can do that is that Nvidia realized how easy it would be to make very slightly different chipsets that facilitated that. Basically they are just Nforce 4 chipsets, that can operate in parallel, giving 40 Pci express lanes (2-way) or 80 PCI express lanes for a 4-way Opteron. (Note a maximum of 4 x16s, as the other 16 can only be a max of x4, due to the 20 lanes per nforce4)
You can't do x16/x16 with any Intel Processor, as of now. (Though having seen how little x16/x4 or x16/x2 hurts benchmarks (vs standard x8/x8) I'm not convinced it's a big deal at all.)
Re:And what of... (Score:2, Insightful)
The Tyan is an insanely specified server board with something like 40 PCI-E lanes is it's basic config. It's not like that because Nvidia wouldn't release specs. It's like that so you can run several high performance workstation level video cards. I don't even think it uses two NForce 4 chipsets, I think you've just misunderstood the specs. Do you have evidence to the contrary Mr Coward?
And considerin
Re:And what of... (Score:1)
Also, it does use 2 nforce pro's... a 2200 and a 2050. Might want to check out the datasheet [tyan.com]. Most definately a killer workstation board for years to come.
Re:And what of... (Score:1)
To complement your nice Tyan motherboard [tyan.com], get one or two (XXX check for physical sizes) of them realizm 800 [3dlabs.com] from 3dlabs. They are the only 16 lanes PCIexpress videocards I know of. Not sure about GPGPU [gpgpu.org], but at 3840 x 2400, solitaire is bound to look amazing... especially if you can get some nice 9.2Mpixel displays as well: High end videocard without a matching display, what would be the point? Check for instance the IBM T221 [ibm.com]).
Anandtech
nvidia (Score:2, Insightful)
Re:nvidia (Score:1)
> that would cost around 3000$ to buy
Benchmarks aren't about making you feel good about your system. They're about making you feel inadequate so you'll buy a new one. And they WANT you to spend $3000 on a machine fit for NASA.
It's a good policy, anyway. A $1000 machine lasts about a year; a $1500 machine lasts about two years; a $3000 machine lasts about *five*. My P3-500 was about $3000, and still going strong. You pay for what you get.
Re:nvidia (Score:1)
Re:nvidia (Score:1)
multi-everything (Score:2, Interesting)
As a sys admin, I love the prospect of redundancy, but are there any benefits to bringing this multiplicity to anything else from a consumers perspective? Or does it stop here?
Re:multi-everything (Score:1)
Re:multi-everything (Score:2)
Re:multi-everything (Score:1)
Re:multi-everything (Score:1)
Re:multi-everything (Score:4, Interesting)
Wouldn't doubt it.
You can only improve on things so long before you need a complete redesign. Adding more to the mix is a great stopgap that extends the usefulness of technology.
At some point AMD and Intel are going to have to perform a MAJOR redesign (even bigger than the dual-core). Granted this might not be until we reach the 7GHz mark, but there is an invisible line somewhere.
There is one big downside for the consumer though: increased prices. Dual-Core CPU's will be more expensive than regular ones. SLI graphics will require buying 2 cards. RAID storage requires multiple hard drives.
Personally I think it would be cool if my next computer were dual-core with SLI video ports and a RAID setup. Whether or not I can afford it, that's another story.
With the obvious effects of distributed and grid computing Sony's supposed cell tech might actually prove to be interesting (though I'd prefer it on a more local scale).
Re:multi-everything (Score:2)
We've already hit it. Haven't you noticed how Intel already dropped their plans for 4GHz chips, and is going dual-core instead? The shift from upping clock speeds to parallelization is happening now, and we're never going to reach that 7GHz mark (at least not in the forseeable future).
Re:multi-everything (Score:2)
Re:multi-everything (Score:1)
Re:multi-everything (Score:2)
Well, the reason I find these cards interesting is I can buy one now. In a few months, when prices have dropped, I can buy the other and get ~2x performance.
YMMV etc, but it's intriguing to me. I'm sick of throwing away hardware.
Already has (Score:5, Funny)
I have even heard about a guy with TWO complete individual PCs...
Re:multi-everything - Yowza (Score:2)
Mind you, it's price range is beyond my means, but that's why you have online shopping carts you can always clear out.
Re:multi-everything (Score:1)
Humbug! (Score:4, Funny)
Back in my day we had the Voodoo 2's and the ol' 6mb of ram, 12 if you were rich! Couldn't even get two separate sprites on the screen without extreme lag... but we liked it!
Re:Humbug! (Score:2)
Re:Humbug! (Score:3, Interesting)
I had a single 12MB one that I bought used on eB
Re:Humbug! (Score:1)
On a separate note though, I recall a member of a popular Voodoo card fan site actually using two Voodoo cards to run Doom 3, it actually looked quite intriguing, the game was utterly stripped of its effects, including the darkness which pretty much defined it to most gamers.
In the end it seemed the Voodoos had successfully displayed the barebones of Doom 3 in general. It almost seemed like the older Doom games.
Interestingly though, one area which was
Re:Humbug! (Score:2)
I'm of two minds now. On one hand (mind? crazy metaphors), I want to find that old PC and be reminded of how older accelerated games looked. On the other, I want to build a new one and revel in the progress of consumer technology.
Re:Humbug! (Score:1)
Yet seeing these games in a retro sense would be wondrous, after years of advanced effects its pretty hard to imagine what older games are like (I recently replayed Tomb Raider 2, and was amazed at how primitive and dissapointing th
Re:Humbug! (Score:2)
Re:Humbug! (Score:2)
Nice motherboard, but... (Score:4, Interesting)
Re:Nice motherboard, but... (Score:1)
To test the nForce 4 SLI Intel Edition chipset, NVIDIA shipped us a reference motherboard with features that should be indicative of retail-ready products. We should note that the motherboard we tested does not support Intel's new dual-core processors, even though the nForce4 SLI Intel Edition was designed for both single and dual-core processors from the start. NVIDIA has informed us that support for dual-core processors is board dependent, and that top-tier manufacturers
Re:Nice motherboard, but... (Score:2)
You'd be far better off buying an Athlon 64 FX with a nForce4 SLI board today. That's still the gaming king-of-the-hill. Torch your money responsibly.
Re:Nice motherboard, but... (Score:3, Interesting)
At Liz Claiborne in the 90's when I worked there, the merhcandising team used dual vodoo's for studioMax when I worked there over the more pro video cards. They were very fast.
All these apps fly on Intel cpu's if you look at any benchmark. This is because they contain hand written assembly
Re:Nice motherboard, but... (Score:1)
I re-edited two sentances which is why redundancy is present.
Re:Nice motherboard, but... (Score:2)
RAID 5 (Score:5, Informative)
Re:RAID 5 (Score:1)
Re:RAID 5 (Score:2, Informative)
Nvidia RAID... not so good. (Score:3, Interesting)
I have one box under Windows using Nvraid, and it is just terrible. It drops drives from the RAIDs seemingly for fun, and configuring a bootable RAID is difficult under XP, and impossible under Win2k (even with an SP4 slipstream install, in case anyone was going to point that out).
The management software is crude at best. It cannot, for example, email alerts when a drive drops off.
My $.02.
jh
AMD board with RAID-5 (Score:3, Informative)
Damien
Re:RAID 5 (Score:1)
Does it really matter? This is still using your main processor for all the XORing so what is the benefit?
Re:RAID 5 (Score:2)
AMD's top of the line FX was not included. Most FX chipsets are not only the same class as the extreme Intel boards but many come with RAID 5 as well.
Re:RAID 5 (Score:3, Insightful)
And besides, using mdadm under Linux to create a RAID array out of the same drives using exactly the same configuration will work fine. I've tried it.
This is so cool... (Score:1)
Another Intel-funded CPU comparison? (Score:5, Interesting)
* Fastest consumer CPU they offer,
* Priced at about $1100, street
And compare it to the AMD offering, with these characteristics:
* Second fastest CPU they offer,
* Price of about half of the Intel offering.
Yes, that is a most fair review. It makes perfect sense to conclude that, on mostly identical chipsets, that Intel is faster.
How much are these sites paid under the table?
Re:Another Intel-funded CPU comparison? (Score:2, Informative)
- The 3800+ performs at 3.42-3.61 GHz which is too low.
- The 4000+ should perform at about 3.6-3.8 GHz, which is about right.
- T
Re:Another Intel-funded CPU comparison? (Score:1)
The 3.73EE is an 'enthusiast' CPU, it ought to be compared to AMD's 'enthusiast' CPUs.
We don't use Celerons to compare Intel & AMD64 and think its a useful comparison.
Re:Another Intel-funded CPU comparison? (Score:1)
Re:Another Intel-funded CPU comparison? (Score:2)
AMD's PR ratings are hardly something that should be used as the basis of comparison. Moreover, they're supposed to suggest, from my understanding, of how a particular AMD model will compare to an Intel CPU ***from the corresponding family***. If you really don't understand this, you're effectively saying that you don't know enough about the industry to be writing about it.
The clock speed or name of the processor just shouldn't matter. Do you really think that your logic would fac
Re:Another Intel-funded CPU comparison? (Score:1)
> processor just shouldn't matter.
The name, no, that shouldn't matter. Neither should the price. We're talking about performance differences, plain and simple; who made the processor and how much it cost are two things that have nothing to do with that.
> Do you really think that your logic
> would factor into someone's buying
> decisions?
No, I think it would factor into the question of which processor is comparable to another processor for purposes of benchmar
Re:Another troll who didn't RTFA? (Score:5, Insightful)
Normally, in a processor comparison, the processors are comparable for a reason - same positioning by the companies involved, same price point, whatever.
In this case, it appears that the only reason why the AMD proc was chosen was to give Intel "wins" in close contests, like LAME MP3 encoding, and to not make Intel's best look too awful in the cases where AMD won.
Point is, Intel was represented with its best game. Why should AMD be presented in a less favorable light?
There is little journalistic integrity with these enthusiast sites.
Where is PC Virtual Reality? (Score:1)
Re:Where is PC Virtual Reality? (Score:2)
That and they couldn't push the latest pixel cruncher when the display on your headset would be at most ~640x480
All about supply and demand.
There is a demand for the latest pixel pusher and not for the actual innovation...
This is why most cpu lines
Obstacles to Mainstream VR (Score:5, Insightful)
1) The headsets really haven't "tipped" price-wise. Kind of like LCD screens for a long time, they stay expensive (around $10k) while slowly improving in features (e.g. resolution, motion tracking). Until they get "good enough" the prices won't trend downwards. (There are cheap headsets, but they make you sick pretty fast. Even the pricey ones make you sick after 30 mins or so
2) The big issues w.r.t. UI remain unsolved. E.g. a lot of VR setups involve complex motion tracking and setting aside a room for subjects to walk around in. Usually a second person watches the subject to prevent them from, say, running into a wall... There are rigs that allow you to suspend the subject to allow them to walk through theoretically infinite landscapes... we're talking six figures though.
3) Behavior capture. The solutions to tracking movement remain pretty experimental and invasive. All the stuff we've talked about so far will, at best, get you walking around in a virtual landscape, capture your head movements (kind of), and maybe capture some of your arm and finger movements. Even assuming your $500,000 suspension rig captures all your body movements perfectly, we still to capture facial expression and lip synch. (So far, spacial 3d audio is pretty primitive too
4) Force Feedback. All this VR is going to seem pretty lame when you can walk through solid objects or your hand passes through an item you're trying to manipulate. Arguably, this is a subset of item (3) above, but in fact just allowing people to walk around in an unlimited expanse is a big enough problem...
There are plenty of finer grained issues to deal with, but the rendering of VR scenes (at least, so far) has turned out to be the easy part. At the moment, if you wanted to play WoW in VR you'd need to set aside a large room, buy an expensive HMD, and a really expensive suspension rig. (Luckily, WoW lets you run straight through people so the UI will match this perfectly.)
Re:Where is PC Virtual Reality? (Score:2)
Games are nice, and I'm somewhat surprised helmet displays aren't more popular for them, but that's not really VR and there's no content outside of that that would make VR more popular.
Soundstorm (Score:1)
The question is how well it performs in real world (Score:2)
How is the nforce4 Linux support (Score:2, Interesting)
I've booted my machine into it and to my suprise the ethernet devices worked out of the box with Xandros (based on debian sid). I still do not know about the raid or SLI video, however. I'm using a crappy old S3 PCI video card right now, but am about to receive two GeForce 6800 GTs in the mail. Can I use these bad boys in linux? Anybody know?
Re:How is the nforce4 Linux support (Score:2, Informative)
What about the drivers in the first place? (Score:1)