Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

nVidia nForce 120

cygnus writes: "Hardcore tech upstart site ExtremeTech.com got to check out an nVidia reference board, their findings are here. While most of the drivers and hardware were beta, they gave it some positive play. The story has a few large photos that are worth a gander... The reference board has only two PCI slots and no onboard Ethernet. Ouch. I'll stick with my Mac for now."
This discussion has been archived. No new comments can be posted.

nVidia nForce

Comments Filter:
  • no onboard ethernet? perhaps you weren't reading very carefully... Connectivity: The MCP also has all the typical I/O connectivity and functionality you'd expect from a south bridge. These include two ATA/100 channels for hard drives, a hardware Ethernet 10/100 MAC and soft-modem support. The networking capability also supports HomePNA 2.0 (phone line networking). There's also the usual low-speed connections, including USB 1.1 support (two channels, for four ports), floppy disk support, PS/2 keyboard/mouse, and the legacy serial and parallel connections. The one thing missing from the picture is Firewire. nVidia chose not to implement 1394 into the MCP, which is an unfortunate choice. Windows Me already has robust support for 1394, and Windows XP will have a stronger implemention of 1394. In addition to connectivity with consumer devices, 1394 can also be used for small local area networks.
  • Tip: if you're having problems with an nVidia card under X, try playing with the

    Option "NvAGP"

    line in XF86Config. My card crashed after a few seconds of openGL, until I set this to "2". Now it still crashes, but it takes more like half an hour than a few seconds.

  • The review that the article links to discusses how nVidia has realized that they can't survive by being in just the graphics business, and that they realized this when they saw the colossal success of the Intel i810 platform, despite its horrific 3D graphics performance.

    What they then go on to insinuate, though, is that nVidia is expanding into yet another realm of the PC hardware industry, which is, frankly, a scary thought to me. Not content with having a monopoly on the video card market, they are now going after the logic set market, intending to dethrone Intel in the biggest market of them all: low-cost motherboard chipsets.

    What I'm really worried about here is the PC architecture turning into another closed, proprietary, overpriced piece of junk that the Apple has always been. When they start making video cards that are only "compatible" with nVidia chipsets, you will find out what I'm talking about. This is just the beginning of the market consolidation for nVidia. First 3DFX and now this. You can already tell that they are against open standards when they refuse to release open source drivers for their newest cards so that they can be supported under Linux and other free operating systems. What do they have to hide, anyway?

    While no one on Slashdot may be interested in buying a motherboard manufactured with this chipset, there will be a lot of people who will. While one may even get decent performance out of the chipset, perhaps even better than the Intel eqivalent, one must consider: is it really the best choice that I can make in the long run? My personal opinion is that the answer is no, it ain't worth it.

    --

  • I know this one is still in beta so I can't expect too much, but can anyone tell me why so many of these video card reviews never have screen shots? I don't care if the card goes fast if it doesn't look good.

    Sigi
  • ...no onboard ethernet...
    According to the article, it does.
    The MCP also has all the typical I/O connectivity and functionality you'd expect from a south bridge. These include two ATA/100 channels for hard drives, a hardware Ethernet 10/100 MAC
    And although the ref board only has two PCI slots, the chipset supports 5. Asus already has one in the pipeline.
  • So I assume when you say it only has support for 2 PCI slots you mean, "It supports 5 PCI slots, just like all the other AMD-supporting chipsets, but this one board only implemented two of them, which is irrelevant to us because it's merely a reference board and we'll all buy the Asus full ATX that they've already designed," right?
  • by Anonymous Coward
    welcome to the woderful world of slashdot. the last website in the world that proclaims freedom of speech, but doesnt have any.
  • "stick with my mac" means he is terrified of the DIY PC market. too many options, too many mouse buttons, too many apps, too many OSs to deal with. although the fact that he knows about mainboards is a rarity for a Mac person.
  • ummm, how is chipsets Intel's core business??? I always thought it was server processors (Xeon)? With the multitude of chipset makers (VIA, Aladdin) out there making PIII/PIV enabled chipsets, I doubt Intel is going to shake much from this. Bottom line, AMD has a much mure relaxed position on releasing the specs necessary to run their processors, I believe that is one of the reasons they have as much market share as they do. Intel, on the other hand, is much more stringent when it comes to releasing specs of that kind.

    I'm sure after Rambus, Intel will screen it's clients/partners much more carefully, and nVidia's current business practices leave MUCH to be desired. Just ask Hercules.

  • by SClitheroe ( 132403 ) on Sunday June 17, 2001 @04:58PM (#144758) Homepage
    One neat thing that this chipset supports is the so-called "twin-bank" technology. This interleaves access to the DDR DIMMS (ie. byte 1 is on DIMM1, byte #2 in on DIMM#2, byte #3 in on DIMM #3) to pump up the maximum memory bandwidth beyond what a single DDR DIMM would be capable of. I guess they had to do this since the GPU uses system memory (ugh), but it has a nice side-effect of really unleashing the performance potential of 1ghz+ CPU's, especially if you ditched the onboard video and stuck something decent in the AGP slot.

    And to think, my 386 had interleaved memory back in the day..I was 'leet and didn't even know it ;)
  • by diablovision ( 83618 ) on Sunday June 17, 2001 @04:58PM (#144759)
    AMD recently claimed [dqchannelsindia.com] they would have 30% overall market share by the end of the year, their goal before launching their 64 bit processor [zdnet.co.uk]. It seems they are on track to meet those expectations. With NVidia's chipset offering a low-cost affordable solution for low-end machines and the 760MP [tomshardware.com] chipset offering us the world's first multiprocessor AMD platform, they are putting themselves in a position to have real sway in the coming 64 bit desktop revolution.
  • by hattig ( 47930 ) on Sunday June 17, 2001 @04:59PM (#144760) Journal
    NVidias nForce chipset looks great, and has a great next generation chipset architecture with is crossbar architecture, dual DDR channels, etc. But it is expensive ($60 - $70 per chipset) and "only" incorporates GeForce 2MX (nForce 420 might be GeForce 2 Pro) graphics.

    This puts it in a strange position in the market. The chipset is very powerful, yet the graphics will be decidedly average when the chipset is finally being sold on the market. It beats all the other integrated systems out there (the audio system is to die for, beating the SBLive! into a cocked hat) by a massive margin, but they only cost like $20 a pop anyway!

    The SiS735 looks to be the other chipset to look out for, beating the VIA and AMD solutions at the moment, and being a much cheaper single chip solution.

    Both chipsets incorporate next generation interconnects - nForce uses an 800MB/s Hyperlink connection, and the SiS uses a 1.2GB/s multi-threaded connection (possible due to the single chip design).

    VIA will soon be releasing their second shot at a DDR chipset for the Athlon, called the KT266Pro, without the problems the first one had, and improved DDR memory interface.

    Whatever, the future is looking great for the Athlon in terms of chipset support, which now covers the entire low to medium end of the market (up to workstations and low-medium end servers with the 760MP).

  • by jeroenb ( 125404 ) on Sunday June 17, 2001 @05:01PM (#144761) Homepage
    nVidia decided it was time to dive into the design and development of integrated core logic to capture a portion of this mainstream market with superior product.

    If they intend to take on Intel's i810 and i815 successes, then why do it with a chipset for an AMD CPU instead of one for Intel's? Sure AMD is getting bigger and bigger, but I haven't seen any marketshare survey give them more than 30% at best. And with the current largest PC maker in the world (Dell) not selling Athlon's at all they're severely limiting the reach of this new chipset it seems.

    Seriously, it's great that motherboard-makers like Asus are going to integrate the nForce, but what's the use? People building their own PC (the ones buying Asus/Abit/MSI/etc. boards) probably don't want to get an integrated GeForce2 MX...

    So anybody know what the deal is? Licensing?

  • by steveha ( 103154 ) on Sunday June 17, 2001 @05:06PM (#144762) Homepage
    AMD has had trouble selling into the low end of the market. Most of the trouble has been because the Intel side has low-cost motherboards with integrated video and sound. Vendors who want to build an inexpensive AMD system can get a Duron chip, but the savings are canceled out because they also have to buy video cards and audio cards.

    Now nVidia is going to make it possible to make an integrated motherboard, and the performance is going to be excellent. The Duron totally crushes the Celeron, the GeForce MX totally crushes the 810 onboard video, and the audio DSPs totally crush everything currently on the market. As long as the price for a Duron plus one of these boards is about the same as the price for a Celeron plus an 810 motherboard, they will sell a whole bunch of these.

    steveha

  • You could at least say something in the writeup about what the nForce is - is it a new graphics card? A motherboard? A chipset? A toaster oven? I know almost everyone else follows hardware development religiously, but I had no clue what this post was about. The linked article didn't help much at first since it assumed the reader already knew what was being reviewed.
  • The elipses in your quote, "... no onboard ethernet", refer to, in part, the reference board. The article does not state, contrary to what you way, that the reference board has on-board ethernet. The fact that the chipset supports it is another matter. But then you knew this, given the last paragraph in your reply in which you draw the distinction between the number of PCI slots on the ref board and the number supported by the chipset.
  • by hattig ( 47930 ) on Sunday June 17, 2001 @05:12PM (#144765) Journal
    A number of reasons I think:

    1. XBox was originally designed to use an Athlon. Politics dictated that the PIII would be used in the end.

    2. The PIII is at the end of its life, the Athlon has at least another 18 months ahead of it.

    3. The PIII cannot take advantage of this chipset at all.

    4. NVidia's bus licenses for the Intel platform are from Microsoft, they don't own them themselves, hence they can only make Intel chipsets for Microsoft.

    4a. Intel might have given P4 licenses to the slow chipset makers (ALi, SiS) which won't compete with Intel, but look at what is happening within Intel with regard to VIAs high-performance P4X chipset!

    Many people do not build great systems to run games on only, and this would be great - performance, and cheaper then buying separate components, and also enough oomph for a quick game or two.

  • by Anonymous Coward on Sunday June 17, 2001 @05:13PM (#144766)
    Too bad you haven't bothered to read about the nForce before commenting. The nForce supports an AGP slot, and in fact in the article they mention putting a better video card in with no problems.
  • by StenD ( 34260 ) on Sunday June 17, 2001 @05:13PM (#144767)
    ...no onboard ethernet...
    According to the article, it does.
    The MCP also has all the typical I/O connectivity and functionality you'd expect from a south bridge. These include two ATA/100 channels for hard drives, a hardware Ethernet 10/100 MAC
    As you note for PCI, the chipset supports onboard ethernet, but the reference board doesn't have onboard ethernet. From "The Real Thing--Hands On":
    Curiously, although device manager reported an "nVidia Ethernet Adaptor", there was no physical connection on the motherboard
  • by Anonymous Coward
    Just because the chipset has built in ethernet, it doesn't mean that there is an ethernet connector on every board with that chipset. Looks like this reference board doesn't have the digital audio connector(s) either. In short this board is missing quite a few of the features that are supported by the chipset, but I'm sure asus or someone will make a more full featured version for their board.
  • by linatux ( 63153 ) on Sunday June 17, 2001 @05:14PM (#144769)
    This sounds great to me. Small M/B, reasonable graphics, sound etc. Mix with a wee power supply and a laptop HDD and I'll have a PC that I can comfortably carry to work & back. Personally couldn't care less if it had NO PCI slots, but it must have ethernet.
  • The question that you've got to ask yourself is: what do you expect from screenshots? As far as 2D goes, the screenshots will all look the same anyway, really. In 3D, any card will usually render whatever the current Direct3D revision specifies, or it has new and unique and (largely) unsupported features, for which you'll find enough screenshots on the card manufacturers website. So, I guess that screenshots just don't give you all that much real information.
  • by Anonymous Coward
    NVIDIA has stated in the past they some of their graphics chip technology has been licensed from 3rd parties, and that those licenses prohibit the full release of specs.

    Basically, NVIDIA is stuck with decisions from the past that were probably correct at the time, but which now cause problems.

    However, the nForce shouldn't have this problem as NVIDIA should be aware that they have to allow for Linux support, and the core chipset business is an entirely different thing than graphics chipsets in terms of how much info you have to let out.
  • by hattig ( 47930 ) on Sunday June 17, 2001 @05:22PM (#144772) Journal
    Motherboards using nForce will be performance integrated mainboards. Also the price of the chipset will drop over time, I expect it will be $45 by the end of the year.

    Also, there are integrated chipsets for AMD processors already:

    1. SiS 730/733. Not a great performer, but really cheap. Quality to boot as well, but I have one running FreeBSD just great, using it at the moment). Also has network interface on-chip that performs fine.

    2. VIA KM133. Integrated Savage graphics. Outperforms the i810/i815 chipsets by a reasonable amount.

    3. VIA KL133. Integrated Trident graphics. Pretty crap I reckon, pitched against i810 though so what do you expect?

    The VIA solutions do not have integrated networking. This is why a lot of boards sold are i810/i815 - for the cheap corporate market where a motherboard needs video, crap audio and networking.

    VIA are aiming for that last market a lot with their C3 processor and PL133/PM133 chipsets. They will soon have all-in-one boards that are small, and incorporate: Video, Audio, Processor, Memory, Network, Modem, IDE, etc. There will be no need for any expansion slots (1 PCI will be provided though), and will use the new VIA iTX motherboard size (smaller than FlexATX).

  • by Anonymous Coward
    VIA = withholds chipset bug status reports, so that operating system developers can't write workarounds and it ends up eating your filesystem (intel does have a detailed chipset errata page)

    ALi = lackluster performance but at least it doesn't eat your filesystem

    Intel = high performance with bug workarounds openly posted to thier developer website.

    nForce = closed source drivers. too bad $everything_except_microsoft
  • by hillct ( 230132 ) on Sunday June 17, 2001 @05:27PM (#144774) Homepage Journal
    A couple of days ago there was an article here on /. [slashdot.org] discussing the nature of content the ExtremeTech.com [extremetech.com] website. Some comments on that article at the time (including mine) made the mistake of assuming that the site was being run by ZD Net, which is a subsidiary of C|Net, which most readers here look down upon, however, aparently that information was in error. The tite is in fact managed by ZD Media which is a new division of Ziff Davis Publishing.

    The earlier /. article is defintely worth a read though, when evaluating content from the Extreme Tech website.

    Having said that, and regardless of the earlier postings here indicating inacuracies in the ExtremeTech hardware review being discussed here, I have to say the site look quite well put together and editorially fair.

    --CTH


    ---
  • by Spoing ( 152917 ) on Sunday June 17, 2001 @05:28PM (#144775) Homepage
    ...with a smattering of "I'm a real hard core tech site" pasted on front. (Modding myself down 1 point.)
  • You missed out:

    SiS = open with specs, provide help with drivers (LinuxBIOS use SiS boards because of the help they get)

    I disagree about NVidia. Their graphics drivers are closed source because they contain licensed technology that they are not allowed to release. Hopefully this will not be the case with nForce, so whilst nVidia will probably not be as open as SiS, they should not have a reason to close source everything.

    Also, Intel chipsets are not all high performance. ServerWorks chipsets are. Intel integrated chipsets are pretty poor really, and need I mention i820?

  • The reference board has only two PCI slots and no onboard Ethernet. Ouch. I'll stick with my Mac for now.
    Do you think this stuff could really compete with mac anyway?


    Compete is a relative term, depends on the task (say, Photoshop vs Quake). I'm interested to learn how the onboard gfx of nForce compares to say, a GF3 in a Mac or PC. GF3 may be a bit faster, but is limited by AGP 4x/6x/8x and the system ram thruput. I'd also like to see if NVIDIA is working on some sort of unified color correction API for Windows (a la Apple's ColorSync... something that is universal, not just one part of a kluge of components).
  • Tell me people...why is it when someone admits they use a Mac you all have to pounce on them like schoolyard bullies using adolescent comments? It was just a passing comment by the poster but it seemed enough to apparently threaten some of you. Honestly I don't get it. I use Macs, but I don't go around bashing PCs or calling the users names -- that's just plain old hate/ignorance. Many Mac users would buy PCs...but they just don't offer what we want: OS X, integrated wireless support, problem free operation, configuration in minutes, etc). Back to the topic of motherboards...where can one find a x86 board with no legacy ports...just USB, Firewire, etc and one with integrated wireless support on board (and decent quality too, not the cheap junk)? All answers would be appreciated. I'd like a modern, inexpensive x86 box to dabble with Linux. I wish the onboard stuff on the x86 motherboard side wasn't so low quality...my friends have nothing but problems with onboard video, sound, etc. They tell me it's cheap...maybe so...but they end up paying more $$ for sound and video cards -- it bewilders me that someone pays money for something and doesn't mind that it doesn't work as advertised. Time is money, if you pay for something, it should just work...right?
  • Onboard graphics, ethernet, sound, and a few other goodies. Sounds like a Mac.

    Heh, I thought it sounded very Amiga-like myself! All it is missing is Firewire, and I bet that will be included on many motherboards, as this is a performance all-in-one chipset.

    Wonder if it is power-hungry, or if it can be used in laptops? This chipset would take over the performance laptop market if it did...

    Also, can you get the following: a 3.5" bay cover, with two USB ports (connect to header on motherboard), two Firewire ports (connect to header on motherboard/Firewire card), headphone socket for audio, serial port for legacy, and maybe a nifty 1-line LCD display for geeks? That would be great.

  • VIAs high-performance P4X chipset

    LOL!

    You mean the one that corrupts your filesystem when you enable ATA66 or 100 disk access?

    Well, it hasn't been released yet, so no. Also that bug is old, and relates to a fault with the Soundblaster Live! because it hogs the PCI bus. Although I agree that the PCI implementation should still be able to deal with it.

    Never had problems with ATA-100 disk access using VIA motherboards, except when installing Windows 2000 on an ECS motherboard. That is due to buggy Windows 2000 drivers anyway.

  • The crossbar architecture reminds me a lot of my Silicon Graphics Octane... built in January 1997. As far as wide memory bitpaths, I think the Sun Blade 1000 is the highest for less than $12,000 --288-bit ram path (upgrade four dimms at a time).

    At any rate, I think I'll wait for the SiS735.
  • GeForce has very low 16-bit quality compared to other solutions, it is a fact, and there are plenty of examples on the net somewhere to prove this.

    When rendering to 16-bit mode, the GeForce uses a 16-bit buffer, so the more layered textures a particular area has, the more artifacts get introduced. Other GPUs use a 32-bit buffer for all modes, and thus get great looking 16-bit graphics (at a small performance cost).

    Anyway, the interesting GPU at the moment is the Radeon 2, with its Truform technology that looks great.

  • I use a bit of everything and while I only own one mac these days, we use almost nothing but G4s at work (FinalCutPro 2 with DVcam and DVCPro gear). At home I spend most of my time on a few old SGI workstations but slowly using more and more Mac OS X on my PBG3. As soon as Apple finishes (or starts, heh) full hardware acceleration for the Aqua widgets, I'll probably go fullbore OS X and never look back. There's something very nice about using Photoshop, postgresql, apache, and iMovie all on the same machine under the same os. I do admit that OS X hasn't been perfect, but our Mac OS 9.1 machines at work have been rock solid.
  • This puts it in a strange position in the market. The chipset is very powerful, yet the graphics will be decidedly average when the chipset is finally being sold on the market.

    Why is that? I mean, if it's very powerful, the graphics should be more than average, right?

    It beats all the other integrated systems out there (the audio system is to die for, beating the SBLive! into a cocked hat) by a massive margin, but they only cost like $20 a pop anyway!

    Well, better technology should be more expensive, don't you think?
  • Being a Mac user (as well as a NeXTSTEP/Openstep, Solaris, IRIX, OpenBSD, Linux, and Windows user), I have indeed run across far too many "typical mac users". Half of which are addicted to the platform and refuse to even look at anything else... the other half not knowing the difference between USB and HBO (or IEEE1394 and the IRS 1040).

    I use a number of G4s at work as they've been powerful, stable, wonderful machines to work with for DV video editing and compositing (under Mac OS 9.1). The G4 case was a deam to work with when adding ram and a second hard drive to each machine. We currently have two SGIs and three PCs running Maya but will certainly try Maya under OS X later this summer. At home and for some of the organizations I work with, I use a variety of x86 and workstation platforms. I do my gaming and game serving on two homebrew Windows PCs. No need to bash a platform or try to convert the dissimilar. Use what you enjoy and what works for you.
  • The primary reason that these hardware reviews don't include screenshots is that the reviewers have to sign an NDA in exchange for the privilige of getting an early preview of the latest and greatest hardware. This is mostly due to the fact that the company in question doesn't want to kill their product before it is even released due to reviewers badmouthing the performance of beta drivers, beta firmware, beta hardware, etc.

    --
    < )
    ( \
    X

  • why don't you just... buy a laptop?

    -------

  • It's hard to tell exactly what the chipset can do, as opposed to what the reference board does. I would assume that, if this chipset can't handle a better GPU, they'll make one before long that can. Likewise, most of the obvious problems are probably a result of them doing the motherboard when they normally do chipsets.

    If they've improved memory bandwidth enough that the GPU and CPU can both use the same memory without getting too slow, that would probably be a big advantage-- you could dynamically allocate memory size and memory bandwidth, so that you only give the graphic system a huge amount of memory when it wants it, and you can use the memory for programs when you're not doing that much graphics.
  • Didn't anyone read this:

    Hardware Reviews Online [slashdot.org]
  • Well, if you think it isn't worth it, don't buy a motherboard based on this chipset.

    The simple fact is that the current generation of PC architecture is at a dead end. Making faster processors and faster video cards is great, but at the high end they are being seriously hobbled by poor system memory performance and bus speeds.

    NVidia is offering a solution to this problem. AMD is also offering a seperate solution. I'm sure Intel will 'answer' with their own solution...

  • The Mac is closed and proprietary, but hardly a piece of junk. Nvidia doesn't have a monopoly, and neither does Apple.
  • Why is that? I mean, if it's very powerful, the graphics should be more than average, right?

    No, the graphics won't be more than average - the GeForce 2MX, which is what Nvidia has integrated onto this motherboard, has been their value graphics chip for some time now. It isn't nearly as fast as GeForce2, GeForce2 Pro or GeForce3, and the motherboards aren't even on the market yet.

    What the post you replied to pointed out, is that this is a strange combination - while other features on the motherboard, like the built in sound and memory architecture are high end, the graphics aren't in the same class. So many (myself included) would like to have the same chipset, without the builtin graphics - no point in having that chip if you want to buy a top-of-the line graphics card anyway.

  • To sum up the article, here are the most notable features of the nForce reference board that they recieved:

    The chipset's North Bridge- Dubbed the "IGP", it provides for more efficient use of memory bandwidth using the GeForce3's crossbar switching technology. Note that this is the same kind of technology that SGI, Sun, and other high-performance UNIX hardware vendors have been using for their memory architecture (drool...)

    Support for 128-bit DDR memory- This is just amazing. Not only will their particular implementation allow you to put different sizes of DIMMS in the board, but it will also allow DDR memory to boost it's bandwidth 100%, which means that instead of maxing out at 2.1GB/sec, it maxes out at 4.2GB/sec (again, drool...)

    The link between the North and South Bridges- This will support bandwidth up to 800MB/sec, which means that it will have an excess even under unusual conditions, such as maxing out ATA/100, maxing out the PCI bus and outputting 256 streams of digital audio all at the same time. Simply awesome.

    The built-in audio- The article gives a lengthy description of this. Suffice to say, it is better than any consumer card available today, built-in or otherwise.

    Built-in graphics system- This actually was a little disappointing to me, because, while they may have it running at essentially AGP 6x, they used their budget system, the GeForce2 MX, for the processing. I'm also disappointed that it uses a "shared video memory" architecture that can be found in a lot of deplorable video platforms, notably the Intel i810 chipset. Basically, it uses up to 32MB of system memory for the frame buffer in addition to its onboard memory. However, with the huge amount of bandwidth available to the system memory, this may or may not be an issue.

    Lack of support for 1394 and 64-bit PCI- Regrettably, they don't seem to have included support for either Firewire or 64-bit PCI. The lack of 1394 support is particularly surprising, as it has gotten increasingly better support from both Windows ME and Windows XP, as these are the main platforms which will be running on this board.

    All in all, I think this is going to be an awesome board, a real leap ahead of everything else out there for the x86 market right now, at least in the consumer arena. I look forward to buying one, if the sticker shock isn't too harsh ;-).

    --
    < )
    ( \
    X

  • Uh, did you see the spex on the DDR bus? 128-bit? Sounds PERFECT for a nice 1u AMD server.. DASP sounds interesting, and integrating the north and south bridges on one chip (with HyperTransport) looks particularly interesting if they do cool stuff like 64-bit 66MHz PCI (for my ever-lovin gigabit ethernet adapters full bore.. 1 bus per slot please!)

    I was leaning towards the ASUS A7S for such a beast, but who knows? I may want to go for a 1024x768 run of UT on my nameserver.. ;)


    Your Working Boy,
    - Otis (GAIM: OtisWild)
  • by Chris Johnson ( 580 ) on Sunday June 17, 2001 @06:43PM (#144795) Homepage Journal
    Actually, this review seemed like nothing but a marketing release written by the company. I can just see the ZDnet guys placing emergency phone calls to nVidia:
    "Look, when we do X the framerate drops to one every five minutes. Uh, can we say that 'sucks'? Um, I see, well then, how about can we say it is 'not good'? No? Well, but, what do you expect us to say? Well, people are going to be asking about this, we can't really NOT say anything.. what's that? 'Stately'? Are you kidding or something? Wait, wait, no, no, I'm not being disrespectful! No, sir! We'll use that to describe it, sir! All right... thank you... thank you..."

    Get used to this sort of thing. These guys are going to be _the_ 'everybody' in the statement you may be hearing a lot of... "No, you can't put out a negative review of our product. No, not even a little tiny bit. Everybody else is perfectly happy to accept our terms on this, take it or leave it..."

    Oh joy :P

  • why are YOU so threatened by others being threatened? the guy's comment was just stupid. he said "The reference board has only two PCI slots and no onboard Ethernet. Ouch. I'll stick with my Mac for now." he's judging a reference board... anyone in the DIY scene knows that reference boards come like this, microATX style. nForce boards will be available in the fall. and there IS onboard Ethernet in the nForce.

    the "stick with my Mac for now" comment just makes him a big target for catcalls. sure pal, go ahead and stick with your "supercomputer". he's the instigator.

    as far as mainboards with no legacy ports, I'm not aware of any. but you could disable them in BIOS to free your IRQs if that's what your worried about.

  • The main reason is Intel's licencing -- I remember reading that on Tom's Hardware Guide.

  • Another interesting problem is the issue of memory bandwidth. There is some concern, rightfully, about available system memory bandwidth when using the integrated graphics core. In theory, the CPU should always be able to grab memory bandwidth first. When using the built-in IGP graphics, the memory streaming tests came in about 15-20% lower than when a GeForce2 Ultra was dropped into the AGP slot. This is possibly related to their memory arbitration logic still being enabled and consuming cycles even when nothing else is truly vying with the CPU for memory access.Another interesting problem is the issue of memory bandwidth. There is some concern, rightfully, about available system memory bandwidth when using the integrated graphics core. In theory, the CPU should always be able to grab memory bandwidth first. When using the built-in IGP graphics, the memory streaming tests came in about 15-20% lower than when a GeForce2 Ultra was dropped into the AGP slot. This is possibly related to their memory arbitration logic still being enabled and consuming cycles even when nothing else is truly vying with the CPU for memory access.

    Exactly how does the reviewer think the picture is getting on the screen if the RAMDAC isn't accessing main memory? Even suggesting that the CPU should have higher priority than the RAMDAC is either ignorance or stupidity. Of course the RAMDAC has to have priority or you are going to end up with big blank patches on the screen!!
  • The GeForce 2 MX can barely handle 800x600 in 32-bit mode, according to the review.

    So you are stuck with 16-bit graphics if you want 1024x768, which any person is going to want because this is a performance chipset at heart, for workstations and the like.

    Why can't someone integrate a KyroII into the northbridge?

  • Except that they linked to an earlier article that they had written on it on the first page, and if you didn't understand what they were talking about, you could have easily clicked on the link to brush up on it beforehand.
  • I've worked with a number of the highly integrated SiS chips - they all have graphics (2d and mediocre 3d - not a gamer's machine), ether, IDE, USB, sound, etc etc built in - the Linux drivers seem to be pretty good (with, sadly, the exception of MPEG playback even though the chips have support for motion comp) - there are both kernel and X drivers available - on the downside there's almost no published data from SiS for anyone wanting to work on the drivers.

    There are a bunch of realy cheapo boards available - some only have 1 PCI slot (after all with almost everything builtin most people don't need slots at all) - you can build an OK box for $150 - many 'barebones' machines have these boards in them. Apart from that you don't see them here that much, I suspect the bulk of them are being sold in India and China.

    In fact the only real problem I've had with them is that the Windows ether driver seems to blue screen all the time - like I care :-)

  • I have a GeForce MX and it performs a little better than the Voodoo3 that was in my machine before. For most games I find it good enough. I like the idea of this. I am thinking about getting one of these to put in a 1u case for LAN parties.
  • Maybe because the PIV is a piece of junk?

    Intel tends to slump from time to time, but the i810 fiacso (including RAMBUS) followed by the PIV is very worrisome.

    I think that AMD stands to hit is big with the right chipset. This may just be it.

    -Peter

  • Actually, for the longest time, everyone has been shipping integrated graphics with AGP slots. HP used to not do that, and they caught hell for it. I doubt you'll find many computers at Best Buy that have integrated graphics and no AGP slot.
  • At any rate, I think I'll wait for the SiS735.

    Why? If you're looking for an incredible integrated video and sound capabilities, the nForce will likely be the only decent choice. This chipset seems more aimed at middle end machines built by OEMs where the user doesn't really care what's in it, or for compact systems that need to take as little space as possible. The SiS chipset seems far more for traditional power users who want to swap out any of their components. I dont' think people who buy the nForce will be the same people who would've bought a SiS735 based board. It will, however, raise the bar against some of SiS's other chipsets that feature integrated video, audio and LAN, and force them to create more powerful video and audio in their chipsets.

  • If it's an iMac or a Cube we're talking about, then the reference board has you beat 2 to 0 on PCI Slots. Same goes for their laptops, obviously. So hopefully this is a G4 minitower or older Mac that you're sticking to.
  • Good god, man. Judge a company by what it does, not what it has the *potential* to do. There are so many good things about NVIDIA I don't have the time to iterate them now. NVIDIA got where they are by making good products and having good support. It is a disrespect to them to suggest they're trying to control the world, when they've given no such signs.

    As for the driver thing, get your head out of your ass. It's called licensed code. (Plus the fact that they have the best OpenGL implementation available. Why should they write ATI's driver for them?)
  • I've had the opposite experience with Linux and SIS based boards. The ones I've seen had some nasty problems with the SiS IDE controller that would lock up the system spontaneously when mounting CDs, or other IDE-based tasks. There were "phantom" SiS900 LAN adaptors detected (with no visible option to turn it off in the BIOS), and generally looked to me like a very shoddy board. About the only board I've never had a problem with are Intel chipset-based boards, since Intel has made a policy of documenting their chipsets and errata publically, the support for these boards in the Linux kernel seems exceptionally stable.
  • What a bunch of whiners. It didn't have an RJ-45 jack. I'm sure that it had one of the old fashioned jacks. Just requires a soldering iron, a magnifying glass, and the patience of Job.

    You folks must be waayyyyy too hung up on plug and play.

  • As for the driver thing, get your head out of your ass.

    Yup, that's how logical arguments are won. Please, keep trying to start a flamewar, you're getting better at it.

    It's called licensed code.

    That is their problem. Perhaps they shouldn't have used licensed code? Food for though, eh?

    Why should they write ATI's driver for them?)

    Mainly because they wouldn't be writing ATI's driver for them. They are based on different chipsets. You do realize that drivers are hardware specific, right?

    --

  • I can think of four boxes built by friends in college with Asus/GeForce2MX. Off the top of my head. I can only think of one that doesn't fit that description. A lot of the people building their own PCs (the ones buying Asus/Abit/MSI/etc. boards) are price-sensitive consumers.
    --
  • It might well be said that I am addicted to the Mac platform and yet... 1394 is aka FireWire and I pay my damned taxes using a 1040 form... and would not think of using windows unless under extreme duress, it's simply a philosophical thing. Linux, I'm all for it.
  • "why don't you just... buy a laptop?"

    Hmm, maybe because:

    • notebook components are expensive
    • notebook components are expensive, or
    • notebook components are expensive

    Think of it as a computer you'd take to LAN parties or any other situation, that won't require a loan to purchase.

  • by Cerlyn ( 202990 ) on Sunday June 17, 2001 @08:48PM (#144814)

    Check your BIOS: If you have a setting called "Memory Interleave" or something like that, you likely can do this with your motherboard today. The motherboard I have that does this (an EPOX EP-MVP3G) has three settings: None, Two Bank, and Four Bank. Benchmarks suggest a major improvement with a K6-2/450 when Two Bank is selected, although I don't have enough slots to try Four Bank out (it just reverts to two, or so it seems).

    And dareth I mention it, many Macintoshes have been able to do this for quite some time now. Just add extra RAM to a Mac, make sure all the SIMMs are identical, and you will suddenly have a nice performance boost for more than one reason.

  • The lack of 1394 support is particularly surprising, as it has gotten increasingly better support from both Windows ME and Windows XP
    If nForce was meant for the Xbox, then no 1394 support isn't that suprising.
  • by Anonymous Coward
    At least on Linux, the kernel folks are still fighting VIA bugs. So dismissing a bug as "old" is not the same thing as saying "it's fixed" which it isn't because VIA is a dogshit company with dogshit products. And they won't release specs or errata, which is good enough reason alone for open source users to avoid them. The Nvidia solution might not be very high-end, but at least it's not VIA.
  • Err, correct me if I'm wrong, but the 'onboard' video card still has ram on it. Probably something like 8 or 16mb or so (maybe 32 if they were nice). Enough for a few framebuffers. The main memory is used to store textures that the video card can't hold. This is the main idea behind AGP that is now conveniently forgotten because cheap and performance are two opposite goals...
  • Err, correct me if I'm wrong, ...

    OK, I'll correct you ;)

    The onboard video DOES NOT have integrated frame buffer memory. This is the same for most i810's, i815's etc. In fact, you can lower system performance by running your monitor at a higher refresh rate! To find out how much memory bandwidth your monitor is using on these things, mulitply your screen number of pixels (width * height) by 2 for 16-bit color or 4 for 24/32 bit color. This will give you how many bytes per refresh the RAMDAC is using. Multiply that by your refresh rate (ex. 85) and you have bytes per second. Take this off of your maximum memory bandwidth, and that's what you have left for your CPU. Not a great situation!

  • I'm typing this on an iMac. I'm using a three-button Logitech Wingman Gaming Mouse, I've seen more apps available for download on Hotline than I would know what to do with (or be able to fit on my measly 20GB hard drive), I dual-boot between Mac OS 9.1 10.0.3, I have four x86 operating systems installed under VirtualPC, and my server runs Slackware on a SuperMicro P6DGU with on-board Ultra2 SCSI (Adaptec AIC-7890 chipset) and 512MB of ECC PC100 SDRAM. We're not as rare as you might think.

    (by the way, although I use Hotline, I do try to pay for most of the commercial and shareware software I use.)

    --

  • ATX, PCI, AGP, ATA/66, PC133, 10/100/1000 base-T, v.90, USB, IEEE1394, IEEE802.11b... Sounds awfully proprietary to me. Just because nobody else is selling PowerPC motherboards to consumers yet, doesn't mean it's a proprietary architecture. Just a nitpick. ;-)

    --

  • Actually the nForce does allocate eighther 4, 8, 16, or 32 MB of main system memory for frame buffer. I think it is in that article. It also says that there is an interconnect the equivilent of a 6x AGP speed, so it isn't missing the point of AGP, it is using it to the nForce's advantage.
  • by _damnit_ ( 1143 ) on Sunday June 17, 2001 @09:55PM (#144822) Journal
    The article clearly states on the next-to-last page:
    Curiously, although device manager reported an "nVidia Ethernet Adaptor", there was no physical connection on the motherboard, so we had to resort to a tried-and-true 3Com 3C905B-TX card for network connectivity. There was also a mysterious "PCI card" present, but there was no clue as to what it represented.
    Please read the entire article before you comment. Also, you whiners about moderation could follow the same advice. Perhaps some moderator knew the information was false and mod'ed him/her down. Irregardless of where the truth lay, you just want to complain about the evils of moderators.


    _damnit_
  • The 'phantom' sis900 is probably the integrated one in the north bridge - if the board you have doesn't have the full integrated ether it may still have a header you can attach one to (it looks like a PCI panel with the ether connector/transformer on it (like the one for the modem or socket7 USB). The Sis based boards come in all shapes and sizes - I've even seen ones with extra ether chips on them (plus the builtin ether header) presumably because the integrated driver for windows is broken
  • by Anonymous Coward
    Yeah, it's Ziff Davis, and believe me, they're about as hardcore as Nick at Nite. Those guys never read a press release they couldn't rewrite as an article.
  • This puts it in a strange position in the market. The chipset is very powerful, yet the graphics will be decidedly average when the chipset is finally being sold on the market. It beats all the other integrated systems out there (the audio system is to die for, beating the SBLive! into a cocked hat) by a massive margin, but they only cost like $20 a pop anyway!

    I've been wondering about this...what exactly have Creative's engineers been up to for the past 3 years? There have be no real upgrades to the SB Live! in that time, all they do is add neat little drive bay cable connections and change the box a little it seems.

    So is this a case of "We don't have to release a better product, this one is selling fine..." or did they just fire all their engineers?

    Seems like given 3 years of R&D time, Creative may have something up their sleeve. Or maybe not...who knows...

    -Sokie
  • by IronChef ( 164482 ) on Sunday June 17, 2001 @10:43PM (#144826)

    Man, I better go buy some Intel hardware. God knows they need the support. Those poor guys with their great CPUs have never amounted to anything, and now nVidia's really going to crush them. It's a shame.
  • by Xross_Ied ( 224893 ) on Sunday June 17, 2001 @11:04PM (#144827) Homepage
    nForce does not use memory interleaving, it uses a memory crossbar.

    memory interleaving:
    Interleaving of different banks of memory which are in SERIES.
    + shortens address line setup time for sequential memory access.
    + upto 30% more bandwidth (depending on type of memory FPS/EDO/SDRAM) when the cache line is double (or quadruple) of the bus width.

    memory crossbar = Interleaving of banks of memory which are in PARALLEL.
    + parallel memory transactions (if each is for a different bank).
    + improvements are less deterministic, depends on which transactions can be parallelized.
    + does reduce memory contention, i.e. different devices (CPU, Video, Disk, NIC) wanting access to memory at the same time.

    For multimedia type applications (read games, DVD playback) this is a big help.

    nForce's twin-bank memory sub-system reminds me of the Amiga's twin bank memory sub-system.
  • Would this be like the on board "ethernet" port on the Tanzania based Macs and Mac clones? i.e. the Power Macs 4400, Motorola StarMaxes, and the sublicensees. It looks like Apple integrated ethernet into the O'Hare ASIC, but I have I have never tried to solder an Ethernet port and related hardware on and figured out which shunts would have to be moved for Ethernet support. Personally I'm kind of leery of trying to use the Ethernet on the 4400 and its siblings. First off the 4400s that did come with Ethernet had the controller on a Comm Slot II card, which is basically a PCI slot on LSD. Second off the Powerbook 3400 which has Ethernet too uses the O'Hare ASIC and you'd think that to save money and space Apple would use the Ethernet on the O'Hare IC, instead they use a Modem/Ethernet card whose slot is more like a PCI card on crack. Now I know Apple is not known for its sound engineering decisions, but it is safe to say Apple would have used the Ethernet on the O'Hare ASIC if possible. Thirdly Apple never mentions the existance and denys the presence of Ethernet hardware with respect to the O'Hare ASIC in all of the developer notes I've seen referring to it. Apple mentions that the Heathrow ASIC has an MFM floppy controller that no Mac has/will ever use. Of course this is all out the window if Apple is lying and the 4400 has a Heathrow controller(maybe) or something else entirely(doubtful). Hmm, now that I think about it maybe Apple is lying and the 4400 does have a Heathrow ASIC as opposed to an O'Hare ASIC.
  • Just a minor correction..

    NOT 128bit DDR memory:
    nForce has two independent 64bit DDR memory buses, see..
    http://www.anandtech.com/chipsets/showdoc.html?i =1 484&p=4

    When both banks are are populated (with a 64bit DDR DIMM in each) you have a "virtual" 128bit bus. However, if the two banks are interleaved, you do get 100% boost in bandwidth.

    Note: more bandwidth is not what limits 1+GHz PCs, its memory latency and system contention.
    nForce is the 1st chipset to attempt to address the issue of contention while NOT increasing memory latencies.
  • closed source drivers, hopefully they will have a fix for that..
  • the current generation of PC architecture is at a dead end.

    I sense a tone (not in your post, you provide the hook, thanks) that integrating somehow limits freedom to build a custom PC from components. So in 2010, will hardware hackers be building teeny weeny wearable PCs using tweezers and a magnifying glass? Of course not. We'll go through the single chip phase, then the cost will fall, then people will use entire computers as components.

  • Gee, you haven't checked that for a long time - did u?

    Upgrade to kernel 2.4.5 or upward - the problems is fixed there - I'm using 4 X 30GB on ATA 100 as RAID - and it works perfectly (with SB Live)

  • No, the graphics won't be more than average - the GeForce 2MX, which is what Nvidia has integrated onto this motherboard, has been their value graphics chip for some time now.

    You are comparing apples to oranges. nForce in by far the most powerful integrated graphics chipset there is! When you compare it to other integrated chips, you will see that there really is no comparison! Sure, add-on cards are of course faster, but when it comes to integrated graphics, nForce blows everything else away!

  • Great! Troll Mod... grunt grunt.

    I didn't read that in the article, but nevertheless, integrated devices, and a couple of pci devices, still make up for what I said.
  • If you look a SuiteSisterMary's SlashDot user information page [slashdot.org], you will see that almost all of this person's postings have been moderated up, even though they are typically trivial, not particularly insightful posting (and I mean this matter-of-factly, not as an intended insult). I had a discussion with SuiteSisterMary and noticed that each of his (her?) responses was moderated up to 2 as soon as it appeared, as was the case with all but one of SuiteSisterMary's other postings postings on that article. [slashdot.org] I believe that SuiteSisterMary may be maintaining a stable of slashdot accounts to use as they get moderator points (or perhaps arranging with friends to achieve the same result through their accounts).

    Although I do not think that it is possible to always stop this kind of abuse if the perpetrator is careful enough. On a case-by-case basis, one can point it out (as I am doing here, which is why I have attached this off-topic posting here). It might be a good idea for Slashdot staff to check if SuiteSisterMary has been sloppy about this by seeing if the IP addresses from which SuiteSisterMary's postings were made are the same the IP addresses from which they were moderated (or a group of very close ones, say, from a DHCP or modem pool) and if all of the moderation consistently happened within a minute or two of posting.

    It may also be possible to make some systemic fixes to at least reduce the problem, such as by preventing a moderator from moderating the same user's posting within one month (which would also discourage moderators from camping out on one particular thread for partisan purposes).

  • I had a discussion with SuiteSisterMary and noticed that each of his (her?) responses was moderated up to 2 as soon as it appeared, as was the case with all but one of SuiteSisterMary's other postings postings on that article.
    Actually, if you understood Slashdot's Karma system, you'd know that if you have high enough Karma, you automatically post at +1, and if you have low enough karma, you automatically post at -1. ACs post at 0, and registered users post at 1, followed by the above modifiers. And I don't have other accounts.
  • Actually, according to the article, the motherboard itself had an ethernet chip, but there was no physical ethernet port; I guess that's actually more useless than not having an ethernet chip onboard. But once again, it was also a beta board.
  • According to SuiteSisterMary's user information page [slashdot.org] at the time I am writing this reply, 37 out of 38 of his (or her) most recent postings are moderated up, including the one to which I am replying. I encourage readers to look at those replies. They are not the sort of postings that layout a substantial amount of new insight or that bring in much new factual information that normally get moderated up. Whether SuiteSisterMary is lying about having extra accounts or is doing something slightly different but close I leave readers and to draw their conclusions, and hope Slashdot staff will look at the IP addresses and times of SuiteSisterMary's postings and moderation events.

  • The Geforce2 MX is a great choice. Some would say that a Geforce2GTS/Geforce3 would be nicer but recent NV cards have 200-250mhz DDR memory just to keep up with the GPU. There is no way without faster mem it would make any difference in speed. Maybe if they put the framebuffer/texture buffer on the board but that costs money. 1024x768 16bit color is well above enough for a integrated chipset. If you want faster go for a standalone GFX card.
  • I have no idea what your problem is. Jealousy? Self-righteousness? Penis envy?
    Oh no, somebody has more Karma than I do. But because I personally don't think they deserve it, he must have gotten it by cheating! There is NO possible way that any moderator could possibly think differently than I do, therefore there is NO possible way that any legitimate moderator would ever mod this guy up! Oh no, I've only posted 9 posts, one or two of which have gotten modded up! He's posted 30 or 40, and they almost all did! Waahhh! Waahh! It's not fair!
    Budda Flying on a Gods-damned bicycle. I hope for both our sakes that somebody does decide to listen to your pissing and moaning, take a look at the logs, and tell you that yes, I only have one account, that yes, all of my modded posts were modded by people other than myself, and then you'll shut up.
  • Is it just me, or are the editors at Extremetech.com some of the worst out there? I read hardware reviews all the time - from arstechinca, cnet, etc, and never before have I really noticed grammatical and typographical errors nearly as much as after reading one article on extremetech.com.

    Am I the only one who has noticed this? You'd think with all the corporate backing that they would have some decent quality.

  • Oh, and bye the bye, the post you were referring to, where I mentioned that I don't have any other accounts, WAS NOT MODDED UP. Read the damn slashdot faq. [slashdot.org]
    Logged-in users start at 1 (although this can vary from 0 to 2 based on their karma) and anonymous users start at 0.
  • The frame buffer is part of main memory - you set up how much main memory to use as the frame buffer in your BIOS.

    Sure it has an "AGP 6x" connection, but this doesn't help a lot when you are pumping all the data to the RAMDAC through that connection as well.
  • I must apologize. I did not realize that it was possible for any user's post to start at 2. If your posts are simply starting at two, then the number of additional moderations remaining is not that unusual, and I was wrong.

  • 1. XBox was originally designed to use an Athlon. Politics dictated that the PIII would be used in the end.

    or maybe they didn't want to have stability problems with the chip (athlon) overheating ?

    2. The PIII is at the end of its life, the Athlon has at least another 18 months ahead of it.

    wrong.. the new PIII Tualatin is due to come out shortly.. .13 microns, 256kb, 512kb+ cache. New enhancments. Starting at 1.13, and 1.26, PIII's are far from dead.. maybe the Coppermine core will stop at 1.13 (they are reintroducing these with the latest stepping btw) .. But like AMD and its Palamino .. Intel is extending the life of the PIII with the Tualatin

    3. The PIII cannot take advantage of this chipset at all.

    Single PIII's have shown no significant improvment with DDR .. but Dual PIII's have.. this using an Iwill DVD-266R and Dual PIII 1ghz CPU's showed 15-20% improvement over similar setup SDR boards

    4a. Intel might have given P4 licenses to the slow chipset makers (ALi, SiS) which won't compete with Intel, but look at what is happening within Intel with regard to VIAs high-performance P4X chipset!

    ugh.. i just wish VIA would go away.. The day they make a good overall chipset.. is the day Microsoft will make all their software open source

  • Because nVidia is secretly in cahoots with AMD. They are currently developing a GeForce4/Athlon "system on a chip" which will be the worlds first processor to have world class graphics built in.

    Thinkaboutit.
  • Heh, I bet there won't be any consumer cards out in the near future that would need 64-bit PCI, but I sure would appreciate it for the Ultra160 SCSI RAID controller that I have for my own personal use ;-)

    --
    < )
    ( \
    X

  • Well, I don't know squat about how processors really work. Allow me to display my ignorance . . .

    It seems that even reasonably "optimized" code will only see a significant clock-for-clock performance gain on a PIV when it is "lucky" to need the right work done to keep the pipeline filled.

    Am I wrong here? Clearly the PIV could do some incredible stuff with the right synthetic benchmark, but do you really think that it is going to have a significant (again, clock-for-clock) advantage in real world apps once compilers catch up? If so, I think this is the minority opinion. Or am I wrong about that too?

    -Peter

Quantity is no substitute for quality, but its the only one we've got.

Working...