Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

NVidia Vs. Intel: Fight To Come? 140

Mostly Monkey was the first to write to us regarding a new article on Tom's Hardware. The article is about the launch by Nvidia of "...nForce, its integrated graphics, audio, Northbridge, and Southbridge chipset, also referred to as Crush." The implication is that Nvidia is setting itself up to move past the graphics/audio market and get into competing with Intel in the full on chip market. What with AMD's recent success, that doesn't sound so unlikely.
This discussion has been archived. No new comments can be posted.

NVidia Vs. Intel: Fight To Come?

Comments Filter:
  • by Anonymous Coward on Monday June 04, 2001 @09:09AM (#177648)
    This is pretty interesting in the sense that no company since the original Amiga team has tried to create a fully integrated system with multimedia capabilities from scratch. Instead of building an initial architecture and keep tacking on bus loads until performance starts to suffer, build *everything* into the architecture so that bus transfers from competing chips don't step on each other (well, as little as possible anyway).
    With all the talk about the new Amiga (software only), I'm more excited by this development.
  • Absolutely. With the "nForce", nVidia is promising a single driver that would support not only the mainboard and disk controllers, but also the on-board GeForce 2 GPU, Ethernet, and Sound Card. This should be a huge hit with the OEMs.
  • Like BIOS updates, kernel revisions and the like, you only update if you need the new features. On the other hand, if nVidia wanted to do 3 releases per year, and release X woudln't be released till feature Z was done, but could be released without feature Z but with feature Y or fix Y that you're after, you'd have to wait.

    Release early/often works on the assumption that the user knows when to upgrade. Release rarely-and-when-its-done works on the assumption that the user doesn't
    John
  • by pod ( 1103 )
    Any non-zealot moderator, please pay attention! This is an extremely good point, and is the very reason nVidia doesn't release source to their drivers. Video drivers these days are very complex, they're a technology on their own. If you've ever seen the function specs on DirectX and OpenGL you'll realize how complex implementing all of that correctly, completely and efficiently is. That video drivers are under constant development should be hint enough for you. Each release gives a 2-5% speed boost; driver development is almost analogous to 3D game engine development, each generation offers immense improvements in features and speed, and there is a lot of incremental development between generations (though games don't usually come with engine updates, except maybe for Quake).

    --

  • Why should we blame nVidia for this situation? nVidia is not an Open Source(tm) company, they make hardware and drivers to make it all work. They probably don't give a flying shit about what license they use as long as they can license 3rd party code and get their stuff on a Windows install CD. And contrary to GPL and Open Source(tm) they actually think their driver code is an intellectual property asset (which arguably it is, how thick are the OpenGL and DirectX spec books?), and releasing it in source form is unacceptable, not to mention a breech of contract with above mentioned 3rd party code licenses. Just be glad you _have_ Linux drivers, and high quality ones at that.

    --

  • Yeah, they work REALLY great on my Alpha Linux machine. Thanks for reminding me.

    So tell me, just to make it very clear, why did you buy an nVidia card for your Alpha machine knowing it didn't work? It's not like nVidia has drivers for any OS for Alpha (not just Linux), so why'd you bother? Or is this just rhetorical FUD? Go away troll.

    --

  • It's not driver support (though that can be an issue), and it's not even performance. We've had some network cards in our servers here fail, totally stop working, for no reason. Kingston cards are the most notorious for us, we recently _gave away_ a box full of Kingston NICs. No-name and cheap NICs are just not worth the trouble of potential critical hardware failure, and the price premium on known good quality cards is not that high. For a moderate use home machine or a development server anything will work, these cards _are_ selling and they _do_ work, but sometimes you get what you pay for.

    --

  • Everything sounds good. Aside from any upcoming technical problems, this just may be the very first integrated chipset I'll buy. Everyone who knows what they're doing does the wink-wink-nudge-nudge thing when talking about integrated motherboards (you know, the ones with crappy on-board sound/ether/video).

    Sound is pretty much commodidy stuff these days, just go down to your local hardware store, pick up an SB Live, and you're done. There are some more powerful cards out there for people who need them, but otherwise SB Live is perfectly sufficient. nVidia is just doing the logical, integrating a more powerful sound solution, thereby eliminating most of Sound Blaster's market.

    Ether is pretty much commodity stuff as well. Aside from all the no-name clones out there, it's best to just spend the extra couple of bucks and get yourself a real card, 3com, Intel, whatever. This nVidia chip should really also have a good 10/100 component.

    And video? Well, I suppose nVidia is already in that market heavily, with all the cards based on their chip, and it's best to leave this part alone, as video cards are the most often performed (and most expensive) upgrades (how many times did you upgrade your sound card or network card, or even cpu?), so it's a cash cow as a stand-alone card.

    The only threat to Intel in this chipset is of course the integration and elimination of the North/South Bridges, but most importantly, nVidia's alignment with AMD. It's all AMD this, AMD that, and DDR also features prominently. Good stuff.

    --

  • I can't agree more with your statement. Ever since I read the first write-up on the Geforce3, I've been waiting for someone to bring the memory crossbar to the mobo. I know anandtech had a nice bit on the ServerworksHE or whatever, but that's a server/Highend workstation board. This is the first modestly priced board with these features.
    What questions I have remaining concern i/o and busses. Sure, the HyperTransport is fast between north and south... but how many PCI busses does the south bridge handle? How about PCI-X or 64bit/66Mhz PCI? Firewire? IrDA? I also wonder if interleaving wouldn't have been better than the "neat" feature allowing different types/amounts of memory on each bus.
    I am impressed by the feature list and eagerly await the first test drives by Tomshardware and others. This could get interesting.


    _damnit_
  • Now I've had an ATI Radeon for a month or so, and have yet to experience a single crash. And the Radeon DRI drivers are still alpha code... Go ATI, I say!

    Until you realize that they too are holding out... TV-Out, hardware MPEG2 decode... these things aren't available on your beloved ATI cards when used under Linux. Go Trident, I say! :-)

  • Hmm, would be nice if all modules worked that way. I had to add some modules for ip_sec to my firewall this weekend and tried doing it without recompiling everything. Sure enough had to patch the kernel also and recompile and stuff. Kinda sucks on dual 50Mhz Sparc procs.

    Sounds like development code to me. If you're gonna run dev code, get a development machine. My firewall is an 80386DX/33 but all my compilation takes place on a dual Cel433 with 256M of memory. It takes the sting out, let me tell you. :-)

  • Man, I'd love to see an open source OS that realizes that recompiling the kernel to add new features is the fucking stupidest idea that anyone has ever come up with - no wonder Linux is getting it's ass kicked in real markets. Buh-bye VA linux and eazel - thanks for all the free shit.

    Use kernel modules, then. Want to use a new feature? Just compile the module, and set up the appropriate config in modules.conf... I use ALSA sound modules to configure my sound card, for instance.

    The single unified driver that nVidia is promoting may help with copatibility issues, but it practically guarentees that this motherboard will be Linux incompatible. nVidia has long sinced closed the GForce drivers, and I bet that Dolby will have some reservations about opening up the APU. A partial doc/source release may staisfy GPL and kernel requirements, but the performance is likely to stink...

  • nVidia's integrated GForce2 chip communicates with the rest of the chipset at the equivalent of 6x AGP. Yet the integrated chip is not "top of the line"--if you want to upgrade your graphics capability (to say, a GForce3) you'll have to use a slower 4x AGP interface... A difficult tradeoff?
  • Yeah, but that makes no sense. Banks don't need to communicate with each other.

    In parallel computing, a 'cross-bar' is where every node has a p2p link with every other node. Basically this is the optimal setup, as long as you don't count cost or complexity.

    I'm trying to imagine how this idea could be applied to a memory bus, where chipset communicates with ram and that's it, and I can't. Oh well.
  • by Chris Burke ( 6130 ) on Monday June 04, 2001 @09:47AM (#177662) Homepage
    Sounds very nice, but it doesn't mean a thing if it doesn't push bits as fast as the old guys. Or if they can't get the thing to market.

    Hmm... what part of dual-bank DDR with h/w prefetch didn't you understand? That's double the memory bandwidth (4.2 GB/s) of any existing mobo, with a possibility for greatly reduced latency due to prefetch. And while the CPU still communicates to the chipset over a mere 1.6GB/s bus, remember that this leaves another 2.8GB/s for prefetches and DMA from the graphics chip and other components. Also, EV6 still has room for further speedups, and this will give AMD a reason to do so. I don't find their claim of 20% improvement to be unbelievable (I'll believe it when I see it, but it isn't unbelievable).

    I have no idea where the "cross-bar" part of the memory system comes into play... The idea doesn't even make sense. You have a chipset, and two banks of DDR. That's two point-point links, for which the term cross-bar seems ill-fitting. Anyway...

    Also, they've got a demo board using only 4 layers! Number of layers has a high impact on board cost, and having only four is very good. Also, according to the article a number of major board manufacturers have made commitments. So it looks like this thing will see the light of day.

  • Yeah, we know where nvidia *really* got their mcp from.

    END OF LINE.

  • However, when I said "focus on improving memory access and AGP speed", I meant that they should keep the expandability option open. Not forgetting that things do break and not work as expected. Maybe something like a GPU socket? where you have a CPU socket and a GPU socket - I think that could satisfy my "expandability" criteria.

    Not that I really disagree with your post, but how often again do the CPU sockets change? .. about once per generation? .. yup. Can you plug an Athlon into a pii slot? .. nope. CPU sockets is a really weak example for 'expandability'.

    (Even within a generation, my supermicro mobo won't take piis (it's smp) above 5(6?)00 mhz.. they didn't exist at the time. I'm currently at 350, a bargain even when I bought them .. how much exactly am I gaining there?)

    Not counting that, provided everything works, all-in-one chipsets are cheaper.

    It's not like nvidia is preventing you from sticking a matrox card in your box.. yet :)

  • Pfft. Don't worry about it - they'll just be leaking about 3 driver sets a week, and expect you to sort out which ones work and are worth keeping. This way, they'll have everyone playing beta-tester/lab rat, rather than just their video card customers...
  • For the graphics card market, they've certainly become the standard these days, and perhaps a little complacent about how they approach things.
    However, they're the newcomer to a very cutthroat market, and very soon they'll have Intel and VIA (and probably SiS and co. too) breathing down their necks.
    If they keep the programming info close to their chests, they'll be limiting the compatibility of their board. And any limiting factor these days is a Bad Thing (TM).
    A good dose of open-ness wouldn't hurt the adoption of the board early on, and may even become a deciding factor in it's continued survival in the marketplace, once all it's competitors wake up and produce a more open answer.
    One can but hope.

    Malk
  • Precompiled modules?
    Though, I very much hope that this doesn't come to pass.

    Malk
  • The article says it's hardware accelerated for DirectX.

    This, I'd assume is because a lot of this is X-Box spinoff tech.

    If I understand the article fully, then it'd Cirect-X compliant from the word go.

    Malk
  • by malkavian ( 9512 ) on Monday June 04, 2001 @08:21AM (#177669)
    Well, as per usual, it seems to prove that extra competition in an industry forces innovation.
    With the Intel/VIA etc. wars, it just seemed to be a war to see who could produce the same solution at lower cost. Very little new actually happened.
    When NVIDIA talked about their memory specs in the GeForce3, I was wondering how long it would take for the powers that be in the industry to have the same kind of foreward thinking on motherboards.
    Now it seems that it took NVIDIA to jump in with both feet to wake everyone up.
    Well, succeed or fail in this new venture, they've certainly given everyone a lot to think about, and I'll definately be putting money aside to buy one as soon as I hear they're settled in and stable.
    It'll be fun to see where this race of new ideas ends. :)
    Long live innovation, competition, and hats off to NVIDIA.

    Malk
  • by malkavian ( 9512 ) on Monday June 04, 2001 @08:28AM (#177670)
    Well, as they said in the article, when queried about why they didn't have an Intel version:

    "We don't have an Intel bus license"

    From this, I assume that Intel are trying to do what IBM did with the MCA architecture over a decade ago.
    They're forcing developers into restrictive licenses, and possibly expensive ones, to use an architecture.
    This was a really bad idea, as IBM found when nobody used MCA, and instead went for the inferior EISA bus, which didn't have such heavy restrictions.
    I can't blame NVIDIA for trying to get their dedicated graphics cards looking and performing to the best possible standard (as they already have optimisations for the PIII and AMD processors), and it's only logical for them to optimise for P4 if they can. And they have.
    However, this new chipset is simply saying that AMD are more open, and easier to develop with, rather than having Intel breathing down their neck with licensing issues at every step of the way.
    My view on it is that AMD just gave NVIDIA the license to innovate properly, and good things are now happening.

    Malk
  • I didn't have much luck with the original link.

    here's [tomshardware.com] another one that seems to work better.

  • The suggestion that NVidia is getting into integrated chipsets in order to compete with Intel on new ground is wrongminded.

    This is a defensive mood, pure and simple. If NVidia doesn't get its hooks firmly into the integrated chipset market then they will see their share in the total graphics chip market dwindle as a larger and larger percentage of new motherboards/systems are built with integrated chipsets.

    In a similar vein, Intel's presence in the graphics chip market is itself defensive. Graphics chips are more competitive for graphics price/performance-wise than general purpose CPUs. That is to say, spending 50 bucks on a 3D chip is going to do more for graphics performance than spending 50 bucks to get a pentium IV 1433MHz instead of a pentium IV 1375MHz. Therefore, each graphics accelerator sold potentally means that people buy a cheaper CPU, which eats into Intel's revenue. Selling graphic's chips helps them hold on to some of that revenue.
  • nVidia has a motherboard chipset that offers integrated sound, video, and an extremely high bandwidth between the two chips?

    So now the PC platform has finally come full circle in it's design and reached the point that the Amiga was in 1987


  • My biggest problem with integrating so much into the systemboard is the lack of upgrade options. A standard PC, with seperate sound, video, and other cards allows you to upgrade any component (including the systemboard/processor) without "losing" the others.

    eg. if my systemboard blows, I can buy a new one and salvage the video card from the blown system. However, if the component is integrated into the systemboard I loose this feature.

    I'd be hardpressed to buy or recommend something like this for personal use. Perhaps computer labs would be interested (where there are 200 identical computers and any problem systems are swapped for new systems). In that respect I do see the advantage (drivers, integration, etc).

    It's cool now, but five years from now? I just don't like throwing things away because they are obsolete. Even if the system board is current, the video card may become useless - install a "full" video card into the AGP slot and the onboard chipset becomes a useless waste of space.

    Why not focus on improving memory access or AGP speed? Why not further communication between the graphic chipset and the CPU instead of developing these (IMHO) silly integrated solutions?

    Bah.


    ---
    Computer Science: solving today's problems tomorrow.
  • They're already the Microsoft of the Video card industry, why not take the Microsoft practice of "embrace-and-extend" as well?

    Look at history though. I think you'll see that NVIDIA won because it produced better stuff in a tighter development cycle than it's competitors. And it's competitors still exist. I think ATI has a lot of strength in the TV-Video thing and is leverging that strength in the 3D arena. Even though NVIDIA bought up 3DFX, that doesn't leave the entire market to them (although they do have a large chunk).

    If I were NVIDIA, I'd be worried about the DOJ and see if they want to attack their monopoly.


    ---
    Computer Science: solving today's problems tomorrow.
  • > They did; furthermore, it's quite possible to
    > have very-much-higher-than AGP access speeds
    > with an on-die solution; this should give you a
    > better idea of where the future will be.
    >
    > At a guess, for example, Hammer or some near-
    > future version will have memory controllers on
    > die. I expect integration to be a strong trend,
    > because we've gotten to the point where the
    > current bottlenecks are the communication
    > channels between parts.

    Obviously if the GPU is directly tied to the motherboard you get a lot of performance wins. However, when I said "focus on improving memory access and AGP speed", I meant that they should keep the expandability option open. Not forgetting that things do break and not work as expected. Maybe something like a GPU socket? where you have a CPU socket and a GPU socket - I think that could satisfy my "expandability" criteria.

    From the press release:

    PR> Designed to deliver the ultimate multimedia
    PR> experience, nForce is the architectural
    PR> foundation for Microsoft's highly-anticipated
    PR> Xbox game console. In addition, NVIDIA has
    PR> secured design wins with the industry's top
    PR> PC ODMs, including ASUS, MSI, GIGABYTE,
    PR> Mitac, and Abit; and expects to announce many
    PR> other OEM design wins by the winter of 2001.

    The good thing is for integrated hardware developers (things like the X-Box) will be able to bypass the AGP tech and have an onboard video card. Cool. However, the other ODM's mainly push consumer/business end PC mobos and not integrated solutions. Granted there is a place for the "all-in-one" motherboard, my bias is towards the techy end-user (eg. the guy or gal that enjoys building their own systems with bits and pieces).

    I hold firm that integrating the video card onto the motherboard is, for the most part, "silly". I feel that, according to the specs, there is more power than the average desktop workstation needs (to justify costs, that is) as opposed to a non-integrated solution. Where I work, most users get an ATI rage IIe. Doesn't hold much for 3D, but it does fine for 2D - which is what most businesses use. Businesses or users that need the 3D power would probably be disappointed by the specs.

    My opinion: Too much power for the average worker, too little power for those that need it.

    Now, if you can put the CPU, videocard, soundcard, and ethernet adaptor all on the main processor... well.. I hope you see why I think it can get a little "silly."


    ---
    Computer Science: solving today's problems tomorrow.
  • > Look at the Linux driver - sure it's legal, but
    > is it open source? I know about the NDA
    > problems, but why does that prevent them from
    > releasing a lower performance open source
    > driver? Answer: Because it's not in their
    > commercial interests while they have the best
    > performance on Linux why let that good
    > performance get back into the community where
    > other vendors might take advantage of it?

    I suffer from this problem. I have never had a system that could run X (XFree86) properly. 99% of the problem was not having proper drivers. On my 486, I had some no-name trident that could barely pull off 72Hz at 800x600. I lived with Linux on the commandline and booted to Windows for gui stuff. On my newer comps, I went thru another Trident AGP card, a TNT, a GF1, GF2, and now a GF3, and I still get a shitty display.

    Now that 3Dfx is dead, why can't they release their friggin drivers? Arguh.

    > Who knows what the long term holds in this
    > industry - we might all be using OS/2 on PPCs
    > in 5 years!!

    Hey, that sounds sweet. ;)

    ---
    Computer Science: solving today's problems tomorrow.
  • You didn't answer the question, "how is this specific to the GPL?"

    If nVidia can't release 3rd party source-code then they can't BSDL it, put it in the public domain, or GPL it. It's all off-limits.

    So how can the GPL be to blame?
  • by gmhowell ( 26755 )
    G13 technology. Like Emil's arm.

  • Here's a great summary of the architecture for people (like me) who aren't engineers on HardOCP: http://www.hardocp.com/articles/nforce/index.html [hardocp.com]

    Also, in addition, I personally really like AnandTech, so here's a link to his detailed article: http://www.anandtech.com/showdoc.html?i=1484 [anandtech.com]

    (And no, I'm not a 'karma whore' ... who _really_ cares?)
  • Thanks - good point! They go together rather well. It'll be interesting to see the Sandra scores for this one.
  • It's certainly going to be good for real world apps (I just read a review that pointed out it's essentially like having an L3 cache in the chipset), but I'd have thought it'd show up in a synthetic streaming memory benchmark also - unless the CPU was already achieving it's theoretical max (which seems unlikely).
  • by SpinyNorman ( 33776 ) on Monday June 04, 2001 @08:31AM (#177684)
    The Tom's article says that the chipset does speculative data prefetch to improve memory-CPU throughput by up to 20%...cool... but why are they bothering to do this when Athlon is about to be replaced by Palomino that has hardware prefetch built in?
  • I hope you're joking...

    MS is one of NVidia's biggest supporters...there's a reason just about every nvidia request makes it into the next version of DirectX. On top of that figure that all of thise hardware is made especially to fit perfectly within the DirectX specs and functionality to provide the best DirectX support out there. Ditching MS in the PC market is something akin to stopping breathing. Sounds like something to try, but give it a few minutes, you'll regret it.

  • I suspect NVIDIA is more interested in becoming the next Intel, not the next Microsoft. Microsoft is known for its crap products and market dominance due to cut-throat business practices.
    Intel is also fairly brutal on the business side of things, but they consistently produce high quality products. NVIDIA got to where they are today by making top-notch hardware, combined with aggressive marketing. I don't see that changing any time soon. When you have a formula that works, stick to it.

    -Mark
  • Agreed. What this will do for benchmarks like STREAM is cut the memory latency by a huge amount. The chipset prefetch will have the right data sitting in its high-speed cache to return to the CPU on the next bus cycle, rather than having to wait the 4 or 5 bus cycles for the memory access to go through. Once it has figured the pattern, it will be prefetching far enough ahead (with it's higher bandwidth) that the right data should always be in the "L3" cache ready for the CPU at the next bus cycle.

    Honestly, I'd believe the 20% performance improvement. Sounds about right for removing memory latency from non-random access. Gotta keep those three pipes in the Athlon filled!!
  • Where did you get the info that the GF2 was optional? All the stuff I see says it is part of the North Bridge (or IGP or whatever). The Dolby Decoder and twin memory controllers are the only configurable options I can see (look at http://www4.tomshardware.com/mainboard/01q2/010604 1/nforce-12.html)

    It will never be as easy to use an ATI card with an nVidia mobo as it is to use an nVidia card because of the driver issue. You plug in a GF3 and it will just work - the existing drivers will just pick it up and use it. You are effectively getting your GF3 drivers bundled with the mobo driver package. You plug an ATI card in and your first experience is the good old 640x480 resolution in Windows (assuming we are talking mass marken here - Linux people tend to be cool with figuring out drivers). You also can get weird interactions - ATI end up having to devote time to make their drivers work with nVidia's mobos, while nVidia don't have to care about ATI (remember the nVidia/VIA/AGP4x thing a while back?)

    Whatever happens, if nVidia gets a monopoly (very doubtful really) then there is little chance of them not exploiting it by the simple fact it exists. You can't fault a company for testing that it's own products work well with each other!!
  • They do 'embrace and extend'. Ever seen the OpenGL nVidia extensions? Guess what are the first extensions to get coded for these days? Oops - extended a supposedly open spec!!

    Look at the Linux driver - sure it's legal, but is it open source? I know about the NDA problems, but why does that prevent them from releasing a lower performance open source driver? Answer: Because it's not in their commercial interests while they have the best performance on Linux why let that good performance get back into the community where other vendors might take advantage of it?

    nVidia, just like Microsoft, won out of sheer bloody minded persistance. The NV1 failed badly, so nVidia sided with Microsoft and DirectX and had cards supporting the next version of DX on the shelves as the versions were released. If you wanted T&L with your DX7 games then you had to get nVidia. If you got anything else then you were SOL. Now that nVidia is king of the hill they can afford to take on the bigger challenges - kinda like MS shrugging off IBM and going it alone.

    nVidia is one company that I would be very careful of in the future. I think they have a big potential to be the next Microsoft, but don't have a monopoly to speak of at the moment and probably won't for a few years yet (ATI has to roll over first which ain't gonna happen easily). The battle is heating up at the moment and the consumers are going to be the winners in the short term. Who knows what the long term holds in this industry - we might all be using OS/2 on PPCs in 5 years!!
  • Yup. The fear component came in when I suddenly realized how much of my life I must have wasted staring at hex dumps of DOS code.
  • Oh. Yeah. Throx is stupid. Bad Throx. :-(

    You are also right about the exploitation. Thanks for setting me straight.
  • by throx ( 42621 ) on Monday June 04, 2001 @09:10AM (#177692) Homepage
    It's simple really. The chipset has a much higher bandwidth to the memory than the CPU does so it can interleave the memory fetches with normal data access. If the CPU is also using a prefetch then the chipset has a lot better chance of figuring out what the pattern is and fetching the correct data.

    Remember that the CPU is running at 1066M/s to the chipset but the chipset has 4200M/s to RAM.
  • by throx ( 42621 ) on Monday June 04, 2001 @10:45AM (#177693) Homepage
    The reason they are using a GF2 and not a GF3 is simple really - if you want a GF3 then you have to buy it from nVidia and they get to make more money. If you don't want a GF3 then they've sold you a GF2 already which will keep you from spending money at their competitor.

    Remember the GF3 driver is likely to be included in the unified driver they are going to be shipping, but if you go with the competitor then you have to worry about driver issues. Looks like nVidia is set to become the Microsoft of the chipset/hardware industry!
  • Why the hell would they not want to be the next MS... I mean, I want to be the next MS.

  • Someobody came out with a chipset for Althlon/Duron CPUs that isn't from VIA.

    The VIA chipset (and USB controller) on my board (Asus A7Pro) is the crappiest, most unstable piece of junk i've ever used. - and its not even overclocked.

    I just hope NVidia does a better job of delivering a chipset that works properly than VIA has.

  • 4-in1 drivers? Well, i was going to try that as my next step under Win2k (i used to dual boot), but..

    In attempting to make my fucking VIA USB controller work, i flashed the BIOS on my mobo using the ASUS-supplied utility.

    Upon reboot, Win2k claims the mobo is no longer ACPI-compliant and BSODs, leaving me unable to:
    a) flash the BIOS back to the old version, and
    b) unable to access the files on my NTFS drive.

    Linux, however, boots and runs quite happily on the same BIOS, but the onboard USB controller is still fucking horked no matter which OS i try and use it under.

    A third-party USB card seems to work OK, but now all my slots are full, my machine crashes a lot (even under Linux) and i paid good money for a worthless fucking USB controller and a shoddy chipset.

    I am not a happy camper.

  • ... Intel has proven it can't make a decent video card -- why should Nvidia be able to make a processor?

    Grip
  • by bconway ( 63464 ) on Monday June 04, 2001 @08:19AM (#177698) Homepage
    While this looks like an interesting development, it looks like nVidia is trying to play both sides of the battle. The top graphics article on Tom's Hardware, here [tomshardware.com] at http://www4.tomshardware.com/graphic/01q2/010511/i ndex.html, discusses the next nVidia driver release, which focuses entirely on Pentium 4 optimizations to the nVidia drivers. I would decalre an all-out war yet, guys. ;-)
  • by xinu ( 64069 )
    Hmm, would be nice if all modules worked that way. I had to add some modules for ip_sec to my firewall this weekend and tried doing it without recompiling everything. Sure enough had to patch the kernel also and recompile and stuff. Kinda sucks on dual 50Mhz Sparc procs.
  • (laugh) oops!
  • nforce is also the first chipset on Intel I remember seeing with a
    • crossbar memory architecture, with point-to-point connections between individual memory banks and I/O devices and CPU.
    Sun's UPA bus architecture works this way, as do various proprietary SGI, IBM, and HP(?) workstation and server busses with various minor distinctions among them. AGP, in a sense, has kinda provided a "good enough" variant of this by segmenting graphics to a separate bus, but a crossbar approach is more general solution for the same problem- maximizing bandwidth utilization without the data traffic conflicts that arise in bus-based solutions.

    --LP

  • Besides being better/faster/cheaper, one aspect of the new nforce chipset is totally novel:
    • ability to guarantee bandwidth and latency for specific I/O requests (what they're calling "StreamThru"), a feature previously only available on mid-range (Octane) and high-end SGI systems with the XIO bus.
    This can be quite handy for things like handling multiple video streams without dropping frames, or mixing video with other I/O requests. I wonder if there are other cool things you could do once you can allocate bandwidth for I/O and have said allocations enforced in hardware?

    I've never seen this capability on the Intel platform before. It was a cutting-edge SGI feature back in 1996 or so. Sun/HP/IBM/Intel were never concerned enough about video to implement this on their workstations, but I do recall some discussion about whether it'd be in Infiniband or not (don't remember the outcome).

    --LP

  • Um, I don't want to sound too negative, but this part:
    ...there's a path to an NVidia/AMD Xbox implementation at a lower price point.
    of your post doesn't exactly jive well with this part from the end:
    this new chip is GeForce 2. [...] The XBox uses GeForce 3
    I mean, you clearly understand that just popping an AMD Palomino on an nForce-based board does not, in fact, make an Xbox. Also, I doubt that if, if it were possible, would in fact be cheaper: a non-nForce motherboard for the Athlon CPU today is around $100, and $200 for the CPU doesn't sound unreasonable. Add a memory, harddrive and case to that, and I think you've pretty much killed the chance to be cheaper than the Xbox--and you would still not have comparable graphics! On a more constructive (or at least less whiny) note, I think there are at least three reasons why nVidia doesn't include GeForce3-level graphics in the nForce:
    1. Competition--the GF3 is still pretty new, and budget cards (the "MX"s) are expected in the fall (at the earlies). Including the same level of graphics power in a supposedly cheap motherboard chipset would compete directly with graphics board sales.
    2. Transistor counts--the GeForce3 is >57 million transistors. Perhaps it was a bit too much to integrate?
    3. Memory bandwidth--a GeForce3 typically has a 128-bit DDR memory interface, clocked at 230 MHz. That's 7.2 GBps, or more than 50% more than all the memory bandwidth provided by the nForce!
    Now, let's just hope NVIDIA are kind enough to allow Linux to run on motherboards based on the nForce. Darn, I think my formerly so simple next-computer-purchasing-plan ("must have SMP Athlons, must have SMP Athlons...") just became a bit more complex. The nForce is *cool*!! ;^)
  • I don't think that Sandra scores would tell you anything exciting. I don't think that a synthetic benchmark will show you the advantages of something like this. Real world apps will probably show which kind of programs can benefit from this.

    -prator
  • Um, what chip would you buy for gaming from AMD that would cost 200 dollars?

    -----------------------

  • Actually, AMD processors running with a FSB of 133MHz can receive data at 133M x 64 bits x 2 (double-pumped bus) / 8 bits per byte = 2.13MB/s, not 1.066M/s.

    The Pentium 4 data transfer rate is even higher, at 64/8 * 4 * 100 = 3.2MB/s.

    So, nForce, at 4.2MB/s, doesn't really have that much extra bandwidth to play with, especially since it's sharing the RAM bandwidth with the graphics controller (altough, GC doesn't really use up that much of the bandwidth).


    ----------
  • by artemis67 ( 93453 ) on Monday June 04, 2001 @09:25AM (#177707)
    It's all been caught on video, right here [chello.nl].
  • Nevermind my emulated Amiga is only on par with my real one. Soundless too.

    I miss Autoconfig(TM). :-\


  • Why not integrate the mono-testicular GF2MX core on all mobo chipsets?

    Pretend there could be more effective chips out in the future (or now).


  • At least NVidia is still innovating at a brisk pace.


  • Why do so many people think this is funny? NVidia releases new drivers more often than a typical person would ever want to download them. see here [3dchipset.com]. Version 12.40 came out about three weeks ago. 12.60 is already out. And the current Linux release is 12.51.

    ------
  • Actually, the released quite a bit of specs some time back, just not enough to use 3D effectively. They did release open-source 3D accellerated drivers (based on Mesa) at one point in the past, but they had to leave out some key technologies due to NDA's, and all they got was trash talk from the Linux community for about a year.

    But, yeah, obviously their 2D specs are available (XFree includes a 2D NVidia driver), and they did release those 3D specs that they could at one point. Those specs were for the TNT2 and below (GeForce didn't exist then), but due to NVidia's hardware design philosophy they should work with the GeForce line as well. Of course, you won't get the GeForce features, nor the NDA'd stuff. Oh well. Frankly, after the Voodoo 5 driver fiasco, I don't really think the open source community could write a better driver anyway. (BSD support would be nice, but I'm told that BSD hackers are on the verge of getting the Linux binaries to work with BSD anyway.)

    If you love writing your own graphics drivers, then, yeah, don't buy NVidia. Personally, I prefer to write higher-level code, and have the stuff below it do what it's supposed to do.

    ------

  • Key words: "seemed to be"

    ------
  • Look on that same page. Read the paragraph under the table. It says that each board can come with either on-board graphics or with an AGP slot, making for 16 total configurations.

    I know it will be easier to use an NVidia card with NVidia's mobo. What I'm saying is that it should be just as easy to use an ATI card with NVidia's mobo as it is to use an ATI card with any other mobo. As long as that is the case, no exploitation has occurred, as they have not made it *harder* to use other cards, they have just made it *easier* to use their own.

    ------

  • There is a big difference between having a monopoly (either horizontal or vertical) and exploiting it. I don't see it as a problem if NVidia makes it convenient to use their hardware together. I would see it as a problem, though, if they purposely made it inconvenient to use other hardware. However, as long as it is just as easy to use an ATI card with NVidia's mobo as it is with any other mobo, there's no problem.

    BTW, the GF2 on the NVidia motherboard is optional. You can buy the board without it, and choose any card you wish.

    ------

  • by Temporal ( 96070 ) on Monday June 04, 2001 @10:33AM (#177716) Journal
    It's not just their unified architecture that makes their drivers special. It's their OpenGL implementation.

    I see people making this mistake time and time again. Everyone seems to think that a graphics driver is just like any other driver. They think that the code for the driver is mostly useless for any card other than the one it is written for. With graphics drivers, that is not the case.

    In order for your driver to be OpenGL certified, is has to implement *all* of OpenGL. That includes T&L, NURBS, and much much more. NVidia has an *excellent* software implementation of T&L, and I'm sure the people who made the Kyro and Kyro II (neither of which have hardware T&L) would *love* to get ahold of it. You expect NVidia to just give that to you for free? Um, I don't think so.

    Motherboards are a different issue, and I don't really think NVidia would do something dumb like keep their motherboard specs closed. I suppose we'll find out, but their record with graphics cards says nothing about it.

    ------

  • by Temporal ( 96070 ) on Monday June 04, 2001 @12:34PM (#177717) Journal
    I mean, is there any other possible explanation? NVidia's history started as a small graphics company looking to overthrow the monsterous 3dfx, who at the time seemed to be years ahead of the competition. NVidia came out of nowhere, overtook them, and destoyed them, over but a few short years. Meanwhile, the veterans of the graphics industry tried and failed to beat NVidia's technology.

    Now, NVidia is just jumping right on in to the motherboard and sound processing markets. And from the looks of it, their very first product already blows the competition out of the water. How long has this competition been here? We're talking about *Intel* and *Creative*, for crying out loud! You don't just beat these guys overnight! What have they been doing all this time? Why is NVidia able to take them out so easily? Why can't ATI or Matrox touch the GeForce 3? Why is NVidia not afraid to spend Microsoft's money to develop Linux and Mac drivers? What is going on here? At this rate, NVidia will rule the world in five to ten years!

    Obviously, NVidia's technology is being provided by aliens. There just is no other explanation. We have to stop this conspiracy before it's too late!

    ------

  • And your point regarding the GPL was what? I can't see any connection in your post that connects GPL to the problem that isn't already a problem with public domain, good ol' copyright or IP laws in general. In short you blame GPL, when you should've blamed nvidia for using 3rd-party code with various different license limitations. Unless I got the whole thing wrong? What I gathered from the post was that you hate GPL for some reason, but not why.

    Unreal Memory Leak was funny though. What few people know, is that Unreal got a very sophisticated Crash-Probability-Drive(TM) too ;*)

    - Steeltoe
  • If every human being on this planet jealously `protects' `their' `IP' and knowledge, our society would stop to a screeching halt (this is hypothetical of course). Think about it for a second.

    Why should anyone inform their colleagues of what they know when it can give you an advantage not to? By keeping information to yourself, you may become invaluable to your company. Clearly, this has already become a problem in the IT industry. Why bother fix bugs when you can get paid to do it later? Why make good documentation when you know the whole system and want contracts? Why teach the new guy, when you can get "his" salary?

    This mindset is elitist, foolish and inefficient. It comes from an individualistic view, that if followed to its extreme would dissolve society and its development. Note that I'm not saying that is a plausible scenario. I'm just saying these companies' battle over IP is not good for society as a whole. It's not good for us, the people. What is more important? Government-granted monopolies or free sharing of what can be shared with abundance? If you disagree on this, you could just as well argue for keeping the 3rd world illiterate and enslave them: It would produce alot of so-called wealth for us. All in all, this is a question of what you really value.

    Our society is not built from the mindset of many bits of pieces forming a big puzzle. It's from the perspective of a puzzle where you try to put the best pieces in correct place. Failing to understand this (I know I didn't explain it very well), will just make us alienate us from ourselves even more. Our society exists because we have created it as a concept and as an admirable one to contribute to. When we stop that, society will become anarchistic and violent. Violent may mean intellectually violent too.

    So what if today's companies have IP they think they must protect in order to survive. Nothing is cast in stone yet. We mold society to our needs. The need is there, and it'll just become greater and greater.

    Please note that code for device drivers is hardly great IP since it will only work with said hardware. I know we can't just dissolve IP laws overnight, I'm not proposing that either. Btw, you didn't answer my question.

    - Steeltoe
  • Haha, you're funny :-)

    I said what I said because I meant it. If public companies aren't beneficial, but is stealing mindshare from the public by inventing the wheel over and over again or having monopolies on the wheel, maybe the system should be revised? Nowadays, it seems like the stockmarket is crumbling at its core with the increase in greed and (dis)information throughput on the Internet. It was never designed for that, and alot of it is basically illegal. But who cares about the stock as long as you make a quick buck? Hopefully.. Whenever someone wins, someone else loses, except when REAL value is being produced - but that takes time and effort. That was why the stockmarket was created in the first place - to pay people to find the "correct" price. Tell that to the thousands of newbies creating havoc, or the "analysts" that misinforms the public to earn on their own stocks.

    Btw, I don't make alot of karma and few people shout "amen Brother!" to what I post. Maybe I get 1 karma point per 20 posts or something. It's stupidly low because I happen to have a pretty controversial view as opposed to a few others on here. Just that we disagree or something, doesn't mean I'm a troll or a zealot. I believe in change, you obviously believe in justifications and keeping status quo.

    However, with the apathetic political climate we have today, what politician dares to really make a difference? I don't believe in revolution, but it seems like somethings bound to happen if we don't get our act together soon.

    Btw, I started a political program on sourceforge. It's really nice, with some nifty ideas. I don't have any code yet, but that'll be coming soon. I just have to work full time first. Join me and help shape the future of politics! ;*)

    Oh yeah, and we should share everything we have. Even underwear!

    - Steeltoe
  • Tom's Hardware had a Micro ATX demo motherboard, but does a full ATX demo board for nvidia's new chipset exist?

    How does this board benchmark against the MSI K7-Master? Is the inverted PCI slot supposed to be a CNR riser?

    Yes (from Asus [digit-life.com]), dunno (but look here [inqst.com] for benchmarks), and I think so :-)

  • So now the PC platform has finally come full circle in it's design...

    It's not a circle, it's a spiral - and this time round, we're a few rungs higher up the ladder (speaking as a die-hard - but still dead - ex-Amigan here).

    The Amiga was of course cool in its day, but 4x 8bit sound channels vs. 256x 16 bit 2D/3D channels into AC-3, 14 MB/s gfx RAM b/w vs. 4.2 GB/s, 1 Mpix/s blitter vs. 350 Mpix/s multitextured 3D + T&L... I think we've advanced a little :-)

  • Lookie here [inqst.com]!
  • Well, not too surprising, whenever a chip becomes ubiquitous (sp?), it becomes integrated into the core of the system. I remember the days when I couldn't run software, because my Mac LC II lacked a math coprocessor. As these sorts of applications became more common, we started getting integrated floating point.

    Now, as virtually every game requires a 3d card, why not just integrate the damn things into the motherboard? Strikes me that this more or less follows the traditional progression within the computer industry.

    So my question is, why the Hell do I still need a sound card?

    The only "intuitive" interface is the nipple. After that, it's all learned.

  • by Animats ( 122034 ) on Monday June 04, 2001 @09:21AM (#177725) Homepage
    This has some major implications not mentioned.

    First, Microsoft was expected to lose about $200 per unit on early Xbox units. With this NVidia announcement, there's a path to an NVidia/AMD Xbox implementation at a lower price point. This gives Microsoft much more leverage with Intel in bargaining for discounts on parts for the Xbox. Microsoft probably won't lose $200 per unit. Intel will get squeezed hard.

    Second, it makes it very clear that entertainment apps now drive the PC industry. More transistors are now devoted to 3D graphics and audio processing than to the main CPU. The PS2 had that balance too; all the innovation is in the graphics and media processing, while the main CPU was an old design. Now it's taking over. Business desktops will have full 3D graphics and 5-channel audio, even if they never use it.

    Third, it's now clear that one more generation and one-chip PCs will go mainstream. There have been one-chip PCs, but they were low performance. As the chip count goes down, motherboards get smaller. With this level of integration, there's little need for slots for plug-in boards. AGP goes away, and probably so does PCI. Add-ons will go on FireWire, USB, or Ethernet.

    Finally, this all leads to downsized packaging. The computer can fit in the base of a flatscreen, and will probably end up there.

    It's not clear why this new chip is GeForce 2, rather than GeForce 3 architecture. The XBox uses GeForce 3. It's probably a production bottleneck; NVidia was having trouble getting GeForce 3 parts out the door in volume. But NVidia will fix that.

  • That wouldn't be so bad in their current state. If nVidia was to continue releasing products every six months that make their old product look like yesterday's news, then I say bring it on. Of course if nVidia was going to rule the world, they probably wouldn't have to.
  • The reason that the Kyro's don't have T&L is because it is very difficult to implement with tile based rendering (or so I've read, I couldn't say for sure). Some people have said its impossible, but it been said by Thompson that they're next card will have T&L so we'll see if that can stand up to the mightly Geforce 3. Not to harp on you though, I realize that I responded to another comment of yours too.
  • I sure hope you're using the latest 4-in-1 drivers from viahardware.com and that you've got the latest service packs for your presumeably Microsoft based OS.

    (clever insult, no?)
  • by SLi ( 132609 )
    Without a hitch? No way. While the XFree non-3D drivers work well, using the NVidia binary crap with my TNT2 card reduced maximum uptime to something like 2 days (well, maybe 3 if I didn't touch the computer at all). Until I decided I didn't need to stand that.

    Now I've had an ATI Radeon for a month or so, and have yet to experience a single crash. And the Radeon DRI drivers are still alpha code... Go ATI, I say!

  • I'm pretty sure that the whole board is using a unified driver, so they might not want to open it up after all... check socketa.com's review for more info.

    If they don't open the video driver, they can't open the video++ driver...
  • Considering PowerVR have already created hardware T&L for their chips that Sega use in their arcade machines, I wouldn't say it's impossible...
  • Intel doesn't seem to like competition lately on their own buses. That's why they revoked VIA's license to manufacture chips using the P6 bus. In one of their craftier manoevers, VIA simply contracted the manufacturing out to National Semiconductor, who does have a P6 license and is less likely to be pushed around by Intel, which is why National's logo appears on VIA's chipsets now.
  • by ErikTheRed ( 162431 ) on Monday June 04, 2001 @11:17AM (#177738) Homepage
    This is the one angle of this story that I feel has not been sufficiently explored. This new chipset is a potential godsend for IT managers and workers. Why? When you troubleshoot a PC, one of the first things you have to look at are Video Drivers, Sound Drivers, NIC Driver, IDE Driver, Chipset Driver, etc. NVidia is going to make sysadmin's lives much easier if this product is 1/3 as good as it looks. Much easier to properly maintain PCs this way. This gives AMD a much bigger wedge to use to pry into the corporate market. Saving a few dollars per unit when buying PCs is nice, but anything that drops TCO (Total Cost of Ownership) gets major attention.
  • by SquadBoy ( 167263 ) on Monday June 04, 2001 @09:19AM (#177739) Homepage Journal
    The reason their Video Drivers are so closed is first because of various NDAs that they (Nvidia) had to sign to get bits of tech from other people and also because one of their great strengths in the video market is the unified driver setup that they have. Whether you agree with then or not (I don't) it seems clear why they would think that keeping that info close to their chests would be a good thing. In the MoBo chipset market it is a diffrent story these things need to be open and everybody else has them open. Also the entire history of the Linux drivers seems to indicate that they really did want to open them and just found themselves unable to do so. Comparing that with this is apples and oranges and it seems obvious they will do the right thing here.
  • I might be taking this wrong, but it looks like this architecture might bridge the gap between console gaming and PC gaming. Its like the in-between. With all the audio and video integration, it looks like a nerd-souped-up-super-gaming-setup.
  • Well, prefetch is easy:

    Just remember that what's wanted next is either a 1 or a 0. So supply a 1. Hey, half the time you'll be right, and I'm sure it wouldn't matter the rest of the time. Why bother with prefetch? Cut the crap, use this for all data!

    Tom.

    P.S. It's 17:40 here, and it's a Monday. That might explain my point of view.

  • by OverCode@work ( 196386 ) <overcode@gmaCHEETAHil.com minus cat> on Monday June 04, 2001 @08:23AM (#177747) Homepage
    What, so now I have to sign an NDA to get chip programming manuals, if I can get them at all?

    No thanks... nVidia is one of the least-open companies out there, repeatedly denying requests for programming information. They finally did produce Linux drivers for their cards, but they are binary-only. How would you like binary-only kernel support for your CPU?

    Or maybe they would support Linux. It's a fairly lucrative market in the server area, at least. But based on their past attitude I wouldn't count on it.

    -John
  • It's not clear why this new chip is GeForce 2, rather than GeForce 3 architecture.

    <opinion>
    A production bottleneck is possible, or maybe the nFORCE people couldn't integrate the newer GeForce3 chip in time. However, I would not be surprised in the least if NVidia came out with a new version of their north bridge chip (the IGP) with a GeForce 3 core.

    In any case, you can always add your GeForce 3 card later on, since the chipset supports AGP. You probably won't get the extra speed boosts or the unified drivers, but at least you're not stuck with the older GPU.
    </opinion>

  • by ageitgey ( 216346 ) on Monday June 04, 2001 @08:39AM (#177751) Homepage
    nVIDIA is trying to be the next MS? Isn't everyone trying to be the next MS? You speak as if all companies aren't trying to take control of their respective markets.
  • Okay, I know nothing about hardware programming, but as impressed with the proposed architecture that I am, I can't help but wonder if it'll take a while before DirectX will be able to work with this, especially considering they're changing the way the processors manage graphics and audio. Does anybody know whether or not this thing will handle DirectX right away or with minor tweaks?

    If not, then I brought up that question to ask this one... maybe it'll give us Linux/BSD users a chance to NOT have to play catch-up to Microsoft on a new architecture?

  • by clinko ( 232501 ) on Monday June 04, 2001 @08:17AM (#177761) Journal
    That's awesome. Now I get to have outdated and nonexisting drivers for my motherboard from nvidia instead of just this goofy video card!

    (If you don't see the sarcasm in that statement I pitty you)
  • by discovercomics ( 246851 ) on Monday June 04, 2001 @09:08AM (#177763) Homepage
    If you read all the way to the end of the article there is a link [tomshardware.com] to another article about this chipset. This article starts out with a little background history then moves on to details about the chipset.
    High bandwidth HyperTransport interface as connection to the IGP.
    Dual ATA100 IDE controller
    NVIDIA APU (Audio Processing Unit) sound device with a huge feature set, including Dolby Digital encoding for AC3-output.
    Full networking feature set, including FastEthernet 100/10 Mbit, HomePNA 2.0 (home phone line networking) and SoftModem
    Six concurrent USB-ports with 2 USB-hubs
    SteamThru, allowing high bandwidth and guaranteed real time memory access of all devices connected to the MCP, as typically required by video or audio broadcasting from disk, CD, DVD, LAN, WAN, IEEE1394 or when burning a CDROM.

    to address part of your question about directx, directly here is a link to the page [tomshardware.com] about the audio processing unit.

    Hardware DirectX8 audio processor (the first of its kind)
    Up to 256 different stereo voices, of which 192 are 2D-voices and 64 are 3D-voices
    Dolby Digital 5.1 Encoder
    DSL2-acceleration (Downloadable Sample Version 2)
    32 bin mixer, with 8 voice volumes mixed to each bin (to get to 256)

    As it appears that this chipset is an outgrowth of the Xbox development I can imagine that integration with directx* will be pretty much a given.

    As far as earlier comments about drivers go, this second article mentions [tomshardware.com] that "A motherboard with nForce chipset will be installed with ONE driver"

    Obviously this type of chipset won't appeal to everyone, but hey for me it sounds great. No more Nic, audio card, graphics card etc to buy. The only thing they missed was integrating 128 MB or system RAM into the mix..

  • The point is that the two channels aren't "crossbared", but the connection to them is.

    How to explain it ...

    uhm ... the crossbar allows the north bridge (IGP) to allocate memory in both channels at the same time, thereby reducing the latency, since it can write/read to/from them simultaniously.

    I hope that helps a bit.
  • I think this new architecture, and design mindset, could be a very nice addition to the PC market. I want the nicest multimedia computers to play my games on, but I don't really feel like shelling out money for the nicest gaming rig for features that won't be used for gaming. If they can reduce the overall cost of this mobo, and resulting PC parts for it that make it more like a PS2 or Xbox that operates like a PC, I'm going to buy one or two. I can still have my server for playing with hardcore computer stuff, but I won't have to spend a fortune to have a really nice gaming platform to run computer games on, surf the web, and view multimedia.

    And no, a PS2 or XBox is not like a computer with a keyboard, mouse, and expandability built into its design. (Although they are getting closer to this realization).

  • by Ayende Rahien ( 309542 ) on Monday June 04, 2001 @08:35AM (#177772)
    Sorry, can't do that.
    A kernel *has* to know about the hardware.
    And as Linux Kernel is GPLed, they can't release just the binaries.
    It would be a lot of work, though, to build the spec of the chip from the kernel's sources.

  • It's funny to see that a while ago nVidia was the underdog and 3dfx were the bad guys. All of a sudden nVidia has turned into the bad guy.
    Is this some kind of dillema in today computer industry? You cant be big (and make lots of $$$) and good a the same time?

    Anyway i hope ATI's TRUFORM will kick their butt a bit. Competition is a good thing.
  • Take a look at HardOCP [hardocp.com] for pictures of the new board (Abit MicroATX featuring NV22 and the Crush 12).

"How to make a million dollars: First, get a million dollars." -- Steve Martin

Working...