NVidia Vs. Intel: Fight To Come? 140
Mostly Monkey was the first to write to us regarding
a new article on Tom's Hardware. The article is about the launch by Nvidia of "...nForce, its integrated graphics, audio, Northbridge, and Southbridge chipset, also referred to as Crush." The implication is that Nvidia is setting itself up to move past the graphics/audio market and get into competing with Intel in the full on chip market. What with AMD's recent success, that doesn't sound so unlikely.
The new Amiga? (Score:3)
With all the talk about the new Amiga (software only), I'm more excited by this development.
Re:Another major benificiary - AMD (Score:1)
Re:sarcasm (Score:1)
Release early/often works on the assumption that the user knows when to upgrade. Release rarely-and-when-its-done works on the assumption that the user doesn't
John
Re:Docs (Score:1)
--
Re:What's this to do with GPL? (Score:1)
--
Re:What's this to do with GPL? (Score:1)
So tell me, just to make it very clear, why did you buy an nVidia card for your Alpha machine knowing it didn't work? It's not like nVidia has drivers for any OS for Alpha (not just Linux), so why'd you bother? Or is this just rhetorical FUD? Go away troll.
--
Re:Sounds great [Question, OT] (Score:1)
--
Sounds great (Score:2)
Sound is pretty much commodidy stuff these days, just go down to your local hardware store, pick up an SB Live, and you're done. There are some more powerful cards out there for people who need them, but otherwise SB Live is perfectly sufficient. nVidia is just doing the logical, integrating a more powerful sound solution, thereby eliminating most of Sound Blaster's market.
Ether is pretty much commodity stuff as well. Aside from all the no-name clones out there, it's best to just spend the extra couple of bucks and get yourself a real card, 3com, Intel, whatever. This nVidia chip should really also have a good 10/100 component.
And video? Well, I suppose nVidia is already in that market heavily, with all the cards based on their chip, and it's best to leave this part alone, as video cards are the most often performed (and most expensive) upgrades (how many times did you upgrade your sound card or network card, or even cpu?), so it's a cash cow as a stand-alone card.
The only threat to Intel in this chipset is of course the integration and elimination of the North/South Bridges, but most importantly, nVidia's alignment with AMD. It's all AMD this, AMD that, and DDR also features prominently. Good stuff.
--
New blood and forward thinking (Score:1)
What questions I have remaining concern i/o and busses. Sure, the HyperTransport is fast between north and south... but how many PCI busses does the south bridge handle? How about PCI-X or 64bit/66Mhz PCI? Firewire? IrDA? I also wonder if interleaving wouldn't have been better than the "neat" feature allowing different types/amounts of memory on each bus.
I am impressed by the feature list and eagerly await the first test drives by Tomshardware and others. This could get interesting.
_damnit_
Re:Docs (Score:1)
Now I've had an ATI Radeon for a month or so, and have yet to experience a single crash. And the Radeon DRI drivers are still alpha code... Go ATI, I say!
Until you realize that they too are holding out... TV-Out, hardware MPEG2 decode... these things aren't available on your beloved ATI cards when used under Linux. Go Trident, I say! :-)
Re:Docs (Score:1)
Hmm, would be nice if all modules worked that way. I had to add some modules for ip_sec to my firewall this weekend and tried doing it without recompiling everything. Sure enough had to patch the kernel also and recompile and stuff. Kinda sucks on dual 50Mhz Sparc procs.
Sounds like development code to me. If you're gonna run dev code, get a development machine. My firewall is an 80386DX/33 but all my compilation takes place on a dual Cel433 with 256M of memory. It takes the sting out, let me tell you. :-)
Re:Docs (Score:2)
Use kernel modules, then. Want to use a new feature? Just compile the module, and set up the appropriate config in modules.conf... I use ALSA sound modules to configure my sound card, for instance.
The single unified driver that nVidia is promoting may help with copatibility issues, but it practically guarentees that this motherboard will be Linux incompatible. nVidia has long sinced closed the GForce drivers, and I bet that Dolby will have some reservations about opening up the APU. A partial doc/source release may staisfy GPL and kernel requirements, but the performance is likely to stink...
AGP 4x vs 6x (Score:2)
Re:It's still about speed (Score:1)
In parallel computing, a 'cross-bar' is where every node has a p2p link with every other node. Basically this is the optimal setup, as long as you don't count cost or complexity.
I'm trying to imagine how this idea could be applied to a memory bus, where chipset communicates with ram and that's it, and I can't. Oh well.
Re:It's still about speed (Score:4)
Hmm... what part of dual-bank DDR with h/w prefetch didn't you understand? That's double the memory bandwidth (4.2 GB/s) of any existing mobo, with a possibility for greatly reduced latency due to prefetch. And while the CPU still communicates to the chipset over a mere 1.6GB/s bus, remember that this leaves another 2.8GB/s for prefetches and DMA from the graphics chip and other components. Also, EV6 still has room for further speedups, and this will give AMD a reason to do so. I don't find their claim of 20% improvement to be unbelievable (I'll believe it when I see it, but it isn't unbelievable).
I have no idea where the "cross-bar" part of the memory system comes into play... The idea doesn't even make sense. You have a chipset, and two banks of DDR. That's two point-point links, for which the term cross-bar seems ill-fitting. Anyway...
Also, they've got a demo board using only 4 layers! Number of layers has a high impact on board cost, and having only four is very good. Also, according to the article a number of major board manufacturers have made commitments. So it looks like this thing will see the light of day.
Multimedia Communications Processor, eh? (Score:2)
END OF LINE.
Re:My thoughts... (Score:2)
Not that I really disagree with your post, but how often again do the CPU sockets change? .. about once per generation? .. yup. Can you plug an Athlon into a pii slot? .. nope. CPU sockets is a really weak example for 'expandability'.
(Even within a generation, my supermicro mobo won't take piis (it's smp) above 5(6?)00 mhz.. they didn't exist at the time. I'm currently at 350, a bargain even when I bought them .. how much exactly am I gaining there?)
Not counting that, provided everything works, all-in-one chipsets are cheaper.
It's not like nvidia is preventing you from sticking a matrox card in your box.. yet :)
Re:sarcasm (Score:2)
Re:Docs (Score:2)
However, they're the newcomer to a very cutthroat market, and very soon they'll have Intel and VIA (and probably SiS and co. too) breathing down their necks.
If they keep the programming info close to their chests, they'll be limiting the compatibility of their board. And any limiting factor these days is a Bad Thing (TM).
A good dose of open-ness wouldn't hurt the adoption of the board early on, and may even become a deciding factor in it's continued survival in the marketplace, once all it's competitors wake up and produce a more open answer.
One can but hope.
Malk
Re:Docs (Score:2)
Though, I very much hope that this doesn't come to pass.
Malk
Re:question about drivers... (Score:2)
This, I'd assume is because a lot of this is X-Box spinoff tech.
If I understand the article fully, then it'd Cirect-X compliant from the word go.
Malk
New Blood (Score:4)
With the Intel/VIA etc. wars, it just seemed to be a war to see who could produce the same solution at lower cost. Very little new actually happened.
When NVIDIA talked about their memory specs in the GeForce3, I was wondering how long it would take for the powers that be in the industry to have the same kind of foreward thinking on motherboards.
Now it seems that it took NVIDIA to jump in with both feet to wake everyone up.
Well, succeed or fail in this new venture, they've certainly given everyone a lot to think about, and I'll definately be putting money aside to buy one as soon as I hear they're settled in and stable.
It'll be fun to see where this race of new ideas ends.
Long live innovation, competition, and hats off to NVIDIA.
Malk
Re:nVidia sending mixed messages? (Score:4)
"We don't have an Intel bus license"
From this, I assume that Intel are trying to do what IBM did with the MCA architecture over a decade ago.
They're forcing developers into restrictive licenses, and possibly expensive ones, to use an architecture.
This was a really bad idea, as IBM found when nobody used MCA, and instead went for the inferior EISA bus, which didn't have such heavy restrictions.
I can't blame NVIDIA for trying to get their dedicated graphics cards looking and performing to the best possible standard (as they already have optimisations for the PIII and AMD processors), and it's only logical for them to optimise for P4 if they can. And they have.
However, this new chipset is simply saying that AMD are more open, and easier to develop with, rather than having Intel breathing down their neck with licensing issues at every step of the way.
My view on it is that AMD just gave NVIDIA the license to innovate properly, and good things are now happening.
Malk
Get your PS2 kbd, mouse, ethernet and LCD add-ons (Score:1)
correct link? (Score:2)
I didn't have much luck with the original link.
here's [tomshardware.com] another one that seems to work better.
Grabbing new ground or holding old ground. (Score:1)
This is a defensive mood, pure and simple. If NVidia doesn't get its hooks firmly into the integrated chipset market then they will see their share in the total graphics chip market dwindle as a larger and larger percentage of new motherboards/systems are built with integrated chipsets.
In a similar vein, Intel's presence in the graphics chip market is itself defensive. Graphics chips are more competitive for graphics price/performance-wise than general purpose CPUs. That is to say, spending 50 bucks on a 3D chip is going to do more for graphics performance than spending 50 bucks to get a pentium IV 1433MHz instead of a pentium IV 1375MHz. Therefore, each graphics accelerator sold potentally means that people buy a cheaper CPU, which eats into Intel's revenue. Selling graphic's chips helps them hold on to some of that revenue.
Shades of the Amiga (Score:2)
So now the PC platform has finally come full circle in it's design and reached the point that the Amiga was in 1987
My thoughts... (Score:2)
eg. if my systemboard blows, I can buy a new one and salvage the video card from the blown system. However, if the component is integrated into the systemboard I loose this feature.
I'd be hardpressed to buy or recommend something like this for personal use. Perhaps computer labs would be interested (where there are 200 identical computers and any problem systems are swapped for new systems). In that respect I do see the advantage (drivers, integration, etc).
It's cool now, but five years from now? I just don't like throwing things away because they are obsolete. Even if the system board is current, the video card may become useless - install a "full" video card into the AGP slot and the onboard chipset becomes a useless waste of space.
Why not focus on improving memory access or AGP speed? Why not further communication between the graphic chipset and the CPU instead of developing these (IMHO) silly integrated solutions?
Bah.
---
Computer Science: solving today's problems tomorrow.
Re:Why its a GF2 (Score:2)
Look at history though. I think you'll see that NVIDIA won because it produced better stuff in a tighter development cycle than it's competitors. And it's competitors still exist. I think ATI has a lot of strength in the TV-Video thing and is leverging that strength in the 3D arena. Even though NVIDIA bought up 3DFX, that doesn't leave the entire market to them (although they do have a large chunk).
If I were NVIDIA, I'd be worried about the DOJ and see if they want to attack their monopoly.
---
Computer Science: solving today's problems tomorrow.
Re:My thoughts... (Score:2)
> have very-much-higher-than AGP access speeds
> with an on-die solution; this should give you a
> better idea of where the future will be.
>
> At a guess, for example, Hammer or some near-
> future version will have memory controllers on
> die. I expect integration to be a strong trend,
> because we've gotten to the point where the
> current bottlenecks are the communication
> channels between parts.
Obviously if the GPU is directly tied to the motherboard you get a lot of performance wins. However, when I said "focus on improving memory access and AGP speed", I meant that they should keep the expandability option open. Not forgetting that things do break and not work as expected. Maybe something like a GPU socket? where you have a CPU socket and a GPU socket - I think that could satisfy my "expandability" criteria.
From the press release:
PR> Designed to deliver the ultimate multimedia
PR> experience, nForce is the architectural
PR> foundation for Microsoft's highly-anticipated
PR> Xbox game console. In addition, NVIDIA has
PR> secured design wins with the industry's top
PR> PC ODMs, including ASUS, MSI, GIGABYTE,
PR> Mitac, and Abit; and expects to announce many
PR> other OEM design wins by the winter of 2001.
The good thing is for integrated hardware developers (things like the X-Box) will be able to bypass the AGP tech and have an onboard video card. Cool. However, the other ODM's mainly push consumer/business end PC mobos and not integrated solutions. Granted there is a place for the "all-in-one" motherboard, my bias is towards the techy end-user (eg. the guy or gal that enjoys building their own systems with bits and pieces).
I hold firm that integrating the video card onto the motherboard is, for the most part, "silly". I feel that, according to the specs, there is more power than the average desktop workstation needs (to justify costs, that is) as opposed to a non-integrated solution. Where I work, most users get an ATI rage IIe. Doesn't hold much for 3D, but it does fine for 2D - which is what most businesses use. Businesses or users that need the 3D power would probably be disappointed by the specs.
My opinion: Too much power for the average worker, too little power for those that need it.
Now, if you can put the CPU, videocard, soundcard, and ethernet adaptor all on the main processor... well.. I hope you see why I think it can get a little "silly."
---
Computer Science: solving today's problems tomorrow.
Re:Why its a GF2 (Score:2)
> is it open source? I know about the NDA
> problems, but why does that prevent them from
> releasing a lower performance open source
> driver? Answer: Because it's not in their
> commercial interests while they have the best
> performance on Linux why let that good
> performance get back into the community where
> other vendors might take advantage of it?
I suffer from this problem. I have never had a system that could run X (XFree86) properly. 99% of the problem was not having proper drivers. On my 486, I had some no-name trident that could barely pull off 72Hz at 800x600. I lived with Linux on the commandline and booted to Windows for gui stuff. On my newer comps, I went thru another Trident AGP card, a TNT, a GF1, GF2, and now a GF3, and I still get a shitty display.
Now that 3Dfx is dead, why can't they release their friggin drivers? Arguh.
> Who knows what the long term holds in this
> industry - we might all be using OS/2 on PPCs
> in 5 years!!
Hey, that sounds sweet.
---
Computer Science: solving today's problems tomorrow.
Re:What's this to do with GPL? (Score:2)
If nVidia can't release 3rd party source-code then they can't BSDL it, put it in the public domain, or GPL it. It's all off-limits.
So how can the GPL be to blame?
G13 (Score:2)
Summations are our friends! (Score:2)
Also, in addition, I personally really like AnandTech, so here's a link to his detailed article: http://www.anandtech.com/showdoc.html?i=1484 [anandtech.com]
(And no, I'm not a 'karma whore'
Re:Why have the chipset to data prefetch? (Score:1)
Re:Why have the chipset to data prefetch? (Score:1)
Why have the chipset to data prefetch? (Score:3)
Re:MS Next? (Score:1)
MS is one of NVidia's biggest supporters...there's a reason just about every nvidia request makes it into the next version of DirectX. On top of that figure that all of thise hardware is made especially to fit perfectly within the DirectX specs and functionality to provide the best DirectX support out there. Ditching MS in the PC market is something akin to stopping breathing. Sounds like something to try, but give it a few minutes, you'll regret it.
Re:great... (monopoly a la Intel, not Microsoft) (Score:1)
Intel is also fairly brutal on the business side of things, but they consistently produce high quality products. NVIDIA got to where they are today by making top-notch hardware, combined with aggressive marketing. I don't see that changing any time soon. When you have a formula that works, stick to it.
-Mark
Re:Why have the chipset to data prefetch? (Score:2)
Honestly, I'd believe the 20% performance improvement. Sounds about right for removing memory latency from non-random access. Gotta keep those three pipes in the Athlon filled!!
Re:Why its a GF2 (Score:2)
It will never be as easy to use an ATI card with an nVidia mobo as it is to use an nVidia card because of the driver issue. You plug in a GF3 and it will just work - the existing drivers will just pick it up and use it. You are effectively getting your GF3 drivers bundled with the mobo driver package. You plug an ATI card in and your first experience is the good old 640x480 resolution in Windows (assuming we are talking mass marken here - Linux people tend to be cool with figuring out drivers). You also can get weird interactions - ATI end up having to devote time to make their drivers work with nVidia's mobos, while nVidia don't have to care about ATI (remember the nVidia/VIA/AGP4x thing a while back?)
Whatever happens, if nVidia gets a monopoly (very doubtful really) then there is little chance of them not exploiting it by the simple fact it exists. You can't fault a company for testing that it's own products work well with each other!!
Re:Why its a GF2 (Score:2)
Look at the Linux driver - sure it's legal, but is it open source? I know about the NDA problems, but why does that prevent them from releasing a lower performance open source driver? Answer: Because it's not in their commercial interests while they have the best performance on Linux why let that good performance get back into the community where other vendors might take advantage of it?
nVidia, just like Microsoft, won out of sheer bloody minded persistance. The NV1 failed badly, so nVidia sided with Microsoft and DirectX and had cards supporting the next version of DX on the shelves as the versions were released. If you wanted T&L with your DX7 games then you had to get nVidia. If you got anything else then you were SOL. Now that nVidia is king of the hill they can afford to take on the bigger challenges - kinda like MS shrugging off IBM and going it alone.
nVidia is one company that I would be very careful of in the future. I think they have a big potential to be the next Microsoft, but don't have a monopoly to speak of at the moment and probably won't for a few years yet (ATI has to roll over first which ain't gonna happen easily). The battle is heating up at the moment and the consumers are going to be the winners in the short term. Who knows what the long term holds in this industry - we might all be using OS/2 on PPCs in 5 years!!
Re:Why have the chipset to data prefetch? (Score:2)
Re:Why its a GF2 (Score:2)
You are also right about the exploitation. Thanks for setting me straight.
Re:Why have the chipset to data prefetch? (Score:4)
Remember that the CPU is running at 1066M/s to the chipset but the chipset has 4200M/s to RAM.
Why its a GF2 (Score:4)
Remember the GF3 driver is likely to be included in the unified driver they are going to be shipping, but if you go with the competitor then you have to worry about driver issues. Looks like nVidia is set to become the Microsoft of the chipset/hardware industry!
Re:great... (Score:1)
Its about time.. (Score:2)
The VIA chipset (and USB controller) on my board (Asus A7Pro) is the crappiest, most unstable piece of junk i've ever used. - and its not even overclocked.
I just hope NVidia does a better job of delivering a chipset that works properly than VIA has.
Re:Its about time.. (Score:2)
In attempting to make my fucking VIA USB controller work, i flashed the BIOS on my mobo using the ASUS-supplied utility.
Upon reboot, Win2k claims the mobo is no longer ACPI-compliant and BSODs, leaving me unable to:
a) flash the BIOS back to the old version, and
b) unable to access the files on my NTFS drive.
Linux, however, boots and runs quite happily on the same BIOS, but the onboard USB controller is still fucking horked no matter which OS i try and use it under.
A third-party USB card seems to work OK, but now all my slots are full, my machine crashes a lot (even under Linux) and i paid good money for a worthless fucking USB controller and a shoddy chipset.
I am not a happy camper.
Intel need not worry (Score:1)
Grip
nVidia sending mixed messages? (Score:3)
Re:Docs (Score:1)
You're so right!... (Score:1)
not sure it's 100% unique on Intel, but also... (Score:2)
--LP
Includes a *new* feature for Intel systems (Score:4)
I've never seen this capability on the Intel platform before. It was a cutting-edge SGI feature back in 1996 or so. Sun/HP/IBM/Intel were never concerned enough about video to implement this on their workstations, but I do recall some discussion about whether it'd be in Infiniband or not (don't remember the outcome).
--LP
Re:Implications (Score:2)
Re:Why have the chipset to data prefetch? (Score:1)
-prator
Re:Implications (Score:2)
-----------------------
Re:Why have the chipset to data prefetch? (Score:2)
The Pentium 4 data transfer rate is even higher, at 64/8 * 4 * 100 = 3.2MB/s.
So, nForce, at 4.2MB/s, doesn't really have that much extra bandwidth to play with, especially since it's sharing the RAM bandwidth with the graphics controller (altough, GC doesn't really use up that much of the bandwidth).
----------
Fight, nVidia vs. Intel's minions (Score:3)
Re:Shades of the Amiga (Score:1)
I miss Autoconfig(TM). :-\
Re:Who remembers math coprocessors? (Score:1)
Pretend there could be more effective chips out in the future (or now).
Re:great... (Score:1)
Re:sarcasm (Score:1)
------
Re:Docs (Score:1)
But, yeah, obviously their 2D specs are available (XFree includes a 2D NVidia driver), and they did release those 3D specs that they could at one point. Those specs were for the TNT2 and below (GeForce didn't exist then), but due to NVidia's hardware design philosophy they should work with the GeForce line as well. Of course, you won't get the GeForce features, nor the NDA'd stuff. Oh well. Frankly, after the Voodoo 5 driver fiasco, I don't really think the open source community could write a better driver anyway. (BSD support would be nice, but I'm told that BSD hackers are on the verge of getting the Linux binaries to work with BSD anyway.)
If you love writing your own graphics drivers, then, yeah, don't buy NVidia. Personally, I prefer to write higher-level code, and have the stuff below it do what it's supposed to do.
------
Re:They get their technology from aliens (Score:1)
------
Re:Why its a GF2 (Score:1)
I know it will be easier to use an NVidia card with NVidia's mobo. What I'm saying is that it should be just as easy to use an ATI card with NVidia's mobo as it is to use an ATI card with any other mobo. As long as that is the case, no exploitation has occurred, as they have not made it *harder* to use other cards, they have just made it *easier* to use their own.
------
Re:Why its a GF2 (Score:2)
BTW, the GF2 on the NVidia motherboard is optional. You can buy the board without it, and choose any card you wish.
------
Re:Docs (Score:5)
I see people making this mistake time and time again. Everyone seems to think that a graphics driver is just like any other driver. They think that the code for the driver is mostly useless for any card other than the one it is written for. With graphics drivers, that is not the case.
In order for your driver to be OpenGL certified, is has to implement *all* of OpenGL. That includes T&L, NURBS, and much much more. NVidia has an *excellent* software implementation of T&L, and I'm sure the people who made the Kyro and Kyro II (neither of which have hardware T&L) would *love* to get ahold of it. You expect NVidia to just give that to you for free? Um, I don't think so.
Motherboards are a different issue, and I don't really think NVidia would do something dumb like keep their motherboard specs closed. I suppose we'll find out, but their record with graphics cards says nothing about it.
------
They get their technology from aliens (Score:5)
Now, NVidia is just jumping right on in to the motherboard and sound processing markets. And from the looks of it, their very first product already blows the competition out of the water. How long has this competition been here? We're talking about *Intel* and *Creative*, for crying out loud! You don't just beat these guys overnight! What have they been doing all this time? Why is NVidia able to take them out so easily? Why can't ATI or Matrox touch the GeForce 3? Why is NVidia not afraid to spend Microsoft's money to develop Linux and Mac drivers? What is going on here? At this rate, NVidia will rule the world in five to ten years!
Obviously, NVidia's technology is being provided by aliens. There just is no other explanation. We have to stop this conspiracy before it's too late!
------
What's this to do with GPL? (Score:1)
Unreal Memory Leak was funny though. What few people know, is that Unreal got a very sophisticated Crash-Probability-Drive(TM) too
- Steeltoe
Re:What's this to do with GPL? (Score:1)
Why should anyone inform their colleagues of what they know when it can give you an advantage not to? By keeping information to yourself, you may become invaluable to your company. Clearly, this has already become a problem in the IT industry. Why bother fix bugs when you can get paid to do it later? Why make good documentation when you know the whole system and want contracts? Why teach the new guy, when you can get "his" salary?
This mindset is elitist, foolish and inefficient. It comes from an individualistic view, that if followed to its extreme would dissolve society and its development. Note that I'm not saying that is a plausible scenario. I'm just saying these companies' battle over IP is not good for society as a whole. It's not good for us, the people. What is more important? Government-granted monopolies or free sharing of what can be shared with abundance? If you disagree on this, you could just as well argue for keeping the 3rd world illiterate and enslave them: It would produce alot of so-called wealth for us. All in all, this is a question of what you really value.
Our society is not built from the mindset of many bits of pieces forming a big puzzle. It's from the perspective of a puzzle where you try to put the best pieces in correct place. Failing to understand this (I know I didn't explain it very well), will just make us alienate us from ourselves even more. Our society exists because we have created it as a concept and as an admirable one to contribute to. When we stop that, society will become anarchistic and violent. Violent may mean intellectually violent too.
So what if today's companies have IP they think they must protect in order to survive. Nothing is cast in stone yet. We mold society to our needs. The need is there, and it'll just become greater and greater.
Please note that code for device drivers is hardly great IP since it will only work with said hardware. I know we can't just dissolve IP laws overnight, I'm not proposing that either. Btw, you didn't answer my question.
- Steeltoe
Re:What's this to do with GPL? (Score:1)
I said what I said because I meant it. If public companies aren't beneficial, but is stealing mindshare from the public by inventing the wheel over and over again or having monopolies on the wheel, maybe the system should be revised? Nowadays, it seems like the stockmarket is crumbling at its core with the increase in greed and (dis)information throughput on the Internet. It was never designed for that, and alot of it is basically illegal. But who cares about the stock as long as you make a quick buck? Hopefully.. Whenever someone wins, someone else loses, except when REAL value is being produced - but that takes time and effort. That was why the stockmarket was created in the first place - to pay people to find the "correct" price. Tell that to the thousands of newbies creating havoc, or the "analysts" that misinforms the public to earn on their own stocks.
Btw, I don't make alot of karma and few people shout "amen Brother!" to what I post. Maybe I get 1 karma point per 20 posts or something. It's stupidly low because I happen to have a pretty controversial view as opposed to a few others on here. Just that we disagree or something, doesn't mean I'm a troll or a zealot. I believe in change, you obviously believe in justifications and keeping status quo.
However, with the apathetic political climate we have today, what politician dares to really make a difference? I don't believe in revolution, but it seems like somethings bound to happen if we don't get our act together soon.
Btw, I started a political program on sourceforge. It's really nice, with some nifty ideas. I don't have any code yet, but that'll be coming soon. I just have to work full time first. Join me and help shape the future of politics!
Oh yeah, and we should share everything we have. Even underwear!
- Steeltoe
Re:Is there an ATX demo board anywhere? (Score:1)
Yes (from Asus [digit-life.com]), dunno (but look here [inqst.com] for benchmarks), and I think so :-)
Re:Shades of the Amiga (Score:1)
It's not a circle, it's a spiral - and this time round, we're a few rungs higher up the ladder (speaking as a die-hard - but still dead - ex-Amigan here).
The Amiga was of course cool in its day, but 4x 8bit sound channels vs. 256x 16 bit 2D/3D channels into AC-3, 14 MB/s gfx RAM b/w vs. 4.2 GB/s, 1 Mpix/s blitter vs. 350 Mpix/s multitextured 3D + T&L... I think we've advanced a little :-)
Benchmarks! (well, some) (Score:1)
Who remembers math coprocessors? (Score:2)
Now, as virtually every game requires a 3d card, why not just integrate the damn things into the motherboard? Strikes me that this more or less follows the traditional progression within the computer industry.
So my question is, why the Hell do I still need a sound card?
The only "intuitive" interface is the nipple. After that, it's all learned.
Implications (Score:4)
First, Microsoft was expected to lose about $200 per unit on early Xbox units. With this NVidia announcement, there's a path to an NVidia/AMD Xbox implementation at a lower price point. This gives Microsoft much more leverage with Intel in bargaining for discounts on parts for the Xbox. Microsoft probably won't lose $200 per unit. Intel will get squeezed hard.
Second, it makes it very clear that entertainment apps now drive the PC industry. More transistors are now devoted to 3D graphics and audio processing than to the main CPU. The PS2 had that balance too; all the innovation is in the graphics and media processing, while the main CPU was an old design. Now it's taking over. Business desktops will have full 3D graphics and 5-channel audio, even if they never use it.
Third, it's now clear that one more generation and one-chip PCs will go mainstream. There have been one-chip PCs, but they were low performance. As the chip count goes down, motherboards get smaller. With this level of integration, there's little need for slots for plug-in boards. AGP goes away, and probably so does PCI. Add-ons will go on FireWire, USB, or Ethernet.
Finally, this all leads to downsized packaging. The computer can fit in the base of a flatscreen, and will probably end up there.
It's not clear why this new chip is GeForce 2, rather than GeForce 3 architecture. The XBox uses GeForce 3. It's probably a production bottleneck; NVidia was having trouble getting GeForce 3 parts out the door in volume. But NVidia will fix that.
Re:They get their technology from aliens (Score:1)
Re:Docs (Score:1)
Re:Its about time.. (Score:2)
(clever insult, no?)
Re:Docs (Score:1)
Now I've had an ATI Radeon for a month or so, and have yet to experience a single crash. And the Radeon DRI drivers are still alpha code... Go ATI, I say!
Re:Docs (Score:2)
If they don't open the video driver, they can't open the video++ driver...
Re:Docs (Score:1)
Re:nVidia sending mixed messages? (Score:2)
Another major benificiary - AMD (Score:5)
Re:Docs (Score:4)
Bridging the gap? (Score:2)
Re:Why have the chipset to data prefetch? (Score:2)
Just remember that what's wanted next is either a 1 or a 0. So supply a 1. Hey, half the time you'll be right, and I'm sure it wouldn't matter the rest of the time. Why bother with prefetch? Cut the crap, use this for all data!
Tom.
P.S. It's 17:40 here, and it's a Monday. That might explain my point of view.
Docs (Score:5)
No thanks... nVidia is one of the least-open companies out there, repeatedly denying requests for programming information. They finally did produce Linux drivers for their cards, but they are binary-only. How would you like binary-only kernel support for your CPU?
Or maybe they would support Linux. It's a fairly lucrative market in the server area, at least. But based on their past attitude I wouldn't count on it.
-John
Re:Implications (Score:2)
<opinion>
A production bottleneck is possible, or maybe the nFORCE people couldn't integrate the newer GeForce3 chip in time. However, I would not be surprised in the least if NVidia came out with a new version of their north bridge chip (the IGP) with a GeForce 3 core.
In any case, you can always add your GeForce 3 card later on, since the chipset supports AGP. You probably won't get the extra speed boosts or the unified drivers, but at least you're not stuck with the older GPU.
</opinion>
Re:great... (Score:5)
question about drivers... (Score:2)
If not, then I brought up that question to ask this one... maybe it'll give us Linux/BSD users a chance to NOT have to play catch-up to Microsoft on a new architecture?
sarcasm (Score:5)
(If you don't see the sarcasm in that statement I pitty you)
Re:question about drivers... (Score:4)
to address part of your question about directx, directly here is a link to the page [tomshardware.com] about the audio processing unit.
As it appears that this chipset is an outgrowth of the Xbox development I can imagine that integration with directx* will be pretty much a given.
As far as earlier comments about drivers go, this second article mentions [tomshardware.com] that "A motherboard with nForce chipset will be installed with ONE driver"
Obviously this type of chipset won't appeal to everyone, but hey for me it sounds great. No more Nic, audio card, graphics card etc to buy. The only thing they missed was integrating 128 MB or system RAM into the mix..
Re:It's still about speed (Score:2)
How to explain it
uhm
I hope that helps a bit.
Apart from the technical... (Score:2)
And no, a PS2 or XBox is not like a computer with a keyboard, mouse, and expandability built into its design. (Although they are getting closer to this realization).
Re:Docs (Score:3)
A kernel *has* to know about the hardware.
And as Linux Kernel is GPLed, they can't release just the binaries.
It would be a lot of work, though, to build the spec of the chip from the kernel's sources.
nvidia joins the dark side. (Score:2)
It's funny to see that a while ago nVidia was the underdog and 3dfx were the bad guys. All of a sudden nVidia has turned into the bad guy.
Is this some kind of dillema in today computer industry? You cant be big (and make lots of $$$) and good a the same time?
Anyway i hope ATI's TRUFORM will kick their butt a bit. Competition is a good thing.
pictures of the nvidia board (Score:2)