

nVidia nForce 120
cygnus writes: "Hardcore tech upstart site ExtremeTech.com got to check out an nVidia reference board, their findings are here. While most of the drivers and hardware were beta, they gave it some positive play. The story has a few large photos that are worth a gander... The reference board has only two PCI slots and no onboard Ethernet. Ouch. I'll stick with my Mac for now."
no onboard ethernet? (Score:2)
Re:Maybe (Score:2)
Option "NvAGP"
line in XF86Config. My card crashed after a few seconds of openGL, until I set this to "2". Now it still crashes, but it takes more like half an hour than a few seconds.
The road to closed PC hardware? (Score:2)
What they then go on to insinuate, though, is that nVidia is expanding into yet another realm of the PC hardware industry, which is, frankly, a scary thought to me. Not content with having a monopoly on the video card market, they are now going after the logic set market, intending to dethrone Intel in the biggest market of them all: low-cost motherboard chipsets.
What I'm really worried about here is the PC architecture turning into another closed, proprietary, overpriced piece of junk that the Apple has always been. When they start making video cards that are only "compatible" with nVidia chipsets, you will find out what I'm talking about. This is just the beginning of the market consolidation for nVidia. First 3DFX and now this. You can already tell that they are against open standards when they refuse to release open source drivers for their newest cards so that they can be supported under Linux and other free operating systems. What do they have to hide, anyway?
While no one on Slashdot may be interested in buying a motherboard manufactured with this chipset, there will be a lot of people who will. While one may even get decent performance out of the chipset, perhaps even better than the Intel eqivalent, one must consider: is it really the best choice that I can make in the long run? My personal opinion is that the answer is no, it ain't worth it.
--
Screenshots? (Score:2)
Sigi
Incorrectness in the article (Score:1)
If by 2, you mean 5.... (Score:4)
Re:How the hell is that a troll? (Score:1)
Re:Why exactly (Score:1)
Re:This bodes well for AMD (Score:1)
I'm sure after Rambus, Intel will screen it's clients/partners much more carefully, and nVidia's current business practices leave MUCH to be desired. Just ask Hercules.
Memory interleaving (Score:5)
And to think, my 386 had interleaved memory back in the day..I was 'leet and didn't even know it
AMD in good position to take 30% (Score:5)
nForce and SiS735 the future...? (Score:3)
This puts it in a strange position in the market. The chipset is very powerful, yet the graphics will be decidedly average when the chipset is finally being sold on the market. It beats all the other integrated systems out there (the audio system is to die for, beating the SBLive! into a cocked hat) by a massive margin, but they only cost like $20 a pop anyway!
The SiS735 looks to be the other chipset to look out for, beating the VIA and AMD solutions at the moment, and being a much cheaper single chip solution.
Both chipsets incorporate next generation interconnects - nForce uses an 800MB/s Hyperlink connection, and the SiS uses a 1.2GB/s multi-threaded connection (possible due to the single chip design).
VIA will soon be releasing their second shot at a DDR chipset for the Athlon, called the KT266Pro, without the problems the first one had, and improved DDR memory interface.
Whatever, the future is looking great for the Athlon in terms of chipset support, which now covers the entire low to medium end of the market (up to workstations and low-medium end servers with the 760MP).
But why support Athlon first? (Score:4)
If they intend to take on Intel's i810 and i815 successes, then why do it with a chipset for an AMD CPU instead of one for Intel's? Sure AMD is getting bigger and bigger, but I haven't seen any marketshare survey give them more than 30% at best. And with the current largest PC maker in the world (Dell) not selling Athlon's at all they're severely limiting the reach of this new chipset it seems.
Seriously, it's great that motherboard-makers like Asus are going to integrate the nForce, but what's the use? People building their own PC (the ones buying Asus/Abit/MSI/etc. boards) probably don't want to get an integrated GeForce2 MX...
So anybody know what the deal is? Licensing?
Probably will be a success (Score:5)
Now nVidia is going to make it possible to make an integrated motherboard, and the performance is going to be excellent. The Duron totally crushes the Celeron, the GeForce MX totally crushes the 810 onboard video, and the audio DSPs totally crush everything currently on the market. As long as the price for a Duron plus one of these boards is about the same as the price for a Celeron plus an 810 motherboard, they will sell a whole bunch of these.
steveha
... Use the nForce, Luke? (Score:2)
Incorrectness in the above reply (Score:2)
Re:But why support Athlon first? (Score:4)
1. XBox was originally designed to use an Athlon. Politics dictated that the PIII would be used in the end.
2. The PIII is at the end of its life, the Athlon has at least another 18 months ahead of it.
3. The PIII cannot take advantage of this chipset at all.
4. NVidia's bus licenses for the Intel platform are from Microsoft, they don't own them themselves, hence they can only make Intel chipsets for Microsoft.
4a. Intel might have given P4 licenses to the slow chipset makers (ALi, SiS) which won't compete with Intel, but look at what is happening within Intel with regard to VIAs high-performance P4X chipset!
Many people do not build great systems to run games on only, and this would be great - performance, and cheaper then buying separate components, and also enough oomph for a quick game or two.
Re:Integrated devices... no thanks! (Score:3)
Read for comprehension. (Score:4)
Re:Incorrectness in the article (Score:2)
Portable PC (Score:4)
Re:Screenshots? (Score:1)
Re:The road to closed PC hardware? (Score:2)
Basically, NVIDIA is stuck with decisions from the past that were probably correct at the time, but which now cause problems.
However, the nForce shouldn't have this problem as NVIDIA should be aware that they have to allow for Linux support, and the core chipset business is an entirely different thing than graphics chipsets in terms of how much info you have to let out.
Re:Probably will be a success (Score:4)
Also, there are integrated chipsets for AMD processors already:
1. SiS 730/733. Not a great performer, but really cheap. Quality to boot as well, but I have one running FreeBSD just great, using it at the moment). Also has network interface on-chip that performs fine.
2. VIA KM133. Integrated Savage graphics. Outperforms the i810/i815 chipsets by a reasonable amount.
3. VIA KL133. Integrated Trident graphics. Pretty crap I reckon, pitched against i810 though so what do you expect?
The VIA solutions do not have integrated networking. This is why a lot of boards sold are i810/i815 - for the cheap corporate market where a motherboard needs video, crap audio and networking.
VIA are aiming for that last market a lot with their C3 processor and PL133/PM133 chipsets. They will soon have all-in-one boards that are small, and incorporate: Video, Audio, Processor, Memory, Network, Modem, IDE, etc. There will be no need for any expansion slots (1 PCI will be provided though), and will use the new VIA iTX motherboard size (smaller than FlexATX).
Re:This bodes well for AMD (Score:1)
ALi = lackluster performance but at least it doesn't eat your filesystem
Intel = high performance with bug workarounds openly posted to thier developer website.
nForce = closed source drivers. too bad $everything_except_microsoft
accuracy of content on ExtremeTech.com (Score:5)
The earlier
Having said that, and regardless of the earlier postings here indicating inacuracies in the ExtremeTech hardware review being discussed here, I have to say the site look quite well put together and editorially fair.
--CTH
---
FYI: It's a Ziff-Davis site... (Score:4)
Re:This bodes well for AMD (Score:1)
SiS = open with specs, provide help with drivers (LinuxBIOS use SiS boards because of the help they get)
I disagree about NVidia. Their graphics drivers are closed source because they contain licensed technology that they are not allowed to release. Hopefully this will not be the case with nForce, so whilst nVidia will probably not be as open as SiS, they should not have a reason to close source everything.
Also, Intel chipsets are not all high performance. ServerWorks chipsets are. Intel integrated chipsets are pretty poor really, and need I mention i820?
onboard graphics performance? (Score:2)
Do you think this stuff could really compete with mac anyway?
Compete is a relative term, depends on the task (say, Photoshop vs Quake). I'm interested to learn how the onboard gfx of nForce compares to say, a GF3 in a Mac or PC. GF3 may be a bit faster, but is limited by AGP 4x/6x/8x and the system ram thruput. I'd also like to see if NVIDIA is working on some sort of unified color correction API for Windows (a la Apple's ColorSync... something that is universal, not just one part of a kluge of components).
Insecure? (Score:1)
Re:hmmm... (Score:1)
Heh, I thought it sounded very Amiga-like myself! All it is missing is Firewire, and I bet that will be included on many motherboards, as this is a performance all-in-one chipset.
Wonder if it is power-hungry, or if it can be used in laptops? This chipset would take over the performance laptop market if it did...
Also, can you get the following: a 3.5" bay cover, with two USB ports (connect to header on motherboard), two Firewire ports (connect to header on motherboard/Firewire card), headphone socket for audio, serial port for legacy, and maybe a nifty 1-line LCD display for geeks? That would be great.
Re:But why support Athlon first? (Score:1)
Well, it hasn't been released yet, so no. Also that bug is old, and relates to a fault with the Soundblaster Live! because it hogs the PCI bus. Although I agree that the PCI implementation should still be able to deal with it.
Never had problems with ATA-100 disk access using VIA motherboards, except when installing Windows 2000 on an ECS motherboard. That is due to buggy Windows 2000 drivers anyway.
my take (Score:2)
At any rate, I think I'll wait for the SiS735.
Re:Screenshots? (Score:1)
When rendering to 16-bit mode, the GeForce uses a 16-bit buffer, so the more layered textures a particular area has, the more artifacts get introduced. Other GPUs use a 32-bit buffer for all modes, and thus get great looking 16-bit graphics (at a small performance cost).
Anyway, the interesting GPU at the moment is the Radeon 2, with its Truform technology that looks great.
well said (Score:2)
What's wrong with more powerful = more expensive? (Score:2)
Why is that? I mean, if it's very powerful, the graphics should be more than average, right?
It beats all the other integrated systems out there (the audio system is to die for, beating the SBLive! into a cocked hat) by a massive margin, but they only cost like $20 a pop anyway!
Well, better technology should be more expensive, don't you think?
opinion (Score:2)
I use a number of G4s at work as they've been powerful, stable, wonderful machines to work with for DV video editing and compositing (under Mac OS 9.1). The G4 case was a deam to work with when adding ram and a second hard drive to each machine. We currently have two SGIs and three PCs running Maya but will certainly try Maya under OS X later this summer. At home and for some of the organizations I work with, I use a variety of x86 and workstation platforms. I do my gaming and game serving on two homebrew Windows PCs. No need to bash a platform or try to convert the dissimilar. Use what you enjoy and what works for you.
Re:Screenshots? (Score:2)
--
< )
( \
X
Re:Portable PC (Score:1)
-------
Reference vs. actual board? (Score:2)
If they've improved memory bandwidth enough that the GPU and CPU can both use the same memory without getting too slow, that would probably be a big advantage-- you could dynamically allocate memory size and memory bandwidth, so that you only give the graphic system a huge amount of memory when it wants it, and you can use the memory for programs when you're not doing that much graphics.
Wait a sec (Score:1)
Hardware Reviews Online [slashdot.org]
Re:The road to closed PC hardware? (Score:1)
The simple fact is that the current generation of PC architecture is at a dead end. Making faster processors and faster video cards is great, but at the high end they are being seriously hobbled by poor system memory performance and bus speeds.
NVidia is offering a solution to this problem. AMD is also offering a seperate solution. I'm sure Intel will 'answer' with their own solution...
Re:The road to closed PC hardware? (Score:1)
Re:What's wrong with more powerful = more expensiv (Score:2)
Why is that? I mean, if it's very powerful, the graphics should be more than average, right?
No, the graphics won't be more than average - the GeForce 2MX, which is what Nvidia has integrated onto this motherboard, has been their value graphics chip for some time now. It isn't nearly as fast as GeForce2, GeForce2 Pro or GeForce3, and the motherboards aren't even on the market yet.
What the post you replied to pointed out, is that this is a strange combination - while other features on the motherboard, like the built in sound and memory architecture are high end, the graphics aren't in the same class. So many (myself included) would like to have the same chipset, without the builtin graphics - no point in having that chip if you want to buy a top-of-the line graphics card anyway.
A strange combination, to say the least. (Score:5)
The chipset's North Bridge- Dubbed the "IGP", it provides for more efficient use of memory bandwidth using the GeForce3's crossbar switching technology. Note that this is the same kind of technology that SGI, Sun, and other high-performance UNIX hardware vendors have been using for their memory architecture (drool...)
Support for 128-bit DDR memory- This is just amazing. Not only will their particular implementation allow you to put different sizes of DIMMS in the board, but it will also allow DDR memory to boost it's bandwidth 100%, which means that instead of maxing out at 2.1GB/sec, it maxes out at 4.2GB/sec (again, drool...)
The link between the North and South Bridges- This will support bandwidth up to 800MB/sec, which means that it will have an excess even under unusual conditions, such as maxing out ATA/100, maxing out the PCI bus and outputting 256 streams of digital audio all at the same time. Simply awesome.
The built-in audio- The article gives a lengthy description of this. Suffice to say, it is better than any consumer card available today, built-in or otherwise.
Built-in graphics system- This actually was a little disappointing to me, because, while they may have it running at essentially AGP 6x, they used their budget system, the GeForce2 MX, for the processing. I'm also disappointed that it uses a "shared video memory" architecture that can be found in a lot of deplorable video platforms, notably the Intel i810 chipset. Basically, it uses up to 32MB of system memory for the frame buffer in addition to its onboard memory. However, with the huge amount of bandwidth available to the system memory, this may or may not be an issue.
Lack of support for 1394 and 64-bit PCI- Regrettably, they don't seem to have included support for either Firewire or 64-bit PCI. The lack of 1394 support is particularly surprising, as it has gotten increasingly better support from both Windows ME and Windows XP, as these are the main platforms which will be running on this board.
All in all, I think this is going to be an awesome board, a real leap ahead of everything else out there for the x86 market right now, at least in the consumer arena. I look forward to buying one, if the sticker shock isn't too harsh ;-).
--
< )
( \
X
Re:Integrated devices... no thanks! (Score:1)
I was leaning towards the ASUS A7S for such a beast, but who knows? I may want to go for a 1024x768 run of UT on my nameserver..
Your Working Boy,
- Otis (GAIM: OtisWild)
Re:Screenshots? (Score:3)
Get used to this sort of thing. These guys are going to be _the_ 'everybody' in the statement you may be hearing a lot of... "No, you can't put out a negative review of our product. No, not even a little tiny bit. Everybody else is perfectly happy to accept our terms on this, take it or leave it..."
Oh joy :P
Re:Insecure? (Score:1)
the "stick with my Mac for now" comment just makes him a big target for catcalls. sure pal, go ahead and stick with your "supercomputer". he's the instigator.
as far as mainboards with no legacy ports, I'm not aware of any. but you could disable them in BIOS to free your IRQs if that's what your worried about.
Re:But why support Athlon first? (Score:1)
I don't think he understands memory arbitration... (Score:3)
Exactly how does the reviewer think the picture is getting on the screen if the RAMDAC isn't accessing main memory? Even suggesting that the CPU should have higher priority than the RAMDAC is either ignorance or stupidity. Of course the RAMDAC has to have priority or you are going to end up with big blank patches on the screen!!
Re:Screenshots? (Score:1)
So you are stuck with 16-bit graphics if you want 1024x768, which any person is going to want because this is a performance chipset at heart, for workstations and the like.
Why can't someone integrate a KyroII into the northbridge?
Re:... Use the nForce, Luke? (Score:2)
the SIS chips are cool ...and linux support too (Score:1)
There are a bunch of realy cheapo boards available - some only have 1 PCI slot (after all with almost everything builtin most people don't need slots at all) - you can build an OK box for $150 - many 'barebones' machines have these boards in them. Apart from that you don't see them here that much, I suspect the bulk of them are being sold in India and China.
In fact the only real problem I've had with them is that the Windows ether driver seems to blue screen all the time - like I care :-)
Re:Screenshots? (Score:1)
Re:But why support Athlon first? (Score:2)
Intel tends to slump from time to time, but the i810 fiacso (including RAMBUS) followed by the PIV is very worrisome.
I think that AMD stands to hit is big with the right chipset. This may just be it.
-Peter
Re:Integrated devices... no thanks! (Score:2)
Re:my take (Score:1)
Why? If you're looking for an incredible integrated video and sound capabilities, the nForce will likely be the only decent choice. This chipset seems more aimed at middle end machines built by OEMs where the user doesn't really care what's in it, or for compact systems that need to take as little space as possible. The SiS chipset seems far more for traditional power users who want to swap out any of their components. I dont' think people who buy the nForce will be the same people who would've bought a SiS735 based board. It will, however, raise the bar against some of SiS's other chipsets that feature integrated video, audio and LAN, and force them to create more powerful video and audio in their chipsets.
The mac that you're sticking with (Score:1)
Re:The road to closed PC hardware? (Score:1)
As for the driver thing, get your head out of your ass. It's called licensed code. (Plus the fact that they have the best OpenGL implementation available. Why should they write ATI's driver for them?)
Re:the SIS chips are cool ...and linux support too (Score:1)
Re:Read for comprehension. (Score:2)
You folks must be waayyyyy too hung up on plug and play.
Re:The road to closed PC hardware? (Score:1)
Yup, that's how logical arguments are won. Please, keep trying to start a flamewar, you're getting better at it.
It's called licensed code.
That is their problem. Perhaps they shouldn't have used licensed code? Food for though, eh?
Why should they write ATI's driver for them?)
Mainly because they wouldn't be writing ATI's driver for them. They are based on different chipsets. You do realize that drivers are hardware specific, right?
--
Re:But why support Athlon first? (Score:1)
--
Re:opinion (Score:1)
Re:Portable PC (Score:2)
Hmm, maybe because:
Think of it as a computer you'd take to LAN parties or any other situation, that won't require a loan to purchase.
You *can* do this with many motherboards (Score:4)
Check your BIOS: If you have a setting called "Memory Interleave" or something like that, you likely can do this with your motherboard today. The motherboard I have that does this (an EPOX EP-MVP3G) has three settings: None, Two Bank, and Four Bank. Benchmarks suggest a major improvement with a K6-2/450 when Two Bank is selected, although I don't have enough slots to try Four Bank out (it just reverts to two, or so it seems).
And dareth I mention it, many Macintoshes have been able to do this for quite some time now. Just add extra RAM to a Mac, make sure all the SIMMs are identical, and you will suddenly have a nice performance boost for more than one reason.
Re:A strange combination, to say the least. (Score:1)
Re:But why support Athlon first? (Score:1)
Re:I don't think he understands memory arbitration (Score:2)
Re:I don't think he understands memory arbitration (Score:3)
OK, I'll correct you ;)
The onboard video DOES NOT have integrated frame buffer memory. This is the same for most i810's, i815's etc. In fact, you can lower system performance by running your monitor at a higher refresh rate! To find out how much memory bandwidth your monitor is using on these things, mulitply your screen number of pixels (width * height) by 2 for 16-bit color or 4 for 24/32 bit color. This will give you how many bytes per refresh the RAMDAC is using. Multiply that by your refresh rate (ex. 85) and you have bytes per second. Take this off of your maximum memory bandwidth, and that's what you have left for your CPU. Not a great situation!
Re:Why exactly (Score:1)
(by the way, although I use Hotline, I do try to pay for most of the commercial and shareware software I use.)
--
Re:The road to closed PC hardware? (Score:1)
--
Re:I don't think he understands memory arbitration (Score:1)
No, you need to read further (Score:3)
_damnit_
Re:the SIS chips are cool ...and linux support too (Score:1)
Re:FYI: It's a Ziff-Davis site... (Score:1)
Re:nForce and SiS735 the future...? (Score:1)
I've been wondering about this...what exactly have Creative's engineers been up to for the past 3 years? There have be no real upgrades to the SB Live! in that time, all they do is add neat little drive bay cable connections and change the box a little it seems.
So is this a case of "We don't have to release a better product, this one is selling fine..." or did they just fire all their engineers?
Seems like given 3 years of R&D time, Creative may have something up their sleeve. Or maybe not...who knows...
-Sokie
Re:The road to closed PC hardware? (Score:4)
Man, I better go buy some Intel hardware. God knows they need the support. Those poor guys with their great CPUs have never amounted to anything, and now nVidia's really going to crush them. It's a shame.
Re:Memory interleaving (Score:5)
memory interleaving:
Interleaving of different banks of memory which are in SERIES.
+ shortens address line setup time for sequential memory access.
+ upto 30% more bandwidth (depending on type of memory FPS/EDO/SDRAM) when the cache line is double (or quadruple) of the bus width.
memory crossbar = Interleaving of banks of memory which are in PARALLEL.
+ parallel memory transactions (if each is for a different bank).
+ improvements are less deterministic, depends on which transactions can be parallelized.
+ does reduce memory contention, i.e. different devices (CPU, Video, Disk, NIC) wanting access to memory at the same time.
For multimedia type applications (read games, DVD playback) this is a big help.
nForce's twin-bank memory sub-system reminds me of the Amiga's twin bank memory sub-system.
Re:Read for comprehension. (Score:1)
Re:A strange combination, to say the least. (Score:2)
NOT 128bit DDR memory:
nForce has two independent 64bit DDR memory buses, see..
http://www.anandtech.com/chipsets/showdoc.html?
When both banks are are populated (with a 64bit DDR DIMM in each) you have a "virtual" 128bit bus. However, if the two banks are interleaved, you do get 100% boost in bandwidth.
Note: more bandwidth is not what limits 1+GHz PCs, its memory latency and system contention.
nForce is the 1st chipset to attempt to address the issue of contention while NOT increasing memory latencies.
but.. but.. it still lasts longer than M$Windows.. (Score:1)
Re:The road to closed PC hardware? (Score:1)
the current generation of PC architecture is at a dead end.
I sense a tone (not in your post, you provide the hook, thanks) that integrating somehow limits freedom to build a custom PC from components. So in 2010, will hardware hackers be building teeny weeny wearable PCs using tweezers and a magnifying glass? Of course not. We'll go through the single chip phase, then the cost will fall, then people will use entire computers as components.
Re:But why support Athlon first? (Score:3)
Upgrade to kernel 2.4.5 or upward - the problems is fixed there - I'm using 4 X 30GB on ATA 100 as RAID - and it works perfectly (with SB Live)
Re:What's wrong with more powerful = more expensiv (Score:1)
No, the graphics won't be more than average - the GeForce 2MX, which is what Nvidia has integrated onto this motherboard, has been their value graphics chip for some time now.
You are comparing apples to oranges. nForce in by far the most powerful integrated graphics chipset there is! When you compare it to other integrated chips, you will see that there really is no comparison! Sure, add-on cards are of course faster, but when it comes to integrated graphics, nForce blows everything else away!
Re:Integrated devices... no thanks! (Score:1)
I didn't read that in the article, but nevertheless, integrated devices, and a couple of pci devices, still make up for what I said.
Moderation Abuse (fake accounts?) (Score:1)
If you look a SuiteSisterMary's SlashDot user information page [slashdot.org], you will see that almost all of this person's postings have been moderated up, even though they are typically trivial, not particularly insightful posting (and I mean this matter-of-factly, not as an intended insult). I had a discussion with SuiteSisterMary and noticed that each of his (her?) responses was moderated up to 2 as soon as it appeared, as was the case with all but one of SuiteSisterMary's other postings postings on that article. [slashdot.org] I believe that SuiteSisterMary may be maintaining a stable of slashdot accounts to use as they get moderator points (or perhaps arranging with friends to achieve the same result through their accounts).
Although I do not think that it is possible to always stop this kind of abuse if the perpetrator is careful enough. On a case-by-case basis, one can point it out (as I am doing here, which is why I have attached this off-topic posting here). It might be a good idea for Slashdot staff to check if SuiteSisterMary has been sloppy about this by seeing if the IP addresses from which SuiteSisterMary's postings were made are the same the IP addresses from which they were moderated (or a group of very close ones, say, from a DHCP or modem pool) and if all of the moderation consistently happened within a minute or two of posting.
It may also be possible to make some systemic fixes to at least reduce the problem, such as by preventing a moderator from moderating the same user's posting within one month (which would also discourage moderators from camping out on one particular thread for partisan purposes).
Re:Moderation Abuse (fake accounts?) (Score:1)
Re:Incorrectness in the above reply (Score:2)
Re:Moderation Abuse (fake accounts?) (Score:1)
According to SuiteSisterMary's user information page [slashdot.org] at the time I am writing this reply, 37 out of 38 of his (or her) most recent postings are moderated up, including the one to which I am replying. I encourage readers to look at those replies. They are not the sort of postings that layout a substantial amount of new insight or that bring in much new factual information that normally get moderated up. Whether SuiteSisterMary is lying about having extra accounts or is doing something slightly different but close I leave readers and to draw their conclusions, and hope Slashdot staff will look at the IP addresses and times of SuiteSisterMary's postings and moderation events.
About the GPU. (Score:1)
Re:Moderation Abuse (fake accounts?) (Score:1)
Grammar school anyone? (Score:1)
Am I the only one who has noticed this? You'd think with all the corporate backing that they would have some decent quality.
Re:Moderation Abuse (fake accounts?) (Score:1)
No separate video RAM (Score:2)
Sure it has an "AGP 6x" connection, but this doesn't help a lot when you are pumping all the data to the RAMDAC through that connection as well.
Sorry, I did not realize users could start at 2 (Score:1)
I must apologize. I did not realize that it was possible for any user's post to start at 2. If your posts are simply starting at two, then the number of additional moderations remaining is not that unusual, and I was wrong.
Re:But why support Athlon first? (Score:2)
or maybe they didn't want to have stability problems with the chip (athlon) overheating ?
2. The PIII is at the end of its life, the Athlon has at least another 18 months ahead of it.
wrong.. the new PIII Tualatin is due to come out shortly..
3. The PIII cannot take advantage of this chipset at all.
Single PIII's have shown no significant improvment with DDR
4a. Intel might have given P4 licenses to the slow chipset makers (ALi, SiS) which won't compete with Intel, but look at what is happening within Intel with regard to VIAs high-performance P4X chipset!
ugh.. i just wish VIA would go away.. The day they make a good overall chipset.. is the day Microsoft will make all their software open source
Re:But why support Athlon first (speculation)? (Score:2)
Thinkaboutit.
Re:A strange combination, to say the least. (Score:2)
--
< )
( \
X
Re:But why support Athlon first? (Score:2)
It seems that even reasonably "optimized" code will only see a significant clock-for-clock performance gain on a PIV when it is "lucky" to need the right work done to keep the pipeline filled.
Am I wrong here? Clearly the PIV could do some incredible stuff with the right synthetic benchmark, but do you really think that it is going to have a significant (again, clock-for-clock) advantage in real world apps once compilers catch up? If so, I think this is the minority opinion. Or am I wrong about that too?
-Peter