ATI Announces 512MB Graphics Card 440
Annoyed.Gamer writes "Today ATI announced their first 512MB graphics card, the X800 XL 512MB. I have some systems that don't have more than 512MB of system memory, much less on a graphics card. According to AnandTech, the 512MB card can't outperform its 256MB counterpart and costs 50% more. ATI's favorite Half Life 2 showed the only real performance increase in the entire article. Overall a disappointment, especially because ATI for some reason didn't outfit their highest end GPUs with 512MBs, only the mid-range X800 XL."
Some of your computers don't have 512 megs? (Score:5, Funny)
Re:Some of your computers don't have 512 megs? (Score:5, Funny)
This is Slashdot. This is where people dig up antediluvian machines and install Linux on them out of pure masochism. I'll bet you anything you like that at least three people will follow up this post to confirm that they're installing Gentoo on an old 486DX/33 and that they're expecting it to finish compiling and be able to start up X in just another week or two...
Re:Some of your computers don't have 512 megs? (Score:5, Funny)
And apparently he created his account before UIDs were implemented, so...
Re: Gentoo (Score:4, Funny)
486DX/33 (Score:3, Funny)
Next step: Turning the old 386 into a MythTV based PVR.
No, I'm New Here (Score:4, Funny)
Re:Some of your computers don't have 512 megs? (Score:2)
Re:Some of your computers don't have 512 megs? (Score:3, Interesting)
I once had Linux running on a 4MB 386SX/16. I wanted to run one block of distributed.net, just so I could get the bottom entry in their stats page, but worked out after a while that it would take about three months.
Alas, I don't have that any more; my botto
Re:Some of your computers don't have 512 megs? (Score:4, Funny)
Re:Some of your computers don't have 512 megs? (Score:2, Funny)
Re:Some of your computers don't have 512 megs? (Score:5, Funny)
I have a Pentium PC with 64MB that functions as a print server so effectively that I never see it.
Lots of RAM does not make up for a small penis.
Nor, apparently, does it make up for mediocre video processing. But, there is a market for this, just as there is a market for spoilers that mount on Chevy Cavaliers or Honda Preludes.
Re:Some of your computers don't have 512 megs? (Score:2, Funny)
Re:Some of your computers don't have 512 megs? (Score:5, Interesting)
On my todo list is to tap into the 5V of my old laser printer and put the print server on the printer's power supply. I also use X-10 equipment, the power stuff bought long before their annoying Internet campaigns for those damn cameras. In that fashion I can further reduce the standby power consumption. The printer's duty cycle is very, very low on an annual basis.
So, unless you live in some sort of a situation that provides power as part of your rent or such and don't really care about overall societal power consumption, you might want to carefully consider your printer server configuration.
Re:Some of your computers don't have 512 megs? (Score:3, Interesting)
It is pretty neat for things that do not have a constant draw, like refrigerators, etc. in that it accumulates kWh from the time it is plugged in.
However, it is probably just a typo but if you really were running only 1.8 - 2.1 kWh _per month_ than you would be beating the hell out of 99.99% percent of the "developed" world.
Re:Some of your computers don't have 512 megs? (Score:4, Funny)
no, but it is need to render my monster penis in all its manly details.
Re:Some of your computers don't have 512 megs? (Score:5, Funny)
Isn't it hard to deliver lots of "RAM" with a small penis?
How about a focus on quality? (Score:5, Insightful)
I'd be thrilled just to have my ALL-IN-WONDER® 9800 Pro not be so damn fragile. Often it comes up with bars and artifacts and I keep rebooting until it behaves. I've tried all the driver and firmware updates and fiddled with AGP volage settings to no avail. Graphics benchmarks all pass with flying colors (no pun intended) then the PC crashes when I start up some games. Meanwhile, a $37 graphics car (with a $10 rebate) from Circuit City is 100% reliable (except I can't watch TV on it.) Time for ATI/Nvidia race to focus on quality rather than quantity.
Re:How about a focus on quality? (Score:5, Informative)
Re:How about a focus on quality? (Score:4, Interesting)
I had a Geforce 4Ti which suffered from nasty screen corruption in some games, which was fixed with the aid of a CPU fan from a 486 blowing air in the general direction of the graphics card.
Yeah, high tech, I know. Even better - said fan was held in place with a mounting bracket from a 386's hard disk.
Re:How about a focus on quality? (Score:3, Interesting)
I put two case fans in mine (intake at the front, outtake at the rear, and changed my power supply to an enermax (with yet another fan). This stopped all my problems (nForce2 board Asus - a78nx with an AMD 2800+ cpu). A friend has basically the same configuration and was having problems as well until he added more cooling.
I've talked to techs who
Re:How about a focus on quality? (Score:3, Interesting)
I decided about two years ago to purchase a Radeon 9800 with 256 MB when it first came out. I had to order it overseas it was so new. However, the graphics drivers suck. I see more artifacting than I ever have before. The same thing happens on my laptop which has a radeon 9600. It has to be ATI and not the games because the artifacting happens in every graphical application.
It's the la
Re:How about a focus on quality? (Score:3, Interesting)
1st one > would start drawing artifacts on the screen about 30 seconds into the game.
2nd one > worked great for a week, then it started corrupting textures, and vectors (wierd protrusions would pop out of walls etc) and in 2d mode the fonts would look all sparkly, and when you typed the sparkles would change.
3rd one > DOA - didn't post at all
4th one > DOA - also didn't post
out of hand (Score:5, Interesting)
Re:out of hand (Score:5, Insightful)
On a side note, my office computer is a Dual 2.8 Ghz P4 machine, and I don't see a difference in normal day-today office stuff. Hell, my olf 400 Mhz. G3 laptop is just as capable as my Office machine for 95% of the work that I do. All those guys out there dropping $500 every 6 months on new cards are not showing their muscle under the hood, but rather their lack of brains. Or their large quantity of spending cash, due to the fact that they still live at home. (I'm totally getting flamed for that last comment, but that's cool)
Re:out of hand (Score:3, Insightful)
I am sure I am way in the minority, but my Windows system is an Athlon 900 (slot), 512 SDRAM, Win98, and an ATI-AIW32MB video card. It plays all my games fine (except the latest Ghost Recon, which
Re:out of hand (Score:2)
Re:out of hand (Score:5, Insightful)
We should thank these people that are willing to pay for the bleeding edge graphics performance. They enable us to pay bottom dollar for yesterdays technology that performs 90% as well.
You do not have to understand a performance enthusiast to benefit from their pocketbook.
Some people need the ram. (Score:3, Interesting)
These cards, with the specialized software, stuff quite nicely that image into the card memory, which allows my system to roam with a high end display.
Course, I don't know about *this* card, just others that have 512mb.
In fact, I did inquire with one manufacturer about upgrading a card to 1gb... talk about eyeballs popping
dual p4? õ (Score:3, Informative)
i doubt the accuracy of this statement. Especially since a dual p4 machine does not exsist.
you either have:
1) a new dual core EE cpu (unlikely)
2) A dual xeon server (more unlikely)
3) a normal p4 with hyperthreading (most probably)
just because it has two cpu bars in task manager does not mean you are running a dual system my friend.
the reason you dont see a difference between a p4 2.8 and an amd 1.4 is because the 1.4 is an AMD
put a p4 1.4 and a p4 2.
Re:out of hand (Score:5, Informative)
Good videocards allow better image quality in games. If you don't need better image quality, that's fine, but most people disagree with you.
Re:out of hand (Score:3, Insightful)
Most people do think like that. But the extreme gamers are then ones who bring the prices down for the rest of us. Please let them continue...
lol, blind leading the blind (Score:3, Interesting)
Thats like asking a kid who has been blind since birth how he feels about no seeing anything for his whole life. Of course he doesn't miss what he never had. Until you experience a high end system displaying high end graphics, you can speak about how good or bad you old system is. You are 'blind' to what you have never seen. How can I explain what red looks like to a blind person? How can I ex
Re:out of hand (Score:2)
Re:out of hand (Score:2)
For one, need them to anticipate graphics performance a couple years into the future.
Average gamers though... yeah, I don't see the point.
Re:out of hand (Score:2)
Whoops, "developers" that is, need them to anticipate graphics performance a couple years into the future.
Re:out of hand (Score:2, Insightful)
Re:out of hand (Score:2)
First off, not all the world is video games, and in the Windows world, many high-end graphics tools correctly take advantage of extra video RAM.
That said, more video ram directly translates to the ability to render faster at higher resolutions. No increase in performance? Try running an MMORPG with 50+ mobs/characters in front of you in 1280x
Re:out of hand (Score:2)
Who cares, as long as they keep doing it?
See, I have roughly zero interest in the latest FPS game ("Jax and Daxter" on PS2 is more my style), but this irrational push for the latest and greatest means that you and I get to buy some amazing year-old hardware for next to nothing. $30 will get you an MX 440. Joe Gamer would look on it with distaste, but it's screamingly fast for the easy work I ask of it. With the upcoming OpenGL desktops, I
Re:out of hand (Score:3, Insightful)
Re:out of hand (Score:3, Insightful)
Thus, if ATI is currently behind Nvidia, they have pressure to release a new card or drop prices (or both), otherwise they'll lose marketshare. Then Nvidia is behind, so they look to do the same. Thus you get frequent releases.
Then you just
Re:out of hand (Score:2, Interesting)
i think the only real thing that drives this kind of product leap is the small segment of the market that spends that kind of money on a video card 'because they can.'
granted, when the new HDL revision comes out for Source, then there will be a reason to upgrade because the o
I'm no Bill friggen Gates here... (Score:5, Insightful)
Re:I'm no Bill friggen Gates here... (Score:5, Informative)
Then again, unless you are tiling dozens of transparent movies, you probably won't notice this level of overkill.
Could help OS X, stores lots of things there... (Score:3, Interesting)
it's funny.. laugh.. (Score:5, Interesting)
Chicken and egg (Score:5, Insightful)
The reason is obvious: game designers target the prevalent market. Given that there are a limited number (zero) of users with 512MB of onboard memory, few video game makers are going to require 512MB of simultaneous textures (or even 256MB, and to a degree not even 128MB). Doom 3 may, as the article states, have 500MB of textures, but I highly doubt they are used simultaneously.
This is just another card for people with the money to say "just in case...".
Re:Chicken and egg (Score:5, Informative)
Otherwise you get hitching in scenes when Doom 3 needs to swap out that amount of data quickly for another batch of data (opening doors, switching from rendering level to reading the PDA, etc) because it will be moving data from the AGP memory cache from the main system memory bank.
Re:Chicken and egg (Score:2, Insightful)
But this article shows otherwise - there was almost no difference having 512MB of video card memory. The reason is most certainly that different subsets are used in different areas, and the hit on AGP/PCI Express to pull the active set into video card memory is momentary and largely irrelevant. If every 30 seconds you need to purge and cycle in through ultra-high speed AGPx8 or PCI Express, that really isn't t
instance of "benchmarking makes people stupider" (Score:5, Insightful)
No, it does not. It shows the limitations of a benchmark which is focused solely on frames-per-second performance.
The effects of texture thrashing will be perceptible (and distracting) at times to the human player, but they won't do much at all to effect such a benchmark.
It's a noticeable flaw, every 30 seconds. Doesn't matter if all you care about is "frames per second."
Re:instance of "benchmarking makes people stupider (Score:3, Informative)
Not necessarily. They could be taking the inverse "seconds per frame" on a per-frame basis, which should be reasonably accurate.
About as useful... (Score:4, Insightful)
Re:About as useful... (Score:2, Funny)
Apparently you haven't driven on some of Cleveland's roads.
Doom3 and Ultra mode quality (Score:4, Interesting)
Re:Doom3 and Ultra mode quality (Score:5, Funny)
Hmm, maybe I should start my own business - buy some low-mem cards, glue some DRAM modules onto it, sell them, profit(!).
you mean (Score:3, Funny)
Make it and they will come... (Score:5, Interesting)
Somewhere, someone is thinking of a killer application that needs 512MB of video RAM to work.
I just can't, for the life of it, imagine what it could be...
Re:Make it and they will come... (Score:2)
Re:Make it and they will come... (Score:2)
Yes, people really do play six characters at once.
Possibly it will be used on the Mac (Score:5, Interesting)
With Quartz 2D Extreme (marketing!) putting the entire rendering of the display onto the graphics card as an OpenGL surface, and lots of the display-rendering code itself being stored there as well, you can never have too much RAM - especially with the composition manager etc. all eating up gobs of it...
Simon
Now then.. (Score:2)
Now - I'm just an end-user, sitting here working with a large project in Revit [autodesk.com], an application that brings even fast PCs stacked with RAM to their knees. It's basically a database with a graphical interface and so every little operation results in refreshes and an element of regeneration of the display. It's a good tool, with great potential, yet that lag is a total patience-killer.
If the vector oper
Re:Make it and they will come... (Score:2)
The GUI system of OS X is relying more and more on the GPU to do drawing operations. The approach that Apple seems to take is to store everything graphics related in the GPU's memory, this includes window contents (for compositing), but also bitmaps, font glyphs etc.
This approach gives improved performance, but eats up a lot of graphical memory, s
Re:Make it and they will come... (Score:2)
It'll be faster when apps use it (Score:5, Informative)
Just because some games don't use that other 256MB doesn't mean that no apps use it. The "pro" cards have been at 512MB to 640MB for a while, now. They wouldn't even bother selling them if no one knew what to do with them.
Scientific Applications (Score:5, Informative)
A lot of applications in biology (3D microscopy, macromolecule interactions, MRI etc..), weather modeling, oil field visualization, to name just a few, are hungry for more onboard video memory.
Re:Scientific Applications (Score:4, Insightful)
This card is supposed to be a gamers card as its optimized for such things. Workstation cards are the opposite, most of them perform poorly on games even though their specs may lead one to believe otherwise.
How does memory and performance relate? (Score:2)
Obviously if you don't have enough (e.g. 64Mb RAM when the game engine needs about 128Mb RAM) there will be a performance hit, but if the game has all the memory it needs what would the point of having more be?
Re:How does memory and performance relate? (Score:2)
Re:How does memory and performance relate? (Score:2)
These pseudo-techie people seem to have the misconception that a video card's performance these days is entirely reliant on how
Driver or hardware? (Score:3, Interesting)
On a somewhat unrelated note, why don't these tests ever include MMORPGs? I'd like to think that a very crowded area in EverQuest during a raid with a lot of spell effects going off would challenge even the highest-end video card on the market. I think it's debatable that including some of these other types of games (MMORPG's specifically) would be more appropriate and well-rounded than 6 different FPS's.
Of course, the problem would be fair testing of what is obviously a dynamic environment. My opinion is that two identical machines attending the same event with an almost identical viewpoint could be achieved. It would just require some social coordination to get the testers included in these events.
Re:Driver or hardware? (Score:2)
as much as that would be nice, unless your MMPORG allows you to record/playback a demo, it would be impossible to make any meaningful comparisons between runs and/or different cards.
Re:Driver or hardware? (Score:2, Interesting)
Graphic Apps (Score:3, Interesting)
Re:Graphic Apps (Score:2, Informative)
But I don't really know either =P
Re:Graphic Apps (Score:3, Interesting)
If the card manufacturer writes a hardware plug-in for Photoshop to use it, which I've never seen one outside of Radius (not for RAM but for processing).
Probably has something to do with the Tiger releas (Score:5, Informative)
From the "Core Image" page [apple.com]:
When a programmable GPU is present, Core Image utilizes the graphics card for image processing operations, freeing the CPU for other tasks. And if you have a high-performance card with increased video memory (VRAM), you'll find real-time responsiveness across a wide variety of operations.
Re:Probably has something to do with the Tiger rel (Score:3, Interesting)
Yes. Read the arstechnia article about OS X's new desktop rendering system. Then think about how much information is stored on the video card for that to work. Then think about how the current effects are just scratching the surface.
My 128 meg card can handle it now for most things, but when I turn on a whole bunch of real-time effects it does get bogged down because it is forced to swap with system memory.
Re:Probably has something to do with the Tiger rel (Score:3, Funny)
Shouldnt they fix their drivers first? (Score:5, Interesting)
if ATI cards were twice the speed of nvdia, I'd still avoid them, simply because nvdia drivers are rock solid and unfussy, whereas the ATI driver 'envrionment' is usually a bug ridden barrel of unstable bloatware, that avoids standards like the plague
Your mileage may vary etc blah blah
Re:Shouldnt they fix their drivers first? (Score:2, Insightful)
A sale is quantifiable on the sheet. A lost sale is an abstract concept that requires human intelligence to comprehend and take into account.
So time and money "wasted" on coding drivers looks like a pure expense with no payback to the bean counters who think the computer has all the answers.
This is the sort of shit that happens when you abrogate your rightful place as the thinking componant of the system to a slice of ro
Re:Shouldnt they fix their drivers first? (Score:3, Insightful)
Haha, yeah. It's funny, Nvidia's stuff is the only thing to ever blue screen any of my win2k machines. The ATIs have always given me a little warning by shitting all over themselves and giving me time to close down and reboot the machine.
I wonder if AGP drivers are a variable which effects the stability and performance of various cards differently. There are always people who swear up and down that they have better experience with one brand or the other, and they certa
When can we skip the CPU? (Score:2)
I wonder when the GPU will supplant the CPU? I'm sure it would be much easier for ATI to add a few million transistors for some general CPU performance than for Intel/AMD/IBM to replicate a high-power GPU. The CPU-needs of the core logic of basic applications are pretty minimal and
Compositing Window Managers (Score:5, Informative)
Others have responded with various games as the killer app.
And perhaps, today, they are the driver for this much VRAM.
However, there is a use for a card with that much VRAM that isn't gaming - compositing window managers.
Apple's MacOS, Microsoft's Longhorn, and *nix's various compositing WMs all operate by giving each active window its own chunk of memory sufficent to hold the whole window, and then treating that memory as a texture for a polygon and letting the 3D hardware do the final compositing onto the display. This allows for effects like translucent windows, smooth window movement, quick resizing of windows, simplified backing store (handling windows overlapping other windows), and many other useful items - these aren't just "eye candy", but things that make the system much more useful.
Now, think about how many windows you have open right now. Think about how many windows a power user may have open. Think about how much memory that can burn to give all those windows their own space.
512M of VRAM isn't overkill for such situations - it's barely enough, and video card vendors are starting to look to supporting virtualization for the card's memory needs (especially in PCI Express cards where the card can have a decent amount of bandwidth to system memory.)
Re:Compositing Window Managers (Score:2)
My Mac mini does pretty well with 32MB on a 2048x1536 screen resolution, which is 3.2MP.
It's all about marketing (Score:4, Funny)
512 MEGABYTES OF MEMORY!!!
TWICE THE MEMORY OF ANY OTHER GRAPHICS CARD OUT THERE!
NO OTHER GRAPHICS CARD COMPARES!
I expect ATI to come out with a sound card next month with a volume control that goes up to 11.
I'm so old... (Score:2)
Truly, we live in an age of wonders.
Hardware vs Software (Score:5, Insightful)
For instance, the Lotus Smartsuite products were way ahead of Microsoft's Office suite when they were released, but the entire package was took about 25 1.4MB floppies, I think, and then would hardly run on the typical system at the time. A couple of years ago I was looking for some clip-art and loaded it from CD. On modern hardware, the package was quite pleasant to use.
There were some bugs in SmartSuite, and Microsoft did a number on compatibility at the API level, but I think overall it was the bloatware aspect that hurt it the most. A few years later the package seems rather spritely and compact.
Hardware suffers from the opposite problem. The attitude "Why would I need that much?", which hardware vendors play into by offering products with overkill specs in the wrong areas. Since they can't double processor speed, doubling the amount of RAM is the next best thing, right?
No, the next best thing would be to offer rock-solid reliability in the hardware and drivers. Make it cheaper. Ship the source for your drivers. I want it to work, and if it doesn't work I want there to be a way to fix it.
I know that's not how the video card business works. If you're not at the cutting edge, you're an also-ran. I just wish it weren't that way.
Sorry for rambling. To tie it all together, I think vendors get caught up in having features their marketing department can brag about, rather than delivering products their customers can use most effectively.
GPUs are the future for OSX 10.4 (maybe Longhorn) (Score:5, Informative)
http://arstechnica.com/reviews/os/macosx-10.4.ars
news? (Score:2)
Two Words: "Core Image" (Score:3, Informative)
Once a Mac version of this is available, Core Image [apple.com] and "Quartz 2D Extreme" [apple.com] will put the extra vram to pretty good use.
Ars has a pretty good explanation about why the extra elbow room will make a difference, namely, the GPU won't have to hit its backing cache in RAM [arstechnica.com] as often.
who will ever need that much memory (Score:2)
(There is another post, probably more along this line.... for one [slashdot.org] and for another [slashdot.org].)
I tend to agree people will find use for the 512M memory in video cards. Of course there's the infamous Gates quote about "noone will ever need more than...." (or words to that effect)...
I have NO idea what I'd use 512M memory for on a video card.... My first inclination might be to back up the hard drives from my first three or four PC's into the video cards memory each night ;-). But I do know I'm using technology in
It's about more than just (choke, gasp..) games (Score:2)
Graphics cards aren't JUST designed for games...although it's hard to believe from what you read here.
Of course we have games to thank for great graphics cards which have allowed for the GUI to move onto the card
It won't stop the kiddies (Score:3, Funny)
This is the real reason why ATI even does such a werd-ass thing.
-Mommy, my penis is shrinking!
-Well son, let's get you a new videocard then!
That's just my opinion and experience of dealing with teenage computer users these days.
Re:It won't stop the kiddies (Score:2)
It has nothing to do with age.
good time to upgrade? (Score:3, Funny)
I've learned (Score:4, Insightful)
Well duh (Score:3, Insightful)
Can that have anything to do with texture resolution not being there yet? They'll no doubt be there in the future though, so I can only see this as the first 512 MB card with more to come. I don't think it's really "bad", just a little bit ahead of its time.
Re:Well duh (Score:5, Insightful)
I see a lot of really sour posts on this one about how it's stupid, ridiculous, how a P3 500 is just fine, how last year's game runs great..
They say it costs twice as much but only helps one game? Then I say it's a sign of things to come. They've said this same crap about 3D video board memory for years. "You don't need 64MB!!!" "You'll never use 128!!" "256? You're stupid!"
If the video boards all have gobs of memory, then the games will all start to have gobs of high resolution, bump mapped, great looking textures. Why is this a bad thing? When the next generation of games hits the shelves in a year or so, they'll use that video memory.
And... (Score:3, Funny)
Graphics Research (Score:3, Informative)
I don't know where ATI expects to make the money on this (certainly not that much $$$ in the research market), but I'm personally glad that they released this card.
The big question in my mind now is how good the cache performance is on this new card.
Usefulness... (Score:4, Insightful)