New ATi 3D Chip 97
Cooper writes "Saw a piece on Sharky Extreme about a new ATI chip dubbed the Rage 6, which they say is going to be used on Microsoft's x-box as well as for PCs. It's got an on-board geometry processor like the NVIDIA GeForce. " Wow. 2 gigapixels per second? Wow. The graphics market is starting to really heat up - check the earlier story about the Voodoo 4 &5.
Intriguing... (Score:3)
Seriously, though, sheer pixels per second is kind-of meaningless. (Actually, it'll be state changes per second. :) In most vector-based or polygon-based product, the bottle-neck is in calculating the outer perimiter of the shapes.
Personally, I think pixel-based displays are a dead-end, anyway. Aliasing is horrible, and the techniques to get round it do so by making the picture too blurred to tell.
What's needed is a pure analogue polygon-based display, capable of area fill and non-trivial shapes. 3D would be nice, too. Anyone got a holographic projector they could lend me?
Graphics cards (Score:2)
Anyway which graphics card is "the business" both now, and which is expected to be ?
On another note, I'm not too enthusiastic about Microsofts X-Box - seems the forthcoming Sony PSX 2 is going to beat it into the ground, and the spec of Microsofts product doesn't look that different from a PC anyway. Anyone think any different ?
The Playstation 2 looks like it might be the first games console I'll buy since I bought one that played table tennis games in glorious black & white !
Why all those pixels? (Score:1)
But deep end video chipsets do have a purpose just a short ways down the road. True immersive VR worlds. 3D gogglevision with lag time well below the perception threshold (10ms, perhaps?). Gibson-esque Cyberspace. The Matrix.
Between screaming video and fast wires, the technology is almost here. How long will it take for the content to follow?
Power? (Score:2)
I found it really funny that because each of the new chips uses 7 watts of power, if you have more than 3 it can't get enough from the bus (AGP provides 20W of power) ... so if you look at the picture there's a brick that you have to plug into your outlet in your wall, and then plug into the back of the card... /me is wondering how much heatsink/fannage you'll need for 4x7watt processors...
Graphics market just starting to heat up? Huh? (Score:1)
Good god, where have you been? The acceleration of graphical capabilities of PCs over the past two years (starting with 3dfx's DooDoo) has been incredible. Even last years games were aeons beyond those games of a year before them, and before that improvements in graphics technology and 3D were dog slow. We have been in the middle of a revolution in 3D graphics speed and quality for more than two years now. Hardly "just starting to heat up"
Power! (Score:2)
This may parallel (fatally, as is the case for automotive analogies) automobile mufflers; I find that the really powerful automobiles have extensive exhaust systems, whilst any car with a "wimpy" exhaust system is itself necessarily "wimpy."
A graphics card that requires inordinate amounts of power might, on the one hand, be making flashy use of electricity to make it seem cool, but might truly be providing a whopping lot of "rendering power."
Of course, the killer question is whether or not there will be XFree86 accelerator code that can actually harness this power... Otherwise, these monstrous, smoking-at-the-ears graphics boards may be paper dragons, parallelling the Intel MMX, where few if any programs actually made real use of the optimization...
Re:Intriguing... (Score:1)
Got quite a long ways before that happens, though - think of the amount of information nessisary for a 15" x 15" x 15" display that produces true 3D holographic display!
Of course, the next step beyond that would be something that produces not only 3D, but 'solid' displays that can be touched, ala Holodeck concept. Of course - better be prepared to actually duck that rail gun shot!
Re:Intriguing... (Score:1)
Re:Why all those pixels? (Score:1)
-----
Support? (Score:2)
A lot of people give a lot of hot air to Open Source in general and Linux in particular.
Nvidia did a very nice thing by providing an optimized, glx version of the SVGA X server for the TNT cards. 3Dfx has recently made some moves in this area, but I still tend to think they are "closed thinkers" (GLIDE).
The question, what has ATI done for me lately? Why should I even consider giving them my money?
It'll be obsolete in a year. (Score:3)
Of course, larger textures, more polygons and features like full-screen antialiasing and enviornmental bump mapping and true color are great, but it is hard to get worked up over any particular video card when each of them only has bits and pieces of the really cool features, and when it comes down to it, all they really do is allow some minor improvements in image quality and allow you to up your screen resolution and get more frames. Great for the obsessed gamer, but game developers still have to make games with the low end in mind, so the games themselves aren't tremendously more impressive from a graphical perspective.
For those reasons I'm a lot more excited about the Dolphin and PSX2 than I could ever be about a mere video card, or even a new class of video cards (like those with on-board T&L, which is meeting with mixed enthusiasm from developers btw.)
But all these incremental advances and all the competition is great. It means that maybe in 3 or 4 years, there'll be PC games where the game developers aren't limited by hardware anymore, but rather the "developmer's bottleneck" will be the designer's creativity, effort and resources. That would be cool. :)
Re:Intriguing... (Score:1)
I don't think raster displays will work for that. Mine can't even handle 100 dpi and still be sharp. (though I guess I can say my monitor does its own antialiasing
Analog has been tried before -- they're called vector displays (ala Spacewar and early Atari arcade games). For some odd reason they didn't catch on permanently. Go figure.
Finally, if you want to know how to create a 3-D image, I've found a link [mit.edu] to ways of doing it.
DAMN (Score:1)
If you think you know what the hell is really going on you're probably full of shit.
Hardware features (Score:1)
Now, comparing that FireGL card to other things available, it's quite lackluster.
-It's geometry setup actually slows down what my dual PIIs could do by themselves (at least, it feels that way)
New Graphics Card from Me, Inc. (Score:4)
Don't settle for wussy video cards that are limited to only 'pixels' and may or may not be out after Christmas. Use the Eyeball(tm) (or two) and enjoy true reality
(please note that overclocking the Eyeball(tm) or removing the Eyeball(tm) from the EyeSocket(tm) in any way voids the warranty and may damage said Eyeball(tm))
Re:Ummm... I meant to write "VooDoo" (Score:1)
Re:Support? (GLIDE) (Score:1)
But, essentially, an implementation is available.
Pan
great more x-box rumors (Score:1)
Re:New Graphics Card from Me, Inc. (Score:1)
Re:New Graphics Card from Me, Inc. (Score:2)
All you need to do is press down on the socketsto reseat the firmware for a minute or two.
Days of fun.
Pope
WARNING: Transmeta speculation ahead... (Score:1)
Think of it as a hardware-interpreting-coprocessor!
Re:Saving my karma! (Score:2)
It's far more fun to forget about that... :-)
(No, there probably aren't many people with hundreds of karma points. My count is only in the "single" hundreds, not in multiple hundreds...)
Re:great more x-box rumors (Score:1)
Speed vs Reality (Score:2)
All of us in our shop have been reading reports of the "new-and-improved graphic" graphic chips that supposed to fly at this and that speed, in the giga-pixels-per-second range.
All is well, if those giga-pixels-per-second can be translated into SERIOUS USE.
For a gamer, the giga-pixels performance might be sufficient, but for other serious uses - like REALTIME SIMULATION for example, giga-pixels-per-second performance do us any good if it can't do simple refreshs at 30 frames/second or more, at the resolution of 2048*2048*32bits/pixel.
REALTIME simulations _ARE_ important to our shop, and the adage "A picture is better than a thousand words" rings very true to us, for there is no way we can catch faults by pour over the tera-bytes of data generated on a typical simulation.
We have reached our human capabilities in our demanding simulation environment, and when we _NEED_ the hardware to support us, all we see are marketing hypes that do not translate into real use for us.
My hope is that one day the marketing hype will go away, and serious users like us will get the products that we truly require.
Anyone know how long we have to wait until _REALLY USEFUL_ products will come to the market?
Re:Support? (Score:1)
I don't think so, but an illuminating question... (Score:3)
That may be a gross oversimplification, but there you go...
The patents that Transmeta has been granted somewhat confirm this point of view; the patents represent ways of optimizing the emulation of those "lots-of-control-statements."
There may be a real killer graphics chip right around the corner, although it is easy to argue that the last three years have involved the continuous release of successive generations of "more-and-even-more-killer" graphics processors.
I'm not sure that there is a "Transmeta" of the graphics world; it's probably not appropriate to talk about such until next February when you might conceivably be able to buy some Transmeta product...
Re:Saving my karma! (Score:1)
I wish I had a nickel for every time someone said "Information wants to be free".
Re:Saving my karma! (Score:1)
I find it smarter to put "Go ahead and moderate me down for daring to say this" somewhere in my mail so as to come across as a martyr.
Oh, and I bet I will get moderated down for speeking the truth on this issue, but I will always be ready to give of my karma to serve the readers of slashdot!
-
We cannot reason ourselves out of our basic irrationality. All we can do is learn the art of being irrational in a reasonable way.
Re:ATI Blows (Score:2)
My ATI All-in-Wonder card works great for me. I've had absolutely no problems with it, and now that ATI has relased the specs for it, I'm really looking forward to being able to use it for video capture.
Is there any particular reason you don't like ATI?
the rise of graphics = more fighting games :( (Score:2)
have become so big, there's just more and more fighting
games.
Good Thing: Games are more flexible. Good ideas can become better.
Bad Thing: Since graphics have become less and less
of a barrier, there's been less originality in games.
I long for the days of nes, snes, and genesis, where
game programmers had to come up with ORIGINAL ideas. Now it's
almost always shoot-em-ups!
Re:Speed vs Reality (Score:1)
Why are you looking to companies like ATI and 3dfx to build accelerators for realtime simulations? These companies build cards for people who want to play computer games. Duh.
What you need is to rent time in one of The Caves [uic.edu].
Next Generation Changes Hardware/Architecture (Score:2)
A number of companies have specialized in developing really unique and interesting graphics engines which they are marketing to the big makers in the form of logic cores. They range from faster engines to render more polygons per microsecond, to more unique designs that render blocks of the screen at a time, to really uniqe designs that take polygons completely out of the picture. These engines will find their way into the next generation graphics chips very shortly.
Additionally there have been a number of advances in the architecture of the systems themselves that will make significant advances in graphics. The advent of really high performance memory (Rambus, DDR, etc.) will certainly improve performance. And then there are some really uniqe designs coming that use copper-wire interconnect technology to give roughly the same performance as "system on chip" designs. And with all that, they die sizes of the chips themselves will keep shrinking (I think the new ATI chips are done on a .18 micron process). So suffice it to say, things are only going to get more interesting.
Re:Support? (Score:1)
It seems to me that even with the benefit of "the specs" ATI video always seems to be "acceptable" or "functional" instead of "really nice" (ala TNT) or "great".
I will continue to use Nvidia products, but I will refrain from calling people who buy ATI stuff names
-Peter
Re:DAMN (Score:1)
Re:ATI Blows (Score:1)
Re:Intriguing... (Score:1)
Seriously, though, sheer pixels per second is kind-of meaningless. (Actually, it'll be state changes per second. :) In most vector-based or polygon-based product, the bottle-neck is in calculating the outer perimiter of the shapes.
Not always; depends what you're doing. With simple scenes (less than 10k polygons), and using texture maps, lightmaps, bump maps, translucency maps etc, at high resolutions (if you want to reduce aliasing), major pixel fillrate is needed - it's still a big bottleneck (one of many).
Personally, I think pixel-based displays are a dead-end, anyway. Aliasing is horrible, and the techniques to get round it do so by making the picture too blurred to tell.
Reality itself is aliased, it's just that the pixels are really really small. There's no such thing as pure analog. And anti-aliasing techniques do not soften or blur the picture at all (like filtering techniques do), they make the edges cleaner, but still crisp. Supersampling (a popular method of anti-aliasing that requires huge fillrates) duplicates what our vision does (the eye "supersamples" what it sees, by averaging all the rays that fall on a particular rod or cone).
We don't have the retinal resolution to perceive the world around us at the atomic (let alone the Planck) level, so we see those nasty jagged edges as perfectly smooth. Once pixel-based displays get close to the limits of our eyes (1200dpi looks pretty good), we won't be able to tell the difference.
Not to say there aren't more efficient ways of doing this, but pixel displays will work perfectly well.
Re:ATI Blows (Score:1)
I mean, the performance on all ATi Rage cards would be WAY better if they focused on producing excellent drivers for their excellent cards.
Or, if their new chip actually does 2 gigapixels, they wouldn't have to make useless stuffs like the MAXX, which doesn't even match up to the GeForce 256.
AW
Odd trend in PC hardware design; where's the DSP? (Score:1)
I myself am a casual gamer and played my share of Half Live, Descent and Freespace (try Freespace, it's good).
Now if I look at any current game-capable home PC, I think those are quite a
- A graphics card that generates a LOT of heat, so much that it needs its own fan.
Some graphics cards consume that much power that the mainboard's chipset starts behaving flaky. Some graphics cards consume more power than supported by not-that-old chipsets and can indeed create actual hardware defects on a mainboard.
- My desktop's graphics card has 32 MB of RAM. Come on, my laptop has 32 MB of system RAM and I do close to everything on it, including word processing, database development, web server stuff...
- Yet, the graphic's card awesome processing power and its RAM are being used for gaming only. I mean, most of the time at a computer I spend my time programming or using office applications. At these times, these resources are just idly wasting power.
- What do these modern graphics cards do? They speed up 3D related calculations.
- Isn't that more or less a specialized variant of what a digital signal processor does? Or am I naiive when it comes to a DSP?
- Why do we have to have specialized chips for graphics, sound, win-modems and whatever when we could have used a single type of add-on chip to help the CPU? Why did the industry still not decide to put at least *one* versatile, programmable DSP on every modern computer mainboard? Those things are cheap, they are versatile, they could be programmed to speed up a whole bag of different algorithms such as audio (think live MP3 encoding, that was possible years ago with a DSP), graphics (Photoshop filters), encryption. Again: Or am I naiive when it comes to a DSP's capabilities?
- Another related question: Why this odd decision to push 4-way sound cards for 3D sound? 5 channel home movie theatre stereos exist since a long time, why did noone in the industry decide to offer a simple card to connect with those.
Oh my. I don't claim to be an expert on DSPs or 3D audio. But still, from the little I know, I think that the last two years in PC hardware design went terribly wrong...
------------------
Re:Odd trend in PC hardware design; where's the DS (Score:2)
But I think that the concept is right, yet IBM failed to actually do it accordingly.
------------------
Improbability Accelerator? (Score:1)
------------------
Re:DAMN (Score:1)
You are stupid.
Very stupid.
Don't you hate eating your own words?
If you think you know what the hell is really going on you're probably full of shit.
Huh? (Score:1)
"I long for the days of nes, snes, and genesis, where game programmers had to come up with ORIGINAL ideas. "
And frequently didn't. Can we say Mortal Kombat and [insert SNES Street Fighter clone here]?
"Now it's almost always shoot-em-ups!" ;)
Please don't confuse 3D first-person-shooters with the traditional shoot-em-ups; as defined at Shmups [classicgaming.com].
Why do you care? (Score:1)
Like Nintendo with their Project Reality, Ultra 64, etc. Even Nintendo now discloses more than MS. Nintendo has to be the most secretive and strange company. They do some of the strangest things, but they can pump out the games. They can make a good game and find companies that can make fun games for them.
Can Microsoft do that? Unless they basically try buying their way in, they will have more of a problem than they expect. Nintendo and Sony have deep pockets. And they have no series characters. Like Pokemon, Mario, or Crash. And no Square.
And at least Nintendo basically has disclosed general info. Like the IBM copper CPU, etc.
I don't really care too much. Good games make people buy consoles, not the processor or the damn graphics card.
How to program the rapidly advancing chipsets (Score:3)
That will change over time... (Score:1)
damn preview (Score:1)
Personally, I like the expanded colour range and on the fly hardware particle generation.
All you need to do is press down on the sockets to reseat the firmware for a minute or two.
Days of fun.
Thanks for watching. I'll spell better next time.
honest.
Pope
Re:Improbability Accelerator? (Score:1)
Or are you looking to keep your tea warm?
Either would work.
Do wonder for those real-time particle generation experiments.
But it just makes the QuakeBots jumpy.
Pope
come on, look around (Score:1)
can you say Tribes!
can you say Thief!
can you say Half Life!
theres more than just quake3 these days
:)
Re: creativity (Score:1)
Re:ATI 3d...it makes SGI 3d look like childs play! (Score:1)
between games and professional work. Tell me
ONE single professional 3D program that run on
Linux, (except Houdini). The O2 came out in
1996/1997 and you are comparing a 200Mhz
O2 made for _modelling_ with a brand new
500Mhz ( or what you got ) P3 with a brand new
graphics-card,made for games
That's like comparing apples and oranges!
And the fact that DD rendered Titanic on Linux
doesn't say shit cause rendering is all about
CPU-number-crunching and you only need an os
that doesn't crash every second and
doesnt
cost and arm and a leg, so they could
have used *BSD instead of Linux
Re:Power? (Score:1)
I can understand the V5's need for an external supply since the bus alone can't provide that much power. But...why an external brick? Why not an ordinary drive-style power connector internally that can tap the system's existing power supply?
Gay! Totally gay! Liberace gay!
Well.. (Score:2)
I know what you might say: "Maybe your one card was bad..." I thought of that I ordered another one and had the same results.
The other thing was that It ran the first few months as the second closest thing to vapourware after Daikatana. The only people that could get a hold of them (after their release date) were people who review hardware.
Will the "New" ATi Rage Fury MAXX be the same story? If so then the niVidia GeForce will still be my First choice.
Conclusion: Wait till they both have been released. If you can get a hold of one for free test it and post your results for others to see. And for those who can't.. Check online before buying either.
The age of abstraction is here to stay. (Score:2)
As 3D chipsets multiply, software abstraction becomes increasingly important. Application developers can't be expected to keep up with the onslaught of new hardware. If manufacturers want developers to take advantage of their new hardware, they have no choice but to support standard libraries.
I certainly wouldn't go out of my way to support a particular chipset if the manufacturer didn't bother to provide proper OpenGL support...
Re:great more x-box rumors (Score:1)
Viglen a british computer company that Amstrad took over have the rights to splash the microsoft logo over their machines...
Ancillary buffers (Score:1)
Re:Power? (Score:1)
It mentions (in the interview I think) that the Voodoo5 6000 needs a brick because 3dfx feels it'd stress the normal PC PSU too much, lame I know but that's what they're saying. The V5 5000 and 5500 will have internal PC PSU connectors apparently.
Hope that clears it up.
--
Re:Power? (Score:1)
here [3dfx.com] is the correct link
--
Re:I don't think so, but an illuminating question. (Score:2)
It's entirely possible to stack multiple processors onto one die.
(That's how the 486 and Pentium work, merging the 386 and 387 processors into a single unit. The CyrixGX went one step further, merging the graphics processor in as well.)
If Transmeta have a "universal instruction set" general CPU, merged with ultra-fast GPU and MPU, it would rip to shreds everything else out there.
Re:Hardware features (Score:1)
How come it won't do 3d in a window?
My Permedia2 can, but only up to 1024x768x16bit.
Have you tried to lower the resolution?
Re:Odd trend in PC hardware design; where's the DS (Score:1)
Most modern "analog modeling" synths use one or more DSP's as their sound-engines.
Clavia Modular series, Yamaha An1x, Virus, etc, etc.
Re:ATI 3d...it makes SGI 3d look like childs play! (Score:1)
You'll sink your puny little pentium since the card does absolutly *nothing* to help you. To get 85 fps with a low poly-count, all you is a high pixel fillrate and a texturemapper.
High end card aren't made for gameing, so they put lots and lots of computational power into them instead of giving them a totally pointless high fillrate.
Re:I don't think so, but an illuminating question. (Score:1)
I've done some research on the 3D graphics chips of UNIX workstations and PCs over the last five years and Transmeta's approach doesn't make any sense as a peak-performance graphics chip. While I must apologize to any real 3D chip designers out there for my generalizations and any misconceptions they may spot with their even greater experience,
I'll try to summarize why programmable chips don't make sense to accomplish fast 3D:
1) If the algorithm a chip must execute is fixed (as it generally is with 3D algorithms), nothing is faster than a well-designed hard-coded ASIC that lays down precisely the circuits needed for optimally accelerating that algorithm.
2) If the algorithm varies substantially, a general purpose processor is more useful. In cases where there is significant unpredictable branching in the algorithm, a general purpose CPU will be optimal. In cases where there is strong data parallelism, a DSP will be optimal.
3) Run-time programmable logic, such as FPGAs (field programmable gate arrays), microcode, or Transmeta-style programmable logic, has traditionally been best for cases where
3A) you want to trade off top-notch ASIC speed for programmability in case your algorithm isn't debugged or you get flaws in the silicon
3B) you wanted to accelerate a broad, flexible set of functions faster than, say, software, but didn't want the expense of a general purpose microprocessor
3C) you really want to accelerate one algorithm now, but in few minutes, you want to accelerate another. With 3D you're changing the path picked through the 3D pipeline multiple times per frame, at every state change. You wouldn't be able to reconfigure your chip fast enough to optimize for that type of changing; at best, you might reconfigure your logic for a particular 3D game and the 3D primitives it uses.
Transmeta and FPGA chips can of course accelerate 3D logic, but what you have to realize about 3D logic is that
1) it is very branchy- lots of special cases depending on just whatever mode you're in (This makes DSPs and rasterization a poor fit, although Intergraph has used DSPs for geometry acceleration)
2) the front end geometry processing is primarily floating point (vertex) matrix manipulation, the back-end rasterization is primarily integer (pixel) manipulation; your architecture must provide both types of execution units, in the right proportion
3) the process of shipping all the various vertex, texture and triangle-to-pixel data is very timing-sensitive, requiring lots of dedicated FIFO buffers for optimal acceleration
4) many pixel operations such as pixel walking or gouraud shading generally require very simple increment operations that don't require a full integer unit as one would find in a CPU or DSP (FPGAs would be better here, ASICs better still)
5) the data paths between circuits on the chip grows practically exponentially as you go through the 3D graphics pipeline. Megabytes of vertex data turn into gigabytes of pixels. A general purpose CPU or FPGA are not typically optimized for this.
6) moreso than FPGAs or CPUs, a 3D chip has to be optimized for huge output bandwidth to the frame buffer, both read bandwidth (for Z buffer and blending operations) and write bandwidth, with a separate set of data lines for reading in the initial vertex/texture primitives. The backend frame buffer bandwidth typically requires more pins than you'd have in a CPU/FPGA package. And most CPUs, DSPs, and FPGAs don't have such split memory controller setups integrated into the package, requiring a more expensive external chip.
If this got too technical; I apologize for not having time to make it simpler and/or more precise. But nothing I've seen about Transmeta rings any bells as having promise for making a faster 3D graphics chip, something I'd be very interested in.
--LP
Probably not a relevant strategy... (Score:2)
But that word "address" is the critical thing... In order to stack many processors together, and make use of them, you need considerable memory management hardware so that those CPUs can actually address memory, and not trample on one another whilst doing so. Parallelism isn't trivial to harness...
Re:ATI : Linux Hostile? (Score:1)
That's why I have a 19.2 modem instead of a cable modem.
Companies only understand money.
Thanks for explaining... (Score:2)
If you really want Xscreensaver to work right. . (Score:1)
Xscreensaver will work double buffered (which means it won't flicker, if anybody else has that problem), but, expect that you will have bugs in other GL apps. Also, will be a lot less stable. An unstable X can and will crash the entire system. To top off that, no matter how hard you try to optimize it with -mpentiumpro cflags and such, the CVS driver will be slower than the binary released by nvidia.
Re:Thanks for explaining... (Score:1)
The only way I can think of to solve it would be to force moderators to read a certain set of posts that haven't been seen by other moderators (somewhat like the meta-moderation process.) I admit this would add an additional small discouragement factor to moderators.
--LP
Re:Huh? (Score:1)
Anyway, you forgot Virtua Fighter 1 + 2 for Windows95. They are/were the definitive 3d fighting games. I whish they had more for the PC. Tekken 3 would sure as hell be nice. VF3 would probably run in some form on a PC. And while someones at it, make a linux port.
Re:New Graphics Card from Me, Inc. (Score:1)
Re:New Graphics Card from Me, Inc. (Score:1)
Re:Saving my karma! (Score:1)
I think that people SAY that in their posts, because they feel like they're going out on a limb, or have low self esteem or something like that, and in reality, they ARE going out on a limb, and it's limbworthy thought that earns good moderation points.
That's the optimist's view, though.
I wish I had a nickel for every time someone said "Information wants to be free".
DSP and 3D sound stuff.. (Score:1)
With 3D audio, there is a sound card that will output sound to a Dolby Pro-Logic Surround Sound capable receiver: Creative Labs' SoundBlaster Live! series, including the original card, the value edition, the MP3 edition, the platinum edition, etc. The first main software upgrade to the sound card (LiveWare 2.0) included the ability to encode audio in Dolby Surround format, so you can plug it right into your favorite home-theater stereo and blast yourself away. Believe me, it's fun. =)
Re:Ancillary buffers (Score:2)
Re:New Graphics Card from Me, Inc. (Score:2)