Matrox Parhelia 512 Preview 203
SpinnerBait writes "Finally, you don't have to sift through all the unreleased and unauthorized
bogus information around the net about Matrox's upcoming 3D Graphics chip,
called the Parhelia 512. Matrox has taken the wraps off their next
generation GPU and
this Preview over at HotHardware goes through its feature set with a fine
toothed comb.
They also give you a very rare glimpse inside Matrox's Montreal Headquarters,
as well as a look at some very impressive technology demos, rendered on their
new chip. Looks like impressive stuff for sure."
Macs (Score:1)
PS Frist P0st
Wow... (Score:2, Interesting)
Re:Wow... (Score:1)
sad sad sad.
oh wait, the same statement probably applies to my three computers. OUCH.
Far better look at the card here (Score:2, Informative)
http://www.hardocp.com/articles/parhelia/index.
http://www.hardocp.com/articles/parhelia/analys
Why not go to the source? (Score:2)
Re:Please provide LINKS (Score:1, Flamebait)
Is this crap? (Score:1, Insightful)
"Gigacolor", as Matrox likes to call it, is otherwise known as full display of over 1 Billion colors. Before you peg the "Marketing-Hype-O-Meter" too far, believe or not, the human eye can definitely tell the difference between 16 Million and 1 Billion colors
now if I remember correctly there are less than a dozen monitors that can produce this kind of detail(please correct me if I am wrong) and no-one can reallistically tell the difference (once again...please correct me if im wrong).Anyhow i can see something more than 5.2 on the marketing hype-o-meter
less than a dozen monitors that can... (Score:2, Informative)
It's a bit like using 24bit sound recordings to mix and then downsampling them to 16bit.
Re:Is this crap? (Score:2, Insightful)
24 bit (16 million colors) are a lot, and I certainly have difficulties to find the difference between color #70e0e0 and #70e1e0, but when you want to have a nice background, top is plain blue (#0000ff) and botton is black (#000000), then there are only 254 levels between those. And I can clearly see those lines where the blue color value changes.
And that's where the more colors shine. Just using 10 instead of 8 bits reduces those color bands by a factor of 4.
Instead of not using those alpha bits at all (in 32 bit color mode), one might as well use them for nicer colors. Now which OS supports that mode? X11?
HaraldHuman colours are not the same as computer colours (Score:2, Informative)
Actually, you should make that black to blue to white. And while you'll manage to distinguish the colours at the centre of the scale (near "pure" blue), I doubt you'll be able to distinguish the colours at the top and bottom (near white and black).
The limitations of 24-bit colour can also be dealt with with dithering. Most high-end animation programs render internally at 48 / 64 bits per pixel (16 bits per component) and then dither the image when they convert it to 24-bpp (8-bpc). This would result in a much smoother transition from black to blue (and then to white), with no visible banding.
Most modern graphics cards already do real-time dithering, but only in 16-bit modes (which still work internally at 24 / 32).
RMN
~~~
Re:Is this crap? (Score:2)
Think I'll talk to my contact at Matrox to see if we can get ahold of one of these and support this mode.
It is. (Score:4, Insightful)
For example, most people can distinguish between two very similar 24-bit medium greens but not between three or four similar 24-bit dark blues.
That said, no monitor can accurately represent 16 million colours, let alone several billions. Even if they could, the dynamic range of monitors is very limited compared to the range our eyes can see (ie, monitors have very limited brightness compared to the normal sunlit world), so most of those colours would be wasted.
Higher colour precision is good because it minimises round-off errors, but this applies mainly to internal calculations (some operations are done directly on the final framebuffer, but very few). For display, 24 bits (and a good monitor) are more than enough.
RMN
~~~
Re:It is. (Score:5, Funny)
If i remember back to biology this is because there are lots of green (Natural) foods but not so many blue ones and we therefore have allocated more cones in our eyes to distinguishing greens than blues.
This is why blue m-m's are an affront to nature
It's our brain that sers; not our eyes. (Score:1)
For all I know, some food may have lovely ultra-violet or infra-red shades mixed with their yellow or green, but we just can't see them (some animals can). In fact, if we could "see" much longer wavelengths, we'd have heat vision. Cool but probably a bit confusing.
It's also interesting that, while our eyes have receptors that are sensitive to YGBL (Yellow, Green, Blue and Luminance), we tend to think in HLS (hue, luminance and saturation).
Our brain "constructs" the red parts of the image from the other signals. If our eyes see something that has high luminance, some yellow, very little green and no blue, we perceive it as red (note that here when I say "yellow" I mean "something that is detected by our 'yellow' receivers", not pure yellow).
This is actually similar to the way TV signals are transmitted (a black-and-white signal plus two "difference" colours signals, so it's compatible with both B&W and colour TVs).
And, of course, not everyone's eyes are calibrated the same, so what is brownish green to one person can be greenish brown to another, and so on.
RMN
~~~
Not if you're human... (Score:3, Informative)
I recommend reading a bit more on the subject before making such definitive statements. You can start with this:
Spectral sensitivity of the human eye [uq.edu.au]
As you can see, at 650 nm (pure red), the cones are almost blind. The brain combines this information with what it gets from the rods (luminance) and realises that there is some colour there. And since it has no blue, almost no green and only a little yellow, it's translated to "red".
TVs use RGB (red,green,blue) just as they could use CMY (cyan,magenta,yellow) or any other group of complementary colours (of which there is an infinite number - any three colours that are 120 apart in a spectrum wheel will do). It has nothing to do with the actual wavelengths that the receptors in our eyes are tuned to.
You may also want to read some more about how TV colour signals are encoded (messy but interesting) and why current standards are as they are. Do a quick search on the internet and I'm sure you'll find plenty of pages about it.
RMN
~~~
If it works well enough, it probably won't evolve (Score:2)
Things like earlobes, pubic hair, and the fact that Windows 2000 is actually quite stable.
RMN
~~~
Re:If it works well enough... (Score:2)
No. Evolution means improvement. The theory of evolution through natural selection (which is what most people mean when they say "evolution") says that species tend to improve naturally when those improvements increase their probability of reproducing successfully.
But it's not always clear if a certain change will improve a species chances of reproduction and / or survival.
And when a certain characteristic has little or no relevance in survival and reproduction, then it will stay or go based purely on chance.
The reason we are more sensitive to green is because there was something related to it that allowed our ancient ancestors to get laid more
And if you can find out what this was, your theory may be right. Personally I cannot. That's what I meant when I said that there's a certain tendency to use the theory of evolution through natural selection to "explain" things that do not fit its definition.
If this were not true, then there would be more people who were not more sensitive to it.
The reason why we are more sensitive to green is a natural consequence of two things: 1. green wavelengths are at the middle of our visible spectrum and 2. our photon receptors aren't 100% accurate, so they don't react to just one wavelength. The result is that the blue and yellow receptors are also partially sensitive to green, so there's an increase in green "resolution".
It's sort of the way single-CCD cameras work (two in each four pixels is green). This translates not only to better spatial resolution but also to better colour accuracy.
One of these likely advantages would be the ability to more easily distinguish between plants that would kill you, and those that are nutritious.
You can't distinguish between poisonous and edible plants based on colour. There are poisonous and edible plants of just about every colour and shade.
Most animals are colour-blind and are quite able to distinguish what they can eat from what they cannot. Smell (and experience) are much more important than colour.
In fact, it works the other way around. Since almost all insects can see colour, it plays a major role in plant reproduction and survival, because the plants with the most striking colours will attract more insects and therefore reproduce more.
Human vision could be improved by covering a slightly wider spectrum, but there's no "natural" incentive for that to happen, so it doesn't. Women won't magically fall in love with me and ask me to be the father of their children just because I can see ultra-violet light (er... will they?).
When something is good enough, it'll stay that way for a long time. Nature is lazy. Which makes me a naturist.
RMN
~~~
Re:It is. (Score:1)
Sunlit world? What's that?
Re:It is. (Score:2)
RMN
~~~
Re:Is this crap? (Score:1)
Think about it this way, you apply 4 texture maps to the same object, and then have a couple light sources... some fog. If everything gets rendered at 32bit colour (8bit per channel) you will get more rounding erros.
The Matrox Gigacolor will reduce banding and visual artifacts in rendered scenes...
I would assume that the end result will still be a 32bit image sent to the display.
30-bit color helps mainly for movies (Score:1)
On still images the difference between 8- and 10-bit colour is not that significant; the human eye does a decent job of interpolating the bands. Where the 10-bit really shines is in moving pictures, eitehr in games or movies. When the Bands move across the screen because the camera is moving past a star, the bands are really evident in 8-bit.
Re:Is this crap? (Score:2)
We are much more attuned (visually of course) to difference in intensity (luminance) in comparison to difference in color. IT is possible for monitors to have a higher luma response than chroma response, so...
Having 30 bits worth of intensity variation (ok, call it 10 bits if you will) can provide more distinct rendering, wven on monitors that don't support 1 billion "colors".
Don't forget... (Score:4, Insightful)
Re:Don't forget... (Score:1, Informative)
Hmmm (Score:2, Interesting)
Or more seriously, I wanted to upgrade before (from my G400) but GForce / ATI have poor 2D performance and some bad filters on there cards which require a bit of hacking to sortish out, and Matrox didn't have a viable home/gamer solution, sure there 10bit medical cards look nice, but not quite for me.
The only problems i have had in the past with matrox cards are,
Poor OpenGL support, though the drivers seemed to have been fixed as of Feb this year.
There Linux support is a little, well patchy. they do provide drivers, but there only half open and a bit of a pain to get working corretly, some of the problems may have been down to old X4 versions though.
Well I'll Buy one in the next couple of months and try to post a more informed comment!!
Re:Hmmm (Score:2, Insightful)
Return to castle wolfenstein on a two plus year old card (g400max) with reasonable framerates, I'm ok with that and am looking forward to a new matrox.
Fast asynchronous pipelines? (Score:1)
Evans & Sutherland were the people who made the military simulators long long time ago
More links on Parhelia (Score:2, Informative)
http://www.hardocp.com/articles/parhelia/index.
http://www.hardwarezone.com/articles/articles.h
http://www.matrox.com/mga/products/parhelia512/
I hope they make Linux drivers for it. Hardware text AA seems kinda cool.
Re:More links on Parhelia (Score:2)
Anandtech has a full preview on it too (Score:5, Informative)
The summary mentions quite a few interesting notes regarding the effect this card would have on current games.
- In "simple" games like Quake III Arena, the Parhelia-512 will definitely lose out to the GeForce4 Ti 4600. By simple we mean games that generally use no more than two textures and are currently bound by fill rate. NVIDIA's drivers are highly optimized (much more so than Matrox's) and in situations where the majority of the Parhelia's execution power is going unused, it will lose out to the Ti 4600. This can change by turning on anisotropic filtering and antialiasing however, where the balance will begin to tilt in favor of the Parhelia.
- In stressful DX8 games, Matrox expects the Parhelia-512 to take the gold - either performing on par or outperforming the GeForce4 Ti 4600. Once again, as soon as you enable better texture filtering algorithms and antialiasing the Parhelia-512 should begin to seriously separate itself from the Ti 4600. The quad-texturing capabilities of the core as well as the 5-stage pixel shaders will be very handy in games coming out over the next several months.
So from the look of it, Parhelia does not wipe out Nvidia (though I would like them to), but is a worthy competitor to nvidia in current games. It would be interesting to see how ATI and Nvidia match up to this new competitor in the coming months.
Be afraid. Be vewy vewy afraid.
Re:Anandtech has a full preview on it too (Score:2)
Re:Anandtech has a full preview on it too (Score:1)
100+ fps in Q3A is NOT too much! Here's why: (Score:5, Informative)
- First off, Q3A is used as THE single standard metric to see how a card will perform under a common load. It's a very good way to judge the raw speed of a card overall, and often provides good pointers as to overall performance in fancier modes or other games, but it certainly doesn't mean every game you play will be 100+ fps.
- Second, that figure is an AVERAGE. When actually gaming, the average framerate is not the issue - the MINIMUM framerate is the killer. 60 fps average is fine, but when the framerate drops to 10-15 fps in a heavy firefight, you're in trouble. A higher average framerate usually translates to a higher minimum as well. In fact, many sites have taken to quoting minimums as well, or even showing a complete framerate graph.
- Third, the ability to manage 100 fps at e.g. 1024x768 means only around 40 fps at 1600x1200, if your monitor extends that far, or perhaps only 30 fps at 1024x768 with 4x AA if it doesn't. Your card will need to score 200 fps if you want to improve your resolution/AA, or maybe even 300 fps if you want to do that and still keep your minimum fps above 60.
- Fourth, the same argument applies to other quality improvements like trilinear and anisotropic filtering. Taking 32 texture samples instead of 4 can really kill your framerate, so you better hope you're getting enormous framerates with non-anisotropic filtering if you hope to get acceptable speed with anisotropic filtering enabled.
- Fifth, Q3A is not the only game out there. There are a lot of more demanding games available today, even those based on the Q3A engine like RtCW, that will give you much lower framerates.
Combining two or more of the above factors can bring the fastest graphics card to its knees, even if it scores 200 fps in Q3A. We'll have to wait until we see scores of 300 or 400 before we can expect to play Jedi Knight II at 1600x1200 with 9x AA and 16-sample anisotropic filtering, while never dropping below at least 30 fps. But boy, will it look good when we can :-)
Ideally, a review will give individual scores for all the above - high resolution, AA, anisotropic filtering, a range of modern games, and all combinations of the above. But since this would entail a vast amount of testing and a huge array of numbers, most reviews settle for a few known tests that are indicative of performance in other tests. And the most popular of those is good old Q3A.
Re:100+ fps in Q3A is NOT too much! Here's why: (Score:2)
Re:Anandtech has a full preview on it too (Score:2)
I am having a hard time getting excited about a video card that is not out yet that will be as good as a video card that has been out for months
Re:Anandtech has a full preview on it too (Score:2)
In theory you/they are correct, but what about the Parhelia in practical tests? That's what I'm waiting for.
Think about it before you mod me down as a troll/flamebait/offtopic
The G200 looked impressive too but didn't deliver (Score:4, Insightful)
Re:The G200 looked impressive too but didn't deliv (Score:2, Interesting)
What does it actually do? (Score:1)
Link with more information & screens (Score:1, Redundant)
http://www.hardwarezone.com/articles/articles.h
We'll see... Maybe they're back? (Score:2)
At last...fresh(er) blood (Score:1)
This is what is really needed in the industry. nVidia has and ATI has been the top dogs for a while and the new releases have been a little stale. Sure the GeForce 4's have been nice, but there are those out there who think that the GeForce 3's give better image quality. Then there's ATI and it's new Radeon 8500 128mb cards...it's just a 8500 with 2x the memory.
Matrox entering the ring again with this new chip and it's abilities should rattle the windows for a bit and we'll see nVidia and ATI scrambling for the next gen cards to out perform Matrox.
It's a competitive situation that promotes quality product for everyone.
Now if only M$ would get the clue eh?
Better than their TV add-on cards? (Score:2)
What Supporting Hardware Does One Need? (Score:3, Interesting)
Now we move on to monitors. Could someone recommend a monitor that I can use to accurately resolve 1 billion colors? I tend to run my 2 Viewsonic PT775's at 1600 x 1200 so I've grown accustomed to that much "real estate".
This sounds like an awesome card, but I really don't know where to go or what to get to reap all the benefits of it.
Lastly, precisely when and where can a fellow technogeek acquire one? Since the HotHardware site seems to be experiencing some serious "Slashdot Effect" I was unable to finish reading the entire article. MRP $$ and a release date would be very useful.
Vortran out
Re:What Supporting Hardware Does One Need? (Score:3, Informative)
And this would be for the fastest Matrox Card. There might be more than one flavour, but dont expect them to diversify it like Nvidia does (Ti200, Ti4400, Ti4600 and the MX420 and MX440).
Shipping Date : Late June
Re:What Supporting Hardware Does One Need? (Score:1)
Re:What Supporting Hardware Does One Need? (Score:2)
If you build it they will come. When voodoo introduced 32 bit maps, then games were designed to support it. I'm sure with such a powerful card especially with regards to mapping games will jump to look more realistic which seems to be teh goal. My concern is still more price I dont feel like paying $300 and up for a card just to play video games I rather just get a PS2. I still think that computers are not suitable gaming platforms. It much more fun playin in my living room on my 40" TV with my friends than in my study on a 19" monitor.
Re:What Supporting Hardware Does One Need? (Score:2)
I've been doing some pricing on a new machine for myself (of which I cannot afford yet), and some of the newer kt333 chipsets come with AGP 8x, as well as USB 2.0 etc.
It's out there, but do we use it? not yet....
Re:What Supporting Hardware Does One Need? (Score:2)
Monitors: Analog monitors (eg. the one you have today) can display an infinite number of colors. The DAC (digital-to-analog converter) on the graphics card creates the appropriate analog signal. The real question is whether digital DVI monitors will support more than 24 bits of color.
Where: Matrox has a list [matrox.com], including their own online store [shopmatrox.com]. CDW [cdw.com] seems to carry most Matrox products.
When: June.
Price: $450 for the top-end, low-end was not specified anywhere I could see.
Re:What Supporting Hardware Does One Need? (Score:2, Insightful)
'Hot' hardware ??? (Score:1)
Any pictures of the card (Score:1)
Mean while over at Nvidia... (Score:2, Interesting)
I kid you not!!!
Re:Mean while over at Nvidia... eh 3D labs: (Score:2)
http://www.extremetech.com/article/0,33
Which in my eyes sounded a lot better than Matrox offer since it was much more general-purpose. But on the other hand Matrox knows what features are really needed, and the PS2 showed that general-purpose features won't get you anywhere if they are hard to use. Featurewise it's a draw, but they are two different kind of beasts.
Extremetech also has a thorough discusson of the Matrix release:
http://www.extremetech.com/article/0,3396,s=101
And don't blame me if that site don't have persistant links.
Non Wintel Os's (Score:1)
In some of the earlier 'previews' there was talk of OpenGL 2.0, which I'm sure this card will theoretically be compliant with (once the ARB settle on the specs of course). But what of support for Linux, BSD, OS X. Does the hardware support both big and little endian?
It's fair that Matrox are pushing the DirectX 8.1 (and 9 no doubt) and Windows thing now, but when will we hear about other possibilities?
Re:Non Wintel Os's (Score:2)
Tom's Hardware as well (Score:1)
www.tomshardware.com/graphic/02q2/020514/index.
Beware -- I was just trying to get to the 3rd page in the review -- it appears to be getting slower..... ?
Specs VS G450 (Score:3, Interesting)
I really like the prospect of having three monitors to eliviate the issue of having a giant gap between displays due to the thick boarder of any display. However
This new card claims it only does 3840 x 1024 resolution on three cards. It still has the max color depth, but the resolution has to drop. By going to this big fancy new card I'd only gain 100,000 pixels, which in reality is next to nothing.
Is it a driver limitation, or does it take more than a 512bit dual 400mhz 256mb video card to push 4800 x 1200 for simple 2D functions?
~LoudMusic
The third DAC is slow (Score:2)
Which means, if you want to run all monitors at the same res (required for "Surround Gaming", really), you're limited to the resolution of the external DAC, which probably struggles to do 1280 x 1024.
It's nothing to do with the driver, and you can always add a second PCI gfx card for more monitors to get all the area you need. Try 5 x nVidia Quadro4 400NVS cards, each with 4 monitor outputs capable of 2048 x 1536, for a total of 61 million pixels - 16 times what you have now :-)
Re:The third DAC is slow (Score:2)
I could throw in a couple additional cards now and spread out the desktop - the reason I don't is because I like to have Windows think that it's all one display. It treats windows and the desktop differently, and the tasktray spans the entire desktop field. I've got a PCI GeForce2 MX that I could throw in and add another 2 monitors, but it can only handle 1280 x 1024
~LoudMusic
Re:The third DAC is slow (Score:2)
Not sure which GF2MX you have, but the ones I've seen certainly supported up to 2048x1536 on the primary monitor at least. They have a 350 MHz DAC, IIRC. And different resolution screens should be possible too, at least under XP/Me.
Re:The third DAC is slow (Score:2)
~LoudMusic
Re:foolish comments undeserving of a 2 (Score:2)
Because the gaps aren't right in the middle of your field of view. With three displays, default pop-up behavior (dialog boxes and such) would occur on a single monitor in the middle of your viewing window without being split in half (unless it's one of those silly dialogs that's shown as a percentage width of the desktop).
There are, of course, ways to modify that behavior (using Matrox's own tools if you have one of their cards), but it would be nice to look straight ahead and get an uninterrupted center desktop. As it is now, I look straight ahead and see 3 inches of border and gap between desktops.
I spend all day now looking either to the left or to the right. Looking to the middle would be a nice change.
And the quip about reading the article was unnecessary. First, he could have tried to read the article and found the site suddenly unresponsive (as did I). Second, he could have been reading it for technical information instead of marketing info. It happens.
Parhelia Shots : Come and get'em here.. (Score:3, Insightful)
Look at the massive heatsink on that baby... Ooooh mama...
Re:Parhelia Shots : Come and get'em here.. (Score:1)
hehehe. An idea forms. I could make a scoop in my case much like the bonnet on cars with blowers.YAY
Re:Parhelia Shots : Come and get'em here.. (Score:1)
In fact, this is a pretty hot hardware card.
Re:Parhelia Shots : Come and get'em here.. (Score:2)
Re:Parhelia Shots : Come and get'em here.. (Score:1)
Then again, it's running at 52 deg Celsius right now.
How do they do triple-head? (Score:2)
The screen caps make me drool
Wonderful for people in (broadcast) TV (Score:4, Informative)
This means that any particular pixel may have up to 30 bits of color (even though the maximum difference between colors of pixels is less than that.
Obviously, this is not something that is easily accomplished with standard 24 bit/32 bit rendering. If you convert the SDI into something that can be represented in the frame buffer of the video card, then you've lost precision. This is unacceptable for broadcast! (And no, overlay isn't generally good enough since you want to capture the pixels for output though SDI)
Admittedly, this card isn't perfect- It would be nice to have 8 bits of destination alpha (for a key channel). 4 shades of keying just isn't enough...
In any case, having a card (finally!) support 10 bit rendering (especially the 10 bit rendering in openGL) in hardware will be wonderful!
Not intended for people in (broadcast) TV... (Score:4, Informative)
As someone else pointed out, 10 bits of RGB does not equate to 10 bits of YUV. The Parhelia will give great 10 bit RGB previews (completely independant of output quality), and will even output a 10 bit YUV video signal - but only via S-Video, where the two colour signals get encoded together anyway. You need 10 bit component output, or 10 bit SDI, neither of which can be done by the Parhelia. It's more aimed at the 10 bit DVD market than a professional output solution.
The two-bit alpha limitation is largely irrelevant. For display on a monitor, RGB is all you need. Processing of deep-colour images should be done with at least 16 bits per component (including alpha) in memory for best results, then dithered down to 10 bit RGB for display. Key channel output requires a second video connector, so it won't do that at all.
Anyone notice? (Score:2)
Re:Wonderful for people in (broadcast) TV (Score:2, Interesting)
10 bit YCrCb IS different, and that was what I was stating!
With 10 bit per channel RGB, we can cheat- render RGB, convert it by rerendering it (still on the video card), and then blending it with the YUV video without losing any precision, all in the framebuffer of the graphics accelerator...
I.e. We don't want to use a keyer and output key-fill because if we did that (since our application (the yellow line for football and others)) we'd have to buy video delays in order to maintain field-accurate rendering...
We do understand the sampling scheme- Its just unfortunate.
If the industry went component instead of composite, or went for RGB over SDI instead of 601, (which unfortunately requires a "dual link"), it would make me a much happier person...
Article over at toms too.. (Score:1)
Always top notch from Matrox (Score:1)
But Boss, it has Glyph Antialiasing! (Score:2)
Just when I thought that my workplace would never spring for a card with these features, up popped Page 6 [hardwarezone.com] (just ignore all those pictures of people playing games with the card) with Glyph Antialiasing for "business appeal!" Three monitors, here I come.
But Boss, does it do hinting? (Score:2)
Their edge-AA functionality would lend itself well to font rendering. It's debatable whether it'll help the speed or even quality of current Windows font rendering, but so long as you're not forced to use it, it can't hurt. The hardware gamma correction is good, and it does "de-gamma" the background before blending in the text (which should be done with linear data).
My question is, does it correctly support hinting? It's not much use unless it does.
Matrox software sucks, tho (Score:1)
I think the problem is the graphics card biz is a low margin business, and the first thing they skimp on is software. *Sigh*
Also, they're so cheap their site is completely
Re:Matrox software sucks, tho (Score:1, Interesting)
Note that the capture cards weren't advertised as 2000 compatible, and the driver was sort of a 'best effort' by Matrox (an effort which failed). The cards worked fine in 98, which is what it said on the box.
And then CPUs/ATA disks got fast enough that you could do software capture with better results without futzing with some hacky low-end MJPEG board like the Rainbow Runner (which itself works OK in software mode).
The lesson I learned wasn't necessary that Matrox SuX0rs D00d, but instead that (1) video capture on Windows sux0rs in general because the lack of standardized APIs, and (2) If you are serious about vidcap, it's better to spec out a semi-pro system (like the Matrox RT card), get it working, and never upgrade your OS/hardware just because it would be cool to do so. If you aren't serious, get a cheeze software capture card (WinTV or ATI or something) and prepare for driver hell.
TripleHead QuadHead QuintHead ... (Score:2, Funny)
past driver experiences (Score:1)
What Parhelia means... (Score:5, Informative)
interesting stuff
The next Matrix? (Score:1)
Linux drivers (Score:2)
This card looks really sweet, and Linux could really use some competition to NVIDIA in the 3d card market, I hope the Linux drivers are up to par.
If they're binary only, I hope they put as much effort into them as NVIDIA does.
Beware Matrox Driver Support (Score:2)
Re:Beware Matrox Driver Support (Score:2)
Preview at Tech Report (Score:2)
Hmm. If history repeats itself no one will notice (Score:5, Interesting)
The next problem is that Matrox ruined their reputation in my eyes with the G200 by lieing about OpenGL. Lieing about how they were going to have it in November, then December, and so on... they kept this up until they announced the G400 and then suddenly the g200 was a no-go.
Ever since the G400 series it seems Matrox has been coming up with feature laden cards... trouble was no one asked for the features they chose to offer. Now they added even more features and a buttload of performance to boot. Yet as before, GF5 will be announced about the time this card is supposed to ship, and most likely be in stores at the same time.
When and how much? (Score:2)
I wasn't able to find any info from the hothardware or matrox sites. Any rumors as to when this is coming out, and how much it's going to cost?
I demand my Nerdity points! (Score:2)
And sending all of my friends half-hourly updates.
Unfortunatly I did have to go to bed and was thus unable to be the first non-NDA'd person to read the previews.
Product name too hard to spell, help!
I will just keep on refering to it as the G1000, SOOO much easier, heh.
Newletter from Matrox (Score:2)
http://www.matrox.com/mga/start/newsletter/may_200 2/parhelia512.cfm [matrox.com]
Re:Slashdot cache (Score:1)
Re:You won't be seeing 1 billion colours on Linux (Score:1)
Re:You won't be seeing 1 billion colours on Linux (Score:1)
Regardless, I would expect many visual apps on linux to use OpenGL and I know that you can use more than 8 bits per channel with OpenGL as I have done that on SGI's.
Re:You won't be seeing 1 billion colours on Linux (Score:3, Informative)
Your wrong.
'man X', under 'COLOR NAMES'
The syntax is an initial sharp sign character followed by a numeric specification, in one of the following formats:
#RGB (4 bits each)
#RRGGBB (8 bits each)
#RRRGGGBBB (12 bits each)
#RRRRGGGGBBBB (16 bits each)
Re:You won't be seeing 1 billion colours on Linux (Score:2)
Ooh, on second thought, even 24-bit color doesn't do well on the gradients in title bars and background images sometimes....
Re:You won't be seeing 1 billion colours on Linux (Score:1)
Re:You won't be seeing 1 billion colours on Linux (Score:2)
That can not be correct. The XColor structure, which is used all over the Xlib API for communicating color values is 16 bits per gun, 48 bits total.
Re:You won't be seeing 1 billion colours on Linux (Score:1)
Re:I've just realised it doesn't matter anyway (Score:2)
Then try this experiment. Open a paint program, select a nice medium, fully saturated blue and paint 1/2 the screen with it. Now edit that color and change one of the color components by a single value. I.e. from FF to FE or something. Then paint the other half of the screen.
MOST people can see a mach band where the two colors that differ by only 1 value of one color component meet in the middle. (If you don't see it at first, try a slightly darker shade of blue) That is why you need more than 256 values per component. Even when you are only showing two of the 2^24 colors on the screen, the fact that there are "only" 2^24 colors becomes a limiting factor.
10 bits (1024 values) is reaching the level of human perception for the most part. BUT that's still not good enough because gamma correction in the hardware can reduce the actual color resolution back down to 8 bits pretty fast. Eventually we'll all be using 16 bits per component all the way through. (Well for graphics work anyway) That'll give enough user color matching and adjustment and hardware color matching enough "breathing room".