Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

S3 Graphics Comes out of Hiding with Chrome20 275

Steve from Hexus writes "S3 Graphics, having been quiet for a while, has today announced a new graphics solution, Chrome20, with which they intend to take some market share away from ATI and Nvidia. From the article: 'We were offered a chance for some hands on play with a mid-range Chrome20 series desktop board - the machine was loaded with over 40 top games. A quick run of Half Life2 , Far Cry , Halo and a couple of other titles demonstrated that S3G's new 90nm mainstream card was working without any visual problems and with very playable frame rates.'"
This discussion has been archived. No new comments can be posted.

S3 Graphics Comes out of Hiding with Chrome20

Comments Filter:
  • Sweeet! (Score:4, Funny)

    by halcyon1234 ( 834388 ) <halcyon1234@hotmail.com> on Wednesday September 07, 2005 @04:12PM (#13503093) Journal
    Finally, an alternative to all that wonderful ATI stuff.

    {blink}

  • by mjrauhal ( 144713 ) on Wednesday September 07, 2005 @04:13PM (#13503101) Homepage
    So, how about Linux drivers? Free ones?
    • I just hope if they're going to release linux drivers, they make them less a P.I.T.A. to install than Nvidia / ATI ones.

      Maybe working more closely with the kernel developers, releasing the driver module as source code with the main kernel download, so it works out of the box.
      • Now, now. ATI is a royal pain, which has failed on me under several distros, and several computers - but the nVidia drivers have always been fairly reliable. Finicky maybe, but I've always been able to get theme to work.
      • Sorry, but it's highly unlikely. In the early days of Linux (early to mid 90s), S3 cards were one of just a few that we all avoided at all costs, because unlike most other cards which had open specs, S3 kept theirs a closely guarded secret. (S3 drivers did show up eventually, but by that time everyone else was using Tseng Labs ET4000 W32 which was open).
    • An important question on a "Linux site." I'll consider buying one of these cards, but only if good Linux drivers are available either freely or for a few bucks. Not $20, I mean perhaps $5. Charging so little might not offset the cost of work that went into the software, but the important thing is to build a customer base.

      Hear that, S3? I know you people read Slashdot.

    • by Anonymous Coward
      The new S3 cards cause me to ponder why SGI failed.

      Back in 1995, SGI should have dumped its proprietary hardware: specialized graphics chips and MIPS. SGI should have created the following dream box: Linux + ARM + commodity graphics chips from NVIDIA, S3, Chromatics, etc.

      The special sauce that greases every component is OpenGL. SGI should have leveraged its software technology and dominated the graphics market for decades to come.

      Yet, no one at SGI listened.

      The critics warned that x86-plus-commodi

    • Solid support is *much* more impotant to me then politics. I use Linux because it works for me and works well, same reason I use Nvidia cards under Linux.
    • Obligatory qualifier: "open source Linux drivers with, as a minimum, feature and speed parity with the Windows drivers."
    • I'd much rather they simply released complete documentation so that people who know what they are doing can write proper drivers.
    • by Grendel Drago ( 41496 ) on Wednesday September 07, 2005 @04:55PM (#13503505) Homepage
      It's a small market, true, but what exactly would S3 lose by opening up its drivers? They'd instantly become the graphics card for anyone running Linux. It's a small but real benefit---and what, then, would be the cost to them?

      Apple users are a small market, but they're incredibly loyal. Why wouldn't S3 get in on that action?

      --grendel drago
      • by myslashdotusername ( 903486 ) on Wednesday September 07, 2005 @05:05PM (#13503590) Homepage Journal
        what exactly would S3 lose by opening up its drivers?

        Several lawsuits, as technology used in writing those drivers is patented, and they've likely cross-licensed the patents to even be able to write a modern 3-d driver.

        now you could strip all the patented code, and fix it into a working driver, and provide source for it, but ATI already has been doing that for years, yet all I see from the /. community is a bunch of Nvidia fanboy ravings of how good the closed source Nvidia drivers are.

        So I hope this answers your question, as to why they cannot do what you seem to think would be so easy. And hey, even if patents were a non issue, the drivers would still be a 'trade' secret, giving that away to your competetors for free means that they will always know how to make there product perform better than yours.
        • by fossa ( 212602 ) <pat7.gmx@net> on Wednesday September 07, 2005 @05:35PM (#13503844) Journal

          This is S3. I thought competitors *already* know how to make products better than theirs?


        • giving that away to your competetors for free means that they will always know how to make there product perform better than yours.

          Not necessarily. In reality, I think the ramifications of opening a graphics driver are analogous to a VCR company telling people where the buttons for play, fast forward, fast backward, etc... are; i.e., no useful competitive info at all.
          • You are wrong. Drivers have a dramatic influnce on the performance of video cards. This is why they are being constantly updated and why people who really care about 3d performance keep up with the latest drivers ans why, or at least last time I was a PC gamer, getting pre-release drivers is such a big deal. A LARGE portion of the performance for any given card lies in the drivers.


        • I frequently see posts like the Grandparent asking why hardware vendors don't open up their video card drivers. The reality is these are HARDWARE vendors. They have outsourced much of the SOFTWARE development of drivers to third-party companies that have strict licensing requirements about how their code is going to be used. It isn't even so much about "know how to make their product perform better than yours" as it is keeping their lines of code in-house and private so they can get a contract to do anothe
        • by borg ( 95568 ) on Wednesday September 07, 2005 @07:43PM (#13504749)
          Quoth the poster:

          "now you could strip all the patented code, and fix it into a working driver, and provide source for it, but ATI already has been doing that for years, yet all I see from the /. community is a bunch of Nvidia fanboy ravings of how good the closed source Nvidia drivers are."

          Correction: there is an open source radeon driver that only supports 3D acceleration for cards up to and including the 9200 models. Newer models are only have 3D acceleration with the closed source 3D driver.

          Up until ATI stopped releasing 3D programming information to the community, ATI-based cards were all I bought and recommended. The reason is pragmatic: I didn't have to worry about the card working with a new kernel version or the latest -mm patchset. This was my choice, in _spite_ of occasionally incomplete GL implementations (I seem to remember problems with Scorched3D on my radeon).

          The last ATI card I bought was a 9200. Now, I buy nvidia. I may be stuck with a closed source driver, but at least it is a _good_ closed source driver. The latest version can do 3D acceleration over multiple cards (xinerama) if all GPUs are similar, which makes for a stunning game of quake on my triple-head system.

          If S3 came up with an open source driver that was included in the kernel sources and a marginally competent 3D implementation, I would use them for future purchases in a heartbeat.
        • by Grishnakh ( 216268 ) on Thursday September 08, 2005 @01:53AM (#13506818)
          Why is this modded insightful? It's outright wrong.

          Ok, so let's assume you're right and the technology is patented. So what? This means that there are NO secrets allowed by the government in this product. The whole point of getting a patent is that you have to disclose your invention fully in order to obtain legal protection for it. If I want to see this patented technology, I can just look it up at www.uspto.gov. So this cross-licensed patents argument is a pile of BS.

          Strip the patented code... why? Again, if it's patented, there's no secrets. Now maybe the companies holding the patents won't license them in such a way as to allow open-sourced drivers, but this is a licensing issue, not a patent one.

          Trade secret: well, are they patented or aren't they? You can't have a trade secret on something that's patented. The two are mutually exclusive.

          You might want to learn about the various IP protections and how they differ before running your mouth.
      • Because the second they had to close their driver back up (for some legal reason or something), they would instantly lose their "incredibly loyal" linux following, wheras apple could create a policy where if you buy a mac, you have to be raped by Steve Jobs and they would be as popular as ever. Go ahead, mod me -1, I dare you.
    • " So, how about Linux drivers? Free ones?"

      Well after visiting their web site [s3graphics.com] and not finding any Linux drivers for their existing cards, and not even any mention of Linux nywhere on their site, I wouldn't hold my breath.

  • by It doesn't come easy ( 695416 ) * on Wednesday September 07, 2005 @04:13PM (#13503103) Journal
    The picture of the fan sink was the best part.
  • by Stonent1 ( 594886 )
    Will it run all my "S3D" games that came with my 4MB Virge card 10 years ago?
    • You seriously had a 4MB video card 10 years ago? ~1995. My computer in '96 came with a 2MB STB Lightspeed 128, which was a lot of memory for a video card at the time. When the 3dfx cards came out they were "loaded" with 4MB, but that was in '96. You rich bastard!
      • Re:Yeah but.... (Score:3, Informative)

        by rAiNsT0rm ( 877553 )
        I actually was quite poor and had a S3 4mb VRAM ViRGE back in '95 or so. It was actually pretty easy. You could buy the 2MB version and it had two more VRAM spots that you just popped 2 more MB into. I then bought a junk S3 card with 2 MB and put them in, viola! instant 4MB s3 ViRGE of pure pwnage.
        • Re:Yeah but.... (Score:3, Interesting)

          by afidel ( 530433 )
          Which led to an interesting diagnosis of a floormates computer my freshman year. Guy had a S3 Virge card with 4MB of ram. Under windows the card ran fine, but if he loaded certain games he would get weird graphic artifacts on the bottom half of the screen. I figured out that it was a texture memory problem. His wholesaler had sent him a 2MB card with the additional 2MB modules installed, problem was that the memory they used was 10ns slower then the card needed (60 vs 50ns) and so textures above the 2MB bar
      • I had a 2MB S3 ViRGE in 1996, which could be upgraded to 4MB by dropping a few more RAM chips into slots. The card was by no means top-of-the-line then. My monitor at the time had a maximum resolution of 1600x1200 in some crazy 43Hz interlaced mode, and 1024x768 at a decent refresh rate (85Hz). The 2MB ViRGE could only run it in 16-bit colour at this resolution - a 24-bit frame buffer would have required 2.25MB or RAM, and a 32-bit frame buffer would have needed 3MB. As it was, I was using 1.5MB just as
  • by DavidNWelton ( 142216 ) on Wednesday September 07, 2005 @04:16PM (#13503127) Homepage
    Is it a "graphics solution" or a PCI card? Sheez.
  • by VincenzoRomano ( 881055 ) on Wednesday September 07, 2005 @04:17PM (#13503149) Homepage Journal
    ... when S3 will adopt the Quantum-Optical technology [slashdot.org]!
  • by i41Overlord ( 829913 ) on Wednesday September 07, 2005 @04:17PM (#13503150)
    Because you'll need that to view the slideshow that S3 cards produce in 3d games.
    • Isn't that the truth. S3 was the laughing stock of graphic chip developers back in the 90's, and will continue to be so if this new solution of theirs performs like their Virge chips did. Unfortunately I made the grave mistake of purchasing a Virge card years ago due to their price. As the old saying goes, "you get what you pay for", and what I got was complete trash. I could not play Quake at acceptable levels with that crappy card. Boy was I so pleased to put down the cash for the Pure3D 3dfx card.
      • by Silverlancer ( 786390 ) on Wednesday September 07, 2005 @05:12PM (#13503661)
        Recently I scavenged my old computers to find a PCI card to use for my second monitor (my ATI 9700 Pro could only hardware accelerate one output at a time, leading to slow graphics, even on 2D applications like Firefox, on the second monitor). But, all my newer cards were AGP, even the one in my 266mhz Pentium II computer. So I went even farther back, to my Pentium 166mhz non-MMX. This was mistake #1.

        The card in the machine was a 2MB Virge. Things I found out about the card over the next few minutes included:

        1) It supported no resolution higher than 1024x768 60hz 16-bit color.
        2) The output looked so bad even on 2D that looking at the monitor hurt my eyes.
        3) The instant I dragged any 3D game window, even older ones, to the monitor with the Virge card, they started going at about 10 frames... per minute.

        The Virge was the worst graphics card I have ever used. A while back I even tried to run Homeworld on it (as a primary card). Lowest detail levels--check. Lowest resolution--check. Lowest memory allocation--check. End result: D3D hardware acceleration mode goes slower than software mode, at about 2 frames per minute.
        • I think you're giving the ViRGE a harder time than it deserved. For one thing it sounds like the RAMDAC on your card was rubbish, which has nothing to do with the ViRGE - I used my 2MB ViRGE to drive a monitor at 1024x768x16@85 for years. The 2D quality was acceptable for the time, although the monitor isn't superb by today's standards.

          The ViRGE was never marketed as a card for running OpenGL and Direct3D games. The OpenGL implementation was faster than software OpenGL on contemporary hardware - my Vi

          • I used to have a version of Terminal Velocity with a ViRGE rendering path, and that looked much nicer than the software version.

            Yep, that version of Terminal Velocity actually came bundled with the card.. along with optimized versions of Descent and Tomb Raider, IIRC. I remember being in awe of how nice and smooth they looked (compared to software rendering).

            It was definitely not a OpenGL card, it required the (DOS) games be specifically tailored to use it.
        • The fact that you plucked an old 2MB Virge out of a P166, stuck it into a newer machine as a *secondary* card, and you were able to drag a modern 3D game into its monitor and it actually ran at all is nothing short of a miracle...

          The Virge was definitely a dog back in its day, probably even worse than an ATI Rage II, but I would be impressed if any of its better-performing contemporaries (e.g, Rendition or Mystique) would be capable of that feat... I just did a search, and couldn't even find any evidence

  • by MxTxL ( 307166 ) on Wednesday September 07, 2005 @04:18PM (#13503155)
    I've heard these will be bundled with a 6.8GHz 1TB RAM and 2TB HDD Laptop.
  • by vasqzr ( 619165 ) <`vasqzr' `at' `netscape.net'> on Wednesday September 07, 2005 @04:19PM (#13503166)

    Read: Nowhere near the performance of ATI/NVIDIA.

    Unless they plan on taking over the integrated graphics, $300 PC market, why bother?
    • Because it's a start. ATi and nVidia just didn't come out of nowhere with a card-to-rule-all-cards. It took them time, and I imagine, it will take S3 some time too.
      The point is competition. Far too long have we been stuck in a dichotomy of two-superpowers.
      But, this isn't their first try, either. The S3 Delta Chrome was just average at release, and even segmented off into integrated graphics by a few VIA chipsets.

      Trident tried to dive back into the graphics realm. Their card didn't go up to the hype (m

    • Read: Nowhere near the performance of ATI/NVIDIA.

      Correction: Read "nowhere near the performance of ATI/NVidia's top-end models".

      Why do NVidia bother selling the GeForce FX 5200 any more? It's crap compared to a 7800 GTX!

      Oh, wait, it's because they can make a lot of money by capturing the low end of the market as well as the handful of geeks who are anal enough about frame rates to spend more on a single graphics card than the average person spends on a complete computer. Hey, you reckon S3 might just be p
  • S3 video (Score:2, Insightful)

    by highmaster ( 842311 )
    Isn't waiting for a high performance video solution a lot like waiting for a flawless shuttle launch? It has been a long time, a VERY long time since S3 could compete with any of the other major players in performance. They have always been the cheap integrated solution, or the cheapo get by with the bare minimum expansion card type of product. Not gonna hold my breath waiting for S3 to run the next generation video games, let alone current ones.
  • you don't just wake up one christmas morning and have a card that can compete with the big boys
  • by lateralus_1024 ( 583730 ) <mattbaha@gmailLISP.com minus language> on Wednesday September 07, 2005 @04:30PM (#13503277)
    *in head-to-head comparisons against high end ATI / NVidia cards in Windows Safe Mode.
  • by L0neW0lf ( 594121 ) on Wednesday September 07, 2005 @04:40PM (#13503370)
    Isn't this the way S3 does it every time? Let's see:

    Step 1: S3 introduces a new graphics card. The name is similar to one they've previously made, but you've never seen that card before because no-one wants to produce and sell one. Specs seem similar too. As usual, it's supposed to be a mid-level card that won't "take on the big boys" but is supposed to have mainstream performance.
    Step 2: Hardware review sites get a prototype board. They either experience a number of driver glitches, or performance that is vanilla enough that no-one is all that excited.
    Step 4:Joe Gamer reads the review, and buys a tried-and-true midrange solution from ATI or nVidia that doesn't have the driver issues S3 was famous for in cards that actually made it out the door.
    Step 5: S3 has teething troubles with the GPU, or the drivers, or production, delaying the chip's release until its performance is at the low-end, yet priced $20-40 above others' low-end cards.
    Step 6: The lackluster performance of the GPU relegates it to boards made by one dinky little vendor nobody has heard of and doesn't trust, with nonexistent support. S3 has to lower their prices on the GPU to get any sales at all.
    Step 7: S3 doesn't profit.

    I'm just curious...how does S3 manage to keep their graphics card business afloat? Aside from a few integrated solutions on VIA chipset mainboards, I can't see any products they manage to make money on.
    • Yeah, I suspect my experience of anything UniChrome is pretty similar to everyone elses ...

      1. Pick two machines, one Unichrome, one Intel Extreme
      2. Boot up Linux distribution
      3. Play with OpenGL screen saver
      4. ?!?!? wtf
      5. Realize that Intel 'Extreme' kicks the ass of the VIA chip, seven ways til Thursday.
      6. Buy a cheapo FX5200 card and enjoy the 3D goodness.

  • HDMI? (Score:4, Interesting)

    by fallen1 ( 230220 ) on Wednesday September 07, 2005 @04:55PM (#13503506) Homepage
    After reading the article and seeing that S3G has stated "No comment" after being asked about including HDMI on their cards, perhaps they may want to shoot for the, ummm, grey market where people who DO NOT want their computers controlled by outside forces buy their equipment? Maybe even supply areas of the world that want HDMI but without the annoying HDCP that goes along with it so they can still use older monitors/TVs _AND_ still get high definition video - not "oh, that's not a registered device with Central Command Authority! Thou shalt have only 480i. No HD for you!!"

    Personally, I'm getting beyond tired of technology companies who, some singularly and definitely collectively, make more money than Holly-hood, err, Hollywood bending over backward to placate them. Yes, I know that the studios/**AA control the media/content for the most part but if the _major_ technology players stand up and say "Well, we control the technology everyone uses to your content and there is no other tech company(ies) large enough to challenge all of us so THIS is how we're going to play ball." then WTF would Hollywood do except try to get more laws passed? Then all the technology companies that opposed Hollywood could band together to fight that off as well - dollar for dollar and then some. What would happen to the products that those companies that stood up to Hollywood do - especially when the tech-oriented crowd started praising them to friends/family/etc? Sell multiple, multiples of items that are free of DRM and friendly to the CONSUMER? Wow, what a frigging concept! Make products friendly towards the consumer, don't treat them like a dollar with a body attached, treat fair use rights as they should be treated, don't treat the customer like a criminal from the get-go, tell the **AAs to fuck off and fight piracy where it counts (you know, those media distributors in Hong Kong, Singapore, China, Russia, etc), and make millions upon millions of dollars.

    Whew, I've had a very long day.. I think I need lots of sleep now. Sorry for the rant.
  • S3 Graphics, having been quiet for a while,
    Well, when a company produces products that earn the nickname "graphics decelerators", being quiet for a while would probably be the best solution; that and going back and improving their solutions doesn't hurt either. ;)
  • GP2 (Score:5, Interesting)

    by Doc Ruby ( 173196 ) on Wednesday September 07, 2005 @04:57PM (#13503530) Homepage Journal
    I'd like to see S3 expand the market into the general purpose processing market. If their new GPUs were supported as GPGPUs, they might get people to buy their cards to increase all performance, without relying only on Intel and AMD to push CPU performance.

    I've been waiting to see "coprocessor" PCI cards become popular, especially among gamers. I remember when we could buy "math coprocessors" to augment relatively slow/cheap math onboard the x86. That was before CPU manufacturing/marketing economics selected for all CPUs to have fast math sections, but with cheaper ones leaving the circuit lines "cut" to the fast part. Maybe that marketing hustle has inhibited the addition of "redundant" coprocessor chips.

    GPUs are really just fast math coprocessors, optimized for graphics math and fitted with video coder chips. Gamers are the primary performancemongers and live at the bleeding edge of cranking performance. So they're the natural demanding market for pulling GPGPU products across the bleeding edge into mainstream architectures. Especially since GPGPUs aren't "Central", they're more likely to be "stackable", scalable processing units dynamically allocable for whatever's found at boot.

    What we really need are GPUs that have "public" interfaces, either HW or SW (open drivers) that others can harness for GPGPU. Let's see if that kind of competition expands the market for these GPUs, instead of just fighting ATI and nVidia for the current market.
    • Re:GP2 (Score:4, Interesting)

      by dreamchaser ( 49529 ) on Wednesday September 07, 2005 @05:34PM (#13503836) Homepage Journal
      What we really need are GPUs that have "public" interfaces, either HW or SW (open drivers) that others can harness for GPGPU. Let's see if that kind of competition expands the market for these GPUs, instead of just fighting ATI and nVidia for the current market.


      OpenGL is a 'public' interface that effectively hides the hardware with a standard API while also offering low level programmability via it's shader language. We already have what you're asking for.

      Check out the GPGPU [gpgpu.org] project. It sounds like it might interest you.
      • OpenGL doesn't offer an interface to general purpose computing functions on the chip. It's a public interface to graphics functions. Shaders can be hacked for non-graphics math functions, but they're limited.

        The GPGPU project is exactly what I'm talking about. I'd really like S3 to check it out more, and prioritize its projects for support. Then they might expand their graphics market into that served by GPGPU functionality.
        • Ok, then we agree. My fault if I misunderstood what you were saying. I was just using OpenGL as an example, but you're obviously well versed on the matter.

          I've been checking out the Sh [libsh.org] project and it's a lot of fun. I agree; I'd like to see other vendors embrace the GPU as an 'it isn't just for graphics anymore' processor.
          • It's always unclear where people are coming from when we chime in on an obscure tech. Your comments were clear and helpful, even if I already had them in my experience bag :). Some other reader might find it a useful starting point that we've already passed.

            I'm intrigued by the Sony/IBM Cell architecture. Am I correct in believing they're using the Cell as both CPU and GPU, perhaps dynamically allocated per-process, or even per-boot? They might have bridged the gap between GPU and (2D/3D) DSP, for a real GP
            • I haven't seen either achieved. I'm not sure the current crop of GPU's is up to running an X Server but the thought is intriguing. However, I don't see why one couldn't code an MP3 encoder today using something like Sh.
              • The GPGPU site has some compression demos. But not the MP3 compression. I don't know the algorithm, or whether perhaps the Fraunhofer patent or something inhibits that development. But I'd love to get $1000 worth of videocards to do the work of $25,000 worth of clustered Xeons, in a single encoder host.
                • Imagine a beowulf cluster of those!

                  No, seriously...I've been looking into building a poor man's cluster to play with and distributing Sh code to the various nodes.
                  • In 1990, I worked for a company programming the brand-new tech, the AT&T DSP32c, a 12.5MFLOPS "supercomputer" on an (ISA!) board, for "digital prepress" (desktop publishing for real printing). We made our own boards, with 3-5 DSPs joined to an FPGA. And put multiple FPGA/DSP boards in a (25MHz? could it be?) 386. The CPU and bus bandwidth was so slow, compared to maybe 40DSPs+8FPGAs, that I wrote a "client/server" system. A loop on the DOS 386 polled the keyboard for commands, dumping them into a buffer
    • I've been waiting to see "coprocessor" PCI cards become popular, especially among gamers.

      I doubt the PCI bus has the required bandwidth to contribute anything worthwhile to the GFX card. SLI has taken the niche of producing twice the computing power. Not to mention, have you looked at the 7800 GTX SLI benchmarks? At 1600x1200 they're already overkill, they easily drive a 2048x1536 monitor too, providing you can find one. The only monitor that could possibly strain them is Apple's biggest monitor. Gamers don
      • GPUs with manifold the PCI bus speed can perform more complex computation on the same symbols, producing higher quality signals or analysis. Its not a graphics "polygons per second" or FPS opportunity, but rather adding more filters to the pipeline on the same incoming data and outgoing results. That's the whole point of GPGPU. Unless you think computers are "fast enough". For example, voice recognition requires lots of processing for realtime, but not because the voice data (or ASCII results) are high band
  • Um... (Score:3, Interesting)

    by sootman ( 158191 ) on Wednesday September 07, 2005 @04:58PM (#13503542) Homepage Journal
    "Chrome20 is by no means going to take on the high-end cards, instead looking to provide good performance for your more average user."

    Average users don't tend to replace their cards very often. If they do, they'll go with a 6-month-old card from a major player, not a formerly-OK company that basically seems to be saying "Look at us! We're as good as anything else! w00t!"* And until computers run on $3/gallon gasoline, I don't think "lower power consumption" is going to move a lot of cards.

    As for "better performance" when it comes to HDTV... huh? Lots of rigs today can play HD video just fine, and unlike games, video does not benefit much from an ability to show more FPS--once you get past 30, you're pretty much done. Besides, video playback--a series of raster images--has not been much of a problem for years now. It's rendering polygons that's hard.

    Sorry, S3, but I don't think this will do much for you.

    * except for the fact that it's not actually shipping yet, and those other cards have had drivers out for years, and games are already optimized for them, and...
    • "I don't think "lower power consumption" is going to move a lot of cards."

      I've got a shuttle XPC sitting next to my monitor with a GF6600GT sounding like a vacuum cleaner. I'd buy anything with comparable performance for $200 if it didn't have a fan or any funny "2 slot heatpipe to the back blocking PCI cooler. That said, I don't think my card is available yet. Nor am I a large market.

      The big guys have given up on fanless cards. If S3 says they're low power, I hope they don't need one. Fanless actually i

  • by OzPhIsH ( 560038 ) on Wednesday September 07, 2005 @05:21PM (#13503732) Journal
    I for one hope that S3 is successful in their attempt to get back into the market. More competition is a good thing. While I don't see them necessarily competing with nvidia or ATI at this point, one can only hope that they use this as a foothold to break back into the higher end markets in a few years. It can only mean faster and cheaper videocards for everyone. I understand that the cynics have a bit of history on their side when making fun of S3, but it ticks me off a little when I see people practically rooting for them to fail.
  • My Take (Score:4, Interesting)

    by ribblem ( 886342 ) on Wednesday September 07, 2005 @05:23PM (#13503753) Homepage
    I work for one of the major two major players in this market so I am probably a little biased.

    The way I read this is yet another small player wants to run with the big boys. What makes this one different? Well they admit up front that they can't compete in the high end so they will target the low end. Is this going to make a difference? I highly doubt it. I predict a flop.

    I'm not trying to be too harsh. I'm just stating it like I see it. Personally I'd like to see another player in this market, but I doubt it will ever happen unless someone like Intel decides to make high end graphics cards. Both ATI and NVIDIA spend hundreds of millions of dollars a year on R&D to make their high end cards and all that R&D is applicable to the lower end discrete cards. The lower end cards now days use most of the great ideas we've come up with for the high end cards, but we just do fewer pixels in parallel thus using fewer transistors. Our lower end cards are also fairly power effience even though this article didn't mention it (almost like want people to assume our low end cards use 100W just like our high end cards do). Unless another company spends that kind of money I doubt they'll compete. I'm not saying it's impossible, just unlikely.

    I think the graphics industry is becoming less and less likely to have a major revolution (i.e. to something other than triangle based rendering); which would make it much easier for a new player to get into the market. Graphics for the PC with all its legacy software is becoming more like the irreplaceable x86 platform everyday. If we do change to something completely different it will probably come to a console first, but the longer we go on optimizing algorithms and hardware for these triangle based systems the more unlikely such a revolution will come.

    Most people who understand CPU architecture will tell you x86 is an old inefficient design, but Intel and AMD have spent so much time/money optimizing it that nobody can seem to come up with a new general purpose CPU that is better. I think the same thing is happening with graphics. The weird coincidence is that both of these fields have 2 major players...
    • I assume you're talking about nvidia and ATI being the two major players? While it's true that they are they ones really pushing the technology at the high-end, you're forgetting that Intel still has a greater market share then either ATI or nvidia when it comes to actual volume of graphics processors sold (usually integrated solutions). I imagine that this is more what S3 will be competing against. Not that competing against Intel is an easy thing to do either....
    • Re:My Take (Score:5, Informative)

      by adisakp ( 705706 ) on Wednesday September 07, 2005 @05:57PM (#13503996) Journal
      Most people who understand CPU architecture will tell you x86 is an old inefficient design

      Actually, x86 is a very inefficient instruction set. However, the efficiency of the instruction set has been sidestepped mostly by on-the-fly hardware translation to a more efficient instruction set, large virtual register sets, out-of-order execution, and speculative execution. Neither AMD nor Intel CPU's operate on the x86 instruction set internally. Both of them translate x86 instructions into micro-ops internally and execute those instead -- believe it or not, they're doing in hardware much of what Transmeta was doing in software. The Pentium 4 doesn't even have a true L1 cache for instructions but rather uses an "execution trace cache" which has pre-translated micro-ops.

      Furthermore, it's a chicken-and-egg problem when it comes to CPUs. A lot of optimization for X86 occurs because of the vast amount of software (Windows, etc) that runs only on X86. This software is often less than efficient and the manufacturers (Intel and AMD) optimize for the software inefficiencies with things like branch prediction, dynamic fetching, out-of-order execution, etc. Unfortunately, the optimization units to deal with x86 inefficiencies end up costing nearly as many transistors as the units that actually do the work. Other architectures that are more efficient or ship less volume will get less optimization simply because there isn't a reason to throw more $$$ at these optimization units if the core architecture and Instruction Set (IS) are already efficient.

      Video cards are not bound to a particular architecture. You can have a radically different video card programmed with a similar API (Direct X or Open GL). Perhaps this can be considered similar to the CPU markets where AMD and INTEL have different internal micro-architectures that interpret and execute the same API (of x86 instructions). However, if one architecture is much less effecient than another, it's easier to switch to the more efficient architecture with an intervening well-designed software abstraction layer in-between (DirectX/OpenGL) than to do the hardware-level translation (x86 procs). Video cards don't have to worry about the software compatibility as long as they can support a minimum number of DirectX/OpenGL features. And it seems like add-on (PCIx/AGP/etc) video cards *ALWAYS* have to worry about performance and price more than CPU's. There's a market for slower cheaper CPU's like the Semprom and Celeron but the only market for cheap video cards is in the MB/integrated category. People aren't going to get excited about an add-on video card that's slow.
      • Just a note on video boards, and other expansion cards. You do have to produce hardware specific for your targetted platform. You can't just take a PC targetted card and use it on a Mac or UltraSPARC, despite all being PCI. You have to boostrap the cards differently to get them up and running on each. Things that aren't x86 don't have a BIOS, or ACPI, for example. That means you have to interface to the system differently on boot. You do different things to handshake on the bus, to allocate resources,
  • by tji ( 74570 ) on Wednesday September 07, 2005 @05:54PM (#13503963)

    Even on powerful systems, decoding and displaying HDTV content can be tough. The current S3 "Unichrome" integrated video processors include MPEG decoding capabilities. This goes well beyond MPEG acceleration in XvMC / DxVA.. It does most of the MPEG processing in hardware, rather than only the iDCT/MC.

    Hopefully these new cards will continue to support MPEG decoding.. If so, I'll buy one & ditch my Nvidia with their closed source binary drivers.

    But, I would need to understand a few issues before taking the plunge:

    - Are the specs & source code for the card fully open? (VIA / S3 have had some issues on this front in the past).
    - Are these cards available for purchase? The S3 DeltaChrome & GammaChrome cards were not available as far as I could tell. Only the unichrome was available, as an integrated video option on VIA motherboards.
    - Does it have full MPEG2 decoding support?
    - Does it have MPEG4 accel support? How about MPEG4.10 / AVC accel (or full decoding)?
  • by Animats ( 122034 ) on Wednesday September 07, 2005 @08:49PM (#13505126) Homepage
    What S3 really does is design the graphics controllers that go into Via chipsets. There are huge numbers of those controllers out there. They're pretty good graphics controllers, considering that they come almost for free as part of the motherboard chipset.

    That's probably the future. The plug-in graphics card is rapidly headed for the same fate as the plug-in math coprocessor chip, the plug-in MMU chip, the plug-in DMA controller chip, the plug-in serial port board, the plug-in network adapter, and the plug-in disk controller.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...