Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

S3 Jumps On GPGPU Bandwagon 86

arcticstoat writes "It's sometimes easy to forget that the PC graphics market isn't owned by ATI and Nvidia, and the company that first gave us 3D acceleration, S3, is very much still around. So much so, in fact, that it's now even developed its own GPGPU technology. S3 claims that its Chrome 400 chips can accelerate gaming physics, HD video transcoding/encoding and photo enhancement. To demonstrate the latter, S3 has released a free download called S3FotoPro, which enhances color clarity and reduces haze in photos. However, the company hasn't yet revealed whether it plans to support OpenCL in the future." The Tech Report also points out that this could allow S3's parent company, VIA, to compete with Intel and AMD in graphics processing technology.
This discussion has been archived. No new comments can be posted.

S3 Jumps On GPGPU Bandwagon

Comments Filter:
  • by squisher ( 212661 ) on Saturday October 18, 2008 @12:39PM (#25424993)

    This is definitely not the first time in recent years that we hear S3 can compete with ATI and Nvidia again. As much as I'd like to see that, I certainly won't believe it until I see some decent independent benchmarks.

    • Re: (Score:3, Funny)

      by rhyder128k ( 1051042 )

      Hey, give them a chance. If their excellent 3D graphics chipsets are anything to go by this could give you the power of a 386 processor ON YOUR DESKTOP! Imagine it: DooM running in practically real-time. This baby could render the teapot POV example in 3/4 mins rather than the hours it would take on the older XT class machine.

    • Seriously, I've never heard of S3. Are they not sold at normal stores like Best Buy?
      • Typically not. Standalone S3 based cards do exist; but they survive primarily in the form of embedded video on VIA boards.
      • Re: (Score:2, Funny)

        by Anonymous Coward

        normal stores like Best Buy?

        America really has gone downhill.

        • I was thinking the same thing.

          If it's not a super-dooper-insta-matic-graphics-o-tron-buzzword-overclocker-massive-profit-maker-by-stupid-customers cheapo graphics card, Best Buy doesn't want anything to do with it.

      • by OrangeTide ( 124937 ) on Saturday October 18, 2008 @01:59PM (#25425523) Homepage Journal

        Long ago they used to be, back when ATI and Trident were big names in the video card business.

        • Re: (Score:3, Interesting)

          ATi, Trident, Matrox, S3, the good old days... I remember when I worked in a computer shop, we used to burn through S3 Virge and S3 Trio cards like they were going out of fashion.

          Unfortunately they were left for dead when people no longer needed a 2D card to go with their 3DFX card - the combo cards from Diamond were killer cards and removed the need for the usual S3 Virge/Trio or Trident.

      • I had an S3 Virge about 11 years ago, they were pretty popular back then. I don't recall hearing much about them since, I assumed they'd gone the way of 3DFX.
      • Re: (Score:2, Interesting)

        by tibman ( 623933 )

        S3 is from the age of 3dfx cards and pre Nvidia Geforce cards. I don't remember any of their cards being very successful? Other than some late Savage cards, but even then, not equal to 3dfx, ATI, or Nvidia offerings.

        I still have stuff with 3dfx logos.. i miss them :(

        • *Click* *Spining Logo* = Time for some eye candy! Happy memories :-)
        • Re: (Score:3, Informative)

          As I stated in an post further up - the Trio and Virge cards are what S3 made a killing on.

          I actually remember a server board that basically required a Trio - other cards would cause the system to hang mid use. They were great little cards and even were able to have expanded memory added.

      • The original name for "DXTC" was .... "S3TC"

        http://en.wikipedia.org/wiki/Texture_compression [wikipedia.org]

      • I actually think I've seen some of their hardware before; but, I didn't know they were still around, so I guess I'm in a similar boat as you.
        S3 Graphics [wikipedia.org]
      • by Mr Z ( 6791 )

        My S3 ViRGE sitting in my Gateway 2000 G6 Pentium II-300 weeps.

        Ah, what the heck am I talking 'bout. I kicked that turd and its 4GB HD to the curb years ago.

    • by BrentH ( 1154987 )
      While the press release doesnt make this terribly clear, I think it may be that this applciations automatically processes all image data and adjusts as needed before it sends it out to the monitor. So i a way its an image 'normalizer', like how you have audio normalizer that make all sounds have about the same volume. This actually would make sense (if its what I think it is), because this would, for example, mean that all your photos 'look good' without an touching up, all your homemovies dito, and dito th
    • Yep (Score:3, Interesting)

      by Sycraft-fu ( 314770 )

      First we need to see a video card that performs well. Serious. The whole reason that nVidia (and ATi) cards can do well at GPGPU stuff is that they are fast at gaming stuff.

      Gaming graphics are at their heart a whole lot of parallel single precision floating point math. Thus, that is what modern video cards are good at calculating. Well the GPGPU idea was just someone saying "Hey, these things are amazingly fast as number crunching, and graphics aren't the only sort of thing this is relivant to. Let's get an

      • Re: (Score:3, Interesting)

        by bendodge ( 998616 )

        Do games later.
        For now, let's see some really small, low-power, low-heat video chips with enough power for HD video and basic 3D acceleration. If they do that and release documentation for Linux, they can pwn the netbook market. Guess what S3 appears to be aiming at?

        • by hattig ( 47930 )

          The problem is that both NVIDIA and AMD have products for this market that also double as system chipsets and thus are quite cheap to use instead of discrete graphics. For example, see the recently announced 9400M from NVIDIA (although their older AMD chipsets also had the video decode capability) and AMD's 780G.

          (If you distill AMD's solution, you will end up with an ATI Xilleon chip (although I think they sold this off to Broadcom recently) that is a SoC with the video decode unit that is used in AMD GPUs,

        • I just wish they would aim more resources at OpenGL optimization.

        • Gaming hardware is developed for power first, with lower power as an afterthought. A separate market of buyers exists, which prioritize compactness and low Wattage. Why not a separate R&D model? Both have a legitimate place in computer hardware, but like you, I think S3 has different plans than trying to compete on ATI's & nVidia's terms. And I'm excited about the possibilities.

          If they do that and release documentation for Linux, they can pwn the netbook market.

  • what (Score:4, Insightful)

    by Brian Gordon ( 987471 ) on Saturday October 18, 2008 @12:40PM (#25425001)
    How is enhancing photos the business of a video card? That can be done in software at a perfectly acceptable speed without hardware acceleration.
    • Re:what (Score:5, Informative)

      by Poltras ( 680608 ) on Saturday October 18, 2008 @12:44PM (#25425037) Homepage
      Not true. Some filters which took minutes in Adobe Photoshop CS2 only took half a second in CS4. Just doing a low-pass filter or a blur to get noise reduction would, of course, be doable by a single cpu correctly. But once you go professional, the time saved through using GPGPU is amazing, and means you can see results in realtime, so you can make adjustments much much faster.
      • Re: (Score:3, Informative)

        Comment removed based on user account deletion
        • Re: (Score:3, Funny)

          by slimjim8094 ( 941042 )

          I'm fairly sure that's what he said. He even specifically mentioned photoshop

          • Pretty much... oh noes, Nvidia wasn't mentioned.

            Either someone is a bit too much of a pedant or we're seeing some fanboyism.

        • Oh. There's a company already in the market? Well, forget it then. There's no need for a second company to produce a similar product.
          Give everybody the rest of the day off, and we'll come up with some new stuff tomorrow.

          </sarcasm>

      • by aliquis ( 678370 )

        Except in Aperture there the GPU acceleration is much slower than Lightroom thanks to Apples retarded decision of using 128 MB vram. Hurray for less choice.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      How is enhancing photos the business of a central processing unit? That can be done in hardware acceleration at incredible speeds without software.

      • Re: (Score:1, Funny)

        how is enhancing photos the business of a computer? That can be done with professional equipment and professional lighting.

        • Re: (Score:3, Funny)

          Hos is enhancing photos the buisness of professional equipment and lighting? That can be done with a few lenses, a cheap pen light and some cardboard.

        • by aliquis ( 678370 )

          Off-topic my ass, funny or insightful :D

          Why ISN'T enhancing photos the business of the central processing unit? Does it matter how it's done as long as you get the results you want?

          Anyway, all enhanchments will still try to fix what the exposure didn't.

      • Re: (Score:3, Funny)

        by Fred_A ( 10934 )

        Or you can just squint a bit, or step back a little. Cheap and fast and used to work just as well.

  • S3 (Score:3, Funny)

    by crawly ( 890914 ) on Saturday October 18, 2008 @12:40PM (#25425007)
    I vaguely remember them, and here I though they had gone out of business.
  • Easy to forget (Score:4, Informative)

    by Spatial ( 1235392 ) on Saturday October 18, 2008 @12:48PM (#25425059)

    It's sometimes easy to forget that the PC graphics market isn't owned by ATI and Nvidia

    That's right. Intel own it too.

  • VIA (Score:5, Interesting)

    by blind biker ( 1066130 ) on Saturday October 18, 2008 @12:57PM (#25425113) Journal

    Looks like VIA is really serious about this whole x86 business - they are the little (compared to Intel and AMD) thorn in the side to the big boys. With so many bald decisions regarding their own x86 roadmap, it's a miracle they're still around!

    What I mean is: AMD has been on the razor's edge for many years already, always in danger of unprofitability due to the thin or sometimes non-existent margins they had in order to keep with the top-dog. And AMD has a substantial slice of the x86 market, definitely way bigger than VIA. Imagine what sort of creative management it takes for VIA to stay competitive.

    S3's role in VIA's x86 plans could be crucial. I can definitely see them help VIA into the emerging netbook market. Cheap and low-power, is what VIA and S3 are good at, and that's exactly what netbooks are all about.

    • With so many bald decisions regarding their own x86 roadmap, it's a miracle they're still around!

      Uh.... I shouldn't be posting minutes after waking up from an afternoon slumber :o)
      Obviously, I meant bold decisions! Oh well - at least it's a funny typo!

    • Re: (Score:3, Interesting)

      by PineGreen ( 446635 )

      Incidentally, I put together a completelly fanless audio system based around via esther at 1Ghz (the only noise producing thing is HD and I used a quiet barracuda HD and even that gets annoying after a while). The little box soon expanded into torrent downloader, file and web server and is incredibly stable platform. The best ever spent $400 (it is actually more stable than my 8 core xeon monster worth $8000 last year that I use at work).

      • $8000? Thats a crap lot for a machine, even a 8 core Xeon. What you got in there?
        • A small bag of gemstones.

        • $8000? Thats a crap lot for a machine, even a 8 core Xeon. What you got in there?

          One of a few pleasures of academia is that you get whatever machine you want...8 cores at 3 GHz (which was max last year), 8Gb RAM, 2x 750Gb raided, 30 inch screen driven by a decent videocard... adds up pretty quickly. :)

    • VIA is also an ARM licensee, but I've not seen any products from them based on ARM designs. I don't know if this is because they only sell to OEMs, or if it's because they don't exist.
      • by hattig ( 47930 )

        VIA may use ARMs as components within their other chips. It's not too hard to imagine an ARM inside their Firewire controllers, for example. ARMs are so small and efficient they get put everywhere you need some CPU functionality these days.

    • And AMD has a substantial slice of the x86 market, definitely way bigger than VIA. Imagine what sort of creative management it takes for VIA to stay competitive.

      If a company sells enough widgets to be profitable it doesn't matter if they're 90% or .01% of the widget market. If their revenues exceed their expenses they're making money. VIA and AMD don't have to dominate the CPU market to remain competitive or profitable. Whether or not they're actually profitable and competitive is what is important.

  • The linked wikipedia article says that OpenCL will be introduced in OS X Snow Leopard. According to a post by DigiShaman above, Nvidia cards are GPGPU with their Cuda API. I don't know much about the technology, but maybe this is what Apple had in mind when they put Nvidia cards in all the new Macbooks?
    • I think the whole point of OpenCL is that one doesn't need to be concerned about the underlying hardware. It can be a discrete Nvidia/ATI GPU, an chipset integrated GPU, or even a hybrid multi-core CPU with graphics acceleration cores. I've read that both NVidia and ATI/AMD claimed to support OpenCL in the future. From Apple's point of view it's a matter of choosing a vendor who can provide them with the best(by Apple's own standard) hardware solution.

    • by aliquis ( 678370 )

      Yeah, Nvidia supports Cuda on everything above the 6000-series I think.

      Two other reasons is probably that:
      a) It's faster.
      b) Transgamings Cider software don't support the Intel graphics, and now with this one EA:s "mac games" such as C&C 3, Spore and such will run on the Macbook aswell.

  • by fuzzyfuzzyfungus ( 1223518 ) on Saturday October 18, 2008 @01:37PM (#25425349) Journal
    I can't say I'm wildly optimistic about the likely power of an S3 GPGPU setup, given the history of S3 GPUs. On the other hand, because their performance is likely to be somewhat mediocre, and they certainly don't have the marketshare or power of someone like NVIDIA, they are more likely to do things like release documentation in order to attract development for their platform. In general, the dominant player has the greatest incentive to go it alone, keep things proprietary, and generally try to leverage their power, while the second stringers are much more likely to be helpful in their attempt to build marketshare.
  • by 4D6963 ( 933028 )

    Is it me or is there "S3FotoPro Enhancement" in TFA looks like nothing different than mere contrast adjustment?

    • by 4D6963 ( 933028 )
      Crap crap crap and an extra slice of triple crap, I meant their not there... Damnit, the things tiredness makes you type..
    • by Teun ( 17872 )
      I'd sooner compare it with Gamma adjustment.

      But does it run on Linux? :)

    • I think the program simply performs some basic automatic tunings such as contrast, saturation and sharpness adjustments. It is not supposed to be a replacement of Adobe Photoshop/Lightroom with tons of sophisticated filters (which mostly are meaningless without manually inputted parameters anyway). It's simply a demo app to show the effect of RELATIVE performance gain one can get from GPU hardware acceleration. Even though it is likely that the stuff still runs faster on my Xeon PC without acceleration than

  • by Anonymous Coward

    No, I'm not trying to be funny (or annoying, if you prefer).

    I'm seriously asking.

    so far, S3 has the worst performance and support for Linux. At least VIA is beguining to open their drivers:
    http://linux.via.com.tw/support/downloadFiles.action

    For me at least, even if S3 starts a video revolution, I will stick to NVidia, ATI and even Intel until I hear about decent Linux performance.

    (Sorry about the link not being clickable. It seems that ACs can't post in HTML, and I don't need an account just yet)

  • If i remember correctly, it was Diamond that first brought us the Diamond EDGE cards which used Nvidia's first 3d chip. It was the same chip used in the Sega Saturn.

    I believe S3 Virge came after. As far as i can remember... the Diamond EDGE PC cards were the first 3D accelerator cards. ... And that would make Nvidia first no?

    After Edge flopped, Matrox gave us the Millenia cards which had terrible 3D support on them. Voodoo then took the entire market for a while until Nvidia launched the TNT2 which was a gr

    • by Shark ( 78448 )

      Actually, my Matrox Impression Plus had 3D (OpenGL) acceleration. That was um... 1994?

      The tricky thing then was to find *anything* but the 3 matrox-made sample (and crappy) games that would use it ;) It was a real killer in 3D Studio and CAD though.

  • I want to work as a GPU designer for S3 and put my heart and soul into a product that will be laughably pathetic compared to nVidia and ATi's offerings.

    I also want to fight two MMA champs at the same time, just so I can push my body to the limit and get utterly humiliated and destroyed anyways by two laughing guys drinking beer while they are beating me up.

    • by jensend ( 71114 )

      I think you must mean you want to work for S3's driver team- it's the drivers which are laughable. S3's hardware is- like many other things VIA and its subsidiaries have come up with, such as the Envy and the Nano- good engineering.

      S3 cards have done fairly well in benchmarks compared to parts from nV and AMD/ATI of the same classes (midrange and low end- S3 doesn't bother with the high end)- an impressive achievement considering that 800-lb gorilla Intel has consistently failed to even come anywhere close

  • with HD video acceleration and decent 3D for compiz eye candy / simple games.

    If they release good quality linux drivers I think S3 could make a name for themselves once again. The netbook and low-end pc market is growing. AFAIK even nVidia does not have any kind of HD video acceleration with their linux drivers.

    Otherwise most windows geeks already have a preference of either nVidia or ATI, so this would largely go ignored and only end up in the laps of those who don't know the difference or who just got t

  • was that amazon s3 is using gpgpu

news: gotcha

Working...