Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Intel Hardware

Intel Discrete Graphics Chips Confirmed 159

Arun Demeure writes "There have been rumors of Intel's re-entry into discrete graphics for months. Now Beyond3D reports that Intel has copped to the project on their own site. They describe it as a 'many-core' architecture aimed at 'high-end client platforms,' but also extending to other market segments in the future, with 'plans for accelerated CPU integration.' This might also encourage others to follow Intel's strategy of open-sourcing their Linux drivers. So, better watch out NVIDIA and AMD/ATI — there's new competition on the horizon."
This discussion has been archived. No new comments can be posted.

Intel Discrete Graphics Chips Confirmed

Comments Filter:
  • More competition (Score:5, Insightful)

    by GreenEnvy22 ( 1046790 ) on Tuesday January 23, 2007 @08:10AM (#17722334)
    Competition is almost always good, so I look forward to this. I'd like to see Intel push ATI and Nvidia to create more power efficient chips, as it's quite rediculous right now.
    • by Moraelin ( 679338 ) on Tuesday January 23, 2007 @08:21AM (#17722446) Journal
      If you look at the vast majority of chips either ATI or nVidia sell, they're actually pretty efficient.

      But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast". They'll go to some benchmark site to see some "nVidia's 8800 GTX is faster than ATI's X1900XTX!" article (not entirely unexpected, it's one generation ahead), end up with some vague "nVidia is faster than ATI" idea, then go buy a 5200. Which is the lowest end of two generations behind the ATI, or 3 behind that 8800 GTX.

      Both ATI and nVidia even went through times of not even trying to produce or sell much of their headline-grabbing card. And at least ATI always introduces their latest technology in their mid-range cards first, and they tend to be reasonably energy efficient cards too. But it's like a chicken contest: the one who pulls out loses. The moment one of them gave up on having an ultra-high end card at all, the benchmark sites and willy-waver forums would proclaim "company X loses the high performance graphics battle!"

      I don't think Intel will manage to restore sanity in that arena, sadly. Most likely Intel will end up playing the same game, with one overclocked noisy card to grab the headlines for their saner cards.
      • Re: (Score:3, Insightful)

        by Kjella ( 173770 )
        That sorta assumes you can have one without having the other - can you really have a damn good mirange card that wouldn't perform as a high-end card if you jacked up the GPU frequence, RAM speed and added a huge noisy fan? Trying to measure the midrange gets too complicated though, too many variables like noise and power consumption. Let's just have an all-out pissing contest and assume that it scales down.

        Technologicly, it does. But then there's the part about market economics, you charge what the market w
      • by danpsmith ( 922127 ) on Tuesday January 23, 2007 @09:27AM (#17723100)
        But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast". They'll go to some benchmark site to see some "nVidia's 8800 GTX is faster than ATI's X1900XTX!" article (not entirely unexpected, it's one generation ahead), end up with some vague "nVidia is faster than ATI" idea, then go buy a 5200. Which is the lowest end of two generations behind the ATI, or 3 behind that 8800 GTX.

        Maybe I'm in the minority of people here, but I've always gone to sites that have actual reviews of the card I will potentially be buying. Companies have different models and each one of those models of product has its own advantages and disadvantages. I think a lot of the people that do a lot of shopping comparison online (i.e. most of the market that's actually going to be buying/installing their own graphics card) know this and do the same. ATI and Nvidia cards are only going to sell to a certain section of the market other than OEMs, and I doubt very severely that this is the approach that the type of people upgrading video cards would use in determining which card to purchase. I know I usually check out anandtech.com and look for benchmarks on the price range that I'm in.

        This is like saying "Alpine stereos are better" and buying the lowest model level alpine without comparing it to anything else in the price range, nobody who is going to be installing it themselves can be that stupid, unless they were fanboys looking for a reason to hype up their favorite company anyway. Either way it doesn't look like a real market strategy to me.

        • by renoX ( 11677 )
          > Maybe I'm in the minority of people here, but I've always gone to sites that have actual reviews of the card I will potentially be buying.

          I doubt that a majority of slashdot readers don't do the same thing, that said I don't know if you have seen but the website usually compares boards in the same range but don't provide the prices of the boards, so you have to build yourself the graph with the performance/price comparison.
          So it's not so easy to do the comparison..
        • by WhoBeDaPlaya ( 984958 ) on Tuesday January 23, 2007 @12:07PM (#17725022) Homepage
          Actually, there are lots of "computer engineers" here who bought FX5200 or similar cards and think its the cat's meow.
          "Oooh look! 256MB RAM! That shiiiny FX5200 has to be better than my friend's 128MB 9700 Pro"
        • I meant it's marketting/advertising/PR, by any other name. It's not that everyone uses _only_ those 8800 GTX benchmarks to choose a lower end card, it's that it's there to bombard you with "company X is better/cooler/higher-tech/whatever than company Y" until it hopefully starts to create a subconscious bias. It's not the only criterion, but for enough people it ends up being _a_ criterion whether they acknowledge it or not.

          Sure, we all like to pretend that we're, like, all intelligent and stuff, and would
      • But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast".

        And, for some of us, we don't even need to know if it's all that fast. As long as it's properly supported by Xorg.

        The nVidia video card on my FreeBSD box died the other week. I went to my local PC shop, got the chea

      • The high end cards are useful for two reasons:
        - They give game developers something to work on so they can target mid-range performance levels a year in the future.
        - They keep the pressure on to continue significant performance improvements.

        People will always complain about having to upgrade / buy new hardware, but go play through Half Life I after playing Half Life II and tell me that the graphics improvements were "a waste of money" or "utterly unnecessary for the enjoyment of the game". I enjoy decen

    • Competition is almost always good, so I look forward to this. I'd like to see Intel push ATI and Nvidia to create more power efficient chips, as it's quite rediculous right now.

      No kidding! Looking at video cards to get away from dreaded shared memory I couldn't believe what they want for anything decent that wouldn't burn a hole in anything it touched (heat/cost). And given Intel's history of Open Source drivers for the wireless, I am not holding on waiting for them. AMD/ATI, I hope AMD makes ATI manag

  • by vhogemann ( 797994 ) <victor.hogemann@com> on Tuesday January 23, 2007 @08:12AM (#17722356) Homepage
    And if they enter the gaming video market, I can assure you that my next videboard will be an Intel one.

    Intel drivers for Linux Just Work(TM). I installed Ubuntu 6.10 on my Acer notebook, with a i915g video adapter, and everything worked without any extra effort. And I'm even able to use Beryl/Compiz as my default window manager, without any stability issues.

    Both nVidia and ATI should learn from Intel.
    • I want to get a motherboard with Intel onboard graphics (that has free Linux drivers). I've heard of the G965 chipset; is that the one to go for? I would prefer to buy a 'workstation' rather than 'consumer' motherboard but they tend not to have integrated graphics, no?

      Are Intel's own-brand motherboards worth it? In the past I've bought Asus but that was for AMD-based systems.
      • I've heard good things about that one and how it does fine with the modern eye-candy like xgl etc. Apparently there's newer and better already out from Intel. See here [wikipedia.org].

        I've looked around for boards with these chips and found them in several brands including Asus and Intel. A search for "gma" on your favourite computer store's site should find something.
      • Re: (Score:3, Informative)

        by chrish ( 4714 )
        The Intel GMA950 is the one Apple's using in the Mac Mini and MacBook laptops, and doesn't seem too horrible for an integrated shared-memory GPU; it runs all the spiffy OS X eye-candy nicely, and I've had people tell me that playing games (World of Warcraft natively, or City of Heroes after installing BootCamp and XP) on it is fine.

        Since gaming isn't really your focus if you're running Linux ;-), I imagine the GMA950 chipset (or something newer) would be great for KDE/GNOME/etc. even when they start using O
        • Gaming on the i955 which is in the mac mini is an entirely different issue. The chipset is fast enough for many things, it is pretty much twice as fast as an old radeon 9200 with more accessible mem and has excellent video acceleration, but 3d gaming despite it is problematic. While some games work quite well, even new ones, using shaders etc... Some games seriously choke (like Gothic 3) with a crash, some even plainly refuse to run, due to the fact that they find an intel graphics card, and some have garbl
    • Re: (Score:3, Insightful)

      by Lonewolf666 ( 259450 )
      Intel drivers for Linux Just Work(TM)
      That might have to do with their drivers being Open Source, which has been recommended by the Linux community for a long time. According to all statements from kernel devlopers I've read, Open Source drivers are much easier to maintain.
    • by Andy Dodd ( 701 ) <atd7@c[ ]ell.edu ['orn' in gap]> on Tuesday January 23, 2007 @08:56AM (#17722760) Homepage
      "Intel drivers for Linux Just Work(TM). I installed Ubuntu 6.10 on my Acer notebook, with a i915g video adapter, and everything worked without any extra effort. And I'm even able to use Beryl/Compiz as my default window manager, without any stability issues."

      This is because Intel's graphics chipsets are crippled and don't implement any of the features covered by other companies' patents which force ATI and NVidia to go closed-source.

      You seem to forget that ATI had fully open-source drivers until they were forced to "go closed" due to licensing another company's IP for their chipsets. In that particular case, the first incident was S3 Texture Compression, a feature essentially required by all modern games, and apparently with patent licensing agreements that prohibit closed-source drivers. For a few months, S3TC was why Unreal Tournament 2003 (or was it 2k4?) only ran on NVidia cards under Linux - it wasn't until ATI released binary drivers that supported S3TC that UT2k3 would run on ATI cards under Linux.

      The end result is that ultimately, the choice will not be Intel's as to whether to go open-source or not for full functionality, just as ATI had no choice but to "go closed" or simply leave certain critical features disabled/unsupported under Linux.
      • You seem to forget that ATI had fully open-source drivers until they were forced to "go closed" due to licensing another company's IP for their chipsets. In that particular case, the first incident was S3 Texture Compression, a feature essentially required by all modern games, and apparently with patent licensing agreements that prohibit closed-source drivers. For a few months, ...

        So this is all the fault of gamers ... who use Windows ...?

        I don't know whether to laugh, cry or punch someone in the face.
      • Re: (Score:3, Insightful)

        by MartinG ( 52587 )
        Why can't they release open source drivers that cover as much functionality as possible and provide a close source version optionally that includes the non-oss releasable parts?

      • by realnowhereman ( 263389 ) <[moc.liamg] [ta] [snikrapydna]> on Tuesday January 23, 2007 @10:54AM (#17724114)
        This is because Intel's graphics chipsets are crippled and don't implement any of the features covered by other companies' patents which force ATI and NVidia to go closed-source.

        And I should care about that why?

        Intel cards are not bleeding edge. However, if all you want is a reasonably powerful, 3D supporting card for your open source desktop, then they are perfect. I don't require a huge framerate in $LATEST_GAME, because I don't play it. If I did, then an Intel card would obviously not be for me.

        My intel-based graphics work perfectly, and don't give a moments trouble. I can run 3D applications if I want, and a flashy eye-candy-full desktop too. I previously had an nVidia card, and it was nothing but a fight - is my card supported with this release of the driver? Is it crashing my computer? Is it going to compile with the latest kernel?

        Nowadays, I do nothing but apt-get upgrade to keep my graphics in order and I am a lot happier for it.
      • Patents would not require the code to be closed source.

        You are probably confusing patents with copyrights on the submitted code.

        The only way patents make code closed is when a company thinks they may be violating a patent and wants to hide it.
      • Why does liscencing a patent require closed source drivers? If I recall correctly, the patent system requires disclosure as reciprocation for limited monopolies. I believe there are open source 3d drivers for the savage3 chipset, which is strange given the circumstances claimed.
        • by Andy Dodd ( 701 )
          "Why does liscencing a patent require closed source drivers?"

          As I posted in a previous reply, that is at the patent owner's discretion, even though it might not make sense.

          Either way, the drivers are closed due to intellectual property licensed in such a way as to forbid implementation of some features in an open-source driver. Whatever the exact details are, it's a fact that ATI's first closed-source Linux drivers were released shortly after the UT2003 S3TC fiasco, and S3TC support (and hence UT2K3 compat
        • by nuzak ( 959558 )
          > If I recall correctly, the patent system requires disclosure as reciprocation for limited monopolies.

          It's simpler than that: patent applications are public, and they're supposed to describe the invention in sufficient detail to make them reproduceable. That's the entire point of a patent system. In return for giving up a trade secret, the inventor gets a limited time monopoly on production of the patented device.

          Except the system is so rigged now that patents are deliberately obfuscated and often eve
      • You seem to forget that ATI had fully open-source drivers until they were forced to "go closed" due to licensing another company's IP for their chipsets.

        No they didn't. ATI has never developed open-source drivers. The DRI project developed open-source ATI drivers for ATI's hardware under NDA. ATI did not participate in this development.

        In that particular case, the first incident was S3 Texture Compression, a feature essentially required by all modern games,

        No it is not. Epic is one of the few comp

    • Re: (Score:3, Informative)

      by MrNemesis ( 587188 )
      I've also recently switched to an Intel GFX card for my Myth backend - integrated GMA X3000 in a GigaByte 965G-DS3. Nice board and, in general, nice graphics - after a bit of tinkering getting xorg working with DRI was pretty easy (although fiddling with 915resolution to get my 1680x1050 TFT working at native res was a bit of a pain, but then I guess that's the "attraction" of using Gentoo ;)).

      However, like Andy Dodd point out there are several glaring omissions in the driver; the biggest one for me is that
    • by niko9 ( 315647 )
      I myself love the fact that current video chip sets from Intel have free open source drivers (Yay for Google Earth on my Thinkpad x40!), but there is no mention anywhere that their next generation of discrete video chip sets will have open source drivers.

      I still lament the fact that I can't get any of their current generation of video chips on a discrete card; could finally get rid of my Nvidia card without having to give up my Socket 939 motherboard and CPU.

      I still have more faith in the Open Graphics proj
  • by LaughingCoder ( 914424 ) on Tuesday January 23, 2007 @08:12AM (#17722362)
    Intel is years behind in this market. And they tried this once before, with dismal results: http://news.com.com/Intel+retreats+from+graphics+c hips/2100-1001_3-230019.html [com.com]

    If anything the graphics market has gotten even more specialized since then. I don't know why they think they can succeed this time.
    • by CastrTroy ( 595695 ) on Tuesday January 23, 2007 @08:41AM (#17722606)
      But most people don't buy the top end. There's still a lot of computers being sold with Intel graphics chipsets, right on the motherboard, because most people could care less about which graphics card they have. They'd rather be playing games on their big TV with their console. As long as they can play Tetris variation #349 and freecell, they don't really care which graphics card they have.
      • Re: (Score:3, Informative)

        by thue ( 121682 )
        most people could care less about which graphics card they have

        They could care less? It would only possible do be able to care less if you actually cared.

        http://www.impleader.com/photos/blog/caringcontinu um.jpg [impleader.com]
        • Yes, could care less is correct, because it's short for the phrase:

          I suppose I could care less, but I'm not sure how.
          • by thue ( 121682 ) on Tuesday January 23, 2007 @11:21AM (#17724372) Homepage
            Yes, could care less is correct, because it's short for the phrase:
            I suppose I could care less, but I'm not sure how.


            I agree with you, and concede the point.*

            *Here "I agree with you, and concede the point" is actually short for the phrase "I could agree with you, and concede the point, but I consider using words which mean the opposite of what you are trying to say in normal conversation to be extremely silly.".
            • "I could agree with you, and concede the point, but I consider using words which mean the opposite of what you are trying to say in normal conversation to be extremely silly.".

              Oh yeah, that makes sense.
          • Yes, could care less is correct, because it's short for the phrase:

            I suppose I could care less, but I'm not sure how.

            That begs the question, "How can we work a begs-the-question-misuse joke into this thread?"

    • by Kjella ( 173770 )
      I think it's quite simply because AMD/ATI has been flagging combined solutions, which means Intel and nVidia either need to team up or roll their own. This might be just as much strategical: "nVidia, you need us more than we need you". nVidia has proven they're no slouch when it comes to business, for example by refusing to license SLI they've muscled in on the high-end motherboard market. Intel certainly has greater ambitions than to deliver Intel CPUs to a nVidia system, and this might be a way of saying
    • Re: (Score:2, Interesting)

      If I am reading this article right, "multi-core" and and "high-end" graphics probably means that intel is going after realtime ray-tracing HW support, which is seen as natural succesor of current z-buffered graphics. There are university projects already proving that ray-tracing hardware support works fine and bring way better graphics then what is available by ATI/nVidia. Battle for best ray-tracing HW will start soon among all 3 key players (ATI/Intel/nVidia) and Intel probably thinks this is right time t
    • by suv4x4 ( 956391 ) on Tuesday January 23, 2007 @08:59AM (#17722806)
      I don't know why they think they can succeed this time.

      Remember when AMD made Intel clones down to the very chip architecture and it didn't matter which manifacturer you bought from?

      Remember how AMD K5 sucked and people started leaning towards Intels? And then Pentium 4 happened, and AMD's new architecture was much superior? And then Core turned things on their head again?

      Things change. I don't think we're using 3DFX cards anymore either too. They used to be ahead of everyone.

      • Re: (Score:3, Funny)


        Remember when AMD made Intel clones down to the very chip architecture and it didn't matter which manifacturer you bought from?

        Remember how AMD K5 sucked and people started leaning towards Intels? And then Pentium 4 happened, and AMD's new architecture was much superior? And then Core turned things on their head again?


        Pepperidge Farms remembers.
      • The difference in your examples is that AMD was always a CPU company. Intel is not a graphics chip company. They tried to be once but failed miserably. Graphics is a specialized area with unique domain knowledge, just as CPU design is. Would you assume, if Nvidia announced they were going to make CPUs to compete with Intel and AMD that they would be successful? Now, I'm not saying Intel can't be successful. I am only saying it's not the way to bet (and I have history on my side).
    • Re: (Score:3, Interesting)

      by Creepy ( 93888 )
      I suspect the real problem is because high end cards are starting to push Shader unification [rage3d.com].

      From a chipset standpoint, Intel actually makes decent (not spectacular, but better than many) graphics hardware already, they just don't have hardware transformation and lighting (T&L), which gets offloaded to the CPU. That means you can't be throttling your CPU(s)/cores and need a decent pipe between the hardware and memory. Intel said a couple of years back that it's a myth [intel.com] that the bottleneck is usually in
  • ...who can compete with ATI and Nvidia.
    Intel has technology, has brains, has money, has plants. They can do something "as good as" the two others. Competition is a good thing (prices falling, etc); only two main actors for videocards is a bad things.
    S3 can't compete. Matrox can't compete. 3dfx can't compete (they're dead). Others can't compete. Intel is our only hope.
    • by nbannerman ( 974715 ) on Tuesday January 23, 2007 @08:29AM (#17722516)
      Well, SONICblue (formerly S3 / Diamond) are essentially dead as well(chapter 11, most product lines sold off), but Matrox still survive with a 3-5% share of the market, and they're doing fairly well in niche markets - scientific, medical, military and financial. As for 3dfx, their assets (intellectual and staff) where purchased by NVIDIA; so any innovation from their prime years is probably still alive and well (to a degree).
      • Re: (Score:3, Informative)

        The graphics part of S3 was sold to VIA at about the same time as it transformed to SONIC|blue. So the Chapter 11 thing is irrelevant.

    • by sjf ( 3790 )
      Intel has technology, has brains, has money, has plants.

      What they don't have, though, is ATI(AMD) and NVIDIA's patent portfolios.
  • If Intel can make a graphics card that is better than my current GeForce FX 5700LE in all areas (including shader performance) I am sold.
    Especially if they have open source Linux drivers for the thing :)
    • by bmgoau ( 801508 )
      Right now there are two seperate generations of cards on the market (even an emerging 3rd), some very very affordable, which can best that card, and provide (in a roundabout way) support for linux. The utility cost of waiting for intel to bring out their chip may be disproportionate to simply buying todays budget cards.
    • Mutually exclusive. You get either new features or open source drivers. Many of the more interesting features newer GPUs offer are covered by patents etc., which require Intel to license the stuff. The license agreements will most probably contain clauses that prohibit the disclosure of implementation details; for example because the licence comes with example code and Intel probably doesn't want to go to great lengths to be able to prove in court that their implementation is not derivative of the example c
  • Will Intel be clever enough and innovative enough to have a "GPU" socket on such motherboards? Maybe even GPU-specific memory sockets rather than shared memory?

    One can always hope.
    • I doubt this will eliminate onboard graphics. At the low-end price range and in the light-weight mobile market, they're simply necessary. But if Intel could produce an onboard graphics chip that would compete with the 300-series (low-end discrete) from Nvidia and ATi, that could change the game.

      It's also unlikely Intel boards would have a GPU slot that's not PCIe (or PCIe 2.0), since no one would buy a motherboard that locks them into only Intel. Even Crossfire/SLI boards allow you to have one of the ot
      • Considering most motherboards are sold already in PCs I doubt most people even know or care that they are locked into Intel only and anyway motherboards don't lock you in that much, you can just get a new motherboard without worrying too much. Intel boards don't allow you to plug in an AMD processor chip after all and I dont hear many people complaining about being locked into a processor manufacturer. The performance, efficiency and manufacturing gains might well be worth producing powerful on board graphi
    • by Lonewolf666 ( 259450 ) on Tuesday January 23, 2007 @08:36AM (#17722560)
      That socket is usually called a "PCIe slot" these days. If you use a socket instead of just integrating the graphics chip into one that is onboard anyway, you might as well use the established solution.
      Another interesting approach (albeit not for high end machines and somewhat OT here) is AMD's plan to integrate the GPU with the CPU. That way, you might have some more choice than with a soldered in chip, and GPU cooling could profit from the availability of decent CPU coolers.
      • you missed the point entirely.

        This doesn't take up one of your expansion slots, since you already have the graphic-out ports on the motherboard in such solutions. Meaning in a small-form-factor machine, you have one more option for tweaking the system to what you want/need.
  • Intel has a chance. Intel has the experience with cpu's. Intel can also interface with their new processors. I think Intel could atleast put up a good fight. Why do you think Amd bought ati? They know that intel can do gpus and really good ones if they tried and the only way amd would be able to compete would be buying a gpu maker wich they did.
    • by dusanv ( 256645 )
      They know that intel can do gpus and really good ones if they tried

      Then I guess Intel hasn't really tried yet because they haven't yet produced a half decent graphics chip in their history (current integrated graphics lineup *stinks* compared to ATI/NVidia integrated stuff). I don't think anyone is afraid of discrete Intel graphics cards.
  • I've never met an Intel graphics solution that could play anything more intense than Solitaire. Is the G965 any good?

    I just upgraded my sister's mobo + CPU. It had embedded graphics, so I figured it would be comparable to her 2 year old nVidia AGP card. Nope. I had to buy a new PCIe nVidia card to handle Sims 2.

    On a side note: Has anyone noticed that the extremely popular family-friendly 3D games are the worst performers? Sims 2 and RCT3 both take eons to load - much slower than Q4 or HL2.
    • It's performance is on a par with the IGP's about 3 iterations back from ATI- mostly due to immature drivers. It's closer in performance to the previous generation of integrated graphics (Which happens to be a chip from the previous era of GPUs with a vastly lower power consumption due to process shrink and logic improvements...)- some things it bests ATI's chips, other things ATI's chip with it's current drivers pastes it all over the place. The chip's capable of quite a bit more, but it's hampered by an
    • by Eideewt ( 603267 )
      Intel isn't that bad. A 915 plays Halo fine, and it *is* a budget chip. Point taken, Intel chips don't do what the brawnier ones do, but I don't think Intel has demonstrated incompetence in the field yet.
  • by Rastignac ( 1014569 ) on Tuesday January 23, 2007 @08:52AM (#17722724)
    I've been waiting for years for such kickass videocards. I've seen running protoypes in labs/universities; quite impressive videos. After a few years, now, the technology should be ready for the big market ? Pixar-like technology at home !
    Real-time raytracing needs a lot of power; so, a multicore videocard is a great idea ! With raytracing, each core can compute one part of each picture. Better than SLI.
    Using their knowledges, Intel can build a very fast multicore real-time raytracing videocard. It will be "something different", and it will compete with ATI and Nvidia in a new innovative way...
    • But no games would work. :)
      • by Eideewt ( 603267 )
        But no games would work without driver support for OpenGL/DirectX, you mean. If the chips can handle rendering duties, then they can presumably render something like the average card's output as well.
  • by markov_chain ( 202465 ) on Tuesday January 23, 2007 @09:08AM (#17722876)
    Until this new hardware will let me display fractional polygons I'm sticking to my continuous graphics board.
  • But will they include DVI? Better yet, dual DVI for those who run either dual monitors or really large monitors which require dual link?
    • by Slashcrap ( 869349 ) on Tuesday January 23, 2007 @10:07AM (#17723566)
      But will they include DVI? Better yet, dual DVI for those who run either dual monitors or really large monitors which require dual link?

      No, in fact they aren't even going to include DSUB outputs. They are going to use modulated RF outputs like you got on the ATARI ST and AMIGA. They will be capable of displaying NTSC resolutions at anything up to 60Hz refresh rate.

      What the fuck do you think?
      • This comes on the heel of the news about Creative ditching the Audigy brand in favor of the new "SIDmeister" series, which will be based around the MOS Technology 6581 Sound Interface Device, offering stunning three-voice 16-bit 2.0 surround sound*.

        Creative is confident to bring the first 100,000 units to market in fall 2007. "We have already managed to find 80 SIDs," a spokesperson said, "and we're pretty sure we can get the other 99,920 ones in the next couple months."


        * Sound will appear to come from
      • I don't know. How many Intel motherboards ship with DVI connectors rather than VGA's DSUB? I haven't seen any.
        • by Bradley ( 2330 )
          There SDVO [wikipedia.org], where you put a cheap card into the PCIe x 16 slot. Graphics goes over the PCIe bus (@1-2GHz) and the SDVO card can then have DVI/HDMI/tvout/etc. You still use a slot, but don't need to have a separate card/heatsink/fan/etc.

          Only problem is that I've never actually seen one.... SiL apparently make chips, and I'm sure if I went looking I could find a card, but they're not exactly common. No idea if they work under linux.
        • Re: (Score:3, Informative)

          MacBooks and MacMinis ship with DVI on board.
  • by BillGatesLoveChild ( 1046184 ) on Tuesday January 23, 2007 @09:18AM (#17722988) Journal
    Intel's previous foray into the Discrete Graphics Market was the Intel i740. I got one, agreeing with PC salesman "Hey, you can't go wrong with Intel can you?" It was quite a decent chip for its time, and the driver was very stable. I don't ever recall graphics hanging once! It was disappointing when Intel bailed out of the 3D market, but to their credit they continued to update the drivers whenever a new version of DirectX rolled out.

    Intel have already made a return of sorts to 3D with their Media Accelerator 9XX series chips you'll find in many Intel laptops. It's funny, because you'd expect an embedded chipset to be lame; lowest common denominator, shared RAM and akk. But this lappie has it and the graphics scream. It's faster than my nVidia 5700 which is two years old. The driver is stable too; never crashed. If they can do this with an embedded chipset 3d, imagine what they can do when they really put their mind to it?

    nVidia and ATI have the market to themselves these days. nVidia has got pretty lax regarding driver stability for these days, and it's damned near impossible to get support out of them. They've fobbed off support to OEMs, who slap electronics onto cards and are in no position to help with driver problems. That's the sort of thing that happens when a company dominates a market.

    If Intel can come out with some high performance electronics and stable drivers, well, Welcome back, Intel! I for one welcome you as my new Overlord!
    • I've been monitoring this thread with some interest. I'm looking to build a new home computer that will run Linux exclusively (most likely Kubuntu). Mostly, it will be my personal workstation but I do plan to install some games - mostly 1st person shooter types. While I don't require "cutting edge", I would like decent performance. Can this chipset handle things like the latest UT or Doom III on Linux?

      I mean, I like nVidia, but if Intel is supported out-of-the-box with open source drivers, then that wor
      • Intel have a Game Compatibility List on their website. Not sure what your situation is, but my lappie use i945. DOOM III is fine, but Quake IV isn't. If Intel want to really penetrate the graphics market, obviously their next list will have to be all green spots. "Intel: The way it's meant to be played!" ;-)

        http://www.intel.com/support/graphics/intel945gm/s b/CS-021400.htm [intel.com]

        Serious gamers bitch that the 9XX series is low end. Maybe it is. But it whips my nVidia 5700 and that's good enough for me!

        Here's a list
        • by wrook ( 134116 )
          OK.... I have a desktop with an nVidia GeForce FX 5200. And I have a laptop with an i945GM chipset. The laptop has a *much* beefier CPU, but I can't even come close to the 3D graphics performance of the GeForce. Neverball gives me about 20 frames per second. Vegastrike only gives me about 15 frames per second if I'm close to any ship.

          Beryl runs quite nicely on the laptop, but I can't find a 3D game that runs decently at all.

          Personally, if you want to do *any* 3D gaming at all, the 9XX doesn't seem to b
      • by xenocide2 ( 231786 ) on Tuesday January 23, 2007 @11:56AM (#17724850) Homepage
        As far as I know, the GMA 9xx [wikipedia.org] series is a couple generations behind, performance wise. It should play quake3 and UT2k4 just fine, but it seems to have trouble with the Doom 3 engine, and I suspect the new UT engine will also be unplayable. On the windows side, it doesn't work with halflife2 either. Seems the most likely kind of game to fail is a new FPS. But I hear aero and Xgl/AIGLX work fine, so you may be satisfied with the current Intel offerings. The wikipedia page seems like a good place to start researching if you're still interested.
    • by nxtw ( 866177 )
      my laptop's GMA 950 and my HTPC's GMA 900 both have never been the cause of a driver crash. I can't say that about the ATI 9600 in my old desktop or the X1400 in my old laptop. the GMA 900 outputs 1080p (1920x1080) to my TV and the GMA 950 in my laptop outputs two 1280x1024 (total 2560x1024) displays.

      My current laptop is rated for 3/4 the battery life when fitted with an ATI X1300 GPU.
    • Liar! (Score:3, Informative)

      by bogie ( 31020 )
      The GMA950 is a crap 3D card. Even the most basic google research shows that it is NOT a return of Intel to 3D and no reviewer worth a dam has said the graphics "scream". Poor performance and incomplete 3d support are the hallmarks of the GMA950. If you play nothing but Quake II than yea, the GMA950 is for you.

      http://www.extremetech.com/article2/0,1697,1821814 ,00.asp [extremetech.com]
      http://www.anandtech.com/video/showdoc.aspx?i=2427 &p=3 [anandtech.com]
      http://everythingapple.blogspot.com/2006/03/intel- gma-950-terrible-opengl.html [blogspot.com]
      • Are you trolling? I own laptops with the Intel, the Radeon XPress and nVidia 3d chipsets and two nVidia desktops, so I reckon I'm qualified to express an opinion.

        For the games and graphics software I run, it's fine. The Intel game compatibility list says which games do and don't give acceptable frame rates. I said that in the first post.

        Compared to the 5700, yes, hate to scare you, but it really does scream. The nVidia GeForce FX is a dog of a card. It has a very slow PixelShader implementation, and nVidia
  • Driver Open Sourcing (Score:5, Interesting)

    by Midnight Warrior ( 32619 ) on Tuesday January 23, 2007 @09:22AM (#17723040) Homepage

    Has anyone considered that the reason ATI/NVidia won't open source their drivers/firmware is because there are blatant copyright and patent violations in their code? I'm not saying there are violations, but if there are, then I would expect each to violently defend against anyone seeing their source code. To date, the best argument heard is that access to the code would provide their competitors an unfair advantage into their optimization techniques, which most of us recognize to be hog wash. At worst [zdnet.com], they wrap it up in "we have licensed proprietary algorithms" declarations and refuse to give the community a chance to work around those algorithms.

    There is only one way forward. NVidia should fund the effort to rewrite their firmware/drivers, providing only the hardware register descriptions and nuances. I'm quite sure others have asked NVidia to do this already, but Intel moving forward with this plan should force the other's hand. I'm surprised that Microsoft hasn't chimed in here because for every open specification we get in the OSS world, they also get. That's where all those Microsoft drivers come from. And only on occasion is a vendor-supplied driver better that the Microsoft one. Open sourcing any drivers also helps Microsoft support more hardware out of the box, without a multitude of licensing agreements and royalty schemes.

    And of course, NVidia (and now ATI) have been adding more treasure to their war chests with the PCIe motherboards. I just bought a new motherboard and it's extremely hard to find a new board with PCI-Express that doesn't have an nForce or ATI chipset.

    It's going to be a tough game for Intel because it's not just graphics drivers. AMD could play into this game if they took a decisive maneuver with their GPU integration into the CPU. Remember that AMD now owns ATI.

    • Re: (Score:3, Insightful)

      by Cheesey ( 70139 )
      Has anyone considered that the reason ATI/NVidia won't open source their drivers/firmware is because there are blatant copyright and patent violations in their code? I'm not saying there are violations, but if there are, then I would expect each to violently defend against anyone seeing their source code.

      Yes, this has been suggested before. These violations, if they exist, may not be deliberate though.

      Remember that software patents are often very broad. It is hard to write any software at all without violat
    • by NSash ( 711724 )
      There is only one way forward. NVIDIA should fund the effort to rewrite their firmware/drivers, providing only the hardware register descriptions and nuances.

      What is NVIDIA's incentive to do this?
    • I'm surprised that Microsoft hasn't chimed in here because for every open specification we get in the OSS world, they also get. That's where all those Microsoft drivers come from. And only on occasion is a vendor-supplied driver better that the Microsoft one. Open sourcing any drivers also helps Microsoft support more hardware out of the box, without a multitude of licensing agreements and royalty schemes.

      Isn't typical Slashdot-think quite the opposite? I.e., Microsoft prefers closed hardware specificall

  • Industry Benefit (Score:2, Interesting)

    by Darkryft ( 1054812 )
    I believe in competition being good, but I'm not sure it's all about just competition. This likely could be the move to save PC Gaming as a whole. Technology-wise PC's will always have superiority over consoles, but there are rare arguments to the economics of top-end gaming PC's against consoles. Microsoft and Sony take huge losses to push their hardware, and slowly but surely it does pay off - Gears of War on the Xbox 360 has sold 3 million copies in just a hair over 60 days. Name one PC title that is
    • I think the biggest problem for pc gaming is, that pcs especially home pcs have moved into the realm of notebook computers, and face it, most notebooks currently have centrinos in there, and those intel chipsets suck at 3d graphics. The main problem is, most games are produced for high end hardware and nvidia only, while most pc users nowadays have an integrated graphics chip in the medium range speedwise, which not even is tested with many games. Speaking of producing outside of the market, the pc publishe
  • Even Intel doesnt get it perfect in the fist chips.
    Another considerationis they may have to emulate someone elses API. Too much software out there to have a new one.
  • The Future of the Linux Desktop is finally safe. Intel Graphics is so far the closest thing the world has seen to a successful open graphics project. Intel has hired some of the best minds in the Xorg project- such as Keith Packard [wikipedia.org] and Eric Anholt [anholt.net]- and has put out great free drivers for some time now. Sure the GMA 9xx series is not the best for games, but it can run a 3D accelerated desktop with the best of them.

    If nothing else the excitement in and around the Linux community over these OpenGL possibilities

...there can be no public or private virtue unless the foundation of action is the practice of truth. - George Jacob Holyoake

Working...