Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Intel Hardware

Intel Discrete Graphics Chips Confirmed 159

Arun Demeure writes "There have been rumors of Intel's re-entry into discrete graphics for months. Now Beyond3D reports that Intel has copped to the project on their own site. They describe it as a 'many-core' architecture aimed at 'high-end client platforms,' but also extending to other market segments in the future, with 'plans for accelerated CPU integration.' This might also encourage others to follow Intel's strategy of open-sourcing their Linux drivers. So, better watch out NVIDIA and AMD/ATI — there's new competition on the horizon."
This discussion has been archived. No new comments can be posted.

Intel Discrete Graphics Chips Confirmed

Comments Filter:
  • More competition (Score:5, Insightful)

    by GreenEnvy22 ( 1046790 ) on Tuesday January 23, 2007 @09:10AM (#17722334)
    Competition is almost always good, so I look forward to this. I'd like to see Intel push ATI and Nvidia to create more power efficient chips, as it's quite rediculous right now.
  • Re:Discrete? (Score:2, Insightful)

    by Shard013 ( 530636 ) <<moc.liamtoh> <ta> <310drahs>> on Tuesday January 23, 2007 @09:16AM (#17722400)
    I am guessing in this context "discrete" means seperate from the motherboard.
  • by Moraelin ( 679338 ) on Tuesday January 23, 2007 @09:21AM (#17722446) Journal
    If you look at the vast majority of chips either ATI or nVidia sell, they're actually pretty efficient.

    But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast". They'll go to some benchmark site to see some "nVidia's 8800 GTX is faster than ATI's X1900XTX!" article (not entirely unexpected, it's one generation ahead), end up with some vague "nVidia is faster than ATI" idea, then go buy a 5200. Which is the lowest end of two generations behind the ATI, or 3 behind that 8800 GTX.

    Both ATI and nVidia even went through times of not even trying to produce or sell much of their headline-grabbing card. And at least ATI always introduces their latest technology in their mid-range cards first, and they tend to be reasonably energy efficient cards too. But it's like a chicken contest: the one who pulls out loses. The moment one of them gave up on having an ultra-high end card at all, the benchmark sites and willy-waver forums would proclaim "company X loses the high performance graphics battle!"

    I don't think Intel will manage to restore sanity in that arena, sadly. Most likely Intel will end up playing the same game, with one overclocked noisy card to grab the headlines for their saner cards.
  • by Lonewolf666 ( 259450 ) on Tuesday January 23, 2007 @09:28AM (#17722510)
    Intel drivers for Linux Just Work(TM)
    That might have to do with their drivers being Open Source, which has been recommended by the Linux community for a long time. According to all statements from kernel devlopers I've read, Open Source drivers are much easier to maintain.
  • by CastrTroy ( 595695 ) on Tuesday January 23, 2007 @09:41AM (#17722606)
    But most people don't buy the top end. There's still a lot of computers being sold with Intel graphics chipsets, right on the motherboard, because most people could care less about which graphics card they have. They'd rather be playing games on their big TV with their console. As long as they can play Tetris variation #349 and freecell, they don't really care which graphics card they have.
  • by jonwil ( 467024 ) on Tuesday January 23, 2007 @09:54AM (#17722742)
    If I wanted to run binary kernel modules I would just buy a 6xxx series NVIDIA and be done with it.
    I specifically said "Open Source" :)
  • by suv4x4 ( 956391 ) on Tuesday January 23, 2007 @09:59AM (#17722806)
    I don't know why they think they can succeed this time.

    Remember when AMD made Intel clones down to the very chip architecture and it didn't matter which manifacturer you bought from?

    Remember how AMD K5 sucked and people started leaning towards Intels? And then Pentium 4 happened, and AMD's new architecture was much superior? And then Core turned things on their head again?

    Things change. I don't think we're using 3DFX cards anymore either too. They used to be ahead of everyone.

  • by Kjella ( 173770 ) on Tuesday January 23, 2007 @10:01AM (#17722814) Homepage
    That sorta assumes you can have one without having the other - can you really have a damn good mirange card that wouldn't perform as a high-end card if you jacked up the GPU frequence, RAM speed and added a huge noisy fan? Trying to measure the midrange gets too complicated though, too many variables like noise and power consumption. Let's just have an all-out pissing contest and assume that it scales down.

    Technologicly, it does. But then there's the part about market economics, you charge what the market will pay and pocket the margin. That's why they're mostly close anyway. Let's take Intel's Core 2 introduction, before: AMD vs Intel was close. Intel introduces a damn good new processor, AMD slashes prices 50%, again they're close. Who had the best technology before and after? Good question, but most of the difference doesn't show up in the market.
  • by BillGatesLoveChild ( 1046184 ) on Tuesday January 23, 2007 @10:18AM (#17722988) Journal
    Intel's previous foray into the Discrete Graphics Market was the Intel i740. I got one, agreeing with PC salesman "Hey, you can't go wrong with Intel can you?" It was quite a decent chip for its time, and the driver was very stable. I don't ever recall graphics hanging once! It was disappointing when Intel bailed out of the 3D market, but to their credit they continued to update the drivers whenever a new version of DirectX rolled out.

    Intel have already made a return of sorts to 3D with their Media Accelerator 9XX series chips you'll find in many Intel laptops. It's funny, because you'd expect an embedded chipset to be lame; lowest common denominator, shared RAM and akk. But this lappie has it and the graphics scream. It's faster than my nVidia 5700 which is two years old. The driver is stable too; never crashed. If they can do this with an embedded chipset 3d, imagine what they can do when they really put their mind to it?

    nVidia and ATI have the market to themselves these days. nVidia has got pretty lax regarding driver stability for these days, and it's damned near impossible to get support out of them. They've fobbed off support to OEMs, who slap electronics onto cards and are in no position to help with driver problems. That's the sort of thing that happens when a company dominates a market.

    If Intel can come out with some high performance electronics and stable drivers, well, Welcome back, Intel! I for one welcome you as my new Overlord!
  • by danpsmith ( 922127 ) on Tuesday January 23, 2007 @10:27AM (#17723100)
    But they invariably _have_ to have some benchmark-breaking super-card to grab the headlines with. The way it works is that while only a minority of people will actually buy the top-end graphics card, there are millions of people who just need a reminder that "nVidia is fast" or "ATIs are fast". They'll go to some benchmark site to see some "nVidia's 8800 GTX is faster than ATI's X1900XTX!" article (not entirely unexpected, it's one generation ahead), end up with some vague "nVidia is faster than ATI" idea, then go buy a 5200. Which is the lowest end of two generations behind the ATI, or 3 behind that 8800 GTX.

    Maybe I'm in the minority of people here, but I've always gone to sites that have actual reviews of the card I will potentially be buying. Companies have different models and each one of those models of product has its own advantages and disadvantages. I think a lot of the people that do a lot of shopping comparison online (i.e. most of the market that's actually going to be buying/installing their own graphics card) know this and do the same. ATI and Nvidia cards are only going to sell to a certain section of the market other than OEMs, and I doubt very severely that this is the approach that the type of people upgrading video cards would use in determining which card to purchase. I know I usually check out anandtech.com and look for benchmarks on the price range that I'm in.

    This is like saying "Alpine stereos are better" and buying the lowest model level alpine without comparing it to anything else in the price range, nobody who is going to be installing it themselves can be that stupid, unless they were fanboys looking for a reason to hype up their favorite company anyway. Either way it doesn't look like a real market strategy to me.

  • by Slashcrap ( 869349 ) on Tuesday January 23, 2007 @11:07AM (#17723566)
    But will they include DVI? Better yet, dual DVI for those who run either dual monitors or really large monitors which require dual link?

    No, in fact they aren't even going to include DSUB outputs. They are going to use modulated RF outputs like you got on the ATARI ST and AMIGA. They will be capable of displaying NTSC resolutions at anything up to 60Hz refresh rate.

    What the fuck do you think?
  • by Cheesey ( 70139 ) on Tuesday January 23, 2007 @11:15AM (#17723674)
    Has anyone considered that the reason ATI/NVidia won't open source their drivers/firmware is because there are blatant copyright and patent violations in their code? I'm not saying there are violations, but if there are, then I would expect each to violently defend against anyone seeing their source code.

    Yes, this has been suggested before. These violations, if they exist, may not be deliberate though.

    Remember that software patents are often very broad. It is hard to write any software at all without violating some patent or other. If you write software, and you have a lot of money, the patent trolls will come knocking. Giving away source code makes the troll's job much easier. Perhaps that is part of what NVIDIA and ATI want to avoid.

    Another problem is that they've used other people's code under NDA in their drivers. There is a similar problem with Windows - Microsoft could not release the source as free software without removing a lot of third-party components.
  • by MartinG ( 52587 ) on Tuesday January 23, 2007 @11:40AM (#17723990) Homepage Journal
    Why can't they release open source drivers that cover as much functionality as possible and provide a close source version optionally that includes the non-oss releasable parts?

  • by thue ( 121682 ) on Tuesday January 23, 2007 @12:21PM (#17724372) Homepage
    Yes, could care less is correct, because it's short for the phrase:
    I suppose I could care less, but I'm not sure how.


    I agree with you, and concede the point.*

    *Here "I agree with you, and concede the point" is actually short for the phrase "I could agree with you, and concede the point, but I consider using words which mean the opposite of what you are trying to say in normal conversation to be extremely silly.".

Prediction is very difficult, especially of the future. - Niels Bohr

Working...