Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Portables

Killer Mobile Graphics — NVIDIA's GeForce 8800M 89

MojoKid writes "Today NVIDIA unveiled the much-anticipated GeForce 8800M series of mobile graphics processors. The GeForce 8800M is powered by the new G92M GPU which is built on a 65nm manufacturing process and shares a lineage with the desktop-bound G92 GPU on which NVIDIA built their GeForce 8800 GT. The 8800M series will come in two flavors, a GTX and a GTS, with different configurations of stream processors, 64 for the GTS model and 96 for the high-end GTX."
This discussion has been archived. No new comments can be posted.

Killer Mobile Graphics — NVIDIA's GeForce 8800M

Comments Filter:
  • by Zymergy ( 803632 ) * on Tuesday November 20, 2007 @01:58AM (#21417113)
    It appears Alienware will be using the GeForce 8800M GTX in their "m15x" and "m17x" models:
    http://www.alienware.com/intro_pages/m17x_m15x.aspx [alienware.com]
    NVIDIA GeForce 8800M Link: http://www.nvidia.com/object/geforce_8M.html [nvidia.com]
  • by gerf ( 532474 ) on Tuesday November 20, 2007 @02:01AM (#21417131) Journal

    Well, it's supposedly more mizerly (?) than the 512MB 8600m GT, which I have. With a 1.6 c2d, 2GB ram, XP Pro, and a 85WHr battery, I can get over 5 hours of battery life, which I think is dang good considering that I can play games quite well. That of course is not while gaming, only web browsing and using IM.

    Of course, this will vary from laptop to laptop, YMMV.

  • by Zymergy ( 803632 ) * on Tuesday November 20, 2007 @02:15AM (#21417191)
    PowerMizer Mobile Technology page: http://www.nvidia.com/object/feature_powermizer.html [nvidia.com]

    Maybe the NVIDIA Technical Brief will yield some answers: http://www.nvidia.com/object/IO_26269.html [nvidia.com] (Warning, spawns a PDF)

    PowerMizer7.0 Power Management Techniques:
    Use of leading edge chip process
    CPU load balancing
    Intelligent GPU utilization management
    Revolutionary performance-per-watt design
    PCI Express power management
    Aggressive clock scaling
    Dedicated power management circuits
    Display brightness management
    Adaptive performance algorithms

    CPU Offload Example (from NVIDIA's Technical Brief)
    Figures 3 and 4 (see PDF) show CPU utilization when running a Blu-ray H.264 HD movie using the CPU and GPU, respectively. You can see that under the GPU video playback, 30% less CPU cycles are being used. This dramatic reduction in CPU usage means less power is being consumed by the processor, therefore system power consumption is reduced. resulting in longer battery life.
    Note: Testing was conducted on an Intel Centrino based platform with 2 GHz Core2 Duo processor, and a GeForce 8600M GS, running Intervideo WinDVD8 playing a Casino Royale H.264 Blu-ray disc.
  • iMac (Score:3, Informative)

    by tsa ( 15680 ) on Tuesday November 20, 2007 @02:16AM (#21417195) Homepage
    Let's hope Apple puts this card in the next iMac instead of the crappy ATI they put in it now.
  • Re:Unlikely. (Score:2, Informative)

    by grahamd0 ( 1129971 ) on Tuesday November 20, 2007 @03:15AM (#21417411)

    My work machine is brand new Macbook Pro. It's got an nvidia card in it. Like all Macbook Pros do. [apple.com] So does the Mac Pro. [apple.com]

  • by vipz ( 1179205 ) on Tuesday November 20, 2007 @03:25AM (#21417449)
    I believe the 8800GT on the desktop side of things uses the same G92 chip. Sparkle has already announced a passively cooled version of that: Press Release [sparkle.com.tw] Pictures of a passively cooled Gainward card have also been floating around the net.
  • Re:iMac (Score:2, Informative)

    by TheMidnight ( 1055796 ) on Tuesday November 20, 2007 @03:28AM (#21417461)
    Well, since we're talking about laptops and mobile graphics, I feel the need to point out that my new MacBook Pro has an nVidia 8600 GT in it. Apple has provided nVidia chips in the MacBook Pro line for a few months now. You can get 128 MB or 256 MB, depending on whether you buy the 15" or 17" model.
  • by m94mni ( 541438 ) on Tuesday November 20, 2007 @03:43AM (#21417515)
    Actually, one important part of newer PowerMizer designs (>3.0 maybe) is that parts of the GPU are *turned off* when not in use. Other parts run on decreased voltage.

    That effectively decreases the number of proce3ssors and of course saves a *lot* of Watts.
  • Re:iMac (Score:3, Informative)

    by Vskye ( 9079 ) on Tuesday November 20, 2007 @03:48AM (#21417535)

    Well, since we're talking about laptops and mobile graphics, I feel the need to point out that my new MacBook Pro has an nVidia 8600 GT in it. Apple has provided nVidia chips in the MacBook Pro line for a few months now. You can get 128 MB or 256 MB, depending on whether you buy the 15" or 17" model.

    Yep, that might be true.. but to get to 256MB in graphics memory you have to spend $2499.00 US. That's just crazy. (MacBook Pro 15") I'm sorry, but I'll just get a iMac and purchase a cheap PC based laptop, and toss Linux on it. Personally, I'd love a MacBook but the bang for the buck just isn't justified. (spec wise, and the pro version is just insane price wise)
  • Re:Unlikely. (Score:5, Informative)

    by Anonymous Coward on Tuesday November 20, 2007 @05:19AM (#21417921)
    Actually, you got it the wrong way around.

    Apple was going 100% ATI, but then ATI leaked about how they'd got the contract to the press and Jobs was furious. He really HATES secrets getting let out (I've no idea why, it seems to be industry standard practice. But if you ever happen to enter a NDA with applie then you better honour it!)

    Anyway, Apple pulled the contract and shifted every mac they could to nvidia. However, for some reason they didn't shift imac despite shifting everything else. I have a vague suspicion it is to force apple developers to always code in a GPU independent way (basically to keep nividia honest) but as an imac owner, it is very annoying.
  • Re:Unlikely. (Score:3, Informative)

    by UserChrisCanter4 ( 464072 ) * on Tuesday November 20, 2007 @09:42AM (#21419389)
    Intel graphics are already in Macbook (non-pros) and the Mac Mini; the low range of Apple products hasn't had a dedicated GPU/VRAM setup since the PowerPC days. In fact, there are fewer product lines with ATI chips than there are any others, since ATI is now only present in the iMac line of products and as a BTO option on the Mac Pro.

    You have to go pretty far back in Apple's product line to find a point where there wasn't a pretty even mixture of video card combination available.
  • by recoiledsnake ( 879048 ) on Tuesday November 20, 2007 @11:01AM (#21420359)
    From http://www.anandtech.com/video/showdoc.aspx?i=3151&p=2 [anandtech.com] :

    As for PowerPlay, which is usually found in mobile GPUs, AMD has opted to include broader power management support in their desktop GPUs as well. While they aren't to wholly turn off parts of the chip, clock gaiting is used, as well as dynamic adjustment of core and memory clock speed and voltages. The command buffer is monitored to determine when power saving features need to be applied. This means that when applications need the power of the GPU it will run at full speed, but when less is going on (or even when something is CPU limited) we should see power, noise, and heat characteristics improve. One of the cool side effects of PowerPlay is that clock speeds are no longer determined by application state. On previous hardware, 3d clock speeds were only enabled when a fullscreen 3D application started. This means that GPU computing software (like folding@home) was only run at 2D clock speeds. Since these programs will no doubt fill the command queue, they will get full performance from the GPU now.
    Hopefully, Nvidia will follow the lead.
  • by lonesome_coder ( 1166023 ) on Tuesday November 20, 2007 @11:03AM (#21420375)
    Actually, these cards are replaceable. This module is very easily removed and uses a standard mxm pci-e interface so that they can be changed out.

    Don't bring flames if you don't know what you are talking about.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...