Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Technology Hardware

AMD Plans Simultaneous Desktop and Mobile Chip Releases 199

wh173b0y writes "Tom's Hardware reports that AMD is planning to release both it's dual-core desktop and mobile chips at the same time. This news comes after AMD, who have been fairly quiet since the release of the Athlon FX-55, came up shorter than intel on the release dates for it's dual-core processors. Intel on the other hand has been busy planning more than a dozen different chips to release as well as pressing its software designers to embrace its 64-bit architecture."
This discussion has been archived. No new comments can be posted.

AMD Plans Simultaneous Desktop and Mobile Chip Releases

Comments Filter:
  • by dosius ( 230542 ) <bridget@buric.co> on Thursday March 03, 2005 @08:53PM (#11840479) Journal
    I would like to see a chip that would work as both mobile AND desktop...

    Moll.
    • My laptop currently has an AMD64 2800+ .. does anyone know if the dual core 64's will work with older motherboards that support 64bit single core cpu's? If so, this would be great news for people like me who already have 64bit laptops, if not.. meh.
      • Re:Older Laptops (Score:2, Informative)

        by ekul.taylor ( 864730 )
        Socket 939 and 940 will work (you will probabley need a bios update). If I remember correctly socket 754 should work as well. Thank you integrated memory controller. That is only for the first release of dual core chips. Once the cores have shared cache and the like I think you need new core logic for the northbridge
    • people did (and probably still do) that with mobile XP 2500+'s and such because you could get them up to very high clock speeds even air-cooled due to the low power requirements.
    • Re:Eff pee? (Score:3, Insightful)

      by Eric Smith ( 4379 ) *
      I'd like to see a product that's a floor was AND a dessert topping. But it's only gonna happen on SNL.

      There are much different tradeoffs that have to be made in chip design for low power vs. high performance.

    • Re:Eff pee? (Score:3, Informative)

      I currently run a AMD Mobie 2600+ (forgot the wattage). The thing is great, overlocks easily (isn't clocked locked)and with pretty cheap, normal, quite cooling solutions. Not only that, but it also runs a lot cooler then the normal athlon XP which means you dont need as many fans. It is so nice to have a CPU that doesn't go over 100F, and the case temp to go with it. I really hope in the future that AMD continues to make mobile processors that you can use in desktops.
      • IDK, my A64 3400+ for normal usage is around 90F or so. It's only under sustained full load that it hits 123F, and the mobo temp is still around 100F.

        I only have one fan aside from the Power Supply fans.
    • I would like to see a chip that would work as both mobile AND desktop...

      I thought all of AMD's mobile chips were the same pinout as the regular desktop chips - which means you can use either one for either application?
    • Two words: Pentium M. [tomshardware.com]
    • Re:Eff pee? (Score:3, Funny)

      by Jozer99 ( 693146 )
      Step 1: Buy laptop
      Step 2: Unpack laptop
      Step 3: Plug laptop power adapter into laptop
      Step 4: Plug laptop power adpater into laptop
      Step 5: Plug monitor into laptop
      Step 6: Plug USB Keyboard into laptop
      Step 7: Plug USB Mouse into laptop
      Step 8: Turn laptop on
      Step 9: ???
      Step 10: Profit

      Seems simple to me.
    • Re:Eff pee? (Score:3, Informative)

      by Laebshade ( 643478 )
      You mean like the AMD Mobile Athlon XPs [newegg.com]? I know you meant dual-core, but still.... these nice gems go in (most) desktops and laptops.
  • by Avyakata ( 825132 ) on Thursday March 03, 2005 @08:55PM (#11840492) Homepage Journal
    Isn't that kind of a bad strategy? I mean, won't they take away the attention from each other? I'd think it'd be better to make a spectacle of one, wait for people to invest interest in it, then, once the hype dies down, release the other to a similar effect. Won't this move minimize public attention?
    • by eyegone ( 644831 ) on Thursday March 03, 2005 @09:08PM (#11840579)

      If they understood marketing, they'd be Intel.
      • by ackthpt ( 218170 ) *
        Intel, which shrugged off the idea of 64 CPU's a couple years ago, as something people didn't need, has made up for this gaffe and is not only getting ready to sell their dual core line, but have already indicated the run of the Pentium IV is soon to be over.

        Next thing you'll hear from Santa Clara, 'why, we practically invented it!'

        So what kind of Las Vegas act will they enlist to push dual core? Probably twins or something, as Sigfried and Roy are shutdown.

    • Isn't that kind of a bad strategy? I mean, won't they take away the attention from each other?
      Like the way there are only a few companies making most of the brands of laundry detergent, and they each sell basically the same product under a bunch of brand names in order to compete for shelf space at Walmart?

      Seems like a viable strategy for laundry detergent; I don't know why it wouldn't apply to microprocessors as well.

      • by Che Guevarra ( 85906 ) on Thursday March 03, 2005 @10:02PM (#11840904)
        The difference is that each brand of detergent is not stamped Proctor and Gamble (atleast not overtly). Most people have no idea that every single detergent on the shelf at the grocery store is made by the same company. Proctor and Gamble uses this very expensive strategy to insulate each brand from negative consumer perceptions AND to eat up shelf space ANONYMOUSLY. Does AMD plan to place their name on both chips? If so, the Detergent analogy does not apply. Sorry to be a punk, I'm taking a marketing class this semester.
    • The speed of chip development (Moore's law) probably doesn't permit a staggered release if both chips are on the same development schedule.
      • For general knowledge:

        Moore's law states that every 18 months, computing power doubles and the price halves.
        • Wrong. Although these statements are often referred to as Moore's law, they are much more accurately termed corrolaries: the "law" actually only states that the number of transistors on a chip will double every 18 months. This is after a revision process as well, since I believe the original statement used either 6 or 12 months. According to Wikipedia [wikipedia.org], in 1975 he revised his law further, to a two year doubling period.
          • Geek battle begin! "In 1965 Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue." http://www.intel.com/research/silicon/mooreslaw.ht m
            • Okay, just looking at the start and end points on their chart, it appears that his law has been holding up pretty well, with a 22 month [google.com] overall doubling period. What is much more interesting, though, is that it looks like in fact the rate of exponential growth is increasing (or at least did in recent history), and looking at the chart from 1997 to 2003 (Pentium II to Itanium 2) gives a doubling period of only 12 and a half months [google.com].
            • Yeah, sure, in intel's revisionist history.
    • I mean, won't they take away the attention from each other? ...or maybe converge attention in a tigher spotlight previously unknown to mankind.

      Maybe they are going to spend some wang on the launch, and want to smoke to ferrets with one stone in the bush. Or something to that effect.

      Also, 1st to market might play something for dual core.

      Was I the only one who read the snippet as:

      is planning to release both it's dual-core desktop and mobile chips AT THE SAME TIME. [dun dun duuuun]

      Well, it means I can up
  • by Anonymous Coward on Thursday March 03, 2005 @08:56PM (#11840502)
    Two cores at the same time.
  • It didn't occur to me that we'll finally be able to get semi-affordable dual CPU laptops! :) drool!
  • News flash (Score:5, Interesting)

    by Wesley Felter ( 138342 ) <wesley@felter.org> on Thursday March 03, 2005 @09:00PM (#11840525) Homepage
    Desknotes use the same processors as desktops, so of course they come out at the same time. And now that all the desktop chips have power management, the difference between "desktop" and "mobile" chips is very little.
    • Re:News flash (Score:5, Interesting)

      by GoatPigSheep ( 525460 ) on Thursday March 03, 2005 @09:28PM (#11840711) Homepage Journal
      Well not exactly, my centrino notebook has a 75 watt power supply, for the WHOLE SYSTEM.

      A high ghz P4 can use 1.5 x that JUST FOR THE MICROPROCESSOR. The power management on the P4 is just to keep your electricity bill down...
    • Re:News flash (Score:3, Informative)

      by Jeff DeMaagd ( 2015 )
      Desktop and mobile chips mean a lot in difference, it's not just about having more idle modes. The thermal design power of the fastest Pentium M chip is 25 watts. The slower and ultra low volt P-Ms are in the mid-single digits. I am certain that the leakage power at idle of even a slow desktop Athlon 64 is higher than the fastest Pentium M running at 100%.
      • Re:News flash (Score:3, Interesting)

        You'd be surprised. I certainly was.

        The numbers for the Athlon 64 Winchester core are quite impressive...but this is because they havn't cranked up the voltage to produce anything faster than a 3500+ core yet.

        But take this, for example:

        3200+ Winchester.
        30w full-load (2.0GHz)
        10w Idle.
        3w Idle with Cool N' Quiet enabled (thanks to half core speed and even lower voltage)
        ~10w moderate load (Cool N' Quiet clocking the processor at 1GHz most of the time, 2GHz when performance demands it).

        I have one, and this
    • Well, even if it is the same chip, all designs benefit from a matured manufacturing process. Normally after a few steppings, a core will be capable of lower power operation, since manufacturing defects would be lessened.
  • Arr. (Score:5, Interesting)

    by Anonymous Coward on Thursday March 03, 2005 @09:03PM (#11840543)
    "Intel on the other hand has been busy planning more than a dozen different chips to release as well as pressing its software designers to embrace its 64-bit architecture."

    Good luck with that.

    AMD already rules the x86 64-bit market. AMD chips are currently more power efficient and produce less heat (on average, let's not compare high efficiency chips to 'normal' chips on either side of the table). Not to mention, who needs dual core, when you can have eight eight-core Opterons*? Sixty-four cores! Mmmm, there's the beef.

    It's so nice to see Intel trying desperately to catch up to AMD. ;) Insert quips about mighty falling, tables turning, et cetera.

    * Yeah, yeah, they won't be here tomorrow. I can dream, damn it.
    • by SuperQ ( 431 ) *
      Unfortunately, there is one problem with that.. and that is Dell.

      The first x86-64 machines we had at my work were Intel. Simply for the fact that we are an EDU and get good prices on Dell machines.

      Now.. if we do pick x86-64 for a cluster solution, Intel will probably not happen.

      If Dell shiped AMD procs.. Intel would die overnight.
    • Maybe Intel is thinking 'at least one of these has got to find a market somewhere'.
    • by Wiz ( 6870 )
      Yup, and even better than that is that AMD's 64-bit performance is better than Intel's.

      LinuxHardware [linuxhardware.org]

      Lost Circuits [lostcircuits.com]

      Notice how often AMD gain from running in 64-bit mode, where as Intel lose performance.

    • by Anonymous Coward
      AMD already rules the x86 64-bit market

      Actually, they don't. Copying from http://news.com.com/IBM+extends+lead+in+server+mar ket+-+page+2/2100-1010_3-5587722-2.html?tag=st.nex t [com.com]:

      "AMD pioneered the addition of 64-bit extensions to x86 in 2003 with its Opteron. Intel followed suit halfway through 2004. Despite AMD's earlier arrival, more revenue came from servers using Intel's 64-bit Xeon chips, McLaughlin said: $1.3 billion for Xeon servers, compared with $838 million for Opteron servers"

      Notice tha
      • Intel has 80% of the market share

        Yeah but a few years ago, Intel had over 90% marketshare. AMD has come a long way. People used to give me strange looks when I told them I preferred AMD over Intel (this was in the Super Socket 7 days, when AMD had PC-100 and Intel was still using PC-66).

        There's nothing wrong with Intel products (FDIV jokes aside), they make solid chips that perform decently. But the price/performance award goes to AMD hands down, and has for some time.

  • by Husgaard ( 858362 ) on Thursday March 03, 2005 @09:03PM (#11840545)
    It drives the market forward, forcing all parties to innovate.

    But take a moment to think about the current software patent madness, and what would have happened if this had been the case with semiconductor patents in 1980. In this scenario we would be lucky if Intel announced that the 486 would hit the market next year.

    If a company has a monopoly there is no incentive to innovate. Patents are monopolies, but they have to be balanced so the incentive to innovate is not taken away.

  • It's... (Score:2, Informative)

    by neutron2000 ( 409922 )
    ...not it's, it's its. Holy crap. And it's even an entity... It took effort to be that wrong.
    • Re:It's... (Score:1, Offtopic)

      by TIMxPx ( 859220 )
      Right on, brother. I don't know how that got modded offtopic. It should have been modded +3, for being the only person with the grammatical certitude to point out that /. can't get a simple thing like that right on the FRONT PAGE. That's embarrassing. Now mod me down offtopic, too!
    • I don't think that they could afford the Euro to invest in a proof reader, alas!
  • Correction (Score:5, Insightful)

    by leathered ( 780018 ) on Thursday March 03, 2005 @09:10PM (#11840594)
    "..as well as pressing its software designers to embrace its 64-bit architecture."

    Should read 'embrace AMD's architecture'.
    • Yes, it = AMD. You didn't expect Intel to mention the competition's name did you?
    • ..as well as pressing its software designers to embrace "its" 64-bit architecture
    • No, Intel is pressing software designers to embrace Intel's 64 bit architecture. Nobody needs any pressing with regards to AMD64, because lots of software is already taking advantage of it, and most of the rest will run fine in 32 bit mode.

      Itanium, on the other hand, requires lots of pressing to convince anybody to write for it. Very small market, after all...
      • No, Intel is pressing software designers to embrace Intel's 64 bit architecture. Nobody needs any pressing with regards to AMD64, because lots of software is already taking advantage of it

        Is it? I haven't noticed any kind of rush to support AMD64 at all, other. Could you elaborate on "lots"?
    • >> Should read 'embrace AMD's architecture'

      Yes, embrace AMD's architecture, which is in itself a trivial extension of INTEL's original x86 which AMD has copied for 2 decades.
  • by Anonymous Coward on Thursday March 03, 2005 @09:13PM (#11840608)
    How does the chip know which mode to run in? Probably a jumper.
  • by account_deleted ( 4530225 ) on Thursday March 03, 2005 @09:13PM (#11840610)
    Comment removed based on user account deletion
    • Uhh... duh, they do it all the time. Increasing clock speed is the standard method. But this time there's additional things that could be done, ie tuning compilers to take better advantage of having more than one processor available.
  • Two questions: (Score:5, Insightful)

    by MacGabhain ( 198888 ) on Thursday March 03, 2005 @09:18PM (#11840645)
    When is the last time Intel met a release schedule?

    When is the last time Intel failed to abandon at least a fourth of their in-development product line?

    Intel anouncing a dozen different dual-core processors for a range of machines is a joke, and frankly isn't even very good hype. Even if I believed it, I wouldn't be impressed. You don't NEED 12 different lines. Make 5 and make them right: 1) Super low power notebook; 2) performance notebook; 3) main-stream desktop; 4) enthusiast-gamer desktop; 5) Hardcore teraflops. (Oh wait... this is Intel. Better skip that last one. They're not exactly known for putting their effort into general-purpose FPUs.)

  • by Anonymous Coward on Thursday March 03, 2005 @09:22PM (#11840673)
    And AMD have a similar number.

    - Faster Semprons
    - Faster Athlon 64s
    - Faster Athlon FXs
    - Faster Athlon 64Ms
    - Faster Opterons
    - New Dual Core Opterons
    - New Dual Core Athlon 64s
    - New Dual Core Athlon 64Ms
    - Upcoming 65nm Opterons (both single and dual core)
    - Upcoming 65nm Athlon 64s (single, dual, FX)

    And there are probably plans for Quad-core Opterons, etc, at 65nm, and so on.
  • Catch-22 (Score:3, Interesting)

    by Luthair ( 847766 ) on Thursday March 03, 2005 @09:30PM (#11840725)

    Most have no use for dual cores and devs have no reason to implement support until their customers have them.

    • Re:Catch-22 (Score:3, Insightful)

      by Grey Ninja ( 739021 )
      I think in this case, it's much like 64-bit. If you build it, they will come. High end performance freaks like to have stuff like that. You can run Winamp and a game at the same time, or who knows what. It's not REALLY useful, but it's useful enough that people want it. As CPU manufacturers start to build them, a market will generate. As that happens, devs will begin to support the new market.
      • Unless you're talking dual 486s, Winamp uses bugger all CPU time. I've been doing that since Win95 and Pentium Classic/MMX.

        Try something a little more hardcore -- playing games while HTPC software runs in the background, possibly capping a large HDTV stream, and/or feeding it to the rest of the house.
        • No rudeness intended, but I couldn't make heads or tails of what you meant by "uses bugger all CPU time"

          That means it uses it all? or does that mean it doesn't use it all? or it depends?

          I'm American, some of the international idiom is lost on me without explanation. Thanks in advance for the clarification!
    • Re:Catch-22 (Score:5, Informative)

      by swillden ( 191260 ) * <shawn-ds@willden.org> on Thursday March 03, 2005 @10:06PM (#11840928) Journal

      Most have no use for dual cores and devs have no reason to implement support until their customers have them.

      I don't agree that most people have no use for dual cores. Sure, most applications don't make use of them, but all modern operating systems are multi-tasking and the ability to have one CPU taking care of all of the common busywork while the other one is crunching on whatever your main task is does make a difference.

      If you don't believe me, find a dual processor machine sometime and spend some time working on it. It's surprising how much smoother and more responsive it is -- often, a dual-processor machine *feels* faster than a single-processor machine with far more than twice the actual performance. I have a dual 500Mhz PII box that still surprises me every time I touch it. It feels faster than my 1.4 GHz Athlon and seems about as quick to respond as my Athlon64 3400+.

      For common tasks, users will find they actually prefer two cores at 1 GHz over one core at 4 GHz. The dual-core machine will be cooler (and therefore quieter) and will often be more responsive, even though it will be much slower at straight-line CPU-bound tasks.

      People will like these.

  • by doormat ( 63648 ) on Thursday March 03, 2005 @10:05PM (#11840917) Homepage Journal
    Since its just AMD's desktop-replacement line of chips, its the same thing as if Intel putting Pentium Ds in DTR laptops. Besides, Tom's Hardware is the Fox News of tech news, heavily intel/nvidia biased.
    • Besides, Tom's Hardware is the Fox News of tech news, heavily intel/nvidia biased.

      First, being the "Fox News" of tech news isn't necessarily bad - after all Fox has the highest ratings of any cable news channel. ;-)

      Further, having an NVIDIA bias isn't a bad thing either - NVIDIA has the best graphics tech right now, it makes *great* AMD64 chipsets, and it aggressively supports Linux with the best graphics and system drivers available. What's not to like?

      On the other hand, I'm not a big fan of Intel at

      • The problem with biases is, they stick around even when they're wrong. And fox news having the highest ratings doesn't make their news any more accurate, it just means they're misinforming more people.
        • The problem with biases is, they stick around even when they're wrong.

          What, exactly, is the difference between "opinion" and "bias"? Further, do you understand what an "editorial policy" is?

          And fox news having the highest ratings doesn't make their news any more accurate, it just means they're misinforming more people.

          I didn't state that its news was "more accurate". However, I would say that it's clear that more viewers prefer Fox's presentation of the news, than that of its competitors. The other ch

  • by cr0y ( 670718 ) on Thursday March 03, 2005 @10:28PM (#11841048) Homepage
    A highend mobile chip would be awesome, I would happily toss the extra money for a desktop replacement if it ran the games just as well as my current desktop, (which shouldnt be hard, A64 3000+, 1024,R9800Pro)

    I have been waiting for an athlon 64 notebook with a mobile radeon x800 for months...anyone know when this thing is due for release?
  • So BRING IT ALREADY! (Score:2, Interesting)

    by eWarz ( 610883 )
    I've been holding back on upgrading for a LONG time. I Almost upgraded when PCI Express and DDR2 came about, but with the news that dual-core CPUs were just around the corner, i decided to wait for that. I'd prefer AMD over Intel.

    FYI running on an AMD Athlon 2400+, MSI K7T-Turbo2 (KT133a chipset), 3 GB RAM, Geforce 6800Ultra, SB Audigy, Maxtor 80 GB special edition. While this PC isn't exactly a slouch in it's own right. (tends to outrun every machine i've touched, and since i do freelance computer repa
  • Intel Roadmap? (Score:3, Insightful)

    by Anonymous Coward on Friday March 04, 2005 @12:52AM (#11841874)
    While it seems nice that Intel has tried to pass off AMD's 64-bit solution as theirs and tried to pass of the idea of dual-cores as theirs, it seems clear that one of these companies is executing its roadmap and one is trying not to get runover by the competition.

    I still see clear technical advantages due to foresight in AMD's architecture (NUMA, Hypertransport) that support their dual-core designs. I see no such a roadmap/foresight from Intel. How do they plan on getting data to these dual-core Xeons fast enought so that their buses are not the bottleneck?
  • To quote the intel article:

    Parallelism will allow to chip developers to speed up processors ten-fold between 2005 and 2008, the executive said. "By the end of the decade, mainstream desktops will handle eight threads, mainstream servers 32 threads"...

    Although great news for games players, developers and media users, how is 8 processors going to be any benefit to the average corporate desktop that uses MS Office, IE and handful of other non-processor-intensive apps?

    It seems like dual and multicore
    • Although great news for games players, developers and media users, how is 8 processors going to be any benefit to the average corporate desktop that uses MS Office, IE and handful of other non-processor-intensive apps?

      Because most modern operating systems are written to allow for executing multiple threads simultaneously. If you run two or more apps at once you will see a benefit. When I'm at work I tend to be running 9 or 10 apps at a time, and I'm by no means exceptional in that regard.

  • I think that multi-core chips have an even larger potential in the mobile market than in the desktop market.

    With current processers - and even some not-so-current processers - there's not really much that an average person does on a laptop that actually uses all (or even most) of the CPU cycles. DVD playback, email, web surfing, and word processing tend to be the big apps - and none of them require much of a CPU at all.

    However, once a person starts trying to do several things at once, then issu

If entropy is increasing, where is it coming from?

Working...