Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Desktops (Apple) Portables (Apple) Apple

Apple's New M1 Macs Won't Work With External GPUs (engadget.com) 103

Today, Apple showed off the first Macs powered by its new M1 CPU, delivering impressive performance and excellent battery life, however they won't come without any compromises. According to Engadget, citing Paul Gerhardt's tweet, "tech spec pages for the new machines reveal that none of them are compatible with external GPUs that connect via Thunderbolt." From the report: Only some people would require add-on oomph in any case, but Apple's support for external graphics cards gave it some extra gaming cachet and informed creative professionals their needs would continue to be met. Now, they'll have to wait and see if things change for higher-end models as Apple Silicon spreads throughout the company's PC lineup.

There's also been some focus on the fact that the 13-inch MacBook Pro M1 models only include two USB-C ports onboard instead of four, but whether or not you think that's enough ports, it's consistent with the cheaper Intel models it replaces. A more striking limitation is the one we've already noted, that the MBP is limited to 16GB of RAM -- if you think you'll need 32GB then you'll have to opt for an Intel-powered model.

This discussion has been archived. No new comments can be posted.

Apple's New M1 Macs Won't Work With External GPUs

Comments Filter:
  • Don't worry (Score:5, Funny)

    by Arthur, KBE ( 6444066 ) on Tuesday November 10, 2020 @08:01PM (#60709856)
    By the next generation there won't be any more external ports to speak of.
  • Developers Only (Score:1, Interesting)

    by Cmdln Daco ( 1183119 )

    The only people who will buy one of these first gen boxes are developers who want to get on the platform early. Apple hasn't thought the whole thing out yet for anybody to want to commit to one for their daily. Too many first-gen limitations. Apple will orphan this batch early.

    • Not sure I agree it will be developers only. A huge number of users really only ever use their computers for web browsing, email, document writing, and organizing/viewing their music/photo/video library. Apple tracks usage - so they know this. If these types of users game (big if) they play mobile/tablet type things - not PC gaming.

      The only difference these type of users will see is the massive increase in battery life. So they'll be happy.

      Agree with you that Apple will likely orphan this 1st gen batch
    • We're looking at replacing an aging iMac (about ten years old) that is just too slow for regular productivity tools anymore. I'm considering one of the new Minis to get on the Arm track, as an Intel won't have nearly as much chance of lasting ten years at this point.

      • Bear in mind, the current gen M1 is limited to 16GB RAM, and no upgrade path. Granted it's early to say what impact that will have, it may be plenty under the unified architecture, but if you do have any RAM intensive apps it may be worth waiting a generation.

    • Can you list out what are the short comings ("first-gen limitations") that exist on a M1 MacBook Air that doesn't exist on the Intel MacBook Air?
      • Ability to run Linux and Windows natively. This was one of the huge advantages of owning a Mac. For professionals it was nice to be able to switch between all 3 major operating systems when the need came up. You won't even be able to run X86 versions of Linux or Windows in Parallels out of the box.

        • yes you can, but with ms cloud and linux cloud, so its not going to be the end of the world, and then you can have 20 hours in the beach without recharging :) and get some color. I mean why everyone wants their db, appserver, web server, etc in the same machine? when cloud is so cheap
          • That's neither natively or on a local VM, that's just a VM on a remote computer. It doesn't run on the Mac with the M1 Processor. Maybe you need to access your data when you are on an airplane. The cloud is useful for many things, but having a local instance of the server running on your machine for development is much better.

        • Are you being deliberately obtuse? Apple Silicon will not run x86 code. Your whining about not being to run Windows as a "first-gen limitation" is just pure horse shit nonsense.
      • by MrNaz ( 730548 )

        Software availability. Adobe's Arm version will be available some time next year. Office and Adobe will get done quickly, but what about all the other software tools people use to get work done?

        • Again, what tools do you need that wouldn't work? Other than other idiots blathering about the lack of ability to run x86 virtual machines, which is something that has been explicitly stated. Can you list the software tools that wouldn't work?
          • Healthcare workers have clinical management software. Logistics companies have transport management tools. Factory managers have ERPs.

            Pretty much every industry has their own critical software. Just because you can get by with a Chromebook doesn't mean everyone else can.

            • And you have confirmation that these software do not run? One of the ways Apple has worked around a different CPU is to use Rosetta (or Rosetta 2 in this case). Why would these ERPs and other software not work under Rosetta 2? Apple even demonstrated rather demanding games that worked fine under Rosetta 2. If you don't have confirmation that your special software don't work under Rosetta 2, you know what you should do? Go confirm it, then bitch about it.
              • by MrNaz ( 730548 )

                So Rosetta is an emulation layer that trashes performance, and is not particularly good at being transparent. I'd say in the PPC to x86 transition about 75% of software worked properly, the rest needed tweaks. Industrial software is not like consumer software. User bases are smaller, as are development teams.

                Furthermore, there's plenty of PPC software that just never made the transition and died. Arguably, that's because there weren't enough users and resources so they died a natural death, but still, there

                • You used a lot of words to say you don't have any hard evidence of specific "software tools" that won't work. Got it.
                  • by MrNaz ( 730548 )

                    Given that these laptops haven't been released to the public, and so nobody has been able to test them with anything, it is 3 flavours of retarded for you to even ask for "hard evidence" of anything with regards to these new Macs.

                    • You are the one making unsubstantiated claims, and now you are blaming others for asking you for evidence of unsubstantiated claims? You must be a Trump voter.
                    • No, I'm making pretty straightforward observations based on rather obvious and well understood background facts.

                      ARM architecture is well understood.
                      x86 is also well understood.

                      Unless you believe that Apple somehow magically made a chip that is 10x faster than any other Arm OEM (and some of them like Samsung and Qualcomm are really, really good at it) then the claims here are false.

                      Anyway I'm done with this conversation. We'll find out the truth of the matter soon enough.

                    • Your original words were

                      Software availability. Adobe's Arm version will be available some time next year. Office and Adobe will get done quickly, but what about all the other software tools people use to get work done?

                      What has this got to do with Apple Silicon being 10 times faster? You keep moving goal posts when you are asked to substantiate your claims. In this case, your claim that there's "software tools" that won't work (that's what I take not "available" to mean).

                      Can you focus on answering the question?

                    • No I don't. That comment was in response to this:

                      "Can you list out what are the short comings ("first-gen limitations") that exist on a M1 MacBook Air that doesn't exist on the Intel MacBook Air?"

                      It's a perfectly reasonable answer.

                    • And you feel that all the "software tools" that you use, such as ERP and so on will be broken. Based on no actual experience with the hardware. The whole point of Rosetta 2 is to bridge that pain.

                      If your claim is we don't know how effective R2 is, that's a good line to go with. That's acceptable.

                      However, your claim is that the software tools are unavailable, that's simply not true. Most ERP, or small companies, or single developer shops use normal programming languages such as C++, Swift, Java, and

                    • Ok. Whatever. We won't have to speculate for long.

                    • you may be interested in this. https://www.codeweavers.com/bl... [codeweavers.com]
    • by xack ( 5304745 )
      I'm ordering an M1 Macbook Air it as enthusiast and also to early adopt any emulators for Windows/Linux. I ordered one of the first white Macbooks in 2006 as well, which had the safety net of Windows XP which won't be an option this time round.
  • External GPUs for gaming won't be needed, how many AAA games are going to run on ARM?
    • Comment removed based on user account deletion
      • The fact they’re killing OpenGL on macOS is probably a bigger reason, but the gaming engines (Unity / Unreal / Cry) are all multi-platform, although it’s not 100% seamless.

      • I have no doubt it will be comparable in the things most Mac users do which is consume most forms of content except games. I however highly doubt it will be any good at real work. And that's simply because most professional applications leverage some pretty specific processor extensions that currently have no equivalent in ARM cant just be reinvented overnight.

        Apple's move to ARM has essentially killed any potential for the platform to be used in a professional environment.

    • Fortnite? Oh wait, nm...

    • Comment removed based on user account deletion
      • Good luck with much gaming on x86 OSX. lol, that's Apple for you. If you want to game on Apple, you game with iOS.

        If you want to game on a PC, you're playing from Steam or GOG.com

        You do realize that x86 Apple computers can dual boot into Windows or Linux, no?

        I recall this being actually quite popular among some circles of the gaming community, macOS for work and Windows for play. Part of this because of the poor support of eGPUs with macOS at the time. I've seen people run Windows and/or Linux on their Apple computers, with no intention to return to macOS. This could be because the Apple computer is old enough that it cannot run the latest macOS but it's been well cared for and t

        • Getting the video to run fast enough to run games under Wine on Linux on a Mac is a serious pain though, I couldn't do it and eventually gave up (maybe someone can do it).
          • by aitikin ( 909209 )

            Getting the video to run fast enough to run games under Wine on Linux on a Mac is a serious pain though, I couldn't do it and eventually gave up (maybe someone can do it).

            I'm running a 2020, 16" MacBook Pro spec'd to max RAM, and I can attest to how much of a pain it is. Currently playing through an older game (FFVII through Steam, through WINE, on Catalina (yes, that's a nightmare, but possible, hell, I've even managed to run some 32-bit ones)) and it uses as much of the system's resources as it can take, and I'm not running at top resolution.

            That being said, I don't want to have to reboot when I want to game, so I'm staying put there, but I wonder why it would be any wors

            • but I wonder why it would be any worse under Linux on a Mac, than any other hardware?

              Yes. I had no problem getting graphics to work under Wine on a Dell XPS 15 laptop.

      • "If you want to game on Apple, you game with iOS." Well, good thing then that macs are now compatible with iOS games
        • "If you want to game on Apple, you game with iOS."

          Agreed. It’s not like the full version of X-Plane actually runs on the Mac. Ditto Disco Elysium. Starcraft II and Portal don’t have Mac versions either,

          Wait.

          Yes they do.

          I see what’s happened. It generally requires an IQ of 110 or greater to realize that doing things a different way than we do them is a perfectly valid alternative, and is not an existential threat to our own existence.

          That’s a full standard deviation above the US a

      • My 2010 iMac had been utilizing GeForceNow for gaming. Buttery smooth high resolution on a crap 512MB video card.
      • Did you notice the part of the presentation where they said the IOS apps will run natively on the ARM Mac? Translation: All the iPad and iPhone software will run on the M1 systems which includes ALOT more software than x86 MacOS... Depending on the quality of the developer some apps will work better than others of course; iPad apps will probably be more useable than iPhone apps.
    • Almost anything really demanding these days (AI, video editing, photo editing, CAD, etc...) use the GPU for acceleration, it's not just for gaming.
      It may be a fast IGPU but that's not enough, particularly combined with a (pathetic) max of 16gigs ram.
      • Completely agree with the 16GB RAM limitation being a deal breaker. If that limit is part of the chip arch, then M1 is DOA for any pro level use cases. It looks promising, but it still feels a lot like this is a public beta without the beta label.

        • by uberzip ( 959899 )
          Seeing as the ram is built into the SOC, I’m guessing the real limit is needing to manufacture another part. That’s the downside of the SOC... every combination of Ram requires a different product rather than just soldering on more ram to the system board. Seeing as they are just getting started my guess is they wanted to stick to two major parts... the 8gb chip and the 16gb chip. I’m guessing a 32gb version will come out probably as part of a faster M1x that would be used in higher end M
          • That’s the downside of the SOC... every combination of Ram requires a different product rather than just soldering on more ram to the system board.

            It’s been rumored that some company has invented a way to add and remove RAM without needing to replace a SOC or to solder anything.

            Basically, the motherboard has these elongated socket things, I don’t know, for the sake of discussion let’s call them “DIMM Slots”, and you can put these things called, oh I don’t know,

            • It depends a lot on wether the i/o connections to the external ram are fast on the SOC. On-board RAM can be much faster. Do the Apple SOC parts dedicate enough pins for external RAM?

              This isn't your dad's PC with an AST Six-Pack to get the full 640K of RAM by running it through the ISA bus connector.

          • That’s the downside of the SOC... every combination of Ram requires a different product rather than just soldering on more ram to the system board.

            But it also requires that the SOC has enough address lines to even be able to access these parts.

            A lot of SBC top at 4GiB max (e.g.: the RK3399 SOC also used inside the PineBook Pro [pine64.org]), even if 8GiB parts exist and are now affordable.

            The BCM2711 used inside Raspberry Pi 4 and 400 PC is one of the few that can use such chips (or could, in the Pi 400 PC's case. My big prediction, for christmas 2021, Pi foundation announces a Pi 480 PC with 8GiB).

            The good thing with more players moving into the ARM field (Apple,

            • by aitikin ( 909209 )

              Until then, using swap on an a fast NVMe will probably the only solution to aleviate a bit the memory pressure.

              Which (please correct me if I'm wrong here) will lead to said NVMe having a shorter life span thanks to its overly active read/write cycle.

              • Which (please correct me if I'm wrong here) will lead to said NVMe having a shorter life span thanks to its overly active read/write cycle.

                Absolutely. If you consider swap on flash, you should at all cost only consider "Industrial", "Endurance", "Pro", etc. line or products and extensively look at the wear levelling that they support.
                And not fill the whole storage space, but leave some significant fraction unallocated to give more wiggle room to the wear levelling.

                And like I said, only alleviate a bit of the memory pressure (like freeing some more RAM by swapping out less used parts - pages containing initialization code that isn't necessary a

      • Depends on how the Neural engine and other engines combine with the GPU/CPU's as to how AI/ML/CAD/etc will perform. Your intel chip based systems didn't have these extra engines so it will be interesting to see how they help/hinder beefier apps. x86 architecture has ALWAYS been a memory pig; I can run a small scale pi kubernetes cluster on 1G RAM ARM chips, can't even boot some versions of Linux on x86 with 1G of RAM... Just because x86 needs 32G of RAM to be useful doesn't mean other architectures need t
  • by SuperKendall ( 25149 ) on Tuesday November 10, 2020 @08:51PM (#60709968)

    Just because it's not supported initially, doesn't mean it will never be supported... the ports still support it, and in fact Apple could even make their own eGPU to start with...

    • Exactly. Not sure why some are taking this as a NEVER EVER WILL IT SUPPORT IT. They didn't support it right out the gate previously either. When they transitioned to Intel chips they didn't support booting Windows right away either. It was added a couple months later, just as they'll do with eGPU support.
      • Exactly. Not sure why some are taking this as a NEVER EVER WILL IT SUPPORT IT.

        Past performance. Apple hasn't had a track record of backdating hardware features to older devices.

    • Just because it's not supported initially, doesn't mean it will never be supported...

      There can be hard limitation to the hardware - that could be *absolute* blockers.

      the ports still support it,

      the physical ports connectors support it. But not necessarily the CPU/SoC itself.

      There's a lot of messy low-level stuff going on to map the PCIe bus resources that the GPU needs to talk to and be controller by the CPU.
      (BAR, IO Port space(*), etc.) and GPU are among the most complex and resource demanding stuff you can plug on a PCIe bus.

      Just look at all the reports of attempts to run GPUs on ARM SBCs that have PCIe lanes (Jeff [jeffgeerling.com]

      • by tlhIngan ( 30335 )

        There's a lot of messy low-level stuff going on to map the PCIe bus resources that the GPU needs to talk to and be controller by the CPU.
        (BAR, IO Port space(*), etc.) and GPU are among the most complex and resource demanding stuff you can plug on a PCIe bus.

        Just look at all the reports of attempts to run GPUs on ARM SBCs that have PCIe lanes (Jeff (Hackaday), Colin Riley (Tech republic), etc.)

        Most of those ARM CPU have roots in the embedded world (smartphone, tablets, set top boxes, etc. - where basically y

  • by WankerWeasel ( 875277 ) on Tuesday November 10, 2020 @08:56PM (#60709982)
    This is only initially. Just as Boot Camp and other tech wasn't added right when Intel chips were released, eGPU support is coming but not available right at launch.
    • Bootcamp was a simple software release though which added that support to all existing intel macs and you could actually book windows very shortly after intel macs were released thanks to third party hacks. Do the new macs have the hardware to support eGPUs because, if not, then the machine you buy today will never support eGPUs.
    • by AmiMoJo ( 196126 )

      It's not just a case of throwing a Thunderbolt transceiver in there though. ARM designs are starved for PCIe lanes to begin with and Thunderbolt support will probably be via USB 4 when it comes. Apple could do their own thing but it's a lot of work to develop.

      Apple isn't even on the latest revisions of ARM yet, they lag by a few years. E.g. other ARM SoCs are moving to DynamiQ but they are sticking with big.LITTLE for another generation at least. So I wouldn't expect things to happen too quickly on this fro

      • It's not just a case of throwing a Thunderbolt transceiver in there though. ARM designs are starved for PCIe lanes to begin with and Thunderbolt support will probably be via USB 4 when it comes.

        Thunderbolt host support requires PCIe access. What you wrote makes no sense.

        Apple could do their own thing but it's a lot of work to develop.

        Apple isn't even on the latest revisions of ARM yet, they lag by a few years. E.g. other ARM SoCs are moving to DynamiQ but they are sticking with big.LITTLE for another generation at least. So I wouldn't expect things to happen too quickly on this front.

        DynamiQ Is more about cluster computing and as far as I can tell will never be used in general purpose computers. Apple’s instruction set and rough “overall” design are definitely not years behind.

        • by AmiMoJo ( 196126 )

          DynamiQ is the successor to big.LITTLE and intended to consumer SoCs.

          You can see where they are going by looking at how AMD and Intel have developed their multi-core CPUs. Those 16 core 32 thread beasts are not just more cores thrown together, they have been improving the interconnection and resource sharing for years.

          Originally dual core CPUs shared some high level cache and bus stuff but that was about it. There are a lot of issues with threads switching between cores, contention and so forth. There was a

      • by Megane ( 129182 )

        ARM designs are starved for PCIe lanes

        I don't know if you've noticed, but Apple isn't using a third-party "ARM design". They go so far back with ARM that they have a full license to create whatever the fuck silicon they want, particularly their own cores. And PCIe lanes aren't part of the core anyhow.

        Not that I'd expect Apple to care much about making this one anything like a hot rod. It's clearly a first cut to get the architecture out there. If this goes anything like the past 35+ years of Macintosh, the PC industry will go all-in on ARM (on

  • by Joe_Dragon ( 2206452 ) on Tuesday November 10, 2020 @09:15PM (#60710026)

    Mac PRO level system will need more ram and GPU power to drive at least 2 8K displays. Hell the mac pro better have slots for video cards, non apple storage, audio cards, 10 gig or more networking.

  • I'm more interested in whether I will be able to boot from an external drive. I understand that Linux will be a no-go until the community catches up, but I like to keep my work on a separate drive from my personal stuff, especially since we use the Apple business iCloud setup.

    • No the whole point of this is locking down the device just like an ios device is locked down. You canâ(TM)t even install software that isnâ(TM)t from apple store, yet alone boot to something else
  • It appears to me that the problem of supporting an eGPU on new Macs is that there hasn't been enough time to develop the drivers. This should fix itself in time as the new platform gains users and therefore becomes a market of sufficient size to make the effort profitable.

    Another place I could see a problem is the limited bandwidth of the USB4 ports. USB4 doesn't add any more bandwidth over USB 3.x. It adds the ability to put USB packets on the "superspeed" data lines so that it can share those lines with Thunderbolt and DisplayPort packets. With USB 3.x the superspeed lines had to be either/or, they'd have to be dedicated to USB data or to Thunderbolt/DP data packets. Because USB4 is backward compatible with all versions of USB, all versions of Thunderbolt, all (or most?) versions of DisplayPort, there should not be any physical limits in the connector preventing adding an eGPU. That is unless people expect more from an eGPU than before and there isn't enough bandwidth to support an eGPU any more. But then this should also apply to any eGPU that uses the USB-C connector. Unless I'm missing something on how USB4 works.

    So, where is the problem? My best guess is that the problem is the platform is so new that there hasn't been enough time to develop and test drivers for public release yet. That's a problem that should disappear fairly quickly.

    • by AmiMoJo ( 196126 )

      It's interesting that Apple has not revealed how many PCIe lanes the M1 has. I suspect that's the main reason for not supporting Thunderbolt at this time, they simply don't have enough lanes available to make it work.

      Adding more PCIe lanes is not trivial.

      • I suspect that's the main reason for not supporting Thunderbolt at this time, they simply don't have enough lanes available to make it work.

        I don't know what you are talking about. The ARM based Macs support Thunderbolt on it's USB-C ports just like the Intel based Macs. They almost have to because Apple displays have been using Thunderbolt to work for years now.

        • by AmiMoJo ( 196126 )

          Displayport over USB is not the same thing as Thunderbolt. USB-C alternative mode supports Displayport.

          You need PCIe lanes for an external GPU, that's different to Displayport used to connect a monitor.

          • Displayport over USB is not the same thing as Thunderbolt.

            Agreed. Thunderbolt was designed to coexist on the same data lanes as DisplayPort, and with USB4 we can now see USB, Thunderbolt, and DisplayPort data on the same USB-C superspeed data lanes.

            USB-C alternative mode supports Displayport.

            USB-C alternative mode supports a lot of things, including Thunderbolt.

            You need PCIe lanes for an external GPU, that's different to Displayport used to connect a monitor.

            Of course, and if anyone bothered to read the specs on the new ARMbooks they will see that they support Thunderbolt on their USB-C ports. As people should be aware of is that Thunderbolt is one implementation of an external PCIe port.

            Apple has been

      • by Anonymous Coward

        This is literally the only place on the internet where people bitch and moan about PCIe lanes.

    • Glad I still have my 2019 Mini w/Sonnet EGFX Breakaway Puck!
  • Intel didn't support >16gb on laptop CPUs until the 6th generation core processors. These are aimed at the low end of the market, so 16GB will be fine for them.

    • by AmiMoJo ( 196126 )

      Depends how long you want your laptop to last. I have an old NEC LaVie that is still great in every way... Except that it's limited to 4GB RAM because that's all the 2013 Intel CPU supports.

      At the time I got it I thought 4GB was okay, paired with an SSD it runs Chrome very well, handles my development and CAD needs. But 7 years later it's struggling a bit and if I could just upgrade it to 8GB or ideally 16GB I'd be happy to keep it for many more years.

      16GB seems decent today but with the price of these thin

    • by leptons ( 891340 )
      >Intel didn't support >16gb on laptop CPUs until the 6th generation core processors.

      Well that's just totally false. I have a laptop with a 3rd-gen i7 with 32GB RAM. Works great.

      i7 3rd Gen 3630QM

      https://ark.intel.com/content/... [intel.com]

      Max Memory Size: 32 GB

      https://www.notebookcheck.net/... [notebookcheck.net].

      "The Intel Core i7-3630QM is a fast quad-core processor for laptops based on the Ivy Bridge architecture and successor of the i7-3630QM."
  • Macs aren't for games. They haven't been viable for games for like 10 years. An external GPU is almost always a bad idea anyway, but it's especially bad on a Mac. No DirectX, ancient OpenGL, and proprietary metal that only works on games designed for MacOS/iOS.

    Macs are great for some things, and ARM worries me for some other things (Windows emulation in particular), but really, we have probably 10,000 Macs at work (no exaggeration) and I have never seen an external GPU.

    • An external GPU is almost always a bad idea anyway, but it's especially bad on a Mac

      I know someone at work who used a 1080Ti as an eGPU with his macbook pro to do deep learning. More than good enough for trying and testing decent sized things locally before pushing them to cloud machines for long training runs.

      Usual laptop caveats apply: it's not as fat as a desktop but you know it's a laptop.

    • Macs aren't for games.

      They’re “for” whatever people use them for.

      To be fair, I get what you were going for, which is I think: Macs aren’t ideal for gaming in general.

      Which may be so, but there are a fair number of titles available, and if you happen to be a dedicated player of one of those titles, then Macs ARE for gaming. At least for you.

      • Are there any mac-exclusive AAA games?

        Are macs designed to accommodate gamers?

        Macs are not for games. You can use them to play some games badly, but that's not what they are for.

      • How many games exist for PC's vs exist for iPhone/iPad?...
    • Macs are great for some things, and ARM worries me for some other things (Windows emulation in particular), but really, we have probably 10,000 Macs at work (no exaggeration) and I have never seen an external GPU.

      Someone edits a video on their MacBook, they get home, plug in the GPU, and render it.

      You’re telling me that won’t speed things up quite a bit?

      Doesn’t the ability to access CUDA or nvenc make it a better media-creation machine? What about programs that all but require CUDA to run

    • Macs aren't for games. They haven't been viable for games for like 10 years.

      For some definition of "games" maybe. Take a look at the macOS steam catalogue [steampowered.com]. There are some great titles there if you want to fire something up to keep yourself entertained for a few hours on a weekend.

      To use a car analogy, saying that a Mac isn't viable for games is like saying that a Fiat 500 isn't viable for driving. Yeah... if you're a car hobbyist who takes their car to the track, the Fiat would be a silly choice, but it'll still get you around and can be nice for an afternoon drive through the co

    • An external GPU is almost always a bad idea anyway, but it's especially bad on a Mac.

      Counterpoint - my Macbook Pro is running a Radeon RX 5700 via eGPU to drive two 49" ultrawide monitors. When I want to hit the road, I just disconnect and take the laptop with me. All the benefits of portability, with the size and comfort of a full desktop. It was the promise of twenty years ago when laptop docks first became a thing, but the ability to drive multiple monitors at high resolution from the laptop GPU has always been an issue, until eGPUs.

  • eGPU support - could care less; I have literally never met a single person with an eGPU - also don't know a single serious gamer with a Mac. They're all die-hard PC buyers. 16GB of RAM - yeah - that maybe a problem; particularly with such great virtualization support - it seems like the RAM limitation could be a big problem. As a developer; I'm very very keen to onboard to these new machines; the intel Macs have _so_ many problems, particularly with coming back from sleep; it's exciting to contemplate quick

    • Intel Macs have a problem with restoring from sleep? Um... that's a solved problem even with Intel architecture at this point. If Macs are having trouble with waking from sleep then it's more than likely a software problem and the change to ARM isn't going to fix that.

  • Holy moly 17 people that had no intentions of buying a first gen MacBook are going to be tweeting like crazy
  • M1? I'll wait for the M5, I heard it's going to be "The Ultimate Computer".
  • As with all new hardware will it take time for stable drivers to become available. Especially when it's a new platform design, one breaking with previous designs, does it often take many months before third parties have working drivers available.

    • Linux ARM has quite a few drivers, don't think it would take THAT long to glean any needed info specific to ARM from those and apply it to MacOS. You could probably even gleen info from BSD ARM ports for even faster adoption. Could be done, not clear if Apple/Devs will do that or just slide on iPad/iPhone apps tweeked to M1.
      • Look, you cannot even get a proper open-source driver for Nvidia. And on ARM can one also not have proper support for just even the ARM's own GPUs. It's all great on paper, but when you want i.e. video encoding or vulkan support for a Mali GPU then you're already in trouble. So even when a manufacturer decides they now want to support their GPU on Apple's new platform, (i.e. AMD and Nvidia) then it's still going to take a while before you get something mature and it won't be open source and the time for bug

  • The Apple M1 is an SoC design derived from their existing line of SoC designs for mobile applications. It rather makes sense then that the design doesn't include support for that sort of hardware extension. There's two issues: adding the necessary channels to the hardware for it, and writing the driver support to make it possible to use. The hardware support is logically something that can be added in newer designs.

    There are a few sort of obvious desirable architectural features for performance desktop and

  • I just can't get over the dongle hell.. I still use USB-A daily, ethernet, HDMI, among others.

    It's incredibly frustrating to have all those dongles. I have a 2018 15" fully loaded MBP, but I don't use it and just stick to my 2013 13". It's so much easier to not have to deal with all those dongles.

    Apple says they want elegant aesthetics, but some ugly multi-dongle or several dongles just won't do the job.

    There really is no Macbook Pro anymore.. just two versions of the Macbook Air: the one with a fan and the

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...