Will Nvidia Spark a New Generation of Linux PCs? (zdnet.com) 95
"I know, I know: 'Year of the Linux desktop ... yadda, yadda'," writes Steven Vaughan-Nichols, a ZDNet senior contributing editor. "You've heard it all before. But now there's a Linux-powered PC that many people will want..."
He's talking about Nvidia's newly-announced Project Digits, describing it as "a desktop with AI supercomputer power that runs DGX OS, a customized Ubuntu Linux 22.04 distro." Powered by MediaTek and Nvidia's Grace Blackwell Superchip, Project DIGITS is a $3,000 personal AI that combines Nvidia's Blackwell GPU with a 20-core Grace CPU built on the Arm architecture... At CES, Nvidia CEO Jensen Huang confirmed plans to make this technology available to everyone, not just AI developers. "We're going to make this a mainstream product," Huang said. His statement suggests that Nvidia and MediaTek are positioning themselves to challenge established players — including Intel and AMD — in the desktop CPU market. This move to the desktop and perhaps even laptops has been coming for a while. As early as 2023, Nvidia was hinting that a consumer desktop chip would be in its future... [W]hy not use native Linux as the primary operating system on this new chip family?
Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn't. It's that simple. Nowadays, Linux runs well with Nvidia chips. Recent benchmarks show that open-source Linux graphic drivers work with Nvidia GPUs as well as its proprietary drivers. Even Linus Torvalds thinks Nvidia has gotten its open-source and Linux act together. In August 2023, Torvalds said, "Nvidia got much more involved in the kernel. Nvidia went from being on my list of companies who are not good to my list of companies who are doing really good work." Canonical, Ubuntu Linux's parent company, has long worked closely with Nvidia. Ubuntu already provides Blackwell drivers.
The article strays into speculation, when it adds "maybe you wouldn't pay three grand for a Project DIGITS PC. But what about a $1,000 Blackwell PC from Acer, Asus, or Lenovo? All three of these companies are already selling MediaTek-powered Chromebooks...."
"The first consumer products featuring this technology are expected to hit the market later this year. I'm looking forward to running Linux on it. Come on in! The operating system's fine."
He's talking about Nvidia's newly-announced Project Digits, describing it as "a desktop with AI supercomputer power that runs DGX OS, a customized Ubuntu Linux 22.04 distro." Powered by MediaTek and Nvidia's Grace Blackwell Superchip, Project DIGITS is a $3,000 personal AI that combines Nvidia's Blackwell GPU with a 20-core Grace CPU built on the Arm architecture... At CES, Nvidia CEO Jensen Huang confirmed plans to make this technology available to everyone, not just AI developers. "We're going to make this a mainstream product," Huang said. His statement suggests that Nvidia and MediaTek are positioning themselves to challenge established players — including Intel and AMD — in the desktop CPU market. This move to the desktop and perhaps even laptops has been coming for a while. As early as 2023, Nvidia was hinting that a consumer desktop chip would be in its future... [W]hy not use native Linux as the primary operating system on this new chip family?
Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn't. It's that simple. Nowadays, Linux runs well with Nvidia chips. Recent benchmarks show that open-source Linux graphic drivers work with Nvidia GPUs as well as its proprietary drivers. Even Linus Torvalds thinks Nvidia has gotten its open-source and Linux act together. In August 2023, Torvalds said, "Nvidia got much more involved in the kernel. Nvidia went from being on my list of companies who are not good to my list of companies who are doing really good work." Canonical, Ubuntu Linux's parent company, has long worked closely with Nvidia. Ubuntu already provides Blackwell drivers.
The article strays into speculation, when it adds "maybe you wouldn't pay three grand for a Project DIGITS PC. But what about a $1,000 Blackwell PC from Acer, Asus, or Lenovo? All three of these companies are already selling MediaTek-powered Chromebooks...."
"The first consumer products featuring this technology are expected to hit the market later this year. I'm looking forward to running Linux on it. Come on in! The operating system's fine."
\o/ (Score:4, Funny)
What big eyes you have Grandma
All the better to do user-subsidised distributed-AI with my dear, *licks lips*
dumbest article yet (Score:2, Troll)
Not just anyone can steal such a dumb article, it takes EditorDavid.
Probably not (Score:2)
But microsoft might... who wants to buy new hardware? Somebody needs to put a bit more polish on UI's and things though before the Luddites start to use linux.
Re: (Score:3)
What are you talking about? 128GB is *way more* VRAM than has traditionally been available to open-source AI devs unless they're willing to pay out the nose to rent A100s. Normally you have to work on a 3090 or 4090 gaming card with 24GB. You can use multiple cards, but then you're bottlenecked by the bus. This is 128GB unified memory. On a top-end efficient architecture.
Yeah, it's not going to have the FLOPS of a GB200, or anywhere remotely near that. Because you're paying 1/20th the price. But it's
Re: (Score:2)
$3k is the price of a good gaming rig.
If you're looking for something *really* cheap, something like the Orin Nano is more your price range. Normal GPU-level amounts of VRAM but only ~$250 total for a whole system.
Re: (Score:2)
That's exactly what it's marketed as, and "as is" is not big enough to do that.
To do ChatGPT at home, basically, you need a full DGX (which costs a million dollars, and contains at least 4, 80GB GPU's, total 320GB), and 128GB is only a third of what you need. Now imagine buying three of these. That's what you'd need. $10000 at home to build your own chatbot. That's the extreme threat to "cloud" AI nonsense.
But nobody really needs a full LLM. It's doubtful there are any practical use cases for Project DIGITS
Why would anybody pay for this? (Score:2)
LLMs are mostly useless.
Re: (Score:2)
LLMs are mostly useless.
Why would anybody pay for this? Maybe because they realize that LLMs are not just ChatGPT et al. and that there are a lot of interesting and useful models besides LLMs.
Re: (Score:3)
"that there are a lot of interesting and useful models besides LLMs."
Ok name 3 in *popular* use.
Re: (Score:2)
Coqui - Voice cloning/TTS
RVC - Voice cloning
Flowtron/RadTTS - Highest end TTS you can do on a GPU
Vosk - ASR
Whisper is also a useful ASR, but it's clearly trained on pirated videos.
Those above are incredibly useful as-is and can run on CPU's or GPU's. The difference is, when you run those things on a GPU instead of the CPU, it takes fractions of a second where otherwise it would take a minute.
There are plenty of generative AI and LLM stuff that is not useful, but a 128GB "memory" does open it up to being abl
Re: (Score:2)
Ah, you are talking about visual and audible "better crap". Still crap and people are getting tired of it.
Re:Why would anybody pay for this? (Score:5, Insightful)
Meanwhile StackOverflow has 23% of the rate of posting new questions as it did before ChatGPT, but you keep telling yourself that.
Also, as the person above you pointed out, LLMs are not the only application of ML.
Re: (Score:2)
Re: (Score:2)
LLMs are mostly useless.
The same thing was true of PCs, at one point. Then people found was to make them useful and user-friendly, and now they are ubiquitous.
Re: (Score:2)
On the other hand, the history of computing is full of failures and has very few successes in comparison. Your statement is bullshit.
Re: (Score:2)
This is true of literally every technological advancement. There was probably someone who said the electric light bulb isn't very useful because the filament keeps burning out. And then engineers got to work and made them more reliable.
Re: (Score:2)
There are also people that said that, say, Josephson circuits are not very useful. Oh, and look they are.
The thing is that all the failures are easy to forget and the (small) number of successes stay around. Hence uneducated and clueless people fall for confirmation bias.
Why MediaTek? What are they doing? (Score:2)
Normally they're the cheap, basic parts. They take designs from people like ARM and just make them. Other companies will tweak and fix stuff, trying to make a better product than everyone else while leveraging the rest of the work they bought.
I know NVIDIA has their own CPU design, has had a few of them actually. Some much closer to ARM's version than others. Obviously NVIDIA is supplying the GPU/Accelerator (are we going to stop calling them GPU's anytime soon?).
Is MediaTek involved just so NVIDIA does
Re: (Score:2)
Modems.
They're competing with Snapdragon, so they need to be 5G capable on these connected devices.
Local sounds nice, but... (Score:3)
Isn't it normally local, then remote? They released the 'as a service' version first.
Typically buying a device I only need part of is a bad idea. Especially versus renting someone else's.
And how is this so different from an x86 PC with a GPU? I'm sure the costs will differ, but capability wise... I don't know the payoff is there for the work it'll require. All those x86 game tools gone (deep innards of a PC differ, so tools must differ). And unless Linux has a binary translation layer I'm forgetting, I don't think you can run x86 programs on ARM hardware normally.
The parts needed to do so seem around. I just don't think that flavor has been set up. Translate everything? Sure. One program? less so...
Re: (Score:3)
128GB of VRAM, for starters.
This is not a gaming rig.
Re: (Score:2)
Why not just buy the GPU separately and stick it in a PC slot? I can certainly see that there are some potential customers for this "AI PC" but that market seems small. The average home user doesn't need it and can't afford it, and the vast majority of business users don't need it, and those in business who do want it very often are going off of hype and not an actual need.
If AI works out, it's a great new technology, but it's not a large market technology any more than a gene sequencer is.
Re: (Score:2)
What PCI GPU do you think is out there that has 128GB of VRAM?
AI tasks don't need much in terms of CPU or fast local storage. They do however impose a lot of bandwidth / latency and cooling requirements. It's easiest to just design to the task.
Re: (Score:1)
...unless Linux has a binary translation layer I'm forgetting...
There is one [box86.org].
Re: (Score:3)
Hmm, maybe it will be usable for games, then.
Really, though, the hardware isn't optimal for gaming. It's stressing VRAM, VRAM bandwidth, and low-precision flops over higher-precision flops used in gaming. There's no games out there doing their 3d math in FP4 and requiring 128GB of VRAM. If you want a gaming box, get a RTX 5000 series card in a traditional PC.
Most people doing AI already use linux (Score:2)
So, no market gains there.
Meanwhile, people who want a desktop for desktop stuff will buy the mediatek laptop that will soon come out with Winn on ARM
So no, nVidia will not spark a new generation of AI PCs, and this will not bring the year of the linux desktop in 2025
JM2C
YMMV
Re: (Score:2)
So, no market gains there.
" Will Nvidia Spark a New Generation of Linux PCs? "
No.
I'd be curious to hear what specific work... (Score:1)
I know NVIDIA's effectiveness at capitalism puts them at odds with typical OSS people like Torvalds, but it's nice to hear some praise. As I think NVIDIA has tried to do right already.
I know they were working to open source more of their driver. That's nice. Think it was mostly about organizing and making a stub for the stuff they really worried about. Not work that makes much profit for them, but could get some good will. And let other people help them.
I'm curious if the latest work they're praising w
Re: (Score:2)
"working to open source their driver"? This is probably a stupid question but why not just open source it already? What is it that they are protecting? Their hardware interfaces by obscuring them?
Re: (Score:2)
nvidia always had the same standing for everything:
You must do their way or not work with it at all!
everyone hates working with nvidia, hardware makers prefered to go AMD only or lose most of their business than working with nvidia! nvidia enforce so many rules that making profit with them is extremely hard (but the nvidia profits are always increasing). They simply do not care and that is why no console or super computer choose nvidia, they still make enough money being a bully! It is just like Apple, thei
The PC tech problem (Score:3)
Amongst the many IT roles I've held over the last 30+ years PC tech has been the most rewarding as it never ends and is always evolving. That said, 99.99% of tech work I have done deals with Windows and the Intel architecture, Wintel if you will. I have knowledge of fixing Windows issues going all the way back to Windows 3. Put a Windows machine in front of me with an issue and I will figure it out and fix it. However, put a Linux machine in front of me with an issue and there's a 50/50 chance I'll be able to fix it without resorting to wiping the system and reinstalling. I just don't have enough experience with Linux because there has been so little call for it (I'm getting better though as I have Linux machines I use daily now). This is more than likely the case for many of PC techs out there. Linux is crazy powerful but can also be crazy confusing to use. The mind-boggling array of command line tools available is impressive but also convoluted to use and remember. With Windows on the other hand I rarely had to resort to the command line to repair something. My point being that the new generation of PC techs need to get up to speed with repairing Linux systems proficiently. What would really help is less reliance on the command line. PC techs don't want to be uber Linux gurus, they just want to be able to fix it quickly and efficiently for the customer that brought it to them. Also, that command line needs to be completely transparent to the average computer user. Many Linux distros have made great strides toward this but it's not quite there yet. Just my 2 cents.
The operating system is largely irrelevant (Score:5, Insightful)
Amongst the many IT roles I've held over the last 30+ years PC tech has been the most rewarding as it never ends and is always evolving. That said, 99.99% of tech work I have done deals with Windows and the Intel architecture, Wintel if you will. I have knowledge of fixing Windows issues going all the way back to Windows 3. Put a Windows machine in front of me with an issue and I will figure it out and fix it. However, put a Linux machine in front of me with an issue and there's a 50/50 chance I'll be able to fix it without resorting to wiping the system and reinstalling. I just don't have enough experience with Linux because there has been so little call for it (I'm getting better though as I have Linux machines I use daily now). This is more than likely the case for many of PC techs out there. Linux is crazy powerful but can also be crazy confusing to use. The mind-boggling array of command line tools available is impressive but also convoluted to use and remember. With Windows on the other hand I rarely had to resort to the command line to repair something. My point being that the new generation of PC techs need to get up to speed with repairing Linux systems proficiently. What would really help is less reliance on the command line. PC techs don't want to be uber Linux gurus, they just want to be able to fix it quickly and efficiently for the customer that brought it to them. Also, that command line needs to be completely transparent to the average computer user. Many Linux distros have made great strides toward this but it's not quite there yet. Just my 2 cents.
I would agree with you 20 years ago, but the majority of people rely on a browser and a handful of simple clients to cloud services...like e-mail, messaging, etc. Nerds like to install programs. Casual users are happy with Google Docs, the .0001% of the time they need to do something productive from home. My sister is a successful, educated professional in a knowledge-based field. She only uses default apps....safari, apple mail, messages, etc. I am the similar. All I have installed on my personal MacBook is photomator, an extension of Apple Photos that let's me edit RAW files from real cameras, and firefox, because I prefer it to Safari.
Each year, Linux vs Windows vs Mac becomes less irrelevant as more apps are moved to the web/cloud. If the hardware is supported by the vendor, then driver installations should ideally be easy. In fact, in my decades of running desktop Linux, I honestly found it easier than Windows. Ubuntu always supported all of my hardware...for Windows, I'd have to find the device maker's websites and it was often some drama getting things working. I got sick of the drama and just switched over to an XBox for gaming both because I prefer a straightforward experience to a VERY expensive one with better graphics...and I also had kids, so I just don't have the time like I used to cultivate a gaming hobby personally
The barriers you describe sound like hardware and apps...the manufacturer is supporting the hardware, so you can probably expect an Apple-like experience. The apps? Well, less and less are run each day locally...so with each year, they become less relevant and Linux becomes more viable.
Re: (Score:2)
I agree with you. However, Linux still does crash and in my experience the main cause is from installing/updating applications. Some rogue dependency needs to be added or updated and upon a reboot ... panic. These are a real PITA (in my experience, YMMV) to find and correct. The underlying OS can still have issues.
Re: (Score:2)
Windows also crashes. It's not rare at all. It is less stable than a well set up Linux distribution.
Re: (Score:2)
Why wouldn't powerful on-board AI bring about a new PC standard?
Imagine, this PC being able to figure out its own issues as long as it boots. In fact, the Unix ethos might be much better for AI to manage than Windows GUI. Linux complexity may disappear.
Re: (Score:2)
The last kernel panic was years ago because of a faulty hardware. The last Windows crash ... I think last month. Because I rarely use Windows. Otherwise it would probably be rather last week.
Re: The PC tech problem (Score:4, Informative)
Summarizing my experience with fixing Windows issues since, say 5 years ago:
1. Uninstall some update
2. Add or change some cryptic Registry value
3. Set some cryptic Group Policy Editor value
4. Execute some cryptic PowerShell commandlet.
5. Execute some cryptic command line tool
Hardly better and/or more intuitive than what I would do in Linux.
Re: (Score:1)
Re: (Score:3)
Re:The PC tech problem (Score:4, Interesting)
> Put a Windows machine in front of me with an issue and I will figure it out and fix it. However, put a Linux machine in front of me with an issue and there's a 50/50 chance I'll be able to fix it without resorting to wiping the system and reinstalling.
Heh, You describe exactly why I ditched windows about 10 years ago: about every 6 months my system would break, and I couldn't figure out how to fix it without wiping and reinstalling, with Linux I always was at most 3 steps from fixing it. My current install is running 12 years without a wipe, and is completely up to date.
Mileages indeed do vary.
Re: (Score:3)
Man, imagine being able to fix a Windows system better than a Linux system.
Shows me that the person doesn't really know Linux.
Windows is just so much more closed and unable to be fixed by the end user than Linux...
Re: (Score:1)
> Shows me that the person doesn't really know Linux.
That was his entire point... What is yours?
Re: (Score:2)
That you can fix more problems on Linux than you can on Microsoft.
His skill issue is the only problem here.
Re: (Score:2)
It's not just my skill issue, it's an issue many PC techs face. I probably would have devoted much more time to learning Linux back in the early days but when I did I was met with outright hostility from Linux, let's call them "purists". "RTFM!", "N00b!", etc.. etc.. The Linux "community" was full of self-righteous a-holes pushing those away that truly wanted to lean and switch ... Only the "leet" were allowed. You smell like one of them.
Re: (Score:2)
So I'm an elite for pointing out that Linux is infinitely more fixable than Microsoft stuff?
F* off.
Re: (Score:2)
You know, based on this comment, I'm pretty sure why you got the response you did from the Linux Community, and I can tell you that you should look in a mirror. You smell like you own problem.
Re: (Score:1)
When I used Windows for work and Linux at home, I always found Linux more fixable. The fix might range from simple to quite advanced and complex but it was always possible.
Windows was a different matter. For example, Outlook broke, I googled the symptoms and it was a *very* common problem. Half the posts about this problem mentioned three ways to fix it. The other half said they'd tried all three and none of them worked, and they'd never been able to fix it (I ended up in that group).
Another time, I worked
Re: (Score:2)
Windows from the start was for the low end home user. Whereas Unix from the start was intended for professionals, such as engineering, science, or even office professionals. It's a completely different focus. So even today, Windows which is now mostly getting money from the enterprise customers treats their users as idiots, whereas Unix/Linux/etc treat their users as capable. Thus every year Windows removes more and more customization options, whereas Unix provides as much customization as the user want
Widevine DRM ranks linux lower then windows (Score:1)
Widevine DRM ranks Linux lower then windows.
Also some steam games flag Linux play as well.
Re: (Score:3)
Obviously, Digital Restriction Management will have a problem with any open system. Duh.
What's the point? (Score:2)
Unless you are generating porn, it's far more efficient to use the cloud. Especially for LLMs, incredibly inefficient locally because performance scales nearly linearly with batch size.
Re: (Score:3)
"The point" is that this basically is the cloud, in a tiny form factor. It's a modern Blackwell architecture. Highly efficient. It has 128GB VRAM, so you can run large batches. If you're talking about AI art (from your context, I presume so), who generates one image at a time when they have the VRAM to do multiple?
And unlike shared cloud services, obviously, you're in full control of it.
Re: (Score:2)
You could run decent batches for small models. Locally you lack the demand for it though, in the cloud they can aggregate requests.
Re: (Score:2)
Exactly how massive of batches are you picturing there being demand for, for a given prompt?
Also, a $70k GB200 only has 3x the VRAM anyway (performance gains with batch size are logarithmic, not linear). Its main difference is FLOPS, not batching capacity.
Re: (Score:2)
(This oversimplifies the situation, but we have to decide on a particular batching architecture to discuss how you can gain from batching. With Transformers, for example, you gain from continuous batching of uneven-sized requests fitting into the given context size even if they're not similar, so there's always some gain, assuming you're dealing with different sized queries and have a server that can pack them efficiently. But you can only share a given set of token processing on queries that begin with t
Almost true. (Score:2)
"Nowadays, Linux runs well with Nvidia chips."
Kernel 6.8.51 did not get along with my GTX 680. It started fine, but one trip to system reports and I end with gibberish left and bottom and a blue swish center to upper right.
Reverting to 6.8.49 fixed everything back up.
If you are wondering, the system is a 2010 Mac Pro which probably isn't helping things. It usually runs Mint 22 without problems although there was some software that didn't work due to a lack of AVX instructions.
Re: (Score:2)
I had a fair amount of pain getting NV cards working on my Mac Pro 2008. both with Linux and OSX. I don't know if the situation is improved on the 2010.
Once my Mac Pro's PSU died, I put together a very low end Xeon-based game system using some low cost server pulls. Been running that with a 1070 Ti card for a few years now and it's not given me any trouble on the graphics side of things. Usual annoyances with gnome and pulseaudio sometimes misbehaving. That's about it.
Re: (Score:2)
I still go into a new Linux install expecting to spend at least 2 full days to get NVidia+CUDA+full pytorch environ properly set up. Though using Fedora usually makes it worse, since it uses new versions of everything, while NVidia and pytorch apps usually lag behind. I've multiple times had a Fedora OS go end-of-life before NVidia even caught up to it. I generally have to install older versions of GCC and Python and switch back and forth between them. And all of the different ways to install NVidia and
Windows on Arm is not far behind in availability (Score:1)
Huang also boasted on the CES presentation how well WSL2 works on windows too. That suggests Windows on Arm is not far behind in availability. I don't see how Microsoft would cede this space so easily.
"now"? (Score:2)
>"But now there's a Linux-powered PC that many people will want..."
There are already Linux-powered PCs that millions of people want and use every day. Kind of a strange statement.
Re: (Score:2)
Re: (Score:1)
Not all third party commercial software gets updated in a timely fashion. There may be some important applications which have not yet been certified for 24.04.
No (Score:3)
No
Re: (Score:2)
doubt (Score:2)
There was a class of Linux-powered PCs that many people wanted, it was called netbooks: small, inexpensive but somewhat limited in features. Now those AI supercomputers are at the other end of the spectrum, much expensive.
Think you have drunk too much coolaid (Score:3)
Re: (Score:2)
What we need is a Linux desktop computer that fires electrodes into the user's brain, taking control of him/her and using them as an agent to install Linux on more computers. Then I think we'll have a real shot at this being the year of the Linux desktop. The math proves it will work, and we can expect exponential growth in the Linux market too.
Re: (Score:2)
Elon already has the brain interface thing figured out. I for one welcome our new AI overlords using us as meat robots.
Make it run SteamOS well (Score:1)
and you might have something popular.
The trouble is that most games run on x86 and nothing else, and there are only so many developers.
For professional use, sure.
NVIDIA canâ(TM)t be bothered to release its HPC SDK for Windows. Makes sense - who would want to develop on Windows in that space. Youâ(TM)d spend 3/4 of your time wrangling configuration of the environment.
Latest versions of tensorflow likewise donâ(TM)t exist for Windows.
The trend is clear: ML and HPC development = Linux.
a $3,000 personal AI.....Hell NO ! (Score:2)
Only a handfull (Score:3)
Re: (Score:2)
I don't get who they are aiming this at, especially as if you already have a PC, a 5090 will be about $1k cheaper than a digits box and will apparently result in a significantly more powerful AI experience for both inferencing and training.
Windows runs on Arm (Score:1)
Sure the hardware is probably awesome and all. Still this you can't run windows isn't accurate in the least, of course you can, granted NVIDIA probably hasn't done what they needed to do with Microsoft to make it happen, (like write a check) but given the existence of windows on arm it's already there. If anything, windows on arm in a VM is probably just a matter of time.
Still would one like to own one of these, an Apple M4 or an Ampere to run Linux on a top of class Arm system? There are choices....
That's a niche market (Score:2)
"Nowadays, Linux runs well with Nvidia chips." (Score:2)
Since when? Gnome and Wayland are having problems with Nvidia display drivers still today. Fedora, Gnome, and others do not ship the official Nvidia drivers with their distributions because the driver is proprietary closed source. And if you install it you can expect that it will cause other problems with your environment.
Nvidia is *horrible* with Linux today.
Re: (Score:2)
>> Nvidia is *horrible* with Linux today.
I have to say that's a complete 180 from my own experience.
Nvidia has always just been plug and play for me. Admittedly I mostly just use Ubuntu and throw the factory drivers on, (I can't be bothered to waste my time dicking around with Arch and Nouveau or whatever).
Meanwhile AMD GPUs have for me multiple times been a giant headache to get working stably under Linux, even just 2D/desktop stuff.
Personal AI (Score:1)
Re: (Score:2)
Any L3.3s working well for you?
Re: (Score:1)
Re: (Score:2)
Nah, Huggingface is the place to be.
Can't you run big models (slowly) with some layers offloaded to RAM? Or are even the RAM requirements too intensive?
Re: (Score:1)
No (Score:2)
Next question.