NVIDIA Demos "Digital Ira" With Faceworks On Next-Gen SoC, Under Ubuntu 45
MojoKid writes "NVIDIA is holding a tech event currently in Montreal to showcase a number of the tools and technologies the company has developed to foster state of the art in game development. NVIDIA's VP of Content and Technology, Tony Tomasi took a moment to show off Faceworks, and the 'Digital Ira' face that they've demoed at various events over the last year or so. This particular demo was a little different, however, in that it was running on Logan test kit. If you're unfamiliar, Logan is the codename for one of NVIDIA's next-gen mobile SoCs, which features a Kepler-based GPU, like current GeForce GTX 600 and 700 series parts. The demo ran perfectly smooth and the quality of imagery was as good as we've seen on any other platform to date, console, PC or mobile. Incidentally, the demo was running on an Ubuntu Linux OS."
SubjectsInCommentsAreStupid (Score:3)
Re: (Score:2)
Re: (Score:3)
I think it has more to do with the fact that ATI/AMD locked up the console market for the next gen with Wii U, XBox One and PS4 contracts. Nvidia had to find a new market to shift to, which meant Android devices and Steam boxes. Both meant improving Linux Nvidia drivers.
Express truck (Score:1)
I would be afraid of releasing a tool that automated facial expressions for fear of getting hit with a dozen patent letters.
Re: (Score:3, Funny)
Bingo (Score:2)
Re: Bingo (Score:1)
Wouldn't State of the Art game development involve something other than graphics presentation? Like maybe the game, and gameplay elements? A fps engine is not a game.
Re: (Score:1)
Re: (Score:2)
It is, however, an extremely important game development tool. They are showcasing state of the art tools they have developed for game development, not the development of state of the art games. At least, that's what TFS is claiming: it sounds like it's written by Nvidia's PR department... which, actually, wouldn't surprise me. Well, except PR writing generally tends to be of a bit higher quality than the summary (so, don't think my comment was critical of your misunderstanding: poor writing begets poor unde
Gibberish train (Score:2)
"If you are unfamiliar", it says.
Re: (Score:2)
is nvidia relevant in the mobile space? (Score:2)
seems like Apple and qualcomm rule most of the market. i rarely see any new device coming out that doesn't have apple or qualcomm soc's in there
Re: (Score:2)
They make the Tegra which is used in some tablets
http://www.nvidia.com/object/tegra.html
Re: (Score:3)
Tegra 2 and 3 were pretty popular. Tegra 4 seems less so. Maybe Tegra 5 will be more popular. Whatever - competition is always good!
Devices that used Tegra [wikipedia.org]
Re: (Score:1)
Which is a little bit disappointing, considering the Tegra 4 performs really nicely in the Shield.
All I can say is (Score:1)
Damn Provos!
Re: (Score:1)
Indeed. I thought the article would be about Irish cyber-terrorists funded by American hypocrites.
P.S. I apologise for using the prefix "cyber".
Various wheels are beginning to turn (Score:5, Insightful)
Re:Various wheels are beginning to turn (Score:5, Interesting)
AMD and Intel will always get a win here or there, that's the nature of business. If you undercut someone enough to make a loss-leader product that's technically inferior, but sell it well enough, someone big will buy it so they can push out units. In the same way that no console is ever "state-of-the-art", there are a myriad decisions where the balance of value, cost, specifications and real-world performance combine to win the business.
But nVidia, it has to be said, has the lead generally. There are markets here and there and individual counter-examples, but nVidia really does the better job. As someone who owned one of the early 3DFX's, and through ATI Xpert@Work series and a myriad cards in between and through to the present day, I can't honestly consider non-nVidia things nowadays and I'll happily add £200 to a laptop price to get an equivalent model with nVidia graphics. And that's a laptop. And though I'm not your overclocker-ever-fps-counts-twitch-gaming gamer, I play a damn lot of games and spend a lot of money on them, and my preference is nVidia on a laptop (game anywhere with one machine, and even in a power cut, and not worry about 60/120fps pettiness) and run demanding OpenCL software.
The Steambox using nVidia would have been my only choice. It would be suspicious and laughed at if they'd said to use AMD or Intel on a gaming box with such a high recommended spec (even though a lot of the work on Linux drivers has been focused on getting everything out of Intel chipsets). Remember, "SteamBox" means nothing - it's just a collection of hardware that runs Steam OS so there will be AMD Steam Boxes from someone at some point (and they'll probably run AMD chips instead of Intel, too). They may be slightly cheaper and have slightly more bugs and slightly less performance but they won't be vastly different in terms of value for money if they are expect to be sold to people.
I would love Steam Box / SteamOS. I'd probably never install it on anything. That's from someone who was on Steam on day one and has got his ex-wife, girlfriend, brother, and even parents into having their own personal Steam accounts (whether that's 100 hrs on TF2 for my brother or 1000 hrs on Bookworm for my mum, or 10 hours on point-and-click adventure for my girlfriend). I wouldn't give them a SteamBox, because they don't need it with personal laptops, but I imagine they could be a serious contender if we can get the line "Which console will you buy this year? Playstation? Xbox? Wii? Or SteamBox?" into the public media.
However, the controller and the EXISTENCE of the OS is incredibly interesting. And the best bit is that a "Steam Box" doesn't exist as a thing... you'll get people making "overclocker's Steam Machines" and budget Raspberry-Pi-style "Nano-Steam Machines". THAT'S the exciting bit.
What card is in there is moot so long as people AREN'T able to tell just from playing on it. And AMD can play catch-up incredibly quickly if it becomes as popular as we hope. Hell, they only have to release one decent open driver for one particular chipset and EVERYONE will jump on it to make Steam Machines from it because it's the one with the open driver.
The biggest excitement? This is yet-another-device that will be in the home and may become a household name that will run Linux. Everyone has a TomTom or a Kindle or an Android device, and now we're pushing Linux into it's traditionally-regarded weak market. Once you get a household with Linux on everything else but the home PC, how long is it before the home PC doesn't even come with a Windows license anymore? Hell, I see people selling Windows/Android laptops and netbooks and tablet PC's already.
That's the exciting part, not that the only decent gaming cards are announced to go into a gaming computer that can use anything it likes.
Re: (Score:2)
I spent many years diagnosing blue-screens and all kinds of weird behaviour with ATI/AMD graphics chips. That history doesn't just go away, and I hear enough from my IT-literate friends to think that it's not swayed in the other direction significantly. We all have our opinion. To me, no amount of performance is worth data loss on a machine I might well be storing work on. And the benchmark differences just don't justify having to deal with that amount of hassle on any platform.
If every nanosecond matte
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Incidentally, the newest fglrx AMD Graphic Driver supports the new 3.12 Kernel -- Nvidia doesn't. There is also a huge performance boost to AMD Cards with the 3.12 Kernel; AMD is picking up speed on Linux. As far as quality, even on Windows Nvidia is better so I wouldn't expect AMD to be better anytime soon. There is one place where AMD surpasses Nvidia, that's the Open Source driver. Nvidia doesn't have an Open Source driver, but some kids managed to hack a bit and create one; although not official and not
Re: (Score:2)
There is also a huge performance boost to AMD Cards with the 3.12 Kernel;
That is only [phoronix.com] because a problem with the ondemand performance governor was corrected [anzwix.com].
I still have an AMD based system though (cpu, chipset, and gpu).
Digital IRA!? (Score:1)
The Irish Republican Army is digital now? Is that like a semi-military botnet?
Microsoft BOB (Score:2)
Sounds like Microsoft Bob all over again.
They should call Melinda Gates and get her input.
Re: (Score:2)
Tegra 5 gives Nvidia the chance to demonstrate their new ARM strategy of fusing 64-bit ARM cores (ARMv8 - desktop class) with PC-class AAA GPU cores
Tegra 5 uses ARM's A15-Cortex 32-bit ARMv7 core. Tegra 6 will be Nvidia's first SoC to use their custom designed 64-bit ARMv8 core.