Ask Slashdot: What Is the Latest and Greatest In Computer Graphics Research? 95
OpenSourceAllTheWay writes: In the world of 2D and 3D Visual Content Creation, new tricks that ship with commercial 2D or 3D software are almost always advertised as "fantastically innovative". But when you do some digging as to who precisely invented the new "trick" or "method" and when, you often find that it was first pioneered many many years ago by some little known computer graphics researcher(s) at a university somewhere. Case in point, a flashy new 3D VR software that was released in 2018 was actually based around a 3D calculation method first patented almost 10 years ago. Sometimes you even find that the latest computer graphics software tricks go back to little-known computer graphics research papers published anywhere from 15 to 25 years ago. So the question: What, in mid-2018, is the latest and greatest in 2D or 3D computer graphics research? And which academic/scientific publications or journals should one follow to keep abreast of the latest in computer graphics research?
Barnes (Score:1)
ACM TOG & SIGGRAPH (Score:5, Informative)
If you're looking for the great classics in computer graphics, many not so little-known graphics papers are in SIGGRAPH proceedings.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Starting in 2003, all SIGGRAPH papers are published in ACM TOG
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:3)
Ha ha ha ha... wow, that is... wow, so wrong.
GPUs have increased many-fold in performance since 10 years ago. Not even the fastest video card from back then could power a VR headset today, or support modern gaming on a 4K monitor. CPUs have made less of an increase in raw clock speeds, but have made huge jumps in core count and instructions per clock (especially in specialized areas, like vector units). RAM capacities have gone through the roof. Drive technology has made the jump from HDD to SSD, and then f
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Your computer in 2018 essentially works the same as your computer from 1978.
In 1978, "my computer" worked by handing a deck of punch cards to the operator and then coming back 30 minutes later to pick up the print out that said there was a syntax error on line 447.
Re: (Score:2)
No they do not work the same unless you mean that they all are "von Neumann" computers with sequential semantics, using binary*, byte addressed with 8 bit bytes, using two's complement arithmetic etc.
A modern processor presents an interface that looks somewhat like the old type of computer but internally is quite different, in fact most modern processors are using a limited type of dataflow processing in order to extract parallelism from sequential programs. They also use statistical modelling (speculative
Re: (Score:1)
Your enthusiast's standard PC:
1978-1988 was huge (Z80 1KB to 80386 1MB) and the 1978 PC was worthless.
1988-1998 was pretty big (to P4 128MB) and the 1988 PC was worthless.
1998-2008 was a huge improvement (to Dual-Core 8GB) and the 1998 PC was worthless.
2008-2018 shows 2x in any metric (Quad Core 16GB) and the 2008 PC is probably still better than a modern netbook.
Re: (Score:2)
The top-end desktop CPUs in 2008 from Intel were the first generation of the Core i7 series. The maximum amount of RAM that CPU supported was 24GB. Today, the top-end Intel and AMD desktop processors support 128GB of memory (Core X series and Threadripper both) - and if you go over to the single-socket Xeon W you can get 512GB. Dual-socket CPUs support more now, and supported more back then, but we are looking at ~5 times the RAM capacities today that were available then. 2-4GB was typical for an average de
Re: (Score:3)
Re: (Score:2)
That is just my personal system, which is actually running a CPU from ~ 4 generations ago (so about 5 years old). I'm on the cusp of an upgrade (likely this year) to a 6-core at over 4GHz, and if I were into any applications that benefited from higher core counts I could get a 16-core AMD or 18-core Intel processor. It all comes down to what an individual user needs, wants, and of course can afford.
If your point was more along the lines of "basic Internet and office application usage isn't any more complex
Re: (Score:3)
Re: None (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
As I noted, my comment was just in regards to the sweeping - and very incorrect - statement made by 110010001000. I work with computers, but I do not personally perform research to design new hardware or software approaches to graphics. The original question is also very wide-open, so much so that I do not feel I can directly answer it... but I didn't want to leave such a disparaging comment about computer technology unchallenged.
Could you perhaps enlighten me as to what you think I missed regarding the ori
Re: (Score:2)
Re: (Score:2)
LOL - nice try. Check out the plethora of articles I write on a regular basis, looking specifically at performance of different computer hardware in high-end applications:
https://www.pugetsystems.com/a... [pugetsystems.com]
(any written by William George are mine, in case that wasn't obvious from my /. username)
Re: (Score:2)
Re: (Score:2)
Oh for Pete's sake... so does that mean nothing NVIDIA or Intel says can be true, just because they sell GPUs / CPUs?
Sure, a company selling something certainly might stretch the truth to try and get folks to buy things - but that doesn't mean that all the stuff said (or written) by anyone selling a product is automatically incorrect. Where I work we actually don't do much advertising, or make crazy claims: we actually run real-life software to see how it performs, and then publish the results publicly. We
Re: (Score:2)
does that mean nothing NVIDIA or Intel says can be true, just because they sell GPUs / CPUs?
Moving away from a boolean value of True or False, the correct answer is not completely True. That is guaranteed because of marketing BS.
There are hard numbers that claim revolutionary speed and quality increases. These are also guaranteed to also be not completely True because of marketing BS. Core counts, clock speed and memory size do not scale linearly with results, even though that is implied by marketing comparisons.
So called "real world" tests are also deliberately misleading. As they say, YMMV. I
Re: (Score:1)
I think OP was talking more in the mainstream. There just isn't as much going on.
Case in point: I don't game (disclaimer), but my primary computer today is a late 2013 Macbook Pro (15 inch Retina). The i7 It shipped with is till perfectly adequate today. The 8GB RAM it shipped with still feels adequate, and is still in line with most new laptops out there. The 256 GB SSD has been close to full for a while, but that's till the standard SSD size.... But unlike a lot of newer models, I can upgrade my SSD at wi
Re: (Score:2)
> GPUs have increased many-fold in performance since 10 years ago.
So what? The OP was talking about stuff like innovative new mathematical/logical approaches to performing graphics rendering, not what hardware you're doing it with.
Do you not understand that just doing the same old shit on a bigger chip every year is actually not scientific progress?.
Re: (Score:2)
As I mentioned, my reply was not so much directed at the original question as at the comment above mine (the original "None" comment).
Regarding the original Ask Slashdot question, I am sure if I am qualified to answer... but it seems to me that the question is pretty wide-ranging and not very focused. The query includes mention of 2D and 3D visual content creation, consumer software, VR, graphics research, patents, and more. I would think that checking out conferences like SIGGRAPH and GTC would showcase a
Makes me think of Dilbert (Score:2)
Every time I see a thread about the latest/greatest in computers or graphics, I always think of this Dilbert [dilbert.com] from 1995.
Re: Makes me think of Dilbert (Score:2)
What kinda chip you got in there, a Dorito?
https://youtu.be/qpMvS1Q1sos [youtu.be]
Simple (Score:5, Informative)
quite simply its animation (Score:2)
it used to be that the actual "graphics" was the limiting factor in "peoples minds" now with the textures and engines the realism has got to the point the limitation is the way in which things move
for example https://www.youtube.com/watch?v=uFJvRYtjQ4c
https://github.com/sebastianstarke/AI4Animation
regards
John Jones
Uncurated resource (Score:4, Informative)
If you want something more curated, it becomes trickier, but a fun way of doing it is to look for the "technical papers preview" videos online for SIGGRAPH. A fairly long-standing tradition of that particular conference is to kick off the whole thing with a very short, usually humorous blurb of every technical paper being presented that year, done by the authors of each paper, in one giant marathon session on the first day. Each paper gets like 30 seconds to pitch its idea and show it off visually, and while you can't find the full 2-3 hour presentation that contains all of them, there's usually a shortened version online with some interesting/promising examples.
Pfffft, get with the times (Score:5, Funny)
Haven't you heard of Qbit Blockchain Deep-Learning Microservice Serverless 4D.js Rendering?
Re: (Score:2)
Re: (Score:1)
"Deep-Learning" is in there (sometimes called "deep neural networks"). Actually, I missed IOT, not AI.
Re: (Score:1)
Dammit Jim, I'm a troll, not a pedant.
Mining happened (Score:2, Interesting)
Well... (Score:1)
does mining cryptocurrency count?
Bridges (Score:2)
If you're more into math and art than optimization tricks, check out Bridges [bridgesmathart.org].
(I (re)?discovered math art about 3 years ago, and it sort of reminded me of the early 90s demoscene, except this time it's for grownups. I got into Bridges as soon as I heard of it, and it's my third year taking part in some way; there's also an art exhibition and a short film festival for those of us who'd rather just show off what they do instead of giving lectures.)
Re: (Score:2)
Thanks, that is not so much a treat as a whole candy shop.
Why is this not closed yet ? (Score:1)
Re: (Score:2)
Denoising path traced iamges (Score:3)
A lot of more recent Graphics papers (mostly Image Processing, actually) are using Convolutional Neural networks to do various things. There has been a lot of low hanging fruit in the areas of denoising, and various image manipulation techniques, so results in those areas have been transformed in the last few years.
One such "hot" area that has application in the broader area of computer graphics, is the denoising of path traced images. Path tracing uses stochastic light bouncing techniques to produce a highly accurate image (in terms of lighting effects), but these images are noisy (due to the stochastic nature of the rendering process), requiring a large amount of samples to "average away" the noise, and hence being slow to render. Neural networks can learn to remove the noise from such images, potentially allowing for photorealistic images to be created extremely rapidly, perhaps even in realtime. In my view, this is the most exciting "game changing" area in graphics at the moment.
As an average user... (Score:2)
The most significant jump in graphical improvement was from a Voodoo card. Since then, everything has been seemingly incremental in comparison.
SIGGRAPH TPPT is exactly what you asked for! (Score:2)
SIGGRAPH is the ACM computer graphics research conference. You won't find anything more cutting edge. Each year they produce a video "SIGGRAPH $YEAR : Technical Papers Preview Trailer". This is exactly what the OP was looking for. Here's 2017's video:
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:1)
The 2018 edition is here: https://www.youtube.com/watch?... [youtube.com]
Follow @id_aa_carmack on twitter (Score:3)
Oh thats easy.
Follow John Carmack on Twitter [twitter.com] !
Who cares if it's new if it's new to you? (Score:2)
The really fundamental advances take a long time to be fully explored. There is little significant that doesn't build upon earlier work.
Check out Conformal Geometric Algebra [cam.ac.uk], which is the basis for the company Geomerics' [siliconstudio.co.jp] Enighten software for real-time global radiosity lighting for games. (Now part of ARM / Silicon Studio).
See the lectures linked from the first link, in particular lecture 7 on CGA. These are by Chris Doran, one of the founders of Geomerics, a member of the Cambridge GA group. Also see Leo D