Forgot your password?
typodupeerror
Graphics Software Programming IT Technology

Hacking Hi-Def Graphics and Camerawork Into 4Kb 255

Posted by timothy
from the thumbdrive-could-hold-the-entire-world dept.
TRNick writes "The old home-computing art of hacking elaborate graphics and camerawork into tiny amounts of memory has been lost, right? Not so. The demoscene is keeping ingenious coding skills alive, and TechRadar finds out the latest developments. Winner of the 4kb competition at 2009's Breakpoint party was RGBA's demo 'Elevated,' a gorgeous scrolling demo featuring photo realistic landscapes and music, which fits into the memory used by one of your PC's desktop icons. This is really impressive stuff."
This discussion has been archived. No new comments can be posted.

Hacking Hi-Def Graphics and Camerawork Into 4Kb

Comments Filter:
  • Wow (Score:5, Funny)

    by Lord Byron II (671689) on Monday July 20, 2009 @12:03PM (#28757105)

    It takes a 64MB avi to store the 4KB demo!

  • I wish (Score:4, Insightful)

    by Khyber (864651) <techkitsune@gmail.com> on Monday July 20, 2009 @12:06PM (#28757149) Homepage Journal

    I wish more developers would try doing things like this. I can imagine a game along the designs of Doom3 or Quake4 that would fit on a floppy disc with some proper code crunching.

    Of course, the downside is that it'd be all too easy to snag tiny files like that on a torrent site.

    • Re:I wish (Score:5, Informative)

      by molukki (980837) on Monday July 20, 2009 @12:11PM (#28757201) Homepage
      Try .kkrieger [theprodukkt.com] -- FPS game in 96kB
    • by Toveling (834894) *
      Been done: http://www.theprodukkt.com/kkrieger [theprodukkt.com] ... 96k game with fairly high-quality graphics
    • Re:I wish (Score:5, Interesting)

      by slim (1652) <john@ha[ ]up.net ['rtn' in gap]> on Monday July 20, 2009 @12:20PM (#28757333) Homepage

      Try Left 4K Dead [mojang.com]

      The fact is that cramming a lot of game into a small space is still worth doing.

      • Flash games sites have bandwidth costs
      • Console gamers develop high expectations as the hardware reaches the end of its lifecycle, and developers compete to squeeze more out of it
      • Console gamers develop high expectations as the hardware reaches the end of its lifecycle

        Microsoft and Nintendo virtually stopped developing and approving new software for their older consoles (Xbox and GameCube) once their replacements (Xbox 360 and Wii) hit the market. So I take it you're talking about Sony.

        • by slim (1652)

          Microsoft and Nintendo virtually stopped developing and approving new software for their older consoles (Xbox and GameCube) once their replacements (Xbox 360 and Wii) hit the market. So I take it you're talking about Sony.

          That would be *past* the end of their lifecycle.

          • by tepples (727027)

            That would be *past* the end of their lifecycle.

            Which ties in with my point. Near the end of a Microsoft console's lifecycle, attention drifts toward Games for Windows. And at the end of a Nintendo console's lifecycle, attention drifts toward the handheld (Game Boy Advance between N64 and GameCube; Nintendo DS between GameCube and Wii). Only Sony consoles appear to get games that push the hardware in its last year, and that's because Sony overlaps its consoles' lifecycles.

      • But the cramming rises the price drastically. So you have to find a balance. Which, except for rare cases, means next to no cramming.

    • One problem that is easily foreseeable: this would require very distinct and very specific libraries to be installed on all computers you want it to run on. That may work if all you use is DirectX, but if you use any other libraries of code... well, this really severely limits it. (example: I the exe I downloaded for the 4k demo crashed).

      I would rather have to have an entire CD (incidentally, I wouldn't be able to use a floppy anymore) and be able to run it without downloading this, that, and one other li

    • Re: (Score:2, Insightful)

      by RobVB (1566105)

      Of course, the downside is that it'd be all too easy to snag tiny files like that on a torrent site.

      That, and games would take 4 hours to start on today's processors.

    • by Hatta (162192) *

      Would libraries on the host system count? I can see how it would be easy to make a really small game, if you have hundreds of megabytes of libraries on the host system to leverage. Much harder if you have to do it all from scratch. What are the rules for these kinds of things?

      • by Knos (30446)

        No extra download on top of a fresh installation of your (proprietary) OS.

        On linux, it's much harder to define what's in and what's not, so it's a genuinely less interesting playing field for competitions.

    • An indie game titled 'Roboblitz' uses procedural generation to 'unpack' game textures the first time the game is run. It makes the installer smaller, but the unpacking process is still time- and processor-intensive. It saves transmission bandwidth, but doesn't do the end user any other favours.
      • by tepples (727027)

        It saves transmission bandwidth, but doesn't do the end user any other favours.

        Not everybody has the opportunity to live somewhere that can get cable or DSL. So a lot of people upgrade from dial-up to satellite or 3G Internet. These plans usually have burst performance not much better than ISDN and a 5 GB/mo transfer cap, which works out to a 14.4 kbps sustained throughput, for $60 per month plus equipment costs plus prohibitive overage fees. For them, saving transmission bandwidth becomes much more important.

    • It's already easy to snag gigabytes of stuff on a torrent site, I doubt making games smaller is going to make people any more inclined to pirate them unless they were truly gargantuan to begin with.. besides, having all your games generate the levels, characters, animations, sounds, AI etc procedurally would create some rather annoying loading times - not worth it unless you then save the uncompressed game so you don't have to go through the same procedure every time you run the game.

    • No, they wouldn't (Score:5, Insightful)

      by Sycraft-fu (314770) on Monday July 20, 2009 @12:44PM (#28757695)

      While demos like this are extremely neat, there are also some real limitations to what you can do. This is by no means an all inclusive list but some of the major limitations of making something like this:

      1) All graphics are completely procedural, as in mathematically described. That means you don't get to have an artist sit down and draw them. Puts limits on how they can look and demands a fair bit of self similarity.

      2) You use a MASSIVE amount of memory in relation to your file size. You may have noticed it sits at a black screen for a bit before running. Why? It is doing all its calculations, decompressing in to memory. When running on my system, it took 350MB. Rather than storing lots on disk and streaming as needed, you store little on disk and have to use tons of RAM.

      3) You can't have things like voices and such in the game, takes too much space. Even with extremely efficient compression (which produces audible artifacts) voices will quickly make your game larger.

      4) All assembly coding. To do this, you are writing everything as efficient as you can. That's wonderful, but hard to maintain. For a large project that is going to need to run on a lot of systems, be patched and so on, you want a higher level language. Doing everything in assembly would be a nightmare to maintain.

      I could go on, this is just an example. What it comes down to is that this is neat for demos. I -love- stuff like this, Farbrausch is one of my favourties for this sort of thing. However it is not a realistic exercise for normal applications. You do not want to sacrifice everything just to try and have a small program footprint. On the contrary, if increasing the on disk size makes it better or more efficient, then you want to do that. Disk space is extremely cheap. Better to use more of it than to sacrifice in another area.

      • by rbarreira (836272)

        RAM comsumption isn't bigger than for other games. With more CPU power it should be possible to generate more things on the fly as opposed to precomputing them, reducing RAM comsumption.

        As for voices, I guess we just have to wait for better text-to-voice synthesis algorithms.

        • It is bigger than for something that simple. Sure, Mass Effect uses more RAM... However it does a hell of a lot more.

          Again the real question is WHY. Storage is cheap. 9GB DVDs are less than a dollar a piece produced in quantity. Harddrives cost $0.10/GB and are dropping fast. Storage is just not a big deal. As such, it doesn't make sense to worry overly much about using it. I'm not saying waste space, but don't worry about trying to squeeze everything down to a minuscule size. Why would you want to waste CP

      • by Korin43 (881732)
        There's no particular reason you can't write it in C or C++. Compilers are pretty efficient these days.
      • by tepples (727027) <{moc.liamg} {ta} {selppet}> on Monday July 20, 2009 @03:25PM (#28760433) Homepage Journal

        All graphics are completely procedural, as in mathematically described. That means you don't get to have an artist sit down and draw them.

        Then give the artist a dataflow diagram, similar to GraphEdit, to build procedures.

        Puts limits on how they can look and demands a fair bit of self similarity.

        Nature is self-similar.

        You use a MASSIVE amount of memory in relation to your file size.

        But it doesn't have to be pushed over the wire or the optical disk, which becomes important as Xbox 360 games begin to run up against the 7 GB/disc limit and PC games begin to run up against monthly download caps [wildblue.com].

        You can't have things like voices and such in the game, takes too much space. Even with extremely efficient compression (which produces audible artifacts) voices will quickly make your game larger.

        I forget: how big was the S.A.M. synthesizer on the old 8-bit home micros?

        All assembly coding. To do this, you are writing everything as efficient as you can. That's wonderful, but hard to maintain. For a large project that is going to need to run on a lot of systems, be patched and so on, you want a higher level language. Doing everything in assembly would be a nightmare to maintain.

        Sure, demos aren't intended to be maintained much past the party, but some of the procedural techniques apply just as well to C or Lisp or ML or whatever if you want to trade off some efficiency to gain maintainability.

        On the contrary, if increasing the on disk size makes it better or more efficient, then you want to do that. Disk space is extremely cheap.

        Specifically, there are places where disk space is still a lot cheaper than bandwidth.

        • by Jesus_666 (702802)

          But it doesn't have to be pushed over the wire or the optical disk, which becomes important as Xbox 360 games begin to run up against the 7 GB/disc limit and PC games begin to run up against monthly download caps.

          And, of course, one can make the content procedural but cache the generated content on disk once it's generated. That way you don't even get the long loading times every time, yet still save bandwidth. If generation time is significantly shorter than download time (not unlikely as generating 300 Mi

      • by Twinbee (767046)

        I think the sacrifice of RAM is an admirable goal, especially with 64 bit around the corner.

        I associate bloatware more with large file sizes, disk thrashing, latency, and general sluggishness, more than something like massive memory hogging.

      • by wildsurf (535389)
        Better to use more of it than to sacrifice in another area.

        But then there's the Hollywood studios that spend years and hundreds of millions of $$$ and multi-teraflop render farms to produce mega-blockbusters that only engage 4,096 of your brain cells. YMMV.
      • by Mex (191941)

        This is clearly highly optimized for a certain OS, it doesn't work on windows 7. Probably XP?

      • by TCM (130219)

        It looks like someone is confusing demos to mean game demos or somesuch. Demos are supposed to make your jaw drop with four kilobytes, not demonstrate what future games might look like or show you good coding practices.

        Of course they could have had some artist sit down, draw beautiful pictures, integrate voice, write it in a high-level language. But that's completely not the point.

      • >>You use a MASSIVE amount of memory in relation to your file size. You may have noticed it sits at a black screen for a bit before running. Why? It is doing all its calculations, decompressing in to memory. When running on my system, it took 350MB.

        Honestly not trying to troll, but seriously, how much of that ram and loading time is taken up by the 4K demo and how much is actually DirectX libraries being loaded?
    • by HalAtWork (926717)
      Try out .kkreiger [kk.kema.at] from farbrausch/.theprodukkt [theprodukkt.com]. It's a first person shooter in 96k.
    • What you forget, is that that much optimizing of the code, takes a multiple of the time, that the whole game making usually takes. Also, you would have to do it after the playtesting, because much changes there, and you can't possibly debug that ultra-dense code. I think working with some real designers, who are no programmers, would be very hard too.
      If you find a million people wanting to pay for that extra work, I can find you the developers do to it. :)

      Also because you do not want to generate all the tex

      • What you forget, is that that much optimizing of the code, takes a multiple of the time, that the whole game making usually takes.

        Usually, optimizing just 5-10% of the program can result in significant improvements in terms of memory usage, size, execution speed.

        About the file size: I don't think anybody cares for the size. I downloaded a 33 GB torrent (65 GB unpacked) a week ago.

        Good for you.

        But, you know, there are people who care about download size.
        Me, for instance.

    • by Weasel Boy (13855)

      It's not the executable size of most games that takes so much space on your disk, it's all the pictures and sounds.

  • YouTube version (Score:5, Informative)

    by slim (1652) <john@ha[ ]up.net ['rtn' in gap]> on Monday July 20, 2009 @12:12PM (#28757213) Homepage

    I strongly suspect my video card won't be up to this, so I seeked out a capture of it on youtube:

    http://www.youtube.com/watch?v=_YWMGuh15nE [youtube.com]

  • by Maxmin (921568) on Monday July 20, 2009 @12:12PM (#28757215)

    Elevated by RGBA and TBC [youtube.com].

    Impressive, though it seems the demo scene has evolved to include the use of platform libraries (graphics/synthesizer.) Impressive anyways - I'm assuming the imagery is all algorithmic.

    When I last paid attention to demos, it seemed to be all in the executable, code dealing directly with hardware.

    • by Maxmin (921568) on Monday July 20, 2009 @12:26PM (#28757425)

      On second viewing, two things come to mind:

      1) the opening scene from LOTR: The Two Towers, an amazing piece of design and rendering.

      2) Audio player visualizations. The giveaway is the contrails appearing in sync to the music.

      Is it possible RGBA are using a built-in visualization library, possibly from WMP? That would explain the high level of detail and apparent use of texture maps, which I'm guessing wouldn't fit into 4kb, algorithmic or not.

      This (admittedly weak) theory can be verified by disabling the visualization library for Windows Media Player [techspot.com].

      Anyone want to volunteer to verify this?

      • by ewhac (5844)

        Is it possible RGBA are using a built-in visualization library, possibly from WMP?

        Sheesh. Kids these days...

        Go read up on MOD files [wikipedia.org]. (Then go add Scenemusic.eu [scenemusic.eu] to your playlist.) Visual synchronization to MOD files has been going on for over 20 years. It's a solved problem.

        Schwab

      • Re: (Score:3, Insightful)

        by Maxmin (921568)

        I recant my rant. It really does look possible to do what RGBA did in 4KB. This thread forced me to go learn up ...

        Producing mountain-like terrains with Perlin noise [iquilezles.org].

        More by the same author (Inigo Quilez.) [iquilezles.org]

        Truly awesome and impressive. My eyes are opened, and I'm intrigued enough to try my hand at an implementation.

      • Why would they use an external visualization library? Since the sound is encoded in executable (using a format similar to MIDI), it would be trivial to sync the contrails with those notes in code. Furthermore, there's no way that WMP is good enough to pick out individual notes from a raw audio feed like that. There are a lot of smart people working on audio rhythm/beat analysis, and nobody's produced anything nearly that good. Hell, nothing's that good even when the audio file is available, so the analysis
  • Meh (Score:3, Insightful)

    by abigsmurf (919188) on Monday July 20, 2009 @12:22PM (#28757359)
    I don't deny these demos are impressive but the fact they use existing libraries lessens the achievement for me.

    When DirectX basically has it's own 3D engine, you're basically turning the task of creating a demo into generating sounds, textures and models from formulae.

    Libraries used should be limited to the minimum needed to create a window or change the display mode and shouldn't do any real grunt work or, there should be a second metric of RAM and swap files used.
    • by Knos (30446)

      It seems to me you would also forbid a demo to use a 3d accelerator in the name of having a metric you can understand.

      That's alright, but that's not really pushing your hardware to its extents.

    • by mobby_6kl (668092)

      Yeah, I'll have to agree. It's still impressive, but when using DirectX it just isn't the same. I don't have the D3D SDK installed now, but I recall it was possible to get a teapot to render in a few lines of code, so I'd think by not including any dependencies in the binary the filesize could be very small without getting into any extreme measures.
      A few more things:

      • The demos don't work on 64bit Win7, whether in XP compatibility or not.
      • They can cram HD graphics in 4Kb, but can't make a website with
    • Re:Meh (Score:5, Interesting)

      by ivoras (455934) <ivoras&fer,hr> on Monday July 20, 2009 @01:20PM (#28758347) Homepage

      No, it just raises the bar. Back when all you had to work with was CGA in 320x200 it was impressive to show a rotating cube in 4k. Today, this demo nicely shows where the virtual bar is when even considering making a 4k demo. As you couldn't do "Elevated" on your 100 MHz 486 in *no* condition or with any libraries, so would you be laughed at if you presented a rotating cube or a wormhole today.

      Here's an excerpt from TFA:

      for those wondering, this a (too) low density flat mesh displaced with a procedural vertex shader. there arent any texturemaps for texturing, instead texturing (and shading) is defferred and computed procedurally in a full screen quad. this means there is zero overdraw for the quite expensive material at the cost of a single geometry pass. then another second full screen quad computes the motion blur. camera movements are computed by a shader too and not in the cpu, as only the gpu knows the procedural definition of the landscape.

      If you can do better, show your work :)

  • libraries (Score:5, Insightful)

    by Anonymous Coward on Monday July 20, 2009 @12:31PM (#28757487)

    Everyone going on here about how stupid it is that they used existing libraries mind you that typical compo rules state that it must run on a base install. Nobody here is linking to myuberleetcode.dll or anything. That and think about the freaking sound for a second or better yet try and write a 4k and then come back and talk about how stupid it is

    • You said basically what I was going to say. Unlike the demos of a decade ago, these can use 3D libraries and music playback, so all this is doing is providing some 3D models (most likely algorithmically generated), textures, and some music sequence data (very likely using MIDI instrument sets that are built into the OS, or something similar). The ones from ages ago had their own 3D and music engines built in, outputting finished images to the framebuffer and sound to the playback buffer.
      • Re: (Score:2, Insightful)

        by illm (1106673)
        Actually, it's using it's own soft synth to generate the sounds along with the textures at startup. This is commonplace in 4K and 64K intros today. After the sounds have been generated, it's just about triggering the notes. If you listen again, you'll notice that the soundtrack doesn't contain that many instrument "hits", or notes if you say. The drums and bass are probably short repeated sequences, triggered by start and stop flags. This is me guessing though. And also, when you have the note sequences,
      • by adolf (21054)

        Listen to the demo, and then come back and tell me that it's canned MIDI. It's got a bunch of analog synth goodness going on, which simply isn't a function of any stock MIDI synth library (which, as a rule, resemble simple sample-playback machines). And, sure, it loops - but then, so does almost everything else we call "music."

        It's actually very similar to old PC demos in the DOS days. Back in the day, they had a set of hardware with a set of APIs (between MS-DOS and BIOS), and they got to use all of it

  • by Anonymous Coward

    While i certainly admire the technical skill involved, the demoscene is more than that. It's a form of art.

    Just look at http://www.pouet.net/prod.php?which=31571 [pouet.net]

  • by argent (18001)

    Elevated is packed down into a 4k package, but it clearly uses many many times that space when it runs. That's practically cheating... a real 4k demo would run in 4k of memory, total, usually including the screen buffer.

    When we did our entry for the BADGE killer demo contest in the '80s (a version of Lunar Lander that ran on the Amiga workbench and required you to land on top of your open windows) I felt we were already pushing the envelope of what was really "fair", since we were getting the "terrain" for

  • Link (Score:5, Funny)

    by eulernet (1132389) on Monday July 20, 2009 @02:01PM (#28759085)

    Torrent anyone ?

  • Don't know if it is a false positive or not, but Comodo screams about "Win32.Garbage.xxxxx" malware when the file landed in my download folder. I prefer to remain paranoid rather than risk this thing being a nasty, so those that are likewise paranoid might just want to watch the video instead of running the .exe. Just FYI.
    • by Jesus_666 (702802)
      Virus scanners occasionally scream about legit demos. Small demos use clever techniques to shrink their storage footprint. Viruses do the same thing. The scanner detects a packed exe and sounds the alarm. It's true that the demo exhibits viruslike behavior - but that behavior is simply unpacking itself.
  • Explanations (Score:5, Informative)

    by Anonymous Coward on Monday July 20, 2009 @08:39PM (#28764283)

    Hi guys. I'm one of the programmers behind Elevated, Inigo Quilez. I was responsible for the programing of the visuals. Christian Ronde made the music and Rune Stubbe made the synthetizer and music player.

    Apparently some people cannot believe this, but I will say it myself so there is no more useless speculation going on: this is a 4096 bytes executable demo. Not 4096 of source code. ItÂs a 4096 bytes executable (actually, a few less bytes), x86 binary, with plays a realtime animation and music demonstration without using any external data file. It uses a few d3d functions to generate a rectangle, to compile a hlsl shader and to set a projection matrix. That's it. I have read some people claiming there is "3d engines" built in in directx; I must presume those were assertions coming from people who actually know little about computer graphics today. Also, obviously, Microsoft didn't make any RenderCool( D3DX_MOUNTAINS, D3DX_PLEASE ) function in any of their APIs nor LoadTerabytesOfTextures( D3DX_ROCK_AND_SNOW, D3DX_FROM_HIDDEN_SYSTEM_FOLDER );

    The demo doesn't use any external library for sound or whatever. The demo could be recompiled in OpenGL/Linux very easily (it was Opengl in fact, just ported to DX in the last minute), and be something around 4300 bytes. We went for DX to fit in 4096 bytes to complain with the competition rules of Breakpoint, the party where we presented Elevated.

    Regarding the music, the demo not only encodes the music track, but also implements the instrument synthesis and track playback. The complete sound system takes about 900 of the 4096 bytes, it's mainly FPU code. To see how this is possible, you can have a look for now to sound synthesis and DSP.

    The "textures" are infinite, just as the terrain itself. You can travel as far as you want on the terrain, this never ends, and same for textures. The rock, vegetation, snow, texturing takes about 100 bytes, although it uses some Perlin noise functions that take about 350 bytes. So in essence, we encoded mega, tera, peta and hexabytes of texture in few hundred bytes. The prize, of course, is that they are just too fractalish. But it made the job. Cameras are based on simple sinus and cosinus functions, the playback code is 150 bytes or so, and the camera data itself is exactly 4 bytes for each shot (a 16 bit random seed to feed the sin/cos functions with random frequencies and phases), a velocity and a FOV value. The rest of the sequencing data and playback code (to fade in, fade out, summer/winter transitions, brightness/contrast and color correction parameters) are around 400 bytes. The rendering is done in a "deferred" way, for those who know about computer graphics a little bit, which means the zbuffer if filled first and then a full-screen rectangle is drawn with a shader invocation. This shader computes the surface normals, does the texturing (lakes and sky included) and then does some tonemaping and motion blur. The shader is huge, around 1500 bytes. Another 800 bytes are used for basic operations as opening a window, initializing the rendering surface, sending the synthetized sound to the sound card, implementing the rendering loop and listening for the ESC keypress.

    All this code is written in assembler (nasm), for those who were saying we donÂt know what hardware means. We spoke to the machine in this demo, as much as you can do in a modern OS at least from user code. The C version of the demo (which we used during development and debugging) is close to the 4200 bytes. The demo is also selfcompressed, and in fact the first thing the demo does at runtime is to allocate some memory (350 megas), self decompress there, and ask windows to run from that memory location. The uncompressed demo as it comes from visual studio (cl+nasm) is 7 kilobytes if I remember well.

    Making a full selfcontained 4 minutes audiovisual piece like this is possible if you know computers, progaming, maths, rendering technologies, you apply a bit of imagination, and you are lucky to be the first to do it. So, those who

Real Users hate Real Programmers.

Working...