This year, I made my first ever demo! I’ve been threatening to do so for ages now, but since the lockdown hit, I’ve really had no excuse not to. For these of you unfamiliar with the demoscene, it’s a small, yet internationally recognised subculture revolving around computer art. Some of the best programmers and artists produce works for all sorts of computers and electronic equipment, pushing the machines and themselves to the very limit! There are demoscene parties taking place all over Europe (and some further afield). I entered NOVA this year — the UK’s main demoscene party, and had great fun doing so.
Demos and the Demoscene
The demoscene has been around for a while now. It started out when crackers learned to break the copy protection on games. To show off their prowess, these expert programmer-pirates would leave messages at the beginning of the game. These messages often had music, scrollers, custom art. You can draw quite a few parallels to graffiti work, and I mean really good graffitti work! Some of these ‘cracktros’ are incredible! The crackros became the main event, and the demoscene was born.
Back in these days, before the internet, demos were distributed via disk or bulletin boards. You could swap disks with folks you knew, dial into a modem and download, get disks in the mail, or go to a demoparty. Demo parties are still going today, stronger than ever one could argue. The largest Demoscene party is Revision, held in Saarbrucken, Germany. I’ve been a couple of times — you’ll see some of the finest computer art in the world at this event!
Parties tend to be built around competitions. There are several categories and you can enter as many as you like. The competitions range from oldschool demos, such as these written for the Amiga or the ZX Spectrum, to 64K PC Demos — where the size limit is 64K. There are competitions for music, ranging from new school to tracked, and also competitions for Pixel Art and digital photography. Something for everyone.
Quite often, folks will get together to form a demoscene group. Musicians, teaming up with programmers and illustrators to create entries none of them could do on their own. Some of these groups have been going for decades, albeit with different members.
In the past, computers were a bit more restrictive than they are now. 3D rendering on the Amiga is quite the task whereas it’s quite trivial on a modern PC. To keep up the challenge, a set of size-coding competitions were devised. The idea is fairly simple — see how much you can do with 64K, 8K or 4K. Some competitions even go as far down as 256 Bytes[³]! For reference, an average email is around 400 Kilobytes.
Some of these competition entries are absolutely fabulous. One of my favourites is Fermi Paradox; a 64k entry by the demogroup Mercury. I’m still amazed that not only can so much be packed into such a small space, but that so much can be said in a short space of time. The programming skill is only matched by the art.
One of the things I’ve noticed is there are very few Linux demos. This sounds odd to me as Linux is quite the open platform, at least compared to Windows. However, Windows has been the mainstay of the PC demoscene for much of the scene’s history. I think this is because the graphics drivers tend to be better, Windows setups are much more homogeneous (you can generally rely on certain libraries being around), and more of a history of the kinds of hacking techniques often used in demo production. As more demos were released for Windows, more demotools became available. Tools such as crinkler, which compresses your demos down to a tiny size, to the shader minifier. Linux hasn’t got there yet with it’s tooling, until recently.
- Compress the shaders into a header file with shader minifier
- Build the object files with some embedded assembly (which I don’t understand!)
- Strip out any of the bits of the ELF Header we don’t need.
- Link everything together with the Shoddy minsize-oriented linker (SMOL)
- Compress everything with the vondehi program.
Vondhehi is quite funky! The decompressor itself is loaded into your program and decompresses the rest of the program once the program is run.
There are no doubt other subtle things going on. If you want to know more, there is a talk from Revision you can watch that explains the process in more detail. The code-base itself seems a little complicated but some of that is down to the tools being made for windows; there is a dependency on Mono (among other things) for example. Nevertheless, it’s quite possible to get down to an executable of 4K or smaller.
This demo runs on every Linux setup I’ve tested it on. The de-facto standard for demoparties seems to be Ubuntu (at the time of writing, Ubuntu 18 for Revision). All size coding demos rely on some libraries being pre-installed. Unlike Windows, there is no guarantee you’ll have the library you want installed under Linux, so this template comes with three options for the OpenGL context: SDL2 and GTK3. I went with SDL2 in the end, as it resulted in a slightly smaller filesize.
Definitely my weakest area this one. I had no idea where to begin. Thankfully the template is setup to use a tracker or synthesizer. I honestly don’t know how this bit works but after looking around briefly, I found a program called Renoise. Apparently, there are set of instruments made by demosceners that can be loaded into renoise, whereupon you can make your soundtrack. These instruments tend to compress rather well it is claimed. Oidos, 4KLang and Clinkster are recommended in the template. I decided to go with Clinkster for no real reason at all.
Renoise is a Windows program unfortunately, but it seems to run well enough under Wine. I could create a tracked piece of music fairly quickly (I use music in the loosest possible sense of the word!). With the file placed in the right directory, the template makefile rolls it in quite easily.
If you want to know more about demoscene music and listen to the work of someone who really knows what they are doing, check out h0ffman. He has a good write-up on his site about the kinds of hoops a demoscener composer needs to jump through.
Raymarching Menger Sponges
With everything in place, I set to. I’d wanted to learn a bit more about fractals and how they are rendered. I started with the Menger sponge — a classic fractal. I like the look of it! Something weirdly alien yet constructed. I thought I’d use this as the main feature around which this intro would be based.
Normally, fractals are described using recursion. It’s an elegant way of of generating an image as fractals are self similar. However, in a fragment shader, this isn’t possible.
It’s worth mentioning what a fragment shader is. In early 3D graphics, the pipeline that took your triangles and spat out pixels on your screen was fixed; you couldn’t really tweak it. Nowadays, you can alter the functionality of many different sections of the pipeline using small programs called shaders. The fragment shader is the last shader in the line before the pixels appear. It’s sometimes called a pixel shader too, though you aren’t manipulating pixels at this point (well, you are close enough I guess).
The fragment shader is where all the graphics magic happens. You are given a fragment and you output a colour for that fragment. The in-between step is where the fun is. We can use a technique known as raymarching.
Raymarching (or Volume Ray Casting) is used all over the place in a lot of demos; it’s a powerful technique. Similar to raytracing, you shoot a ray from your camera through the screen at the fragment position. You then need to find out what this ray hits. To do that, you use a distance field.
Distance fields are a parametric way of defining a scene. The simplest example is a sphere. A sphere can be defined as a 3D point and a radius — 4 numbers. From these four numbers you can figure out the distance from where your ray is, to the sphere, even figuring out where on the sphere your ray will intersect. The next step is to march the ray that distance and then check again how far away you are.
You can build up an entire scene this way, with multiple objects, creating realistic lighting, special effects and more! It’s a really powerful technique made famous by the website Shadertoy and it’s creator, Iniqgo Quilez. His site details a number of mathematical formulas for distance fields though I can’t claim to understand most of them. However, we can do all this in a single pixel shader.
Back to our Menger sponge then. If we can do it recursively, we’ll have to go iteratively. One way we can create the sponge is to start with a field for a large cube and effectively carve out the holes using smaller cuboids and a subtraction function. Turns out all of these are easy to do in a fragment shader if you have the right functions. An excellent set of such functions can be found on Mercury’s homepage.
Back to our demo. I had the idea of using a Menger Sponge but no idea on the actual art. Where to begin? What was a trying to say? I must admit, this bit really took me by surprise how hard it was! I suppose that’s a bit of a cliche right — the engineer trying to do an art? Still, undeterred, I’d been reading a little about the roleplay game Numenera. I liked the idea of long lost advanced technology. I pictured a desert, with an ancient and worn down artefact, suddenly coming alive as we observe it. I had an idea, so off I went!
The desert part is quite easy — it’s just a plane with a noise offset on the distance field. I used Iq’s lighting model for the shadows and the ambient occlusion. With the sponge rendered, I ran the same noise function over it’s distance field, giving it the worn and broken look.
The sky is rendered with a simple gradient, though a trick of the lighting solution makes it glow a little which is nice. That forms the majority of the scene. The final parts are the camera animations and the scene transitions at the beginning and the end. These are hard-coded into the shader.
One of the requirements of entering the demoscene competition at NOVA this year was a video recording of the demo. The global pandemic has made things harder for everyone, and I was unable to get a decent recording of my demo. The PC I had available was just about good enough to render a smaller, non-anti-aliased version at around 10fps. Not exactly smooth. Thankfully a member of the UK Demoscene — d0pefish — offered to record the demo for me, so big thanks to him!
I was entered into the new school intro competition. I came last, or second if we want to be nice. The winning entry was very good indeed, with better music (did I say I was crap at music?) and some nice reflections. I was pretty happy to come second to such a good 4K though. For my first entry, I was quite happy.
I also entered the teletext and photography competition, actually winning the latter with my photo ‘Bark Europa’. I was very pleased to see a teletext competition; it’s oh-so-British. The teletext editor used is online, running in a web-browser. It’s a lot of fun to have a play with. I’d enjoyed wiling away an hour or two doodling.
I really enjoyed working on my first demo. Once I’d gotten the tools part out of the way, actually tinkering and playing with the shader was great fun. My intention is to improve the Linux 4K template, replacing some of the tools, removing the reliance on Python 2, learning a bit more about Smol and Vondehi, whilst picking up a few more techniques for Ray casting.
I don’t think there are any more parties I will enter this year. I’m aiming for Revision 2021 as well as the next NOVA.
I should probably find a musician who wants to collaborate with me! ^^;