Jay Taylor's notes

back to listing index

Quake's 3-D Engine: The Big Picture by Michael Abrash (2000) | Hacker News

[web search]
Original source (news.ycombinator.com)
Tags: software-architecture game-programming john-carmack quake-3 quake-iii michael-abrash quake 2000s news.ycombinator.com
Clipped on: 2020-12-18

Image (Asset 1/2) alt=
Image (Asset 2/2) alt=
Oh man - of all my 'nerd crushes' Michael is at the absolute top.

I feel like he hasn't been risen up to the stature of John Carmack but if you look at the things he's shipped in his career (no less than Quake AND Windows NT) and at the sheer volume of content he produced during those halcyon days in the late 80s/90s when the PC emerged from dumb as rocks beige business box to premier gaming platform, he is a key figure.

I was too young and grew up too remote to have access to his writings as they were written but as an older engineer going back over his Graphics Black Book, the Zen of Assembly books and his Dr Dobbs articles, there's a lot of lessons that you can learn that still have a lot of value today.

Actually one of the best lessons - and most easily accessible - is in the first chapter of his graphics black book, he talks about the human element of optimisation, and that it is critical to optimise the right thing. He uses an example of a engineer he knew that in the era of early electronic calculators (still clunky and and slow, but getting faster all the time) honed his slide rule skills to wipe the floor with any desktop calculator operator. But he was spitting into the wind, it was clear calculators were getting faster and there was no more road with slide rules.

Ultimately he wasted effort on a legacy technology rather than leveraging what was up and coming.

In many ways this mirrored the PC, in its early days, it was slow, dumb and unwieldy, but it was improving at a rate of knots and eventually came to dominate.

Both Michael's Black Book and Zen of Assembly are on Github and maintained as PDF, Epub and other formats. They aren't terribly applicable technologically to computing today but they are very readable and contain a lot of general purpose problem solving and optimisation tips that are as applicable today as they were then.

https://github.com/jagregory/abrash-black-book

There have been few technical writers emerge that match Michael Abrash and it makes me so happy he's still hacking away at interesting problems today. Shame he doesn't write as much as he did, but I'm sure he's busy trying to usher in the metaverse (snowcrash had a big influence on him)!


I lucked my way into a lunch with Michael somewhat under 10 years ago and had a chance to pick his brain, and of course express gratitude for his writing which had an influence on me when I was younger.

What I found really remarkable, though, was he also picked my brain -- a junior programmer with no notable accomplishments, working on not very interesting problems. And when I bumped into him on a couple occasions after that, he remembered what I worked on and would ask how it was going.


That's very good manners. And I suspect a prodigious memory was a part of what made him so successful over his career.

If Michael can remember your details, he'll surely remember all kinds of other details as well and have them at his command.


When I first got the black book, I wanted to skip over the first chapter and get to "the good stuff", but forced myself to read it. It was the most useful info in that book (even more obvious today as few use the kinds of tricks found in the remainder): you can take hours/days/longer optimizing to the nth degree some algorithm that is used in a for loop, but better still is taking a bigger picture look and finding out you can accomplish the same thing a different way without needing the for loop at all.


There, you said it. It's one of the best books on computer science, hacking, software engineering, high performance software... But my favorite part is always those chapter starters. Light philosophy and stories about taking a step back, looking for your own bias, measuring first then optimizing, trust but verify... I also appreciate how he always gave credit, and it was kind of funny to see him always impressed by John Carmack.

I wish I could meet him one day but I expect my brain would empty out and I'd just stay there nerd-giggling and gushing.


I forgot to add, Michael did a post-mortem at GDC a year or so after shipping and you can access it online still (requires Flash though?!)

https://www.gdcvault.com/play/1014236/Quake-A-Post-Mortem-an...


Someone please transcode and upload this to somewhere like YouTube already.


In Safari I'm seeing audio only but I was able to download & play it on my way to work. Great talk. I did miss seeing the slides when he talked about some BSP & PVS features but it was great nonetheless.

And who crashed that car? Was it Jeff R. ?


> Both Michael's Black Book and Zen of Assembly are on Github and maintained as PDF, Epub and other formats

Do you know of a source for Dr Dobbs too? I've found this so far [0] but no luck yet with more recent volumes.

[0]: https://archive.org/details/dr_dobbs_journal


The archive dvds thank goodness have been preserved which will cover you up until 2008.

https://archive.org/details/DDJDVD6


Great writing too! If I remember a chapter title correctly:

"The best optimizer is between your ears!"


Ooh, I hadn't seen those repos before! I'll check them out! I've worked on converting a book to useful markdown, and hopefully epub and pdf, and there could useful processes there.


"here’s the secret to success in just two words: Ship it"

"After you finish the first 90% of a project, you have to finish the other 90%."

I feel like this conflict is reconciled today by releasing the first 90% and the second 90% as updates.

Notably, QTest was released on February 24, 1996, followed on June 22, 1996 by the full Quake release.


>> I feel like this conflict is reconciled today by releasing the first 90% and the second 90% as updates.

Damn, I feel like you just described everything wrong with the game industry in the last 5-10(?) years in one sentence.


> Damn, I feel like you just described everything wrong with the game industry in the last 5-10(?) years in one sentence.

Which makes sense because it's how much we've been online first.

Games before that era usually worked under the assumption that an internet connection (or at least a higher bandwidth one) wasn't available. Nowadays there's barely any reason to stop development after the GM is being burned on disks.

I do wonder what's gonna happen with long term use though. My game boy cartridges still work, but what will happen with current gen games? Even those bought phisically usually depend on patches from a server that won't be available in the long term future.


This is why I feel GOG (DRM free game install files) is so important for PC gaming. I'm not sure what the solution is for consoles though.


I think this is why we see so many "early access" games, and why so many of them never leave that stage


But the flip-side is we're all used to receiving automatic and continuous bug-fixes and even enhancements.

Even the maligned Day-0 mega-patch, is fine if it lets the game master ship earlier and fixes the bugs before I can play it.


In the short term, yes it's a good thing that automatic bug fixes and enhancements happen. But in the long term it incentivizes shipping unfinished, buggy products. It may be a net positive, but it is not an unalloyed benefit.


Michael Abrash is an excellent chronicler of technical history. Reading through his contributions is always time well spent.

See also https://fabiensanglard.net/ for a more contemporary take. His 'Black Books' on Wolfenstein and Doom are directly inspired by Abrash.


Insert obligatory reference to the fast inverse square root calculation which, thanks to its ability to more quickly generate (1/x) by using a bit shift with a magic number 0x5F3759DF, allowed performant lighting calculations for use in games. This number was known for this purpose for long before Quake (shout out SGI), but was popularized for the first time there


It's not the same without the original comments. Also, I don't know which is crazier, the magic number, or the fact that the code is casting a float to long bitwise and then working on it.


  i = * (long * ) &y; // evil floating point bit level hacking
  i = 0x5f3759df - (i >> 1); // what the fuck?


The bit manipulation already generates an approximation of 1/sqrt(x), not just 1/x.


Fast inverse square root wasn't in Quake until Quake III though, right?


That's correct, though there was plenty of other asm and bit level evil graphics voodoo that made quake able to run so fast on consumer hardware a few generations earlier than if they were stuck using slower division:)


The most notable trick in Quake I (at least IMHO) is the triangle rasterizer line-loop which schedules the divide needed for perspective-correct texture mapping to the FPU so that this expensive divide can run in parallel to pixel-span rasterization on the integer ALU. To my young brain who's only done 8- and 16-bit assembly on simple CPUs before this was nothing short of rocket science :)


Damn, that's awesome


Quake 2 had an interesting model format. It was essentially a set of OBJ mesh keyframes that get linearly interpolated between, for the smoother animation you see in Quake 2 character models when compared to Quake 1. MD2 was what it was called. One of the first things I wrote as a 3D game developer was an OBJ loader, and then an MD2 loader.

Carmack's code is something I've always admired. The system he figured out for a client/server networking model in Quakeworld for Quake 1 is still essentially the way most multiplayer games work now, as far as I know.


The Quake 1 and 2 model formats were very similar, the interpolation was a rendering feature. After the Quake 1 engine source was released in late 1999 the interpolation was quickly added by fans (although it wasn't as easy as it sounds, as IIRC the original Quake 1 source (not QuakeWorld) didn't track entities across frames on the client-side, so that had to be added first).

The main difference between the two model format was how they encoded vertex coordinates. They both stored X, Y, Z coords as one byte each. But MDL (Quake 1's format) had a uniform scale/offset for transforming these into the final coordinate space, whereas in MD2, each animation frame had its own scale and offset. This seems like an upgrade but when combined with interpolation it could also result in a pretty ugly "vertex swimming" (jiggling) effect when you tried to portray subtle movements, like the idle anims for the player's weapons.

One of the many things I admired from Quake is that there was a pretty uniform scale of detail to everything. There wasn't really anything that had higher polygon detail, texture resolution, or animation rate compared to anything else in the world. Everything looked very solid and consistent because of that. Quantized vertex coords was one of those tricks that seems restrictive but it didn't hurt them with the game they designed


While we're talking about clever quantizing, we should mention the vertex normal encoding. In MD2 (iirc, not sure about MD1) each vertex normal was stored as a byte which indexed into a pre-established array of unit vectors which were more or less uniformly distributed around a sphere. It was a creative way to have good-enough per-frame normals in a tiny amount of space without forcing the engine to do any painfully slow per-frame normal generation (with the floating point division and square root which that entailed).


True but games also added client side hit prediction which the Quake engine games never did. As late as Quake 3 if you were using a railgun you would have to lead your shots. There was the ZeroPing mod for UT then Valve put client side hit prediction in Counter-Strike.

Was some controversy as people said it makes it easier to write cheats, I remember even Cliffy B from Epic Games wrote this blog post about how terrible it is but that is where games went and IMO its a lot better then 1990s netcode.


Only later ports of Q1 engine interpolated between model key frames. Initial release of Quake didn't. And it didn't matter given the resolution people were playing at. For the same reason the crude LOD for models Q1 had was more than enough for the time.


A very important thing with software is that the most painful work can be done by the computer itself.

Tools like Lisp are essential in your toolbox for whatever language you use for coding. If you can script and automate and test everything life is much better.

Most of the pain in software development is self inflicted. The most boring something is for a human, the easiest is for the computer to make it.

Lots of people coming from the "code must be efficient" front, like assembly, c and c++ programmers simply ignored the interpreted, functional and painfully slow "you don't know how things are implemented" world and viceversa.

But both worlds are complimentary.

When I was a kid I discovered Numega SoftIce. That was an incredible debugger and you could automate everything.

Turns out you can do the same with gdm and lldb today and bugs just pop up from automatic tests.


SoftIce is in another league of itself. No other debugger can fully freeze your OS while you hunt down that elusive crypto variable that the other developer obfuscated in several DLL's in order to hide its true purpose.

Man, so much fun was back then.

Story time. One of my clients wanted to reverse engineer a trading algorithm and the only option was the nuclear option. Fully disassemble and in-memory hunting this encrypted, split into different DLL's function that was holding the entire algorithm. Warned the client that would take as much as half a year and can possibly run up to more than $100k. He accepted saying if it's successful then it can gain him millions. So I started the hunt. A few weeks down the road, my client, while we were chatting the usual status and whatnot, dropped the bomb. This algorithm was actually old, as in WinXP era. And I asked "do you have a WinXP variant of this that you'd be satisfied with if I manage to reverse it?". And he said he has. I took that one, prepared a WinXP machine with SoftIce in it and job was done 3 days later.

The level of control you have with SoftIce, you can't achieve it with anything else.


Is there something that would make it impossible to write the equivalent of SoftIce for Win7/Win10, or has it just not yet been done out of difficulty/laziness/lack of market/etc?


A testament to the quality and simplicity of the engine there are a number of QuakeWorld powered games still actively played today. I've been working on a Team Fortress continuation for the last decade or so, and I'm always surprised that our community continues to grow, both players and developers: https://www.fortressone.org/


Michael's black book got me skipping classes in college. His writing was perfectly paced between asm-fu and story telling. I would start and never stop until I was sweating.


The "(2000)" date in the subject can't be correct if he refers to QuakeWorld as still being in development. This has to be from 1996.


The articles are from 1996 - 1997.

According to the appendix (1), this content was posted to Blues News in 2000.

First article: Quake's Game Engine: The Big Picture - Dr. Dobbs Sourcebook, Sprint 1997, #279, pp. 58-60. Last article: Inside Quake: Visible-Surface Determination - Dr. Dobb's Sourcebook, Jan/Feb 1996, #255, pp. 41-45.

[1] https://www.bluesnews.com/abrash/credits.shtml


This is the book that started it all for me.


Question aside... Does anyone know if it is still possible to do 100% software rendering on today's hardware? I'm really interested in doing graphics but directly talking to the video card (e.g: draw a point, line, polygon, etc) in BSD or Linux and get something like the original Quake and Quake 2 software renderers did. I think those are masterpieces that pushed to the limit the capabilities of the PC at the time. I also think it could be a great way to learn graphics from scratch. Some time ago I played: https://jonathanwhiting.com/games/knossu/ which captures the way Quake engines used to render on DOS.


Sure! All you need is a way to display a buffer of pixels on the screen. No other communication with the video card required, it's all CPU until the frame is ready.

Incidentally, I have an ongoing hobby project that does just that: a pure software platform-agnostic 3D renderer with zero non-optional dependencies written in Rust. It comes with a few example programs that use SDL2 and ncurses(!) to handle display and events. I'm planning to add a Wasm example at some point as well.

The project is heavily work-in-progress, and definitely not yet optimized nearly to Quake levels, but on the other hand the code should be relatively clean and useful for learning purposes.

https://github.com/jdahlstrom/retrofire



Sure, but that depends on a browser (something that I really dislike). I was thinking more about building something completely from scratch (near zero dependencies) and low footprint (maybe a tiny 3D C engine).


Get yourself a PI and have fun coding like I and others used to do. in Assembly no less. :)

https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/os/

And if you happen to be motivated enough, maybe you end up doing something like this,

https://www.raspberrypi.org/blog/pifox-bare-metal-arm-assemb...



Look into Larabee's architecture: https://tomforsyth1000.github.io/larrabee/larrabee.html

It's another one involving Michael Abrash and it has inspired many research papers. For example, Intel developed a CPU rasterizer to handle occlusion culling and it can be faster than the GPU equivalent as it doesn't need retrieve data back from GPU to cull the draw calls: https://software.intel.com/content/www/us/en/develop/article...


With Google's SwiftShader Crysis is somwhat playable at low, low specs on a 64-core EPYC Rome CPU: https://youtu.be/HuLsrr79-Pw?t=705 MS also includes a SW renderer in Win10, apparently for use as a fallback: https://docs.microsoft.com/en-us/windows/win32/direct3dartic...


It's absolutely still possible, but note that screen resolution has increased much faster than CPU speed - if you want software-rendered 320x240 like Quake II, or even 1024x768, you'll probably be OK, but if you want to go to full HD even modern CPUs will struggle, and 4K or higher is right out.

A common solution is to make a low-resolution texture (like 800x480), draw onto it with a software renderer, then use hardware rendering to scale it up to full-screen. You can even add filtering like faux scanlines or a convex CRT effect.


Definitely, as long as you have a way to draw pixels you can write a software renderer. You can even do this on simple hardware like an Arduino: https://blog.mclemon.org/8-bit-graphics-with-arduino-and-ssd...


Modern Doom source ports (Zdoom, Zandronum, etc) keeps the capacity of using software rendering and they work on any modern computer.


Quake was not really well playable over modem yet, but Quakeworld was. It's mentioned in the article as the next thing coming.

Strange how much new area was quickly covered in such short time. Of course, the platform itself was developing to make this possible. One couldn't have done dynamic lighting 3d worlds with a 386 or multiplayer 3d shooters with 2400 bps modems.

There have been attempts to board new, moving platforms like VR but no breakthrough yet?


I don't own a VR headset, but I'm cautiously optimistic that it's on track to adoption this time. Just think about how rare video chat was, and now it's the norm.


What’s holding back VR (for 30 years, now) isn’t tech, it’s a lack of real use cases. Maybe AR for more of an office environment thing. But VR gaming is a niche, just conceptually.



Really puts things in perspective when he mentioned polygon count "... entities can contain hundreds of polygons". In polygon count for entities games are realistically only one order of magnitude higher than 20 years ago.


Nice read! It reminds me of Jeff Minter's article about the "Bathtub-curve" http://minotaurproject.co.uk/blog/?p=452


If you are interested in id Tech 3 you may also want to read this one: https://fabiensanglard.net/quake3/index.php


Michael Abrash! Wow, this brings back some found memories, as reading Zen of Assembly Language.


Just curious, how much of the black book is relevant for today?


Any word on a modern port of Quake to iOS/macOS?



EzQuake runs on MacOS http://www.ezquake.com/

BlahGod420 18 days ago [flagged] [dead] [–]




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: