Erik McClure

DirectX Is Terrifying


About two months ago, I got a new laptop and proceeded to load all my projects on it. Despite compiling everything fine, my graphics engine that used DirectX mysteriously crashed upon running. I immediately suspected either a configuration issue or a driver issue, but this seemed weird because my laptop had a newer graphics card than my desktop. Why was it crashing on newer hardware? Things got even more bizarre once I narrowed down the issue - it was in my shader assignment code, which hadn’t been touched in almost 2 years. While I initially suspected a shader compilation issue, there was no such error in the logs. All the shaders compiled fine, and then… didn’t work.

Now, if this error had also been happening on my desktop, I would have immediately started digging through my constant assignments, followed by the vertex buffers assigned to the shader, but again, all of this ran perfectly fine on my desktop. I was completely baffled as to why things weren’t working properly. I had eliminated all possible errors I could think of that would have resulted from moving the project from my desktop to my laptop: none of the media files were missing, all the shaders compiled, all the relative paths were correct, I was using the exact same compiler as before with all the appropriate updates. I even updated drivers on both computers, but it stubbornly refused to work on the laptop while running fine on the desktop.

Then I found something that nearly made me shit my pants.

if(profile <= VERTEX_SHADER_5_0 && _lastVS != shader) {
  //...
} else if(profile <= PIXEL_SHADER_5_0 && _lastPS != shader) { 
  //...
} else if(profile <= GEOMETRY_SHADER_5_0 && _lastGS != shader) {
  //...
}
Like any sane graphics engine, I do some very simple caching by keeping track of the last shader I assigned and only setting the shader if it had actually changed. These if statements, however, have a very stupid but subtle bug that took me quite a while to catch. They’re a standard range exclusion chain that figures out what type of shader a given shader version is. If it’s less than say, 5, it’s a vertex shader. Otherwise, if it’s less than 10, that this means it’s in the range 5-10 and is a pixel shader. Otherwise, if it’s less than 15, it must be in the range 10-15, ad infinitum. The idea is that you don’t need to check if the value is greater than 5 because the failure of the previous statement already implies that. However, adding that cache check on the end breaks all of this, because now you could be in the range 0-5, but the cache check could fail, throwing you down to the next statement checking to see if you’re below 10. Because you’re in the range 0-5, you’re of course below 10, and the cache check will ALWAYS succeed, because no vertex shader would ever be in the pixel shader cache! All my vertex shaders were being sent in to directX as pixel shaders after their initial assignment!

For almost 2 years, I had been feeding DirectX total bullshit, and had even tested it on multiple other computers, and it had never given me a single warning, error, crash, or any indication whatsoever that my code was completely fucking broken, in either debug mode or release mode. Somehow, deep in the depths of nVidia’s insane DirectX driver, it had managed to detect that I had just tried to assign a vertex shader to a pixel shader, and either ignored it completely, or silently fixed my catastrophic fuckup. However, my laptop had the mobile drivers, which for some reason did not include this failsafe, and so it actually crashed like it was supposed to.

While this was an incredibly stupid bug that I must have written while sleep deprived or drunk (which is impressive, because I don’t actually drink), it was simply impossible for me to catch because it produced zero errors or warnings. As a result, this bug has the honor of being both the dumbest and the longest-living bug of mine, ever. I’ve checked every location I know of for any indication that anything was wrong, including hooking into the debug log of directX and dumping all it’s messages. Nothing. Nada. Zilch. Zero.

I’ve heard stories about the insane bullshit nVidia’s drivers do, but this is fucking terrifying.

Alas, there is more. I had been experimenting with direct2D as an alternative because, well, it’s a 2D engine, right? After getting text rendering working, a change in string caching suddenly broke the entire program. It broke in a particularly bizarre way, because it seemed to just stop rendering halfway through the scene. It took almost an hour of debugging for me to finally confirm that the moment I was rendering a particular text string, the direct2D driver just stopped. No errors were thrown. No warnings could be found. Direct2D’s failure state was apparently to simply make every single function call silently fail with no indication that it was failing in the first place. It didn’t even tell me that the device was missing or that I needed to recreate it. The text render call was made and then every single subsequent call was ignored and the backbuffer was forever frozen to that half-drawn frame.

The error itself didn’t seem to make any more sense, either. I was passing a perfectly valid string to Direct2D, but because that string originated in a different DLL, it apparently made Direct2D completely shit itself. Copying the string onto the stack, however, worked (which itself could only work if the original string was valid).

The cherry on top of all this is when I discovered that Direct2D’s matrix rotation constructor takes degrees, not radians, like every single other mathematical function in the standard library. Even DirectX takes radians!

WHO WRITES THESE APIs?!


Everyone Does sRGB Wrong Because Everyone Else Does sRGB Wrong


Did you know that CSS3 does all its linear gradients and color interpolation completely wrong? All color values in CSS3 are in the sRGB color space, because that’s the color space that gets displayed on our monitor. However, the problem is that the sRGB color space looks like this:

sRGB gamma curve

Trying to do a linear interpolation along a nonlinear curve doesn’t work very well. Instead, you’re supposed to linearize your color values, transforming the sRGB curve to the linear RGB curve before doing your operation, and then transforming it back into the sRGB curve. This is gamma correction. Here are comparisons between gradients, transitions, alpha blending, and image resizing done directly in sRGB space (assuming your browser complies with the W3C spec) versus in linear RGB:

sRGBLinear
At this point you’ve probably seen a bazillion posts about how you’re doing color interpolation wrong, or gradients wrong, or alpha blending wrong, and the reason is because… you’re doing it wrong. But one can hardly blame you, because everyone is doing it wrong. CSS does it wrong because SVG does it wrong because Photoshop does it wrong because everyone else does it wrong. It’s all wrong.

The amazing thing here is that the W3C is entirely aware of how wrong CSS3 linear gradients are, but did it anyway to be consistent with everything else that does them wrong. It’s interesting that while SVG is wrong by default, it does provide a way to fix this, via color-interpolation. Of course, CSS doesn’t have this property yet, so literally all gradients and transitions on the web are wrong because there is no other choice. Even if CSS provided a way to fix this, it would still have to default to being wrong.

It seems we have reached a point where, after years of doing sRGB interpolation incorrectly, we continue to do it wrong not because we don’t know better, but because everyone else is doing it wrong. So everyone’s doing it wrong because everyone else is doing it wrong. A single bad choice done long ago has locked us into compatibility hell. We got it wrong the first time so now we have to keep getting it wrong because everyone expects the wrong result.

It doesn’t help that we don’t always necessarily want the correct interpolation. The reason direct interpolation in sRGB is wrong is because it changes the perceived luminosity. Notice that when going from red to green, the sRGB gradient gets darker in the middle of the transition, while the gamma-corrected one has constant perceived luminosity. However, an artist may prefer the sRGB curve to the linearized one because it puts more emphasis on red and green. This problem gets worse when artists use toolchains inside sRGB and unknowingly compensate for any gamma errors such that the result is roughly what one would expect. This is a particular issue in games, because games really do need gamma-correct lighting pipelines, but the GUIs were often built using incorrect sRGB interpolation, so doing gamma-correct alpha blending gives you the wrong result because the alpha values were already chosen to compensate for incorrect blending.

In short, this is quite a mess we’ve gotten ourselves into, but I think the most important thing we can do is give people the option of having a gamma correct pipeline. It is not difficult to selectively blend things with proper gamma correction. We need to have things like SVG’s color-interpolation property in CSS, and other software needs to provide optional gamma correct pipelines (I’m looking at you, photoshop).

Maybe, eventually, we can dig ourselves out of this sRGB hell we’ve gotten ourselves into.


Mathematical Notation Is Awful


Today, a friend asked me for help figuring out how to calculate the standard deviation over a discrete probability distribution. I pulled up my notes from college and was able to correctly calculate the standard deviation they had been unable to derive after hours upon hours of searching the internet and trying to piece together poor explanations from questionable sources. The crux of the problem was, as I had suspected, the astonishingly bad notation involved with this particular calculation. You see, the expected value of a given distribution $$ X $$ is expressed as $$ E[X] $$, which is calculated using the following formula:

\[ E[X] = \sum_{i=1}^{\infty} x_i p(x_i) \]
The standard deviation is the square root of the variance, and the variance is given in terms of the expected value.
\[ Var(X) = E[X^2] - (E[X])^2 \]
Except that $$ E[X^2] $$ is of course completely different from $$ (E[X])^2 $$, but it gets worse, because $$ E[X^2] $$ makes no notational sense whatsoever. In any other function, in math, doing $$ f(x^2) $$ means going through and substitution $$ x $$ with $$ x^2 $$. In this case, however, $$ E[X] $$ actually doesn’t have anything to do with the resulting equation, because $$ X \neq x_i $$, and as a result, the equation for $$ E[X^2] $$ is this:
\[ E[X^2] = \sum_i x_i^2 p(x_i) \]
Only the first $$ x_i $$ is squared. $$ p(x_i) $$ isn’t, because it doesn’t make any sense in the first place. It should really be just $$ P_{Xi} $$ or something, because it’s a discrete value, not a function! It would also explain why the $$ x_i $$ inside $$ p() $$ isn’t squared - because it doesn’t even exist, it’s just a gross abuse of notation. This situation is so bloody confusing I even explicitely laid out the equation for $$ E[X^2] $$ in my own notes, presumably to prevent me from trying to figure out what the hell was going on in the middle of my final.

That, however, was only the beginning. Another question required them to find the covariance between two seperate discrete distributions, $$ X $$ and $$ Y $$. I have never actually done covariance, so my notes were of no help here, and I was forced to return to wikipedia, which gives this helpful equation.

\[ cov(X,Y) = E[XY] - E[X]E[Y] \]
Oh shit. I’ve already established that $$ E[X^2] $$ is impossible to determine because the notation doesn’t rely on any obvious rules, which means that $$ E[XY] $$ could evaluate to god knows what. Luckily, wikipedia has an alternative calculation method:
\[ cov(X,Y) = \frac{1}{n}\sum_{i=1}^{n} (x_i - E(X))(y_i - E(Y)) \]
This almost works, except for two problems. One, $$ \frac{1}{n} $$ doesn’t actually work because we have a nonuniform discrete probability distribution, so we have to substitute multiplying in the probability mass function $$ p(x_i,y_i) $$ instead. Two, wikipedia refers to $$ E(X) $$ and $$ E(Y) $$ as the means, not the expected value. This gets even more confusing because, at the beginning of the Wikipedia article, it used brackets ($$ E[X] $$), and now it’s using parenthesis ($$ E(X) $$). Is that the same value? Is it something completely different? Calling it the mean would be confusing because the average of a given data set isn’t necessarily the same as finding what the average expected value of a probability distribution is, which is why we call it the expected value. But naturally, I quickly discovered that yes, the mean and the average and the expected value are all exactly the same thing! Also, I still don’t know why Wikipedia suddenly switched to $$ E(X) $$ instead of $$ E[X] $$ because it stills means the exact same goddamn thing.

We’re up to, what, five different ways of saying the same thing? At least, I’m assuming it’s the same thing, but there could be some incredibly subtle distinction between the two that nobody ever explains anywhere except in some academic paper locked up behind a paywall that was published 30 years ago, because apparently mathematicians are okay with this.

Even then, this is just one instance where the ambiguity and redundancy in our mathematical notation has caused enormous confusion. I find it particularly telling that the most difficult part about figuring out any mathematical equation for me is usually to simply figure out what all the goddamn notation even means, because usually most of it isn’t explained at all. Do you know how many ways we have of taking the derivative of something?

$$ f'(x) $$ is the same as $$ \frac{dy}{dx} $$ or $$ \frac{df}{dx} $$ even $$ \frac{d}{dx}f(x) $$ which is the same as $$ \dot x $$ which is the same as $$ Df $$ which is technically the same as $$ D_xf(x) $$ and also $$ D_xy $$ which is also the same as $$ f_x(x) $$ provided x is the only variable, because taking the partial derivative of a function with only one variable is the exact same as taking the derivative in the first place, and I’ve actually seen math papers abuse this fact instead of use some other sane notation for the derivative. And that’s just for the derivative!

Don’t even get me started on multiplication, where we use $$ 2 \times 2 $$ in elementary school, $$ * $$ on computers, but use $$ \cdot $$ or simply stick two things next to each other in traditional mathematics. Not only is using $$ \times $$ confusing as a multiplicative operator when you have $$ x $$ floating around, but it’s a real operator! It means cross product in vector analysis. Of course, the $$ \cdot $$ also doubles as meaning the Dot Product, which is at least nominally acceptable since a dot product does reduce to a simple multiplication of scalar values. The Outer Product is generally given as $$ \otimes $$, unless you’re in Geometric Algebra, in which case it’s given by $$ \wedge $$, which of course means AND in binary logic. Geometric Algebra then re-uses the cross product symbol $$ \times $$ to instead mean commutator product, and also defines the regressive product as the dual of the outer product, which uses $$ \nabla $$. This conflicts with the gradient operator in multivariable calculus, which uses the exact same symbol in a totally different context, and just for fun it also defined $$ * $$ as the “scalar” product, just to make sure every possible operator has been violently hijacked to mean something completely unexpected.

This is just one area of mathematics - it is common for many different subfields of math to redefine operators into their own meaning and god forbid any of these fields actually come into contact with each other because then no one knows what the hell is going on. Math is a language that is about as consistent as English, and that’s on a good day.

I am sick and tired of people complaining that nobody likes math when they refuse to admit that mathematical notation sucks, and is a major roadblock for many students. It is useful only for advanced mathematics that take place in university graduate programs and research laboratories. It’s hard enough to teach people calculus, let alone expose them to something useful like statistical analysis or matrix algebra that is relevant in our modern world when the notation looks like Greek and makes about as much sense as the English pronunciation rules. We simply cannot introduce people to advanced math by writing a bunch of incoherent equations on a whiteboard. We need to find a way to separate the underlying mathematical concepts from the arcane scribbles we force students to deal with.

Personally, I understand most of higher math by reformulating it in terms of lambda calculus and type theory, because they map to real world programs I can write and investigate and explore. Interpreting mathematical concepts in terms of computer programs is just one way to make math more tangible. There must be other ways we can explain math without having to explain the extraordinarily dense, outdated notation that we use.


I Tried To Install Linux And Now I Regret Everything


I am going to tell you a story.

This story began with me getting pissed off at Windows for reasons that don’t really need to be articulated. Just, pick your favorite reason to hate Windows, and let’s pretend that was the tipping point. I had some spare space on my secondary drive, so I decided to give Linux another whirl. After all, it had been three or four years since I last attempted anything like this (it’s been so long I don’t rightly remember the last time I tried), and people have been saying that Linux has gotten a lot better recently!

The primary reason I was attempting this is because of Valve’s attempts to move to linux, which have directly resulted in much better nVidia driver support. Trying to install proper nVidia drivers is the thing that wrecked my last attempt. This time, I figured, I could just install the nVidia drivers straight from the repo and everything would be hunky dory, and I could just spend the rest of my time beating on the rest of linux with a comically oversized wrench until it did what I wanted.

I’d had good experiences with XFCE and Fedora on my Linux VM, but I didn’t really like Fedora itself (but it interfaced very well with my VM environment). I wanted to install Ubuntu, because it has the best support and I don’t like trying to dig through arcane forum posts trying to figure out why my computer screen has suddenly turned into an invisible pink unicorn. Unfortunately, Ubuntu is a bloated mess, and I hate it’s default desktop environment. In the past, I had tried Linux Mint, which had been okay, but support had been shaky. I spotted Lubuntu, which is supposed to be a lightweight ubuntu on top of LXDE, a minimal window manager similar to XFCE. This was perfect! So I downloaded Lubuntu 15.04 and installed it on my secondary drive and everything was nice and peachy.

Well, until Linux started, anyway. The first problem was that the package manager in pretty much every linux distro lists all the packages, including all the ones I don’t understand. I was trying to remove some pre-included Paint application and it had two separate packages, one named after itself, and one named <name>-common, but the common package wasn’t automatically removed when the program was! The nVidia packages also had both an nvidia-drivers-340[1] package, and an nvidia-drivers-340-update package, both of which had identical descriptions. I just went with the most basic one because it seemed sensible, but I felt really sorry for someone less tech savvy trying to find anything in that godforsaken list.

So after everything updated and I restarted and told the package manager to start installing my nVidia drivers, I started noticing annoying things about LXDE. A lot of annoying things. Here, let me list them for you:

  • The file manager, when unmounting anything, would helpfully close itself.
  • Trying to change the time to not display military time involves editing some arcane string whose only value is %r, and I still don’t know what that means and don’t want to know what that means. All I wanted to do was change it to say AM or PM!
  • In order to get a shortcut onto the desktop, you had to right-click the menu item and then a new menu would show up that would let you add it to the desktop. You can’t drag anything out of the start menu.
  • The shortcuts seemed disturbingly fickle, occasionally taking 4 rapid clicks to get things to start up.
  • Steam simply didn’t start at all. Unfortunately, we will never know why.
  • Skype managed to spawn a window halfway above the top of the screen, which resulted in me having to look up alt+space in order to rescue it.
  • Skype also cut off everything below the baseline of the bottom-most line of text, so you couldn’t tell if something was an i or a j.
  • Like in Windows, in Linux, modal dialogs have a really bad habit of sucking up keyboard focus, focusing on a button, and then making my next keystroke click the button. The fact that this is a widespread UI problem across most operating systems is just silly.

There were more nitpicks, but I felt like a large number of these issues could have probably be resolved by switching to a better window manager. I was honestly expecting better than this, though. LXDE was supposed to be like XFCE, but apparently it’s actually XCFE with all it’s redeeming qualities removed. However, it turns out that this doesn’t matter! To discover why, we have to examine what happened next.

My friend suggested to get Linux Mint, which has a better window manager and would probably address most of those issues. So, I downloaded the ISO. I already had a USB set up for booting that had Lubuntu on it, so I wanted to see if I could just extract the contents of the ISO and put them on the USB stick. I have no idea if this would have actually worked or not, and I never got to find out because upon clicking and dragging the contents of the ISO out of the archive manager, with the intent of extracting them, the entire system locked up.

Now, I’m not sure how stupid trying to drag a file out of an ISO was, but I’m pretty sure it shouldn’t render my entire computer unusable. This seems like an unfair punishment. Attempts to recover the desktop utterly failed, as did ctrl-alt-del, alt-F2, alt-anything else, or even trying to open a terminal (my friend would later inform me that it is actually *ctrl-alt-F2* that opens the terminal, but I couldn’t ask him because the entire desktop had locked up!). So I just restarted the machine.

That was the last time I saw my Lubuntu desktop.

Upon restarting my machine, I saw the Lubuntu loading screen come up, and then… nothing. Blackness. About 20 seconds later, an error message pops up: “Soft Lock on CPU#0!”. Then the machine rebooted.

After spending just one hour using linux, it had bricked itself.[2] I hadn’t even gotten steam working yet, and now it was completely unusable. This is not “getting better”, this is strapping a rocket to your ass and going so fast in the wrong direction you break the sound barrier. Now, if you have been paying attention, you will note that I had just finished installing my nVidia drivers before the desktop locked up, and that the drivers probably wouldn’t actually be used by the desktop environment until the process was restarted. After mounting my linux partition in windows and extracting my log files, I sent them to my friend, who discovered that it had indeed been the nVidia driver that had crashed the system.[3]

This is problematic for a number of reasons, because those drivers were written for Ubuntu based distros, which means my system could potentially lock up if I installed any other ubuntu based distro, which is… just about all the ones that I cared about. Including Linux Mint. At this point, I had a few options:

  1. Install another ubuntu based distro anyway and hope it was a fluke.
  2. Install something based on Debian or maybe Fedora.
  3. Use the open-source drivers.
  4. Fuck this shit.

Unfortunately, I have little patience left at this point, and after seeing Linux first lock up after trying to drag files before bricking itself, I don’t really have much confidence in the reverse-engineered open-source nVidia drivers, and I certainly am not going to entrust my video card to them in the hopes it doesn’t melt. I really, really don’t want to play whack-a-mole with Linux distros, trying to find the magical one that wouldn’t wreck my graphics card, so I have simply opted for option 4.

But it doesn’t end there.

About two or three hours after this all happened (I had been fairly distracted, and by distracted I mean }}blinded by rage), I noticed that my windows clock was way off. At this point, I remembered something my friend had told me about - Linux, like any sane operating system would, sets the hardware clock to UTC and then modifies it based on the timezone. Windows, on the other hand, decided it would be a fantastic idea to set the hardware clock to local time. Of course, this should have been an easy fix. I just set the time back, forced an update, and bam, time was accurate again. Crisis averted, right?

No, of course not. Linux had not yet finished punishing me for foolishly believing I was worthy of installing something of it’s calibre. Because then I opened Skype, and started receiving messages in the past. Or more accurately, my chat logs were now full of messages that had been sent tomorrow.

It was at this point I realized what had happened. It wasn’t that the timezone had been changed, or something reversible like that. The hardware clock itself had been modified to an incorrect value. Skype had spent the past two hours happily sending messages with a timestamp 8 hours in the future because some idiot at Microsoft thought it was a good idea to set the hardware clock to local time, and now all these incorrect client side timestamps had been propagated to the cloud and synced across all my devices.

I slowly got up from my computer. I walked over to my bed, lied down, curled into a ball, and wept for the future of mankind.


1 I may have gotten these package names slightly wrong, but I can’t verify them because Linux won’t boot up! 2 It’s probably entirely possible to rescue the system by mounting the file system and modifying the x config file to not use nvidia, but this was supposed to just be “install linux and set it up”, not “install linux and spend the next 5 hours frantically trying to get it to boot properly.” 3 After submitting my log files to #ubuntu to report the issue, my friend discovered that the drivers for ATI/AMD graphics drivers are also currently broken for many ubuntu users. What timing! This certainly makes me feel confident about using this operating system!


We Aren't Designing Software For Robots


"I have the solution, but it only works in the case of a spherical cow in a vacuum." - *[old physics proverb](https://en.wikipedia.org/wiki/Spherical_cow)*
Whenever I am designing something, be it an API or a user interface, I try to remember that I am not designing this for a perfectly rational agent. Instead, I am designing software for a bunch of highly emotional, irrational creatures called *human beings*, all of whom have enormously different tastes. I try to include options, or customization, or if that isn't possible, a compromise. I try to keep the door open, to let my software be used as a tool to enhance someone's productivity no matter what workflow they use, instead of trying to impose my own workflow on them.

For some reason, many programmers seem to struggle with the concept of people being different. They get hung up on this naïve concept of right or wrong, as if life is some kind of mathematical equation that has a closed form solution. Let me say right now that any solution to life is going to be one heck of a chaotic nonlinear PDE, which won’t have any closed form solution at all, and certainly not one using elementary functions. When you are developing software, you must keep in mind the range of customers who will be using your product, whether they are office workers or fellow programmers.

Maybe someone is using your product to try and finish a presentation in time to go home and catch a nap before they get up to work their second job so they can support a wife and a screaming baby. Someone else might use your product to track their progress as they try to revolutionize search from their bedroom instead of study for their finals next week. Someone else might be an elderly man trying to figure out how his vacation is going to go.

We are all different. We arise from all walks of life and are bound together in a great journey on this blue ball hurtling through space. It is not cowardice when two people try to put aside their differences and work together, it is strength. It requires enormous courage to admit that there are no simple answers in life. There are no answers in the back of the textbook. There are many different answers, all different in subtle ways, all suitable for slightly different circumstances, all both right and wrong in their own, strange, quirky ways.

Some programmers seem distressingly incapable of compassion or empathy. Many claim to live through the cold hard logic of data, without apparently realizing that data itself is inherently meaningless. It is only given meaning through a human’s interpretation, and a human can interpret data to mean whatever they want. They seem to think they can solve problems by reducing everyone into neat little collections of numbers that can be easily analyzed. It’s certainly a lot less frustrating to work with real numbers instead of real people, but inevitably, a programmer must come to terms with the fact that life is about human beings, not numbers on a screen.

The cold hard logic of our code is good for speaking to computers—it is not good for speaking to other human beings.


Avatar

Archive

  1. 2025
  2. 2024
  3. 2023
  4. 2022
  5. 2021
  6. 2020
  7. 2019
  8. 2018
  9. 2017
  10. 2016
  11. 2015
  12. 2014
  13. 2013
  14. 2012
  15. 2011
  16. 2010
  17. 2009