Today I learned that there are way too many coordinate systems, and that I’m an idiot (but that was already well-established). I have also learned to not trust graphics tutorials, but the reasons for that won’t become apparent until the end of this article.
There are two types of coordinate systems: left-handed and right-handed coordinate systems. By convention, most everyone in math and science uses right-handed coordinate systems with positive x going to the right, positive y going up, and positive z coming out of the screen. A left-handed coordinate system is the same, but positive z instead points into the screen. Of course, there are many other possible coordinate system configurations, each either being right or left-handed; some modern CAD packages have y pointing into the screen and z pointing up, and screen-space in graphics traditionally has y pointing down and z pointing into the screen.
If you start digging through DirectX and OpenGL, the handedness of the coordinate systems being used are ill-defined due to its reliance on various perspective transforms. Consequently, while DirectX traditionally uses a left-handed coordinate system and OpenGL uses a right-handed coordinate system, you can simply use D3DPerspectiveMatrixRH
to give DirectX a right-handed coordinate system, and openGL actually uses a left-handed coordinate system by default on its shader pipeline - but all of these are entirely dependent on the handedness of the projection matrices involved. So, technically the coordinate system is whichever one you choose, but unlike the rest of the world, computer graphics has no real standard on which coordinate system to use, and so its just a giant mess of various coordinate systems all over the place, which means you don’t know what handedness a given function is for until things start getting funky.
I discovered all this, because today I found out that, for the past 6 or so years (the entire time my graphics engine has ever existed in any shape or form), it has been rotating everything backwards. I didn’t notice.
This happened due to a number of unfortunate coincidences. For many years, I simply didn’t notice because I didn’t know what direction the sign of a given rotation was supposed to rotate in, and even if I did I would have assumed this to be the default for graphics for some strange reason (there are a lot of weird things in graphics). The first hint was when I was integrating with Box2D and I had to reverse the rotation of its bodies to match up with my images. This did trigger an investigation, but I mistakenly concluded that it was Box2D that had it wrong, not me, because I was using atan2
to check coordinates, and I was passing them in as atan2(v.x,v.y)
. The problem is that atan2
is defined as float atan2(float y, float x)
, which means my coordinates were reversed and I was getting nonsense angles.
Now, here you have to understand that I was currently using a standard left-handed coordinate system, with y pointing up, x pointing right and z into the screen. The thing is, I wanted a coordinate system where y pointed down, and so I did as a tutorial instructed me to and reversed all of my y coordinates on the low-level drawing functions.
So, when atan2(x,y)
gave me bad results, I mistakenly thought “Oh, i forgot to reverse the y coordinate!” Suddenly atan2(x,-y)
was giving me angles that matched what my images were doing. The thing is, if you switch x and y and negate y, atan2(x,-y)==-atan2(y,x)
. One mistake had been incorrectly validated by yet another mistake, caused by yet another mistake!
You see, by inverting those y coordinates, I was accidentally reversing the result of my rotation matrices, which caused them to rotate everything backwards. This was further complicated by how the camera rotates things - if your camera is fixed, how do you make it appear that it is rotating? You rotate everything else in the opposite direction! Hence even though my camera was rotating backwards despite looking like it was rotating forwards, it was actually being rotated the right way for the wrong reason.
While I initially thought the fix for this would require some crazy coordinate system juggling, the actual solution was fairly simple. The fact was, a coordinate system with z pointing into the screen and y pointing down is still right-handed, which means it should play nicely with rotations from a traditional right-handed system. Since the handedness of a coordinate system is largely determined by the perspective matrix, reversing y-coordinates in the drawing functions was actually reversing them too late in the pipeline. Hence, because I used D3DXMatrixPerspectiveLH
, I had a left-handed coordinate system, and my rotations ended up being reversed. D3DXMatrixPerspectiveRH
negates the z-coordinate to switch the handedness of the coordinate system, but I like positive z pointing into the screen, so I instead hacked the left-handed perspective matrix itself and negated the y-scaling parameter in cell [2,2], then undid all the y-coordinate inversion insanity that had been inside my drawing functions (you could also negate the y coordinate in any world transform matrix sufficiently early in the pipeline by specifying a negative y scaling in [2,2]). Suddenly everything was consistent, and rotations were happening in the right direction again. Now the Camera rotation actually required the negative rotation, as one would expect, and I still got to use a coordinate system with y pointing down. Unfortunately it also reversed several rotation operations throughout the engine, some of which were functions that had been returning the wrong value this whole time so as to match up with the incorrect rotation of the engine - something that will give me nightmares for weeks, probably involving a crazed rabbit hitting me over the head with a carrot screaming “STUPID STUPID STUPID STUPID!”
What’s truly terrifying that all of this was indirectly caused by reversing the y coordinates in the first place. Had I instead flipped them in the perspective matrix itself (or otherwise properly transformed the coordinate system), I never would have had to deal with negating y coordinates, I never would have mistaken atan2(x,-y)
as being valid, and I never would have had rotational issues in the first place.
All because of that one stupid tutorial.
P.S. the moral of the story isn’t that tutorials are bad, it’s that you shouldn’t be a stupid dumbass and not write unit tests or look at function definitions.