Most of what I’ve been doing these days is support code for various other people: adding missing functions, fixing bugs, adding new UI code when it’s needed, and just keeping the airship sailing. I’ve also been dealing with mysterious issues on user machines, which is always fun for a game developer. The process is simple: a user complains that the game mysteriously doesn’t work, and it becomes a game of trying to deduce what the problem is because, after all, you have no way of getting the user’s computer into the office. To give you an idea of the fun involved, here’s a recent example, as well as a few notes for other developers who might run into this particular fun problem.
The story starts with a user noting that “when I start a new game, it crashes.” Well, that’s interesting. Does it crash when starting a new game, or when creating the world? “Starting a new game. No, creating a world. No, both.” We get a crash dump, and it’s crashing in a completely random place in the renderer which it shouldn’t be able to reach during world creation. Neat. So I ask for a copy of the user’s console, and I get that too.
We recently added a hidden debug command line option, which turns on “OpenGL Debug Spew.” Essentially, we create an OpenGL debug context, and every time OpenGL throws an error it faithfully spits out a huge wad of information into console.txt, which I then review and stare at in disbelief while drinking gin and clutching one of my many tunics. In this case, the OpenGL debug spam produces some paydirt. Previous followers of me yelling about technical stuff on the blog will note that we have to use OpenGL’s “Core” profile (3.2) because that’s all that OS X supports, and frankly it’s just better software practice. When running on this user’s machine, the OpenGL debug spew claims that we are calling deprecated functions that no longer exist in the Core profile, such such as reading the projection matrix. Well…. we don’t. I can search my source tree, and no references to glMatrixMode() exist any more.
Okay, so is it SDL that’s doing this, or a third-party library? *check, check* Nope. What’s even weirder, running on my machine in the office with the same video card doesn’t produce the same debug spew – it doesn’t claim that we are calling these deprecated functions at all. So, I go back to the crash dump and I notice that the game is loading some DLLs from something called plays.tv – which also has some junk in the call stack. Well, what the heck is that all about?
Plays.tv is an online streaming application, which I think the current owners bought from AMD (who, occasionally, make graphics cards and should know a thing or two about computing.) It seems to work – and this is just speculation, mind you – by hooking the OpenGL driver entry point and injecting its own code into the application. However, it doesn’t seem to check whether or not the OpenGL profile being used is OpenGL Core or Compatibility, and just calls compatibility mode functions without seeing whether or not this is a good idea or not. Ah-ha!
So what can I do about it?
Well, honestly, not a lot. After some conversations with Baldur Karlsson who produces the excellent RenderDoc debugging tool, it sounds like he had the same problems with his application and plays.tv and ended up just writing to the developers to get it blacklisted. There’s nothing I can do, because if I run the game in compatibility mode, then it will stop working on OS X. Furthermore, I would guess that this *also* explains the crashes we see occasionally with Open Broadcaster Software (OBS). The end solution is probably going to be “create the OpenGL context, see if we end up loading the plays.tv DLL, and if so throw an error message to the user and require them to turn it off before playing Clockwork Empires.” This is messy, because it involves descending into the blasted hellscape that is Microsoft Window’s Process Handling Function Calls, and I’m busy putting out fires elsewhere in our codebase right now, but I suppose it’s one of those things that will have to get done.
Anyhow, if you *are* having trouble with Clockwork Empires, turn off your streaming software, and hopefully this saves other developers some time trying to figure out their own mystery crashes.
My mac has no problem running OpenGL 4.1 and even with very old hardware it runs 3.3
Of course your mac run OpenGL 4.1. OpenGL 4.1 is six years old!
Intel HD 5000: OS X – 4.1, Windows – 4.3
Radeon HD 6970: OS X – 4.1, Windows – 4.4
Radeon HD 5870: OS X – 4.1, Windows – 4.4
Radeon HD 6490: OS X – 4.1, Windows – 4.4
OpenGL state in OS X is “stagnant”.
There’s also “what does Apple say it is” versus “Well, which bits actually will work without falling off?” I chose 3.2 because it’s the minimum specification required for the game we are making, and was also appropriate when I made the decision. I still wouldn’t believe Apple if it told me that it had OpenGL 4.1 support that actually worked.
I find that last screenshot very pleasant to look at for some reason.
Because that’s what we all expect to happen to our villages in the end.
Nicholas! I just love your sense of humour man!
always enjoy your posts so much. Thanks for keeping up the good work, all of you.
much appreciation and thanks for the game.