So, today's MMO topic is time. Specifically, how do you make sure everything happens when you want it to happen.
My guess is that most programmers at some point write a game of some kind, and I suspect most of those are one of a few specifics - a variation of light cycles, the two cannons that shoot ballistic rounds at each other (aka Scorched Earth). For me it was lunar lander, a riff on that game Radio Shack used to run on their TRS-80s, back when there was a TRS-80 and a Radio Shack.
Lunar lander is a pretty simple game. Three controls: left, right, and engine (usually "A", "S" and the space bar). Left and right rotated the lander clockwise or counter clockwise, and engine fired the engines in the direction the engine was facing at that time. Its basically a physics problem: gravity is accelerating you downward, your inertia is moving you in whatever direction your current motion vector is pointing, and your engine is accelerating you away from the direction of thrust. In pseudo-code, Lunar Lander looks like this:
While (LanderY > 0)
LanderX = LanderX + Xspeed
LanderY = LanderY + Yspeed
Yspeed = Yspeed - gravity
LanderDirection = LanderDirection + Rotation
If EngineOn Then Xspeed = Xspeed + sin(LanderDirection) * Thrust, Yspeed = Yspeed + cos(LanderDirection) * Thrust
Draw(Lander)
Once you figure out how to read the keyboard and draw on the screen, the rest is simple math. I remember writing that program and running it for the very first time. Guess what happened. Yep: I crashed into the moon. So fast that my finger hadn't lifted off the Return key after typing "Run".
The problem is that computers are kind of fast. Even TRS-80s running BASIC are kinda fast. My physics was fine, my math was fine, my basic algorithmic process was fine, but the problem was that the TRS-80 was capable of executing that loop over 500 times per second. In about a quarter of a second it was all over. I hadn't accounted for time.
Unless you're a C programmer under the age of 35, the way most of us first learned to deal with this problem is with the idle loop. Basically, to slow down the computer, we give it a lot of nothing to do:
While (LanderY > 0)
LanderX = LanderX + Xspeed
LanderY = LanderY + Yspeed
Yspeed = Yspeed - gravity
LanderDirection = LanderDirection + Rotation
If EngineOn Then Xspeed = Xspeed + sin(LanderDirection) * Thrust, Yspeed = Yspeed + cos(LanderDirection) * Thrust
Draw(Lander)
do(1000 times): not much: repeat
We start sticking in loops in our code that just do nothing, but lots of times, and we adjust the amount of nothing until our program runs at the speed we want. This works, but its problematic. It causes the computer to spend a lot of time doing that nothing specifically to slow it down. On an old computer that could only run one program at a time, this was no big deal but on modern computers that can multitask, you've just written a CPU hog that will slow things down unnecessarily. Its also computer-specific: if you run this on a different computer, it will run at a different speed. And sometimes, it doesn't work at all. If you are under about 35 as I said and a C programmer, chances are that the only programs you ever wrote when you first learned to program were compiled with optimizing compilers. Optimizing compilers can sometimes tell that you've written code that does nothing, and it will replace that code with nothing thinking it just helped you out by making your program run more efficiently. Instead, it removes your attempt to deliberately slow your program down.
A better way is to exploit language functions that can deliberately slow down a program without forcing the computer to chase its tail. Most languages have a sleep command. You say "sleep(5)" and the program goes to sleep for five seconds, ceding control to other things that might use the processor power. And it *always* sleeps for five seconds, no matter how fast your computer gets. Of course, sleep itself has some issues, most notably sleep will do what you ask: it will sleep for a certain amount of time. But that might not be what you want. If you want your program to do something 10 times per second, you might think that adding a "sleep(0.1)" would do that: your program would sleep for one tenth of a second, then wake up. Rinse, repeat, and that happens ten times per second. But that assumes your actual program takes no time at all. Suppose your actual program takes about 0.05 seconds to do its thing. Then because it also sleeps for 0.1 seconds, it takes 0.15 seconds per loop. That's only about 6 times per second, not ten. And on different computers, and even at different times, those times could change.
Here's where we can start to get clever. Suppose we want to run a loop once per minute. We could do that by noting what time it is right now, and then waiting until the time was a whole minute time. So lets say right now its 10:33:42 - 10:33 and 42 seconds. If we then wait for 18 seconds, 18 seconds later it will be exactly 10:34. We then run. Then we check the time again. Suppose its 10:34:12. We wait 48 seconds, and then it will be 10:35. Then we run again. This means no matter how fast or slow we run, we always wait the right amount of time so that we only run once per minute. With a little more math, we could run once per second, or ten times per second, or thirty times per second.
In python, I did something like this:
def clockAlign():
miltick = (time.clock() * 1000 % 1000)
nexttick = (int(miltick / 33.33) + 1.01) * 33.33 + 1
print "sleep:",miltick,nexttick,nexttick-miltick
time.sleep((nexttick - miltick)/1000)
And it works. Before I put the wait function in, the game client was running at an incredible 270 thousand times per second (keep in mind its not doing much except processing keyboard events). Now, it was running at about 30 times per second, with about a millisecond or so variability.
However, I'm still not happy. That's a lot of math, and it doesn't account for certain things, like what happens if the game client takes *more* than a thirtieth of a second to process a loop. Then it will skip a beat. Now, in a sense that's what I want it to do: I don't want the game client to fall behind. But the game client doesn't *detect* that it fell behind. It just figures out when the next clock tick should be, based on what time it is now. Plus, its a little inflexible about how it handles time. I don't think its wrong, I think it needs help. In particular, I think it needs to be connected to something that recognizes that different things happen on different schedules.
A game client needs to draw the screen, and quickly. 30 frames per second minimum, usually. 60 fps would be better. But separate from that, there's other things that the game client might not need to do quite so often. Ditto the game server, which has to be running the same kind of clock system so it can stay in sync with the client. You might want certain things to happen 30 times per second, other things to happen 10 times per second, and other things maybe only once per second. And that requires timers. So that's what I'm going to be thinking about next: how to extend the clock to handle timers of different frequencies, and what to do when the clock strikes a particular time. This is probably where threading comes in, a separate complicated topic of how to have a single program run two different tasks basically simultaneously.
In the meantime, its also kind of boring to have a client that reads a keyboard and a server that knows what the client is doing but does nothing itself. I think I'm going to have to put some sort of placeholder something just to keep me from getting bored watching console debug screens. Maybe a triangle on a sphere.