Game Speed

It concerns me to use “every turn” and similar constructions because I worry about game speed. Is lagging a significant problem with Inform games, and when does it tend to be a problem? Does it matter what VM you are using? (Mine is on Glulx right now.)

I’m maybe not the best person to answer such a question as i’ve been developping since april/may, but I can share a few problems I encountered.
The game i’m making is based on real time. So basically, i haven’t written anything concrete that would get displayed in the game but i’ve started adapting/extending the standard rules for handling a real time environement. I’m developping on Gluxl as well.

-My time engine is lagging after an half an hour. I don’t really know why it is happening, i just decided to sort this out later, when i am able to play my game for more than half hour.
-Regular expressions: i’ve implemented a simple syntax so as to create and test what i call moods (list of numerical parameters) blablabla. It takes me around one second or two to create one of those objects (yep my syntax relies on regular expression. Neat heh ?). :frowning:. The only workaround i found is to launch the game so that it builds them, then save the game. This way I can restore the game (and the dynamically created objects) when I begin a new game.
-Dynamic object creation (needs dynamic objects by jesse mcgrew).

[code]A zboub is a kind of thing.
The zbouba is a zboub.

When play begins:
say “GO”;
repeat with N running from 1 to 5000:
let A be a new object cloned from the zbouba;
say “[line break]DONE”;[/code]

compared to (java):

[code]import java.util.Calendar;

public class testJavaInform {

public static void main (String [] args){
	System.out.println("GO !");
	Calendar now = Calendar.getInstance();
	Object[] container = new Object[5000];
	long before = now.getTimeInMillis();
	for (int i=0 ; i<5000 ; i++) {
		container[i] = new Object();
	long after = now.getTimeInMillis();
	System.out.println("time to complete task = "+(after-before)+" ms");
	System.out.println("DONE !");


I’ve launched the I7 code prior to writing the java version. It’s still lagging (i’m even wondering if the game is frozen). Meanwhile, this is what the java code prints in the terminal:

and i7 doesn’t seem to totally freeze, but seems to get slower and slower. Not to mention that my computer’s fans are going bananas.
edit : i7 seems to have frozen on the 987th object (I think it might just be a matter of max_memory variable or something like this, but i’m wondering why i’m not getting any error message like an OutOfMemoryError).

-Input/output : I tried to make an extension for handling ascii animations. My goal was to create one big file containing all the frames instead of hardcoding them into the source code. It displayed around 1 frame per sec (less than 100*100 characters). And I tried different algorithms, all ended with this result. Maybe because my movie file was too long (a few KOs).

But I can still get pretty good results (so far) : my system executes quite a lot of code every decisecond, and it still works fine. Of course I disconnected the turn system from the advance time rule (if I remember well) : A turn happens only when the player type something.

Glulx wasn’t designed to keep perfect time, especially over long periods (see the spec). That said, it’s possible this could be an interpreter issue. Zoom (Mac) and Gargoyle (cross-platform) have the highest resolution timers.

I’m confused as to why you would do this with dynamic objects. If you know you need 5000 zboubs, just declare 5000 zboubs as standard objects. It really only makes sense to use Dynamic Objects (which don’t have native support in I6/I7) when you need to create them on the fly.

That sounds slower than it really ought to be. The Glimmr Automap extension renders scaled png graphics from fairly large arrays (2000-3000 entries) much faster than that, converting from character codes along the way. Are you using indexed text? If so, you might want to consider using character codes instead, with a custom say statement, e.g.:

To say char-code (N - a number):
(- print (char) {N}; -)

Another potential cause is that your interpreter doesn’t print text very quickly. I’m not familiar with the kinds of speed that are attainable with text-grid windows, in case you’re displaying the animation in a grid window. Finally, 10,000 “pixels” (100 x 100) is quite a lot, you might consider reducing the animation size.

More generally: Be sure to test all of these effects outside of the IDE. The IDE interpreters run quite a bit slower than their standalone counterparts.


So…should I not be worried about “every turn” events?

Correct, don’t worry about it. Usually it’s not a problem.

Assuming you don’t have 1) hundreds of Every Turn rules, or 2) an Every Turn rule that’s, say, calculating pi to a gazillion digits. Might want to tuck that last one elsewhere.

Absolutely. When you’re trying to keep track of relatively long stretches of time (such as half an hour) it’s important to not rely on (say) having a timer tick every second, and then summing all the seconds. If you do this, and the timer isn’t exactly accurate, you’ll drift: if the ticks come at an average of 1.01 seconds, rather than the expected 1 second, that would still push you off by about 20 seconds over 30 minutes. In general the best way round this is to have a long-running timer to keep you in sync.

I think i was already using the “print (char) {N}” statement, but thanks anyway, i’ll try to investigate this later.

What do you mean ? Having both a fast and slow timer ? Or just replacing my current timer with a lo,g-running one ?
Also, i could use an external program in order to time my game.

In a real-time game loop I think you want your timer to keep track of the delta-t since the last tick. Then you can adjust the next tick accordingly to keep the loop in sync, by not ticking the timer until the fractional difference is made up.

Well, it depends on how you’re implementing it. The way I’d think of doing it would be to have one timer that triggers once a minute, and one that triggers once a second, and use the two together to work out the time elapsed since the start of the game. Since errors on timers are going to be an approximate constant (that is, if a one second timer’s accuracy turns out to be +/- 10ms, the one minute timer will most likely also be +/ 10ms) the minute timer will be much less prone to drift.

This would also be an argument for a Glulx opcode that allowed access to the system clock.

Glulx only allows one timer to be active at a time, so this plan is moot.

Yes. That is definitely the way to go.

(Glk call, I figure, since this is more like I/O than computation. And more likely to be platform-dependent.)

As I was just saying in the Javascript thread: more details on my schedule tonight.

Gah, I forgot that. D’oh. In that case access to thr system time seems definitely the way to go.