Glk time and date functions

I know little of these timers you speak of, save this from the new GLK spec: The system clock is not guaranteed to line up with timer events (see “Timer Events”). Timer events may be delivered late according to the system clock.

I don’t know how to repair that. I don’t even think it can be repaired at the I7 level except by turning off the timer, then turning it back on after a busy wait on the player’s microseconds. I guess.

EDIT: Oh, I think I understand, Erik. Well, I don’t want to modify microseconds cause of profiling issues, but if microseconds count straight up to one second, rather than up to the next millisecond, that breaks the pattern. It would be like the day value counting up to a full year before resetting to zero.

I think I’ll add the millisecond line in the documentation with a copy-paste icon and call it a day.

I don’t understand how things are handled well enough to contextualize the technicalities there, but it sounds like we’re on the same page, and pasted code in the documentation sounds just fine. I’m excited about the prospects of having a cheap and easy profiler so please keep those microseconds as inviolate as they need to be!

Any thought of adding infglk style commands to make the I6 a little friendlier? It’s probably not in the brief for Ron’s extension to do this, but I thought I’d toss it out there.

–Erik

Thought I’d just add that I’m writing a profiler suite which will be shown at the demo fair. The code you have isn’t really ideal as it calls the Glk function twice to get the seconds and the microseconds. My aim is to have an easy extension which will handle all of the profiling for you, with statistically significant results, and you just pass it the code you want to compare.

@Erik: “infglk style commands”?

@Danni: Well, that’s not entirely true as of a couple hours ago: while suspending time, say “Time: [player’s seconds]:[player’s microseconds].” But as Erik alludes, I’m not aiming at high-quality profiling anyway, just exposing some new functionality with some simple uses.

infglk is the Inform (6) header file–now incorporated into Glulx.i6t–that defines the constants and commands that allow glulx inform code to read reasonably well to human eyes; in other words, it provides a more user-friendly UI overlay for all the opcodes and hex.

–Erik

Oh, I hadn’t noticed that feature.
And yes, it looks like a good general purpose extension.

@Erik: Ah. That .i6t file wouldn’t be updated in the current version of Inform, let alone in earlier versions, hence I’ll leave the messy hex in. This is so even older versions of Inform 7 can use the interpreters’ new features. Perhaps in a year or three the new extensions can be rewritten with readable posterity in mind, but considering the slow speed at which hobbyist WIPs progress, I think immediate usability is a more important concern for now.

This sounds like a desire to fix an old solution which should be replaced instead of fixed. It’s true that the event timer is not guaranteed to be accurate; neither is the new time system, because just executing the call to retrieve the time takes up time.

The statement in the spec about “Timer events may be delivered late” is not a description of a bug. It’s an attempt to simplify the game author’s life by guaranteeing that the error in timer events will always be positive rather than negative. Cuts out half of the error cases you might have to think about.

Using a long-running timer event to keep time was just a poor way to watch the system clock; now there’s a better way.

Also: a current version of the infglk.h header (including the latest Glk functions) can always be found at github.com/erkyrath/glk-dev/blo … p/infglk.h . (Hit the “download raw” link.)

Hey Zarf, maybe you could answer a question I’ve wondered… how do the arguments to the glk_* functions get magically sent to the @glk? Especially consider that the actual arguments are only defined in the comments…

[spoiler]zarf, I might have found a bug. I seem to get zero in the datetime (destination) array the very first time I call glk($0169, countofseconds, datetime); ! glk_time_to_date_local . The following test case reproduces on my machine: [code]“asdf” by Ron Newcomb

The story creation year is 2012.

To decide if the player’s time is available: (- (glk($0004, 20, 0) ~= 0) -). [! gestalt_DateTime]

When play begins when the player’s time is available [and the story creation year is greater than the pyear], say “I >believe your computer’s date is set incorrectly. This work was published in [story creation year], which is far into the future from where you sit in p[pyear]p [the player’s year] p[pyear]p ([player’s month]) [the player’s year] ([player’s month]).”

Release along with the “Quixe” interpreter.

To decide what number is the pyear: (- (datetime–>0) -).
To decide what number is the player’s year: (- ((FillOutDateTimeArrays() & (datetime–>0))) -).
To decide what number is the player’s month: (- ((FillOutDateTimeArrays() & datetime–>1)) -).

Include (-
Array countofseconds → 3; ! this holds the number of microseconds elapsed since midnight on January 1, 1970, GMT/UTC
Array datetime → 8; ! this holds the above broken down into year, month, day, weekday, hour, minute, second, and microseconds
Global UTCvsLocal = 1; ! truth state: 1 = local time; 0 = GMT otherwise known as UTC

[ FillOutDateTimeArrays;
print “";
if (glk($0004, 20, 0) == 0) return 87; ! glk_gestalt_DateTime(20, 0);
glk($0160, countofseconds); ! glk_current_time(timeval);
print "
”;
glk($0169, countofseconds, datetime); ! glk_time_to_date_local(tv, date);
return -1; ! so this function can be bitwise ANDed with the result of the → array dereference that’s about to happen
];
-).

There is room.[/code]

It reproduces the following. Note that “pyear” is supposed to be zero on its first print – that happens before any GLK calls to populate the array – so “p0p” is right, but right after it, “the player’s year” returns zero even though the asterisks in front of it prove the calls were all made.

Can you confirm? Or did I do something wrong? (Apologies for the messiness here; it was hard to narrow down.)[/spoiler]Actually, it seems to do with the return value, so nevermind.

The first I6 function argument is _vararg_count, which is a magical token meaning “the actual arguments are available on the stack.” The @glk opcode takes its arguments on the stack, so no transfer is necessary. The _vararg_count value itself is the number of arguments.

Ron, I’ll try to take a look at that tomorrow.

Actually, nevermind, zarf. I think the Inform 6 assembler is trying to optimize “((FillOutDateTimeArrays() & (datetime–>0)))” by swapping the order of arguments. It comes out in the right order when I inspect the generated source, but the above test seems to indicate otherwise. Huh. I really need to learn more about Inform 6, it seems.

All true: I probably should have chosen a better example. The timer can also get off even if you’re calling it, say, every second (due to inherent inaccuracies plus the fact that timer calls can be skipped if the VM is busy with something else). If you’re incrementing a counter when the timer code is called, or are otherwise trying to keep track of timer events, the ability to access the system clock is a useful way to calculate errors and to attempt to rectify it. That’s really all I was trying to get at.

Thanks for the clarification!

–Erik

That doesn’t surprise me. Could be an optimization, could be just the way it generates argument data on the VM stack.

I guess you’re stuck writing more helper functions, rather than using the & trick.

Or because the VM has been backgrounded on your phone and is therefore suspending all activity.

(No iPhone jokes, please. A mobile game should suspend all its game logic processing when backgrounded.)

Right. A better way to look at it is that the timer events don’t tell you the time; they tell you that it’s time to consult the system clock and see what time it is.

For example, if you want game event to interrupt every five minutes (real time), you might decide that precision of five seconds is sufficient. (A guard walking down the hall isn’t going to be microsecond-precise.) So you set up a timer event with a five second cycle; in that event, you look to see if the “minutes” field has advanced to a multiple of five.

Then you have to worry about the suspension case. I guess you’d hold off on the guard if you see the system clock jump more than thirty seconds between timer events. (You need the same logic anyhow to deal with the player saving, quitting, and restoring.)

I tried changing the & to the comma, which it probably should have been anyway, but a bug in Inform 6 prevents it from working. (The construction IntegerDivide( (UpdateTime(), datetime–>7) , 1000) parses as IntegerDivide(UpdateTime(), datetime–>7, 1000) rather than UpdateTime(), IntegerDivide(datetime–>7, 1000).) I rewrote some stuff, finished the documentation and sole example, and have now sent it off to our tireless extensions librarian Mark Musante.

Using Ron’s extension, the microsecond readout I’m getting (in Quixe 1.1.1) is actually in milliseconds. That is, the last three digits of the microseconds readout are always 000. I assume that this is not the intended behavior, but possibly my machine isn’t supplying data to the library’s expectations…? (I’m on a MacBook, 2.4GHz Intel Core 2 Duo.)

–Erik

Edited to add: The datetimetest.ulx file running in Quixe gives me similar results.

Some platforms can only provide milisecond precision data (including javascript, on most browsers at least. Chrome has an extension to give you microseconds.) I had originally just suggested miliseconds, but I think Zarf was right to specify the API with microseconds for those platforms that can give you more precision. I’d consider microseconds to be an extra bonus for those archaic desktop interpreters :wink:

The situation is even worse for actual timers. In IE the best resolution is 16ms, so if you want a Glk timer to run quicker than that in Quixe on IE you’re out of luck.

I’m curious in figuring out how to pinpoint slowness in a WIP with these timers. On a desktop machine, microseconds isn’t fine enough to really see how long particular rules take to execute, and while microseconds would be find for the browser interpreters, they don’t give microseconds so the problem exists there too.

Granted, I don’t think this addition to Glk/Glulx was really intended for profiling purposes, but it’s natural to try to use it that way. I’d almost wish for the Glk spec to allow a bit of extra array space in its return values for smaller nanosecond and picosecond readings if the interpreter wishes to provide it, but not actually require those values to be filled in for the interpreter to be considered up-to-spec.

There’s suggestions on the Inform suggestions forum for debugging-interpreters that would print callstack information on runtime error. Such an interpreter would be a natural for implementing nano- and pico- readings if the hardware made such available. That article David linked to shows it to be possible.

Does this make sense?