While trying to implement some precise real-time behavior, I discovered that newer versions of Gargoyle (since at least version 2022) have occasional inaccuracies of up to 30 ms or more in timer events. Gargoyle 2011 - the default on older versions of Linux - does not seem to have this problem, and all of its timer events are accurate within 1 or 2 milliseconds of the correct value. Lectrote, similarly, has a very precise timer system.
Tracing through the source code reveals that Gargoyle passes control of its timer events to the Qt library’s timer API, which features two main timer modes - Qt::CoarseTimer and Qt::PreciseTimer. Only the latter attempts to be accurate to the millisecond. Indeed, making a one-line change to the source code forcing timers to use Qt::PreciseTimer seems to alleviate the problem, reducing inaccuracies to Gargoyle 2011 values without consuming excessive CPU time.
I’d like to collect some additional data about which versions of the interpreter are inaccurate, especially as the delay seems to be system-dependent. I’ve provided this brief Glulx test file which measures inaccuracies and reports some statistics about timer event delays.
Timer Analysis.gblorb (574.7 KB)
Interpreting the results
The program requests timer events every 1000 ms, and runs for 20 cycles. At the end, the average delay between timer events (which should be 1000 ms) and the standard deviation of the delay distribution (which should ideally be 0) are printed, as well as a bar graph of the results. The bar printed with ‘@’ and ‘.’ characters represents 1000 ms, which should be the center point of the graph.
If any of you would like to run it briefly on your system and post your findings (as a screenshot or preformatted text), I would greatly appreciate it.