An interesting fact is that most Z-machine interpreters provide a setting for the interpreter version (header address 1F, in the form of an upper-case letter) along with setting the interpreter number (1. DECSystem-20, 2. Apple IIe, and so on,) but unlike the number, I haven’t been able to find any information on what the version letter does.
Which games check it and what do they do with it? Is it a useful setting, and if so, what default value should it have?
EDIT: The tool tip in Zoom says “Most games just report this at the start and make no further use of it,” which I take to mean that it has no known effect other than this.
The low byte is the interpreter version identifier, an ASCII character which identifies the release of the given interpreter. By convention, these are letters of the alphabet starting with A. This word is set by the interpreter upon initialization.
1 DECSystem-20 5 Atari ST 9 Apple IIc
2 Apple IIe 6 IBM PC 10 Apple IIgs
3 Macintosh 7 Commodore 128 11 Tandy Color
4 Amiga 8 Commodore 64
(The DECSystem-20 was Infocom’s own in-house mainframe.) An interpreter should choose the interpreter number most suitable for the machine it will run on. In Versions up to 5, the main consideration is that the behaviour of ‘Beyond Zork’ depends on the interpreter number (in terms of its usage of the character graphics font). In Version 6, the decision is more serious, as existing Infocom story files depend on interpreter number in many ways: moreover, some story files expect to be run only on the interpreters for a particular machine. (There are, for instance, specifically Amiga versions.)
11.1.3.1
Interpreter versions are conventionally ASCII codes for upper-case letters in Versions 4 and 5 (note that Infocom’s Version 6 interpreters just store numbers here).
Modern games are strongly discouraged from testing the interpreter number or interpreter version header information for any game-changing behaviour. It is rarely meaningful, and a Standard interpreter provides many better ways to query the interpreter for information.
So while a few Infocom games would differ based on the interpreter number, nothing really depends on the version. Modern interpreters may ignore both.
Useless, in other words. I guess the spec could mention that games actually print the version letter, which I suppose is the only reason that interpreters have it as a settable option.
It’s the same reason why there are version numbers on almost all software. If you have a problem with the interpreter, you can check if there’s a new version which may fix it. If you report a problem to the maintainer, it’s good for them to know which version you are running.
Bocfel and Frotz also let you set this with a command line switch. None of these interpreters has a user setting for their own version, because that would be weird and confusing. But they all took the trouble to implement letting the user change which interpreter version is reported to the game.
I think I got the answer: It is tradition, and also kind of cool that old Infocom games will print whatever letter you put in that box as the interpreter version.
EDIT: Redid the images with red circles for less confusion.
It’s primarily Infocom games that care about this setting (see comments above). When the interpreters were being written, the Infocom games were no longer being updated. So we added features to the interpreters to “emulate” any environment, so that we could observe all the variations of behavior in the Infocom games.
And the conclusion is that unlike the number (4 for Amiga etc) the version letter is not actually used for anything, except that it is printed? No game changes its behaviour depending on what version letter the interpreter has, right?
I added it to Bocfel simply because it was possible and easy to do, which is probably a pretty dumb reason.
I do know of one game (Moments out of Time) which checks the version to try to determine if it’s running under Nitfol, in order to disable some features which apparently don’t properly work with that interpreter. I’m sure I discovered this after adding the feature, though, so I can’t pretend it played any part in its implementation!
The Nitfol test also includes a test of the window width, and that’s where it’s being tripped up (under Bocfel at least). The problem is that Bocfel is completely lying about most responses to @get_wind_prop, including the X size, which Moments uses to determine the width. It takes the X size and divides by 10 (not sure where the 10 comes from: the disassembly is hairy, and there’s nothing obviously setting a value to 10 in the Bocfel source).
The only way to fix it is to hack the source: modify the zget_wind_prop() function in screen.c to return a value like 500 for the X size. Bear in mind that V6 support is abysmal so it won’t exactly look great.
I’ll probably bump the value for the next release, since the initial value of 100 was completely arbitrary anyway.
Presumably, it is assuming 10 pixels per character.
That is theoretically reported by the interpreter in window property 13 (“font size”), although in an encoded form, so it may have been missed in a basic search. (Or perhaps Moments is hard-coding it?)
Thanks again! I mostly asked out of curiosity. I’m not working on V6 support, but running V6 games in Bocfel is a nice stress test which has uncovered a number of bugs in Spatterlight.
That is theoretically reported by the interpreter in window property 13 (“font size”), although in an encoded form, so it may have been missed in a basic search.
You are completely correct. I didn’t think about the fact that font size is a word that encodes two bytes. In Bocfel’s case: (10 << 8) | 10. So that explains it, thanks!
I can now at least have the X and Y sizes be correct in terms of the made-up font size of 10, removing a couple of arbitrary values in the window properties.