Ozmoo Release 14.51

Just curious why you wouldn’t use 1E and 1F (interpreter number and version).

Interpreter number was used by Infocom, to distinguish between the different platforms they released games for, and some of their games behave differently depending on the value, so we want to control that value independently. We don’t know if there are post-Infocom games that read the value and care about it as well.

We use the interpreter version, e.g. it’s “N” (the 14th letter) for Ozmoo 14, but there isn’t room for a minor version number.

And there isn’t any place in $1E and $1F where we can say, with very little risk for confusion, that this is the Ozmoo interpreter specifically.

True, but is it a good idea anyway? I think everyone pretty much agrees that making games change behavior based on interpreter is a bad idea™ that goes against the idea of a standard specification. Maybe I just don’t understand.

1 Like

Isn’t that exactly what writing an Ozmoo signature into the header would accomplish, though? Letting the game know what interpreter it’s running on to act on it appropriately?

The specific reason I know of right now that a game author might want to use this information, is that Ozmoo extends the standard by providing eight additional colours in the palette.

If a game is running on an interpreter that is fully compliant to standard 1.1, the game can set the colour to any 15-bit colour value using set_true_colour, but Ozmoo can’t fully comply to the standard, and it doesn’t support setting arbitrary colour values.

Yes.

What about instead adding the gestalt opcode? It’s intended for extensions to the standard.

Never heard of it. Googled, and found it in the proposal for standard 1.2. The section ends with:

“Before using the opcode, ensure that you are using an 1.2 standard interpreter by checking whether the standard revision header (address $32 ) is $0102 or higher.”

Ozmoo doesn’t comply to standard 1.2. Or 1.1. Or even 1.0.

Well the main thing is that opcodes could be parsed even if they then get ignored. For example, I don’t think Ozmoo needs to actually support true colour to say it’s 1.1, just the opcode.

But if you really can’t do that, then I’d think any Ozmoo extensions should only be available when it’s used as a bundled interpreter, not as a general interpreter. And so if it’s used as a bundled interpreter then it doesn’t actually need to indicate anything in the header.

Opinion noted.

Don’t the signature bytes overlap with where a username would be stored (which itself conflicts with the way Inform uses the last 4 bytes?

What about using the header extension table? That’s kind of what it is for. A word for 16-color support could be added to the standard pretty easily.

The details are in the release notes.

There are eight bytes in the header which were used by Infocom for username. The compiler writes the compiler version in the last four. Now Ozmoo typically writes the Ozmoo signature in the first four.

We wanted to use this information in a project right now, not make a proposition for a standard extension which may or may not make it into the standard in 1,2, 3 or 10 years. We opted for a simple solution which we think has an extremely low risk of causing trouble for anyone at all.

I’m just confused what the advantage of using this over the standard “interpreter number” field is.

It causes trouble for other interpreters - games will have behavior that people could come to expect and it can’t be supplied by other interpreters unless they pretend to be Ozmoo.

Edit:
Honestly if it were me I’d just add the word to the header extension table regardless of standard compliance. You already aren’t claiming standard compliance, but I think Ozmoo is fairly popular and I could see 16-color mode being something people might want to support. I’d support it in mine, gladly (despite mine not being public right now, the intent is that it will be someday). The chances of conflicting with some other future use of the extension table is low - when was the last time anything was added to it?

Another Edit: One possibility using the header extension table is to write the highest color number supported there. That way a 16, 32, or higher color mode can be supported. The only ambiguity is the specific colors assigned to each number, but that’s something that can be worked out later.

1 Like

Actually I can see a valid reason for not setting this to a new number (beyond the don’t do interpreter specific stuff): setting a new interpreter number will force specific behavior in existing Infocom games that already check this field.

I think the header extension table is the cleanest option that doesn’t put other interpreters in a bind if they want to support everything Ozmoo does.

1 Like

How do an interpreter add something to the header extension table? The table are fixed during compilation and can’t easily be changed by the intetpreter during runtime. It’s not even certain that there even is a header extension table in the game.

Interpreters already need to set values in the header extension table (Specifically to clear unsupported Flags3 bits, but other uses too). Initial values are set at game compilation time and overwritten by the interpreter as needed. The address of the header extension table address is in the header. If there isn’t an extension table, then the address is zero. The first word of the table indicates the number of following word-sized entries. For the interpreter, reading beyond the length of this particular table (or if the table itself is missing) is guaranteed to return zero.

When inspecting capabilities that may or may not exist beyond those already controlled by header flags, this is the mechanism for that. I would think there’s very little difference (from a game programming perspective) between this and having a game interpret specific bytes in the header in a non-standard fashion.

The only meaningful limitation is the header extension table only exists for V5+ games so cannot be used for earlier zcode versions, but we’re talking about newly written games supporting a new feature.

If that is insufficient (for example: they want to add 16-color support to V4 or earlier games) or too complicated, then (since we’re already breaking the standard) I’d use one of the unused high bits of Flags2 in the header. At least it doesn’t repeat Infocom’s mistake of making behavior change based on a specific interpreter identifier.

2 Likes