Z-machine dynamic memory - 64K is too much?

While digging around because of my questions around the previous topic:
Z-machine memory

I wrote a quick and dirty program to print out the size of dynamic memory and ran it against all Infocom titles, and many older Inform ones. I was rather surprised to see that dynamic memory rarely exceeds 32K, and pretty much never exceeds 48K. I had expected most of the later Infocom games and many of the Inform ones to be much closer to 64K. I’m wondering if the Inform games I chose were representative or not.


I’m wondering if the Inform games I chose were representative or not.

If they were all older games, they were not representative by definition!

Okay, there’s several things going on here.

Infocom games:

Remember that through the v5 era, Infocom was still targetting 8-bit machines. A stock Apple 2e had 64k of (real) RAM. Infocom’s interpreter had to divide that between interpreter code, Z-machine RAM (kept in hardware RAM), and Z-machine ROM (swapped from floppy). That meant a tight practical RAM limit for their games. Practical ROM limit too, given that a floppy was 140k. I don’t remember these limits were but they definitely kept track of them for all their target platforms.

For the v6 games (Zork Zero, Journey, Arthur, Shogun), I believe they only targetted 16-bit machines like the Amiga and Apple 2gs. That changed the playing field, but I haven’t dug into how.


Inform started with the idea of replicating Infocom’s capabilities, so it took the original limitations as an assumption.

By the time people were using Inform 5 and 6, that was no longer true. Most authors’ practical target was z5 running on a modern machine (32-bit addressing, hundreds of megabytes of RAM). So there was no reason not to use 64k of RAM.

However, Inform was still pretty good at optimizing RAM use. The “normal” design patterns of Inform 6 – the kind of programming taught in the DM4 – did not eat gobs of RAM. You added objects one at a time. An object had some RAM cost (properties, attributes, dict words) and some ROM cost (game text and code), but the ROM cost was usually much higher. So most games ran into the 256k z5 ROM limit before the 64k RAM limit. Graham added z8 to address that, but it turned out that you still had to be making a frickin’ huge game to worry about RAM.

Now, I said “normal” design patterns. There were of course plenty of ways to use gobs of RAM if you wanted to. (They all started with “define some gigantic arrays.”)

I7 began to adopt some of those patterns into common use. Many-to-many relations required a lot of RAM. Dynamic strings and lists required a RAM heap, which was (under the hood) a gigantic array.

Up through Inform 7 6G60, these features were considered “extra”. The compiler went to some effort to not build them into a game unless the author explicitly invoked them. If you did, your RAM usage probably approached 64k.

However, the fact was that if you wanted to do stuff like this, 64k wasn’t good enough. The exact amount of RAM needed for dynamic memory features was, well, dynamic. The compiler made an estimate based on how much stuff your game contained. These estimates were generous. You didn’t have to be writing a very large game before Inform said “Uh, it’s getting to be time for Glulx.”

So, starting with 6L02, the Z-machine just wasn’t the primary development target. Inform normalized the used of dynamic memory features. (“Indexed text” became just “text”, for example.) You could still target the Z-machine but it was easier to switch to Glulx than to try to scrape out every byte of Z-machine RAM.

TDLR: There has never been a time when the 64k RAM limit was the critical bar-to-duck-under for the majority of IF authors. Either the critical limit was lower, or ROM was more important, or it was easier to dodge the whole shebang by leaving the Z-machine behind.


According to Moby Games, all the V6 games were available for the Apple IIe.

No, that’s definitely not right.

Journey: The Quest Begins (1989) release dates - MobyGames says that there was an “Apple 2” version of Journey, but the screenshots are for the 2gs. That’s how I played it. There was no 2e version as far as I know.

1 Like

There is an Apple 2e version of Journey at the Internet Archive here: wozaday_Journey directory listing

EDIT: I can’t seem to get it to start properly, though. The text is garbled. Perhaps they need some memory extension peripheral?

An Apple 2e version of Arthur is here, but it won’t even load.

I haven’t played the IIe version myself, but there’s one listed on eBay right now: Journey : The Quest Begins (1989, Apple II) 5.25" Diskettes (B10) | eBay

I guess there is a possibility that these are fakes made to scam eBay buyers.

EDIT: I got them to work. You just have to put disk 3 in drive 2, not 1. They all seem to require a fair bit of swapping and flipping the three six diskettes during play.

Shogun is here:

Zork Zero:

Okay! Thanks for the correction.

1 Like


Thanks for the information. Sorry for the clickbaity subject. I was just surprised the usages weren’t higher. I get that the ROM limit was probably the bigger pain point.

Edit: Just for kicks, I once played around with using jump instructions to create stories well beyond the normal ROM limit (in the double digit megabyte range) and was able to run them successfully on Frotz. Just a curiosity really, but interesting anyway.

One of the motivating factors I had in pulling together a 'terp for the TRS-80 model 1 & 3 was that even with a 48k memory restriction, I could still make many of the post z3 games run on the platform. There were 3 that couldn’t due to memory size requirements (AMFV, BeyondZork, Trinity) - and that was partly due to me using some memory as a disk cache to make the games playable on a real machine (reducing disk I/O was important). Other games were not playable due to the screen being 64x16 characters, but I still enjoyed that I could play Sherlock, Nord & Bert, ZTUU, BorderZone, and Arthur (without graphics).
As a bonus, pretty much anything PunyInform will run smoothly.


I’m considering finishing my z-machine library written in Rust enough to release it publicly. Rust only targets one 16-bit platform I know of while the rest are 32 or 64 bit, meaning there probably isn’t very much need for optimizing it to fit smaller architectures.

1 Like

If I read the documentation correctly, that would be the MSP430. That would be quite the feat to have it run on some of those models.

Yes, it’s the MSP430 and I agree it would be difficult to make it work. My library will work in an environment without the Rust standard library, but currently it does require some heap allocation at startup. Removing that would almost certainly require shrinking the Z-machine stack size and limiting dynamic memory.