I believe I have the correct area to post this question. I am trying to understand the Z specification and with the help of some of the posts here I am further along the way (thanks to all the contributors). I have the Folly interpreter running with debug set. When I run Zork 1, this is the first couple of lines from the debug log:
Finished release [optimized] target(s) in 0.06s
Running `target/release/examples/term zork1-r119-s880429.z3 --debug`
My question is, why the differences between the call addresses executed versus the call addresses listed in the operands? After scaling the call addresses in the operand by 2, there is still an offset of about 7 in one case and 10 in another case.
First up, your dumps are showing the words in a confusing mismatched endianness, so I have reversed them below (I think I got it right). So the first call should be (broken down into opcode, operand types, operands, and storer).
00050d0 a1a000b000 e0 03 2afd 83a4 ffff 00 e197
So the first call goes to 55fa, that’s the 03 byte:
Section 5.2 says routines start with a byte telling us how many locals there are. So this routine has 3 locals. Then, because this is a version 3 storyfile, section 5.2.1 tells us that there the initial values of those locals follows. In this case they’re all zero. Then the first instruction is at 5601, as the log says. And again, it’s another call, which I broke down into its parts.
Thanks a million for your quick feedback. Yeah, I need to see if there are other settings for the hexdump tool that I used to dump in bytes rather than 16 bit little endian words.
And then, based off of the debug that I have, it goes to routine at address 2*0x2afd. My question this time is, what does the value at 0x50d6 which is 0x03 do for me?