I7 Parser test suite?

I’m pretty sure I know the answer given the lack of any search results, but in hope of not re-inventing a wheel I thought I’d give it a shot anyway:

Is there any publicly available test suite to put the parser through its paces? Point being, of course, so that it could then be used with extensions that modify the parser to check for unwanted side effects.

If not, well, I’d been meaning to get to learning to use regtest

https://github.com/erkyrath/Inform6-Test is a partial library test suite. It uses a modified regtest script.

EDIT-UPDATE: URL is now https://github.com/erkyrath/Inform6Lib-Testing .

I intended to use the DM4 exercises as the primary test list, but I only got as far as ex52. I haven’t touched that repo for several years.

(Since it’s based on the DM4, I used the 6/11 library. Some changes will undoubtedly have to be made for 6/12.)

Note that this is not the same as https://github.com/erkyrath/Inform6-Testing , which is a compiler test suite. Sorry; bad choice of name.

1 Like

(The reason this bogged down is that the DM4 exercise answer list only contains a few key lines for each exercise. You have to boilerplate the rest of the code to get a working, testable game. This is boring.)

Rename the first to Inform6lib-Test(s)?

Does Github break repository remotes when you do that? I guess I could read the docs.

…no, the old URLs keep working. Okay fine. :)

1 Like

Thanks. I was thinking of I7’s parser; I should have been more clear. I’ll update the subject.


You could take the same approach: extract all the exercises from the I7 manual, compile them, run the test example in each one. I think you can automate the entire process, since those examples are intended to be complete and self-testing.

1 Like

Graham Nelson does exactly that doesn’t he? Pretty sure I read that. But he hasn’t published his test framework.

I already have the examples extracted and compiled, so I’ve got a head start and that certainly covers a lot of ground – thanks.

What I was really hoping for was something that (aspired to) hit every path through the parser code, something that would at minimum elicit every error, use ‘all’ with all the commands that take it, etc.

So I’ve started on that, but I won’t be finished any time soon…

1 Like

Going through Syntax.preform would be a good place to start for testing that methodically. It’s in one of the the subdirectories of the I7 internals. The exact location depends on the platform. (On MacOS it’s Inform.app/Contents/Resources/Internal/Languages/English/Syntax preform.)

There is no public documentation for Preform, but it’s relatively easy to follow if you already know Inform 7. There are a few things that aren’t obvious. For example, the underscore operator requires the next token be lowercase, caret negates a match, and square brackets come right after a | but affect the alternative that was just terminated with the |. Nonterminals that start with if don’t match any tokens, but the match fails the condition isn’t true.

1 Like

One of the eternal ambiguities of talking about IF systems, especially I7: I’m looking to test the player command parser from Parser.i6t, not the parser of I7 code in ni. Of course, I7 being I7, they’re not completely unrelated inasmuch as Syntax.preform shapes handling of both I7 code as input and adaptive text as output by the player command parser (and everything else)…

Thanks for the clues about reading Syntax.preform – I had previously looked for documentation and found its absence.

1 Like

To clarify, even if it’s off-topic: there’s actually a full documentation for Preform (at least for 6L02/6L38).

It wasn’t meant to be public at first (at least until the release of 6L02), but someone put it on the French translation repo and we never took it away. It’s docs/Syntax.pdf in the following repo. (I advise to download the full repo instead of clicking on the file in the web interface, because the web interface will try to load the PDF in the browser and it will slow it down.)

(I would prefer if this PDF had a better home, but I suppose it won’t matter anymore once Inform 7 is open source.)


I can’t recall having ever been more grateful for an off-topic reply. :heart_eyes:


Ah, scope creep. I still don’t have tests that produce all of the parser errors, but I am more than halfway through a suite that generates all 391 responses in the whole Standard Library.

1 Like

At some 235 or so tests in, I’m through nearly all the straightforward cases. I feel it’s useful enough to share… and, now that it’s the case for most of the remaining responses that each is a nuisance to create, like I don’t have the will to finish it promptly.

There are two important parts here, an extension called Testcase Manor.i7x and a regtest file, and a trivial story.ni that includes the extension. The contents of the extension could be embedded in the story file, of course, but my thinking is that the extension could be of general utility in testing extensions. Include it and you get a variety of containers, supporters, devices, lit and dark environments, persuadable and unpersuadable NPCs to play with.

So my question to anyone reading this who has an opinion: if I commit this to the Friends of I7 Extensions repo, where should I put the regtest file and story.ni ? (I figure it should be a scheme extensible for future use for regtest files or other ancillary files). I figure that the end of the path should be Zed Lopez/Testcase Manor/. I had been thinking of putting it under docs, but that would end up being included in the I7 extensions Github Pages gratuitously. So maybe another top-level directory alongside docs and the author-named dirs called ‘files’?

(Unfortunately, this means that ni will carp about even more “not legitimate extension” files at the bottom of Extensions.html for anyone directly using a cloned I7 Extensions repo as their external dir. It’d probably be ideal if the extensions were under an Extensions dir under github.com/io/extensions and docs and files could be alongside Extensions. But that would be annoying to change at this point.)



My advice would be to make a github repository with the extension, test scripts, makefiles, and whatever else is needed. Then someone can grab the repository and run the tests without having to learn about the complications of I7 extension archiving.