I’m pretty sure I know the answer given the lack of any search results, but in hope of not re-inventing a wheel I thought I’d give it a shot anyway:
Is there any publicly available test suite to put the parser through its paces? Point being, of course, so that it could then be used with extensions that modify the parser to check for unwanted side effects.
If not, well, I’d been meaning to get to learning to use regtest…
(The reason this bogged down is that the DM4 exercise answer list only contains a few key lines for each exercise. You have to boilerplate the rest of the code to get a working, testable game. This is boring.)
You could take the same approach: extract all the exercises from the I7 manual, compile them, run the test example in each one. I think you can automate the entire process, since those examples are intended to be complete and self-testing.
I already have the examples extracted and compiled, so I’ve got a head start and that certainly covers a lot of ground – thanks.
What I was really hoping for was something that (aspired to) hit every path through the parser code, something that would at minimum elicit every error, use ‘all’ with all the commands that take it, etc.
So I’ve started on that, but I won’t be finished any time soon…
Going through Syntax.preform would be a good place to start for testing that methodically. It’s in one of the the subdirectories of the I7 internals. The exact location depends on the platform. (On MacOS it’s Inform.app/Contents/Resources/Internal/Languages/English/Syntax preform.)
There is no public documentation for Preform, but it’s relatively easy to follow if you already know Inform 7. There are a few things that aren’t obvious. For example, the underscore operator requires the next token be lowercase, caret negates a match, and square brackets come right after a | but affect the alternative that was just terminated with the |. Nonterminals that start with if don’t match any tokens, but the match fails the condition isn’t true.
One of the eternal ambiguities of talking about IF systems, especially I7: I’m looking to test the player command parser from Parser.i6t, not the parser of I7 code in ni. Of course, I7 being I7, they’re not completely unrelated inasmuch as Syntax.preform shapes handling of both I7 code as input and adaptive text as output by the player command parser (and everything else)…
Thanks for the clues about reading Syntax.preform – I had previously looked for documentation and found its absence.
To clarify, even if it’s off-topic: there’s actually a full documentation for Preform (at least for 6L02/6L38).
It wasn’t meant to be public at first (at least until the release of 6L02), but someone put it on the French translation repo and we never took it away. It’s docs/Syntax.pdf in the following repo. (I advise to download the full repo instead of clicking on the file in the web interface, because the web interface will try to load the PDF in the browser and it will slow it down.)
(I would prefer if this PDF had a better home, but I suppose it won’t matter anymore once Inform 7 is open source.)
Ah, scope creep. I still don’t have tests that produce all of the parser errors, but I am more than halfway through a suite that generates all 391 responses in the whole Standard Library.
At some 235 or so tests in, I’m through nearly all the straightforward cases. I feel it’s useful enough to share… and, now that it’s the case for most of the remaining responses that each is a nuisance to create, like I don’t have the will to finish it promptly.
There are two important parts here, an extension called Testcase Manor.i7x and a regtest file, and a trivial story.ni that includes the extension. The contents of the extension could be embedded in the story file, of course, but my thinking is that the extension could be of general utility in testing extensions. Include it and you get a variety of containers, supporters, devices, lit and dark environments, persuadable and unpersuadable NPCs to play with.
So my question to anyone reading this who has an opinion: if I commit this to the Friends of I7 Extensions repo, where should I put the regtest file and story.ni ? (I figure it should be a scheme extensible for future use for regtest files or other ancillary files). I figure that the end of the path should be Zed Lopez/Testcase Manor/. I had been thinking of putting it under docs, but that would end up being included in the I7 extensions Github Pages gratuitously. So maybe another top-level directory alongside docs and the author-named dirs called ‘files’?
(Unfortunately, this means that ni will carp about even more “not legitimate extension” files at the bottom of Extensions.html for anyone directly using a cloned I7 Extensions repo as their external dir. It’d probably be ideal if the extensions were under an Extensions dir under github.com/io/extensions and docs and files could be alongside Extensions. But that would be annoying to change at this point.)
My advice would be to make a github repository with the extension, test scripts, makefiles, and whatever else is needed. Then someone can grab the repository and run the tests without having to learn about the complications of I7 extension archiving.