Back in February I wrote a script to generate a contents page for the Friends of I7 extensions repo. But what I really wanted to see was something that could say which extensions work and which didn’t. Generating the raw data for that is actually easier than it sounds and that much was working in March. Displaying it all in some half-decent manner was harder, but it’s finally here (though there’s still plenty of HTML/CSS polishing to be done.)
The environment was Linux, with the Inform 6 6.33N binary from the Inform 7 package. It only compiled for glulx, not zcode (I was surprised there aren’t any for Z-Machine only extensions). The methodology was to create a project with a story.ni that included exactly one extension, the one being tested, and source that printed the complete list of extension credits (i.e., with authorial modesty-using extensions included), the regular list of extension credits, and immediately quit. It’s compiled with external set to a directory containing the Friends of I7 extension repo in its Extensions dir.
The list of what extensions a given extension includes or is included by is based solely on the output; there’s no attempt to parse the extension source.
So there are three main pages: Current, Errors, and Previous (as well as pages per author). Current has the working extensions, for values of “working” that mean nothing blows up when you compile something that includes them and run the resulting glulx file. Errors are for ones where something did blow up. Things with 6G60, 6L02, 6L38 in their names weren’t tested (without using a full-blown VM with an old version of Linux I couldn’t get an old version of ni to work, not even via docker) and are on the Previous page.
So what would people like to see? Things tried with Inform 6 6.35? The non-for-glulx-only extensions tested with zcode builds? Different organization?
(It’s probably not impossible there could be different results on different platforms due to some weird edge case somewhere in Inform 6 or something – maybe not the difference between working and not working but at least a potential difference in the exact error output. So it’d probably be ideal to run it everywhere… but I’m not eager to try to automate all of that.)
Good idea! It’s sort of straightforward, even – it already isolates the example-code for the copy-to-clipboard feature. And I was already planning to integrate Quixe, which would enable the best way to present the example: just letting someone interact with it live.
I wasn’t expecting a difference for successful compilation cases. But there are three cases of segfaulting during inform6 compilation (which is currently reported badly with blank i6 error output); I thought it was plausible they might error out gracefully elsewhere.
This looks really great, thanks for your work on it!
One caveat/issue comes to mind:
With this methodology, there are some reasons why an extension might end up as a false positive in the “Errors” category even though it’s working.
It might rely on other extensions to work properly, but instead of including those extensions itself, it just mentions them in its documentation and tells the game author to take care of that. (Maybe it follows a mix-and-match model with other extensions and wants to leave maximum flexibility to the game authors.)
Such an extension would fail the test if it’s just included in a bare project by itself, but it would actually be working as intended out of the box when the docs are followed.
An extension might rely on some minimal code being present in the game, for example which defines a certain table or instantiates a kind of object.
From a cursory glance, I think that is the case for “Achievements” by Juhana Leinonen and “Measured Liquid” by Emily Short, for example. “Achievements” needs a “Table of Achievements”, and “Measured Liquid” needs an instance of a “fluid container” in the world. The “Measured Liquid” problem also causes “Dishes” to be categorized as erroneous, because “Dishes” includes “Measured Liquid”. “Mood Variations” by Emily Short is also affected by a similar issue.
When the right table is defined (or, respectively, a fluid container is declared, etc.), these extensions work correctly out of the box.
So, I’d second Zarf’s suggestion to compile the included examples, which ought to mitigate the points above.
One of my misgivings with the whole endeavor is that there are all sorts of reasons things might be listed under someone’s name as “Errors” that are no fault of theirs. Lots of people can push to the Friends repo; someone introducing a compilation-blocking error in, say, Glulx Definitions by Dannii Willis would create a cascade of broken extensions.
I’m inclined to think it is a bug in Measured Liquid that if you include it but later comment out the one fluid container you’d made you can no longer compile… but of course it’s unhelpful to label it “not working” on that count, and it’s especially problematic because it takes out a second extension. I cut Achievements more slack 'cause it documents the need for the Table of Achievements and it’s up against an Inform 7 limitation disallowing empty tables.
Compiling the examples and passing something with working examples (with maybe an asterisk describing the outcome of the compilation in solitude case) ought to go a long way. But any automated process is going to have some disappointing results.
Just tried it – they are indeed fixed in 6.35 – thank you! One of the three shouldn’t even have counted – despite reporting errors, ni produces an auto.inf for High Performance Indexed Text (which we already knew wasn’t going to work in 6M62) and my code ignored the errors and proceeded with an inform6 compilation anyway. For the other two, with 6.35 they not only don’t segfault but compile successfully. (And I’ve updated the tester output to report a segfault if it does happen instead of just having blank error output.)
Oops, the tester had another bug parallel to trying an inform6 compilation when there were ni errors but an auto.inf was generated. Autosave and Ultra Undo’s auto.infs, compiled with either 6.35 or current 6.36, report Error: No such constant as "GL__M" and return with exit status 1, but produce an output.ulx that runs and even prints the credits information. So I was wrong to say they worked now, but of course it’s a big win that inform6 doesn’t segfault.
(Looks like GL__M is something from the I6 template layer that hasn’t been there since 6G60.)
It’s more special-case-y and hardcode-y than I’m thrilled about, but I’ve implemented a scheme to include predefined custom content on a per extension basis, which will get those extensions and a few others to the “Current” page. (Haven’t updated the existing web pages yet.)
Now I’ve updated the Friends of I7 extensions test demo, with some moved from Errors → Current by changing the testing code, and a couple with truly trivial problems moved by actually fixing them in the repo (or at least fixing their outright compilation-blocking issues.)
By the way, I like the attention to detail that you’re giving the project, for example the annotation with line numbers. Cool stuff!
If you’re willing to include a few more special cases and/or fixes, these would move the extensions below from Errors to Current.
Three cases which require some additional code in the minimal stories to get them working; no changes in the extensions themselves required:
Basic Plans by Nate Cull:
[Needs definitions from Planner in order to compile, so we include that.]
Include Planner by Nate Cull.
Include Basic Plans by Nate Cull.
The Testing Lab is a room.
Planner by Nate Cull:
[Needs at least one planning-relation declared.]
Include Planner by Nate Cull.
Being-in is a planning-relation.
The Testing Lab is a room.
Graphical Map by Xavid:
[Needs a figure declaration and a graphics file in the Figures subfolder of the project's materials folder. We can take the provided one from the github repo at https://raw.githubusercontent.com/i7/extensions/master/Xavid/Figures/Map.png .]
Include Graphical Map by Xavid.
Figure of Map is the file "Map.png".
The Testing Lab is a room.
The PNG (but any will do, of course):
And there’s also a case which requires a change in the extension itself:
Scopability by Brady Garvin:
Replace the semicolon in line 7 with a full stop: An object can be scopable or unscopable; an object is usually scopable.
→ An object can be scopable or unscopable. An object is usually scopable.
(Of course, for the latter case, I could also update it in the repo myself, although I’d need access.)
Thanks much for seeking these out! I fixed Scopability in the repo. And instead of special-casing a modification in testing Basic Plans, I just added “Include Planner” to it in the repo. Happily for my special-casing sensitivity, for Graphical Map one can get away with just the Figure of Map definition without really having a figure file in Materials. (Creating a blorb would fail, but the tester doesn’t attempt that.) And for Planner, I followed your suggestion exactly.
So I’ve updated the web pages with four fewer errors. And also:
Includes and Included by lists are finally alphabetized
pages don’t get built for Internal extensions; they’re just noted in the others’ include lists as appropriate
where code was inserted to make it compile, the extension page says what it was
(But the code insertion isn’t visually distinct enough, and the table isn’t formatted appropriately in the Achievements case. sigh.)
11 short months after Andrew suggested it, the smoketester now compiles examples; when successful, it generates a page letting you run it in Quixe. Next up is automatically trying test me and including its output along with the example text… so that most of the time you won’t have to play it. I’ll publish new output this weekend…