Currently, the Dialog distribution contains two files of actual Dialog code: stdlib.dg, the standard library included in most projects, and stddebug.dg, a library extension that adds various debugging commands. They’re just placed in the root directory right now so they can be easily found.
However, @sue has just added a new extension, unit.dg, for unit tests. And I know @hlship has various extensions of his own that he’s hoping to add. So this seems like a good time to discuss how we want to organize these going forward—and how we want to document them, since currently stdlib.dg is the only one with any external documentation! Even stddebug.dg, despite the std- prefix, doesn’t get the documentation the library does; you just have to poke through the code yourself.
I’d like to propose two options, though they’re not the only ones:
We make a new directory called something like /lib or /contrib or /ext, where all extensions (any libraries that aren’t stdlib.dg) go—including stddebug.dg, which might get renamed to simply debug.dg. Right now, the documentation is divided into a language book and a library book; a new third book would contain documentation for each extension, one page for each. This means that people downloading a Dialog release get all the extensions along with it, but it also means that new versions of an extension need to wait for the next Dialog release before they can be sent out.
We make a new repository under the Dialog-IF organization, along the lines of https://github.com/i7/extensions/, where all extensions (as above, including stddebug.dg) go. Antora can pull from multiple repositories when building the manual, so they could be documented alongside the language itself (in a third book as above), but not be tied to Dialog’s release cadence; alternately, we could do a separate Antora build for the extensions.
Personally, I prefer the second, because it took us a year to release 1a/01 and I don’t know when the next version will come out. When we need those extensions in the main repo (e.g. unit.dg for library unit tests), the automated build process can grab them from the other repository, or just keep its own copy for internal use (ensuring that a new version of unit.dg doesn’t make the main build fail).
But what does the community think? I’m also going to tag in @bkirwi , who handles most of the automated build systems, in case there’s some big advantage or disadvantage I’m missing.
@hlship already has the start of that second idea going, with some of his contributions there. Perhaps he might be amenable to generalizing that to the community?
Another question we’ll need to answer at some point is what our curation policy is.
If the manual is a factor in the decision I think you should delay deciding how to organise anything until it is proven that Antora can actually produce a usable search index. The sites that it builds look nice, but currently you can’t search for any Dialog syntax because the index appears to have been created with punctuation stripped out - it is a fairly fundamental issue that you can’t search for language syntax like (if) because it isn’t going to be much use to search for if. It wouldn’t be great if the same limitation gets inherited by anyone documenting or using an extension.
Good point! I’m currently working on adding cross-references to the syntax index, which should help with that, but it would be nice to also be able to search.
I’d say to keep stdlib and stddebug together as they’re likely closely bound.
Extensions can be tricky; I’ve divided up my extensions into a few parts, but they have inter-dependencies. And I often follow the lib + debug lib conventions, so you have tc.dg for threaded conversation, but also tc-debug.dg.
I’m currently handling dependencies with … a comment at the top of the file.
Perhaps dialog-tool could help formalize something, some kind of dgt install … command that could automatically download necessary files, put them in the right place, and update the dialog.edn file, along with dependencies. The skies the limit, but it’s a waste if I’m the only one using it.
dialog-tool looks amazing to me, but it’s MacOS only, right? That seems like a blocker for it becoming part of the larger Dialog ecosystem. Is there any chance of it becoming available on other platforms?
It’s less prominent on Linux than OS X in my experience, but it does work; I use it to get a more up-to-date version of Frotz than my normal package manager provides.
I’ve been trying to figure out a way to allow something like (include) tc.dg in the source itself, but there are a lot of annoying logistics to figure out for that.
It is probably easier to declare what should have already been defined and then generate a warning or error if it isn’t defined, i.e. check for evidence of already loading a declared dependency instead of actively trying to load it.
I think the lack of an (include) is probably an intentional part of the design.
That’s more feasible, technically, but I’m not sure how useful it would be: if library X modifies something from library Y, then X has to be loaded before Y. So at the time X is being processed, there’s no evidence of Y’s existence yet.
It would be possible to add something like (expect) tc.dg, which ensures that a file named tc.dg appears after the current file in the source list, but that could only check the name, not anything else about it (since, well, the file hasn’t been parsed yet!).
Now, it’s currently possible to do something like this:
[Library X]
(startup)
(if) ~(test for library Y) (then)
WARNING: You need library Y! (par)
(endif)
[Library Y]
(test for library Y)
But that doesn’t scale well, and since it only applies at runtime, it won’t help explaining all the mysterious compile-time errors that come from missing a library or including them in the wrong order.
It’s a thorny problem! Letting an external tool like dgt handle it is probably the easiest option, because it can examine the files before the compiler is invoked.
This is how it works in Inform-7-land, which is the best precedent we have I think. (Insofar as that it’s both successful overall and familiar to IF folks specifically.)
I think we’ll want to handle versioning for extensions separately and more informally. Similarly, extension authors may want to release 10 versions in a month or ~only once or twice ever, and that’s fine.
I think the quality and standardization bar for extensions is different. In the stdlib, we spend a lot of time thinking about whether code is in the spirit of the original vision of Dialog, or whether it’s fully documented, etc. IMO it’s actively good that folks can share extensions that are half-baked or experimental or very idiosyncratic, and the QA we do on the way in should be pretty minimal.
I know that Inform has a smaller set of “blessed” extensions that get bundled with every release and pass some additional quality bar. That may eventually happen for Dialog as well! But I think it makes sense to start with the “big bag of extensions” repo, and consider bundling some of the best-loved extensions with the main build later on.
In terms of what this means for the build process… I think whatever tests we do in the extensions repo should be based on the latest released compiler and standard library, not the version in main. (After all, that is the version that folks are using in their projects!) So in that sense it’s good that it’s in a separate repo.
As far as how those tests should work, including requirements for docs or dependencies etc… I propose collecting the extensions first and dealing with that second. (I have some ideas, but the best way to know whether they’re any good will be to have a bunch of extensions collected already.)
It’s possible that we may want to, in the main Dialog repo, test that we don’t break extensions. For now, I’d propose we handle this the same way we test that we don’t break existing games – ie. by storing a copy in the test directory and writing tests against that.
The individual pieces of it are cross-platform (Java and Babashka).
If you can get Babashka running, you can simply download the .zip file, unpack it, and add the unpacked directory to your path, however you do that on windows. The only thing that won’t work well is dgt new as that has assumptions about Homebrew (in terms of where to copy the standard library from).
I was thinking about perhaps packaging everything in a Docker container for use on Windows? That could potentially solve nearly all the issues (except when dgt skein tries to open a browser window).