Techniques for visualizing logic flow and variables?

Aha! I understand your point regarding the ‘when’ of generation, and you’re right. That ‘when’ doesn’t matter to the player, as long as it’s sometime before opening the locker.

(And that is useful to me as a newbie designer/programmer, as thinking through that computational effort in terms of ‘when’ is important!)

What I wanted to avoid was the details changing ‘on the fly’ so as to fit into the puzzle or story, if that makes sense.

So, the in game steps that I see are:

1 - Investigate Locker: provide player with a large amount of items in the locker, all of which have some level of semiotic connection to the faction question.

2 - Player chooses from among the items in the locker (let’s say 10) that help prove the faction allegiance of the locker owner.

3 - Player submits this list to a judge of some kind. (Perhaps a teacher or Mean Girl queen of the schoolyard.) The judge evaluates these submissions, and then also releases the actual faction allegiance of the student, at which point the player has succeed or failed.

This process allows the player to see how well they assessed the faction based on the items chosen.

This makes total sense! Everything in the locker should have some proximity to the faction allegiance; but I’m considering that they might belong on a spectrum so as to make the details more spread-out and less obvious.

Going back to the notion of thing being changed on the fly… what I want to avoid are those 10 items the player chooses becoming more valuable because they’ve chosen them, because the aesthetic experience should include the risk that the player has guessed a semiotic importance that turned out to be wrong.

It’s possible that I’m still confusing the ‘when’ of generation here!

But it seems to me that the procgen ability of creating a lot of detail with distributed granularity means that it can create something like… a souvenir photo from a visit to the Colosseum. And internally, it’ll have a score of Nerd/Jock… but the player won’t know that score till they submit all 10 items and get a response back. Maybe it’s high on Nerd because it’s a trip to a historical site, or it’s high on Jock because it’s a temple to sports!

Trying to figure that out, experiencing that uncertainty, is part of the game that I want to make. But I want to make sure that the allegiance doesn’t warp to or against the player choice.

Somewhat connected: I recently found a comment I wrote elsewhere about mystery games that was the initial genesis for this idea:

I wonder about creating a mystery that has so many details that replaying is about finding more details, not in finding a different conclusion. A harder challenge for example. Or it’s like JFK where there’s just so much information that no single experience will encompass it all.

This version of the idea didn’t include creating a new student every time, but it does highlight the experience of having TOO MUCH information and the gameplay being about deciding what to keep as much as anything else.

1 Like

I understand what you’re getting at, but I think you’re still looking at it a little backwards. The thing you care about (and what the player could notice) is whether or not the puzzle remains in a consistent state. That’s just (to a first-order approximation) avoiding contradictions.

Choosing items necessarily makes them more “valuable”, in the sense that each piece of (non-red herring) evidence reduces the volume of the problem space that can contain the solution. If the puzzle is solvable.

But the point I want to make is that if you want the player to experience uncertainty (having to make decisions with incomplete information and possibly being wrong), and you’re creating the puzzle(s) via procgen, then either a) that’s actually a specific functional design goal of the procegen algorithm, or b) you have no way of knowing whether or not the particular puzzle any given player gets will produce that effect.

To use an analogy…ever play the early Final Fantasy games (or other similar console RPGs)? The way random encounters work isn’t exactly procgen in the sense we’ve been talking about in this thread, but I think it illustrates a “gotcha” of using randomness in design.

It’s pretty clear that the intent was something like “the player should have a random encounter about once every 10 steps”, so they implemented a system where every step it picks a number between 1 and 10 and gives the player an encounter if it’s a 1. But of course that means that sometimes you can get an encounter every step for several steps, or after only a couple steps.

Players aren’t going to keep a spreadsheet of the number of steps between encounters to discover that, yeah, it’s actually “fair” and for every time they had a run of an encounter every step there was a corresponding point where it was longer between encounters so it actually all averages out to about one encounter every ten steps.

If you want the player to feel like they’re getting an encounter every few steps you have to do something else. Like after every encounter pick a number between say 5 and 15 and spawn an encounter after that many steps. Or start out with zero percent chance of an encounter and incrementally increase the chance per step to produce a desired distribution. Or whatever.

My observation is that it sounds like you’re fixated on what you might call abstract “fairness” or simulationism or something like that: modelling the game problem the way it would work in the real world. And the point I’d make about that is that is doing that is almost always a separate thing than producing whatever specific gameplay effect (emotional reaction in the player, conjuring a vibe, whatever).

Usually when you’re doing “conventional” gamedev, you handle this sort of thing by just jiggling the handle until it feels right and then you publish and you hope players feel the same way. With procgen stuff you can’t really tweak the “feel” the same way, except by explicitly building it all into the procgen process.

And again, not trying to talk you into or out of anything here. Just trying to hammer a specific point about procgen stuff…speaking from lots of experience doing procgen stuff. It’s one of the reasons why doing almost anything via procgen is, like, orders of magnitude more work than just hand-designing everything.

I don’t get the sense that you’re doing so, or that we’re arguing! As a neophyte designer, but with some solid narrative design experience (a few decades as a professional theatre director) I feel confident in my aesthetic instincts but am completely aware of my lack of experience and basic skills… which is what precipitated the question in the first place!

And I do take your point about my fixation on ‘simulationism’ when it is really just, as you say, avoiding contradictions.

But… my other reasons for considering procgen for this has to do with two other elements: replayability and volume of detail.

I could hand-design this game for the aesthetic I’m going for, and it would probably be faster! But then, even if I create 100 items and the player can only pick 10, once they ‘solve’ it, it’s done… and a meta-awareness of the value of the items the next time they play. I’d love for this design to allow for each playthrough to feel fresh. (Not un-like a rogue-like, though not as intentionally brutal as that genre.)

And volume of detail attached to that: it seems to me that procgen allows for both replayability and for an interesting distribution of detail across a wide number of items. (Of course, it will have to be tailored, pure randomness might be ‘fair’ but wouldn’t necessarily feel true.)

Ryan Veeder’s Inform project listed in my first post was useful, because it does something similar in terms of generation.

Thinking back to my initial post and your earlier contributions, would a graph make more sense than a grammar? I’ve been attracted to this topographical map idiom!

I absolutely understand the impulse and don’t want to discourage you…but it’s probably worth pointing out that it’s very easy to overestimate the degree to which people will replay your game.

If the replaying is baked into the game (like a modern rogue-lite or soulslike, where repeating the game loop is the core mechanic) that’s one thing, but if it’s a single overarching mystery story then you probably want to really think about how your game is going to make the case for a replay to the player while they’re playing the first time.

I’d also say that adding detail is almost always easier to do by hand instead of doing it via procgen, with the exception of games with very sparse environments.

Again, not trying to talk you out of anything. And I 100% also get that you might want to design a game in a particular way just because that’s just the kind of game you feel like building. These days pretty much nobody does parser-based IF for commercial reasons, and so “I just kinda want to make a game like this” is a pretty good bullet item in the design rationale. I’m just kinda talking through the thing.

I really don’t have a feel for either what the larger design looks like or what would be easier for you to conceptualize.

For me, I tend to find myself using graphs for situations where what I care about is what the relationships between the bits looks like, and a grammar more when I’m more concerned about “process”.

So for example I’m currently using an explicit graph-based model for procgen map stuff and for NPC dialog stuff. For the map stuff, the main map is “static” in the sense that it’s constructed like a regular IF map. But it connects at specific points with blobs of procgen stuff. This isn’t what I’m doing, but imagine a game where you’re a bird watcher. There’s a town and just outside of it is a forest. The town is static, but every time you go out into the woods it’s a little procgen map, constructed based on the current game state (so things you do in the rest of the game influences what happens in the procgen sections).

For the dialog stuff, conversations with NPCs work more or less like a two (or more) player board game, where at any given moment the abstract conversation state is a space on the board, and the player and the NPC(s) take turns moving to a different space. The board itself isn’t static, but is constructed based on the game state, player knowledge, NPC state, and so on.

On the other hand I use an explicit grammar model for the state-machine-based crafting system module I posted a while ago. It implements a little toy compiler that translates statements in a “recipe” syntax into in-game state machines (which are themselves rulebooks and rules under the hood).

In the former cases I’m more worried about the (varying) nature of the relationships between the bits that are being modeled, and in the latter case I’m mostly interested in being able to just declare more or less explicit transformation rules.

I think I’m getting it!

I appreciate that you haven’t been trying to talk me out of it while also highlighting some design pitfalls common to the direction I’m taking. This helps quite a bit.

And you’re correct; this isn’t a commercial project. I’m trying to teach myself Inform 7 as well as some procgen understanding, and the game concept leaped out to me as I was considering how to apply the above to something.

I think I’ll try to figure out a graph for this, since I’m interested in the varying nature.

To anyone else following this thread, I hope it was useful! Alongside the resources posted, I’ve also found these:

(And I’ll add more links if I find more resources!)