CYOA and IF

A situation in which no-one wins, after having ferociously competed (therefore, acting towards their goal).

The players in question may be computers, but that’s irrelevant for this discussion, surely, because we’re discussing the game, not the players. IF played by a bot would still be IF - it would not magically be a transcript, even though that’s what we, as the non-players, would perceive.

Then what happened to all our ideas of what play is?

C.

The discussion has gotten to the point where I’m more comfortable reading up on other peoples more informed thoughts and less comfortable chiming in, but if I take the bit I quoted as an idea of “play”, then it’s still on the mark. The fact remains that chess is chess, regardless of who plays it - even if the “people” are “computers”, and they’re only programmed to try and reach a winning condition. Or, to pick up on my IF example, the bot is still going through an IF play session, issuing regular, workable, if in context nonsensical and inefficient commands.

Unless I’m misunderstanding something, which is also likely.

What you say here is very logical, but it’s in search of something that isn’t there. What you have advanced is Plato’s theory of forms in different clothing. The theory of forms is one of the most specious canards in the history of philosophy. There is no such thing as a prototypical table or chair or story or game. There is only the assemblage of all actual tables or chairs or games. It makes as much sense to look for a prototypical game as it does to look for a prototypical human. DNA doesn’t work according to prototypes; each gene codes for a few traits. There is no gene that codes for the ‘essential’ traits because there is no essential trait. There is only a surge of similarities that runs through a population like a river. As human-invented forms, stories/games would certainly participate in the same unfocused (and unfocusable) surge of similarities. You are trying to eliminate conflicting versions of things from your reality; which is a noble effort, but the problem with it is that conflicting versions of things ARE reality, so it is doomed to fail.

Extending the evolutionary metaphor, there was no Adam Story and Eve Game at the outset of human culture playing with a Toy Apple and Action Figure Snake: from the very start they have all been loose populations of generally similar things with a lot of mutual overlap, and they all evolved through human culture in parallel, exchanging genetic material as each individual work of art argued separately (via the search for popular acceptance) for its own inclusion in an arbitrary milieu of entertainment categories. That’s why it’s impossible to focus your mind’s lens sharply on a dividing line without stepping wrong and excluding things that common sense will tell you shouldn’t be excluded. It’s your common sense, not your philosophy, that has the best handle on the actual reality here: which happens to just be a pile of stuff shoved together under a rule of thumb for the sole purpose of browsing convenience, and nothing more.

You are definitely to be forgiven for looking for more, however. People have been doing it for thousands of years. They have always failed, but that hasn’t stopped them from trying. In a sense, programmers are the last Platonists. We tend to think in terms of abstract classes that confer their properties onto member objects by way of classification. So objects get their properties via their class, right? That is exactly the way Plato thought, but this philosophy of the world has been debunked time-and-again by history’s greatest skeptics. Classes have no correspondence to reality other than the fact that they are things invented pragmatically by real humans to conveniently suit assemblages of real objects.

I cannot hope to do justice to these arguments in this space (and I haven’t come close), but Sextus Empiricus and David Hume are good reads. 8)

Paul.

EDIT: Those dudes are kinda dry though. Robert Anton Wilson’s nonfictional speeches and writings are a fun (if less rigorous) way to be exposed to some of the same sorts of thinking…

Robert Anton Wilson explains Quantum Mechanics

Nothing?

And, for many works of IF, would be preferable :slight_smile:

Although I confess that my feelings about Robert Anton Wilson’s take on quantum theory are as strong as the next guy’s, I hope you’ll forgive me if we don’t utterly sidetrack this thread…

As I understand the way the ideas are usually employed, the difference between prototype theory and Plato’s forms is that forms are fairly rigidly defined, and can’t be measured. The idea behind prototype theory is that, yes, you do have in some sense an ideal ‘chair’ somewhere in your noggin – this is similar – but it’s just there for comparison. It’s not a set, in other words.

The existence of prototypes is in some sense validated by measuring response times. If you are given a number of pictured creatures, and asked, “Is this a bird?” – some birds you’ll answer yes to quickly, while others it’ll take longer.

Now why do you know a sparrow is a bird immediately, while when it comes to a penguin, an ostrich or a dodo it takes longer? – presumably because the sparrow is more like the prototype. You get there quicker.

So the prototype is just an inferred object which has exactly those traits and qualities such that no other set of traits and qualities would allow quicker reaction times. A lucky escape for prototype theory!

(Or for Plato’s theory of forms, depending.)

Conrad.

ps - Animals do this too. Birds will protect and show interest in eggs depending on their size and the kind of spots on them. This is measurable, to the point where scientists can create fake eggs that look substantially different from their actual eggs – different size egg, larger spots, and so forth – such that birds will prefer them over their own eggs. When it comes to nosing one or the other back into the nest.

To my knowledge, no one has claimed that birds have egg prototypes coded away in their brains, but it looks like the same thing to me.

It seems to me, if two automatons playing chess are “really” playing chess, the only articulation of gameplay remaining is the goal oriented activity version. Chess computers are mindless, but they are certainly goal-oriented.

There’s no room for anything subjective, for words like John’s “recreational” or “meaningful.”

Conrad.

This was the interesting part. The idea that validation would be relevant, let alone necessary :slight_smile:

I’m sorry that you don’t seem to understand. If there’s some way I can break it down into simpler terms for you, let me know.

Also, my name is S. John.

Exactly so.

My apologies, S. John.

C.

No worries; my mother still calls me “Johnnie,” which I haven’t been since 4th Grade :slight_smile:

Yes, neither DNA nor other real things work according to prototypes. But prototypes were never meant to be conceived as out there. They are conceptual or even mental thingies, that real things can be more or less like unto. That’s the very purpose of a prototype theory of concepts (or semantics).
That way protoype theory dispenses with both real inherent essences and any essential conditions for belonging to a given class of things. Everything belongs to every class to a greater or smaller degree – the more like a certain prototype concept (that no real thing needs to be exactly like) the more it belongs to the class defined by similarity to that prototype.

To my mind, the greatest skeptics are often good reading, but much less convincing than the moderate ones. Besides, given that classes are artefacts, why should we conclude that things don’t really belong to those (invented) classes?

Zackly. They are points formed by artifice, valuable in artifice to artifice, which is exactly what they need to be. Their relative positions and “masses” are, necessarily and beautifully and inevitably and correctly, subjective (though the criteria involved may be, and often are, entirely objective).

Someone who uses the term “sports” will have a gravitational center for “planet sports,” centers for orbiting satellites like “winter sports” and “team sports” and nearby planets like “games” and “exercise” and so on … and any given thing will be drawn more usefully (in subjective context) to one body than another, or (in some cases) float in Lagrange points :slight_smile: And since the “planets” can be reasonably rearranged very dramatically according to different needs and nuances of usage, the Internet has yet another way to suck :slight_smile:

Speaking of, the discussion of whether Video Game X is more “a game” or “a sport” is a conversation I never tire of staying out of … Mainly because it’s so often undertaken by those with some kind of medical/emotional need for terms with boundaries rather than centers.

I have a certain professional interest in knowing better how the mind actually works. Besides which I think it’s nifty…

Prototype theory has real applications to writing. (Indeed to any kind of communication.) Understanding prototypes lets us know what can pass being unsaid. If you mention a chair in a room, your audience is likely to understand, without further specification, that it’s a wooden straightback chair: not a lawn chair, a metal folding chair, or a Barcalounger.

Regarding the question of CYOA in IF competitions, it also seems that a prototype approach could be fruitful, and this is in part because it is a descriptive model of how the mind actually works, and not a prescriptive conceptual set-up, like logical definitions are. That has a couple of implications.

For example, consider the worry that CYOA works might interfere with IF works getting fair scores. CYOA is relatively easy to write – not as much programming – and therefore the more difficult IF works might not be able to compete. But at the same time, CYOA is a more limited form (I think most of us believe).

Prototype theory might suggest a couple of approaches to this. For example, if every judge is considered to have a well-developed notion of what CYOA and what IF are, it might be enough simply to ask the judges to rate each work and to assign it to a category: IF or CYOA.

Just crowdsource the problem, in other words. That might be one solution to the original question.

Conrad.

My response already expressed my agreement on this point. I continue to agree. If you need me to agree again, just let me know.

Neither observation matches my experience with making them. If they match yours, that’s interesting. I wonder what the differences were in your own works?

In my experience, there are ways in with CYOA is the more limited form and there are ways in which parser-driven is the more limited form. Both balance out in terms of the goals I tend to have for them. Perhaps your goals have differed, and that’s the basis of our differing perspectives. I can’t even imagine a universe in which someone who’s made both would describe CYOA as “relatively easy” to make compared to a parser adventure. They are equally easy to make poorly, equally difficult to make well. They do call on different skill-sets, but they require comparable effort to achieve comparable doses of awesome.

I think this would (or should) be an embarrassment to the awards if such a rule were implemented.

It’s only a sidetrack if you don’t agree with me. For those who agree, it’s exactly on topic because it renders this thread moot.

Thanks for the link. Essentially what you are saying is that a prototype is a human prejudice, and I agree that this is true insofar as it exists, without necessarily agreeing that people really do mostly think of the same thing when they think of a ‘bird’, because I think that’s culturally determined and therefore useless for defining things like story and game in a generally useful way. I could define ‘game’ in a way that works for my peer group; it wouldn’t be that hard, but nor would it be really meaningful. It’ll probably lead me astray in the end: I’d be better off not doing it and keeping an open mind, instead.

That’s a big presumption. Maybe it’s because penguins and ostriches and dodos have so many cultural referents (i.e. stories attached, like Batman’s Penguin, or ostriches having their heads in the sand, or dodos suffering from human-caused extinction) that so much extraneous information is also called up (because neural networks are indiscriminate that way – related neurons fire together), and it’s the process of deselecting the extraneous information that slows response time. So rather than sparrows being closer to some ideal, instead, penguins and ostriches and dodos could just be laden down with other information that is orthogonal to the question of prototypes. The experiments with animals have the same issue, but also another issue: their responses could be evolutionarily selected based on what is the optimal type of egg for survival (which is also orthogonal to the quesiton of prototypes). Response time or any measure of amount of attention on something really tells you nothing here, as there is no evidence that there is a relationship between focus of attention and prototypicality.

I am not questioning that there is a such thing as more common types of bird, and that sparrows resemble the more common types more than the others, but if prototype theory is only making a statement about commonality, then it isn’t really saying anything important, so I figure it is attempting to go further than that, and that is where I don’t follow the logic.

If it has no existence in reality then of what use could it possibly be, besides as a way for a person who uses it to exclude something from some arena it should probably not be excluded from? I suspect the theory’s logic and its motives. Why do we need to know something that isn’t true, anyway? And why do we need to cloak what is bascially prejudice in scientific terms? So we can all figure out a way to be in agreement? But we don’t need to all be in agreement, and anyway, I don’t want to live in that world. If we all could agree on what is a story and what is a game then that would mean that all of the experiments have ended. The whole point of half of the art innovations that ever were, was to challenge the pre-existing definitions of what it means to make art in that medium. It’s a good thiing those innovations were always introduced into a milieu of disagreement and uncertainty, else they might have been entirely ignored as categorically invalid.

That’s the trouble with thinking in categories. It is only useful for one thing: exclusion.

Paul.

I’d like to delve into this for a moment. What are the challenges in writing a good multiple-choice game, as distinct from writing a good parser game?

IMO, on the multiple-choice game side, most authors struggle with two main challenges:

  1. Managing (exponentially?) branching plot lines.

  2. Designing “meaningful” choices that the player cares about.

A parser game could have these problems as well, if it has a highly branching plot line. But there are many good text adventures with essentially one plot line, gated by a linear series of interesting puzzles.

On the other side, a parser game does have two challenges unique to its format:

  1. Responding well to what the user typed, especially if the user typed something the author may not have expected. This includes responding well to unreasonable actions (did you remember to mark all of your furniture as fixed in place?) as well as disambiguating objects correctly.

  2. Signaling to the player what verbs are available, especially in the context of a puzzle, where the author doesn’t want to name the verb explicitly.

That second point is where parser games can be more limiting (for the author) than multiple-choice games. How do I signal to you that you can type things like: “BLACKMAIL QUINTUS” or “ANNEX SUDETENLAND”? In a multiple-choice game, I can put a button on the screen, and then describe the consequences and implications of your decision.

But I generally agree that by giving more control to the author, you can take control away from the player; this can make some things easier and some things harder for the author.

Yeah. Any story which requires a lot of personal or abstract choices (or exploring questions related to same) collides rudely with the boundaries of parser games. There are only so many ways to represent your character’s moods/suspicions/affections/worry/fear/intentions/etc by fiddling with objects and moving between rooms before the sense of blunt-instrument gimmickry induces eye-rolling. It’s a limitation that can be overcome, but even the methods for overcoming it have to be carefully exchanged and recycled before they, too, feel like blunt instruments (or, in some cases, like borrowed CYOA). Then, to really feel some parser boundaries, set the game at the Woodstock festival, or in Central Park on a Saturday afternoon, without gimmicks which limits the player’s ability to interact as he might naturally be able to. To feel even more boundaries, give the story a dozen highly-divergent plotlines.

Parser games deal with the limitations of the form the same way anything else does: by finding the comfort-zone center away from the boundaries, and staying there most of the time, venturing to the edges only for the occasional bold experiment, ironic meta-commentary, or outright parody. Hence, lots of single-plot games where fiddling with objects and moving from room to room is the most reasonable course of action, and where those “rooms” are often defined by their sense of isolation. Neither CYOA nor parser can offer tactical infinity, but the parser makes an implied promise of tactical infinity on which it must constantly renege and replace with illusion (or outright confession of inability). That gives it many design limitations from which CYOA doesn’t suffer, because CYOA never promises anything except the enjoyment of making explicit choices and exploring their consequences.

CYOA, of course, suffers from equally visible limitations, and comparably retreats most often into its own comfort-zone center, where choices are meaningful but very finite, where some paths re-converge, where divergent plots are embraced and explored at the expense of deep exploration of any one of them, and where the player doesn’t feel that an obvious and reasonable course of action is being artificially excluded. CYOA makes the promise of divergence, and the writer must work hard to fulfill that promise, or the player feels gypped … very distinct from the promise of the parser, where it’s accepted and understood by enthusiasts (if not newcomers) that the game needn’t actually fulfill the promise, only pay it a certain level of attention before smiling lamely and holding up empty hands.

Pressing the boundaries in either form can result in a sense of gimmickry; steering clear of the boundaries in either form avoids that problem but highlights the limitations of each form anyway. And both require facility with writing and game design, and neither makes either of those things any easier to do well.