I disagree. My WIP uses generated “art” as visual cues that something is going on in my parser game, where a player might miss a change in description. It also uses portraits to flesh out characterization of the NPCs and PCs (which you have a choice of, and who are referred to in third person.) I’ve extensively curated images to create the effect I wanted and I think the result is quite appealing.
Now I’m considering ripping all that out, which would mean a lot of change to how I structure the game and the narrative, which I don’t think will be for the better.
Please don’t. Your game was designed with art in mind and if you take it out, I suspect you will never feel that the project is quite as good as it could have been. And the personal fulfilment of completing a creative project of which we can be proud is one of the things that motives us to work on our craft. Yes, some people will be revolted (not my word) by seeing software-generated content inside IF, but you’ll probably be prouder of creating something that’s closer to your vision than creating something on which you’ve compromised in an attempt to please others.
I would like to believe that human intelligence is something special and not replicable by a machine, but all progress in the science of the brain so far indicates that it’s all electricity flowing through wires. The emergent behavior we call the human level intelligence has enormous complexity, but as the years go by I feel like we will eventually crack the code of the intelligence on this higher level, or rather AI will crack it for(?) us. Look at the simple(!) case of protein folding.
I think most neuroscientists would agree it’s a mistake to consider the small section we’ve solved of an impossibly complex puzzle representative of the entire thing.
Perhaps if we distinguish between intelligence and consciousness, machines could probably reach our intelligence but so far, no one understands consciousness so there is no telling if we are even close.
It’s not about “hope.” The workings of the brain are poorly understood by science; this is not a particularly controversial statement. I read an essay once by a neuroscientist or neurologist that compared studying the brain to trying to figure out what the various ingredients in a soup are and what role they each play in the flavor profile when you’re not allowed to see the recipe, go into the kitchen, or ask the cook any questions—all you can do is taste various soups and try to compare them to each other.
yeah at the risk of going fully off topic, “human intelligence is electricity flowing through wires and probably nothing more” seems like a much more naive statement than “we don’t have a complete understanding of consciousness and likely won’t for some time.” To put it mildly, there are unanswered questions in neuroscience about how and why the brain works; we have a much better idea why software does what we tell it to do.
My mother’s neurologist said that trying to treat the brain is like trying to build a watch with a hammer and a screwdriver. The tools are woefully inadequate.
I consider myself a materialist (though not materialistic) and chiefly interested in the topic with regards to the actual material impact on those whose livelihoods are enhanced or (more likely) reduced by capital’s reaction to the hype surrounding “AI.”
Scientific understanding is chiefly the process of figuring out we were wrong about how we assumed the world worked. So it was with Aristotle, so it was with Newton, so it has always been, and probably always will be until humanity runs out the clock. We do not live in a time of privilege where our understanding of anything - let alone intellect and consciousness - has finally achieved a true understanding of the workings of the universe.
And part of that material understanding is an awareness of history, both of the progress of our scientific understanding, and of the hype where NFTs/Meta/AI/Quantum Computing is hyped as the Next Big Thing, and how the fallout ultimately does nothing more than reveal that those who make financial decisions for capital are really bad at their jobs when it comes to evaluating technological potential.
I look at generative AI in art and I don’t see much difference to that of ChatGPT. When ChatGPT outputs words, it’s not trying to communicate anything. Like AI, when it makes an image, ChatGPT makes words.
The mimicry is fascinating though and definitely reflects a good portion of what we do… which is integral to discovering more about ourselves.
Well, if CEOs are saying it, it must be true. Because CEOs are the smartest people. We know this because their salaries are so much higher than anyone else’s, and that’s totally merit-based.
I guess nobody told them that AI is basically harmless if you don’t actually put it in charge of anything, like driving a car or making medical decisions.
AI art is probably the least potentially lethal application, except of course artists for whom it understandably strikes at the raison d’être of their soul.
With respect: I understand how labor action works. And, with respect, I don’t think I’m missing the point. I’m not criticizing the concept of a boycott in general. In fact, I as I said, I absolutely support the idea of direct action to get better transparency and data protections from large corporations. My point is that if what you’re worried about is putting artists out of work then it makes sense to boycott the ones that are putting artists out of work.
So how many dollars are IF makers getting, total? Approximately bupkis. How many dollars are e.g. Microsoft literally giving to OpenAI for the express purpose of further developing AI? Literally ten billion this year alone. Again:
If you’re boycotting the guys represented by the bar on the left and not the ones represented by the bar on the right…if, in fact, you’re directly financially supporting the ones on the right…then this is not a pattern for effective action.
That is: even if I accept literally all of your priors and just say sure, training on publicly available data is theft, AI art inevitably puts artists out of work, this displacement is preventable, there is a moral imperative to prevent this displacement, and there’s some nebulous connection between literally all forms of generative AI such that even if a maker is, for example, only using AI to remix their own art that they’re still culpable…even if we accept all that…boycotting IF makers still doesn’t make sense.
I think the Singularity is just the Rapture for techbros, but Kurzweil’s The Singularity is Near at least manages to contain one of the funniest graphs I’ve ever seen:
It’s…fractally wrong. The more you look at it, the more problems you spot (the use of the “Descent of Man” graphic, the fact that the plot conflates “events” of wildly different category and scope, the fact that at multiple points multiple events occur at the same point, the fact that it was written a decade ago and so predicts the Singularity in the past…).
There’s a palpable air of self-promotion about tech CEOs doing thinkpieces about the impending AI apocalypse. Like if we take the warnings at face value we’re expected to believe that they’re either so stupid or so morally bankrupt that they’re aware that they’re bringing about the doom of mankind and they don’t just stop. So taken at face value we should just be shipping them off to the Hague or something.
But the fact that the CEOs of companies supporting or directly developing AI tech are suddenly calling for regulation should, I think, be read as an attempt to pull the ladder up behind them. That is: they want to preserve their current market position, and frame the issue as so complex and important that it can only be understood and controlled by experts, e.g. them.
It’s one of the other reasons why I think that trying to discourage hobbyist-level use of AI tools is counterproductive. Given the astronomical levels of investment in AI that are currently going on, it seems very unlikely that large corporate interests are going to just walk away from it. And if that’s true, then if we’re worried about the effects on the average worker, then it’s better to have the tools in the hands of as many workers as possible. If we’re looking at the possibility of a world where literally everybody has access to generative AI versus a world where 90% of generative AI is controlled by Google or Microsoft or Adobe…who control access to the tech via subscription and send automated C&D letters to anything that doesn’t have their watermark and individual creators have to cough up their entire workflow to demonstrate authorship to get out of a DMCA takedown…
Well, we don’t have to accept all that because I know of no one remixing their own art or even using tools trained on anything other than public data. And I totally agree that boycotting IF makers using AI has far less direct impact on the businesses creating these tools than like, sabotaging Google’s servers or something. But that’s where (I think) the point is being missed - refusing to engage in a community that supports the use of AI writing and image generation algorithms is a moral objection. It’s not supposed to financially devastate the AI industry, it’s supposed to build a community that values and supports the work of actual artists, which I think is a realistic goal.
I remember the first time I saw this graph (at a Kurzweil speech I attended at college) and the slowly dawning realization that someone can be very intelligent in some subjects (like text-to-speech synthesis) while failing to add anything meaningful to many, many others, haha
I’ll probably regret butting in here, but are y’all going to accomplish anything with this?
My totally unsolicited 2 cents:
It is totally acceptable to not play a game for any reason. I have quit games because I didn’t like the art, or the gameplay, or the subject material, etc. I don’t have to defend or explain that decision and neither does anyone else.
It is totally acceptable to make your game any way you want. There will always be some people who won’t like it because of the art or the lack of it or the subject material or the gameplay, and that’s fine. Make what you want to make for the audience that appreciates it.