Big Studios and the Red Herring of Photorealistic Graphics

Also how historically the graphics quality increasing was seen as necessary to justify the purchases made by hardware enthusiasts in the 90s, and this behavioral pattern never actually left the games industry, long after the maximum level of fidelity started becoming meaningless, and stylized visuals were blown wide open with possibilities.

8 Likes

Videos of expensive assets excite shareholders, which is a very important aesthetic consideration

/s

4 Likes

This seems like a very boring topic to discuss. Nobody here can change Microsoft-Activision-Blizzard’s production specs. Nobody here can change Nvidia’s product lineup either. What’s there to argue about?

2 Likes

Well, you never know (does anyone know what @jjmcc’s job is? I feel like that much review productivity indicates high executive function, I’m just saying).

3 Likes

I literally only started this topic to give @Draconis a space to rant without derailing the previous topic.

Not every thread has to an actionable argument. :woman_shrugging:

7 Likes

The game graphics that hold up the longest seem to combine photorealism and stylization.

A good example is Mirror’s Edge, where the developers and/or designers created a distinct style but also specifically tried to make buildings look real to the exclusion of almost everything else — presumably because they knew the limitations of the technology in 2008 could be applied very effectively to static objects.

Even the shadows are pre-rendered. That’s technically less “realistic” than real-time shadows. However, the shadows look very real, probably because the designers aimed to replicate remarkable shadow patterns that they studied (though that’s just speculation).

Even if there was a super-realistic day and night cycle that cast shadows in different positions in a photorealistic way, the result probably wouldn’t look as good, because for 23 hours of the day the patterns might be pretty unappealing.

In movies, Toy Story is a good example. The animators deliberately chose to use toys as a subject because they required only limited animation. I think this is something the animators have said explicitly.

There are lots of breakthroughs on the technical side, I’m sure, but Toy Story still looks good because toys are minimally realistic in the first place.

5 Likes

I would probably join in that rant, lol. I often say I prefer cartoony graphics to “realistic” ones.

Like, how many ugly polygon-3D PS1 games would’ve been better off being 2D and cartoony or stylised? Most of them, I’d say. XD

2 Likes

I actually disagree with that a bit: It’s just that the improvement isn’t as obvious anymore, but I have to say the Unreal Engine 5 demo looked a lot better than Skyrim, and both are very much later than the 90s. The problem is just that something like that requires similarly unreal GPUs. But with ray tracing hardware becoming more common and Moore’s law (which should still be in effect pretty nicely for GPUs because of the parallelization) it’s just a matter of time for when “ultra high-end” becomes “mid-range” and affordable.

In the end, I care more about the mechanics than the graphics though, except if it’s an Adventure or other story-driven game where I want to feel like I’m in the game. The stars in Stellaris are nice to look at
 for the 10 seconds before you get bored and get on with managing your empire.

“Good graphics” doesn’t necessarily have to be “realistic graphics” though, that’s just one kind of good. I also really liked the art-style in Life is Strange.

3 Likes

I don’t agree that it’s a “red herring”. “PhotoRealistic” graphics in real time are not particularly there yet. I would like to see live ray tracing, but even then it’s only an approximation.

I’m doing ray traced renders for IF. Some single images take over an hour to render. Sure I could spend a fortune on a new GPU, but even then some would still take over 10 minutes. So we’re not even close to real-time.

My images still do not look as “real” as I’d like. This is mostly down to the fidelity of the modelling. Not really poly-count, more like how well its been made. Hair is a problem, as it’s so fine. It’s hard to make faces look alive. The eyes are crucial, and so are lifelike expressions.

Anyway, you can play one of my efforts next week. See what you think.

I totally agree realism isn’t everyone’s cup of tea, and that’s absolutely fine. But i do think some people like it.

3 Likes

Thankfull this thread leaves the “political/social rant” course for taking a more interesting “technical rant”. and being a Final Fantasy fan, I refrain to inflict an arstechnica-grade analysis of FF VII original to XVI/VII Rebirth :wink:
(that is, the final technical & artistic rant on photorealistic 3D graphics :smiley: )

IMVHO, what define the relationship between Art and money is the cultural level of the patron (the Italian word of for patron, “mecenate” is actually the name of the archetypal patron:

so, one can easily see why everyone of high culture in Italy hates the sociocultural degradation of Italian top and top-middle class personified by the late dwarf
)

of course, being Italian I can literally see around the outcome of relationship between the patron and the artist, steered by high culture.

with this major premise, I think that my angle on this issue is obvious: the visual impact and message depend on artist’s and patron’s cultural level. This alone should put Japanese and European videogames’s graphic content above US videogame graphics, if one agree with Kamineko and Zarf’s opinions.

Best regards from Italy,
dott. Piergiorgio

2 Likes

There’s definitely demand for high photorealism. Are large studios pushing that in an effort to limit competition? I doubt it. The technologies and effort are expensive and time consuming - something most companies would prefer to avoid if possible.

While I enjoy some highly stylized games, I also enjoy realistic looking stuff too. It’s amusing when I look back at games 20 to 30 years ago that I thought were amazing looking now look
well
bad, to be honest. Graphics face ever-increasing expectations.

4 Likes

Because they cost so much, I think it’s hard to ignore financial considerations when discussing big studio projects using photorealistic graphics. Since Final Fantasy VII Remake was mentioned, it must have cost significantly more than 100 million dollars. I won’t embarrass myself by trying to guess the exact number.

I don’t have a hard time understanding the concerns of @Draconis : difficult work conditions are often a consequence of high financial stakes.

Who decides where a hundred million dollars should go? The intent of my earlier post, which may have been too quippy to come across, was to assert that the aesthetics of a large project like Final Fantasy VII Remake have tremendous financial implications. Presentation isn’t something left up to an artist as a matter of aesthetic judgement. I don’t mean to say that artists aren’t making choices about their own art for these projects. They certainly are! But I doubt that they are doing so in a vacuum.

Whether the actual return on investment for a project can be linked to graphical fidelity is different question. I think @inventor200 is right to say that there is some holdover from the old days of buying a new video card every six months or so.

In terms of my own tastes, I don’t feel that fidelity is terribly important. To stay in the AAA game space, I don’t think Elden Ring would be a better game if the NPCs had photorealistic face models. Outside AAA, Shin Megami Tensei and Dragon Quest games would be far less appealing to me if they featured high-fidelity graphics.

Still, there are obviously scenarios where photorealism is meaningful to audiences. Naughty Dog’s games (Uncharted and The Last of Us series) are award-winning narrative games, yes, but they are also tech showpieces for Sony hardware. Fidelity is arguably a central design goal for those projects rather than a general “good”. It seems common for these games to be associated with “the Playstation experience,” and I get it.

e: Though I think that even if that kind of fidelity is core to a project, it’s never an excuse for enforcing unrealistic project goals. I reject crunch

3 Likes

on the graphics, DQ has its unique style, and SMT/Persona graphics style is taking another uniqueness of its own, but for lovers of narrative trumping on graphism (the very definition of text-based IF
), the best series remain, at least IMO, Legend of Heroes, at least since the Erebonian arc (Trails of Cold Steel) onwards: its 3D graphics is average, but the depth of story and narrative (more than two million words translated from Japanese to English
) is off-scale high.

(Indeed my favorite setting for easing the learning of an IF language and messing with 'em remains Trail of Cold Steel
)

Best regards from Italy,
dott. Piergiorgio

2 Likes

My take is, the large studios are in a bit of a bind. They’re facing increasing competition from indie developers, and while 95% of those indie developers will never achieve a huge runaway success, there are a lot of them out there, and that last 5% will.

But the studios can’t really afford to rely on occasional huge runaway successes. The budget for each game is enormous; they need a return on that investment. Hiring successful indie developers isn’t much help either, because most of them can’t capture lightning in a bottle twice.

The solution is to push a definition of what makes a “good/professional video game” that correlates to how many person-hours you put into it. A big studio can’t guarantee addictive gameplay, innovative mechanics, or a unique style beyond what dozens and dozens of indie devs with nothing to lose can. But they can push their employees to the limit to have photorealistic graphics and raw hours of gameplay, so they want those to be the metrics that all games are measured by.

That’s why I think it’s a bit of a red herring. There’s no reason to aim for photorealistic graphics except that it’s a demonstration of how many person-hours you have available. And the push for it has repercussions on how employees throughout the industry are treated, because the new metric for how “professional” a video game looks comes down to maximum person-hours at minimum cost.

8 Likes

This conversation reminds me of an old riddle I once heard.

A king holds a contest between two painters. Whoever wins, wins riches and yadda-yadda. They present their finished paintings behind two curtains. The king pulls the first curtain down, and the painting is breathtaking, gorgeous, etc. Truly a masterpiece, and the first artist is truly a genius. Then the king goes to pull down the second curtain, stops, stares at the curtain, and declares the second artist to be the winner!

How could the king have reached this decision without even seeing the second painting?

Answer: Because the second painting was a painting of a curtain. It was so realistic that it deceived the king until he tried to physically interact with it.

This riddle has always struck me as having a completely backwards perspective on the value and purpose of art. But it’s a common perspective.

Personally? I’m surrounded by photorealistic imagery all the time. Give me something weird and funky that I can’t see just by opening my eyes.

10 Likes

Amen to this. This goes for TV, too. I hate the high-def TVs that show me every pore on everyone’s nose. Ingrid Bergman would not have been as beautiful if filmed in high-def.

5 Likes

Actually one rule of thumb in the special effects industry is “the best special effects go completely unnoticed”.

I understand that you want to perceive and appreciate art, but there is also something satisfying about the uncanny realization "That building facade is Trompe-l’Ɠil and the artist is a genius because I was fooled!

filmmaking tangent

There are a lot of movies that have lots of CGI and digital replacement you don’t even perceive. Brokeback Mountain is not thought of as a CGI extravaganza but many of its vistas were enhanced or replaced. Another good example - the movie Nope does employ some really well-done CGI but what may not be apparent is that almost every sky and cloud is digital since they end up being major plot points, and most of the gorgeous night-time scenes were shot day-for-night in IMAX. The problem with filming at night is for the camera to capture an image it needs to have light on it. You can’t go into the woods and just film night time scenes with no light. The set either needs to be strategically lit without seeming like it’s lit - often using blue lights or doing the day-for-night process where the scene is filmed during the day so everything is lit, but a filter over the lens simulates night, and the colors are corrected in post production.

You can see the technical difference - the opening of Jaws is done non-digitally. It’s nearly impossible to light an ocean at night, so they filmed during the day and filtered it. You can tell that what is ostensibly the moon is really weird and bright - that’s because it’s actually the sun through a filter. Decades later, Nope managed to film miles and miles of night-time landscape in a desert and have it all visible for the enormous IMAX ratio. They did it much better by filming scenes simultaneously with a standard and infrared camera and digitally blending the shots for some of the best (artificially) captured desert night shots ever seen.

How Jordan Peele's NOPE Delivered The Best Day For Night Shots In Cinema - Noam Kroll

Many game makers are finding that taking a step back from the uncanny valley of photorealism is often ideal. A stylized image with personality is often more relatable to an audience than an image that is almost real but not quite. Compare the frightening CGI of Polar Express that tried for realistic faces to a modern Pixar or Disney film which employs realistic lighting and textures but maintains a sense of animated drawing and design - no one will mistake the images as real, but they are much more pleasing and enjoyable.

Polar Express vs Encanto

Unknown-2
Unknown-1

images
images-2

And I agree - CEOs and producers without the most honed sense of artistry have an easier time quickly evaluating whether graphics are “good enough” than the actual plot and game mechanics which require time and hands-on evaluation, so it’s probably understandable why they focus on visuals since you can get investors easily excited from a glance at an image.

7 Likes

Hair is so incredibly difficult in 3D art, omg. :sob:

I’ve crashed Blender so many times trying to attempt it, and have to leave single images to render overnight.

Honestly my jaw drops every time hair is done well in 3D.

In the title, I don’t mean uncanny valley or anything like that, nor do I mean the style itself is bad. I mean in terms of game design, a mistake is made of assuming photorealism is as simple to implement mechanically as low-poly styles, and the player is faced with a deeply detailed environment where the objects of focus are remarkably hard to find.


The title is admittedly an exaggeration because I didn’t put much effort into it because I was just being silly by giving Daniel room to vent without derailing the previous thread.

EDIT: Oh there are quite a few responses. I should have gotten meds first.

4 Likes

Okay, I went blind in 2012, so graphics is mostly a moot point for me, but back when I had a working eye, I already felt like those chasing realism had hit diminishing returns. The last time a game wowed me for photorealism was something for the Gamecube or PS2 and after that, if a game wowed me for graphics, it was because I was playing some retro game that looked better than I expected for the hardware(Kirby’s Adventure on the NES looks more like a SNES game than an NES game) or because handhelds were doing things previously only possible on home consoles
 And honestly, circa the 20-preteens, there were PS1 titles that I felt had aged better than some titles a generation newer(The original Spyro the Dragon aged better than Sonic Adventure in my opinion).

I also never got the HD hype and the only reason I got a 720p television when I switched from a CRT to a LCD was because SD LCDs weren’t a thing at the time. I preferred animation to live-action when it came to movies and television, but I never watched a 480*640 DVD rip of a live action program and thought(this doesn’t look as good as real life and thought it a total waste that Blu-Ray releases of SD content were always upscales that use the same number of discs as the DVD version instead of using Blu-Ray’s larger capacity to fit an entire movie franchise or an entire television season on a single disc.

3 Likes

I fully concur on your tangent on matte, digital and analogic.

Mattes can be defining a scene and its impact. IMVHO the best, by far, the one at the end scene of Munich (2005), which background matte underscore and reinforce the rather significant dialogue between the protagonist and his handler (no spoiler, suffice to say that expose the chained cycle of self-feeding mistakes of the last quarter of century
)

Best regards from Italy,
dott. Piergiorgio.

1 Like