Zaku

Traveler
Joined
Dec 18, 2024
Messages
33
Reaction score
147
Awards
15
Website
letsdiscoverthingsthataregood.wordpress.com
2EF5D15B-C6F8-4006-898C-B7D8398CF223.pngI believe this is a question that has become more and more relevant in recent years as graphical prowess progresses but the overall results stagnate. We have long since reached the era of diminishing returns with increased polygons and come the PS3 era, I believe we have attained a plateau on which we can build but that we can't really elevate or surpass. PS4 is PS3+1 and PS5 is PS3+2 so to speak.

I'm sure a lot of us 30-somethings have fond memories of the evolution of graphics. Moving to a console generation that could render 3D must no doubt have been magical and is not something that can be properly experienced by younger generations. To the zoomer, having your mind blown by Mario Kart 64 must sound like satire but when evolution felt like revolution, the gift of "something new" was constantly achieved. And that is something we have lost. And, believe me, there were enough people even back then who would jerk off to Lara Croft who looked like a bunch of pyramids stacked on top of each other.

Such a revolution, a new step that is unlike the previous is no longer doable. And this is part of the overall problem with diminishing returns: we have traded in seeking "new" with seeking "more".

Now, don't get me wrong, I can appreciate a top-notch graphical experience quite a bit and I don't want to turn this into "Graphics Versus Art Style: The Debate #33258" but the question remains: what's the tradeoff to more graphical power? What are the downsides of reaching higher numbers of polygons?

I can think of a few and it's become obvious the industry is now experiencing the same thoughts:
  • More and more focus goes towards the finetuning of graphics, leading to less overall content, area size or map diversity as each individual section takes longer to create
  • Game optimization goes out of the window: file sizes have become bloated and even become a selling point - a game that is 100 GB big pretends to justify as much under the illusion of "more content" and dominates the free real estate that is your memory space, becoming evergreen content and pushing out the competition
  • Game development cycles have become disastrously long - games that are in development for 8 years are no longer a rarity, comparing the previous output of Rockstar Games to the modern one in quantity is a sad joke
  • Budget creep essentially means that if your game fails, so does the studio
  • This, in turn, has pushed out all creative risk-taking
  • There are less games overall due to the aforementioned factors
  • The death of AA games. This is what made the vast majority of the PS2 library but nowadays, it's either the latest AAA gaming juggernauts or indies
This is not feasible and it's reached a point where it doesn't just hurt the medium but also the industry itself. We're looking at the current lineup of Sony's PlayStation and even they themselves have admitted that there's nothing big on the horizon - if anything, they have announced more media mix bullshit (TV series, movies) since then. When one of the three big gaming console makers admits that gaming is currently not where it's at, we have an issue.

As Mark Cerny has recently talked about the PS6, it came to my mind how ludicrous the concept of a PS6 is. What game and which game studio is supposed to be able to properly utilize it? There are maybe two handful of proper, fully utilized next gen titles halfway into this console lifecycle? Who is this made for and who can operate it? Plus, with 80 per cent of games currently being played being "older titles" (think Fortnite, GTA Online), what's in it to invest more and more into an increasingly smaller slice of the cake to the point where even the existence of dedicated gaming platforms itself becomes questionable?

Not to mention that a lot of the latest games have these graphical issues. They don't run well. There's always something off. I was thinking the other day of the games that I thought looked the best in performance and graphics on my PS5 and it was
  • Ghost of Tsushima
  • Death Stranding
  • Returnal
  • Ratched and Clank
  • Demon's Souls
  • Astro Bot
  • Mafia
  • Gran Turismo 7
Wait, hold on, half of these are graphically upgraded PS4 titles! As I was playing GT7 just now, I saw this highly realistic visual spectacle, running smoothly like butter, thinking to myself "Well, that sure is the power of the PS5" and then was shocked to remember that this is a PS4 game actually. And so are several others on this list. And it looks and runs a lot better than Dragon's Dogma 2, Final Fantasy 16 and Monster Hunter Wilds, next generation titles.

So I'm really wondering who will be left to fight to the end of the graphical arms race and if any money can still be made by that point. Sure, Rockstar Games will probably persevere in terms of technologically keeping up, having the proper manpower and making enough money no matter how much they spend on the next Grand Theft Auto but for the rest of the industry, it's getting more and more dire to survive for, once again... diminishing returns.

It's all in for very little to gain. Even gambling is more respectable than gaming for understanding that that's not how it works.
 
Virtual Cafe Awards
1736539087405.png

  • More and more focus goes towards the finetuning of graphics, leading to less overall content, area size or map diversity as each individual section takes longer to create
  • Game optimization goes out of the window: file sizes have become bloated and even become a selling point - a game that is 100 GB big pretends to justify as much under the illusion of "more content" and dominates the free real estate that is your memory space, becoming evergreen content and pushing out the competition

related meme
 
Virtual Cafe Awards

wavve-creator

Ontologist lost in America; dreaming.
Joined
Aug 12, 2024
Messages
402
Reaction score
1,009
Awards
115
Game optimization goes out of the window: file sizes have become bloated and even become a selling point - a game that is 100 GB big pretends to justify as much under the illusion of "more content" and dominates the free real estate that is your memory space, becoming evergreen content and pushing out the competitio
COD did this, it floods your memory so you could not play any other game.

On topic the end point of graphics would be fooling the consumer so well they do not realize they are the ones being consumed.

Screenshot 2025-01-10 at 3.05.26 PM.png
 
Virtual Cafe Awards

Waninem

30 Year-Old Boomer
Joined
Aug 15, 2024
Messages
386
Reaction score
1,875
Awards
164
I'm honestly not sure. You pretty much need a 4k or even an 8k TV to fully appreciate graphical fidelity like this, and a lot of people (myself included) are still using your basic widescreen HD TV or monitor. I know what they want ultimately want; graphics and physics engines that are basically indistinguishable from real life. That's been pretty much the goal since video gaming was born; make the models look better, make the textures look better, use more triangles for smoother form, use motion capture, etc, etc. But rarely does it seem to pay off in a major way. Instead it just contributes to a culture of filesize bloat and unoptimization, like you said. I fail to see why a game needs to use 100 gigs of storage space. We have full sized MMOs that use a tiny fraction of this. Maybe have high quality textures as an option for the videophiles out there, but don't make them standard when probably the majority of the population don't even have the right equipment to be able to enjoy all the details. Not even gonna get into stuff like uncanny valley and how this is only going to increase as a danger as fidelity gets better.

All that said, I think PS4 era graphics were the functional peak. As video game generations came and went, the graphical gain from one console to the next decreased on an exponential curve, at least to me. The improvement from the PS2 to the PS3 was about half as noticeable as the improvement from the PS1 to the PS2. I think there was some improvements from the PS3 to the PS4, at least in terms of things being smoother and fidelity being higher, but from then on games have either stayed about the same or look noticeably worse compared with older titles. Arkham City looks freaking amazing still, and Nier: Automata released back in 2017 was (and is still) pretty in spite of the fact it was a lower budget game. Maybe the hope was that once graphics hit the plateau that more focus would be brought to other things like better gameplay or gimmicks like virtual reality (maybe the one instance where realer-than-real graphics could be justifiable.) Seems like they're gonna just keep sharpening that pencil to infinity though.
 
Virtual Cafe Awards

Ross_Я

Slacker
Joined
Oct 17, 2023
Messages
1,331
Reaction score
3,654
Awards
245
Website
www.youtube.com
I'm sure a lot of us 30-somethings have fond memories of the evolution of graphics. Moving to a console generation that could render 3D must no doubt have been magical and is not something that can be properly experienced by younger generations. To the zoomer, having your mind blown by Mario Kart 64 must sound like satire but when evolution felt like revolution, the gift of "something new" was constantly achieved.
Well... I don't know how constanly it was achieved, really. In my memory, we only had one (1) revolution: the mentioned transition from 2D to 3D. Because back when we played GTA 2 and then GTA 3 came out... Now that was "Wow". Like, The Wow. W.O.W.
Nothing ever compared since. Surely, Doom 3 kind of felt like "Wow", but not The Wow. And, in general, graphics didn't change much since the first Serious Sam. Surely, they added more polygons, tweaked the lights and made textures much larger... but that's it. First Serious Sam still has one of the most amazing sky textures, and I'm telling you that the water in that game still looks good.
I guess, that's why I never really paid much attention to fancy graphics. I feel like not much changed since 2001. But if you want the objective breaking point - well, obvioulsy, 2007 and Crysis. They made it their point to push it to the limit with Crysis - and they did just that.
And yeah, the point is...

So I'm really wondering who will be left to fight to the end of the graphical arms race and if any money can still be made by that point.
The arms race has been over for years as of now.

  • Game development cycles have become disastrously long - games that are in development for 8 years are no longer a rarity, comparing the previous output of Rockstar Games to the modern one in quantity is a sad joke
Let's clear this one up: Rockstar's output is miserable not because they are developing something, no. They just feel comfortable enough milking their online titles to death. Those generate enough income, and so Rockstar doesn't feel the need to make new games too often. After all, every new Rockstar game on the scene will potentially distract their player from their own online titles - and Rockstar definitely doesn't want to compete with itself.

[*]Budget creep essentially means that if your game fails, so does the studio
[*]This, in turn, has pushed out all creative risk-taking
Let's clear this one up too. Our favorite example - Concord. It has been developed for 8 years and cost about 400 million dollars. And it has been shut down after just two weeks. And yet Sony is alright.
Then there is Sega's Hyenas which also has been in development for years and cost millions - yet which has been cancelled without any kind of a release at all, despite the fact that it supposed to be Sega's most expensive game to this point.
So, all these big flops do not really end the studios. And they are not the reason that pushed out all creative risk-taking as well. As a matter of fact, the budget creeps and lack of creative risk-taking are both branches of one reality: reality of modern games-are-a-product business-oriented video game industry, dominated by executive-enforced decisions and corporative ethics, whose only desire is to maximize profits.
Actually, those are the reasons studios are closed as well: not because the game fails, but to minimize the losses. I mean, even studios making bestselling games are getting shutdown nowadays: https://www.theverge.com/2021/5/8/2...ier-video-game-studio-shutdown-book-interview
Quote from the article above: "The biggest thread is always money in some way or another. It's always like: "We ran out of money," or "We don't think that this is going to lead to enough money," or "This is not going to lead to enough growth," or "We need this money for something else." It all comes back to just the flaws of unregulated late-stage capitalism and the issues that causes and the insatiable need for growth."
https://www.youtube.com/watch?v=gZffFoQekcc said:
After talking to developers, producers and games media folk, I'm pretty much certain that there are too many money-first middlemen, investors and executives calling the shots. The creatives and sensible business savvy-folk have been shuffled out of the impactful roles and no longer have much of a say.
Thing is, these money-first people - they can only think short term. Simply because they do not need to think long term in the first place. Of course, eventually they will drive the video game indistry into the ground because this method has always been in their playbook. They have no incentive to improve or adapt - all they need to do is to earn their massive golden parachute. Kind of "me mortuo terra misceatur igni" approach.
"The goal that I had in bringing a lot of the packaged goods folks into Activision about 10 years ago was to take all the fun out of making video games." - Bobby Kotick during the 2009 Deutsche Bank Securities Technology Conference.

Now, don't get me wrong, I can appreciate a top-notch graphical experience quite a bit and I don't want to turn this into "Graphics Versus Art Style: The Debate #33258"
But that's really the only thing that's left for us at this point. Actually, there's barely any debate in this area from my point of view: since the graphics do not evolve much since 2001-2007, you can appreciate those all you want, but style is the only thing that can produce any kind of "wow" effect mentioned above. And so I felt some kind of "wow" when I launched Huntdown, not Vampyr.
 
Last edited:
Virtual Cafe Awards

mesaprotector

metric system hater
Joined
Jul 4, 2023
Messages
202
Reaction score
606
Awards
76
If you're wondering what a market looks like that already has hit the performance limits of what a regular end user can notice, look at SSDs. PCIe 5.0 drives have been around for a couple years now, with very little hype around them, just because 14GB/s in sequential transfers is pointless to the average person. It's faster than almost anyone's internet connection, and few people are constantly copying huge files from one drive to another (if you did that often enough to care, you'd wear out your drive). It doesn't help that those drives are very power-hungry and run hot.

For graphics—most people cannot see a resolution difference above 4k, sometimes less than that depending on the screen size. (Only people with 20/15 or better vision, sitting closer than normal to a large screen, can sometimes tell 4k and 8k apart.)

With refresh rate there is some evidence that people are able to perceive differences in motion blur, etc., even at very high (500Hz+) frequencies (from a study with fighter pilots). I doubt it will matter, though, except maybe for VR. I myself can just barely feel a difference over 60Hz, and for the majority the differences above about 144Hz are small. If anything, small amounts of motion blur are a good thing as they reduce the perception of flickering/stuttering.

So then the question becomes when will we have gaming hardware that can sustain a steady 144Hz at 4k, and the answer is, either right now or very soon. Frame gen and upscaling are already close to being obsolete in 2025. They would've been a lot more groundbreaking ten years ago when 1080p was still "HD".
 
Joined
Nov 13, 2024
Messages
28
Reaction score
51
Awards
12
Ultimately games will continue to be made in this way because they sell to the wider public. Luckily there are plenty of fantastic indie games who focus on gameplay over graphics, so I never understood why there is so much complaining about these issues. There is also room for the "better than real life" graphics, such as Ghost of Tsushima which I have not played myself as it looks a bit uninteresting gameplay wise, but some of the scenes in game look rather wonderful from an artistic viewpoint.
imrs.jpg


The current dogma about artsyle vs realism is a false proposition to me; for there is plenty of expression to be found in realistic art, just as there was with Baroque, Classical, and Romantic paintings despite all three eras being under a "realism" umbrella. Personally I would love to play a game that looks like Ghost of Tsushima and plays like Ninja Gaiden 2
WjROXwA.gif

bjh5kej8oru51.jpg
 
Last edited:
Joined
Apr 29, 2024
Messages
442
Reaction score
2,789
Awards
173
Graphics serve the game, otherwise I would watch a film right?
I think graphic evolution serves its purpose in the world of videogames as long as it made it possible to do new things with the gameplay.
To me the gameplay is to the videogame as the camera is to cinema. Sure there's other things that matter, good sound design, good soundtrack, good acting, but all that doesn't make a film a film, the camera does, and specifically the camera as a language, filmed theatre is not cinema.
Otherwise we could talk about a radio drama or an opera, but not a film.
Similarly, what makes a videogame a videogame is the game. And when graphical evolution ceased to bring new tools through which we could improve the art of the game we lost any meaningful sense of wonder with new graphic capabilities.
The jump from SNES to 64 was amazing due to the implications the 3D had for the gameplay. Now the 3D seems completely mastered, no matter how many more folicules I can see in the main character's face, the gameplay wont improve.
So I think the videogame industry should rather focus on new technology which would have impact on the gameplay, or just use the tools already available to master the art of the game.
 
Virtual Cafe Awards

kimn

Traveler
Joined
Nov 5, 2024
Messages
39
Reaction score
80
Awards
16
Remember - the end game is not just about the user, but the developer, too.

Even when we get the first indistinguishable-from-real-life vr smell-o-vision 3000 game, the arms race won't be over. That game will have been impossibly difficult to produce, the next goal will be to make it doable by an indie team of three.

Same as super mario taking a massive team and millions of dollars to squeeze every last drop of performance out of the primitive hardware. An indie studio could make the same thing over a weekend nowadays.
 

knowyouareloved

Internet Refugee
Joined
Dec 15, 2024
Messages
21
Reaction score
105
Awards
11
The point where increased graphical fidelity stops adding value is like the Year of the Linux Desktop; it will happen at a different point for everyone, and some will endlessly bicker that the day will never actually arrive. Eventually enough people will realize gameplay ultimately matters most and the market for AA games will reappear. Perhaps that seems unlikely but Nintendo is killing it right now. Nintendo decreed that no one is allowed to see what Luigi looks like in more than 1080P, but plenty happily accept that because the games are fun.
 

pressC4caius

Pickle enjoyer
Joined
Nov 10, 2024
Messages
13
Reaction score
19
Awards
4
I think it is worth pointing out that companies like NVIDIA are always trying to push some kind of tech to show they are on the bleeding edge. Right now it's AI frame interpolation, tomorrow it may be some other technology. They have a lot of money in the game and it is in their best interests to push graphical fidelity so people need to continue buying their products. For example there have been rumors that NVIDIA sponsors game development in exchange for guaranteeing DLSS path tracing and ray tracing are included. This is rumored to create incentive to keep games poorly optimized so there is a continual market for DLSS and similar tech. IDK if this is true.

I personally feel like we are approaching the event horizion for graphics for conventional gaming unless something dramatic changes. VR graphics still have a long way to go for example. OLED screens are very cool but they are still very expensive but seem to be trending downward in price. I have never personally seen an 8k screen but i can't imagine myself noticing much difference to 4k. Until we see some kind of dramtic new technology, i don't see any true drastic progression happening.

But as with everything in life time will tell
 
*slaps pc hw*
>this bad boy can hold onto 255gb of memory, which is 1/3rd of newest games' working memory

*pc hw slaps you back*
<clockwork overload

....................................................

it seems that next level of this should be quantum - as we approach limits of fidelity, memory use/materials (chips/cores, moore law; >> lim of c/mm, or be it c/(per)nm) and el. bill, it is only a question of time, resources and ingenuity to tell we will come from digital and enormous, to analog and quantum computing, be it at this time unimaginable way to be done...
 
Virtual Cafe Awards

PatHeadroom

Traveler
Joined
Jan 28, 2023
Messages
74
Reaction score
95
Awards
26
Not to mention that a lot of the latest games have these graphical issues. They don't run well. There's always something off. I was thinking the other day of the games that I thought looked the best in performance and graphics on my PS5 and it was
  • Ghost of Tsushima
  • Death Stranding
  • Returnal
  • Ratched and Clank
  • Demon's Souls
  • Astro Bot
  • Mafia
  • Gran Turismo 7
Wait, hold on, half of these are graphically upgraded PS4 titles!
Bro living the PS5 Has No Games meme.
 
Virtual Cafe Awards

alCannium27

Well-Known Traveler
Joined
Feb 15, 2023
Messages
529
Reaction score
1,464
Awards
147
I believe the logical conclusion to the GPU race today will make graphic settings obsolete
DLSS is a neural net model that make use of Nvidia's post RTX2XXX cards hardware for upscaling game graphics, achueving the goal of increasing resolution levels at a lower computational cost beyond certain levels (pretty much 4k+ at this moment)
As hardware powers increases, there is no reason why any traditionally rendered images cannot be alter by the AI, in a manner similar to what we can do with diffusion models now, in order to change its resolution, aesthetics, style, etc.
Image running a Grand Theft Auto get with a "peter Jackson's Lord of the Rings trilogy filter" on, or cyberpunk 2077 with a family guys cartoon filter on
in which case the question is no longer "how realistic", rather "how fast." Realism will become merely player choice, a fidelity and resolution a toggle
1736667345491.png
1736667809886.png
 
Last edited:
Virtual Cafe Awards

alCannium27

Well-Known Traveler
Joined
Feb 15, 2023
Messages
529
Reaction score
1,464
Awards
147
Oh, and to add to the previous point, it's entirely possible with diffusion or other AI-based rendering techs, 3D modellings and shader coding will be completely unnecessary as all it would take is a basic depth-based scene render without textures or even need for hi-poly models, which is then rendering using a controlnet-like extra-model on top, for rendering the scene in any styles the devs/players desires.
1736671797904.png
 
Virtual Cafe Awards

Pixel the puppy

Internet Refugee
Joined
Jan 12, 2025
Messages
2
Reaction score
3
Awards
1
View attachment 129717I believe this is a question that has become more and more relevant in recent years as graphical prowess progresses but the overall results stagnate. We have long since reached the era of diminishing returns with increased polygons and come the PS3 era, I believe we have attained a plateau on which we can build but that we can't really elevate or surpass. PS4 is PS3+1 and PS5 is PS3+2 so to speak.

I'm sure a lot of us 30-somethings have fond memories of the evolution of graphics. Moving to a console generation that could render 3D must no doubt have been magical and is not something that can be properly experienced by younger generations. To the zoomer, having your mind blown by Mario Kart 64 must sound like satire but when evolution felt like revolution, the gift of "something new" was constantly achieved. And that is something we have lost. And, believe me, there were enough people even back then who would jerk off to Lara Croft who looked like a bunch of pyramids stacked on top of each other.

Such a revolution, a new step that is unlike the previous is no longer doable. And this is part of the overall problem with diminishing returns: we have traded in seeking "new" with seeking "more".

Now, don't get me wrong, I can appreciate a top-notch graphical experience quite a bit and I don't want to turn this into "Graphics Versus Art Style: The Debate #33258" but the question remains: what's the tradeoff to more graphical power? What are the downsides of reaching higher numbers of polygons?

I can think of a few and it's become obvious the industry is now experiencing the same thoughts:
  • More and more focus goes towards the finetuning of graphics, leading to less overall content, area size or map diversity as each individual section takes longer to create
  • Game optimization goes out of the window: file sizes have become bloated and even become a selling point - a game that is 100 GB big pretends to justify as much under the illusion of "more content" and dominates the free real estate that is your memory space, becoming evergreen content and pushing out the competition
  • Game development cycles have become disastrously long - games that are in development for 8 years are no longer a rarity, comparing the previous output of Rockstar Games to the modern one in quantity is a sad joke
  • Budget creep essentially means that if your game fails, so does the studio
  • This, in turn, has pushed out all creative risk-taking
  • There are less games overall due to the aforementioned factors
  • The death of AA games. This is what made the vast majority of the PS2 library but nowadays, it's either the latest AAA gaming juggernauts or indies
This is not feasible and it's reached a point where it doesn't just hurt the medium but also the industry itself. We're looking at the current lineup of Sony's PlayStation and even they themselves have admitted that there's nothing big on the horizon - if anything, they have announced more media mix bullshit (TV series, movies) since then. When one of the three big gaming console makers admits that gaming is currently not where it's at, we have an issue.

As Mark Cerny has recently talked about the PS6, it came to my mind how ludicrous the concept of a PS6 is. What game and which game studio is supposed to be able to properly utilize it? There are maybe two handful of proper, fully utilized next gen titles halfway into this console lifecycle? Who is this made for and who can operate it? Plus, with 80 per cent of games currently being played being "older titles" (think Fortnite, GTA Online), what's in it to invest more and more into an increasingly smaller slice of the cake to the point where even the existence of dedicated gaming platforms itself becomes questionable?

Not to mention that a lot of the latest games have these graphical issues. They don't run well. There's always something off. I was thinking the other day of the games that I thought looked the best in performance and graphics on my PS5 and it was
  • Ghost of Tsushima
  • Death Stranding
  • Returnal
  • Ratched and Clank
  • Demon's Souls
  • Astro Bot
  • Mafia
  • Gran Turismo 7
Wait, hold on, half of these are graphically upgraded PS4 titles! As I was playing GT7 just now, I saw this highly realistic visual spectacle, running smoothly like butter, thinking to myself "Well, that sure is the power of the PS5" and then was shocked to remember that this is a PS4 game actually. And so are several others on this list. And it looks and runs a lot better than Dragon's Dogma 2, Final Fantasy 16 and Monster Hunter Wilds, next generation titles.

So I'm really wondering who will be left to fight to the end of the graphical arms race and if any money can still be made by that point. Sure, Rockstar Games will probably persevere in terms of technologically keeping up, having the proper manpower and making enough money no matter how much they spend on the next Grand Theft Auto but for the rest of the industry, it's getting more and more dire to survive for, once again... diminishing returns.

It's all in for very little to gain. Even gambling is more respectable than gaming for understanding that that's not how it works.
Don't forget about "animation budgets". How many unique flutters in a cape, or the strands of hair on Lara Croft's head.

I'd bet someone on a game dev team is salivating over the idea of LC in a proper wet Tshirt.

Then there's the Last of Us Pt II which gave Ellie and Abby the quadrupedal/prone/supine move set from MGSV....Speaking of which the case study is Delta Snake Eater. Any team seriously into PS6 and beyond either dreads or dreams of their own versions of that coming to fruition.

A digital hellscape where digital hair follicles grease and mat down realistically.
 

Attachments

  • IMG_1337.jpeg
    IMG_1337.jpeg
    823.1 KB · Views: 1
  • IMG_1336.png
    IMG_1336.png
    2.4 MB · Views: 1

RisingThumb

Imaginary manifestation of fun
Joined
Sep 9, 2021
Messages
1,211
Reaction score
3,779
Awards
234
Website
risingthumb.xyz
Excellent points all around. Thanks for making a great thread Zaku
  • Game development cycles have become disastrously long - games that are in development for 8 years are no longer a rarity, comparing the previous output of Rockstar Games to the modern one in quantity is a sad joke
Unfortunately Games are high complexity to make. Acerola did a good video on why game development is hard. Programming is already high complexity and many programmer jobs are really just complexity management jobs. Additionally, many games do not have good unit tests or E2E tests(often for good reason! E2E tests are long to run, and can run into issues if your game features non-determinism(many games using physics has this issue), and also for the fact some things will pass tests, but be obviously wrong when manually testing it, i.e. sounds too loud, too many overlapping, visual weirdness etc). Add in that graphics programming(shaders etc) has afaik, no good debugging approaches, and you begin to approach why the programming part is hard. Then if your engine has limitations you either need to work around it(compromising your design vision and taking a lot of time), extend the engine(which means more complexity has to be managed and you have to get comfortable with the harder work of engine development, and you have to make sure it still compiles for your target platforms. I think it's possible for Unreal as it's source available but I may be wrong) or get in touch with people who can fix these issues upstream for the engine(for the average developer... unfeasible for Unity and Unreal).

This is also in an iterative loop. Each person added into a team requires more time to communicate between them, which extends the time for development and reduces the number of iterations that can be done. This is also part of why level designs have simplified over the years, as it becomes communication between narative designers, environment artists, level designers. game designers, and developers. The more communication that needs to happen, the longer and more difficult each development iteration takes, and also the more money needed to pay for it.

That said, for many games I think it's also a case of diminishing returns. But making the most of the diminishing returns, and knowing what to leave alone and instead get their work finished is... another challenge. Like they say, the last 20% of the work, is 80% of the effort.

Additionally you're looking at things from a quantity point of view. The end consumer doesn't care about quantity... in fact some would see it as a mark of it being shovelware. This happens on steam where a lot of new games are shovelware. Consider when was the last time you looked on the playstore for a game? For me it was more than a year ago because of the sheer quantity of shovelware on it. Quantity is not quality- it's also part of why I don't really go looking very much or very hard for games on itch.io either. For good or worse, Steam is very good at curating games players are interested in.
  • Budget creep essentially means that if your game fails, so does the studio
This is sadly true. The only real "polyfill" for it, is contract work, or trying to fill the gap either with early access(completely unsuitable for some game types like puzzle games), kickstarters(completely unsuitable for any developer who has no track record), or to try and get money from patreon-style things, or even just to try and make game development courses or asset packs. Arkane Studios had a huge gap between Arx Fatalis and Dishonored, that was filled in with contract work for Ubisoft and then later for Bethesda.

The other choice is to make games that are the spiritual successor to Oblivion Horse Armour DLC. I understand why they do it, but every developer who does this instead of just making good games, should quit their job and go join a Fortune 500 Company or an investment bank as they're clearly in it for money, and those are much better ways to get money. I don't think making money is bad, but you do yourself a disservice if you try to get it in bad ways by ruining things.
  • More and more focus goes towards the finetuning of graphics, leading to less overall content, area size or map diversity as each individual section takes longer to create
This is true, but consider it a different way: your graphics is the first thing people see when they play your game. Sadly games with an excellent game design will be ignored if they have interesting or appealing graphics- with the exception being that the game design keeps people addicted in a "just one more" loop. Factorio does this, the graphics are a bit off putting for new gamers. Vampire Survivors also did this.

Also regarding your map diversity and area size point... it's also because of the tooling changes. A lot has moved to mesh-based systems, which means your level designer has to know a 3D modelling tool to do his work, instead of something like a brush-based editor like Trenchbroom. This is harmful for them as it hurts their iteration speed, and harmful to the game as 3D modelling software often doesn't have ways for managing entities in the level. It's also really bad for beginner level designers as they can't dip their feet into it all without having to adopt 3D modelling, scripting and design hats. Plus their choices(see what I say about overlapping light volumes and shadows) affect performance in the game, and they're not really placed well for dealing with performance as they're not programmers.
  • Game optimization goes out of the window: file sizes have become bloated and even become a selling point - a game that is 100 GB big pretends to justify as much under the illusion of "more content" and dominates the free real estate that is your memory space, becoming evergreen content and pushing out the competition
Game optimization doesn't go out of the window... The file sizes are largely down to either baked lighting for large maps, or for the various texture types that each texture has(Each of them having 2048x2048(often both high and low resultions too) albedo, roughness, AO map, emission sometimes, normal maps etc...). The reason to have lower resolution textures is to reduce VRAM costs when they're used by shaders. The reason to use baked lightmaps is to reduce GPU costs on the lighting, as Global illumination is a hard problem. Additionally a lot of the issues often come from unoptimised lighting- shadows are very expensive, overlapping lights are expensive... and so on. The 2 choices here you get are... either bundle higher resolution textures as DLC(as Monster Hunter World does), which also means you accept people are gonna make your game look like shit in youtube videos and turn people off of it... or force people to download it all at high storage cost.

Imagine if your first exposure to Skyrim was this, naturally from a youtube video or a livestream or a game news article...
1736713440396.png

You would probably be turned off if you're the average consumer...

Now if you really wanted to talk about Game Optimisation going out of the window. I partly blame Unreal, and partly blame Developers who have bought into ideology around game development rather than pragmatics. Threat Interactive have done a bunch of videos on how Unreal is... culpable for this.

Between storage space and game performance, most people prefer game performance, but the cost in asset sizes is ridiculous. I mentioned above about the textures, often many games will channel pack a lot of textures together into one(i.e. Alpha not being used by Albedo, but instead being used as a 0-1 for the roughness map). Many game engines and developers don't do this anymore, understandably because it's time consuming to do... but at a ridiculous size like 100GB games, compression efforts should be made. Unfortunately they're not made because they know consumers will suck it up, and it lets them get a lower quality product to market quicker.
DLSS is a neural net model that make use of Nvidia's post RTX2XXX cards hardware for upscaling game graphics, achueving the goal of increasing resolution levels at a lower computational cost beyond certain levels (pretty much 4k+ at this moment)
Slightly better than TAA, but still makes everything blurry as you're upscaling from a lower resolution. DLSS uses previous frames to help in upscaling the image for the current frame, but as a result gets ghosting from previous frames and often adds detail where at native there would be none, or often smears over details where there would be. For anything at or below 1080p it's useless in my experience, and most midrange machines still target 1080p. Over 50% use some kind of 1080p, see Steam Hardware survey. Above that you're getting into situations where it's *probably* useful but I don't use anything at or above 1080p yet... so I can't personally attest to it either way. Anyone else running with 4k monitors want to weigh in here?
Oh, and to add to the previous point, it's entirely possible with diffusion or other AI-based rendering techs, 3D modellings and shader coding will be completely unnecessary as all it would take is a basic depth-based scene render without textures or even need for hi-poly models, which is then rendering using a controlnet-like extra-model on top, for rendering the scene in any styles the devs/players desires.
If I understand correctly, you're basically saying having a bunch of rigs in a scene and then having an AI neural network plaster its image over it. What I've seen of it, works fine for video and image... but I'll believe it when I see it for games. Even still.. the idea of giving players the control of the style, is imo, in the realm of modding. Most players will not want to spend their time configuring how the game looks in something like reshade, so why should we expect it in a game? That said, some post process effects like pixelation and so on are nice to see settings to turn off(and you see it in many PS1-style games). Additionally giving this kind of control, is corrosive to the brand recognition of the game as people won't know it's the same game. Look at heavily modded Skyrim and compare it to regular Skyrim and tell me it looks like it's the same game. Not saying it's bad, but it's a hard sell. Maybe better for some kind of AI Generated Garry's Mod sandbox style game? To me, a big question is persisting details temporally and spatially(these are games, you can move in space, and most images and videos aren't concerned with this).

Oh, I almost forgot the biggest issue. Stable diffusion on my RTX 3070 takes quite a while to generate a single image from a prompt for a relatively small sized image. This would need to work in real time, at 60fps, ideally 120fps. Until you can generate images of high quality, native(or near native) resolution every 8ms, it's simply out of the question in the same way Raytracing is out of the question. This 8ms of computation has to be shared with physics systems(sometimes sharing GPUs), sound, I/O, AI, etc... All of these issues, is why it's one of those things where I'll believe in it when I see it, because I think the hardware has plateaued quite a bit.
If you're wondering what a market looks like that already has hit the performance limits of what a regular end user can notice, look at SSDs. PCIe 5.0 drives have been around for a couple years now, with very little hype around them, just because 14GB/s in sequential transfers is pointless to the average person. It's faster than almost anyone's internet connection, and few people are constantly copying huge files from one drive to another (if you did that often enough to care, you'd wear out your drive). It doesn't help that those drives are very power-hungry and run hot.
This is a good point, but I think users are noticing that games in 2017 look better than games now. DOOM 2016 looked phenomenal, while most games nowadays look like they are blurry and smeared to shit. What's worse is that UE5 being the default engine a lot of companies are using, Unreal designs their engine so many of their graphics effects rely on TAA or else you'll get graphical artifacts. The hardware improvements do exist, but they are just being wasted more and more, and due to the complexity of game development(see my first paragraph at the top) it's hard to manage that complexity let alone optimise on it. If you're using UE5, you're gonna naturally follow UE5's recommendations

I think it's also a matter of education. Many people don't know how to describe why their game's visuals look worse than the games of almost a decade ago. Of course what looks worse, is up to taste, some people like blurry shit like motion blur and all that. I utterly detest it. I also think there's a major quality issue nowadays. Too many developers say "good enough" when it's not good enough. I talked a little bit about this in my article on reflection approaches. The most blatant example of good enough not being good enough, is in how badly behind video game sounds are. Most games do either mono sound or if they do stereo, only do the most basic panning between left and right ears. No sound reflections, no audio occlusion or absorption etc. Once you start looking at tech that's not Graphics programming you'll quickly see how woefully behind a lot of different tech is(and also how a lot of consumers don't see it or it's paved over graphically. Hell... clipping and collision issues are still very common, and they really shouldn't be so common because it's possible to find and define the walkable environment).

Ultimately part of the issue is idiot consumers buying and not refunding shit games that have major flaws... and also games companies that become publicly traded, as the audience of their products is no longer gamers, but investors. Every publicly traded games company goes down the shitter. For Nvidia you see this already with how DLSS is rubbish at 1080p or below, and how expensive the RRP for GPUs have gotten the last 5 years and how they are marketing to AI and Cryptobros. Nvidia's audience is not gamers, but investors.

Apologies for the long post, it's something I care about, so... deal with it
 
Virtual Cafe Awards

Similar threads