Avatar

Titanfall 2: Monarch's Reign Gameplay Trailer (Gaming)

by CruelLEGACEY @, Toronto, Thursday, May 25, 2017, 14:15 (2748 days ago)

HYPE!!!

Killing it with more great looking DLC. Can't wait!

Titanfall 2: Monarch's Reign Gameplay Trailer

by DEEP_NNN, Thursday, May 25, 2017, 16:24 (2748 days ago) @ CruelLEGACEY

Nice. I guess this is a paid update?

I look at T2 at 60fps on XB1 and then D2 at 30fps. Shakes head sadly at Bungie. Scorpio has literally been fucked out of 60fps for D2.

Avatar

Titanfall 2: Monarch's Reign Gameplay Trailer

by Xenos @, Shores of Time, Thursday, May 25, 2017, 16:35 (2748 days ago) @ DEEP_NNN

Nice. I guess this is a paid update?

It's not! Still free!

I look at T2 at 60fps on XB1 and then D2 at 30fps. Shakes head sadly at Bungie. Scorpio has literally been fucked out of 60fps for D2.

Digital Foundry has a pretty good article about this. I was with you that I thought Scorpio should be able to handle it, but Digital Foundry is pretty confident that without Bungie completely rewriting their engine Scorpio wouldn't have the CPU power to hit 60fps with Destiny 2.

Avatar

30fps vs 60fps

by CruelLEGACEY @, Toronto, Thursday, May 25, 2017, 16:47 (2748 days ago) @ Xenos

Nice. I guess this is a paid update?

It's not! Still free!

I look at T2 at 60fps on XB1 and then D2 at 30fps. Shakes head sadly at Bungie. Scorpio has literally been fucked out of 60fps for D2.

Digital Foundry has a pretty good article about this. I was with you that I thought Scorpio should be able to handle it, but Digital Foundry is pretty confident that without Bungie completely rewriting their engine Scorpio wouldn't have the CPU power to hit 60fps with Destiny 2.

The debate between hitting 30 fps vs 60 fps is a contentious one, from the developers' side. Hitting 60fps, and keeping it there, is extremely hard. Your game needs to render more than twice as efficiently as a game running at 30fps (because you're not simply doubling the frames... you still need to account for certain other processes between frames that can't be sped up, so the parts that can be sped up need to go more than twice as fast to make up lost ground). And when it comes to the player experience, user research shows that the vast majority of gamers cannot tell the difference between 30 and 60fps.

So to hit 60fps, you need to make loads of sacrifices in other areas (less texture data, fewer lighting and particle effects, physics, etc etc etc), and then barely anyone will notice the difference anyway.

So for a developer to internally justify all those sacrifices, they need to feel that hitting 60fps will have a significant impact on their specific audience's experience with the game. Fighting games live or die by their reception from the hardcore fighting game community, and those players will absolutely feel the difference between 30 and 60fps. While Titanfall 2 does feature a campaign this time around, it was developed first and foremost as a competitive multiplayer game. And the hardcore shooter crown will also notice the difference.

But Destiny is a larger animal than that. The most hardcore crucible players will certainly complain about the game running at 30 fps... but that's just one of many reasons why I'd argue Destiny isn't the game for hardcore PvP players at all. Its for a broader audience, many of whom are perfectly happy with a rock-solid 30fps.

Avatar

30fps vs 60fps, GPU vs CPU

by dogcow @, Hiding from Bob, in the vent core., Thursday, May 25, 2017, 17:12 (2748 days ago) @ CruelLEGACEY

I look at T2 at 60fps on XB1 and then D2 at 30fps. Shakes head sadly at Bungie. Scorpio has literally been fucked out of 60fps for D2.


Digital Foundry has a pretty good article about this. I was with you that I thought Scorpio should be able to handle it, but Digital Foundry is pretty confident that without Bungie completely rewriting their engine Scorpio wouldn't have the CPU power to hit 60fps with Destiny 2.


The debate between hitting 30 fps vs 60 fps is a contentious one, from the developers' side. Hitting 60fps, and keeping it there, is extremely hard. Your game needs to render more than twice as efficiently as a game running at 30fps (because you're not simply doubling the frames... you still need to account for certain other processes between frames that can't be sped up, so the parts that can be sped up need to go more than twice as fast to make up lost ground). And when it comes to the player experience, user research shows that the vast majority of gamers cannot tell the difference between 30 and 60fps.

So to hit 60fps, you need to make loads of sacrifices in other areas (less texture data, fewer lighting and particle effects, physics, etc etc etc), and then barely anyone will notice the difference anyway.

So for a developer to internally justify all those sacrifices, they need to feel that hitting 60fps will have a significant impact on their specific audience's experience with the game. Fighting games live or die by their reception from the hardcore fighting game community, and those players will absolutely feel the difference between 30 and 60fps. While Titanfall 2 does feature a campaign this time around, it was developed first and foremost as a competitive multiplayer game. And the hardcore shooter crown will also notice the difference.

But Destiny is a larger animal than that. The most hardcore crucible players will certainly complain about the game running at 30 fps... but that's just one of many reasons why I'd argue Destiny isn't the game for hardcore PvP players at all. Its for a broader audience, many of whom are perfectly happy with a rock-solid 30fps.

What I gleaned from the various Bungie videos was that the GPU (on the PS4Pro) had the power to render the graphics at 60fps, but the CPU didn't have the power to run the game loop (ai, physics, networking, etc.) at 60fps.

So, I don't know that you'd have to make sacrifices in textures & lighting (possibly particle effects too, can the physics for those be run on the gpu now? I don't even know anymore). BUT, you would certainly have to make sacrifices in the physics, AI, # of enemies/objects, etc.

It makes me wonder if Bungie could run the AI & other heavy/slow/low-priority things at a slower rate (30 or 15 per second) yet keep the physics at a 60fps. They may have already decoupled those & still not been able to hit the 60 "game loops/ticks" per second on necessary elements to pull off 60fps.

If D2's frame-rate is CPU bound, perhaps they could still pull off 60fps in crucible? I don't see PvP carrying the same computational burden that PvE does with there being a lower # of combatants & objects to run logic for.

Avatar

30fps vs 60fps, GPU vs CPU

by ZackDark @, Not behind you. NO! Don't look., Thursday, May 25, 2017, 17:26 (2748 days ago) @ dogcow

can the physics for those be run on the gpu now?

Yes, but most physics would be inefficient to run on the GPU and possibly inaccurate too. Thing about GPUs is that it can do the same mathematical operation in dozens of hundreds of inputs at the same time, but if you need each input to operate differently or need the result of one operation be the input of the next, the GPU will take a full cycle to get to that. Same with the CPU, but at least it isn't built with thousands of parallel cells, so you're not wasting processing power that could be better spent elsewhere.

Avatar

30fps vs 60fps, GPU vs CPU

by dogcow @, Hiding from Bob, in the vent core., Thursday, May 25, 2017, 17:49 (2748 days ago) @ ZackDark

can the physics for those be run on the gpu now?


Yes, but most physics would be inefficient to run on the GPU and possibly inaccurate too. Thing about GPUs is that it can do the same mathematical operation in dozens of hundreds of inputs at the same time, but if you need each input to operate differently or need the result of one operation be the input of the next, the GPU will take a full cycle to get to that. Same with the CPU, but at least it isn't built with thousands of parallel cells, so you're not wasting processing power that could be better spent elsewhere.

Right, highly parallel manipulation of large arrays of data. What I was specifically wondering about was the integration of physics processing units with GPUs (like PhysX). My understanding of that integration is that it can handle the physics for particle systems (but not necessarily game simulation physics). Do console GPUs have that integrated now? Honestly I haven't followed GPU developments & what capabilities consoles have lately.

Avatar

30fps vs 60fps, GPU vs CPU

by ZackDark @, Not behind you. NO! Don't look., Thursday, May 25, 2017, 17:59 (2748 days ago) @ dogcow

Well, by definition you can't, unless it's a highly parallizable physics (like you mentioned, particle physics fits the bill, as well as some implementations of fluid dynamics) or you make a GPU also have a CPU, which might happen on discrete graphic cards, but not on embedded GPUs.

Avatar

30fps vs 60fps, GPU vs CPU

by Xenos @, Shores of Time, Thursday, May 25, 2017, 17:59 (2748 days ago) @ dogcow

can the physics for those be run on the gpu now?


Yes, but most physics would be inefficient to run on the GPU and possibly inaccurate too. Thing about GPUs is that it can do the same mathematical operation in dozens of hundreds of inputs at the same time, but if you need each input to operate differently or need the result of one operation be the input of the next, the GPU will take a full cycle to get to that. Same with the CPU, but at least it isn't built with thousands of parallel cells, so you're not wasting processing power that could be better spent elsewhere.


Right, highly parallel manipulation of large arrays of data. What I was specifically wondering about was the integration of physics processing units with GPUs (like PhysX). My understanding of that integration is that it can handle the physics for particle systems (but not necessarily game simulation physics). Do console GPUs have that integrated now? Honestly I haven't followed GPU developments & what capabilities consoles have lately.

Someone correct me if I'm wrong, but from my understanding not much has been done using PPUs and GPUs together since DX10. From what I've heard most companies have just switched to using CPU as it's cheaper and easier (and compatible with a wider range of systems) and with advancements in CPU technology they can use the cores your computer isn't using anyway to process things better run on faster single cores.

Avatar

Offloading stuff to GPU

by uberfoop @, Seattle-ish, Thursday, May 25, 2017, 18:47 (2748 days ago) @ dogcow

Do console GPUs have that integrated now?

It's not really a hard barrier where one moment you can't do stuff and the next you can. The question is what you can offload to where and how efficiently, and with how much development effort.

The GPU in the Xbox 360 only nominally supported rendering tasks. That is (roughly speaking and I'm leaving all kinds of stuff out), you'd submit a piece of geometry, GPU would sample textures, GPU would calculate per-pixel results based on user-defined shader program, then output per-pixel results to an image.
But there's nothing stopping you from architecting, within that framework, a task that isn't actually a rendering task, and then having the GPU run it.

So, perhaps your "geometry" is actually a proxy for an array of particle data (position, velocity, particle type, etc). The "textures" that you have the GPU sample are actually just the particle data from the array. Each "pixel" you output is, once again, the particle data, but now updated with new trajectory. The "image" you're creating is just a new array of particle data.
To detect collisions, you can have the GPU also sample the depth/normal buffers of the main scene at the on-screen location of each particle. If the particle's trajectory "intersects the depth buffer", you can calculate a bounce, or do some other thing based on particle type (like a raindrop particle might turn into a water splash particle).

That's exactly what Bungie was doing back in Halo Reach, to be able to process huge numbers of particles for cheap. It's got some limitations, like particles can't interact with any geometry other than what's on-screen and forward-facing (since it's literally just using the on-screen content as the definition for collision geometry), but with things like raindrops and sparks, people usually don't notice the quirks in the system. (Reach still handles some particles CPU-side.)

Avatar

30fps vs 60fps, GPU vs CPU

by Cody Miller @, Music of the Spheres - Never Forgot, Thursday, May 25, 2017, 19:10 (2748 days ago) @ dogcow

Programmers can correct me if I'm wrong, but the CPU has to generate the display lists to send to the GPU for rendering. This is why even though 4 player split screen is the same number of pixels, you get slowdown. The CPU is assembling 4 display lists instead of one. The GPU can't draw what it doesn't get, so there is inescapable overhead in improving the frame rate.

Avatar

30fps vs 60fps, GPU vs CPU

by uberfoop @, Seattle-ish, Thursday, May 25, 2017, 19:44 (2748 days ago) @ Cody Miller

Programmers can correct me if I'm wrong, but the CPU has to generate the display lists to send to the GPU for rendering. This is why even though 4 player split screen is the same number of pixels, you get slowdown. The CPU is assembling 4 display lists instead of one. The GPU can't draw what it doesn't get, so there is inescapable overhead in improving the frame rate.

That's part of it.

Even as far as rendering goes, there's lots of extra work on the GPU side. Doing two small tasks takes longer than one big one, since context switches and stuff happen. Small polygons are more expensive to draw (relative to their size) than big ones due to how GPU rasterizers work, and LOD systems are never 100% perfect.

I don't have any exact data to back this up, but I wouldn't be surprised if Halo 3's geometric simplicity is a big part of why it's so relatively uncompromised in split-screen. Less things in the scene means less duplicated work.

Avatar

30fps vs 60fps, GPU vs CPU

by uberfoop @, Seattle-ish, Friday, May 26, 2017, 15:30 (2747 days ago) @ dogcow

If D2's frame-rate is CPU bound, perhaps they could still pull off 60fps in crucible?

Even if technically feasible, I'd strongly dislike that. Regularly switching from 60fps to 30fps in the same game feels bad. The acclimation gets faster and easier the more you do it, but still.

Avatar

30fps vs 60fps, GPU vs CPU

by dogcow @, Hiding from Bob, in the vent core., Friday, May 26, 2017, 18:44 (2747 days ago) @ uberfoop
edited by dogcow, Friday, May 26, 2017, 18:54

If D2's frame-rate is CPU bound, perhaps they could still pull off 60fps in crucible?


Even if technically feasible, I'd strongly dislike that. Regularly switching from 60fps to 30fps in the same game feels bad. The acclimation gets faster and easier the more you do it, but still.

Having the framerate drop mid-game feels bad, yes. You can really feel that, but I don't think having a solid 60 in one game mode and a solid 30 in another would be as horrible as you suggest. It wouldn't be so terrible, Crucible would just have a different feel to it than PvE. In fact, Crucible already has a different feel than PvE (at least it did for me, I got destroyed for many many games before I could get use to how different it was from Destiny PvE).

Edit: maybe it would be terrible, but I wouldn't write it off until I tried it.

Avatar

30fps vs 60fps, GPU vs CPU

by uberfoop @, Seattle-ish, Friday, May 26, 2017, 19:46 (2747 days ago) @ dogcow

Having the framerate drop mid-game feels bad, yes.

I don't mean mid-game. I mean within the same general product: visual design, control scheme.

Play a Halo game on MCC for a while, and then immediately start playing the original version on oXbox or 360. The switch from 60->30 is jarring.

Avatar

30fps vs 60fps, GPU vs CPU

by dogcow @, Hiding from Bob, in the vent core., Friday, May 26, 2017, 20:21 (2747 days ago) @ uberfoop

Having the framerate drop mid-game feels bad, yes.


I don't mean mid-game. I mean within the same general product: visual design, control scheme.

Play a Halo game on MCC for a while, and then immediately start playing the original version on oXbox or 360. The switch from 60->30 is jarring.

Yeah, I should give that a try sometime. I purchased the MCC but never got around to playing it. It seems whenever I'd have a spare 30 minutes (that I didn't want to use in destiny) I'd fire up my xb1 & have to update the system software, then I'd launch the MCC & it'd have a bunch of updates & I'd never get to playing before my time was up. I assume the MCC has calmed down by now. :)

It's a shame really, my xb1 has mostly been just a paperweight. :(

Avatar

30fps vs 60fps, GPU vs CPU

by Cody Miller @, Music of the Spheres - Never Forgot, Friday, May 26, 2017, 18:59 (2747 days ago) @ dogcow

60fps for Multiplayer and 30fps for single player worked for Uncharted 4.

Avatar

30fps vs 60fps

by Xenos @, Shores of Time, Thursday, May 25, 2017, 17:25 (2748 days ago) @ CruelLEGACEY

The debate between hitting 30 fps vs 60 fps is a contentious one, from the developers' side. Hitting 60fps, and keeping it there, is extremely hard.

Yeah this is the thing that really gets me. The two examples people bring up are usually Battlefield 1 and Titanfall 2. Battlefield 1 however does NOT hit a consistent 60fps, dipping down to below 45fps on a regular basis, and Titanfall 2 only hits 60fps consistently by using dynamic resolution. As a company that obviously prides themselves on making gorgeous games, neither of these solutions are good. I can completely understand why they would prefer a 4K/1080p30 over a spotty 60fps or changing resolution.

Avatar

30fps vs 60fps

by Cody Miller @, Music of the Spheres - Never Forgot, Thursday, May 25, 2017, 19:05 (2748 days ago) @ CruelLEGACEY

It is not hard to hit 60fps. The NES was doing that with a CPU running at 3mhz or something like that.

You simply need to scale back the complexity of your simulation.

Avatar

And/or visual complexity too

by ZackDark @, Not behind you. NO! Don't look., Thursday, May 25, 2017, 19:25 (2748 days ago) @ Cody Miller

I can hit astronomical amounts of frames-per-second on my entirely-text-based simulation programs and I can assure you they're doing way more complex physics with far more particles/larger arrays than most games out there.

Avatar

30fps vs 60fps

by CruelLEGACEY @, Toronto, Thursday, May 25, 2017, 19:38 (2748 days ago) @ Cody Miller

It is not hard to hit 60fps. The NES was doing that with a CPU running at 3mhz or something like that.

You simply need to scale back the complexity of your simulation.

I'll rephrase. It is hard to hit 60fps while maintaining a graphical fidelity that the audience will accept.

Avatar

Hardware Sprites

by uberfoop @, Seattle-ish, Thursday, May 25, 2017, 20:00 (2748 days ago) @ Cody Miller

It is not hard to hit 60fps. The NES was doing that with a CPU running at 3mhz or something like that.

You simply need to scale back the complexity of your simulation.

Yep.

The hardware sprite era had very different types of tradeoffs for the devs, though.

Those systems don't work by having a GPU render a frame that then gets sent to be output. Instead, the graphics hardware is just set up with data describing what sprites are at what on-screen locations, and it produces the image pixel-by-pixel as it gets sent to the TV; you give it an on-screen coordinate, and it tells you the color of that pixel.

So the graphics hardware was always running in sync with the video signal. Since the video signal was 60Hz, the graphics hardware always produced frames at 60fps.

Rendering itself was never a bottleneck, so tradeoffs that would result in a 30fps game didn't have as much bang-for-the-buck to give, and were effectively leaving half the system idle half the time. Those systems also had very little memory, so you didn't necessarily have enough stuff to work on to make it worth blowing more than 16ms on anyway.

Avatar

30fps vs 60fps

by Durandal, Friday, May 26, 2017, 14:59 (2747 days ago) @ CruelLEGACEY

I will say, from my limited time on Titanfall 2 and far more extensive Overwatch, I haven't really noticed the difference between 30 FPS and 60 FPS.

The performance benefits are already so far beyond my skill that I just don't derive much observable benefit. I'd much rather Bungie add more interactive terrain, NPCs, and visual effects then to strip that all out to make for a high throughput performance.

Avatar

Relic is back!

by Kahzgul, Thursday, May 25, 2017, 17:15 (2748 days ago) @ CruelLEGACEY

Ahh, my favorite map from TF1 returns. Sweeeeeeeet!

Avatar

Relic is back!

by CruelLEGACEY @, Toronto, Thursday, May 25, 2017, 17:23 (2748 days ago) @ Kahzgul

Ahh, my favorite map from TF1 returns. Sweeeeeeeet!

One of my favorites too! I also didn't expect how happy I would be to hear some old Titanfall 1 music again. Goosebumps :D

Back to the forum index
RSS Feed of thread