Avatar

Rendering The Future (layman's version... by a layman...) (Destiny)

by uberfoop @, Seattle-ish, Sunday, March 16, 2014, 11:29 (3914 days ago) @ Ragashingo
edited by uberfoop, Sunday, March 16, 2014, 11:40

1. They did a lot of work to make sure Destiny's new graphic features worked on the 360. If I read it right it looks like they managed to give every scene dynamic lighting (with full interactive shadows and light flickers and stuff) while still fitting within the current generation of hardware. This is awesome for me since I'll be playing on the 360 for a while.

At the very least, it sounds like they'll be wanting their "static" lights to have some degree of animation available.

If you're expecting scenes to have lots of shadow-casting dynamic lights on PS360, I wouldn't get my hopes up; it's possible, but this is a pretty ambitious game to begin with, and I won't be surprised if they just parallel-project a single shadow from a "dominant light source."

One big step forward from Reach seems to have been improving the materials system and rendering system to reduce the number of costly or hard to work with special cases. In Destiny they place a light and the entire scene from basic level geometry to characters, to transparent objects all get lit and shadowed correctly. In Reach they had several different light types, and terrain types and so on that all reacted differently (or not at all meaning some combinations didn't work with each other, period) depending on what they were doing. By improving things on one end it lets the artist more easily create what they want without worrying if the objects is going to reflect or shadow or interact with fog or sparks properly, etc.

Graphics engines have generally been moving in this direction. "Physically based rendering" is part of that phenomenon.

If you use a unified model that is largely an approximation to realistic light phenomena, an artist can define a material according to real-world understanding of how materials work, and it will look as expected when placed in any environment. More getting things right the first time, less digging through a weird list of special cases and funky magic options that don't line up with common sense.

3. Particles were a big improvement that Reach showed over Halo 3. Through various techniques Bungie got a huge increase in the number of particles they could use at once. This showed up as more sparks, and explody bits, and my favorite, as the raindrops that could be frozen in midair in theater mode since each nearby drop actually existed.

The main thing Reach did was pull particle physics off the CPU and run it on the GPU.

From what I understand, in Halo 3, the CPU calculates the particle trajectories, and it detects when particles collide with polygons in the game world.

In Reach, particle collision is a screen-space phenomenon.

When modern 3D games are rendered, they wind up generating depth and normal buffers which describe the "shape" of the scene as seen by the player. So if a scene involving a sphere looks like this, you might have a depth buffer that looks something like this; the brightness corresponds to the depth from the camera.

(Sorry for the polygon edges in the second image, "working correctly" was not yet a feature of my rasterizer when I stored that image.)

In Reach, the GPU handles particle movement, and detects particle collision using stuff like the depth buffer. It's a GPU-friendly and highly parallelized procedure, which can run blazing fast on the 360's GPU even with a pretty large particle count.
One consequence is that it probably doesn't handle offscreen particles particularly correctly... but then, that obviously might not be a big deal.

The stuff involving particles in this presentation is mostly about particle rendering, though.

The Destiny engine seems to take another similar sized step forward and uses some fancy tricks (like determining if a small particle is too far away to really be seen and taking it out of the equation early on so the game can use available processing power on things you actually will see) to further increase the number of particles they can have on screen at any given time.

Haha, practically ever modern game seems to do that sort of thing. Though I'm not sure where the presentation talks about dropping particles based on distance.

The section about transparent effects mostly talks about how transparencies are shaded, and (perhaps more awesomely) how they've implemented low-resolution transparency blending. That triple-image comparison at the end, with "high resolution render" compared with "1/4 res with bilinear upsample" compared with "1/4 res with VDM" is AWESOME.

Most games that use low-resolution alpha blending have artifacts like in the "bilinear upsample" image; in particular extreme cases, like transparencies in Gran Turismo 5, this can make a 1280x1080 game look like it's running at 240p.

That VDM stuff is cool (and I'm sure the PS3 version will look better for it, given that that console generally has shit for render output performance).

My one, hopefully not too layman-ish, question would be this: A lot of the new lighting system was made to fit in the 360's small EDRAM but it sorta sounded like maybe the systems were still confined to that small amount of space even on the next-gen consoles. Is this true, or can the graphics systems spread their wings on the next-gen consles? You know... generally speaking... :p

Depends what you mean by "confined." They chose a 96 bit-per-pixel format to be able to fit decently high-resolution (almost 720p) buffers into the 360's 10MB eDRAM.

The next-gen consoles will use the same 96bpp format, but they don't have the same buffer size restriction; the XB1 has a fast 32MB pool of eSRAM and a slower 8GB pool of DDR3 DRAM to render to, while the PS4 always renders out to an 8GB pool of GDDR5 DRAM. So if you were concerned that the PS4 version was going to push 1152x720 because that's a reasonable choice for the 360, that's probably not going to happen.

Furthermore, this engine is likely very configurable, and I certainly expect the PS4/XB1 to be cranked up in some respects beyond just resolution (and texture quality blah blah blah). Maybe the antialiasing is better, maybe transparencies are bucketed to higher quality, maybe more light sources cast shadows, maybe there are more light sources, maybe texture filtering is better.
Although they could certainly have used a larger buffer format and a more complex material model on PS4/XB1, using a buffer format like this doesn't necessarily mean that they aren't taking decent advantage of the offerings of the eighth-gen consoles.


Complete thread:

 RSS Feed of thread