PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

That's a strange one considering its an arcade title.

That must be one demanding engine they are using.

 

I agree that is weird, it doesn't look like a very demanding game.

Link to comment
Share on other sites

I agree that is weird, it doesn't look like a very demanding game.

Well if its not due the demands of the engine, I guess you could look at other development issues.

Link to comment
Share on other sites

Trials has always been sided with Xbox, so its unlikely anyone can start the they're pro-Sony nonsense. Must be more SDK issues yet again.

Link to comment
Share on other sites

I think lack of experience with the One. Considering its a racing game, tiny environments, less detailed backdrops that are typical with racing games. Forza 5 managed it, Trials should.

 

If it was a different kind of game I might understand.

Link to comment
Share on other sites

I think lack of experience with the One. Considering its a racing game, tiny environments, less detailed backdrops that are typical with racing games. Forza 5 managed it, Trials should.

If it was a different kind of game I might understand.

Difference between those two is trials using deferred lighting where as Forza used pre-baked.

Link to comment
Share on other sites

Difference between those two is trials using deferred lighting where as Forza used pre-baked.

 

And the textures/models/background looks like garbage compared to Forza.

Link to comment
Share on other sites

Difference between those two is trials using deferred lighting where as Forza used pre-baked.

I hadn't heard that Forza 5 used pre-baked lighting but it makes sense given the performance limitations of the XB1. RAGE also used pre-baked lighting to run at 60fps on XB1 / PS3 and I was thoroughly unimpressed by it. Dynamic lighting is expected of modern games.

Link to comment
Share on other sites

So is deferred lighting very demanding on hardware?

Yeah any kind of realtime lighting engine is. Lighting has always been one of the most if not the most taxing thing on graphics cards. Probably rivaled by alpha textures on things like smoke/fog, but those can be limited to small amounts with things like gunshots. The overall lighting engine affects everything on screen.

I hadn't heard that Forza 5 used pre-baked lighting but it makes sense given the performance limitations of the XB1. RAGE also used pre-baked lighting to run at 60fps on XB1 / PS3 and I was thoroughly unimpressed by it. Dynamic lighting is expected of modern games.

Yeah it does, probably why night time/time changes aren't there yet either.
Link to comment
Share on other sites

Trials was 800p prior to patch?

 

In an email from Ubisoft?s PR containing our review code for the game, we have officially received details on how Trials Fusion will look on each platform. As their first game to release on multiple platforms at launch, it seems that RedLynx has successfully mastered the art of multiplatform development. As expected, the PS4 and PC versions of the game will be running at 1080p while the Xbox One version will run at 900p after the game?s day one patch. On the other hand, the Xbox 360 version of Trials Fusion will run in native 600p, ?like all leading games on the platform?, with each version of the game running at a constant 60 frames-per-second.
 
Additionally, the game?s day one patch has been detailed, which will increase the resolution of the Xbox One version from 800p to 900p, and will improve the frame-rate of the PS4 version. Other fixes will deal with track central issues, leaderboard issues, and replays not being in sync. The game will release physically and digitally on Xbox One, PS4, and Xbox 360 on April 16th, with the PC version coming later on the 24th

 

 

http://b-ten.com/trials-fusion-resolution-day-one-patch-details-revealed/

Link to comment
Share on other sites

 

A game that looks that shoddy running at 800p, upgraded to 900. Hillarious. It's basically an arcade game. Maybe the SDK needs work, but like I said earlier Forza managed it and they're a racing game too. It can't be down just to the lighting. 

Link to comment
Share on other sites

A game that looks that shoddy running at 800p, upgraded to 900. Hillarious. It's basically an arcade game. Maybe the SDK needs work, but like I said earlier Forza managed it and they're a racing game too. It can't be down just to the lighting. 

Actually, it can. Dynamic lighting requires a lot more processing power than static lighting. With games like Forza 5 they pre-bake all the shadows and reflections into the textures - this looks fine in many circumstances but it has limitations. For instance, you can't do night races, you can't do day/night transitions, etc. It's one of the reasons RAGE was able to run at 720p @ 60fps on X360 and PS3 when most other games first-person shooters run at 30fps or drop the resolution to run at 60fps (like the Call Of Duty series).

 

The lack of dynamic lighting is a major omission for a modern game. But Forza 5 doesn't just cut corners there - the buildings are most block boxes with virtually no polygon detail and the spectators are 2D and aren't animated. It's all smoke and mirrors, whereas Trials Fusion has detailed backdrops and proper lighting. Forza 5 is very good as masking its limitations but a lot of features were cut to run at 60fps - it highlights the limitations of the next-gen consoles.

Link to comment
Share on other sites

Its pretty surprising that an arcade game can outdo a AAA title in both visuals and engine sophistication. I bet their budget was much smaller too.

 

I think Forza was just pushing for the 1080p/60fps mark and sacrificed visual and engine sophistication so it'd look like Xbox One was going to have 1080p/60fps as a standard. (It was a launch title)

Link to comment
Share on other sites

I hadn't heard that Forza 5 used pre-baked lighting but it makes sense given the performance limitations of the XB1. RAGE also used pre-baked lighting to run at 60fps on XB1 / PS3 and I was thoroughly unimpressed by it. Dynamic lighting is expected of modern games.

Yeah any kind of realtime lighting engine is. Lighting has always been one of the most if not the most taxing thing on graphics cards. Probably rivaled by alpha textures on things like smoke/fog, but those can be limited to small amounts with things like gunshots. The overall lighting engine affects everything on screen.

Yeah it does, probably why night time/time changes aren't there yet either.

The GPU is fine doing this sort of work, deferred rendering techniques do make the GPU work but that's what the GPU is there for. The X1 has no issues with that, it's moreso when you use techniques like deferred lighting, your frame buffer flies up which disallows it all to fit on the eSRAM at 1080p. Bundles and DX12 will be the savior of it, and won't have issues like this. With bundles, the memory management only pulls the render target needed at the time over by part which is being rendered which allows best management of the RAM.

 

EDIT: Read More @ http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx (At the bottom) 

From my understanding on the subject, the CPU overhead decrease is something that is already implemented on the X1 with it being a console. It's debatable if the better scaling has applied across the cores. The descriptor heaps and tables and the bundles arern't implemented on the X1 along with the Pipeline changes. These points make a massive difference.

Link to comment
Share on other sites

It really comes down to SDK and or developers better utilizing the esram,  MS has been clear about it, if you want to get to 1080p and 60 frames that's how you do it.  If a game is 900p at 60fps or 1080p at 30fps it's because they're not using the esram in a way that will allow for it.  To just write it off as a limit of the hardware alone is premature.    I have no doubt that right now it's not easy to work with but that's not always going to be the case going forward.  The fact more and more games are getting closer to that "magic mark" everyone wants so bad should be telling at this point, it's the same thing we've seen with every new console release, specially if there's something new to learn.   I expect to see a number of 900p@60fps titles for a while and then you'll get more and more at 1080p@60, time is always a factor in things.

  • Like 3
Link to comment
Share on other sites

The GPU is fine doing this sort of work, deferred rendering techniques do make the GPU work but that's what the GPU is there for. The X1 has no issues with that, it's moreso when you use techniques like deferred lighting, your frame buffer flies up which disallows it all to fit on the eSRAM at 1080p. Bundles and DX12 will be the savior of it, and won't have issues like this. With bundles, the memory management only pulls the render target needed at the time over by part which is being rendered which allows best management of the RAM.

ESRAM is a limitation rather than an asset. Microsoft might be able to reduce its impact but it will always make game development more difficult than on the PS4. Certainly DX12 will improve things but the new OpenGL does the same thing (reduces CPU usage and allows access to low-level APIs) so that won't do anything to close the gap. The issue will be compounded if the PS4 becomes the lead platform for most game development, which is a distinct possibility given it is easier to programme for and has outsold the XB1 - this will mean the PS4 versions will receive more resources and attention, giving it a further advantage. We saw a similar thing last generation with the X360, as the PS3 was more difficult to programme for.

Link to comment
Share on other sites

ESRAM is a limitation rather than an asset. Microsoft might be able to reduce its impact but it will always make game development more difficult than on the PS4. Certainly DX12 will improve things but the new OpenGL does the same thing (reduces CPU usage and allows access to low-level APIs) so that won't do anything to close the gap. The issue will be compounded if the PS4 becomes the lead platform for most game development, which is a distinct possibility given it is easier to programme for and has outsold the XB1 - this will mean the PS4 versions will receive more resources and attention, giving it a further advantage. We saw a similar thing last generation with the X360, as the PS3 was more difficult to programme for.

It's good to know that you didn't read the topic around the subject matter I posted above and mentioned OpenGL doing the same things, which it isn't buy a long shot. You've also got to understand that in terms of this discussion (Consoles) that PS4 doesn't sorely use OpenGL but rather a heavily modified version which uses in-house Playstation technologies that don't adapt the latest OpenGL implementations.

 

Without eSRAM the ROPs would have to share bandwidth with everything else trying to access DDR3 for framebuffers, you'd have a massive bottleneck in the system. With eSRAM ROPs have a pool of memory which is accessed sorely by the GPU for the purpose of framebuffers. It's just a shame that the thing is on silicon and it's too small to properly fit a large z-buffer on there at 1080p. With DX12 bundles and descriptor heaps and tables, the buffers can be managed correctly and pulled to eSRAM in bundles for rendering which dramatically saves on space and allows the DMAs to dedicate themselves between moving buffers between DDR3 and eSRAM. It's a far from ideal solution compared to unified memory (ala GDDR5) but with bundles and those techniques in general, you could see some really awesome rendering techniques, of-course in 1080p.

Link to comment
Share on other sites

"When I hear people say devs need to just utilize esram better, its like saying they need to learn how to do the most basic thing again. Esram isn?t rocket science, it isn?t even complex. Its just a pain in the a**. That?s it,?

 


 

Just to clear up the discussion that eSRAM is very complex therefore devs haven't figured out how to utilize it correctly.

It's merely too small (32mb) to be much of a benefit.

Link to comment
Share on other sites

 

"When I hear people say devs need to just utilize esram better, its like saying they need to learn how to do the most basic thing again. Esram isn?t rocket science, it isn?t even complex. Its just a pain in the a**. That?s it,?
 
 
Just to clear up the discussion that eSRAM is very complex therefore devs haven't figured out how to utilize it correctly.
It's merely too small (32mb) to be much of a benefit.

 

There's a difference between difficulties of being easy to use on a basic level and incorporating render engines to properly use it's small size to correctly allow a lot of layers in a 1080p frame. The latter is complex, like I said prior, something DX12 will help a lot with due to it being inside the API.

 

I love how you favour a user who is banned on the forum the article has taken source from with an un-named developer rather than official and technical documentation. What isn't rocket science is reading up yourself and gathering a complete and correct understanding.

 

Then again, some things will never change.

Link to comment
Share on other sites

 

It's good to know that you didn't read the topic around the subject matter I posted above and mentioned OpenGL doing the same things, which it isn't buy a long shot. You've also got to understand that in terms of this discussion (Consoles) that PS4 doesn't sorely use OpenGL but rather a heavily modified version which uses in-house Playstation technologies that don't adapt the latest OpenGL implementations.

 

Without eSRAM the ROPs would have to share bandwidth with everything else trying to access DDR3 for framebuffers, you'd have a massive bottleneck in the system. With eSRAM ROPs have a pool of memory which is accessed sorely by the GPU for the purpose of framebuffers. It's just a shame that the thing is on silicon and it's too small to properly fit a large z-buffer on there at 1080p. With DX12 bundles and descriptor heaps and tables, the buffers can be managed correctly and pulled to eSRAM in bundles for rendering which dramatically saves on space and allows the DMAs to dedicate themselves between moving buffers between DDR3 and eSRAM. It's a far from ideal solution compared to unified memory (ala GDDR5) but with bundles and those techniques in general, you could see some really awesome rendering techniques, of-course in 1080p.

So Microsoft is able to massively improve the performance of the XB1 but Sony can't do the same with the PS4? Okay.

 

This reminds me of all the claims we heard about the XB1 and how its cloud computing would give it a massive advantage over the PS4. So far we have seen no tangible benefits from that. DX12 isn't due out for another 18 months, during which time it seems likely that the PS4 will continue to have the performance advantage. Microsoft has already had to remove the GPU reservation for the Kinect in order to compete with the PS4, something it obviously can't do again.

 

Two years is an awfully long period of time for the PS4 to have a decisive performance and price advantage. Even if Microsoft is able to turn things around?and that remains to be seen?there will still be the perception that it is a less powerful console.

Link to comment
Share on other sites

So Microsoft is able to massively improve the performance of the XB1 but Sony can't do the same with the PS4? Okay.

 

This reminds me of all the claims we heard about the XB1 and how its cloud computing would give it a massive advantage over the PS4. So far we have seen no tangible benefits from that. DX12 isn't due out for another 18 months, during which time it seems likely that the PS4 will continue to have the performance advantage. Microsoft has already had to remove the GPU reservation for the Kinect in order to compete with the PS4, something it obviously can't do again.

 

Two years is an awfully long period of time for the PS4 to have a decisive performance and price advantage. Even if Microsoft is able to turn things around?and that remains to be seen?there will still be the perception that it is a less powerful console.

Please quote me where I implied this. Sony do their own changes, they have their ICE team which work on all the software architecture on the box and they always post updates about what they've done. I know, because I follow them. The PS4 won't have any DX 11 style overheads on the machine because it's a console like I said before. That's one little piece of the puzzle which DX12 is aiming to provide to the PC, not the console. Ofcourse the PS4 will have updates, but a dramatic change to graphical pipelines and software architecture takes huge amounts of time and effort. You never know what they may have in-store, but it's not like they're going to incorporate DX12 anytime soon is it.

 

Cloud computing: http://channel9.msdn.com/Blogs/AndrewParsons/Cloud-Assist-Demo-from-Build-2014

MS even re-demoed that video and one of the team posted it on the X1 sub-reddit with more details to stop people saying its tangible to whether the benefits would be there on the X1. Don't need to expand on that, these things take time. Forza and Titanfall  have Azure deeply implemented (as you know) in their games which make them awesome experiences.

 

DX12 will be out for computers in 2015, not the X1. With MS stating that some of "DX12 is already available" on the X1 and with their engineers using it as a system to explore DX12 on (really could do with finding that article again), It'd highly surprise me if it wasn't on the X1 way before the release for the PC. With the X1 it's fixed hardware and a fixed platform, in terms of development it's ten-fold easier to get it a stable release before on the PC. It's a massive change to how GPUs are coded for and how pipelines work within games, we'll see some awesome changes.

 

I'm putting this because I feel like I have to do you mentioning it in every post, yes the PS4 will get updates, I'm no way stating suddenly the X1 will become a supercomputer which makes it locally more powerful in triple percentages. I'm stating how a massive, massive change in software libraries can impose a very beneficial change on the hardware it was designed for. The cloud is not a lie, and I can't wait to see server calculations like the video above, incorporated into games.

Link to comment
Share on other sites

Cloud computing: http://channel9.msdn.com/Blogs/AndrewParsons/Cloud-Assist-Demo-from-Build-2014

MS even re-demoed that video and one of the team posted it on the X1 sub-reddit with more details to stop people saying its tangible to whether the benefits would be there on the X1. Don't need to expand on that, these things take time. Forza and Titanfall  have Azure deeply implemented (as you know) in their games which make them awesome experiences.

Titanfall offloads a bit of AI processing onto the cloud but no more than dedicated servers have long done. It's not exclusive to the XB1 either, as the PC and X360 releases have the same features. As for Forza 5, all it's doing is taking driver styles and sharing them in the cloud - any developer could implement the same thing without Azure. Microsoft has talked up the amount of processing power available but that's irrelevant if it's not improving the gaming experience.

 

DX12 will be out for computers in 2015, not the X1. With MS stating that some of "DX12 is already available" on the X1 and with their engineers using it as a system to explore DX12 on (really could do with finding that article again), It'd highly surprise me if it wasn't on the X1 way before the release for the PC. With the X1 it's fixed hardware and a fixed platform, in terms of development it's ten-fold easier to get it a stable release before on the PC. It's a massive change to how GPUs are coded for and how pipelines work within games, we'll see some awesome changes.

Microsoft has said that the first games based on DX12 will be released towards the end of 2015 and that it expects only half of games to support it. That means even if the performance improvements are as good as is claimed that half of games won't benefit from them. One of the biggest strengths of DX12 is the ease at which game development can be shared across the Xbox and PC but that isn't the case if it's only available on Xbox, plus it obviously doesn't help for the PS4 - you're basically asking developers to develop 3 separate games.

 

Obviously performance on the XB1 is going to improve, as it does on other platforms. However, it seems that there are a lot more hurdles to getting a game properly optimised on the XB1 than the PS4. Personally I hope the DX12 improvements really are as worthwhile as has been suggested, as that will benefit PC gaming.

Link to comment
Share on other sites

This topic is now closed to further replies.