Recommended Posts

If it comes down to a few games here and there not hitting 1080p/60fps vs a laundry list of games not hitting it, I think the choice is incredibly clear as to what the more powerful system is. I am truly dumbfounded how this is even a discussion at this point in time. I really am.

  • Like 2
 

Interesting snippet there:

So did they faced any challenges in getting the PS4 version up and running at 1080p? ?Mainly CPU issues, as our engine didn?t scale for more than 2?-4 cores. This required a change of threading architecture and many parts of the engine needed a rewrite,? Krzysztof  added.

Thats kinda what DX12 will make easier/improve when engines support it in the future, I have to say I find more and more games cpu limited rather than gpu limited (on pc at least) because of poor multithreading.

So looks like The Order won't hit 1080p and won't even manage 60fps. Their excuse is hilarious! :laugh:

source: http://kotaku.com/a-developers-defense-of-30-frames-per-second-1580194683

Wow that is simply the most ridiculous excuse I've ever seen. Intentionally aiming for a low framerate because most movies are 24fps? You can't seriously argue that a lower framerate can lead to a better game experience. It might make it look more like a movie, all right, but can that really be considered better from a gameplay perspective? 

  • Like 2

So looks like The Order won't hit 1080p and won't even manage 60fps. Their excuse is hilarious! :laugh:

source: http://kotaku.com/a-developers-defense-of-30-frames-per-second-1580194683

In other words they couldn't get the PS4 to do 1080p/60fps with the IQ they wanted and this is a PS4 exclusive that has already been delayed (at least) once.

To be honest I have no idea what they're playing at. The only way that such a ridiculous aspect ratio and low framerate can be justified is if the graphics are absolutely stunning, which I'm not convinced about. The main reason for it is they're using 4xAA, which the next-gen consoles just can't handle properly.

 

Interesting snippet there:

Thats kinda what DX12 will make easier/improve when engines support it in the future, I have to say I find more and more games cpu limited rather than gpu limited (on pc at least) because of poor multithreading.

 

CPU is always the bottleneck when it comes to games, more than other things.  The GPUs today, even in the consoles are more than fast enough to outpace the CPUs we have today, and the consoles don't exactly use the fastest ones out there.   If, and this is the case regardless, a game can't do 1080p but can do 60fps or can do 1080p but not 60fps then anything done to reduce the CPU overhead will help with those.  It's only in cases where a game is lower, and a lot lower, in both res and framerate that you probably won't see much of a impact.

 

I expect as more and more XB1 games hit 900p@60fps now or 1080p@30fps with the current DX11 API that the advantages DX12 will bring will be what pushes them to hit the mark everyone wants to look for.   Though it's been said, when looking at a few games so far, that the 900p and 1080p versions are pretty close, if you have to take a screen cap, sit and look at it or zoom in to find the differences then it doesn't matter, no one buying these games is going to do that, they're going to play them. 

  • Like 2

Wow that is simply the most ridiculous excuse I've ever seen. Intentionally aiming for a low framerate because most movies are 24fps? You can't seriously argue that a lower framerate can lead to a better game experience. It might make it look more like a movie, all right, but can that really be considered better from a gameplay perspective?

Oh God the comments in the digital foundry video are going to be a bombsite. Mario Kart 8 is being torn apart in the comments because the framerate went to 59fps for a second or two during it.

I am applying the same yardstick that "sony fans" use when it comes to  X1 games. I am all for devs choosing whatever they think is best for their games and although I would prefer 1080p/60fps, I am realistic given the console hardware.

It's funny because their reasoning is exactly same as what Crytek gave with Ryse but now that the game is on Sony's side, "sony fans" are suddenly changing their tunes. My post is in no way relates to PS4vsXB1 or their relative power, just wanted to point out the hypocrisy around this topic. I got expected response. 

 

Simple forum search and this popped up in first few results, (red emphasis mine)

The difference is the Crytek did not convince enough people that Ryse looks good enough to get away with only being 900p/30.

For the Order, more people are convinced that the game looks great in order to warrant 'letting it slide' . Also, the developers have laid out a pretty lengthy explanation that seems to resonate more with people online.

Personally, I think the whole argument is silly. Ryse is a good looking game, end of story. What I have seen about The Order also makes me think its a good looking game. Always getting stuck in the mud of these details is just bs fodder. Sure if there are issues such as a framerate that drops too much or other visual problems then that can affect the end result, but I have seen enough good looking games that only have a solid 30fps framerate to be just fine with a developer choosing that. Hello Infamous and Ryse. Two games that did not hit 60fps and yet turned out to be good looking games.

The difference is the Crytek did not convince enough people that Ryse looks good enough to get away with only being 900p/30.

For the Order, more people are convinced that the game looks great in order to warrant 'letting it slide' . Also, the developers have laid out a pretty lengthy explanation that seems to resonate more with people online.

Personally, I think the whole argument is silly. Ryse is a good looking game, end of story. What I have seen about The Order also makes me think its a good looking game. Always getting stuck in the mud of these details is just bs fodder. Sure if there are issues such as a framerate that drops too much or other visual problems then that can affect the end result, but I have seen enough good looking games that only have a solid 30fps framerate to be just fine with a developer choosing that. Hello Infamous and Ryse. Two games that did not hit 60fps and yet turned out to be good looking games.

Actually their reasoning is exactly same but it just happened that Ryse landed in the 1080p/60fps launch ###### storm by "sony fans" whereas The Order won't be out for another year. You can't convince someone who is not listening/reading.

Crytek called it "image quality" while RAD is calling it "aesthetics". If I remember right, Crytek tried to explain but it was buried in "LOL NOT 1080p" noise. I guess they should have put black bars with 24 fps and called it "Filmic" which seem to resonate with people.

If it comes down to a few games here and there not hitting 1080p/60fps vs a laundry list of games not hitting it, I think the choice is incredibly clear as to what the more powerful system is. I am truly dumbfounded how this is even a discussion at this point in time. I really am.

 

Same here, but more because the real question is "so what?" Even ignoring the fact that there are now a handful of multi-plat games out now where the actual difference is negligible despite the on-paper difference between platforms, so what? People on both sides are having fun with the games anyway and not giving a ###### about res or framerate. Go figure.

Crytek called it "image quality" while RAD is calling it "aesthetics". If I remember right, Crytek tried to explain but it was buried in "LOL NOT 1080p" noise. I guess they should have put black bars with 24 fps and called it "Filmic" which seem to resonate with people.

Words matter. Crytek's mistake might have been to use the words "image quality" which is more likely to feed into the idea that they did it just because that was the best they could pull off. RAD gave an explanation that clearly does not say it was simply the best they could do, but instead a choice in order to pull off the look they wanted.

The nice thing is though that no matter what people say about Ryse or The Order regarding these points, other people will just enjoy the games. Many people have enjoyed Ryse and I have a feeling many people will enjoy The Order (based on what I have seen of it so far).

The nice thing is though that no matter what people say about Ryse or The Order regarding these points, other people will just enjoy the games. Many people have enjoyed Ryse and I have a feeling many people will enjoy The Order (based on what I have seen of it so far).

 

I personally hate movie black bars which is why I have a 21:9 monitor (roll on tv's and projectors with that aspect) I know others may like movies with black bars but I would be surprised if that was many, in fact when I asked around after getting my monitor many didn't understand why tv shows didn't normally have black bars and films did, they just found them annoying through the movie experience although you never know it may be seen as a positive by some people, just not many I suspect.

LOL at the comments

R2I6BdD.png

 

 

Those comments are a joke.  Some games work best at 60fps and really need to be, others are fine at 30fps. It depends on the game, the new Wolf game is a FPS, at times it moves fast, specially if you want to play it guns blazing.  It has to be 60fps, or at least higher than 30fps because of the type of game it is.  Now some 3rd person game like AC or WD and so on, where it's an overall slower pace doesn't have to be 60fps to give you a smooth playing experience.  These types of games are fine with a locked 30 or a 30-45fps range.

You do know that Cryengine has an internal target of rendering at 30fps dont ya? sean tracy said thats there target for it cus the cryengine code needs 16ms to run with whatever else is going on added on top so with 30fps they have 33ms window to fit it all in, at 60 fps they have 16ms window so its not good enough for them. Thats why you need some beefy hardware to get it upto 60fps. But with games on ps4 and x1 if they shoot for 60fps like they did with forza 5 then gfx have to sacrificed somewhere. If they did it at 30fps like driveclub which is bad for forza cus its a simulator and had another year (cus driveclub was a launch title and delayed twice) then it probably would of looked like what they originally wanted.

 

The Order is only capable of 4x MSAA cus there rendering less of the screen 20% less vertical pixels which is why i highly doubt youll see any proper 1080p game with MSAA in although Uncharted Devs are teasing that the order will look crap compared to uncharted 4 and the tech is apparently amazing...

 

For anyone having stuttering issues in Watch Dogs if its at Ultra completely, put texture to high that should solve it unless you run out of VRAM. Ive got 1080p, 1 GPU buffer, High textures, Temporal SMAA the rest of it maxed inc HBAO+ High etc uses about 2.5GB VRAM get at the very lowest 25+fps even speeding through thrw city at night with rain no slow down on a i7 950 (its completely fine doesnt need that stupid 9000 passmark this rated at 5886 and the game is heavily single threaded uses 1 core for most stuff)  12GB RAM (the game only uses 2.5GB at most) and a 3GB 280x (core 1020 memory 6200)...  dunno what Ubisoft were smoking with the specs for watchdogs but it doesnt need em. If your running ultra Textures and high MSAA youll prolly need 4GB gfx card and i got a SSD not sure if it helps with it but yah. Game 1080p nearly completely maxed 0 stuttering or problems 14.6 AMD drivers done :d

Oh God the comments in the digital foundry video are going to be a bombsite. Mario Kart 8 is being torn apart in the comments because the framerate went to 59fps for a second or two during it.

 

Is WASH_DOGS 1080p/60fps on the Wii U?

I still think 900p vs 1080p is just a ludicrous argument. Seriously, this is the difference, hardly even an issue. Let alone when you're sitting the proper distance from a screen... or things are moving on that screen. Not to mention the fact that all these comparison shots are being critiqued on the computer... where lower resolutions are easier to spot due to the proximity of your eye from a computer monitor.

 

resolution-comparison-two.png

  • Like 2

Its anything but a ludicrous argument. It wasn't ludicrous last gen for many when a higher resolution was in play and its no more ludicrous this gen... In fact the differences this gen are far bigger and far more noticeable.

Those who claim they can't see it ... Well, mmm whatever.

Its anything but a ludicrous argument. It wasn't ludicrous last gen for many when a higher resolution was in play and its no more ludicrous this gen... In fact the differences this gen are far bigger and far more noticeable.

Those who claim they can't see it ... Well, mmm whatever.

 

If you can tell the difference, you're sitting too close to your television.

 

optimal-viewing-distance-television-grap

 

screen-table.gif

 

Based on the charts above, in order to really tell the difference on, for example a 46" television you'd have to be sitting at the minimum viewing distance. Anything more, you'd not really be able to tell.

If you can tell the difference, you're sitting too close to your television.

 

optimal-viewing-distance-television-grap

 

284992.gif

 

What an arrogant response... If someone can tell the difference, they can tell the difference. They don't need to be told they're watching their television wrongly.

What an arrogant response... If someone can tell the difference, they can tell the difference. They don't need to be told they're watching their television wrongly.

 

Arrogant? Um no, this is science. And if you want to be talking about someone insulting others for their opinions read his post considering he's the one dismissing the idea of the difference being marginal. But I don't see you pointing "arrogant" fingers at him, Audioboxer ;).

Seriously, ask anyone who knows anything about televisions. There's a reason these things exist, and it's all about what TV to get and what's really worth it in a particular living space. Seeing the difference between 900p and 1080p is marginal, at best and often means you're sitting close enough to pick pixels apart. If you're that close, you're too close. What "too close" is may differ from person to person but the reality is you shouldn't be able to pick out the pixels on a screen at all.

 

The closer you sit, to any screen the more pixels you need to pack in to prevent artifacts from being seen. So if you buy a TV, be aware that there are optimal distances to view said TV at.

 

PS: viewing these comparisons on your 1080p computer monitors is not the same as comparing it on the television itself.

  • Like 2

I want being arrogant at all.. It's also science. The difference in resolution this gen is much bigger than last gen yet people want to dismiss it as 'nothing in it'

Doesn't matter what distance you sit away, its noticeable.

Assassins Creed on ps4 looks better than it does on Xbox one on my TV sitting the same distance. I notice it. Many do. Those that don't do seem to be the ones trying to justify it.

Arrogant? Um no, this is science. And if you want to be talking about someone insulting others for their opinions read his post considering he's the one dismissing the idea of the difference being marginal. But I don't see you pointing "arrogant" fingers at him, Audioboxer ;).

Seriously, ask anyone who knows anything about televisions. There's a reason these things exist, and it's all about what TV to get and what's really worth it in a particular living space. Seeing the difference between 900p and 1080p is marginal, at best and often means you're sitting close enough to pick pixels apart. If you're that close, you're too close. What "too close" is may differ from person to person but the reality is you shouldn't be able to pick out the pixels on a screen at all.

 

The closer you sit, to any screen the more pixels you need to pack in to prevent artifacts from being seen. So if you buy a TV, be aware that there are optimal distances to view said TV at.

 

PS: viewing these comparisons on your 1080p computer monitors is not the same as comparing it on the television itself.

 

Those comparisons might be better for films/video, but games do have very noticeable differences when running at lower resolutions versus higher. People happily reported GTA4 looking sharper on their Xbox 360s, the difference there is (1120?630) compared to (1280?720). Why all of a sudden with the bigger gaps this generation are people like yourself telling us all if we can notice a difference we're somehow rubbing our eyeballs on our TV?

 

Even I can notice the difference between Netflix 1080p and a Blu Ray disc. If you know what compression artefacts, aliasing and "jaggies" look like, you're going to notice them. Argue how much that affects your enjoyment, it probably doesn't all that much, but arguing we shouldn't be able to see them... That is arrogant.

I want being arrogant at all.. It's also science. The difference in resolution this gen is much bigger than last gen yet people want to dismiss it as 'nothing in it'

Doesn't matter what distance you sit away, its noticeable.

Assassins Creed on ps4 looks better than it does on Xbox one on my TV sitting the same distance. I notice it. Many do. Those that don't do seem to be the ones trying to justify it.

Familirise yourself with the differences early on from last gen, they we're way bigger than these.

This topic is now closed to further replies.