PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

Xbox doesn't use vanilla DirectX API, it never did.

Nor did I ever claim it did.  In fact I think at least once I quoted it as DX11.x+ (note the plus) but I may have not be very rigorous with that as I don't feel it's a very important distinction.  The main point being that the DirectX variant the Xbox One does use (whatever you want to call it) is not as low level as Sony's low level API. This situation will almost certainly be resolved with the DX12 update (which will likely similarly be DX12+ on Xbox One and not exactly identical to the PC version)

Link to comment
Share on other sites

From Microsoft about concerning the eSRAM read+write capability:

"Yes you can - there are actually a lot more individual blocks that comprise the whole ESRAM so you can talk to those in parallel. Of course if you're hitting the same area over and over and over again, you don't get to spread out your bandwidth and so that's one of the reasons why in real testing you get 140-150GB/s rather than the peak 204GB/s..."

"If you're only doing a read you're capped at 109GB/s, if you're only doing a write you're capped at 109GB/s,"

You aren't going to hit that theoretical peak bandwidth in real world situations. Lets pretend though that you can. Do you really think having a little bank of 32MB of faster RAM is going to make up ENTIRELY for the fact that the vast majority of your RAM is only 68.3GB/s? Yes the eSRAM help, it helps A LOT, I'm not trying to downplay what it does for the Xbox One compared to not having it, it makes a HUGE difference. That said that tiny high speed bank isn't going to single handedly make the overall memory performance better than a full 8GB bank of 176GB/s memory.

the bandwidth the CPU can use is only around 20GB/s,so 68GB/s compared to 172GB/s is not an issue for cpu,except of course DDR3 has less latency,but then again the cpu is OOE.

now on to the gpu. you stick the g-buffer,shadowmaps,depth buffers in esram. these are the bandwidth heavy tasks. this is where you get dinged on performance. you dont even need to stick the whole portion anyways. you could use small portions and just swap asynchronously with the move engines like it says in the slide above.

think of this like streaming video from the internet where you only have a 32mb chunk of ram,and you want to stream a 5GB movie. you know how it works,you make 2 buffers. one that it plays from,and one that you buffer data in. it starts playing from the first buffer,while in the background you're downloading and filling the second buffer. when its done playing from the first buffer,you swap your pointers,and now it continues playing from the second buffer,while you now download and fill the first buffer again, over and over again.

 

As for Simultaneous access well the PS4 has a second ARM based processor with it's own RAM as well. It's only 256MB but that's more than the eSRAM in the Xbox One. It's not super fast and it's not accessible by game developers but what it does is run the OS and background tasks while the game is running (or the system is in standby and the primary systems are shut down). So what that means is the PS4 OS doesn't have to make calls and use the bandwidth the games are using while the Xbox One has to share its memory bandwidth between it's much larger OS and the games. I really don't think any of this makes a HUGE difference but if you're seriously going to argue that being able to access that wopping 32MB or fast eSRAM at the same time as the system memory is going to somehow enable to the Xbox One to catch and surpass the PS4 in memory performance I guess we're bringing out and over hyping all the minute details.

the arm chip is irrelevant in terms of rendering. both consoles reserve 2 CPU cores for the OS,so that kind of stuff is done on those.the bandwidth on arm chips is so pitiful it doesnt even matter.

 

So because MS got more of a boost from an upclock than 14 CUs ON XBOX ONE HARDWARE you take that to mean that no developer will be able to utilize more CUs on PlayStation 4 hardware. The upclock provided more of a boost because it boosted the CPU speed, the Memory Speed, the GPU speed, etc. and taken as a whole that did more for them. This did put the CPU power a bit ahead of the PS4 but the Memory is still significantly slower and makes it difficult to keep 14 CUs fed. The PS4 with it's faster memory is more able to keep > 12 CUs fed but you're correct in that they won't be able to keep ALL 18 going on graphics alone. I never said they were going to use all 18 for graphics only but it does allow them outperform the Xbox One on graphics.

And at no time did I refute that. All I said was that Sonys 18CUs offer more performance then MS's slightly higher clocked 12 (I didn't even specify graphics so sure some may be used for compute). Maybe only using 13 of the 18 would enable them to beat the Xbox One, maybe it would take 15, they don't have to fully utilize all 18CUs to beat Xbox One in performance so proving that they currently aren't doesn't prove anything. You make a big deal about Xbox One devs not fully utilizing the eSRAM right now and how it's going to improve well the same thing is true for the Sony devs with the CUs. That's part of the point here. People keep saying that the Xbox One is going to get better because of X, Y and X in the future (which I totally agree with) but then continue by saying that's going to allow them to catch the PS4 as if Sony is just going to sit still and it's hardware is all tapped out right now. BOTH will improve with time but as far as hardware performance Xbox One is just weaker hardware. That doesn't mean it's a bad console, there are more reasons to buy a console then it's hardware specs.

There are both true, I'll give credit where credit is due. As noted before the upclock effected not only the GPU but the CPU as well which gave the Xbox One a slight CPU advantage. The PlayStation 4 also has a programmable DSP, it's part of that extra core I mentioned back in the memory discussion so it's not like the PS4 doesn't even have one as your statement seems to imply. However I will grant you that from what we know the Xbox One DSP is superior to the PS4 one. If fancy audio is your thing then Xbox does hold a slight edge there as well. Keep in mind however part of the reason for this is because much of what a DSP does is VERY similar to what unified shaders do on GPUs. As you mentioned before those 18 Compute Units on the PS4 GPU aren't intened to be for graphics alone. If you are a game developer and want to do super fancy DSP things that the PS4 DSP chip can't handle Sony's intent is for you to still do that but do so buy using some of those CUs from the GPU. So maybe your game uses 15 CUs for graphics surpassing what the Xbox One is capable of and then the remaining 3 are used to augment the DSP on audio to enable the game to do similar audio capabilities as the Xbox One DSP does on it's own. I don't really think most people care that much about audio though... the PS4's programmable DSP is likely "good enough" for most people just like very few people bother to buy PC sound cards anymore, the ones integrated with motherboards are typically fine for most people.

heres the thing. memory is really the biggest difference between both consoles. the techniques they learn for everything else mostly applies to both,like compute. when some people talk about developers improving on xbox one, we're basically talking about the memory,because it isnt straightforward like PC and PS4. there are tricks and techniques that need to be used to extract all of its potential performance.

also at no time did i say the extra CUs in the ps4 have zero benefit, im merely arguing that simply selling the xbox one architecture short because of number of elements is shortsighted,because its more complicated than that. like i said earlier, 3 times the coherent bandwidth means shorter compute jobs could finish quicker on xbox one even with less CUs. bigger jobs would win on the ps4.

in the end,all im saying is watch out, because resolutiongate will turn into something like AAgate or motionblurgate.

Link to comment
Share on other sites

###

Microsoft bans Xbox game developers from running their own servers on their own networks for Xbox games. Microsoft bans Xbox game developers from using publicly available server/cloud providers to host their Xbox One games. 

 

###

I'm not sure what you mean by "cloud tech" to answer your question. I know for a fact that if EA or Ubisoft for example were to build their own public cloud data centers that MS would not allow Xbox One clients to connect to them because they are not hosted on MS's Xbox Live network, 

 

 

Got a source for this? I distinctly remember MS saying publishers were welcome to use their own infrastructure if they wanted. I can't find a source so don;t hold me to that but i'd like to see your source.

Link to comment
Share on other sites

There are A LOT of people who don't seem to understand that. Making that "well known" point the only purpose of my initial post in this thread (related to cloud computing, I did cover other topics as well) and yet people seem to have taken issue with it. You've apparently turned that seemingly simple statement of fact into some perceived attack on Cloud computing in general and/or MS Xbox Live compute in particular. That was NEVER my intent. My intent (with respect to cloud computing specifically) was just to say that PS4 developers can and almost certainly will use cloud computing this generation as well so it's not a differentiator that will allow the Xbox One to exceed the PS4 in performance. That's it. Not that could computing is bad, not that Xbox Live compute is bad, not even that Xbox One as a console is bad.

That's fine and I understand that your trying to counter false information, but sometimes it comes off and simply brushing aside what MS is offering.

Its like there is no middle ground. It either means everything or nothing.

The reality is that if people would just stick to what makes it a good deal, then you would silence those that wish to make it more than it is over time. It is a differentiator now, but not because it would allow the X1 to outperform the ps4 or some rubbish like that.

Microsoft bans Xbox game developers from running their own servers on their own networks for Xbox games. Microsoft bans Xbox game developers from using publicly available server/cloud providers to host their Xbox One games. Sony does not. Developers for PS3/4 games can host their own servers or have them hosted by public providers. That is the distinction I was trying to draw, I apologize if that was somehow not clear.

So this changed from the 360 then? I'm pretty sure developers were able to use their own servers with the 360. Also, if you don't mind me asking, did MS post this change somewhere? I just haven't seen it explicitly stated that a company like EA could not do what it did on the 360 if it really wanted.

I'm not trying to get you to agree. Everyone is free to make up their own mind but contrary to what you might believe there are A LOT of people out there who seem to think that "The Cloud" is some huge Xbox only advantage that MS has. You don't seem to be one of them and that's fine because it wasn't your post I replied to. I'm just trying to set the record straight and people can make their own decisions. If you prefer the "walled garden" approach, if you love Kinect, if you think MS has the best exclusive game franchises, that's all fine I'm not trying to change your mind. If you want to claim that Xbox One has a faster overall memory or graphics subsystem or that the cloud is something Xbox One has over PS4 or that DX12 is going to boost Xbox One performance beyond what the PS4 hardware is capable of or that Sony isn't going to enhance performance on the PS4 as well then we've got a debate.

The whole performance argument is over. The few around here that still insist on bringing it up seem quite stuck in their ways. Most people seem to have moved on.

The whole point of this thread was to highlight how the X1 is improving. It was not meant to say it would beat the ps4 now or do anything to insult the ps4 platform. It's frustrating when people come in and blow it up in either direction.

As far as the cloud stuff, its not some huge advantage, but it is a small one at the moment. MS is clearly the leader among the console developers at pushing the use of the tech and trying to promote it to developers, even using it with their internal teams. Again, this is not an insult to any other platform, its just a set of priorities that differ. MS built a robust platform and are offering with without cost to developers. If anything, I would think more people would be happy to hear this since it could mean that more developers get a taste for what is possible and bring the same ideas to all platforms. As you said, the tech is not unique to one platform.

You may already be aware of this but many of the public cloud providers have free levels of service that are likely sufficient for even the smallest developer to develop their titles on at no charge. It's only when the resource use goes up that they have to pay for that but it's comparatively cheap compared to running your own servers. Heck I have friends that host PC Minecraft servers on the Amazon cloud. This is very approachable. Again I'm not trying to put down Xbox Live Compute. Maybe I'm mistaken but I get the impression that people here think there is some expensive barrier to entry for cloud computing but there isn't. The price will go up when your servers get hammered because if your servers are getting hammered you're selling lots of games and can probably more than afford the increased cloud costs and still have a very healthy profit... even if you are a small indie. When your games popularity wanes your costs automatically go back down and you didn't have to invest in a ton of expensive hardware. You pay for only what you actually use, that's one of the big benefits of the cloud, it's CHEAP. MS giving it away for free is GREAT but it's not like there's a HUGE pay barrier for others that MS is bypassing.

I am aware of the various pricing options. However, neither of us really know all of the access that developers get on the X1. I mean we know the top level details, but I don't think the exact details are out there. Its hard to say what the value is in comparison to other services. Regardless, there is some sort of value there.

At this point the argument is not that there is value, its about how much. You agree there is value, so there is really nothing else to debate.

I'm not sure what you mean by "cloud tech" to answer your question. I know for a fact that if EA or Ubisoft for example were to build their own public cloud data centers that MS would not allow Xbox One clients to connect to them because they are not hosted on MS's Xbox Live network, Sony will allow publishers/developers to do that. I know for a fact that if developer X builds their game servers on Amazon's public cloud MS will not allow Xbox One games to connect to it, Sony will allow this for PS3/4. Did that answer your question?

Not really. Refer to my reply at the start of this post. It sounds like the rules changed for the X1 since you certainly had developers like EA using their own servers on the 360.

There is a HUGE difference between XBL and PSN.

Thanks for clearing that up, that does help to understand how it all works. I know Sony partnered with Rackspace for the servers they are using for services like PS Now, but are you saying that Sony does not run its own network of severs for multiplayer purposes? I guess it could just be limited to its own first party titles.

There has been a long term debate over MS' 'walled garden' vs something more open like Nintendo or Sony. I'm not really sure one is better or worse, just different. I can see positives and negatives on both sides and after all of these years, after many generations, I still can't pick a favorite. MS has provided me a great service over the years, so I can't really knock their methods. On the flip side, so has Sony (although admittedly it wasn't until later in the ps3's life cycle and now the ps4 that I really got a consistently good experience from Sony). Digging up all of the old debate points seems less than useful today.

Link to comment
Share on other sites

There are quite a number "fairly reputable blogs" that talk about how PS4 has a faster memory architecture. A few minutes googling should find you a number of examples.

 

Well, from a standard search of "eSRAM vs GDDR" this is what I found to be the most unbiased source. But perhaps you should maybe retort against the article's claim rather than avoiding that and just dismissing it without a real counterargument?

 

 

What are you talking about? I didn't say the hardware changed I said the performance changed from that article because of the upclocking they did. Heck that's to the Xbox's advantage so I don't see why you'd try to dispute that. You pointed to DirectX12 and Cloud computing in your post they don't change the hardware either but they certainly do effect performance. Removing the Kinect requirement doesn't change the hardware in the box but it improves performance as well (if the developers opt to take advantage of it).

 

Perhaps we shouldn't be dismissing facts just because of the date it was posted then? Upclocking isn't going to change the fundamental functionality of ESRAM vs GDDR5 and how the architecture is designed to work. It's certainly not grounds to ignore the article.

 

Because MS doesn't currently have a low level API for the Xbox, it uses DirectX11.x+ they are trailing Sony in that the PlayStation 4 launched with a low level API. I don't really see how that's debatable. If I have a car and you don't then clearly you're trailing me in having a car. Maybe you'll eventally get a car that's better then mine but right now the fact that I have one at all and you don't means I'm winning at having a car. How can you not understand that?

 

A low level API. Do you even know what an API is? GNM is a low level API, it's very close to the PS4 hardware because it's made specifically for the PS4 and doesn't have to be compatible with other hardware. DirectX11.x and earlier as well as OpenGL are high level APIs. They have high levels of hardware abstraction so that they may be used across a variety of different hardware and so that they automate tasks for programmers making development quicker and easier. This abstraction and automation comes with a performance hit though. What specially does GNM offer? It offers developers the ability to manage their own memory and to split tasks across multiple processors more effectively then DX11 for one. Those are two advances that are rumored to be in DX12 but Sony has them today. DX12 will no doubt be great. I'm a PC gamer first and I can't wait for it to come out because I expect PC games to benefit from it greatly. The Xbox One will benefit from it as well, I'm not disputing that one bit. It looks like it's going to be much better then DX11 however the point I dispute is that it's going to somehow push Xbox One past the PS4 in performance. A lot of what DX12 does will likely be what GNM already does. I'm sure there will be new things as well though, I'm not saying it's just catchup, again I'm a fan of DX12, but whatever gaps there are that the PS4/Xbox One hardware can utilize I'm sure Sony will add to GNM. They don't have to wait for some committee to approve it. Both consoles will have low level APIs, that's not an advantage to MS. In fact DX12 still has to be made abstract enough to work on different types of hardware, GNM ONLY has to work on the PS4 so they can theoretically go lower then even DX12 can. Plus DX12 is also rumored to have a bunch of mobile stuff that probably isn't relevant to the Xbox One and PS4 hardware so Sony isn't going to add that kind of stuff to GNM.

 

Do I know what an API is? Hm... working in the web industry I certainly hope I do since practically everything is an API of some sort. API's are merely interfaces, they allow developers to more rapidly develop by taking concerns away from them. If you think having a "low level" API makes anything work better, it doesn't. All it does is give the developer more flexibility in what they can tell the API to do. And an API's performance is dependent on the developer of that API.

 

The only reason the GNM is any better than previous DX apis is because it doesn't need to worry about multiple hardware configurations. Microsoft does this as well with their console, creating API's specific to the hardware. This is a standard for console companies, and to think that Sony is any more ahead than other companies is ignorant.

 

PS: Winning at having a car? I didn't know this was a discussion about win/lose. But that statement makes your goals very clear. Also, cars are an improper analogy. You can't patch a car to suddenly make it more fuel efficient or get more horsepower.

 

 

My initial post in this thread was in response to other people claiming that the ESRAM, DX12, Cloud Computing, etc. was going to allow MS to catch up to or even surpass the PS4 in performance. I didn't set any "arbitrary standard". Just like with you apparently having people do that annoys me and that's what prompted me to post. If no one in this thread made that absurd claim I would have never commented. My initial comment was just to RESPOND (not initiate) to those people with why that was not the case. You then responded to my post, I didn't start this by responding to yours. You attacked the validity of my post and I have demonstrated that what I said was in fact correct. Hopefully you've learned something, at the very least you now know that PS4 doesn't use OpenGL.

 

So, you think it's absurd that these tools will allow it to catch up? Sounds pretty close minded to me. Also you've demonstrated nothing. You've demonstrated ignorance in the subject matter and have completely avoided facing the points I've brought.

Link to comment
Share on other sites

the bandwidth the CPU can use is only around 20GB/s,so 68GB/s compared to 172GB/s is not an issue for cpu,except of course DDR3 has less latency,but then again the cpu is OOE.

I'm not sure why you create these arbitrary constraints on a general performance discussion. When discussing the GPUs I at no time restricted the performance to graphics yet you tried to saying even sony admits they don't need them all for graphics. So what? Unless you honestly believe the extra CUs wont be used for ANYTHING they DO provide a hardware performance advantage. Maybe they're used for physics or audio or something else, it doesn't matter it's a hardware advantage. Now you confine memory to the effect on the CPU. So lets say both the Xbox One and the PS4 are running something that uses 20GB/s of data for the CPU, that means the Xbox One now has only 48GB/s (plus the eSRAM boost) for graphics and whatever else might need memory access while PS4 has 152GB/s. That's still a pretty big difference. Now I agree the ESRAM puts a SIGNIFICANT dent in that gap and that developers will continue to improve that as they get used to the system but I still contend that they're never going to be able to tweak that eSRAM to such a degree that it can outperform the PS4s faster memory overall. MAYBE you can come up with some contrived isolated cases where that could happen, maybe you can't, I don't really care. The fact remains that generally speaking the PS4 has a faster memory subsystem. A better argument is that it doesn't matter though, maybe developers will never bother to use that extra performance so their titles can be the same cross platform. It may very well be that the performance difference, while there, is imperceivably small... another solid argument. Another strong argument is that hardware specs aren't everything when choosing a console. It's just not the case however that Xbox One has superior overall memory performance to the PS4. The eSRAM and such make a HUGE difference but not THAT much.

the arm chip is irrelevant in terms of rendering. both consoles reserve 2 CPU cores for the OS,so that kind of stuff is done on those.the bandwidth on arm chips is so pitiful it doesnt even matter.

At no time did I say the ARM chip effected rendering. The point again what that at least part of the PS4 OS runs on that chip WITH ITS OWN MEMORY and so those internal system calls don't have to touch the system memory. On the Xbox One has to use it's system memory for such calls and it's widely accepted that the Xbox One's OS is much larger that the PS4 so whatever bandwidth those system calls are taking is not available to the game for the Xbox One but IS for the PS4 since the PS4 has that separate 256MB system memory bank. Now how pitiful of an impact that has on said bandwidth I totally admitted when I mentioned it. The point was specifically to use an absurd special case situation to underscore the absurdity of your special case examples. You know what... if a developer writes a game that fits entirely in the eSRAM cache then it will run a lot faster on the Xbox One than on the PS4... clearly this makes the Xbox One significantly faster than the PS4.

 

heres the thing. memory is really the biggest difference between both consoles. the techniques they learn for everything else mostly applies to both,like compute. when some people talk about developers improving on xbox one, we're basically talking about the memory,because it isnt straightforward like PC and PS4. there are tricks and techniques that need to be used to extract all of its potential performance.

At no time have I stated that Xbox One performance will not get better over time as developers improve their skills at using the eSRAM cache. I TOTALLY agree that is going to happen. That improvement however will not be so vast that it will allow the Xbox One to generally surpass the PS4 in memory performance.
Link to comment
Share on other sites

If you think having a "low level" API makes anything work better, it doesn't. All it does is give the developer more flexibility in what they can tell the API to do. And an API's performance is dependent on the developer of that API.

 

Interesting back-and-forth you guys have going here, but you shot yourself in the foot with this line.

 

Low-level APIs absolutely do effect performance in the same way using a low-level language results in better performance than using a high-level language. Less abstraction.

  • Like 2
Link to comment
Share on other sites

So this changed from the 360 then? I'm pretty sure developers were able to use their own servers with the 360. Also, if you don't mind me asking, did MS post this change somewhere? I just haven't seen it explicitly stated that a company like EA could not do what it did on the 360 if it really wanted.

No, the specific scenarios I stated did not change from the 360. What you may be thinking of and I excluded because it's more dedicated server and less cloud was that MS would allow a developer/publisher to pay to host their own dedicated servers on the Xbox Live network. This means the physical box I believe is at the MS data center NOT the developer/publishers (not 100% sure on that but I believe that's the case). I do KNOW that said server is not accessible by non Xbox clients (except Games for Windows Live PC games for a time). So again EA can't make their own data center and have xbox clients connect to them but they CAN pay MS to host servers on the Xbox Live network. I don't believe this was done very often though as most chose the free peer-hosting option over paying MS to host their servers (which is one of the reasons cloud based servers are such a big deal now). I also don't know if this option still exists as I would imagine it has been entirely replaced by the free Xbox Live Compute option. If it DOES still exist though technically you might be able to pay MS a ton of money to host a bunch of servers for you to build your own cloud on their network but that would likely be insanely expensive and again Xbox only and is just not going to happen.

The whole performance argument is over. The few around here that still insist on bringing it up seem quite stuck in their ways. Most people seem to have moved on.

That was ALL my initial post in this thread was in response to. Everything I have posted since has just been me defending my points from those attempting to refute them.

Thanks for clearing that up, that does help to understand how it all works. I know Sony partnered with Rackspace for the servers they are using for services like PS Now, but are you saying that Sony does not run its own network of severs for multiplayer purposes? I guess it could just be limited to its own first party titles.

Sony does not host multiplayer servers for non-Sony games. They DO host multiplayer servers for their own games of course. They do host infrastructure servers such as the common chat server. But the actual gameplay servers are either peer-to-peer or hosted by the game developer/publisher or someone they pay. Any PS3/4 can connect to any server anywhere on the public internet so developers/publishers don't have to pay Sony be accessible by PS3/4 clients. You can for example run a game server on your own PC that hosts games for PS3 or PS4 clients as far as Sony is concerned but I doubt any devs/publishers would make a product that supports this. I'm pretty sure the only one who ever has is Epic I think eventually made an Unreal Tournament III server I believe that you could run on your PC and host games for PS3 players. It came out WAY after launch though and so far as I know no one else even bothered with the capability. There isn't a Sony policy that prevents it though.

There has been a long term debate over MS' 'walled garden' vs something more open like Nintendo or Sony. I'm not really sure one is better or worse, just different. I can see positives and negatives on both sides and after all of these years, after many generations, I still can't pick a favorite. MS has provided me a great service over the years, so I can't really knock their methods. On the flip side, so has Sony (although admittedly it wasn't until later in the ps3's life cycle and now the ps4 that I really got a consistently good experience from Sony). Digging up all of the old debate points seems less than useful today.

I'm firmly against "Walled Gardens" personally. Again though my point here isn't to change anyone's mind, I'm just trying to correct misinformation when I see it and providing information that I have and others may not. If you look at all the information presented and don't care or even prefer the "Walled Garden" approach, GREAT! The point is to do that from a knowledgeable position and not from one of ignorance or misinformation (as I see many people do). Also I don't pretend to know everything and sometimes in these debates people bring up points I had no idea about and I actually enjoy the debate more when that happens.
Link to comment
Share on other sites

So this changed from the 360 then? I'm pretty sure developers were able to use their own servers with the 360. Also, if you don't mind me asking, did MS post this change somewhere? I just haven't seen it explicitly stated that a company like EA could not do what it did on the 360 if it really wanted.

I looked up the EA situation and I was unaware of it, from: http://electronics.howstuffworks.com/xbox-live3.htm

"There are a few exceptions to exclusive content hosting on Xbox Live. Long-time gaming industry leader Electronic Arts (EA) originally wouldn't produce Live Enabled games for Xbox because Microsoft wouldn't allow them to use their own servers. In 2004, Microsoft struck a deal with EA, and now EA produces Live Enabled games for the Xbox [source: Electronic Arts]. Some of those games make use of EA's own system of user accounts, such as those available for EA Sports Active players. Each Xbox Live Gamertag can be associated with an EA.com account, and linking the two is irreversible"

So as I stated the general rule has not changed however apparently unknown to me is that MS has granted specific exceptions. Apparently EA refused to produce Live enabled content until MS caved but when they caved they didn't change the general rule, they only granted a specific exception. I only know what the rule is and am not privy to what exceptions MS may have granted, I have no idea if there are others besides EA but I imagine the list is pretty small and you have to have a ton of leverage over MS to get that.

I'm pretty sure this is the reason Dust514 doesn't work on Xbox 360 (not that it's a big loss.) I believe the developers expressed interest in making a 360 client but they would have needed it to access their Eve/Dust514 servers (one of the major selling points of these games is a single persistent game server) on the public internet and MS wouldn't let them, they're no EA. I'm not 100% on that one but I believe that's correct.

Link to comment
Share on other sites

You can for example run a game server on your own PC that hosts games for PS3 or PS4 clients as far as Sony is concerned but I doubt any devs/publishers would make a product that supports this. I'm pretty sure the only one who ever has is Epic I think eventually made an Unreal Tournament III server I believe that you could run on your PC and host games for PS3 players. It came out WAY after launch though and so far as I know no one else even bothered with the capability. There isn't a Sony policy that prevents it though.

So would you agree that its fair to say that developers have not leveraged cloud computing or simply dedicated servers in large numbers up to now on consoles? There are obviously barriers to entry that keep more developers from at least trying that for say the ps3/ps4. Maybe the tools don't make it easy enough, or the cost, while not enormous, factors into it.

Either way, maybe MS' deal will lead to more developers with the experience to leverage a cloud infrastructure across all platforms.

 

I'm firmly against "Walled Gardens" personally. Again though my point here isn't to change anyone's mind, I'm just trying to correct misinformation when I see it and providing information that I have and others may not. If you look at all the information presented and don't care or even prefer the "Walled Garden" approach, GREAT! The point is to do that from a knowledgeable position and not from one of ignorance or misinformation (as I see many people do). Also I don't pretend to know everything and sometimes in these debates people bring up points I had no idea about and I actually enjoy the debate more when that happens.

Then I have to ask. What is so wrong about how XBL works? I mean does it hurt developers or gamers? I would guess that the idea is that a walled garden results in less choice and less choice is bad for everyone. I can certainly see examples of that being bad, but is it always bad?

I mean if we look at the history of the service, MS seems to have provided something at least as good as what has been seen on Sony platforms even though it is more open.

Honestly, I think its some of the policies that MS has applied to their platform that has caused more issues then the platform itself.

I looked up the EA situation and I was unaware of it, from: http://electronics.howstuffworks.com/xbox-live3.htm

"There are a few exceptions to exclusive content hosting on Xbox Live. Long-time gaming industry leader Electronic Arts (EA) originally wouldn't produce Live Enabled games for Xbox because Microsoft wouldn't allow them to use their own servers. In 2004, Microsoft struck a deal with EA, and now EA produces Live Enabled games for the Xbox [source: Electronic Arts]. Some of those games make use of EA's own system of user accounts, such as those available for EA Sports Active players. Each Xbox Live Gamertag can be associated with an EA.com account, and linking the two is irreversible"

So as I stated the general rule has not changed however apparently unknown to me is that MS has granted specific exceptions. Apparently EA refused to produce Live enabled content until MS caved but when they caved they didn't change the general rule, they only granted a specific exception. I only know what the rule is and am not privy to what exceptions MS may have granted, I have no idea if there are others besides EA but I imagine the list is pretty small and you have to have a ton of leverage over MS to get that.

I'm pretty sure this is the reason Dust514 doesn't work on Xbox 360 (not that it's a big loss.) I believe the developers expressed interest in making a 360 client but they would have needed it to access their Eve/Dust514 servers (one of the major selling points of these games is a single persistent game server) on the public internet and MS wouldn't let them, they're no EA. I'm not 100% on that one but I believe that's correct.

Well I think everyone can agree that MS is not above breaking its own rules when situations pop up.

Do you know if FF11 was also an exception? It was clearly using its own account system, so I had assumed that SE had struck a deal with MS to use their own servers as well.

Link to comment
Share on other sites

So would you agree that its fair to say that developers have not leveraged cloud computing or simply dedicated servers in large numbers up to now on consoles?

Yes.

There are obviously barriers to entry that keep more developers from at least trying that for say the ps3/ps4. Maybe the tools don't make it easy enough, or the cost, while not enormous, factors into it.

Sony has a fair number of first party games that use dedicated servers but especially on cross platform games they have not been used because Xbox didn't support it (without paying MS apparently more than people were willing to.) Developers don't typically want two completely different sets of networking code so most went peer-to-peer on both platforms, lowest common denominator. That's why it's so exciting that MS is providing free cloud compute now so Xbox users can finally have dedicated server support and 3rd party developers will hopefully start using the capability that's always been there for the PS. So yeah the barriers to entry was MS and the big advance of cloud computing is they are removing the barrier they created. It's kind of like how this new console gen having > 4GB of RAM is finally pushing developers to make PC games that REQUIRE 64bit. It's not like it's a new PC capability but no one used it because consoles (BOTH PS3 and Xbox 360) held developers back in the interest of making cross platform games. So even if you don't care about consoles two big things are happening across gaming now... the rise of cloud/dedicated servers and 64bit games. Everyone wins because bottlenecks have been removed.

 

Then I have to ask. What is so wrong about how XBL works? I mean does it hurt developers or gamers? I would guess that the idea is that a walled garden results in less choice and less choice is bad for everyone. I can certainly see examples of that being bad, but is it always bad?

I mean if we look at the history of the service, MS seems to have provided something at least as good as what has been seen on Sony platforms even though it is more open.

Honestly, I think its some of the policies that MS has applied to their platform that has caused more issues then the platform itself.

I really don't know how to answer this in a reasonable length post. Also like I said I'm not trying to change minds here. Furthermore I don't believe absolutes like "ALWAYS BAD" should be used lightly so I will say that it is not ALWAYS bad. I feel like you are putting me in the awkward position of having to defend something I don't like, rofl. I'll just say both have strengths and weaknesses and my personal preference is to avoid walled gardens as much as I can. Your preference may differ and that's fine. A debate on which is better could likely fill an entire new thread or more.
Link to comment
Share on other sites

Sony has a fair number of first party games that use dedicated servers but especially on cross platform games they have not been used because Xbox didn't support it (without paying MS apparently more than people were willing to.) Developers don't typically want two completely different sets of networking code so most went peer-to-peer on both platforms, lowest common denominator. That's why it's so exciting that MS is providing free cloud compute now so Xbox users can finally have dedicated server support and 3rd party developers will hopefully start using the capability that's always been there for the PS. So yeah the barriers to entry was MS and the big advance of cloud computing is they are removing the barrier they created. It's kind of like how this new console gen having > 4GB of RAM is finally pushing developers to make PC games that REQUIRE 64bit. It's not like it's a new PC capability but no one used it because consoles (BOTH PS3 and Xbox 360) held developers back in the interest of making cross platform games. So even if you don't care about consoles two big things are happening across gaming now... the rise of cloud/dedicated servers and 64bit games. Everyone wins because bottlenecks have been removed.

But has Sony themselves tried to leverage cloud computing beyond dedicated servers on their own consoles? I just haven't seen them push in that direction.

Overall though, it sounds pretty clear that MS was actually holding everyone back. So what they have done for the X1 merely brings them closer to parity, not actually offer something of value.

I just felt that MS seemed more willing to invest in the technology both at the hardware level and with their own development teams compared to say Sony.

 

 

 

I really don't know how to answer this in a reasonable length post. Also like I said I'm not trying to change minds here. Furthermore I don't believe absolutes like "ALWAYS BAD" should be used lightly so I will say that it is not ALWAYS bad. I feel like you are putting me in the awkward position of having to defend something I don't like, rofl. I'll just say both have strengths and weaknesses and my personal preference is to avoid walled gardens as much as I can. Your preference may differ and that's fine. A debate on which is better could likely fill an entire new thread or more.

Good point there. We can leave it at that.

Link to comment
Share on other sites

This is the reason Netflix exists instead of everyone buying BDs.

not really... I get your point though.

Link to comment
Share on other sites

I'm not sure why you create these arbitrary constraints on a general performance discussion. When discussing the GPUs I at no time restricted the performance to graphics yet you tried to saying even sony admits they don't need them all for graphics. So what? Unless you honestly believe the extra CUs wont be used for ANYTHING they DO provide a hardware performance advantage. Maybe they're used for physics or audio or something else, it doesn't matter it's a hardware advantage.

they do provide an advantage for compute, but once compute is finished,then you have to shift the data. you cant dismiss the advantages of the other system because its not marketing material. in gpgpu,its a fight of more compute vs coherent bandwidth.

heres an example of 2 different systems and their end result.

system 1: 4 compute units. base bandwidth

system 2. 2 compute units. 3 times the base bandwidth

system 1 needs half the time to finish compute than system 2, but needs 3 times longer to shift the data over. system 2 can end up finishing the job faster despite system 1 having double the compute power. in this case, system 1 clearly wins on paper specs,but not when put to action.

 

Now you confine memory to the effect on the CPU. So lets say both the Xbox One and the PS4 are running something that uses 20GB/s of data for the CPU, that means the Xbox One now has only 48GB/s (plus the eSRAM boost) for graphics and whatever else might need memory access while PS4 has 152GB/s. That's still a pretty big difference. Now I agree the ESRAM puts a SIGNIFICANT dent in that gap and that developers will continue to improve that as they get used to the system but I still contend that they're never going to be able to tweak that eSRAM to such a degree that it can outperform the PS4s faster memory overall. MAYBE you can come up with some contrived isolated cases where that could happen, maybe you can't, I don't really care. The fact remains that generally speaking the PS4 has a faster memory subsystem. A better argument is that it doesn't matter though, maybe developers will never bother to use that extra performance so their titles can be the same cross platform. It may very well be that the performance difference, while there, is imperceivably small... another solid argument. Another strong argument is that hardware specs aren't everything when choosing a console. It's just not the case however that Xbox One has superior overall memory performance to the PS4. The eSRAM and such make a HUGE difference but not THAT much.

if you dont want to respect the dual ports of read and write of the esram, do know that it carries a minimum of 109GB/s. couple that with the DDR3 then you're up to 177GB/s anyway, on par with gddr5. gpus are highly parallel machines,so reads and writes are always happening. it isnt simply "isolated cases" where esram alone defeats GDDR5. depth testing,which happens to each pixel before it is written to a render target is both,a read and write of the z-buffer.

 

At no time did I say the ARM chip effected rendering. The point again what that at least part of the PS4 OS runs on that chip WITH ITS OWN MEMORY and so those internal system calls don't have to touch the system memory. On the Xbox One has to use it's system memory for such calls and it's widely accepted that the Xbox One's OS is much larger that the PS4 so whatever bandwidth those system calls are taking is not available to the game for the Xbox One but IS for the PS4 since the PS4 has that separate 256MB system memory bank. Now how pitiful of an impact that has on said bandwidth I totally admitted when I mentioned it. The point was specifically to use an absurd special case situation to underscore the absurdity of your special case examples. You know what... if a developer writes a game that fits entirely in the eSRAM cache then it will run a lot faster on the Xbox One than on the PS4... clearly this makes the Xbox One significantly faster than the PS4.

 

except,it isnt an absurd special case. this is how all of last gen consoles were programmed. the reason why the PS3 was such a PITA to program was because the SPEs only had 256KB of local memory each. Games like Uncharted 3 used the SPEs to do tiled lighting,so they split the frame up into little tiles so they can fit in the SPEs and did the lighting on each little piece. The EDRAM in the xbox 360 was used the same way. 32MB is actually a dream compared to last gen. I dont understand why people think its something thats never been done before. Even mobile GPUs tile internally at the silicon level,because usually mobile memory is tiny.

 

The Xbox 360 has 10 MB (10?1024?1024) of fast embedded dynamic RAM (EDRAM) that is dedicated for use as the back buffer, depth stencil buffer, and other render targets. Depending on the size and format of the render targets and the antialiasing level, it may not be possible to fit all targets in EDRAM at once. For example, 10 MB of EDRAM is enough to hold two 1280?720 32-bit surfaces with no multisample antialiasing (MSAA) or two 640?480 4? MSAA 32-bit surfaces. However, a 1280?720 2? MSAA 32-bits-per-pixel render target is 7,372,800 bytes. Combined with a 32-bit Z/stencil buffer of the same dimensions, it becomes apparent that 10 MB might not be sufficient.

Predicated tiling allows rendering to larger surfaces than can fit into EDRAM at any one time. In predicated tiling, the screen space is broken up into tiles (rectangles). The following figure shows the screen space broken into two tiles.

Hc8X2bS.jpg

In predicated tiling, the commands issued in the Draw method are recorded before execution. The recorded commands, such as DrawPrimitives calls, are then executed for each tile, predicated based on whether the rendered primitives intersect the tile. In the preceding figure, both the triangle primitives would be rendered in Tile 0. Once the primitives for a tile are fully rendered, the tile is then resolved into the texture that is used for the front buffer. Each successive tile is handled the same way and is resolved into the same texture.

http://msdn.microsoft.com/en-us/library/bb464139.aspx

Link to comment
Share on other sites

But has Sony themselves tried to leverage cloud computing beyond dedicated servers on their own consoles? I just haven't seen them push in that direction.

Why would they push in that direction? MS is "pushing" in that direction because they want you to use their product. Sony doesn't provide a product for others to use so what would they push? "Hey PS4 devs, go make servers on MS's Azure cloud or Amazon's cloud or Google's cloud or make your own cloud!" Sony lets developers do whatever they want for their game servers there is no reason for them to "push" any particular solution.

I just felt that MS seemed more willing to invest in the technology both at the hardware level and with their own development teams compared to say Sony.

MS IS more willing to invest in the technology because not having good dedicated server support was one of the major knocks against their walled garden that they wanted to correct this generation. Providing cloud compute gives them dedicated server support and more WITHOUT having to tear down their walls. Furthermore Microsoft with Azure is already a competitor in the general cloud space (not console or game specific) so this made sense for them.

Sony doesn't have to compete there, there is already fierce competition so it makes sense for them to just reap the benefits of that competition. What would paying the huge sums of money to build their own gain them that developers can't already do with the current competitors? Maybe that means PS4 games will be developed on MS's public Azure cloud. Maybe it means they'll be on Google or Amazon, what does Sony care as long as they are making PS4 games. In reality it will probably be a huge mix of public and custom clouds for the PS4 no one is going to push one because they'll all just use whatever is best for their particular application, one solution isn't necessarily the best for every case. Sony IS making it's own cloud for it's own use, PS Now and such, heck I wouldn't be shocked to find that the PS chat infrastructure servers were implemented via cloud computing but people don't care how the server infrastructure for chat servers is implemented so there isn't a marketing push behind that.

Link to comment
Share on other sites

Interesting back-and-forth you guys have going here, but you shot yourself in the foot with this line.

 

Low-level APIs absolutely do effect performance in the same way using a low-level language results in better performance than using a high-level language. Less abstraction.

 

Let me clarify. Low level API's do not improve performance. They allow the developer to more finely control the API, thus performance gains can be achieved by the developer through those lower level controls. The API itself has nothing to do with the performance. It's really semantics, but just clearing up what I meant.

Link to comment
Share on other sites

they do provide an advantage for compute, but once compute is finished,then you have to shift the data. you cant dismiss the advantages of the other system because its not marketing material. in gpgpu,its a fight of more compute vs coherent bandwidth.

heres an example of 2 different systems and their end result.

system 1: 4 compute units. base bandwidth

system 2. 2 compute units. 3 times the base bandwidth

system 1 needs half the time to finish compute than system 2, but needs 3 times longer to shift the data over. system 2 can end up finishing the job faster despite system 1 having double the compute power. in this case, system 1 clearly wins on paper specs,but not when put to action.

That's a nice fictional example but I don't see how it's relevant.

System 1: 18 compute units. 8GB 176GB/s bandwidth

System 2: 12 compute units. 8GB 68.3GB/s bandwidth PLUS 32MB of 218GB/s bandwidth. (I still don't think in regular use you're going to be able to fully saturate that ESRAM and sustain that but we'll set that aside for now and pretend that they can)

System 1 having it's memory 257% faster than 99+% of System 2's memory results in a faster overall system performance even though System 2 has 0.4% extra RAM that is 20% faster than System 1. That tiny bit of slightly faster RAM doesn't make up ENTIRELY for the bulk of the RAM being significantly slower. Now if the entire RAM of the Xbox One was esram as your System 2 example implies that would be a totally different story.

the reason why the PS3 was such a PITA to program was because the SPEs only had 256KB of local memory each. Games like Uncharted 3 used the SPEs to do tiled lighting,so they split the frame up into little tiles so they can fit in the SPEs and did the lighting on each little piece. The EDRAM in the xbox 360 was used the same way. 32MB is actually a dream compared to last gen. I dont understand why people think its something thats never been done before. Even mobile GPUs tile internally at the silicon level,because usually mobile memory is tiny.

I totally agree the PS3 was a PITA to program. I totally agree the Xbox 360 was MUCH easier than the PS3. I totally agree that the Xbox One is easier still than the Xbox 360 and thus by extension the PS3. I don't see it's relevance of any of this to the topic at hand but I'll add the PS4 is easier still than the Xbox One which makes it the easiest of them all. This has nothing to do with hardware performance.
Link to comment
Share on other sites

Let me clarify. Low level API's do not improve performance. They allow the developer to more finely control the API, thus performance gains can be achieved by the developer through those lower level controls. The API itself has nothing to do with the performance. It's really semantics, but just clearing up what I meant.

 

Sorry dude, but you're just digging yourself into a deepening hole here. There is no other valuable statement to extract from your assertion, and semantics are not involved on that point.

 

If by being lower-level, the API has facilitated improved performance then quite simply low-level APIs improve performance. The only possible context in which your assertion makes sense is the case of a high(ish)-level API being given low-level extensions. (OpenGL, sort-of)

Link to comment
Share on other sites

Sorry dude, but you're just digging yourself into a deepening hole here. There is no other valuable statement to extract from your assertion, and semantics are not involved on that point.

 

If by being lower-level, the API has facilitated improved performance then quite simply low-level APIs improve performance. The only possible context in which your assertion makes sense is the case of a high(ish)-level API being given low-level extensions. (OpenGL, sort-of)

 

Interfaces don't control performance, they control how you interface two different pieces of software. That's the definition of an API (application programming interface). You can say I'm digging a hole, but that's what I meant when I said it and still mean that now.

 

Edit: Let me correct myself here. While what I am saying is true, the API itself can be optimized to perform better at certain tasks. That being said, such a truth does not depend on low or high level API's but to all API's.

 

The option of using lower level tools, however, does not itself mean better performance of said API.

Link to comment
Share on other sites

That's a nice fictional example but I don't see how it's relevant.

System 1: 18 compute units. 8GB 176GB/s bandwidth

System 2: 12 compute units. 8GB 68.3GB/s bandwidth PLUS 32MB of 218GB/s bandwidth. (I still don't think in regular use you're going to be able to fully saturate that ESRAM and sustain that but we'll set that aside for now and pretend that they can)

System 1 having it's memory 257% faster than 99+% of System 2's memory results in a faster overall system performance even though System 2 has 0.4% extra RAM that is 20% faster than System 1. That tiny bit of slightly faster RAM doesn't make up ENTIRELY for the bulk of the RAM being significantly slower. Now if the entire RAM of the Xbox One was esram as your System 2 example implies that would be a totally different story.

were talking coherent gpu bandwidth with the caches not the regular memory bus. the xbox one chip has a massive advantage inthis area.

 

I totally agree the PS3 was a PITA to program. I totally agree the Xbox 360 was MUCH easier than the PS3. I totally agree that the Xbox One is easier still than the Xbox 360 and thus by extension the PS3. I don't see it's relevance of any of this to the topic at hand but I'll add the PS4 is easier still than the Xbox One which makes it the easiest of them all. This has nothing to do with hardware performance.

its relevant in that youre dismissing the esrams usage importance because of its size when in the past devs were able to utilize super fast memory for all the bandwidth heavy tasks no matter how small memory was available, and made it function like a large contiguous block of memory running at that speed.

Link to comment
Share on other sites

its relevant in that youre dismissing the esrams usage importance because of its size when in the past devs were able to utilize super fast memory for all the bandwidth heavy tasks no matter how small memory was available, and made it function like a large contiguous block of memory running at that speed.

In the past devs "were able to utilize SUPER FAST memory ... no matter how small" because these tiny SUPER FAST memory caches were an order of magnitude (10x or more) faster than the main memory and/or competitor. The Xbox 360 I believe had 256GB/s access to the EDRAM while both the main memory and the competing PS3's memory bandwidth were in the 20GB/s range. That's a HUGE difference from the Xbox One vs. PS4 where even the most optimistic (and highly improbable) usage scenario has the tiny ESRAM cache at less than double the PS4 RAM while the main system memory of the Xbox One is less than half the speed of the PS4. That slightly faster (2x or so, NOT super fast) tiny cache isn't going to make up for the significantly slower (less than half) system RAM, it's just not going to happen.
Link to comment
Share on other sites

That slightly faster (2x or so, NOT super fast) tiny cache isn't going to make up for the significantly slower (less than half) system RAM, it's just not going to happen.

 

Why won't it? Give a practical example as others have given.

Link to comment
Share on other sites

It's not the TV, it's the person in front of the TV. I can easily tell the difference between 1080p and sub-1080p but my wife has hard time above 720p.

 

 

How can you know & how sub 1080p? 720p we could probably all tell, but I think most would struggle to tell ~100 less lines. I mean, if you are reasonably tech-savvy you'll know the content you're playing or viewing either is or isn't 1080p. You have to have a blind test to determine if you could truly tell.

 

It's like audiophiles who swear they can tell between different bitrates and when they are tested rarely do better than random guessing.

 

You may be able to, but you've have to have a pretty good eyesight &/or perception to notice, especially at a distance.

Link to comment
Share on other sites

How can you know & how sub 1080p? 720p we could probably all tell, but I think most would struggle to tell ~100 less lines. I mean, if you are reasonably tech-savvy you'll know the content you're playing or viewing either is or isn't 1080p. You have to have a blind test to determine if you could truly tell.

 

It's like audiophiles who swear they can tell between different bitrates and when they are tested rarely do better than random guessing.

 

You may be able to, but you've have to have a pretty good eyesight &/or perception to notice, especially at a distance.

I can if I pay enough attention. I can definitely NOT tell that this is running at 1070 vertical lines just by looking at it :laugh: didn't mean it that way. I can tell difference between say 850 and 1080 when playing on my TV. I am not a videophile and have not done any tests to validate my claim. Having said that, there have been times when I thought something was off with quality and then found out that the resolution wasn't Full HD.

Link to comment
Share on other sites

This topic is now closed to further replies.