PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

Raw numbers are impartial and non-subjective. 176GB/s is always faster than 109GB/s. How else would you compare performance, by which gives you the more happy feelings?

When you post something before the hardware is finalized it does. In fact the article you posted is wrong because MS increased the operating frequency of the production Xbox Ones and so the 102GB/s in that article was raised to the 109GB/s I stated. That's still well short of 176GB/s last I checked though.

What's not how it works? I didn't describe any functionality and I read the article. It didn't contradict anything I've said.

At no time did I say DX12 is going to be just a few minor optimizations. DX12 is going to be HUGE for Xbox One and for PC because neither has a low level API right now at all. Both DirectX 11.x and OpenGL are high level APIs just like Sony's GNMX. MS doesn't even have anything like Sony's GNM API right now though, nor is there one for the PC. So MS, your big software maker is actually trailing here not leading.

I know exactly what cloud services are, I work in the IT industry. Most people reading this don't understand the ins and out of the cloud though so calling it dedicated servers is a way to relate to people what it does in a manner which is more understandable to the majority of gamers. I didn't just unilaterally decide to do that on my own though, MS themselves made that decision as noted here:

http://www.eurogamer.net/articles/2014-06-16-microsofts-confusing-xbox-one-cloud-message-shifts-to-dedicated-servers

I guess MS is incorrect then too according to you. You should probably tell them.

Furthermore I have no idea what you are talking about me "putting arbitrary requirements". I didn't put any requirements on anything. My point is that MS Xbox Live cloud is the ONLY cloud option you have when you are making Xbox games. For PC and PS4 you have multiple options from industry leaders INCLUDING MS, Amazon, and Google, plus the freedom to build your own cloud. You are free to make your cloud do anything (not just dedicated servers) on the PS4 and PC but with MS you only have the options MS provides to you, there is no competition, you can't make your own. The big benefit of the Xbox Live cloud is just that MS provides it for free to Xbox developers. Cloud computing is absolutely the future, it's just not exclusive to Xbox and since it's not exclusive it's not going to push the Xbox past the PlayStation from a performance perspective when PlayStation games can do the same thing.

1. You are assuming that the DDR3 RAM has to go through eSRAM, which is wrong as far as I know.

 

No one here claimed that Sony can't do whatever MS is doing with Xbox Live Compute. As of now Sony doesn't have it and even when they do, they will always be dependent on others. They also doesn't have it today and it will take them time to bring it to market. Think of the situation where MS is with their game streaming service (only a proof of concept) and where Sony is with PS Now (almost ready to roll out). Game streaming is not exclusive to Sony but they have it working now whereas MS is still in labs.

 

Sony will never be able to match the "Free" part of Azure given that they will be using cloud that doesn't belong to them. They may even end up using Azure like Apple does for iCloud.

Link to comment
Share on other sites

its 109GB/s each way. read+write = 204GB/s. the 68GB/s DDR3 can also be used simultaneously, so theoretical bandwidth peak is 272GB/s versus 172GB/s with GDDR5. big difference

devs are still learning the system,and many techniques are still not being used.

dru66iA.png?1

 

Interesting...where did you get that slide?

Link to comment
Share on other sites

Any increase in performance is always welcome, why not? Sorry for not following the whole thread. While very happy with my X1, i am aware that the situation thus far has not been ideal. I enjoy Watch Dogs on X1 very much, but the jaggies are not something i'm thrilled to have, so any help there would be embraced with both arms. Then again that's the beauty of our hobby, we can choose what we spend our money on. Right now gearing up for a 4K PC, if i can afford it. This does not detract at all from the X1 or PS4, which i will continue to enjoy. Apologies again, i'm in rambling mode today for some reason.

Link to comment
Share on other sites

<snip>

 

If you have something to add that might help people be more informed in the future then there is no reason not to share it with us. No need to be rude about it. Sometimes people hear something and they think that's how it is.

 

A gentle correction goes so much further than a harsh criticism.

  • Like 3
Link to comment
Share on other sites

So wait, your saying that going from 900p to 1080p is an unrealistic outcome from a 10% change in resources?

I thought you were referring to the idea that it would suddenly bring parity with the ps4 gpu wise.

Then your expectations are that it will have no impact?

It does not automatically mean all games will go to 1080p, that is correct.

 

Some more antialiasing and Better IQ. A few more frames per second. A few more things on screen that do not have to be calculated on the cloud. 

Sure, perhaps it will mean a jump from 900p to 1080p here and there. But across the board on all games? Absolutely not. It does not guarantee that. Not at all.

Link to comment
Share on other sites

It does not automatically mean all games will go to 1080p, that is correct.

 

Some more antialiasing and Better IQ. A few more frames per second. A few more things on screen that do not have to be calculated on the cloud. 

Sure, perhaps it will mean a jump from 900p to 1080p here and there. But across the board on all games? Absolutely not. It does not guarantee that. Not at all.

Exactly. What people don't realise is that 900p (1600x900) is 1.4 million pixels, whereas 1080p (1920x1080) is 2.1 million pixels. Going from 900p to 1080p is a 44% increase in pixels, while a jump from 720p to 1080p is a 125% increase in pixels - that's not going to happen due to a 10% increase in GPU power alone.

 

As Microsoft improves the SDK and DX12 is taken up by developers more titles will hit 1080p but it won't be enough to catch up to the PS4.

  • Like 1
Link to comment
Share on other sites

I think the two slides sum it up well, people can debate it all they want but I don't think anyone in here is working for these companies or developing AAA titles.  I mean heck, that one Sony slide says it pretty clearly that their system is balanced with 14CUs in mind and using the other 4 gives you a minor difference.  This just falls back to what MS was/is/has been saying from the start, it's all about balance and not just one single part of the hardware. 

 

It's sometimes mind blowing how some want to get stuck on some specific raw theoretical number like that's what you actually get out of it.  Guys, how long have you been looking at hardware?  The theoretical high numbers are NEVER what you get from the hardware in the real world, I don't care if the number is from MS or from Sony, that peak is just on paper, it never translates into the real world.  

 

The MS slide also sums up what some developers have posted on other forums, they're not yet fully using the XB1 hardware to it's fullest.  The slide is from the MS and AMD game dev event they had a while ago, it's targeted at developers and not gamers so what they talk about isn't BS or marketed speak.

 

We're already seeing developers getting more out of the hardware, the new XDK helps with this I'm sure, and it'll only get better,  next year DX12 will help yet again.  When the difference is minor, as they've said it is (game developers) then I don't see how they don't close the gap to the point where it's so minor it doesn't even matter.

Link to comment
Share on other sites

Spencer has said himself even the move to DX12 won't increase the performance much, but it does make sense to free to 10% resources up depending on what type of game you're making.

 

 

 

The thing is people keep looking at the PC numbers for DX12 and directly applying it to the console. This is obviously a fallacy. For a PC the multi core and low level enhancements will make a huge difference, especially on computers without a top of the line CPU. On a console and in the specially optimized DX, hardware and SDKs a lot of this is already done. Of course DX12 will bring some performance, but noticed the same as the PC potential.

Link to comment
Share on other sites

Interesting...where did you get that slide?

its from the presentation from a few weeks ago. grab the "inside xbox one" slides from here

http://developer.amd.com/resources/documentation-articles/conference-presentations/

 

<snip>

adding the bandwidths is not numbers cobbled together from slides. its exactly what the architect who designed the chip said.

Digital Foundry: Simultaneously? Because there's been a lot of controversy that you're adding your bandwidth together and that you can't do this in a real-life scenario.

Nick Baker:...

You do get to add the read and write bandwidth as well adding the read and write bandwidth on to the main memory. That's just one of the misconceptions we wanted to clean up.

Digital Foundry: So 140-150GB/s is a realistic target and you can integrate DDR3 bandwidth simultaneously?

Nick Baker: Yes. That's been measured.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

if you want to talk about real code usage,instead of theoretical peak,then both companies have provided their benches. microsoft says 150GB/s for esram,and 50-55GB/s for DDR3.

 

we've measured about 140-150GB/s for ESRAM. That's real code running. That's not some diagnostic or some simulation case or something like that. That is real code that is running at that bandwidth. You can add that to the external memory and say that that probably achieves in similar conditions 50-55GB/s and add those two together you're getting in the order of 200GB/s across the main memory and internally.

whereas real code running on the ps4 where only the gpu is using the ram operates at 130GB/s, and when sharing the cpu the bandwidth can get reduced to 100GB/s or lower for the gpu.

VYmLeVA.png?1

sure 32mb is quite small, but its designed specifically for certain tasks. like the slide i posted earlier said, developers are supposed to asyncly swap data in and out of esram with the DMA/Move engines,which they may not have been doing yet.

the lead programmer of trials fusion even said he can fit the fat unoptimized 84MB g-buffer of Infamous SS into 32mb with the exact same data. this puts to bed the theory that 32mb esram is not enough for 1080p.

 

Digital Foundry: There's been a lot of controversy about the design of the Xbox One. On the Beyond 3D Forum, you mentioned that 32MB has been the magic number for optimising render targets on your engine. Can you go into depth on how you approached ESRAM?

Sebastian Aaltonen: Actually that 32MB discussion wasn't about our engine; it was about an unoptimised prototype of another developer. The general consensus in that discussion thread was that it is not possible to fit a fully featured G-buffer to a 32MB memory pool. I, of course, had to disagree, and formulate a buffer layout that had all the same data tightly packed and encoded to fit to the target size.

http://www.eurogamer.net/articles/digitalfoundry-2014-trials-fusion-tech-interview

  • Like 2
Link to comment
Share on other sites

<snip>

The reality is we don't know what either console is capable of yet. Hardware numbers are just numbers, real world application is necessary to know how it will perform.

 

Also, it's just as rude to ignore what someone says via ad hominem.

Link to comment
Share on other sites

Exactly. What people don't realise is that 900p (1600x900) is 1.4 million pixels, whereas 1080p (1920x1080) is 2.1 million pixels. Going from 900p to 1080p is a 44% increase in pixels, while a jump from 720p to 1080p is a 125% increase in pixels - that's not going to happen due to a 10% increase in GPU power alone.

 

As Microsoft improves the SDK and DX12 is taken up by developers more titles will hit 1080p but it won't be enough to catch up to the PS4.

 

Interesting and counter intuitively, render size doesn't scale linearly with performance. It makes more sense when you understand the layers of rendering. Output size is just one element, meanwhile shaders, texture filtering effects and many other things are either resolution independent or only  a smaller degree affected by resolution. This means a doubling of output resolution doesn't require a doubling of power.

 

If you ever done any 3d modeling or rendering you'd be well aware of this.

Link to comment
Share on other sites

<snip>

Yet you don't seem to know what very low LATENCY embedded static cache is and what it's used for.

 

If DRAM was a be-it-all memory, CPU vendors wouldn't have bothered with L1, L2, L3 caches for CPUs.

Link to comment
Share on other sites

It does not automatically mean all games will go to 1080p, that is correct.

 

Some more antialiasing and Better IQ. A few more frames per second. A few more things on screen that do not have to be calculated on the cloud. 

Sure, perhaps it will mean a jump from 900p to 1080p here and there. But across the board on all games? Absolutely not. It does not guarantee that. Not at all.

I didn't say guarantee that, but I understand your point.

Your saying that different developers will use the resources in different ways. Some to push the res, some to push to effects. That is certainly possible. I think the idea is that the extra resources might be enough to allow a game that would have been running at 900p/30 to run at 1080p/30 for example (that is assuming the developer changes nothing else). We won't know until more developers chime in.

Still, in the end, that means a positive improvement.

 

 

Exactly. What people don't realise is that 900p (1600x900) is 1.4 million pixels, whereas 1080p (1920x1080) is 2.1 million pixels. Going from 900p to 1080p is a 44% increase in pixels, while a jump from 720p to 1080p is a 125% increase in pixels - that's not going to happen due to a 10% increase in GPU power alone.

 

As Microsoft improves the SDK and DX12 is taken up by developers more titles will hit 1080p but it won't be enough to catch up to the PS4.

Here we go again trying to make this about the X1 VS the ps4. At this point, its not even relevant. When someone says a game will be improved to say run at 1080p, it does not mean its suddenly on par with the ps4. Its as if there is a knee jerk reaction to defend the fact that the ps4 is more powerful. Resolution parity does not mean performance parity.

The only relevant part is what does this do for the X1 in comparison to games released so far. Does this allow more games to run at 1080p that were running at 900p? Again, assuming a developer does not change the visual effects, then maybe it does. Bungie seems to be saying that it does for their game.

You guys can argue this till the end of time, but if more games are able to hit 1080p in the future, then its a win and that will be the only thing that matters.

  • Like 2
Link to comment
Share on other sites

A game at 900p with better post-processing can look very close to a 1080p version without any or very little, IMO.   After that it depends on the users but w/e.   At that point you're just going around in circles.

Link to comment
Share on other sites

its from the presentation from a few weeks ago. grab the "inside xbox one" slides from here

http://developer.amd.com/resources/documentation-articles/conference-presentations/

 

adding the bandwidths is not numbers cobbled together from slides. its exactly what the architect who designed the chip said.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

if you want to talk about real code usage,instead of theoretical peak,then both companies have provided their benches. microsoft says 150GB/s for esram,and 50-55GB/s for DDR3.

 

whereas real code running on the ps4 where only the gpu is using the ram operates at 130GB/s, and when sharing the cpu the bandwidth can get reduced to 100GB/s or lower for the gpu.

VYmLeVA.png?1

sure 32mb is quite small, but its designed specifically for certain tasks. like the slide i posted earlier said, developers are supposed to asyncly swap data in and out of esram with the DMA/Move engines,which they may not have been doing yet.

the lead programmer of trials fusion even said he can fit the fat unoptimized 84MB g-buffer of Infamous SS into 32mb with the exact same data. this puts to bed the theory that 32mb esram is not enough for 1080p.

 

http://www.eurogamer.net/articles/digitalfoundry-2014-trials-fusion-tech-interview

 

This information is worth people reading and paying attention to, I mean it's clear now and I don't know how many more developers have to chime in before this sinks in, the two systems are closer in performance than what fans want to think.  Does the PS4 have an advantage? Yes, but is it some giant advantage?  I don't think so, and I don't think the developers who work with both hardware do either, because they've not said so at all.

 

The key here is that the XB1 is more complex, and the tools aren't up to the level they have to be at to help developers use the system better, maybe the June XDK helps that to some degree but there's always going to be improvements.    Both systems will get better with more optimizations put in place, the point I'm sticking to is that when it's all said and done the two will end up closer than what people want to admit at this point.

 

Right now it's done to what's easier to do, it's easier to code for the PS4 at this point because of the way they set it up, the XB1 has some bits you have to learn, like the esram and using he move engines in the best way possible to get the most out of it.  So we're seeing games hitting 1080p or 60fps first on the PS4, good for them.  But I'm also seeing more and more games inching higher on the XB1 as well, it could take longer though how much is up to debate but I think they'll get there in the end.   Might be a few more steps in the process, a number of 900p titles and maybe something like 960p or w/e till they finally push out 1080p titles over and over but I see zero reason as to why this wouldn't be the case and won't happen.

 

I also don't much buy into the argument that the XB1 will "never catch up to the PS4", it doesn't have to.  The goal, as far as some people seem to be stuck on, is getting games to 1080p and 60fps, nothing so out of reach that it's not possible, now if we're talking higher then ok, I'd agree but this isn't the PC and we're not talking about some moving target.   When more and more, or all, multiplatform games are going to be the same on both systems then I don't see what it'll matter, in the end we'll be debating who has the better AA at that point.

Link to comment
Share on other sites

its 109GB/s each way. read+write = 204GB/s. the 68GB/s DDR3 can also be used simultaneously, so theoretical bandwidth peak is 272GB/s versus 172GB/s with GDDR5. big difference

From Microsoft about concerning the eSRAM read+write capability:

"Yes you can - there are actually a lot more individual blocks that comprise the whole ESRAM so you can talk to those in parallel. Of course if you're hitting the same area over and over and over again, you don't get to spread out your bandwidth and so that's one of the reasons why in real testing you get 140-150GB/s rather than the peak 204GB/s..."

"If you're only doing a read you're capped at 109GB/s, if you're only doing a write you're capped at 109GB/s,"

You aren't going to hit that theoretical peak bandwidth in real world situations. Lets pretend though that you can. Do you really think having a little bank of 32MB of faster RAM is going to make up ENTIRELY for the fact that the vast majority of your RAM is only 68.3GB/s? Yes the eSRAM help, it helps A LOT, I'm not trying to downplay what it does for the Xbox One compared to not having it, it makes a HUGE difference. That said that tiny high speed bank isn't going to single handedly make the overall memory performance better than a full 8GB bank of 176GB/s memory.

As for Simultaneous access well the PS4 has a second ARM based processor with it's own RAM as well. It's only 256MB but that's more than the eSRAM in the Xbox One. It's not super fast and it's not accessible by game developers but what it does is run the OS and background tasks while the game is running (or the system is in standby and the primary systems are shut down). So what that means is the PS4 OS doesn't have to make calls and use the bandwidth the games are using while the Xbox One has to share its memory bandwidth between it's much larger OS and the games. I really don't think any of this makes a HUGE difference but if you're seriously going to argue that being able to access that wopping 32MB or fast eSRAM at the same time as the system memory is going to somehow enable to the Xbox One to catch and surpass the PS4 in memory performance I guess we're bringing out and over hyping all the minute details.

with the gpu upclock,ms benchmarks showed they got a graphical performance boost more than if they had 14 CUs.

So because MS got more of a boost from an upclock than 14 CUs ON XBOX ONE HARDWARE you take that to mean that no developer will be able to utilize more CUs on PlayStation 4 hardware. The upclock provided more of a boost because it boosted the CPU speed, the Memory Speed, the GPU speed, etc. and taken as a whole that did more for them. This did put the CPU power a bit ahead of the PS4 but the Memory is still significantly slower and makes it difficult to keep 14 CUs fed. The PS4 with it's faster memory is more able to keep > 12 CUs fed but you're correct in that they won't be able to keep ALL 18 going on graphics alone. I never said they were going to use all 18 for graphics only but it does allow them outperform the Xbox One on graphics.

Sony actually chose to use 18CUs not to increase rendering capabilities,but because they're betting that in the future, there will be alot of compute workloads that they can move over to the gpu. sure you can use all 18 CUs for rendering,but theres little benefit.sony says it themselves in their developer manuals.

And at no time did I refute that. All I said was that Sonys 18CUs offer more performance then MS's slightly higher clocked 12 (I didn't even specify graphics so sure some may be used for compute). Maybe only using 13 of the 18 would enable them to beat the Xbox One, maybe it would take 15, they don't have to fully utilize all 18CUs to beat Xbox One in performance so proving that they currently aren't doesn't prove anything. You make a big deal about Xbox One devs not fully utilizing the eSRAM right now and how it's going to improve well the same thing is true for the Sony devs with the CUs. That's part of the point here. People keep saying that the Xbox One is going to get better because of X, Y and X in the future (which I totally agree with) but then continue by saying that's going to allow them to catch the PS4 as if Sony is just going to sit still and it's hardware is all tapped out right now. BOTH will improve with time but as far as hardware performance Xbox One is just weaker hardware. That doesn't mean it's a bad console, there are more reasons to buy a console then it's hardware specs.

xbox one also has a programmable DSP and higher clocked CPU.

There are both true, I'll give credit where credit is due. As noted before the upclock effected not only the GPU but the CPU as well which gave the Xbox One a slight CPU advantage. The PlayStation 4 also has a programmable DSP, it's part of that extra core I mentioned back in the memory discussion so it's not like the PS4 doesn't even have one as your statement seems to imply. However I will grant you that from what we know the Xbox One DSP is superior to the PS4 one. If fancy audio is your thing then Xbox does hold a slight edge there as well. Keep in mind however part of the reason for this is because much of what a DSP does is VERY similar to what unified shaders do on GPUs. As you mentioned before those 18 Compute Units on the PS4 GPU aren't intened to be for graphics alone. If you are a game developer and want to do super fancy DSP things that the PS4 DSP chip can't handle Sony's intent is for you to still do that but do so buy using some of those CUs from the GPU. So maybe your game uses 15 CUs for graphics surpassing what the Xbox One is capable of and then the remaining 3 are used to augment the DSP on audio to enable the game to do similar audio capabilities as the Xbox One DSP does on it's own. I don't really think most people care that much about audio though... the PS4's programmable DSP is likely "good enough" for most people just like very few people bother to buy PC sound cards anymore, the ones integrated with motherboards are typically fine for most people.
Link to comment
Share on other sites

Unfortunately no evidence to back it up, GAF have been wondering if Sony have even mentioned 900p at all.

 

This is not a rumour though as he said he was told, does a game developer need to then show it written down on headed paper before its believed? why would he lie?

Link to comment
Share on other sites

Interesting and counter intuitively, render size doesn't scale linearly with performance. It makes more sense when you understand the layers of rendering. Output size is just one element, meanwhile shaders, texture filtering effects and many other things are either resolution independent or only  a smaller degree affected by resolution. This means a doubling of output resolution doesn't require a doubling of power.

I didn't claim it was, I was just pointing out the significant difference between 900p and 1080p. My point was that a 10% bump in GPU power isn't going to suddenly fix all of the XB1's performance issues. A 10% overclock doesn't translate to a 10% better framerate and resolution has a significant impact upon performance.

 

Here we go again trying to make this about the X1 VS the ps4. At this point, its not even relevant. When someone says a game will be improved to say run at 1080p, it does not mean its suddenly on par with the ps4. Its as if there is a knee jerk reaction to defend the fact that the ps4 is more powerful. Resolution parity does not mean performance parity.

Of course the PS4 is relevant to this discussion, as it's not the one with the major performance issues. Microsoft wouldn't have jettisoned the Kinect and reassigned the GPU power if it didn't need to in order to compete.

  • Like 1
Link to comment
Share on other sites

Raw numbers are impartial and non-subjective. 176GB/s is always faster than 109GB/s. How else would you compare performance, by which gives you the more happy feelings?

 

It may be faster but what if the calculations it is doing are smaller than it's nearest neighbour? More get done. See, size isn;t everything and numbers DO NOT tell the whole story.

Link to comment
Share on other sites

It can be an advantage for MS as long as competitors are not offering free access to the cloud infrastructure.

So are you assuming that developers/publishers aren't going to use cloud computing unless it's free? Cloud computing is the future, MS is right to embrace it. You radically underestimate its benefits if you think people aren't going to use it unless it's free. During this console generation I suspect most games will move to the cloud. Non-XBox Live game clouds are going to have to be developed to support PC games (Windows, Mac, and Linux), to support mobile games, etc. There is every reason to believe they will exist as well for PlayStation without Sony having to provide them for free. The cloud is the future of computing in general, not just MS, not just consoles.

I just don't get why that is so downplayed. Being able to play with cloud computing without your own investment is pretty compelling.

I'm not downplaying the cloud in general, I don't know how I can hype it more than I have. My point is just that it's not an xbox specific thing as MS fanboys would have you believe. I'm also not trying to downplay that it's cool of MS to provide cloud computing for free. That's not all good though, there are downsides. The reason MS provides it for free is because they effectively ban you from running your own servers (be they stand alone, your own cloud implementation, or a commercial cloud service such as Amazon).

Once Sony offers that for free, then you can say it holds no advantage.

That's the thing, unlike MS, Sony isn't blocking developers/publishers from using any public cloud service provider or indeed developing their own. They have no reason to try to compete in this space, it's already highly competitive. So you'll have to pay a little bit, being cheap is one of the benefits the cloud provides as you often pay for only what you actually use and many providers give you free base level service so you can "play with cloud computing without your own investment" already. MS is one of the competitors in this space, in fact I'd say they were leading it with their Azure cloud technology so it makes since they are offering this service to xbox. Who knows who will lead in the future, maybe Amazons cloud will jump ahead or maybe a new startup will come out of nowhere and blow everyone away. With Xbox you're locked in with MS, with PS you can use whoever you want. In fact I seriously doubt MS is going to deny a customer if a developer/publisher wants to host their PS4 game servers on MS's public Azure cloud (different but based on similar tech to Xbox Live Compute cloud). So even if MS does dominate you can still use their public cloud, just not for free, to host your PS4 and PC game servers. It's also nice to have the option to make your own cloud. I'm sure there are going to be bumps along the way as things transition to the cloud but it seems highly likely that EA and Ubisoft and such will build their own cloud infrastructures over the next 8 years of so of this console generation. The same cloud server farms can be used across multiple games for multiple platforms (Windows, PS3/4, Mac, Linux, Android, iOS, etc) and scale resources between games/platforms as demand requires. Only Xbox has to be separate because MS and their "walled garden" approach to Xbox Live prohibits access to a public cloud service. Heck if they didn't give it away for free I'm not sure many cross-platform title devs would bother to use it since it's limited to Xbox only and lets face it most titles these days are cross-platform.
Link to comment
Share on other sites

1. No I'll take a fairly reputable blog's word for it over your flattening of facts. Numbers are objective, the words in which they are placed are not. You can spin numbers to mean whatever you want them to mean.

There are quite a number "fairly reputable blogs" that talk about how PS4 has a faster memory architecture. A few minutes googling should find you a number of examples.

2. Except the hardware hasn't changed since then so... what's your point?

What are you talking about? I didn't say the hardware changed I said the performance changed from that article because of the upclocking they did. Heck that's to the Xbox's advantage so I don't see why you'd try to dispute that. You pointed to DirectX12 and Cloud computing in your post they don't change the hardware either but they certainly do effect performance. Removing the Kinect requirement doesn't change the hardware in the box but it improves performance as well (if the developers opt to take advantage of it).

3. Because they don't have a separate API they are trailing?

Because MS doesn't currently have a low level API for the Xbox, it uses DirectX11.x+ they are trailing Sony in that the PlayStation 4 launched with a low level API. I don't really see how that's debatable. If I have a car and you don't then clearly you're trailing me in having a car. Maybe you'll eventally get a car that's better then mine but right now the fact that I have one at all and you don't means I'm winning at having a car. How can you not understand that?

What does GNM offer that DX doesn't?

A low level API. Do you even know what an API is? GNM is a low level API, it's very close to the PS4 hardware because it's made specifically for the PS4 and doesn't have to be compatible with other hardware. DirectX11.x and earlier as well as OpenGL are high level APIs. They have high levels of hardware abstraction so that they may be used across a variety of different hardware and so that they automate tasks for programmers making development quicker and easier. This abstraction and automation comes with a performance hit though. What specially does GNM offer? It offers developers the ability to manage their own memory and to split tasks across multiple processors more effectively then DX11 for one. Those are two advances that are rumored to be in DX12 but Sony has them today. DX12 will no doubt be great. I'm a PC gamer first and I can't wait for it to come out because I expect PC games to benefit from it greatly. The Xbox One will benefit from it as well, I'm not disputing that one bit. It looks like it's going to be much better then DX11 however the point I dispute is that it's going to somehow push Xbox One past the PS4 in performance. A lot of what DX12 does will likely be what GNM already does. I'm sure there will be new things as well though, I'm not saying it's just catchup, again I'm a fan of DX12, but whatever gaps there are that the PS4/Xbox One hardware can utilize I'm sure Sony will add to GNM. They don't have to wait for some committee to approve it. Both consoles will have low level APIs, that's not an advantage to MS. In fact DX12 still has to be made abstract enough to work on different types of hardware, GNM ONLY has to work on the PS4 so they can theoretically go lower then even DX12 can. Plus DX12 is also rumored to have a bunch of mobile stuff that probably isn't relevant to the Xbox One and PS4 hardware so Sony isn't going to add that kind of stuff to GNM.

And because cloud computing exists elsewhere it's suddenly not going to do enough to make it "better" than the PS4. That right there is one of the arbitrary standards I'm talking about. Why is the bar "better than the PS4" ?

My initial post in this thread was in response to other people claiming that the ESRAM, DX12, Cloud Computing, etc. was going to allow MS to catch up to or even surpass the PS4 in performance. I didn't set any "arbitrary standard". Just like with you apparently having people do that annoys me and that's what prompted me to post. If no one in this thread made that absurd claim I would have never commented. My initial comment was just to RESPOND (not initiate) to those people with why that was not the case. You then responded to my post, I didn't start this by responding to yours. You attacked the validity of my post and I have demonstrated that what I said was in fact correct. Hopefully you've learned something, at the very least you now know that PS4 doesn't use OpenGL.
Link to comment
Share on other sites

Of course the PS4 is relevant to this discussion, as it's not the one with the major performance issues. Microsoft wouldn't have jettisoned the Kinect and reassigned the GPU power if it didn't need to in order to compete.

Talk to some pc people and tell me the ps4 doesn't have performance issues. Its all about perspective.

The X1 doesn't have performance issues, its just less powerful than the ps4 in regards to the gpu. Issues would point to some kind of fault or problem. That is not the case here. The X1 and PS4 have defined hardware that you can look at and get an idea about possible performance.

Your exactly right that MS improved their sdk and opened up more resources in order to offer a better experience to end users. However, claiming that these steps are only taken because of problems is not as clear. Sony is also working to improve its sdk/resource management, even though some would claim they are doing just fine on that front. Considering that MS themselves claimed at launch that improvements like the ones we see now were coming, it leads me to believe both companies are following the natural progression of console hardware.

 

 

So are you assuming that developers/publishers aren't going to use cloud computing unless it's free? Cloud computing is the future, MS is right to embrace it. You radically underestimate its benefits if you think people aren't going to use it unless it's free. During this console generation I suspect most games will move to the cloud. Non-XBox Live game clouds are going to have to be developed to support PC games (Windows, Mac, and Linux), to support mobile games, etc. There is every reason to believe they will exist as well for PlayStation without Sony having to provide them for free. The cloud is the future of computing in general, not just MS, not just consoles.

I am stunned that anyone would actually try so hard to make MS' offer to developers seem so useless.

Of course companies will pay for services as needed, that still does not take away the value right now. That's all.

 

I'm not downplaying the cloud in general, I don't know how I can hype it more than I have. My point is just that it's not an xbox specific thing as MS fanboys would have you believe. I'm also not trying to downplay that it's cool of MS to provide cloud computing for free. That's not all good though, there are downsides. The reason MS provides it for free is because they effectively ban you from running your own servers (be they stand alone, your own cloud implementation, or a commercial cloud service such as Amazon).

Yes its not Xbox specific, that point is more than well known.

How exactly does MS ban you from running your own servers? Any developer can still build their own servers for other platforms. They can even still use Azure servers if they want, they just have to pay for that access whereas that access is free for a game your building for the X1. I haven't seen any evidence that MS forces a developer, that makes a game for the X1, to sign something saying they were barred from using other servers for use on other platforms. Now if that does exist, then my opinion would change.

 

The same cloud server farms can be used across multiple games for multiple platforms (Windows, PS3/4, Mac, Linux, Android, iOS, etc) and scale resources between games/platforms as demand requires. Only Xbox has to be separate because MS and their "walled garden" approach to Xbox Live prohibits access to a public cloud service. Heck if they didn't give it away for free I'm not sure many cross-platform title devs would bother to use it since it's limited to Xbox only and lets face it most titles these days are cross-platform.

I just don't see it as a bad thing. You do and hey, I can see your reasoning behind it, I just don't agree.

If a game developer wants to make a game for the X1, they get free access to the tools and infrastructure. This is especially great for smaller developers. Also, do we know for a fact that MS does not allow developers to use their own cloud tech with the X1? I find it hard to believe that they actively bar developers from doing that. Of course services that MS' provides through XBL require whatever servers you are using to hook into XBL, but then that is no different than a developer tying servers to Sony's PSN servers to access its features.

Link to comment
Share on other sites

A low level API. Do you even know what an API is? GNM is a low level API, it's very close to the PS4 hardware because it's made specifically for the PS4 and doesn't have to be compatible with other hardware. DirectX11.x and earlier as well as OpenGL are high level APIs.

Xbox doesn't use vanilla DirectX API, it never did.
Link to comment
Share on other sites

1. You are assuming that the DDR3 RAM has to go through eSRAM, which is wrong as far as I know.

No, I'm not. At no time did I say that.

No one here claimed that Sony can't do whatever MS is doing with Xbox Live Compute.

My initial comment on this thread was in part to make a statement that PS4 developers/publishers could use cloud computing as well and it wasn't a Xbox One specific thing and many people seem to believe. If we all agree on that then there is no debate but that's not what happened. People replied to my post refuting my statement which effectively does mean they are tying to say Sony can't do it. I thought it was a pretty straightforward statement on my part as well but there seem to be several people here who want to try to debate it.

As of now Sony doesn't have it

They have no reason to have it. Cloud computing service providers is already a highly competitive field in IT. Sony doesn't have to make their own competitor, they allow developers/publishers to use whatever one they like (including MS's Azure) or even make their own. Cloud computing hosts are here now, they were available at console launch. No one has to wait on Sony. If you're a developer and you want to make your multiplayer servers on Amazon's cloud today for the PS3/4, Windows, Linux, Mac, etc. you can. If you want to set up your own cloud server farm and have the money to do so you can. Embracing the cloud isn't going to happen overnight but I expect it will be a big thing for BOTH Xbox One and PS4 (as well as PCs, mobile, etc.) during this console generation. If you think cloud computing isn't going to take off on PS unless Sony makes it then you're sadly mistaken.

Sony will never be able to match the "Free" part of Azure given that they will be using cloud that doesn't belong to them. They may even end up using Azure like Apple does for iCloud.

You're right, Sony will never be able to match the "free" part but one of the big advantages of cloud computing is that it's cheap anyway, you typically only pay for what you actually use and there is often a free tier to get you going before any charges apply. Do you seriously think developers aren't going to embrace cloud computing unless it's free? That's where the future is, it's cheap, it's scalable, and just about everyone is going to be using it in large scale applications. Sony doesn't want to own cloud computing, there is no reason for them to, they can let the big dogs compete and developers are free to use whatever service that want. Maybe Sony will use Microsofts public Azure Cloud for some of their first party games. Maybe EA will make their own cloud for all of their games to share. Maybe Ubisoft will use Amazon's cloud for this title and Google's for the next. Sony doesn't limit developers, sure they're going to have to pay something but cloud computing is comparatively cheap and Sony is giving developers the freedom to use or create whatever they want. MS gives developers a cloud solution for free but that's you're only cloud option. If you don't like it, tough. You can't make your own, you can't user any other cloud computing competitor, it's Xbox Live Compute or nothing. Is it good that it's free, YES!!! I'm not trying to say that it isn't but there are downsides and the fact that it's not free for other platforms isn't going to stop them from going to the cloud. If a developer wants to make a PC version of their game and release it on Steam they're going to have to either drop cloud support from the PC version or create/use something other than Xbox Live Compute. Whatever they choose for PC they can use PS3/4 too, as well as Android, iOS, etc.
Link to comment
Share on other sites

Yes its not Xbox specific, that point is more than well known.

There are A LOT of people who don't seem to understand that. Making that "well known" point the only purpose of my initial post in this thread (related to cloud computing, I did cover other topics as well) and yet people seem to have taken issue with it. You've apparently turned that seemingly simple statement of fact into some perceived attack on Cloud computing in general and/or MS Xbox Live compute in particular. That was NEVER my intent. My intent (with respect to cloud computing specifically) was just to say that PS4 developers can and almost certainly will use cloud computing this generation as well so it's not a differentiator that will allow the Xbox One to exceed the PS4 in performance. That's it. Not that could computing is bad, not that Xbox Live compute is bad, not even that Xbox One as a console is bad.

How exactly does MS ban you from running your own servers?

Microsoft bans Xbox game developers from running their own servers on their own networks for Xbox games. Microsoft bans Xbox game developers from using publicly available server/cloud providers to host their Xbox One games. Sony does not. Developers for PS3/4 games can host their own servers or have them hosted by public providers. That is the distinction I was trying to draw, I apologize if that was somehow not clear.

I just don't see it as a bad thing. You do and hey, I can see your reasoning behind it, I just don't agree.

I'm not trying to get you to agree. Everyone is free to make up their own mind but contrary to what you might believe there are A LOT of people out there who seem to think that "The Cloud" is some huge Xbox only advantage that MS has. You don't seem to be one of them and that's fine because it wasn't your post I replied to. I'm just trying to set the record straight and people can make their own decisions. If you prefer the "walled garden" approach, if you love Kinect, if you think MS has the best exclusive game franchises, that's all fine I'm not trying to change your mind. If you want to claim that Xbox One has a faster overall memory or graphics subsystem or that the cloud is something Xbox One has over PS4 or that DX12 is going to boost Xbox One performance beyond what the PS4 hardware is capable of or that Sony isn't going to enhance performance on the PS4 as well then we've got a debate.

If a game developer wants to make a game for the X1, they get free access to the tools and infrastructure. This is especially great for smaller developers.

You may already be aware of this but many of the public cloud providers have free levels of service that are likely sufficient for even the smallest developer to develop their titles on at no charge. It's only when the resource use goes up that they have to pay for that but it's comparatively cheap compared to running your own servers. Heck I have friends that host PC Minecraft servers on the Amazon cloud. This is very approachable. Again I'm not trying to put down Xbox Live Compute. Maybe I'm mistaken but I get the impression that people here think there is some expensive barrier to entry for cloud computing but there isn't. The price will go up when your servers get hammered because if your servers are getting hammered you're selling lots of games and can probably more than afford the increased cloud costs and still have a very healthy profit... even if you are a small indie. When your games popularity wanes your costs automatically go back down and you didn't have to invest in a ton of expensive hardware. You pay for only what you actually use, that's one of the big benefits of the cloud, it's CHEAP. MS giving it away for free is GREAT but it's not like there's a HUGE pay barrier for others that MS is bypassing.

Also, do we know for a fact that MS does not allow developers to use their own cloud tech with the X1?

I'm not sure what you mean by "cloud tech" to answer your question. I know for a fact that if EA or Ubisoft for example were to build their own public cloud data centers that MS would not allow Xbox One clients to connect to them because they are not hosted on MS's Xbox Live network, Sony will allow publishers/developers to do that. I know for a fact that if developer X builds their game servers on Amazon's public cloud MS will not allow Xbox One games to connect to it, Sony will allow this for PS3/4. Did that answer your question?

Of course services that MS' provides through XBL require whatever servers you are using to hook into XBL, but then that is no different than a developer tying servers to Sony's PSN servers to access its features.

There is a HUGE difference between XBL and PSN. On XBL with respect to gaming EVERYTHING has to be hosted on the private XBL network and no clients from outside the XBL network are allowed access. On PSN in the PS3 era NOTHING was HOSTED on the PSN network and in fact there is no PSN "Network" anymore then there is a Steam "Network". XBL actually has a private network (that connects to the internet) while PSN/Steam just provide a set of services over the public internet. The PSN services appeared over the life of the PS3 (trophies and such) and during the PS3's life they were completely optional. With the PS4 there are some services that are now mandatory but they are infrastructure services like trophies and cross game chat (if your game has chat). This was necessary because in the PS3 days developers/publishers had to roll their own chat solutions so they were incompatible. Now Sony provides that service so everyone uses the same thing but the games themselves are still hosted by the developers/publishers (if they aren't peer to peer). That's why PSN was free during the PS3 era, there wasn't a network to maintain nor were there any mandatory servers Sony had to provide. Now however since they have to maintain the common chat servers and other such infrastructure they are charging for multiplayer. They still don't have their own network though, it's just a set of rules like... if you want to release your game on PS4 and it has voice chat then connect to this Sony provided chat server and use this API so everything works across game. You can use the same host servers for other platforms but of course you'll have to use a different chat system for non-PS4 clients.
Link to comment
Share on other sites

This topic is now closed to further replies.