Crysis Single Player Demo Released!


Recommended Posts

Wow wow. Just amazing.

So far, daddy likes. Just installed the demo a couple of hours ago and its very impressive, so realistic that its just...

Was able to run everything in High with no sweat like the rest of you. Possibly Very High means bring my pc to its knees so maybe next year if I upgrade, but so far its nice to see we are able to run this game with respectable graphics and speed.

A couple of graphic glitches here and there but overall its very cool, except for the boring first minutes (space bar all the way).

I kinda like it when you move the camera very fast and you get some sort of high-speed blur, same happens when you put your gun in snipping mode.

Link to comment
Share on other sites

A couple of graphic glitches here and there but overall its very cool, except for the boring first minutes (space bar all the way).

I personally loved when you are in the plane, it felt like a movie. The music and that whole scene in general was bad ass.

Link to comment
Share on other sites

Don't even try that, ANova. SM2 and SM3 are completely different. These are features that are the exact same and not different in any way. This situation is completely different than what you're trying to compare it to, so don't even pull that crap.

Nice to see you went ahead and upgraded your graphics card, though... so I'm sure you forgot to 'fight the power' and went ahead and installed Bioshock ;)

Funny that you can see something one way and ignore the obvious another. SM2 and SM3 completely different? Is that why they are a subset of the same API? You think lights rays or any other shader cannot be achieved in SM2? Wrong. Crysis is touted as the most advanced game ever, yet it can run on my SM2 hardware. The hacks for Bioshock to run on SM2 hardware look very good minus a couple effects here and there that would require the shaders to be rewritten, yet you cannot seem to comprehend the situation. Amazing.

I haven't bought Bioshock, if that's what you're getting at. No, I upgraded to run games like this and in the future. Games that could RUN on my old system but at reduced settings. The way I see it, if devs don't want to support hardware, I don't buy their product. Simple really. Crysis runs on hardware much older than the system that Bioshock does not run at all on.

Link to comment
Share on other sites

Funny that you can see something one way and ignore the obvious another. SM2 and SM3 completely different? Is that why they are a subset of the same API? You think lights rays or any other shader cannot be achieved in SM2? Wrong. Crysis is touted as the most advanced game ever, yet it can run on my SM2 hardware. The hacks for Bioshock to run on SM2 hardware look very good minus a couple effects here and there that would require the shaders to be rewritten, yet you cannot seem to comprehend the situation. Amazing.

I haven't bought Bioshock, if that's what you're getting at. No, I upgraded to run games like this and in the future. Games that could RUN on my old system but at reduced settings. The way I see it, if devs don't want to support hardware, I don't buy their product. Simple really. Crysis runs on hardware much older than the system that Bioshock does not run at all on.

Oh. My. Goodness.

How dense are you, ANova? You're really trying to go to great lengths here to lend some sort of credence to an argument that lacked any.

The shader model argument we had was over whether or not dated shader model technology should be supported. What I am saying now is that clearly DirectX 10 is not what we've been led to believe, and that DX9 can do everything DX10 can. How on earth are you connecting the two here, other than the fact that they both involve games and they both involve technology? :rofl:

I never once said the shader models are completely different, so again, stop trying to put words in my mouth. As I've told you before -- if you want to say something silly, don't attribute it to me, attribute it to yourself, because I said no such thing. The arguments are completely different. Read the post and you'll understand that. Shader Model 3.0 is not possible on Shader Model 2.0. You can remove features, sure, but that's not the same thing we're talking about here.

And again -- I never once said that all games are going to adapt Shader Model 3.0 as the standard. I said that, eventually, you will see more and more games adopt the standard and drop support for 2.0. Given the large amount of Unreal Engine 3 licensees, I'm pretty sure I've already been proven right on that count -- wouldn't you agree?

Again, ANova, I can perfectly comprehend the situation, but you're changing the argument completely to suit yourself and make you look like less of a fool than what you've proven to be. Yes, they could have supported 2.0. I never once denied that. Please go look at the posts instead of grasping for straws, buddy. I said they didn't need to waste development resources on something ATI chose not to support and limited their customers. Clearly you eventually saw the light, because you realized ATI screwed you over and upgraded :laugh:

I like how you keep saying you haven't "bought" Bioshock, yet you never deny playing it on your computer and then say that you won't buy their product. Now it's none of my business as to what you're shopping habits are and, honestly, I don't care, but stop dodging my question -- did you or did you not INSTALL Bioshock on your computer? I don't care if you used a friend's copy, answer the question.

Please, don't turn this into another little debate for your campaign for developers to support rapidly aging technology. This isn't the same debate at all. What I said was that clearly DX10 "features" aren't really DX10 -- at least in Crysis, and that DX9 runs the game far better. That is absolutely nothing at all like the debate we had, so don't even try to pretend it is.

Link to comment
Share on other sites

I decided to try the demo and wow I was pleasantly surprised! now while it runs at a very low resolution, it still looks impressive! :) :happy: I was expecting it to crap out and not run/run really slow. Only problem is my on-board sound chip can't handle the game, it stutters like crazy but I'm not too keen on buying a new soundcard for a game (did that once before and ended up been a waste of time...) so I probably won't get it :( even so the game rocks! :)

Link to comment
Share on other sites

Wow wow. Just amazing.

So far, daddy likes. Just installed the demo a couple of hours ago and its very impressive, so realistic that its just...

Was able to run everything in High with no sweat like the rest of you. Possibly Very High means bring my pc to its knees so maybe next year if I upgrade, but so far its nice to see we are able to run this game with respectable graphics and speed.

A couple of graphic glitches here and there but overall its very cool, except for the boring first minutes (space bar all the way).

I kinda like it when you move the camera very fast and you get some sort of high-speed blur, same happens when you put your gun in snipping mode.

what specs are u running at? and resolution in game? look at my signature system...I cannot run at high :(

Link to comment
Share on other sites

I decided to try the demo and wow I was pleasantly surprised! now while it runs at a very low resolution, it still looks impressive! :) :happy: I was expecting it to crap out and not run/run really slow. Only problem is my on-board sound chip can't handle the game, it stutters like crazy but I'm not too keen on buying a new soundcard for a game (did that once before and ended up been a waste of time...) so I probably won't get it :( even so the game rocks! :)

Nevermind, the sound problem seems to have solved itself (still does it, but only rarely now)...this game does push my system very hard, but damn it's good! :woot:

Link to comment
Share on other sites

Oh. My. Goodness.

How dense are you, ANova? You're really trying to go to great lengths here to lend some sort of credence to an argument that lacked any.

Dense certainly describes someone here, but it's not me. Your arguments have been nothing but incorrect FUD and I seriously doubt you've ever understood my point.

The shader model argument we had was over whether or not dated shader model technology should be supported. What I am saying now is that clearly DirectX 10 is not what we've been led to believe, and that DX9 can do everything DX10 can. How on earth are you connecting the two here, other than the fact that they both involve games and they both involve technology? :rofl:

Uh, gee maybe because the situation is exactly the same???? I've said SM2 can do everything SM3 can, oh I don't know, 500 times now! You think DX10 is no big deal? Well, news flash! DX10 is a much bigger deal than SM3 ever was. Period.

I never once said the shader models are completely different, so again, stop trying to put words in my mouth. As I've told you before -- if you want to say something silly, don't attribute it to me, attribute it to yourself, because I said no such thing. The arguments are completely different. Read the post and you'll understand that. Shader Model 3.0 is not possible on Shader Model 2.0. You can remove features, sure, but that's not the same thing we're talking about here.

Oh really, you never said that. Hmm, funny because I have you quoted as saying just that in my original response.

SM2 and SM3 are completely different.

You sure are silly. Shader model 3 IS possible on shader model 2, it just requires slight modifications to the shader code. No ifs, ands or buts about it. That is what I've been saying all along and that is what you've been denying all along. You, my sad friend, are stuck in ignorance.

You think ATI limited their customers, I think the developers did so, since it was them that made the product to spark this debate and them that chose not to support a wider used format. ATI cannot make decisions for developers in the future. Besides, if one format can do everything another format can, how is it exactly that one is limited, other than in whichever format the developer up and decides to support over the other. Oh that's right, you consider SM2 to be inferior and incapable in every way just because it's older. Therein lies your futility.

Edited by ANova
Link to comment
Share on other sites

:rofl:

ANova, clearly you were mistaken in your interpretation of my post and now you're just carrying on and speaking for me. Don't. Don't speak for me, ANova, because it is abundantly clear you don't know what I'm saying. Re-read my post and try again.

I did not say SM 2.0 can do everything 3.0 can... because it can't. I never said DX10 isn't a big deal... because clearly it's been made into a big deal by Microsoft.

Prove to me where shader model 2.0 can do everything 3.0 can. I've never once seen that to be the case. It can be implemented in a game that has 3.0 support -- I've never argued that -- but it is not the same as 3.0. In DX10 in Crysis, however, there is no difference what-so-ever that we've come to see in the demo. None. That is completely different.

You're saying 2.0 could have been implemented in a 3.0 game. I never argued that once, so don't imply that I did. Again, re-read the post and come back and respond.

While your at it, please take the time to address my questions. I seem to remember one specifically about Bioshock in my previous post. :laugh:

Newsflash, ANova: developers choose to support new technologies and drop support for old ones. They do so because of development costs, time restraints, and manpower issues. If you want to continue on in your little moral crusade, do it in another topic. Stop wasting a perfectly good topic on Crysis for your filth that clearly you don't even believe -- or else you never would have upgraded or installed/played Bioshock ;)

Link to comment
Share on other sites

Damn, I'm getting alot of artifacts and glitches. I have an ATi Radeon HD 2900 XT with Catalyst 7.10 drivers. I've googled the issue and alot of ATi users are getting this problem. I read that the Omega drivers allow you to play the demo glitch free.

Link to comment
Share on other sites

I have played through the demo many times. I am running it 1680x1050 with AA off but forcing AF at 4x with everything on high. It looks great. The game is a lot more open than I first thought. I love sneaking up to those logs, shooting the support and having them roll down and crush two guys. As I improve, I find myself aiming for trees to have them fall on people. I have gotten through the demo a new way each time. Hopefully some driver optimization will get me 5-10 more FPS, but right now it is more than playable on my 800GTS.

Link to comment
Share on other sites

Damn, I'm getting alot of artifacts and glitches. I have an ATi Radeon HD 2900 XT with Catalyst 7.10 drivers. I've googled the issue and alot of ATi users are getting this problem. I read that the Omega drivers allow you to play the demo glitch free.

Im also getting artifacts on my nvidia, even using the latest beta drivers...

Screenshots.

Edited by Sir Phlegyas
Link to comment
Share on other sites

Im also getting artifacts on my nvidia, even using the latest beta drivers...

Me too! :/

Also I found a way of running Crysis perfectly smooth on my C2D E6600 with OC 8800GTS... Wait for it...

I run it in 1280x768 windowed! It runs like butter with all settings on High. :p But this guy Yerli Cevat or whatever his name is, should be kicked in the ass for saying that I'd be able to run on high settings at 1680x1050. His own reply ages ago in the Crysis forums.

Link to comment
Share on other sites

:rofl:

ANova, clearly you were mistaken in your interpretation of my post and now you're just carrying on and speaking for me. Don't. Don't speak for me, ANova, because it is abundantly clear you don't know what I'm saying. Re-read my post and try again.

I'm not carrying on. I made a few clear and concise responses to your long drawn out paragraph. Seems more like you're trying to go out of your way to prove something to yourself. You ignore most of what I say and you continue to deny what is fact. I've answered your question, whether or not I installed Bioshock does not matter, no matter how much you try to distort it in an attempt to make me look foolish. The real facts bub, is that Bioshock is a ring in a pool, just because it was designed one way does not mean it is impossible to design something another way.

End of story.

Link to comment
Share on other sites

Im also getting artifacts on my nvidia, even using the latest beta drivers...

Screenshots.

Me too! :/

Also I found a way of running Crysis perfectly smooth on my C2D E6600 with OC 8800GTS... Wait for it...

I run it in 1280x768 windowed! It runs like butter with all settings on High. :p But this guy Yerli Cevat or whatever his name is, should be kicked in the ass for saying that I'd be able to run on high settings at 1680x1050. His own reply ages ago in the Crysis forums.

You guys are lucky to even see more than 50% of the demo. The artifacts I get are extremely bad. All I see is a black pattern flickering on my screen. I can see shreds of the soldier's arms and some other post processing effects like the water on your screen. I'll try it in Vista whenever I can but I have a feeling I'll get the same results.

Link to comment
Share on other sites

Wow. The demo is so hardware intensive. I didn't get smooth fps @ 1680x1050 (all settings set to high/no AA). I tried the "Very High" settings to check out some DX10 visuals and I'm not happy. I thought my rig (see sig below) could handle Crysis with DX10 settings enabled. I just hope the full game runs better than this crappy demo. :|

On the other hand, the UT3 demo played flawlessly with all settings maxed out @ 1680x1050 (4xAA/16xAF).

Link to comment
Share on other sites

On the other hand, the UT3 demo played flawlessly with all settings maxed out @ 1680x1050 (4xAA/16xAF).

UT3 is doing quite a bit less than Crysis on the technology side. A hell of a lot less, and in smaller envirnoments. It's not comparable.

Link to comment
Share on other sites

ANova, you know, any thing can be rendered using software rendering, even DX55. Why won't you fight for that too? :rolleyes: I want to play Crysis on my Voodoo5 PCI 1MB Cirrus Logic crap ffs!

Edited by Leo Natan
Link to comment
Share on other sites

I don't know if I even want to try this on my 7800GT O_o

Trust me, you don't want to try it. :ermm:

Link to comment
Share on other sites

I played it today, and the only way I can get smooth framerates is with everything on low (which sucks, because I really wanted to see the game in full detail).

But even on low it looks good, so I'll probably buy it.

Link to comment
Share on other sites

Can someone tell me how I can run it in DX9 mode in Vista, I have a DX10 card but wanna see what performance is like. I see some sites telling me to right click the .exe and select run as DX9 mode but dont see that option. :s

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.