Nvidia presentations leak


Recommended Posts

Plus with Doom 3, it may be better to think of it as one engine instead of one game. iD's engines have a nasty habit of been used more than Vietnamese street walkers...  :shifty:

This is the most valid statement I have read in this thread.

OMG, DOOM III is optimized for nvidia cards, offcourse they are better.

ATI could make the same slides, but then for HL2

I guess the slides are fake, or a joke, but if they are real, then Nvidia is pretty childisch

didnt carmack say if he took off the ATI optimizations..doom3 would run horribly o.O

Motherboard is a Supermicro dual Intel Xeon "Nocona" board, 800Mhz FSB. It has twin CPU sockets, and PCI Express (NOT PCI-X, as someone said previously) slots - one x16, one x4, that allow you to use twin NVidia cards in "SLI" mode. The exact model is one of the X6DA?-G? series, e.g. X6DA8-G2. It has eight (not six) memory slots, for a total of up to 16GB of DDR2-400.

If money was no object, this would be my next motherboard. :)

Motherboard is a Supermicro dual Intel Xeon "Nocona" board, 800Mhz FSB.  It has twin CPU sockets, and PCI Express (NOT PCI-X, as someone said previously) slots - one x16, one x4, that allow you to use twin NVidia cards in "SLI" mode.  The exact model is one of the X6DA?-G? series, e.g. X6DA8-G2.  It has eight (not six) memory slots, for a total of up to 16GB of DDR2-400.

If money was no object, this would be my next motherboard. :)

thanks for the info Doctor. :) And thank you Radium for backing me up. :)

Get the agp version if you got a good board and chip etc etc and don't plan on buying a new computer in the next 2yrs.

Get the PCI-X version if your buying a new system and make sure you get a PCI-X board (duh!) etc etc.

I'm not a ATI or Nvidia fan, I just buy whats better for my dollar at time of purchase. ;)

P.S - Let the flaming begin! :devil:

Dispite the part of pushing Far Cry and talk about Doom3 superiority (along with the hype along it) it used lots of true real arguments (not displayed in the best way tho).

no ****... that doesn't look like anything that would come from nVidia. Every slide they have ever released has always been very proper and with solid information and not "WE R0XX0R3 MORE!!11!!" like these slide show them as doing. Complete bull**** in my oppinion, I think this topic should really be looked at as nothing more than a sad individuals attempt to make a flame thread, official request to lock

*note i'm not a fanboy of ATI and I don't have time to explain why I like nVidia*

Nvidia has done this in the past. They are the most unprofessional, coniving, unethical company I have seen to date asside from Enron. They lie to their customers on a daily basis and distort facts to their benefit. Their only interest is in profits.

Nvidia has done this in the past. They are the most unprofessional, coniving, unethical company I have seen to date asside from Enron. They lie to their customers on a daily basis and distort facts to their benefit. Their only interest is in profits.

ATI also has the same sort of presentations, also cheated and uses "optimizations"..

stop trying to think ATI people are some sort of Saints...give it a rest...

ps: you think ati ain't interested in profits mainly also? newsflash thats the main purpose of every company, or you think shareholders just want to give people the best products to please them... :laugh:

ATI also has the same sort of presentations, also cheated and uses "optimizations"..

stop trying to think ATI people are some sort of Saints...give it a rest...

ps: you think ati ain't interested in profits mainly also? newsflash thats the main purpose of every company, or you think shareholders just want to give people the best products to please them... :laugh:

What an original argument. :rolleyes:

No ATI has not had nor does have presentations anywhere near this level of absurdity. Do you not remember the leaked 'save the nanosecond' slides from ATI? Yes they are a company and as such do like profits but the difference is they conduct their business in a professional manner and don't blantly lie to consumers or cheat in benchmarks. Don't give me that "oh but ATI cheated in Quake 3" crap. That has no merit. There are differences between optimizations that sacrifice image quality in exchange for performance/benchmarks for the sake of advertising and optimizations that improve gameplay overall. Nvidia's only argument is Doom 3 because that is the only popular game they excel in considering it was developed using their technology and runs using OGL.

Get a clue.

You really can't accept can you...

I liked the part where you said such gems as:

"No ATI has not had nor does have presentations anywhere near this level of absurdity"

"Yes they are a company and as such do like profits but the difference is they conduct their business in a professional manner and don't blantly lie to consumers or cheat in benchmarks."

You really think so ehh lol... :rolleyes:

"Nvidia's only argument is Doom 3 because that is the only popular game they excel in considering it was developed using their technology and runs using OGL."

Wrong, perphaps you meant to say that Doom3 is the only game where NVIDIA totally trashes ATI, which is kinda different...

Also maybe ATI should make decent OpenGL drivers no? its not like its something new that Ati drivers performance in openGL is far from steallar, but hey you prolly must think crap like CCC and A.I. are better way to burn resources right?

BTW if you think its small, look again and keep in mind that the engine of Doom3 will (like with previous id games) be used in many future games, and its prolly not as insignificant as you think for ATI to release proper hotfix drivers for it and keep promising new drivers with performance boosts for that specific game..

-

open your eyes and stop seeing Nvidia as the reason of all evil...you look like Bush blaming Terrorism for everything that is bad...

It could have agreed with you only if you had not said:

crap like CCC and A.I.

:pinch:

edit: About the issue of iD's engine becoming the craze, I dont think it will be as big as their previous engines. No outdoor environments, imagine how many games are going to use that besides Quake4. And not to forget HL2's source engine has already been licensed more even before its release.

No ATI has not had nor does have presentations anywhere near this level of absurdity.

AND HOW THE **** ARE WE SUPPOSED TO KNOW THAT?

If this or any other nvidia presentation DIDN'T leak, we wouldn't have know about it. How can YOU possibly know that ATi doesn't have the same or even worse words describing Nvidia.

you DON'T!.

You really can't accept can you...
Oh the irony is strong with this one.
You really think so ehh lol...

Is that the best you can do? Where's your proof? Where are the slides from ATI that counterclaim what I said?

Wrong, perphaps you meant to say that Doom3 is the only game where NVIDIA totally trashes ATI, which is kinda different...
Ahahaha. Maybe you should stop concentrating on being a fanboy and realize what you just said. 5-20 fps more can hardly qualify as 'trashing ATI'.
Also maybe ATI should make decent OpenGL drivers no? its not like its something new that Ati drivers performance in openGL is far from steallar, but hey you prolly must think crap like CCC and A.I. are better way to burn resources right?

Oh all I ever hear from nvidiots is how nvidia's drivers are better in this and better in that bla bla bla. Now that ATI is actually working on their drivers and adding features you claim it's a waste of resources. rofl, pathetic.

BTW if you think its small, look again and keep in mind that the engine of Doom3 will (like with previous id games) be used in many future games, and its prolly not as insignificant as you think for ATI to release proper hotfix drivers for it and keep promising new drivers with performance boosts for that specific game..
Nothing but unsubstantiated predictions. So far only one other game that we know of is using the Doom 3 engine (Quake 4). Take a look at the market before mouthing off will you. Find the percentage of games using D3D versus OGL and come back with a valid argument. Need I also remind you of Unreal Engine 3, that which uses D3D and quite franky puts the Doom 3 engine to shame. Yes it would be in ATI's best interests to run both platforms well but OGL isn't a huge problem area at this point. It's not like ATI hardware can't run Doom 3, hell I had no problems playing it on a 9500 Pro at high detail, 10x7 with an average of 35 fps. An X800 Pro/XT is around three times as fast as that which is more then enough. And it still stands that X800s are faster in D3D while drawing less power then even a 9800 XT.
open your eyes and stop seeing Nvidia as the reason of all evil...you look like Bush blaming Terrorism for everything that is bad...

If these slides aren't enough to stop and make you think then you're a lost cause.

AND HOW THE **** ARE WE SUPPOSED TO KNOW THAT?

If this or any other nvidia presentation DIDN'T leak, we wouldn't have know about it. How can YOU possibly know that ATi doesn't have the same or even worse words describing Nvidia.

you DON'T!.

http://pwp.netcabo.pt/0239863201/Save_The_Nanosecond.ppt

These are slides that did leak from ATI a few months back. Read it and stfu.

Edited by ANova
edit: About the issue of iD's engine becoming the craze, I dont think it will be as big as their previous engines. No outdoor environments, imagine how many games are going to use that besides Quake4. And not to forget HL2's source engine has already been licensed more even before its release.

Doom 3 is capable of the same style of outdoor environments employed in earlier engines like Quake 3. In fact, the map format is not THAT much different from Quake 3 and skyboxes are already shown off inside the Doom 3 game itself. True, it's no fancy tropical islands like Far Cry, but we can get our fenced off areas we all enjoyed in games like Call of Duty again ;) Possible a tiny piece bigger too

@Anova: I was about to mention the "Save the Nanosend" presentation. A fun read, and I take it with as much a grain of salt as I take any leaked documents.

You are right, there were many presentations leaked. But barring one they did not trash Nvidia and hence did not get such attention.

I dunno, they said some nasty things in "Save the nanosecond"

Things about encouraging developers to treat all NVIDIA hardware as Dx9 hardware, and not bashing NVIDIA too much as the fans would do it for them.....

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.