Switch to Linear ModeSwitch to Hybrid ModeSwitch to Threaded Mode
Printer Friendly View | Email this page | Register Now to start posting!
huber Registered User


Join Date: Dec 2005
Posts: 22
Trade rep: 0 (0%)
Infractions: 0/0 (0)
did NVIDIA "break" DX10.1 in AC?! huber Jun 3rd, 08, 04:49 PM #1

according to this article we can assume that nvidia is pressuring game developers to remove or to not implement DX10.1 in games in order to not give ATI the advantages. very very dirty and upsetting!

http://www.anandtech.com/showdoc.aspx?i=3320&p=6


il.png
IversonGarnett (x3m)
IversonGarnett's Avatar
Elder


Join Date: Jan 2004
Location: SG/SENGKANG
Posts: 2,969
Trade rep: 18 (100%)
Infractions: 0/0 (0)
IversonGarnett (x3m) Jun 3rd, 08, 04:53 PM #2
wah...old news XD
sg.png
sutyi
sutyi's Avatar
fubar...


Join Date: Nov 2005
Location: Budaörs, Hungary.
Posts: 258
Trade rep: 0 (0%)
Infractions: 0/0 (0)
Thanked 1 Times in 1 Post
sutyi Jun 3rd, 08, 05:36 PM #3
Quote:
Originally Posted by IversonGarnett View Post
wah...old news XD
Pretty much. Solution: Dont patch if you had no problems with the game.

The Way its meant to be Paid, as Fudzilla stated a few days ago.
hu.png
Ish718
Ish718's Avatar
Registered User


Join Date: Dec 2007
Posts: 616
Trade rep: 0 (0%)
Infractions: 0/0 (0)
Ish718 Jun 4th, 08, 12:02 AM #4
Nvidia IS the top dog in the gpu market so they can push around developers sometimes or all the time O_o
us.png
fLiNtY insane in the mainframe


Join Date: Oct 2007
Location: Parts Unkown
Posts: 112
Trade rep: 0 (0%)
Infractions: 0/0 (0)
fLiNtY Jun 4th, 08, 12:55 AM #5
yeah, put people lose the picture here. As the Crytek Chief said (the developers of Crysis, best gfx ive seen so far): who needs dx10.1 ? Now that they will develope games for consoles, and every console developer, as xbox360 and ps3 can emulate pretty much the dx10 things, why develope games into 10.1 ? Because this is a hardware and not software update, they would need to write it for dx10 also so they can port it easely to the console market (or at least thats what my knowledge says). Even so, there is not THAT big of a difference between dx9 and dx10 and just read the additions to dx10.1 compared to dx10. Happy with my dx10 card, is it nvidia or ati doesnt make a difference. The problem is always the missinformation of the market and users that dont know the technical details : "now with dx 10.1 support - future proof!" sure, WHEN dx10.1 comes out. I seriously couldnt care less if nvidia would also support dx10.1 because u need to draw some lines over the pc and console market as today most of pc developed games are next gen console supported. That means not much of dx10.1 supported stuff, but even so, I got some hard time seeing a difference between my games on dx9 and dx10 when playing, then the question is, will you understand the difference for the dx10 to dx10.1? And also with the CUDA thing, cant nvidia utilise same 10.1 effects into the cuda engine ? Many dont know about the CUDA possibilitys, I can highly advise people to get into the CUDA thing, its seriously awesome stuff (even if u hate nvidia, I hated them too - sorry ATi & AMD but there is no great developments over ur side).
de.png
Last edited by fLiNtY; Jun 4th, 08 at 12:59 AM..
NKd Registered User


Join Date: Apr 2007
Posts: 110
Trade rep: 0 (0%)
Infractions: 0/0 (0)
NKd Jun 4th, 08, 01:30 AM #6
fLiNty, common and realize what this topic is about, the topic said how ati's hardware was at an advantage once antialiasing was enabled, and they removed the support for no apparent reason that would approve such decision by UBIsoft. its not even about cuda, we all should stay on topic and leave the fanboyism aside while discussing. everything seems to turn in to ati vs. nvidia.
us.png
Last edited by NKd; Jun 4th, 08 at 01:33 AM..
fLiNtY insane in the mainframe


Join Date: Oct 2007
Location: Parts Unkown
Posts: 112
Trade rep: 0 (0%)
Infractions: 0/0 (0)
fLiNtY Jun 4th, 08, 02:01 AM #7
i am on topic and I gave an explenation with reasons why I believe this is happening. You want to make a discussion about "nvidia sucks because they pressure the developers and ati" without some arguments why? If u say something explain, otherwise you are the one off topic.
de.png
N1truX
N1truX's Avatar
Registered User


Join Date: Jun 2008
Posts: 136
Trade rep: 0 (0%)
Infractions: 0/0 (0)
Thanked 1 Times in 1 Post
N1truX Jun 4th, 08, 03:38 AM #8
http://www.directcreed.com/?page_id=7 - Here you can subscribe a petition for Dx10.1 in AC
de.png
fLiNtY insane in the mainframe


Join Date: Oct 2007
Location: Parts Unkown
Posts: 112
Trade rep: 0 (0%)
Infractions: 0/0 (0)
fLiNtY Jun 4th, 08, 04:28 AM #9
Quote:
Originally Posted by N1truX View Post
http://www.directcreed.com/?page_id=7 - Here you can subscribe a petition for Dx10.1 in AC
LOOOL, why not start a petition "save AMD/ATi" lol, would be the same hilarious
de.png
sutyi
sutyi's Avatar
fubar...


Join Date: Nov 2005
Location: Budaörs, Hungary.
Posts: 258
Trade rep: 0 (0%)
Infractions: 0/0 (0)
Thanked 1 Times in 1 Post
sutyi Jun 4th, 08, 04:32 AM #10
Quote:
Originally Posted by fLiNtY View Post
yeah, put people lose the picture here. As the Crytek Chief said (the developers of Crysis, best gfx ive seen so far): who needs dx10.1 ? Now that they will develope games for consoles, and every console developer, as xbox360 and ps3 can emulate pretty much the dx10 things, why develope games into 10.1 ? Because this is a hardware and not software update, they would need to write it for dx10 also so they can port it easely to the console market (or at least thats what my knowledge says). Even so, there is not THAT big of a difference between dx9 and dx10 and just read the additions to dx10.1 compared to dx10. Happy with my dx10 card, is it nvidia or ati doesnt make a difference. The problem is always the missinformation of the market and users that dont know the technical details : "now with dx 10.1 support - future proof!" sure, WHEN dx10.1 comes out. I seriously couldnt care less if nvidia would also support dx10.1 because u need to draw some lines over the pc and console market as today most of pc developed games are next gen console supported. That means not much of dx10.1 supported stuff, but even so, I got some hard time seeing a difference between my games on dx9 and dx10 when playing, then the question is, will you understand the difference for the dx10 to dx10.1? And also with the CUDA thing, cant nvidia utilise same 10.1 effects into the cuda engine ? Many dont know about the CUDA possibilitys, I can highly advise people to get into the CUDA thing, its seriously awesome stuff (even if u hate nvidia, I hated them too - sorry ATi & AMD but there is no great developments over ur side).
Crysis is pretty much DX9, even with the Vista exclusive Veryhigh settings, there are some geometryshader insturctions used, can count them on one hand actually.

Furthermore no, consoles cant emulate DX10 methods and features, cause the graphics chips are basically based on 2 generations old DX9 hardware especially in the PS3 wich has a cutdown 7900 series GPU basically. The chip in the 360 is a different story, cause it has US architecture with DX9 capabilities and a tesselator and so on.

As for NV not giving a damn about DX10.1, well thats simple marketing. They are on top of their game and most developers wont implement it cause consoles dont support the feautres anyway, and titles that come with a DX10 path you wont simply wont have any AA or you'll have some wich isnt applied properly especially around lights where you loose subpixel data and you will have yaggies. Thats down to nV not having 10.1 hardware, and developers have two choices, you wont have AA in titles wich use deffered rendring methods in their engines (you can count almost every title here wich use UE3.0), or they can use Shader Based AA methods, or Deffered SuperSampling wich makes any hardware crawl. Plus they can save the RnD cost, cause they can sell their 2 year old architecture again with some minor adjustments.

DX10.1 HW including S3 newest budget cards have support for custom MSAA trough MSBRW, so you can apply AA to the subpixels where you would normaly loose information with normal AA methods.

Plus the fact that the buffer can be read and written any given moment gives it and edge performance wise.

In AC were no rendering anomalies as the developers stated, but DX10.1 performance with AA was much better after SP1 came along, cause applying AA took less passes.

Most people wouldnt complaing if the DX10.1 support would've come along with a patch for the game, and it had serious problems, than much less people would complain about UbiSoft cutting the support for it. But it came built in and had no problems, and UbiSoft cut in a crippling patch in exchange for paycheck.

Thats BS, no matter how much are we quibbling it.
hu.png
fLiNtY insane in the mainframe


Join Date: Oct 2007
Location: Parts Unkown
Posts: 112
Trade rep: 0 (0%)
Infractions: 0/0 (0)
fLiNtY Jun 4th, 08, 06:09 AM #11
LoL u r right on some points, but other make me just laugh.... ever heard of the hardware emulation layer on the directX "architecture" ? And sure, try to activate ur little Crysis hack and "think" u got directx10 on the high qualitys, then tell me how its physically possible by ur directx9 supported gpu to show the motion blur effect (thats right u cant unless u could "make/emulate" that effect on directx9 standard base coding). And there is a BIIIIIG difference of hardware utilisation and emulation. 2 things can "look" alike, but the utilisation/rendering method would be complitely different. So that makes u wrong again that its mostly a directx9 game (even though u could also say its directx8 etc as direct x is something like an "umbrella standard"). Also by the power of CUDA I wouldnt be supprised if nvidia could "emulate" directx10.1 functions but again as ive read about the game developers and directx10.1, it isnt a noticable step forward and I would rather say its even bad because u need to make an hardware upgrade. I would recommend to pack it to the 11 version and drop 10.1, but hey that would be bad for all those ATi dx10.1 hopers that think they will have some more shinny effects... if they wouldnt read the additions and changes to 10.1, they wouldnt even notice it...
de.png
DaGamer!!!
DaGamer!!!'s Avatar
Gaming's in my blood!


Join Date: Mar 2005
Posts: 2,788
Trade rep: 33 (100%)
Infractions: 0/0 (0)
DaGamer!!! Jun 4th, 08, 01:41 PM #12
fLiNtY, I have a feeling that IF the situation is reversed, that is, nV has DX10.1 cards and ATi has only DX10, you'd be singing a different tune. nV's top now, so I don't see any game developer pushing for DX10.1 after the AC fiasco......removing DX10.1 support instead of fixing it, shame on them!
DaRig!!! | i7 3960X + Corsair H100 | Asus R4E | 16GB Corsair Dom Plat 2133mhz | 2x Sapphire R9 290X 4GB CFX | SB Zx + Logitech Z5500 | Corsair 120GB ForceGT SSD (OS) | 2TB + 3x 1TB WDC Black | Seasonic X-1250 | Caselabs M8 | CM Novatouch + Logitech G502 | 64bit Win 8.1 Pro |
DaRig2 | AMD FX8350 + CM Seidon 120v | Asus CVF | 16GB RipJawX 2133mhz | 2x GTX670 2GB SLi | 120GB Force GT SSD (OS) | 2TB Hitachi Deskstar + 1TB + 2x 500GB WDC Blue | Asus Xonar Phoebus Solo + Logitech z623 | Enermax MAXREVO 1500W | Corsair 650D | KUL ES-87 Clear + Roccat Kone XTD | 64bit Win7 Pro SP1 |
sg.png
sutyi
sutyi's Avatar
fubar...


Join Date: Nov 2005
Location: Budaörs, Hungary.
Posts: 258
Trade rep: 0 (0%)
Infractions: 0/0 (0)
Thanked 1 Times in 1 Post
sutyi Jun 4th, 08, 02:22 PM #13
Quote:
Originally Posted by fLiNtY View Post
LoL u r right on some points, but other make me just laugh.... ever heard of the hardware emulation layer on the directX "architecture" ? And sure, try to activate ur little Crysis hack and "think" u got directx10 on the high qualitys, then tell me how its physically possible by ur directx9 supported gpu to show the motion blur effect (thats right u cant unless u could "make/emulate" that effect on directx9 standard base coding). And there is a BIIIIIG difference of hardware utilisation and emulation. 2 things can "look" alike, but the utilisation/rendering method would be complitely different. So that makes u wrong again that its mostly a directx9 game (even though u could also say its directx8 etc as direct x is something like an "umbrella standard"). Also by the power of CUDA I wouldnt be supprised if nvidia could "emulate" directx10.1 functions but again as ive read about the game developers and directx10.1, it isnt a noticable step forward and I would rather say its even bad because u need to make an hardware upgrade. I would recommend to pack it to the 11 version and drop 10.1, but hey that would be bad for all those ATi dx10.1 hopers that think they will have some more shinny effects... if they wouldnt read the additions and changes to 10.1, they wouldnt even notice it...
Whats with motionblur being a DX10 effect. LOL.
Yeah, you can run almost every 3D API in software render, but I bet you dont want to play like a powerpoint presentation, and sure CUDA is good for a lots of things, but it wont solve hardware limitations in the GPU.

D3D10.1 is not about more or better effects, the main it just solves AA rendering problems under deffered rendring. So it wont make a better shoepolish shine in anygame, but it gets rid of the yaggies properly.

Just for the record I had nVIDIA cards for 4 years in a row, before I' being called an ATi fanboy.
hu.png
IversonGarnett (x3m)
IversonGarnett's Avatar
Elder


Join Date: Jan 2004
Location: SG/SENGKANG
Posts: 2,969
Trade rep: 18 (100%)
Infractions: 0/0 (0)
IversonGarnett (x3m) Jun 4th, 08, 03:05 PM #14
Hmm what if, given this situation, where an ATI GFX card running DX10.1 path vs a NVidia card running DX10 path (both of the same price segment), on the same game. (Take into consideration this game has a full implementation of DX10.1 and 10.) And Nvidia's offering still wins in performance how will you guys comment?
sg.png
Lyfeforce
Lyfeforce's Avatar
Abolisher of E|ektronics


Join Date: Mar 2007
Location: Tampines
Posts: 3,762
Trade rep: 2 (100%)
Infractions: 0/0 (0)
Lyfeforce Jun 4th, 08, 03:18 PM #15
The point is such a scenario hasn't been created. It's quite similar to comparing DX10 to DX9, there are sure to be pros and cons.

I just wonder, why create DX10.1 if it's not going to be used?
A blade cuts both ways. So why take the risk? Stab instead.



sg.png
Thread Tools Display Modes
Linear Mode Linear Mode