Page 1 of 2 12 LastLast
Results 1 to 15 of 25
  1. #1
    Registered User
    Join Date
    Dec 2005
    Posts
    22

    Angry did NVIDIA "break" DX10.1 in AC?!

    according to this article we can assume that nvidia is pressuring game developers to remove or to not implement DX10.1 in games in order to not give ATI the advantages. very very dirty and upsetting!

    http://www.anandtech.com/showdoc.aspx?i=3320&p=6

  2. #2
    Elder IversonGarnett's Avatar
    Join Date
    Jan 2004
    Location
    SG/SENGKANG
    Posts
    2,969
    wah...old news XD
    Desktop Gaming PC - Windows 8 Pro x64
    Intel Core i7 920 D0 | TR Venomous X | EVGA X58 CLASSIFIED E761 | Team Xtreem TXD36144M1866HC9TC
    ECS GTX 680 2GB x 2 (SLI) | Creative X-Fi Titanium Fatality Pro | UE700 | Altec Lansing ATP3
    OCZ Agility 60GB SSD x 2 | WD1500HLFS | LG GSA-H55 SATA
    SAMSUNG SyncMaster XL2370 | Corsair HX850 | CM 690 | Logitech G15 v2 | MS Sidewinder™ X8 | HORI Real Arcade Pro.v3 SA


  3. #3
    fubar... sutyi's Avatar
    Join Date
    Nov 2005
    Location
    Budaörs, Hungary.
    Posts
    258
    Quote Originally Posted by IversonGarnett View Post
    wah...old news XD
    Pretty much. Solution: Dont patch if you had no problems with the game.

    The Way its meant to be Paid, as Fudzilla stated a few days ago.
    "We're going to hell, so bring your sunblock."

  4. #4
    Registered User Ish718's Avatar
    Join Date
    Dec 2007
    Posts
    616
    Nvidia IS the top dog in the gpu market so they can push around developers sometimes or all the time O_o

  5. #5
    insane in the mainframe
    Join Date
    Oct 2007
    Location
    Parts Unkown
    Posts
    112
    yeah, put people lose the picture here. As the Crytek Chief said (the developers of Crysis, best gfx ive seen so far): who needs dx10.1 ? Now that they will develope games for consoles, and every console developer, as xbox360 and ps3 can emulate pretty much the dx10 things, why develope games into 10.1 ? Because this is a hardware and not software update, they would need to write it for dx10 also so they can port it easely to the console market (or at least thats what my knowledge says). Even so, there is not THAT big of a difference between dx9 and dx10 and just read the additions to dx10.1 compared to dx10. Happy with my dx10 card, is it nvidia or ati doesnt make a difference. The problem is always the missinformation of the market and users that dont know the technical details : "now with dx 10.1 support - future proof!" sure, WHEN dx10.1 comes out. I seriously couldnt care less if nvidia would also support dx10.1 because u need to draw some lines over the pc and console market as today most of pc developed games are next gen console supported. That means not much of dx10.1 supported stuff, but even so, I got some hard time seeing a difference between my games on dx9 and dx10 when playing, then the question is, will you understand the difference for the dx10 to dx10.1? And also with the CUDA thing, cant nvidia utilise same 10.1 effects into the cuda engine ? Many dont know about the CUDA possibilitys, I can highly advise people to get into the CUDA thing, its seriously awesome stuff (even if u hate nvidia, I hated them too - sorry ATi & AMD but there is no great developments over ur side).
    Last edited by fLiNtY; Jun 4th, 08 at 12:59 AM.

  6. #6
    Registered User
    Join Date
    Apr 2007
    Posts
    110
    fLiNty, common and realize what this topic is about, the topic said how ati's hardware was at an advantage once antialiasing was enabled, and they removed the support for no apparent reason that would approve such decision by UBIsoft. its not even about cuda, we all should stay on topic and leave the fanboyism aside while discussing. everything seems to turn in to ati vs. nvidia.
    Last edited by NKd; Jun 4th, 08 at 01:33 AM.

  7. #7
    insane in the mainframe
    Join Date
    Oct 2007
    Location
    Parts Unkown
    Posts
    112
    i am on topic and I gave an explenation with reasons why I believe this is happening. You want to make a discussion about "nvidia sucks because they pressure the developers and ati" without some arguments why? If u say something explain, otherwise you are the one off topic.

  8. #8
    Registered User N1truX's Avatar
    Join Date
    Jun 2008
    Posts
    136
    http://www.directcreed.com/?page_id=7 - Here you can subscribe a petition for Dx10.1 in AC

  9. #9
    insane in the mainframe
    Join Date
    Oct 2007
    Location
    Parts Unkown
    Posts
    112
    Quote Originally Posted by N1truX View Post
    http://www.directcreed.com/?page_id=7 - Here you can subscribe a petition for Dx10.1 in AC
    LOOOL, why not start a petition "save AMD/ATi" lol, would be the same hilarious

  10. #10
    fubar... sutyi's Avatar
    Join Date
    Nov 2005
    Location
    Budaörs, Hungary.
    Posts
    258
    Quote Originally Posted by fLiNtY View Post
    yeah, put people lose the picture here. As the Crytek Chief said (the developers of Crysis, best gfx ive seen so far): who needs dx10.1 ? Now that they will develope games for consoles, and every console developer, as xbox360 and ps3 can emulate pretty much the dx10 things, why develope games into 10.1 ? Because this is a hardware and not software update, they would need to write it for dx10 also so they can port it easely to the console market (or at least thats what my knowledge says). Even so, there is not THAT big of a difference between dx9 and dx10 and just read the additions to dx10.1 compared to dx10. Happy with my dx10 card, is it nvidia or ati doesnt make a difference. The problem is always the missinformation of the market and users that dont know the technical details : "now with dx 10.1 support - future proof!" sure, WHEN dx10.1 comes out. I seriously couldnt care less if nvidia would also support dx10.1 because u need to draw some lines over the pc and console market as today most of pc developed games are next gen console supported. That means not much of dx10.1 supported stuff, but even so, I got some hard time seeing a difference between my games on dx9 and dx10 when playing, then the question is, will you understand the difference for the dx10 to dx10.1? And also with the CUDA thing, cant nvidia utilise same 10.1 effects into the cuda engine ? Many dont know about the CUDA possibilitys, I can highly advise people to get into the CUDA thing, its seriously awesome stuff (even if u hate nvidia, I hated them too - sorry ATi & AMD but there is no great developments over ur side).
    Crysis is pretty much DX9, even with the Vista exclusive Veryhigh settings, there are some geometryshader insturctions used, can count them on one hand actually.

    Furthermore no, consoles cant emulate DX10 methods and features, cause the graphics chips are basically based on 2 generations old DX9 hardware especially in the PS3 wich has a cutdown 7900 series GPU basically. The chip in the 360 is a different story, cause it has US architecture with DX9 capabilities and a tesselator and so on.

    As for NV not giving a damn about DX10.1, well thats simple marketing. They are on top of their game and most developers wont implement it cause consoles dont support the feautres anyway, and titles that come with a DX10 path you wont simply wont have any AA or you'll have some wich isnt applied properly especially around lights where you loose subpixel data and you will have yaggies. Thats down to nV not having 10.1 hardware, and developers have two choices, you wont have AA in titles wich use deffered rendring methods in their engines (you can count almost every title here wich use UE3.0), or they can use Shader Based AA methods, or Deffered SuperSampling wich makes any hardware crawl. Plus they can save the RnD cost, cause they can sell their 2 year old architecture again with some minor adjustments.

    DX10.1 HW including S3 newest budget cards have support for custom MSAA trough MSBRW, so you can apply AA to the subpixels where you would normaly loose information with normal AA methods.

    Plus the fact that the buffer can be read and written any given moment gives it and edge performance wise.

    In AC were no rendering anomalies as the developers stated, but DX10.1 performance with AA was much better after SP1 came along, cause applying AA took less passes.

    Most people wouldnt complaing if the DX10.1 support would've come along with a patch for the game, and it had serious problems, than much less people would complain about UbiSoft cutting the support for it. But it came built in and had no problems, and UbiSoft cut in a crippling patch in exchange for paycheck.

    Thats BS, no matter how much are we quibbling it.
    "We're going to hell, so bring your sunblock."

  11. #11
    insane in the mainframe
    Join Date
    Oct 2007
    Location
    Parts Unkown
    Posts
    112
    LoL u r right on some points, but other make me just laugh.... ever heard of the hardware emulation layer on the directX "architecture" ? And sure, try to activate ur little Crysis hack and "think" u got directx10 on the high qualitys, then tell me how its physically possible by ur directx9 supported gpu to show the motion blur effect (thats right u cant unless u could "make/emulate" that effect on directx9 standard base coding). And there is a BIIIIIG difference of hardware utilisation and emulation. 2 things can "look" alike, but the utilisation/rendering method would be complitely different. So that makes u wrong again that its mostly a directx9 game (even though u could also say its directx8 etc as direct x is something like an "umbrella standard"). Also by the power of CUDA I wouldnt be supprised if nvidia could "emulate" directx10.1 functions but again as ive read about the game developers and directx10.1, it isnt a noticable step forward and I would rather say its even bad because u need to make an hardware upgrade. I would recommend to pack it to the 11 version and drop 10.1, but hey that would be bad for all those ATi dx10.1 hopers that think they will have some more shinny effects... if they wouldnt read the additions and changes to 10.1, they wouldnt even notice it...

  12. #12
    Gaming's in my blood! DaGamer!!!'s Avatar
    Join Date
    Mar 2005
    Posts
    2,909
    fLiNtY, I have a feeling that IF the situation is reversed, that is, nV has DX10.1 cards and ATi has only DX10, you'd be singing a different tune. nV's top now, so I don't see any game developer pushing for DX10.1 after the AC fiasco......removing DX10.1 support instead of fixing it, shame on them!
    DaRig!!! | i7 3960X + Corsair H100 | Asus R4E | 16GB Corsair Dom Plat 2133mhz | 2x Sapphire R9 290X 4GB CFX | SB Zx + Logitech Z5500 | Corsair 120GB ForceGT SSD (OS) | 2TB + 3x 1TB WDC Black | Seasonic X-1250 | Caselabs M8 | CM Novatouch + Logitech G502 | 64bit Win 8.1 Pro |
    DaRig2 | AMD FX8350 + CM Seidon 120v | Asus CVF | 16GB RipJawX 2133mhz | 2x GTX670 2GB SLi | 120GB Force GT SSD (OS) | 2TB Hitachi Deskstar + 1TB + 2x 500GB WDC Blue | Asus Xonar Phoebus Solo + Klipsch PM2.1 | Enermax MAXREVO 1500W | Corsair 650D | CM Novatouch + Roccat Kone XTD | 64bit Win7 Pro SP1 |

  13. #13
    fubar... sutyi's Avatar
    Join Date
    Nov 2005
    Location
    Budaörs, Hungary.
    Posts
    258
    Quote Originally Posted by fLiNtY View Post
    LoL u r right on some points, but other make me just laugh.... ever heard of the hardware emulation layer on the directX "architecture" ? And sure, try to activate ur little Crysis hack and "think" u got directx10 on the high qualitys, then tell me how its physically possible by ur directx9 supported gpu to show the motion blur effect (thats right u cant unless u could "make/emulate" that effect on directx9 standard base coding). And there is a BIIIIIG difference of hardware utilisation and emulation. 2 things can "look" alike, but the utilisation/rendering method would be complitely different. So that makes u wrong again that its mostly a directx9 game (even though u could also say its directx8 etc as direct x is something like an "umbrella standard"). Also by the power of CUDA I wouldnt be supprised if nvidia could "emulate" directx10.1 functions but again as ive read about the game developers and directx10.1, it isnt a noticable step forward and I would rather say its even bad because u need to make an hardware upgrade. I would recommend to pack it to the 11 version and drop 10.1, but hey that would be bad for all those ATi dx10.1 hopers that think they will have some more shinny effects... if they wouldnt read the additions and changes to 10.1, they wouldnt even notice it...
    Whats with motionblur being a DX10 effect. LOL.
    Yeah, you can run almost every 3D API in software render, but I bet you dont want to play like a powerpoint presentation, and sure CUDA is good for a lots of things, but it wont solve hardware limitations in the GPU.

    D3D10.1 is not about more or better effects, the main it just solves AA rendering problems under deffered rendring. So it wont make a better shoepolish shine in anygame, but it gets rid of the yaggies properly.

    Just for the record I had nVIDIA cards for 4 years in a row, before I' being called an ATi fanboy.
    "We're going to hell, so bring your sunblock."

  14. #14
    Elder IversonGarnett's Avatar
    Join Date
    Jan 2004
    Location
    SG/SENGKANG
    Posts
    2,969
    Hmm what if, given this situation, where an ATI GFX card running DX10.1 path vs a NVidia card running DX10 path (both of the same price segment), on the same game. (Take into consideration this game has a full implementation of DX10.1 and 10.) And Nvidia's offering still wins in performance how will you guys comment?
    Desktop Gaming PC - Windows 8 Pro x64
    Intel Core i7 920 D0 | TR Venomous X | EVGA X58 CLASSIFIED E761 | Team Xtreem TXD36144M1866HC9TC
    ECS GTX 680 2GB x 2 (SLI) | Creative X-Fi Titanium Fatality Pro | UE700 | Altec Lansing ATP3
    OCZ Agility 60GB SSD x 2 | WD1500HLFS | LG GSA-H55 SATA
    SAMSUNG SyncMaster XL2370 | Corsair HX850 | CM 690 | Logitech G15 v2 | MS Sidewinder™ X8 | HORI Real Arcade Pro.v3 SA


  15. #15
    Abolisher of E|ektronics Lyfeforce's Avatar
    Join Date
    Mar 2007
    Location
    Tampines
    Posts
    3,763
    The point is such a scenario hasn't been created. It's quite similar to comparing DX10 to DX9, there are sure to be pros and cons.

    I just wonder, why create DX10.1 if it's not going to be used?
    A blade cuts both ways. So why take the risk? Stab instead.




Page 1 of 2 12 LastLast