Page 3 of 4 FirstFirst 1234 LastLast
Results 51 to 75 of 82

Thread: Why Do ATI Cards Score So Much Better On 3DMark2005 Then Nvidia Cards?

  1. #51
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Cybercat
    Funny, I never do.
    maybe theres a way to disable it i dont know of? no idear...
    i dont think fudo got the point there. its not like they are writing a new driver that has to run on all cards, its a package of different files and the install utility recognises what card the system is running and picks the right files. so the comment about a unified driver meaning more work does not make any sence to me. plus the latest 6x drivers ARE for the 6xxx cards and fx cards only and give me artifacts on my g4mx and the g4ti of my buddy, so where is the unified driver architecture here?

    nvidia loves to embrace itself for having a great unified driver architecture, but they broke with that idear some months ago and i yet have to see a REAL unified driver that si actually helping several cards and not just either the fx or either the 6xxx cards.

    plus nvidia officially stated that they no longer support the g4mx and lower cards, so the detonators might still include files for the older cards like the tnt2, but there wont be a difference between the different future driver versions for tnt2 users.

    while tnt2 cards are actually indeed too old to play recent games and it makes sence to discontinue their support for new titles, discontinuin the gf1+2+3+4mx cards is not right imo. the g4mx has a huge market share and you can actually play doom3, half life2 and far cry with a g4mx if you have a fast system! and even a geforce1 ddr can play half life2, so i really think its lame to discontinue their support.

    Quote Originally Posted by Cybercat
    The 5800 was a rush job, not thought out thoroughly, and no one is debating its flaws. NVIDIA learned very quickly from that mistake, which is why they were recalled.
    the 5800 was a rush job?
    rather the opposite can and should be said. nvidia was working on the fx ever since the gf4. remember when the geforce 4 was released? it was the undisputed leader for several months, then the 9700pro was released and destroyed the gf4. the 5800 was delayed over and over again and it took nvidia forever to get it ready. and the lanch was just a paper launch, so add 2 more months to all that time. the 5800 was around 6 months late!

    Quote Originally Posted by Cybercat
    I don't agree. I've yet to see anything degraded about the GF6's image quality compared to ATI. While the FX series was another story (they really had no choice in the matter, since those cards would perform like utter crap if they matched ATI's IQ), there's nothing about the GF6 and its latest drivers that suggest that IQ is only "secondary".
    i never said the gf6 has a degraded iq, i was referring to the fx series.

    and cutting the iq that bad without giving people the option to disable the optimizations because "the cards would perform like utter crap" otherwise is a bad thing to do.

    people with fast systems didnt really need the optimizations to play new titles and would have been more glad about bug fixes and good iq rather than more and more optimizations.

    yes there are indications that nvidia focuses on performence first and iq comes on the second rank. they release drivers to speed up their cards first and release a driver with a bug fix for artifacts etc at a later time. been like that with several games.

    ati instead has released a series of drivers in the last month that slowed down their cards in some games compared to the previous driver version, but fixed the bug for the time beeing, and later released a driver that fixed the bug without cutting the performence.

    and NO im not going to send you links about this :P
    read hardware news!

    Quote Originally Posted by Cybercat
    There's only one instance where NVIDIA has an image quality bug with the game, and that can be seen here. That is a problem on Crytek's end, as the new 1.2 patch that has been recalled fixed this issue. Other than that, I have never seen any of the artifacts or bugs that you speak of with any NVIDIA card that I've ever used, nor have I heard of any one else complain about them (have you?). They seem to be isolated incidents with either the computer you tested on, or the graphics cards themselves. I had a GF4 Ti4200 that I tried playing FarCry on, and experienced no such problems. The same holds for my current card.
    thats not true at all. there are dozens of graphic bugs in far cry. i think the latest drivers fixed most of the for the geforce 6 series, but the geforce 4 series cards still have the same problems. i made a thread about it in this section some time ago, use the search funtcion. i added screenshots of a bunch of bugs.

    and no, the card is fine and shows no artifacts in other games, same goes for the gf4ti of my buddies lil brother.

    yes ive heard about far cry bugs all the time and it was in the news and was mentioned in some reviews as well. and when i posted about it in a quite large forum dedicated to graphics, almost nobody replied and the ones who did said its a known issue and ever since doom3 was released nvidia pretty much stopped working on far cry driver updates!

    what drivers did you use? did you set all graphics settings to the maximum?
    and you dont see any bugs or artifacts?

    Quote Originally Posted by Cybercat
    If you could provide a link I'd be very appreciative. And yes, I'm quite lazy.
    jesus christ :P took me 3 seks to get the link
    www.xbitlabs.com -> 2k5 article -> batch test
    http://www.xbitlabs.com/articles/vid...mark05_14.html



    hmmm we are drifting away from the actual topic though. hahaha

    can somebody post his results with a 6800NU or 6800GT or Ultra with the latest 2 or 3 drivers?

    and can somebody please run the batch test with the latest drivers and tell me if they look better than the drivers xbitlabs tested?
    Last edited by saaya; 10-03-2004 at 11:33 AM.

  2. #52
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by saaya
    maybe theres a way to disable it i dont know of? no idear...
    If you find one let me know. Because I haven't run across anything like that.

    When I had the Ti4200, I was using 44.03s, which I found to perform the best with the GF4 series. I wasn't using new drivers.

    As to everything else, I don't have a response. IMO NVIDIA isn't any more guilty with optimizations than ATI. The FX series was just a bad series, and at that time IQ was indeed far worse than ATI, but my point is that it doesn't hold true for their new series.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  3. #53
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    never said it would

  4. #54

  5. #55
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    interesting!

    i dont think its an optimizatio from ati, looks more like a bug. look at it, the ati card renders a bigger area with a reflection wich means more work, compared to the 6800 having a much smaller field of that surface with reflection.

    the lights on the ati card look weird though... is it exactly the same frame?

    when switching forth and back i can see several spots where the ati image looks better than the nvidia image imo. the bump mapping used from the nv card looks much more flat... hmmm does 2k5 use 3dc btw?

  6. #56
    Xtreme Member
    Join Date
    Feb 2004
    Location
    Norway dude
    Posts
    108
    Should be the same frame. Sais 985 on both shots.

    I really don't know anything about it. Just came over it on an other forum. Thought I whould post it here so you could brainstorm on it

    But what I find really wierd is the ATIs AA efforts.

    Look at these.

    4xAA and 8xAF X800: http://www.bjorn3d.com/photos/showph...cat/503/page/3

    4xAA and 8XAF 6800: http://www.bjorn3d.com/photos/showph...cat/503/page/1

    Allways heard ATI was better at AA. But if this is the reason there'e no wonder their better (in framerate) at it.
    Last edited by Veritas.no; 10-04-2004 at 02:28 AM.

  7. #57
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    aa doesnt seem to work for the x800 there... the water and the beast are way more blurry on the 6800 though... kinda weird...

    on most of the screenshots wihout aa you will see that the x800 has slightly sharper textures than the 6800... the 6800 renders a very sharp shadow instead while the ati card renders a soft shaddow.

    doesn aybody know whether the shaddows are supposed to be soft or hard?

    and look at the scene with the cannon fireing!!!! the 6800 is failing to render 1 if not even 2 lightsources and there is almost no black fog in that scene! on the right there is no light source lighting the front of the ship, and on the lower left edge theres is also a light source missing.

    and look at the scene of the beast jumping over the ship. the shot of the entire beast over the ship from the distant side view.

    the water reflections of the 6800 are MUCH worse than those of the ati card. maybe nvidia is only using 16 bit precision vs atis 24 bit? it looks very blurry at least and rendered with less detail and resolution (the reflections)

    also look at the ship under the beast. it has two thin things, one on the left and one on the right. you can see them from the far distant on the x800 shot, but its almost completely gone on the 6800 shot.

    also check out the triliniar test. i dont know if they didnt know how to disable the briliniar optimization, but i doubt they didnt know how to do it...

    the x800 card shows good triliniar filtering, althougt its a bit washed out in the center, and has a higher angle dependence with 3 zones forming the left and right compared to the 5 zones forming the left and right for the 6800. the 6800 has almost no angle dependency for the triliniar filtering and is more sharp in the center.

    BUT its not running triliniar filtering at all... it looks like very optimised briliniar filtering (mix of triliniar and biliniar)


    after looking at all of that stuff for over an hour (LOL ) id say the x800 has a better overall image quality.

    but not by much, both cards show abnormalies wich could be bugs or optimizations... who knows...

    isnt it weird btw? almost all screenshots looked like it was not the same frame from both cards, no? or was i the only one seeing it like that? it always looked as if the scene had moved a bit more forwards or backwards with the shot from the other card...


    very nice article! please post a new thread about it if there is non about it

    great iq review!
    Last edited by saaya; 10-04-2004 at 03:30 AM.

  8. #58
    Xtreme Member
    Join Date
    Dec 2002
    Posts
    345
    bjorn3d didnt apply AA correctly, theres no AA being applied in the ati shots...it has been discussed on the beyond3d board with screenshots w/ and w/o 4xAA
    "it is better to be thought of as an idiot, than to open your mouth and prove it"

  9. #59
    Xtreme Enthusiast
    Join Date
    Jul 2002
    Location
    London Ontario
    Posts
    544
    Quote Originally Posted by saaya
    isnt it weird btw? almost all screenshots looked like it was not the same frame from both cards, no? or was i the only one seeing it like that? it always looked as if the scene had moved a bit more forwards or backwards with the shot from the other card...
    I noticed that as well, it indicates that is from the same frame, however if you look at the gun fire, theyre at different points? perhaps that is causing the reflection difference, and the lightsource banding difference on teh floor

    Everything is lit per pixel, so even the smallest muzzle flash from a gun dramitcally changes the location of reflections/shadows

    and 05 uses 3Dc
    Let the OC'ing begin!

  10. #60
    Xtreme Member
    Join Date
    Dec 2002
    Posts
    345
    Quote Originally Posted by WildKard
    and 05 uses 3Dc
    you are probably thinking of DST
    "it is better to be thought of as an idiot, than to open your mouth and prove it"

  11. #61
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    what do you mean with bjorn applied aa inccorectly? didnt he set it in the driver panel?

    so 2k5 doesnt use 3dc?

    and wildkard,
    1. i dont think the shot is flashing the ship up for just a mili second
    2. i dont think the light flash stops even though the canon is still firing, wich it def is on the frame.
    and
    3. the explosion can impossibly light the lower part of the ship and i doubt that there are two explosions that light parts of the ship in the exactly same frame and both are gone in the next frame, because even if both shots are not the same frame, a difference of more than 1 frame is not possible because the camera etc would have moved more.

  12. #62
    Xtreme Member
    Join Date
    Apr 2004
    Posts
    437
    going from this, maybe this is like the Athlon vs. P4 ...

    AthlonXP has 3 Integer pipelines with 3 FPUs, giving you a 3x1 architecture (integerxFPU)

    P4 has 2 integer pipelines and 4 FPUs, giving you a 2x2 architecture ...

    it's a difference in design ... maybe ATI cards are more efficient at DX9 or whatever tasks ...

    calculating 2+2, on AXP will be faster at the same clock because the pipelines are shorter ... on P4, it will take a bit longer at the same clock because of the longer pipelines ... but P4 conpensates by having a higher clock ...

    this isn't a flame or anything, I'm just saying that the diiference in scores lies in difference in design.

    Quote Originally Posted by gaddster
    maybe it's just that ATI handles dx9 shaders much better.

  13. #63
    Xtreme Addict
    Join Date
    Mar 2003
    Location
    Denmark
    Posts
    2,055
    couldnt this just be a result of driver optimizions? I mean nvidia among other is / were pretty fast to push obsolete models out in the cold.
    "M-I-A"

  14. #64
    Xtreme Enthusiast
    Join Date
    Jul 2002
    Location
    London Ontario
    Posts
    544
    I believe it was stated in the futuremark forums, that on ATi hardware, 3Dc is run, and on NVidia hardware it runs SM3.0(although obviously not even close to the same thing, futuremark chose to implement features from both companies)
    *EDIT* After delving deep into the forums it appears I am inccorect and there is no 3Dc implementation!


    Also, Saaya, I was referring to the reflections on the floor of the return to proxycon screenshot sorry for not making myself more clear, but you can definately see the muzzle flashes are different in the screenshots
    Last edited by WildKard; 10-04-2004 at 04:56 AM.
    Let the OC'ing begin!

  15. #65
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    yeah, but i think thats actually a bug/optimization on the x800 card and not just one frame ahead of the 6800 shot... at least thats what i think...

  16. #66
    Xtreme Member
    Join Date
    Apr 2004
    Location
    GrandIsland New york
    Posts
    255
    Ok, well in the first 2 pics of return to proxycon the muzzle shots do appear to be at different times however there is blaster fire in the air near the top left of teh red box they drew that doesnt move at all so they are indeed taken at the same time. the IQ on the ati shot is alot lower than on the 6800 shot. If you look closely at the back wall and the big tank and even the ladder on the big tank you will see that the ati card is rendering incorrectly I see places where textures look almost entirely different and on the ati shot there looks to be little lines where there shouldnt be. As far as that reflection is concerned sure the muzzle flashes are different but that shouldnt mean the IQ of the reflection that is already there should change the 6800 shot looks to be fluid and no sudden changes in intensity of the reflection while in the ati shot you can see that there are coeccentric arcs where the intensity of the reflection changes. Im not an expert or anything but I would much rather show off the 6800 pic to my friends

  17. #67
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by stayfrosty
    Ok, well in the first 2 pics of return to proxycon the muzzle shots do appear to be at different times however there is blaster fire in the air near the top left of teh red box they drew that doesnt move at all so they are indeed taken at the same time. the IQ on the ati shot is alot lower than on the 6800 shot. If you look closely at the back wall and the big tank and even the ladder on the big tank you will see that the ati card is rendering incorrectly I see places where textures look almost entirely different and on the ati shot there looks to be little lines where there shouldnt be. As far as that reflection is concerned sure the muzzle flashes are different but that shouldnt mean the IQ of the reflection that is already there should change the 6800 shot looks to be fluid and no sudden changes in intensity of the reflection while in the ati shot you can see that there are coeccentric arcs where the intensity of the reflection changes. Im not an expert or anything but I would much rather show off the 6800 pic to my friends
    you see textures that almost look entireyl different? different compared to what? i dont get what you mean... please mark it on the pics and post it here. save it as png as jpeg will screw up the IQ!

    http://www.reflectonreality.com/images/3dm05/1x800.png
    http://www.reflectonreality.com/images/3dm05/16800.png

    yes the x800 def seems to have a problem with that pale layer over the ground...

    this is just a single set, have you looked at the other screenshots? comparing them all gave me the impression that both cards have more or less the same IQ, with the x800 having slightly sharper textures and showing soft shaddows while the 6800 is not showing soft shaddows. and afaik the shaddows in 2k5 are supposed to be soft shaddows like the x800 shows them, and not hard shaddows like the 6800.

  18. #68
    Xtreme Member
    Join Date
    Apr 2004
    Location
    GrandIsland New york
    Posts
    255
    Yea after looking at the other shots I see what you mean each card goes about doing things alot differently. Each one has its ups and its downs. I circled the sqare by the tank, to me they look different, there looks to be some extra stuff on the 6800 card. I had to shrink it so imageshack would host it for me. Ang again after looking at the other shots its really hard to say who comes out on top if there even is a winner in the IQ department

    http://img32.exs.cx/img32/2365/1x800.png

    On a side note saaya thanks for getting my account fixed

  19. #69
    Xtreme Member
    Join Date
    Dec 2002
    Posts
    345
    Quote Originally Posted by saaya
    what do you mean with bjorn applied aa inccorectly? didnt he set it in the driver panel?
    AA should be set to app pref. in the control panel and applied through 3dmark
    "it is better to be thought of as an idiot, than to open your mouth and prove it"

  20. #70
    Xtreme Member
    Join Date
    Apr 2004
    Posts
    437
    driver optimisations mean diddly if you don't have the hardware ...

    drivers can only be inefficient ... so you have to lower their innefficiency ...

    Quote Originally Posted by bias_hjorth
    couldnt this just be a result of driver optimizions? I mean nvidia among other is / were pretty fast to push obsolete models out in the cold.

  21. #71
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    maybe force each to run SM2.0x and get screenies to compare?

    All along the watchtower the watchmen watch the eternal return.

  22. #72
    Xtreme AMD Fanboy
    Join Date
    Nov 2002
    Location
    Salem / Corvallis, OR
    Posts
    979
    Quote Originally Posted by Kunaak
    I don't want to hear the usual dumb conspiracy theory of ATI bought Futuremark or something like that... I don't care about rumours, so unless you got a reciept showing ATI bought Futuremark-keep it to yourself.

    I just wanna know, why does ATI cards do so damn well on 3DMark2005?
    3DMark 05'... at present, should be called "Fetch Vertex Data 05'". Because thats the local bottleneck, or at least the one ATi patched up.

    Thats why their PCIe card always showed the same score, while the last drivers recently allowed the AGP variant to catch up. 05' is incredibly vertex shader bound, and also happens to eat up more than 256MB of local memory which forces allocation via GART to system memory.

    nVidia has either already done this trick, and is slower due to their vertex shader implementation (6vs @ 435 MHz vs. 6vs @ 520) or perhaps ATi's drivers are more efficient lately in multiple areas (its proven the 4.11's are faster in system bound tests as well, increase fillrate slightly, and tweak other areas as well.)

    Before someone yaps about different architectures, remember that the GeForce 6800's have Pixel Shader units that can, in many cases, double their ops / cycle compared to ATi hardware, which is why they compete so well at slower speeds.

    NV must optimize their VS throughput or their system memory texture loading ability.

    Proud Founder of Redline 3D

  23. #73
    Xtreme X.I.P.
    Join Date
    Aug 2002
    Posts
    4,764
    Quote Originally Posted by Hallowed
    05' is incredibly vertex shader bound, and also happens to eat up more than 256MB of local memory which forces allocation via GART to system memory.
    I enabled the extra vertex shader on my 6800 and got a minimal gain

    I used 256MB main memory with 128 and 256MB aperture size and saw no gain or loss when the recommended system memory is 512MB.

    Where do you get your theories from and have you got any solid facts to back it up ?

    Regards

    Andy

  24. #74
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by stayfrosty
    Yea after looking at the other shots I see what you mean each card goes about doing things alot differently. Each one has its ups and its downs. I circled the sqare by the tank, to me they look different, there looks to be some extra stuff on the 6800 card. I had to shrink it so imageshack would host it for me. Ang again after looking at the other shots its really hard to say who comes out on top if there even is a winner in the IQ department

    http://img32.exs.cx/img32/2365/1x800.png

    On a side note saaya thanks for getting my account fixed
    yes, i woulndt call any of the two the winner after thinking about it again. both show clear anomalies wich should not be tolerated. if both cheat there wont be a winner...

    and interesting spot! i have overlooked it so far, but imo it looks like the x800 is using a higher quality for that reflection on the part of the tank than the 6800. dont you think?

    and np for fixing the account actually i cant change people accounts, only fugger and kazoo can do it so i pmed them and they did it

    Quote Originally Posted by STEvil
    maybe force each to run SM2.0x and get screenies to compare?
    good idear!

    and hollowed, nvidias vertex units are weaker. a part of sm 3.0 means to sort of connect pixel and vertex shader to one unit afaik and in sm4.0 (wgf longhorn) there will be only one general hybrid shader unit and there will only be shaders that do vertex and pixel shader operations.

    so the poor vertex processing power is sort of a trade of for having sm3.0 i think.

    and are you sure its the vertex power that limits the nv cards?

    nvidias vertex units are weaker, they saved transistors there to have more transistors for other stuff. this makes sence as the cpu can take away parts of the vertex processing from the gpu as far as i know, so with a fast cpu you dont really need that many that strong vertex units, this also explains why nvidia cards scale slightly better with more cpu power than atis x800 cards. correct me if im wrong please

    vertex processing power is needed for what? for many things in a scene that are all moving, like vegetation, grass and also mmorpg's (massive multiplayer online role play games) and other 3d games with dozens if not hundrets or thousands of units that all move like some strategy games.

    thats where the x800 cards score really well and the nv cards are up to 30%slower.

    i guess we need more vertex processing power for future games as the ideal what everybody wants is a realistic game with wind and moving objects and many units and vehicles moving. so i think it was good from fm to make 2k5 that vertex unit dependant.

    nvidia knew that 2k5 would be vertex dependant and they still decided to save transistors in that part of the gpu... tbh i would rather have kciked sli out of the core and have made stronger vertex units...

    i would have left an interface for slo on the gpus and made the sli chip an external ic like the br02 pciE to agp bridge chip...

    only a fraction of cards will actually run in sli, so they would only to put that external sli chip on a few cards and sell them sli capable cards.

    this would mean that normal cards would be slightly cheaper and sli capable cards would cost a bit more than the normal cards. wich imo is just fair.

    atm its like this: all people who buy a nv sli capable card are paying for sli even if they dont use it, so they are paying the extra money for the few people who actually use sli.

    if it would be limited to some cards the people who want sli pay extra for it... just fair imo, plus it would have made the cards faster (more vertex power or other speed boosts from the transistors they would have saved)

  25. #75
    Xtreme AMD Fanboy
    Join Date
    Nov 2002
    Location
    Salem / Corvallis, OR
    Posts
    979
    Quote Originally Posted by zakelwe
    I enabled the extra vertex shader on my 6800 and got a minimal gain

    I used 256MB main memory with 128 and 256MB aperture size and saw no gain or loss when the recommended system memory is 512MB.

    Where do you get your theories from and have you got any solid facts to back it up ?

    Regards

    Andy
    Less "theory" than "most commonly accepted explanation", but you'd have to root through Beyond3D forums to get at it.

    As far as Aperture Size, I'd figure you knew that changing it only allows the system to allocate more system memory, not speed its transfer. Transfer speed is why the X800XT PCI-E card initially had the higher score, until ATi tweaked a few settings that allowed their AGP counterpart to match its score. (see macci's statement)

    The 6800 could be bottlenecked by other means, masking the vertex issue. I'm awaiting NV to optimize a bit for 05', as I'd like to see my 6800 GT compete a bit better.

    Anyway, I digress, as I have little time, there are those at B3D.. while few.. that know what theyre talking about (rather than bloviating) and have concluded this to be the case.

    Proud Founder of Redline 3D

Page 3 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •