Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 82

Thread: Why Do ATI Cards Score So Much Better On 3DMark2005 Then Nvidia Cards?

  1. #26
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    heheh pretty much

    hmmmm im sorry kanavit, your were right 8x1 i generally faster than 4x2, but there are still situations, not only theoretical, where the 4x2 can be faster, my point was that it doesnt just depend on the pipe/tmu organisation but also on other aspects of the vpu, like what the tmus and math units can do.

    or am i wrong again KHAN?

  2. #27
    Registered User
    Join Date
    Sep 2002
    Posts
    61
    To make this thing short why 5900 sucks in 3dMark 05.

    1. As in most games the 5900 most likely acts like an 4X1 pipeline design, Doom 3 is the big exception as allready mentioned. Big performance hit.

    2. No 24bit color precision which DX 9 requires, so if 5900 or any other nVidia gfx-card for that matter is to draw in full precision they have to use 32 bit mode. BIG performance hit.
    This could also explain why 6800 series gets less points than X800.

    Just my 2

  3. #28
    Xtreme 3DTeam Member
    Join Date
    Jan 2004
    Location
    Dresden
    Posts
    1,163
    Quote Originally Posted by saaya
    heheh pretty much

    hmmmm im sorry kanavit, your were right 8x1 i generally faster than 4x2, but there are still situations, not only theoretical, where the 4x2 can be faster, my point was that it doesnt just depend on the pipe/tmu organisation but also on other aspects of the vpu, like what the tmus and math units can do.

    or am i wrong again KHAN?
    i think that the pipes of nv30/35/40 are basically more powerfull than single pipes on r300/350/360/420.
    in that case u're right.

    ... but what do i know?

  4. #29
    Xtreme Member
    Join Date
    Aug 2004
    Location
    Italy
    Posts
    246
    I think is just a shame...
    The God Father OC.Team

  5. #30
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    feature wise all the nv30 nv35 nv40 and nv43 pipes are "better" than the ati pipes, they are more flexible and can do more different calculations i think.

    performence wise the nv30 and nv35 are worse than the ati pipes, and the nv40 pipes are at least as fast as the ati pipes, rather much faster because of their second math unit.

    Quote Originally Posted by Der_KHAN
    ... but what do i know?
    :P

  6. #31
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Yeah, the FX series has pretty poor shader performance. But what I really want to know is why 6800NUs are beat by 9800XTs and sometimes even 9800 Pros? Now THAT doesn't make ANY sense.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  7. #32
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    why not? 9800s are very very strong pixel shader cards

    and as wel all know 6800s have a higher per mhz performence than the x800 cards, so underclocking them hurts their performence a lot (check 6800NU reviews) and limiting them to 12 pipes makes it even worse...

  8. #33
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    I don't care about clockspeed. The 6800NU has 12 pipelines, the 9800 Pro/XT has 8, and we all know, the more pipelines, the more pixels being drawn on the screen per cycle, clockspeeds be damned (that's why the FX series couldn't be saved by this). Even if you took clockspeeds into consideration, the 6800NU still has a 3900MT/s fillrate, while the Pro and XT only have 3040 and 3296 respectively. It just doesn't make sense.

    And the GF6 series also has strong shader performance, that's one of the things NVIDIA focused on when they made the architecture. They learned their lesson from the FX series.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  9. #34
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    i dont think its a 2k5 thing... look at other benches like shadermark... the 6800nu's dont score that well there either...

    they DO, but they are slower or as fast as the 9800s as well

  10. #35
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    But most games (including DX9 ones) the 6800NU does MUCH better in, and if it's ever outperformed, it's only by the 9800XT on rare occasions (and by small margins), NEVER the 9800 Pro.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  11. #36
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    the difference between the 9800pro and the xt are very small.

    about why the 9800s are actually faster than 6800NU's ... no idear... i guess they can handle all the different things the 2k5 scenes need more efficiently... better pipe organisation...

    or maybe it has to do with 2005 using very little cpu power?
    6800s rely way more on cpu power to help them compared to 9800s and x800s.
    this means nvidia cards scale better with a strong cpu compared to 9800s and x800s wich dont seem to get that much faster with a stronger cpu.

    maybe the cards are forced to run on themselves and cant use a lot of cpu power for the things the have to calulcate in the game tests compared to games, so thats why they are slower?

    i really dont know... interesting point however...

    or maybe its that the 6800s have to calculate all in 32bit precision even though 24bit would be enough, and thats what the 9800s do...


    EDIT: wait a minute, what "dx9 games" do you mean?

  12. #37
    Xtreme Member
    Join Date
    Mar 2004
    Location
    Denmark
    Posts
    490
    Games dont have as many shaders as benchmarks...

    Thats the difference...

    You can compare all that you want... A game isnt the same as a benchmark...

    6800NU usually is only a little faster than 9800XT... But is it stock you mean that the 9800XT beats the 6800NU??? Or is it OC'ed???

    Cuz ppl that bought the 6800NU are budget ppl... Some ppl that still have 9800pro/XT are just waiting for the 6800U/X800XT...

    But you cant compare the bandwidth of the cards.. 5950 had much more bandwidth than 9800XT... But still got beaten...

    Its architecture... And driver issues i think...

    But i disagree with that ATI made a mistake... I really dont think they did... All i heard from SM3.0 was GI and more instructions...

    But even though you reached the max instructions on 2.0b then it wouldnt run realtime on the X800/6800 cards...

    And X800 cards also have GI...

    It would end up like FX cards... Features with no power to use them... SM2.0b can do whatever sm3.0 can on these cards... sm3.0 is better.. But unusable.,..
    P4 3.0C 800@3.64 242FSB 1.625Vcore on Ic7 max3(Watercooled)
    Corsair 2x512@DDR484Mhz PC4000( 3-4-4-8 )2.8V
    X800XT@586/594(Watercooled)
    3*Maxtor 120 GB fluid +9 SATA RAID-0
    3Dmark01: 25250--Aquamark3: 66423--3Dmark03: 13780--3dmark05: 6361

  13. #38
    Xtreme Member
    Join Date
    Jul 2004
    Location
    n^n!
    Posts
    478
    This is only what I think and I really don't have any proof or something

    I think, the driver is the problem. The 6800 have a some new stuff, that I believe have lots of extra potential. I think the X800 driver is a spin off from the R3xx series, as the VPU itself also should be. The 6800 is not a such a direct spin off from the NV3x, but it should share alot with the NV3x, just less than the X800/R3xx comparison. I plain words, Nvidias drivers suck in 2k5 or at least suck less than ATIs

    I think, at least %25 extra performance can be gained from driver opdates the next couple of months on on both ATI and Nvidia cards. I don't believe, this differrence in performance is mirrored on the various benchmarks synthetic, application, or games in general. I also think, that there is a difference in how a certain card performs at default speed and how well it scales in benchmarks.

    I believe, Nvidia has the advantage with the NV4x series over the R4xx series in the end. Pure speculation though and time will tell. In general, performance, and feature wise, for the moment ATI has the overall advantage. I believe that will change

  14. #39
    Xtreme Member
    Join Date
    Mar 2004
    Location
    Denmark
    Posts
    490
    Quote Originally Posted by Cybercat
    I don't care about clockspeed. The 6800NU has 12 pipelines, the 9800 Pro/XT has 8, and we all know, the more pipelines, the more pixels being drawn on the screen per cycle, clockspeeds be damned (that's why the FX series couldn't be saved by this). Even if you took clockspeeds into consideration, the 6800NU still has a 3900MT/s fillrate, while the Pro and XT only have 3040 and 3296 respectively. It just doesn't make sense.

    And the GF6 series also has strong shader performance, that's one of the things NVIDIA focused on when they made the architecture. They learned their lesson from the FX series.
    The number of pipes isnt everything...

    Perhaps 9800 series cards control the pipes better than the 6800 series??? But the 6800 series has that much more power that it weighs up... Unless its the super down scaled 6800NU...

    But the 9600 series card are better than the 59XX series... Its just that when the 9600 series runs away from 59XX series its unplayable on both cards... 5XXX series cards just suck... Hehe... Not fanboyisme or anything.. Even Nvidia gave up on 5XXX series...
    P4 3.0C 800@3.64 242FSB 1.625Vcore on Ic7 max3(Watercooled)
    Corsair 2x512@DDR484Mhz PC4000( 3-4-4-8 )2.8V
    X800XT@586/594(Watercooled)
    3*Maxtor 120 GB fluid +9 SATA RAID-0
    3Dmark01: 25250--Aquamark3: 66423--3Dmark03: 13780--3dmark05: 6361

  15. #40
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    yes mini, thats the same i think. the only disadvantage ati has because it didnt go sm3.0 is that nvidia is now defining some of the sm3.0 specs... but its not that bad...

    and tek... i really dont think so and there are no facts pointing towards nvidia still having a lot of potential in their 6800s they couldnt unlock with the drivers...

    and why should the next generation ati card lose to the current generation nvidia card? 0_o

  16. #41
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    hmmm i wouldnt say that mini, the 5900xt cards were very nice cards, very good price/performence ratio... not for future games... but as well both agreed, it will take time before real games make use of all those features, and until then the 5xxx cards are still "ok" to "good"

  17. #42
    Xtreme Member
    Join Date
    Mar 2004
    Location
    Denmark
    Posts
    490
    Quote Originally Posted by saaya
    yes mini, thats the same i think. the only disadvantage ati has because it didnt go sm3.0 is that nvidia is now defining some of the sm3.0 specs... but its not that bad...

    and tek... i really dont think so and there are no facts pointing towards nvidia still having a lot of potential in their 6800s they couldnt unlock with the drivers...

    and why should the next generation ati card lose to the current generation nvidia card? 0_o
    Perhaps... But what specs???

    SM3.0 has what it has... Nvidia or ATI... Its still only SM3.0... 1 year from now it will have the same features...

    Of course ATI loses a little bit... But heck the people that decide the choice of graphic cards that way are the same guys that say "FX5900 is better than 9800pro because it has 256MB ram and 9800pro is only 128"

    Sayaa: You know if R480 will be SM3.0??? Or will they wait until R520??? (Btw did you see that the first R500 for Xbox2 has been made?)

    Edit: True... It was sorta ironic... Not 100% but still... Hehe...

    FX series is ok... But you can see them get run over in DX9 games... And now all games are DX9... So its not at all futureproof...

    But i think that if ATI and Nvidia used alot more recourses on drivers we would see a BIG difference... It could unlock the true potential of the cards...

    Cat 4.9 is just 4.8 fixed up... So all the small buggs and old crap in there using recourses dor nothing are still there...

    Would be lovely if they could make a 5.0 from the bottom... Only for X800 series... I think we would see a nice improvement... OfC its unrealistik... But still would be nice
    Last edited by Mini; 10-02-2004 at 07:28 PM.
    P4 3.0C 800@3.64 242FSB 1.625Vcore on Ic7 max3(Watercooled)
    Corsair 2x512@DDR484Mhz PC4000( 3-4-4-8 )2.8V
    X800XT@586/594(Watercooled)
    3*Maxtor 120 GB fluid +9 SATA RAID-0
    3Dmark01: 25250--Aquamark3: 66423--3Dmark03: 13780--3dmark05: 6361

  18. #43
    Xtreme Addict
    Join Date
    May 2004
    Location
    Perth, Western Australia
    Posts
    1,622
    i always thought it was the 4x2 thingy. my 5900xt was at 590 gpu with only 820 ram speed and still it scores 7151 with my a64 at only 2.25ghz and a 250fsb. a lot of oced 9800pros have trouble even getting to 7k. point is in fx: speed really helps
    also, i think that ati cards since they are just speeded up and pipe added r3xx cards they are really fast for benches when the quality is low. but the geforce 6 are more gaming cards with their high IQ and enhancing opotions and 32bit precicion.
    you may say: but Ati is way better at AA?
    well then look at AF, nvidias drivers are really good at high AF levels, much better than ATi.
    and when im gaming i like AF more than AA to give a sharper image. is rather 16AF and 4AA than 8of each...
    Quote Originally Posted by bh2k
    sorry m, OI'm a bit drunkz!
    Air benches with 3000+, DFI nf3 and 6800GT 2001SE: 26312 3d03: 13028

  19. #44
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by saaya
    the difference between the 9800pro and the xt are very small.
    Which is why when the 9800XT beats the 6800NU, it's only by a small margin.

    Quote Originally Posted by saaya
    wait a minute, what "dx9 games" do you mean?
    Halo and FarCry are two main ones. While the X800 series does better than the 6800 series in FarCry, the 6800NU still does better than the 9800XT by a pretty good margin.

    Quote Originally Posted by Mini
    6800NU usually is only a little faster than 9800XT...
    Usually a lot faster actually.

    Quote Originally Posted by Mini
    But you cant compare the bandwidth of the cards.. 5950 had much more bandwidth than 9800XT... But still got beaten...
    I wasn't comparing memory bandwidth, I was comparing fillrates, which is completely different, and has a much larger effect on determining a card's performance.

    Quote Originally Posted by Mini
    The number of pipes isnt everything...

    Perhaps 9800 series cards control the pipes better than the 6800 series??? But the 6800 series has that much more power that it weighs up... Unless its the super down scaled 6800NU...
    I think it has more to do with the drivers than anything else. And no, pipelines aren't everything (I don't remember saying they were), but we can both agree that they do have a PROFOUND influence on performance.

    Quote Originally Posted by saaya
    i really dont think so and there are no facts pointing towards nvidia still having a lot of potential in their 6800s they couldnt unlock with the drivers...
    No, there are no facts. You can't have facts in such a situation. Only speculation. I always thought the same as Tek does about the GF6 series having a lot more potential yet to be unleashed than ATI's cards, since they were a spin off of the original R300 architecture, that's been around for a while now. But here, with Catalyst 4.10, ATI pulled off such a performance boost I didn't think was possible. It really surprised me.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  20. #45
    Xtreme Member
    Join Date
    Sep 2004
    Location
    Toronto, Canada
    Posts
    157
    I just saw it mentioned earlier that the 6800 is 8x2, it's not. It is 16x1, and can render depth only 32x0 that's why it's so much faster in Doom 3. As for ATI's vs Nvidia's scores in 3DMark05, the only thing I can imagine is that the program is designed to favor ATI. If it was drivers or hardware then the issue would show up in all games/benches. However, this effect only occurs with 3DMark05, which means it has to do with how 3DMark05 was programmed. Maybe it was intentional, maybe not, I'm sure they'll never admit to it if it was intentional.
    Forums are the Opiate of the Masses

  21. #46
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    mini, r480 is a higher clocked x800 in 110nm. r5xx is sm3.0.
    catalyst 4.8 was a cs source-half life2 boost driver while 4.9 was a doom3 boost driver (or vice versa?)

    reject, i still havent seen a screenshot that shows a worse rendered image with an x800 card compared to a 6800 card. talking about image quality, did you know that nvidia has yet to release a SINGLE 6x driver that passes the whql certification? all the last drivers fail in ALL of the 38 precission and image quality tests

    but dont get me wrong, i think ati not giving people the chance to disable the optimiyations is a very bad move...


    Quote Originally Posted by Cybercat
    Halo and FarCry are two main ones. While the X800 series does better than the 6800 series in FarCry, the 6800NU still does better than the 9800XT by a pretty good margin.
    depends...

    i dont know what eye candy settings 2k5 uses... but you are right, the 6800NU is usually a bit faster compared to the 9800xt.

    but as somebody in this thread said before, 2k5 stresses the cards more than current games and use waaaaay more and bigger pixel shaders than current games.

    but it still remains mysterious why the 6800 cards perform worse in 2k5...

    i dont think it has anything to do with ati buying fm though...


    Quote Originally Posted by Cybercat
    I think it has more to do with the drivers than anything else. And no, pipelines aren't everything (I don't remember saying they were), but we can both agree that they do have a PROFOUND influence on performance.
    i wouldnt say that... the volari v8 has 8 pipes and look at its performence... the bare number of pipes really doesnt mean anything... its like saying the amount of L2 cache of a cpu defines its speed...

    same goes for fill rate, its all about efficiency... pure fill rate and pure bandwidth indicate the performence, but i wouldnt say a card is fast because it has a high theoretical peak fill rate.


    Quote Originally Posted by Cybercat
    No, there are no facts. You can't have facts in such a situation. Only speculation. I always thought the same as Tek does about the GF6 series having a lot more potential yet to be unleashed than ATI's cards, since they were a spin off of the original R300 architecture, that's been around for a while now. But here, with Catalyst 4.10, ATI pulled off such a performance boost I didn't think was possible. It really surprised me.
    speculations based on hope are dreams...

    real speculations are based on facts, like saying "look at 2k3, it took nvidia 2 to 3 driver releases to finally unleash the full potential of their cards back then, so thats why i think it will take nvidia 1 or 2 more driver releases as well this time befotre they can unleash the full geforce 6 potential"

    THAT would be a speculation!

    just saying "i think they will get a big performence boost... because i want it" is just dreaming


    you convinced me though that theres something odd about the 6xxx cards performing not so well...

    well the batch test of 2k5 actuallz DOES indicate that nvidia has worked MUCH less on the drivers than ati... funny as it was nvidias drivers that helped them to reach and hold the performence peak in its history... they seem to have forgotten how important drivers are and probably focus way too much on hardware after the 5800 hardware disaster.

  22. #47
    Xtreme Addict
    Join Date
    May 2004
    Location
    Perth, Western Australia
    Posts
    1,622
    but nvidias 60.x drivers are awesome, 61.32 improved the IQ and 16AF speed so much. they have a very good track record for omtimizing drivers. eg 44.03, 53.03, 56.72, 61.36, 65.37,etc.
    there are a few whql 60.x drivers, dont know maybe you missed them. but look at the FM 03 drivers, the only nvidia one is 52.16 which is archaic. and even for 03 cat 4.9 is apprroved. maybe ati is pushing there drivers to FM with a little incentive of some sort?

    look i found 3 WHQL drivers:
    http://downloads.guru3d.com/download.php?det=789
    http://downloads.guru3d.com/download.php?det=810
    http://downloads.guru3d.com/download.php?det=881
    Quote Originally Posted by bh2k
    sorry m, OI'm a bit drunkz!
    Air benches with 3000+, DFI nf3 and 6800GT 2001SE: 26312 3d03: 13028

  23. #48
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by saaya
    talking about image quality, did you know that nvidia has yet to release a SINGLE 6x driver that passes the whql certification?
    Um, 60.85, 60.86, 61.21, 61.36, 61.72, 61.76, 65.73, and 66.72 were all WHQL'd. (http://downloads.guru3d.com/download.php?id=10) The main problem NVIDIA has with WHQL certification is that its drivers have to work with all current and past legacy cards, because of their "marchitecture" drivers that are to work with all cards. ATI has an easier time getting WHQL because they keep their Radeon and lagacy Rage drivers separate.

    Quote Originally Posted by saaya
    i wouldnt say that... the volari v8 has 8 pipes and look at its performence... the bare number of pipes really doesnt mean anything... its like saying the amount of L2 cache of a cpu defines its speed...

    same goes for fill rate, its all about efficiency... pure fill rate and pure bandwidth indicate the performence, but i wouldnt say a card is fast because it has a high theoretical peak fill rate.
    Well, we both know the Volari never had a chance in hell. They had other architectural problems, not to mention a terrible driver team holding them back. Again I say, pipelines are indeed not everything, but when I mention them, I mainly refer to the two graphics card giants of the time. Third party companies have other considerable disadvantages.

    Quote Originally Posted by saaya
    well the batch test of 2k5 actuallz DOES indicate that nvidia has worked MUCH less on the drivers than ati... funny as it was nvidias drivers that helped them to reach and hold the performence peak in its history... they seem to have forgotten how important drivers are and probably focus way too much on hardware after the 5800 hardware disaster.
    No, I wouldn't say that. NVIDIA isn't stupid, they know how the graphics card market works. You can't have bad hardware and expect to make up for it with great drivers, and at the same time, you can't have unoptimized drivers and expect raw hardware horsepower to do the work for you. For the most part, it needs equal effort from both aspects to win the war (yes, war, that's how serious these companies are ). Last time, NVIDIA had hardware, not necessarily terrible hardware, just greatly underpowered in critical areas compared to the competitition. At first, NVIDIA's drivers were the source of much controversy, but NVIDIA kept at it, trying to do the impossible by improving IQ and still boosting performance. They did that as much as possible with the FX architecture. I still find it an amazing feat that NVIDIA made a 4x2 card (along with other disadvantages) so competitive against ATI's almost flawless 8-pipe architecture. In the end, however, competitive is where it ended. They were still a ways away from being able to win the competition, as ATI still came on top with better IQ and better performance. NVIDIA knows exactly how important drivers are. They have an excellent driver team, working as hard as ever to get as much performance out of the GF6 architecture as they can. There have been as many as five beta driver releases in one week. Last time, NVIDIA had the drivers doing everything they could, but just didn't have the hardware. This time, NVIDIA will have both pulling for them, something ATI had all along. This is what makes this year's battle such a close and exciting race.
    Last edited by Cybercat; 10-03-2004 at 05:44 AM.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  24. #49
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Cybercat
    Um, 60.85, 60.86, 61.21, 61.36, 61.72, 61.76, 65.73, and 66.72 were all WHQL'd. (http://downloads.guru3d.com/download.php?id=10) The main problem NVIDIA has with WHQL certification is that its drivers have to work with all current and past legacy cards, because of their "marchitecture" drivers that are to work with all cards. ATI has an easier time getting WHQL because they keep their Radeon and lagacy Rage drivers separate.
    they are not fully whql certified as far as i know, dont know about you but i got the windows pop up saying they arent certified even though nvidia said they are whql.

    and sayin they have more work to do because they have one driver for all cards is


    Quote Originally Posted by Cybercat
    Well, we both know the Volari never had a chance in hell. They had other architectural problems, not to mention a terrible driver team holding them back. Again I say, pipelines are indeed not everything, but when I mention them, I mainly refer to the two graphics card giants of the time. Third party companies have other considerable disadvantages.
    hehehe yeah
    well my point was that the overall design matters, you cant just pick one part of the design and say thats what defines the speed... people love to make things easier, but you cant really do it like that. people love mhy because its a general indication of performence, yet a higher clocked card doesnt always mean more performence.


    Quote Originally Posted by Cybercat
    No, I wouldn't say that. NVIDIA isn't stupid, they know how the graphics card market works. You can't have bad hardware and expect to make up for it with great drivers, and at the same time, you can't have unoptimized drivers and expect raw hardware horsepower to do the work for you. For the most part, it needs equal effort from both aspects to win the war (yes, war, that's how serious these companies are ). Last time, NVIDIA had hardware, not necessarily terrible hardware, just greatly underpowered in critical areas compared to the competitition. At first, NVIDIA's drivers were the source of much controversy, but NVIDIA kept at it, trying to do the impossible by improving IQ and still boosting performance. They did that as much as possible with the FX architecture. I still find it an amazing feat that NVIDIA made a 4x2 card (along with other disadvantages) so competitive against ATI's almost flawless 8-pipe architecture. In the end, however, competitive is where it ended. They were still a ways away from being able to win the competition, as ATI still came on top with better IQ and better performance. NVIDIA knows exactly how important drivers are. They have an excellent driver team, working as hard as ever to get as much performance out of the GF6 architecture as they can. There have been as many as five beta driver releases in one week. Last time, NVIDIA had the drivers doing everything they could, but just didn't have the hardware. This time, NVIDIA will have both pulling for them, something ATI had all along. This is what makes this year's battle such a close and exciting race.
    hmmm well about nvidia beeing stupid or not is something you can only argue about

    if you look at all their marketing, the media dirt wars, the way they tried to sell the fx5800 for more money than the competition even though it was slower , noisier and needed a bigger psu speaks for itself... the bare fact that they honestly tried to sell those cards with so noisy coolers makes me wonder... how could they honestly have thougt people dont mind a 55db+ fan? in my opinion that WAS stupid, just one example...

    and about nvidias "great" driver team, they used to have one, but in the last year they seem to have focussed on performence in the first point and IQ is only secondary... a bad decision if you ask me, as the high end cards all perform more or less the same and are all fast enough to play all games without any problems. i would chose the slower of the two top cards if it had better iq, and so would most people... so again, thats another stupid decision.

    someting else that psses me off about nvidia is how they use far cry to promote their cards (sm3.0) yet t o date there is STILL NO SINGLE DRIVER that runs far cry without any bugs! thats just sad... i tried far cry with my g4mx and the g4ti of a friend with 10 of the most recent nvidia drivers and they all had bugs and artifacts! ant1, a buddy of mine, has a 6800gt and also tested all latest drivers and had the same problems! now where is that great driver team nvidia used to have?

    look at the batch test in 2k5 (read the xbit article about 2k5) wich clearly shows that ati has put more effort into the latest drivers, a LOT more than nvidia. and that test just backs up what i am seeing in several games.

    nvidia drivers are not bad, but they are far from what they used to be and imo atis drivers are at least as good if not better.


    Nvidia facts are not so bright, they had four partially WHQL while their latest publicly available 61.77 is not WHQL at all.

    Our snitch sent us complete report on WHQL test finished at Nvidiai Geforce 5950 card with 61.77. they are failing each and part of WHQL test. They are failing in all fourteen tests including, D3Dlines, Multisampling, Non power 2 conditional RenderTarget, Pixel Shader Precision, Pixel Shader Ver. 1.1, Pixel Shader Ver. 1.2, Pixel Shader Ver. 1.3, Pixel Shader Ver. 1.4, Pixel Shader Ver. 2.0, Point Sprites, Texture address, Texture stage, Update surface and YUV Bit test.

    We are not sure what is going on with 6X00 cards and how are they performing but learning from the fact that Nvidia still doesn’t have WHQL drivers on its web page speaks for itself.
    http://www.theinquirer.net/?article=18730

    correcting, no idear how i thougt it were 38 tests before :P

  25. #50
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by saaya
    they are not fully whql certified as far as i know, dont know about you but i got the windows pop up saying they arent certified even though nvidia said they are whql.
    Funny, I never do.

    Quote Originally Posted by saaya
    and sayin they have more work to do because they have one driver for all cards is
    How so? Here's an Inquirer article stating the same thing I just did.

    Quote Originally Posted by saaya
    if you look at all their marketing, the media dirt wars, the way they tried to sell the fx5800 for more money than the competition even though it was slower , noisier and needed a bigger psu speaks for itself... the bare fact that they honestly tried to sell those cards with so noisy coolers makes me wonder... how could they honestly have thougt people dont mind a 55db+ fan? in my opinion that WAS stupid, just one example...
    The 5800 was a rush job, not thought out thoroughly, and no one is debating its flaws. NVIDIA learned very quickly from that mistake, which is why they were recalled.

    Quote Originally Posted by saaya
    and about nvidias "great" driver team, they used to have one, but in the last year they seem to have focussed on performence in the first point and IQ is only secondary... a bad decision if you ask me, as the high end cards all perform more or less the same and are all fast enough to play all games without any problems. i would chose the slower of the two top cards if it had better iq, and so would most people... so again, thats another stupid decision.
    I don't agree. I've yet to see anything degraded about the GF6's image quality compared to ATI. While the FX series was another story (they really had no choice in the matter, since those cards would perform like utter crap if they matched ATI's IQ), there's nothing about the GF6 and its latest drivers that suggest that IQ is only "secondary".

    Quote Originally Posted by saaya
    someting else that psses me off about nvidia is how they use far cry to promote their cards (sm3.0) yet t o date there is STILL NO SINGLE DRIVER that runs far cry without any bugs! thats just sad... i tried far cry with my g4mx and the g4ti of a friend with 10 of the most recent nvidia drivers and they all had bugs and artifacts! ant1, a buddy of mine, has a 6800gt and also tested all latest drivers and had the same problems! now where is that great driver team nvidia used to have?
    There's only one instance where NVIDIA has an image quality bug with the game, and that can be seen here. That is a problem on Crytek's end, as the new 1.2 patch that has been recalled fixed this issue. Other than that, I have never seen any of the artifacts or bugs that you speak of with any NVIDIA card that I've ever used, nor have I heard of any one else complain about them (have you?). They seem to be isolated incidents with either the computer you tested on, or the graphics cards themselves. I had a GF4 Ti4200 that I tried playing FarCry on, and experienced no such problems. The same holds for my current card.

    Quote Originally Posted by saaya
    look at the batch test in 2k5 (read the xbit article about 2k5) wich clearly shows that ati has put more effort into the latest drivers, a LOT more than nvidia. and that test just backs up what i am seeing in several games.
    If you could provide a link I'd be very appreciative. And yes, I'm quite lazy.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •