yes!
we want some scores from you
i think lower pcie is master slot in xfire config
yes!
we want some scores from you
i think lower pcie is master slot in xfire config
RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W
RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU
SmartPhone Samsung Galaxy S7 EDGE
XBONE paired with 55'' Samsung LED 3D TV
yeah, but it was set to have the higher slot as master... and it worked, i only had a card in the upper slot and it booted fine, restarted the system and no display and error code 78, 10secs nothing, then 2 fast beeps and error code 85 on the diangostic display on my board.
took me a while to figure out the card had to be put into the lower slot to boot properly again
thx althes, i have feever and my throat hurts and all, but i couldnt resis to play with this thing, best distraction from the pain for me hahah
anyways uploading pics to imageshack, only single card results so far. pics will get posted in a sec.
a64 winchester 3200+ at stock 10x200
2x256mb 2.5-3-3
1x sapphire x1600pro
2k1:
____500/800 17345_______500/900 17859___
____500/800 17345_______600/800 17507___
____500/800 17345_______600/900 18091___
notice how upping the mem speed by 50mhz (100mhz ddr) gives 2x the boost compared to inreasing the core clock by 100mhz!
2k3:
____500/800 6982________500/900 7331____
____500/800 6982________600/800 7356___
____500/800 6982________600/900 7829___
in 2k3 upping the core 100mhz gives a slightly bigger boost than increasing the mem speed 50mhz (100mhz ddr)
2k5:
____500/800 3816________500/900 4026____
____500/800 3816________600/800 3995____
____500/800 3816________600/900 4262____
and surprisingly in 2k5 the mem bandwidth is more important again...
so its obvious those cards starve for mem bandwidth, wich makes me wonder why ati went the 128bit route and gave the xt some high end gddr3 memory instead of making at least the xt a 256bit card with slower and cheaper gddr2, wich would have resulted in the same if not more memory bandwidth...
for nvidias 6600s 128bit made sense, they performed very well nevertheless and 256bit would have brought them too close to the 6800 series, but for the 1600s it doesnt really make sense if you ask me...
theres a very wide gap beween the 1800s and the 1600s, both price and performence wise, a 1600 with 256bit would have closed that gap a decent bit, if not completely. now we have to wait for the 1700 to close that gap, but i guess it will still be 128bit and just with even faster gddr3 or even gddr4... wich i dont understand again... maybe ati gets gddr3/gddr4 that much cheaper if they buy a huge bunch that it makes more sense to use the latest high speed memory in a 128bt configuration rather than using older 256bit memory wich would give the same or more bandwidth...?
the 1600pros dont make any sense if you ask me
theese 1600 cards would have been 6600 killers back a year ago, but now they are too slow and cost too much to be a serious competition in crossfire for single cards like the 7800gt or a usefull replacement for a x850xt. the x850xt lacks sm3.0, but then again, is sm3.0 that important?
even the tp end cards struggle to perform with hdr and aa, and the 7800s cant eve run hdr with aa... so does a mid end card with sm3.0 make sense? if you ask me no, itll be too slow to perform good enough in future games tha really use hdr, and all those lil sm3 gimmicks weve seen arent worth to upgrade to sm3.0 yet if you ask me. if yo ant those gimmicks, go for it, otherwise i recommend a x850xt, or a x800gt or a 7800gt depending on what kind of performence your loking for and how much your willing to spend.
1600s in crossfire dont really make sense, at least not for the pros, and i dont think the xts will suddenly perform much better.
first i had hoped it was the pu holding them back in cf, but the 3ghz cf run from altes verifies what i already had a feeling about, those 1600 cards dont need a fast cpu... they remind me of my g4mx that didnt scale with cpu speed either as it was maxed out already.
and 1600s as single cards make sense as they beat the 6600s, but then again whos still buying 6600s?
the 1600s arent maxd out but they are close i bet. will run some tests with different cpu speed and some cf tests and post results tomorrow
horrible results, a 9800p would be about there
i5 3570K@ 4.8GHz 1.32v, 32GB Gskill 1866, Gigabyte g1 sniper m3
HD7970@1125/1575 stock voltage
1TB F1+2*128GB Crucial M4
Silverstone 450w, no case
2560*1440@120Hz overclocked catleap
Steelseries G6v2+5600dpi modded logitech trackball
yeah...Originally Posted by cirthix
(made this smiley just now, lets call it the always dissapointed saaya smiley )
Thanks for proving the point.
These cards shouldn't cost this must because they just arent worth it.
Running at 3ghz these x1600xt in cf should of been hitting 10k with ease.
They dont scale and really are maxed out.
I may get the dfi rdx200 board agin to see if you can run them in cf without a mastecard.
Abit KN9-SLI
X2 3800+
1gb corsair pc5400
2 evga 7900gt
200gb maxtor sata
ocz 520w
heatware:althes
ebay althes
you think cf was still not enabled?
i think it was, but the fact that we are asking ourselves whether it was enabled at all or not alone proves the point of how slow these cards are better than anything else
im still going to play with the cards in cf, its interesting from a technology standpoint, but doesnt make any sense cost or performence wise...
as i said earlier, if there was a single card scoring as high as two 1600s in cf nobody would even look at it, its just the cf bonus that makes it interesting
well have some fun, I will be playing with my xfire in doom and cod tomorrow.
Abit KN9-SLI
X2 3800+
1gb corsair pc5400
2 evga 7900gt
200gb maxtor sata
ocz 520w
heatware:althes
ebay althes
thx, sure will
lmk how it goes
"only"Originally Posted by saaya
you do remember the step?
nice work though
by the way 25K '2001 would be nice, maybe a goal for you to set
sec, lemme open the box. im running the winnie for now because i didnt trust the engineering sample cf board and didnt want to lose this sweet venice from amd ^^Originally Posted by kutteke
ada3800daa4bp
abble 0504epaw
1195157B50138
funny thing is it says 1.35v on the bottom of the ihs, cant remember to have seen this on another a64.
and 25k in 2k1... with a single card you mean?
going to post pics now, in 2k1 i get a LOWER score with cf than with a single card, lol
enabling crossfire worked fine, plugged in the second card, enabled dual gpu in bios, booted, new device found, reboot, and then some quick and easy steps:
go to the ati symbol in the task bar
go to the 2nd primary card showing up and click extend desktop
]
open the ccc and hit enable crossfire, done.
hehehe...finally making the right connections I see I think it's time you send me all your equipment and I'll test if for you BTW, that's a good cpu...hold on to it. Funny how quickly we forget..........Originally Posted by saaya
nice cpus s7
Abit KN9-SLI
X2 3800+
1gb corsair pc5400
2 evga 7900gt
200gb maxtor sata
ocz 520w
heatware:althes
ebay althes
in real games the x1600xt is between a 6800 and 6800gs.Originally Posted by cirthix
Unapproved link in signature. Signature has been removed.
Please read the forums rules and guidelines. They are at the top of the forums.
Edited by IFMU
TYVM.....just a FYI...those two cpu's were the first Venices ever seen in the wild. I brought them here to XS and they created quite a stir....Originally Posted by althes
yeah i still remember it... man thats quite some time ago isnt it?Originally Posted by s7e9h3n
good times... good times
sooo, time for a lil resume...
2k1: -6.3%
2k3: +31.0%
2k5: +69.5%
2k5 seems to scale really nicely, now lets se how games scale
i have to ad a few things though, the image quality on this card is notably worse than on my x850xtpe!
the dust/fog thatgets blown up in 2k5 return to proxycon, when the big resevoir falls down, is yellowish on this card
textures have a slightly worse image quality as well...
i guess that changing the settings in ccc can fix this, im running default, but to get the same top image quality as on x850 cards you will have to pay a little performence penalty.
something i didnt like about this sapphie cards is the weird noises they make sometimes.
does anybody lse expirience this?
they make a high pitch noise sometimes, it definately comes from the cards front sid, but i cant really tel where from exactly.
if the cpu tests in 2k3 or 2k5 are running this noise comes up all the time, and it makes a pulstating high pitch "feeeb" the faster the fps, the faster it goes. it does one feeb for each frame i the cpu test. it only happens betwen 1-4 fps, if its above that it either turns into a steady feeb or dissapears. really annoying...
and it sems other people also had some weird stuff smared on their cards?
wth is up with that? it appears theres some lousy quality control going on at sapphire as it seems their personell doesnt even wear antistatic gloves!?! i have a bunch of fingerprints on the back of both cards...
now, something even more concerning, the performence in 2k5 looks sweet, high fps... BUT thats not how it felt at some moments...
the fps was high enough to render the szene smoothly, 25fps and above, but it was stuttering notably... wich makes me tink that 2k5 uses alternate frame rednering, where one card runs one frame and the second the second frame. and somehow they got out of sync, and produced nearly identical frames for some time, so i got 25fps, but actually it was 12 identical or nearly identical frames.
so it was like having 12frames and just showing each frame twice, wich resulted in 25fps, but its not really 25fps...
this ws in gt3 in 2k5 btw
there was also some weird things in gt3 in 2k5, the flying boat was moving faster, then slower, then faser again, wich also indicates that both cards got out of sync or had problems to stay in sync.
this happened at the beginning of the game test.
it looked like both cards came out of syn and produced the same frames, then after half a second or a second it was running smoothly again but the fps dropped...
and in gt1, return to proxycon there were some weird things as well, in the szene the guy with the huge machine gun who blows away all the soldiers and boxes, theres a part where you see him from a frog perspective and the machine gun is flashing and has a huge flame bursting out, in this szene the lower 1/3 of the frame, below his machine gun, was suddenly dark, even though the machine gun flash had just lid up the upper part of the frame. this happened a few times and looks a little like in drago in 2k1 without vertical sync, where you see some partially rendered frames put on top of the previous frame, so a part of the frame is already more advanced in the szene than the other.
i ntoiced that with crossfire this partially frame thing happens more often.
this makes me thing that in 2k5s gt1 the cards are using sfr? and the cards got out of sync again...
its all a litle concerning, but then again, this will hopefully be fixed with new drivers.
where do i change what setting crossfire runs at?
where do i set it to do only super tiling or only sfr or afr?
sorry for the spelling, using another keyboard and it sux :P
2K5 is an easy answerOriginally Posted by saaya
clocks that's all
hmm, either seems CF doesn't like not having a master card or there are alot of driver issues.
Unapproved link in signature. Signature has been removed.
Please read the forums rules and guidelines. They are at the top of the forums.
Edited by IFMU
These results make me wanna go X850/X800
I mean...6k on air isn't too hard for the gto with the vf-700, not to mention the perf in games matches the perf in 3d05...to think that it was me who was crazy about this from the beginning
Perkam
dont get what you mean...Originally Posted by kutteke
and sabrewolf, where is the link to games scaling 70-100% with crossfire?
and yeah perkam, as i said earlier... the 1600s arent worth it... unless the price drops even more.
the original price before the recent price drop was ridiculous, its better now, but still not good enough...
3dmark05 is all about the gpu powerOriginally Posted by saaya
you should know that
you live on this forum
But I gave him links to img's with 95+% improvement with cfOriginally Posted by kutteke
Unapproved link in signature. Signature has been removed.
Please read the forums rules and guidelines. They are at the top of the forums.
Edited by IFMU
Bookmarks