View Full Version : Out - Geforce FX5600 - in - GeForce 6800 - Holy Crap!
My system (P4 2.6, 1GB Ram, GeForce FX5600-256MB) posted 1973 on the 3DMark03 test...
Swapped the 5600 for a 6800GT 256MB card - and the system posted a freaking 10100 on the same 3DMark03 test
:)
Now I can run GTR at 1024x768 32bit and post in game70fps+ --- vs 15fps previously.
One would not think that 800x600 32bit would be "noticably worse" than 1024x768 32bit with Anti-aliasing enabled - but it is like night and day.
If anyone is debating whether the jump from the GeForce 5xxx series to the 6xxx is worth it - stop ebating and start installing... :)
TransAm
05-24-2005, 12:35 PM
Thats an improvement
What display are you running this graphics card to? Big CRT or an LCD?
Well - my excitement is tempered by the fact that the FX5600 would support 3D acceleration while running dual-view drivers - which allowed 1600x1200 to my flat panel with the S-Video out to a big screen projection TV at 1024x768 with no distortion.
But the 6800GT will not allow 3D Accelleraiton while the dual-view drivers are running - so I have to use clone-display.
This means that 1024 x768 or 800x600 on the TV is now strethed a little.
I may need to run use the DVI to VGA adapter, then use a VGA to S-Video dongle to get bakc my square squares and round circles.
But who knows what the quality los will be.
But on the upside, it is 10 times faster.. :lol:
TransAm
05-24-2005, 12:55 PM
If you upgrade to plasma TV at some stage, most of them are DVI input = no quality loss at all. Digital card to digital screen via digital interface (I think so anyway, I'm no home theatre guru)
Well - I have a few buddies with plasma and or HD TV's - and watching 16x9 widescreen all the time is a pain - and even worse for gaming without some kind of display adjustment.
They have lossless DVI input, but other screen layout/ratio issues.
Consoles have display adjustment built in - but few PC games have it.
And I hate watching non HD/non widescreen shows all stretched out.
I will be one of the last non-widescreen holdouts on the planet :P hehe
nthfinity
05-24-2005, 01:06 PM
thats a jump ive been concidering... but the price just hasnt fallen at all :( $400 is a little hard to justify...
also, my geforce 4400 Ti (128 mb ram) is posting 1500-1700's... where i thought the 5xx series was supposed to hit 4000's... was something wrong with your card?
tuffguy
05-24-2005, 01:45 PM
Great review... that's the video card that I'm considering right now since I want to get GTR - I have a Ti4200. But I'm gonna need to upgrade my current 2.4ghz P4 system. I'm thinking of getting an Athlon 64 with the Venice core, but I'm not sure whether to get an SLI board or a non SLI board. Just in case I want to add another card later on when the prices drop.
thats a jump ive been concidering... but the price just hasnt fallen at all :( $400 is a little hard to justify...
also, my geforce 4400 Ti (128 mb ram) is posting 1500-1700's... where i thought the 5xx series was supposed to hit 4000's... was something wrong with your card?
Well - Fry's had it for $369 - so I justified it, and I think my FX5600 posted 1900 to 2000 all the time, although in the last 3 or 4 months the fan had seized up - so it was running hot.. hehehe maybe in degraded ode.
DeMoN
05-24-2005, 03:22 PM
Well yeah 1 whole series must be WAY up improvement... but costwize? What is the jump between those two cards?
I normally wait for the cards to drop to $250 before I purchase, this is the first card I have purchaed so close to launch date that the price was over $300 still.
I paid $259 for the FX5600 about 1 year ago.
SFDMALEX
05-24-2005, 05:36 PM
You should be able to run GTR @ 1280X1024 32bit 16AF 4AA with at least 30 cars on max graphics, except 3d wheels at no less then 60fps :wink: with that card. Give it a go.
irrational_i
05-25-2005, 05:58 AM
I still have an FX5600 and it is still okay.
But an upgrade is needed soon.
For a cheaper version you will struggle to do better than a 6600GT. I work with hardware often and the difference between the 6600GT and 6800GT is not that great. Certainly not worth the price diff. But if you can afford it - go for the 6800GT.
I think SLi is not that great, unless you plan to run your games in 1600X1200. Then the SLi will shine.
tuffguy
05-25-2005, 11:23 AM
I work with hardware often and the difference between the 6600GT and 6800GT is not that great
I think SLi is not that great, unless you plan to run your games in 1600X1200. Then the SLi will shine.
You obviously don't work with hardware that often, or you wouldn't say that.. :P
To see what games SLI works in:
http://www.xbitlabs.com/articles/video/display/gf6800u-sli_15.html
It depends on whether the game can make use of SLI:
http://www.xbitlabs.com/articles/video/display/gf6800u-sli_22.html
Notice what happens when you turn on the visual goodies:
http://www.anandtech.com/video/showdoc.aspx?i=2284&p=7
3DMark:
http://www.xbitlabs.com/articles/video/display/gf6800u-sli_31.html
Move your mouse over the second pic:
http://www.anandtech.com/video/showdoc.aspx?i=2281&p=2
tuffguy
05-25-2005, 11:55 AM
Great deal - Leadtek 6600gt for $149:
http://www.newegg.com/Product/Product.asp?Item=N82E16814122206
The same goes for the “eye candy” mode, but the SLI configuration made of two GeForce 6600 GT cards again lacks graphics memory and falls behind the single GeForce 6800 Ultra.
http://www.xbitlabs.com/images/video/gf6800u-sli/3dm2_candy.gif
irrational_i
05-31-2005, 07:32 AM
I work with hardware often and the difference between the 6600GT and 6800GT is not that great
I think SLi is not that great, unless you plan to run your games in 1600X1200. Then the SLi will shine.
You obviously don't work with hardware that often, or you wouldn't say that.. :P
To see what games SLI works in:
http://www.xbitlabs.com/articles/video/display/gf6800u-sli_15.html
It depends on whether the game can make use of SLI:
http://www.xbitlabs.com/articles/video/display/gf6800u-sli_22.html
Notice what happens when you turn on the visual goodies:
http://www.anandtech.com/video/showdoc.aspx?i=2284&p=7
3DMark:
http://www.xbitlabs.com/articles/video/display/gf6800u-sli_31.html
Move your mouse over the second pic:
http://www.anandtech.com/video/showdoc.aspx?i=2281&p=2
I have read most of those articles. To explain my comment quickly:
The 6800GT is easily the best performance/money ratio. The 6600GT runs fine in real-world apps and is MUCH cheaper. Most people I know run games in 1024X768 on 17" CRT monitors. In this case the 6800GT is still much faster than 6600GT, but you can't really observe it. Games we often play are Warcraft3, Neverwinter Nights, Quake3, Morrowind. The difference is hardly noticable.
I recently upgraded to a 19" LCD monitor and now I would also prefer a 6800GT to run in high res with everything turned on full.
If you run games at over 1280X1024 res, then you will start noticing the differences in the newest games like Doom3 and Half Life2 with the 6600GT still being a touch slow.
Remember the 6600GT is also GDDR3 memory, not he DDR of the base 6600 or 6800. It still suffers from fewer pixel pipes and narrower memory bandwidth. This being the reason it drops far behind in very high resolutions.
When you hit 1600X1200 the SLi comes into play with a noticable difference. The difference between 80fps and 50fps is not always noticable and paying double for an SLi setup is not always worth it.
So my argument is that if you have the money - go for it. If it is an issue, the 6600GT will give adequate performance at half the 6800GT price.
And I'd rather go 6800GT than 6600GT SLi.
And of course if you have sucky RAM and slow cpu, the gfx card won't come into its own. You need a complete machine. :D
Minacious
05-31-2005, 12:40 PM
ATI bitches, ATI :lol:
:D :D
I jumped shipped from ATI to NVIDIA. I went from a 9800XT to a 6800Ultra when they came out. I'll probably go back to ATI when the new cards arrive.
ATI drivers support and stability outside of games (especially in 2D mode) sucks.
NVidia have perfected the universal driver.
All hail NVidia.. :P
Lalalalalala...
all hail Nvidia...
Lalalalalala...
ATI is the enemy...
Lalalalala
:)
nthfinity
06-02-2005, 12:02 AM
ive recently heard that the geforce 7800 is making its debut late in june... i think ill waite til the 6800 drops in price, and then ill be the one laughing at the poor blokes who shelled out $370 for an awsome video card :lol:
or, thats just a rumor, and ill have to waite longer while you keep running higher resolutions and higher FPS then i coud hope to :oops:
Minacious
06-02-2005, 12:44 AM
I think ill waite til the 6800 drops in price, and then ill be the one laughing at the poor blokes who shelled out $370 for an awsome video card :lol:
I wish I did pay that small amount for my card. :(
Well - I have previously done the waiting game until this gen card - and I couldn't wait any longer :)
Strangely enough, the 5600's are still almost $200 - which makes no sense this late in the game.
nthfinity
06-02-2005, 02:26 AM
Well - I have previously done the waiting game until this gen card - and I couldn't wait any longer :)
Strangely enough, the 5600's are still almost $200 - which makes no sense this late in the game.
that is always the game one plays... price vs. performance
when i built my PC... not one item in it was below the latest available... and its been quite a good machine (except a raid problem) since i built 'er.
so 3 years on one video card is likely long enough... i remember the huge increase in performance vs. my old voodoo 3500 card... which was light years beyond my TNT 2 card.
now im faced with new-fangled processors , PCI express cards, and SATA hard drives.
Global Warming
06-02-2005, 01:18 PM
ATI bitches, ATI :lol:
:D :D
I jumped shipped from ATI to NVIDIA. I went from a 9800XT to a 6800Ultra when they came out. I'll probably go back to ATI when the new cards arrive.
I can overclock my 6800GT to 435MHz and the RAM to 600MHz which is faster than the Ultra and my card never goes above 60C. I would be interested to know what the max OC settings and temps are on your Ultra.
nthfinity
06-03-2005, 09:08 PM
well, i saw it (6800gt) at a new low price, so its mine now for a low $285, so the getting is good :-D
after its done, ill post my 3dmark scores before/after... see you in a bit
8) :D
1557 3d mark2003 ---->9485 :!:
amazing improvements :-D
^^^ See - I told ya so... :P
But the fact you paid like $70 less than I did is a little irritating.. ;) hehe
nthfinity
06-05-2005, 04:10 AM
^^^ See - I told ya so... :P
But the fact you paid like $70 less than I did is a little irritating.. ;) hehe
thamar is right, and its hard to play the game just right before buying the upgraded card... years ago when i built my pc, it hit the same mark for the Ti 4400... so im happy :P
anyway, what condolences can i offer you other than you are supporting the GDP more than i :P :lol:
in any case, it was your review that convinced me... as its been ages since i browsed toms hardware :-D
Quick update...
My system (P4 2.6, 1GB Ram, GeForce FX5600-256MB) posted 1973 on the 3DMark03 test...
Swapped the 5600 for a 6800GT 256MB card - and the system posted a freaking 10100 on the same 3DMark03 test
Got a P4 3.4 Prescott proc today... so folks know what to expect from GPU vs CPU upgrades, 3DMark03 posts 10100 again with the new 3.4 vs the 2.6 proc.
A little dissapointing, but good a excercise I guess. :)
GPU for games - and nce I do some video encoding tests, hopefully the 3.4 CPU will show gains.
saadie
06-20-2005, 01:32 AM
this is interesting .... there should've been a big difference with the prescott :?....
why dont you try a newer version of 3d mark ?
btw what motherboard ahve you got ?
this is interesting .... there should've been a big difference with the prescott :?....
why dont you try a newer version of 3d mark ?
btw what motherboard ahve you got ?
The point is to compare like test with like test.
So there is little value in changing the test each time you change hardware.
The motherboard, memory, windows install, harddrives and software has remained pretty much constant, so that actual gains can be measured, not "out of context" benchmarks.
It is a Gigabyte P875 8IK1100 V2.1 motherboard with 1GBof PC2700 DDR in 2 x 512MB sticks.
This setup is a pretty general representative of an average system.
I place very little value on out of contet "bench racer" benchmarks.
Like a good dyno run, you need to control the conditions to use it as a "before and after" tool to observe the delta, and not get too caught up in the raw numbers.
The gain the GPU provided was great, the gain from the CPU not much - in game related context - I hope it will evident in the video encoding excercise.
3.4 prescott, well how does it feel to have a toaster running all the time? :twisted:
Toaster? This thing is a freaking microwave... :P
70c resting temp mid 80's operating temp... ;)
This chip better make a decent difference in video encoding - or I am off to the AMD side of the fence... hehe
Yeah - but I am struggling to get this thing to run at a half decent temp. :(
Damn computers... ;) hehe
Weird shit - the BIOS shows 70c idle temp - but any hardware monitor software I load shows CPU temp to be 53ish degrees C.
Shows harddrive temps to be 58c
I wonder how to get accurate temps???
SFDMALEX
06-20-2005, 08:38 PM
Get your self a thermal monitor/fan controller because according to the data my bios shows this system should have blew up a year ago.
Well - after closer observation, and getting HMonitor and a few others to compare - the CPU, clocked to 3.8GHZ is now running a consistant 53c at idle in Win2K - and under hard work when running heavy duty simulation games (Flight sim, train sim or GTR) sits at 65/68c.
The drives seem to be ranging between 48c and 55c now. I mucked about with the fan placement, am working on a baffle AND I tried some new thermal compound for the CPU.
I am just thrilled at the reliability running at 3.8GHZ (so far ;))
I need to work on getting some more air flowing across the hard drives though.
nthfinity
06-21-2005, 01:34 PM
Well - after closer observation, and getting HMonitor and a few others to compare - the CPU, clocked to 3.8GHZ is now running a consistant 53c at idle in Win2K - and under hard work when running heavy duty simulation games (Flight sim, train sim or GTR) sits at 65/68c.
The drives seem to be ranging between 48c and 55c now. I mucked about with the fan placement, am working on a baffle AND I tried some new thermal compound for the CPU.
I am just thrilled at the reliability running at 3.8GHZ (so far ;))
I need to work on getting some more air flowing across the hard drives though.
for accurate temperatures, i use an IR gun
http://www.aces.edu/department/poultryventilation/ToolsofTrade.html
and, for cool temperatures, i highly recomend this case
http://www2.technobabble.com.au/article16.html... my temperatures under load dropped 10 degrees C with the case(14 degrees cooler HDD)
OK - you just sent me to a poultry farmer equipment supplier... ;)
Here in East Texas we dun raisin hawgs... hehehe - not chixkenz... :)
nthfinity
06-22-2005, 12:55 PM
OK - you just sent me to a poultry farmer equipment supplier... ;)
Here in East Texas we dun raisin hawgs... hehehe - not chixkenz... :)
i use mine to measure destratification in a building... and to monitor hardware phisical temperature :wink: ... very very accurate, and presice. besides, i was just doing a google search for an IR gun, and that site provided a picture of the tool; and who says texans cant raise chicken???? :P
and who says texans cant raise chicken???? :P
That new anti-beastiality law.. ;)
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.