• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

New Graphics Card

Status
Not open for further replies.
Level 27
Joined
Sep 24, 2006
Messages
4,979
So since i run Dead Space 2 at 25-30 fps on 1080p and all settings very high i decided that once upon a time i should buy a new one. But i don't want to spend 300 euros on it.

So here are my three candidates (Dutch site but specs are english);

Gainward GTX285
Price: € 122
Memory: 2GB (!!!)
Memory Speed: 1.242 Ghz
Memory Type: DDR3 (I believe there wasn't a lot difference between GPU DDR3 and 5)
Bandwidth: 512 Bit (!!!)
DirectX: 10 (This is the only con!)


And all that for 122 euros.

And the following two are probably equal in performance but differ in specs;

Sapphire HD5830
Price: € 129
Memory: 1GB
Memory Speed: 4000 Ghz
Bandwidth: 256 Bit
DirectX: 11

XFX HD 5770
Price: € 129
Memory: 1GB
Memory Speed: 5200 Ghz
Bandwidth: 128 Bit
DirectX: 11

As you can see the sapphire wins in bandwidth but loses at memory speed (but i think it can be overclocked using ATI's driver), while the XFX has alot more memory speed but less bandwidth. But i guess these specs make them pretty much equal and i heard overclocking the 5770 makes it get buggy.

So my only problem with these cards is that the gainward seems to be pure fucking overkill power at a ridicously low cost somehow but doesn't support directx 10. And i searched on google everywhere if dx11 to find out if it is so much better but it doesn't seem to be all that better. (Some people compared Stalker screenshots to eachother but that sucked because stalker is a shitgame, in DiRT2 the differences between dx9 and completely, if not barely invisible so i don't know what's going on there).

I also take in account that Nvidia seems to have these special privelleges on PC with for example Batman Arkham Asylum where some settings require an Nvidia card to run better, or to be enabled at all.

EDIT - Seems the only feature of DX11 that requires a true DX11 card is called Tesselation, it breaks up polygons into more polygons for more detail (kinda like AA i guess). But i'm playing on a 1080p screen... and with the amount of memory the gainward has i presume i could easily set AA to 8 or maybe 16. What the hell man, if dead space 2 had only like 4x AA everything would be displayed as smooth as a babies butt XD
 
Last edited:

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Ever looked into a geforce 460 GTX? Its runs SC2 on ultra no problem and is fully DX11 complient. It is slightly weaker than a 285GTX and has rather poor tessilisation shaders (well few of them, atleast it has them) but it was rather cheap.

Comparing stats like clock and memory bandwidth accross venders is stupid. An AMD card that "appears" faster usually is slower than an equivelent Nvidia card cause of the way they are designed. AMD cards have really crap support for graphic accelerated calculations of certain kinds (thus why physics in arkham perform horriably on AMD cards). AMD cards however generally perform better on grunt work (masses of polygons, or masses of tessilisation) but few games do that anymore unless for artistic style.

Be aware than an AMD card and an Nvidia card will generate a different image at the end. Due to the way they designed their cards and drivers, minor variations will exist in the final rendered image. An example could be that an Nvidia card does better quality AA (smoother) while an ATI card can crank up AA more but the AA at equivelent levels are not as good (maybe worse edge filtering) but this is just made up.


DX11 against DX10 is not so straightfoward. The 285GTX does not support DX10.1 (only 10) which all DX11 complient hardware does so you must remember that the enhancements to AA and render accuracy that DX10.1 brought are also out of your rach with a DX10 graphic card.

Tessilisation is not a replacement for AA... AA reduces the gaggy effect that occurs due to the finite sample scope my averaging what the pixel colour should be from many samples per pixel. 16x AA for example will do 16 samples in 1 pixel and comprise the output pixel from an average spread of textures based on those samples within that pixel. Tessilisation converts a low geometry model (few verticies) into a high geometry model (lots of verticies) on the fly. The verticies will still produce aliasing artifiacts as they get rendered exactly like non tesilized geometry.

Tessilisation is useful in many ways. Firstly it converts a model with a lot of geometry into a model with much less geometry and a texture layer. As the texture can be stored in the graphic memory, it means that the CPU only needs send the low geometry model when issuing frame rendering instructions and so save a lot of interface bandwidth. As the model is simpler, animations are easier to compute and physics simplified meaning the CPU has to do less work prepairing each frame. Tessilisation is also scaleable, so whereas now we may be restricted to still low geometry densities (af few 10Ks on a main character actor), in the future with out 1337 machines of doom we could render them at a few 1000Ks to give movie like smoothness. It also allows for potentially nifty shortcuts of reusing base models with different textures to create unique models instead of having to software tessilise like all the customizable character face systems do.

back to your problem. I advise Nvidia as they have seemed always good to me. The only problem I had was with the 8800 GT overheating in SC2 but that was because it was the first DX10 graphic card and was designed without consideration of DX10 being used like SC2 did. Both my 275GTX and my brothers 460GTX are very good cards and have yet to show a fault at all.

I do advise that since you are playing at 1080p, you should get a card with atleast 1GB of memory, as performance drops sharply when memory runs out and it is possible under stressing conditions in 1080p to reach such a condition.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
You will probably get only 10-20 FPS more or so... The difference in power between 4870 and 5830 is not that big... That is if your graphic card is the limiting factor.

Maybe if dead space 2 does some major DX11 optimizations it will be bigger but do not expect miricle leaps in performance from that card.
 
Level 27
Joined
Sep 24, 2006
Messages
4,979
Metro-2033_DX11_Benchmark.jpg


O rly. Btw i always doubt if these guys use a HDMI calbe in their tests =/
 
Level 15
Joined
Mar 31, 2009
Messages
1,397
Dude, thats only +2FPS, and it's still unplayable :/

The HD5830 is like an HD4890 with DX11 and half the power consumption, like Super said, not worth it.

You should have waited for HD5870/HD6950 to hit $200 :/
 
Level 27
Joined
Sep 24, 2006
Messages
4,979
Huh, blarghonk my card isn't on there. I don't have a 460. It's just that supergood said that the 460 was better than a 5830...

and if you're willing to pay for a 5870-90 of 200 euros you could as well buy a cheaper 6000 series one with higher nm, clockrate etc. (70 is still quite a difference in price)

Btw 28 fps is playable, see my first post. consoles games run at like 30 fps, and some games make them drop under that because developers always want more eyecandy than the things can handle.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Wow I was wrong, so sorry about that. I meant a 2-3 frame difference (out by a multiple of 10, damn). You should only really upgrade after 3-4 series have come and gone as then you get a easilly noticable and well worth it boost.

Using a HDMI cable makes no difference to framerate at all as the broadcasting interface is equivently computationally free (does not delay rendering at all). Ofcourse, the use of Vsync may consume more time but its still the same for both HDMI/DVI (they broadcast almost identically) and VGA.

consoles games run at like 30 fps
Wrong... Console games run at 60 FPS as required by law to prevent motion sickness. Yes games can dip below 60 FPS in high stress conditions as is to be expected. Games that run at only 30 FPS need to introduce motion blur to prevent motion sickness or greatly decrease the speed at which motion is executed.
The only games I know that routeinly run at 30 FPS are DS games cause it splits the main GPU across both displays to give 30 FPS each. Super smash brothers, Mario galaxy, final fantasy 10, 12, 13, starcrraft II, all lego games, all sonic games (not under stress conditions) run at 60 FPS.

Some pal systems run at only 50 FPS due to PAL specifications but all modern 1080p complient displays support 60 FPS and will utalize that mode with component or HDMI connections.

Huh, blarghonk my card isn't on there. I don't have a 460. It's just that supergood said that the 460 was better than a 5830...
That test means nothing. It did not take into account the visual quality of images produced (there is a differences) nor did it take into account general purporse use of the GPU for physics. As mentioned, ATI can do a lot of grunt raw power but more specialized processes cause it to dramatically fall. An example is in Batman where turning on the GPU physics support will cause even the best ATI cards to fall to an unplayable 10 FPS where as a 460 GTX will cope easilly.
 
Level 27
Joined
Sep 24, 2006
Messages
4,979
Wow I was wrong, so sorry about that. I meant a 2-3 frame difference (out by a multiple of 10, damn). You should only really upgrade after 3-4 series have come and gone as then you get a easilly noticable and well worth it boost.

I can't help it lol.

Wrong... Console games run at 60 FPS as required by law to prevent motion sickness. Yes games can dip below 60 FPS in high stress conditions as is to be expected. Games that run at only 30 FPS need to introduce motion blur to prevent motion sickness or greatly decrease the speed at which motion is executed.
The only games I know that routeinly run at 30 FPS are DS games cause it splits the main GPU across both displays to give 30 FPS each. Super smash brothers, Mario galaxy, final fantasy 10, 12, 13, starcrraft II, all lego games, all sonic games (not under stress conditions) run at 60 FPS.

Mh, then i find it a weird thing to say for the maker of Gran Turismo 5 that he tried to aim for 50-60 fps performance. Makes you wonder... you can't really tell the difference between a minimum of 30 fps vs 60 fps with the naked eye you'd have to compare the images next to eachother to see it.

That test means nothing. It did not take into account the visual quality of images produced (there is a differences) nor did it take into account general purporse use of the GPU for physics. As mentioned, ATI can do a lot of grunt raw power but more specialized processes cause it to dramatically fall. An example is in Batman where turning on the GPU physics support will cause even the best ATI cards to fall to an unplayable 10 FPS where as a 460 GTX will cope easilly.

How do you mean, would a GTX 460 produce such a different image?

Arkham Asylum is not a good example of the GTX's power as it is clearly much more compatible with that game. What if ATI would that?... Offcourse Nvidia doesn't want any other card to run their precious PshyX so they make it only for their cards.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
you can't really tell the difference between a minimum of 30 fps vs 60 fps with the naked eye you'd have to compare the images next to eachother to see it.

On the contary. You can easilly tell the difference. At 30 FPS all animations appear more jerky, at 60 FPS everything is almost silky smooth. Image quality should be the same, just the change of images is improved at higher framerate. I am aware that movies on TV are at 30 FPS, but you must remember that they use realistic motion blur to hide this restriction unlike games where each frame is very sharp so it is easy for discontinuities to occur which are noticable by your eye.

Yes 30 FPS is enough to play, but it is still noticably worse than 60 FPS.

Arkham Asylum is not a good example of the GTX's power as it is clearly much more compatible with that game. What if ATI would that?... Offcourse Nvidia doesn't want any other card to run their precious PshyX so they make it only for their cards.
How many super computers are built from AMD graphic cards? You will find most of them are Nvidia for a reason, nivida is better at non graphic computation.

How do you mean, would a GTX 460 produce such a different image?
They do not work the same? Nvidia has a different set of visual enhancements from AMD and both manufacturers have different ways of rendering images based on driver and hardware. Thus minor variations in output image should be observable. An example of this is the xbox 360 vs PS3 on games which use the exact same graphics, yes this is more extreem as the 360 uses a very different graphic hardware configuration but simlar results should be observable for the 2 PC graphic manufacturers.
 
Level 15
Joined
Mar 31, 2009
Messages
1,397
Metro-2033_DX11_Benchmark.jpg


http://www.anandtech.com/show/2856/14
http://www.tomshardware.com/reviews/radeon-hd-5770,2446-17.html
http://www.techspot.com/review/209-ati-radeon-hd-5770/page13.html
http://www.guru3d.com/article/radeon-hd-5770-review-test/25

*puts on jeopardy music*

The HD5770 has slightly worse performance than a HD4870, your current card, therefore it is a good surrogate in the benchy you provided, meaning the fps jump is only 2-3fps, like I said



I can't help it lol.

Arkham Asylum is not a good example of the GTX's power as it is clearly much more compatible with that game. What if ATI would that?... Offcourse Nvidia doesn't want any other card to run their precious PshyX so they make it only for their cards.

Yep, and then make it run using x87 instruction sets on one CPU core for ATI users.


They do not work the same? Nvidia has a different set of visual enhancements from AMD and both manufacturers have different ways of rendering images based on driver and hardware. Thus minor variations in output image should be observable. An example of this is the xbox 360 vs PS3 on games which use the exact same graphics, yes this is more extreem as the 360 uses a very different graphic hardware configuration but simlar results should be observable for the 2 PC graphic manufacturers.

You keep saying this then never give proof :/

Frankly, I think it's BS, different methods to accomplish the same thing sure, but I think output is the same.



Warning: Dr. is a Nvidia fanboy, I just like the best GPU for the price.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Its hard to get proof when you use only nvidia graphic cards, I would be willing to post comparision shots of ingame starcraft II with an AMD user (someone else) to prove or dis-prove it. Ofcourse the conditions would be ultra, certain common resolution, same AA level, same map and camera angles etc.

I do know that ATI and nvidia support different graphic enhancements over each other which when used will logically give different output results if they are enabled. Additionally, the Dolphin emulator team had a lot of problem getting nvidia and AMD cards to work the same and had graphic glitches or crashes which only expressed with 1 of the vendors.
 
Status
Not open for further replies.
Top