• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

3-D

Status
Not open for further replies.

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Starcraft 2 uses 3D vector graphics (as in they are resolution independant and use 3D vectors for positioning). As such with the right display drivers you can view it in 3D and patches have even fixed bugs when doing so.

No idea how users of ATI (AMD) cards can view 3D...
No idea if it is possible on intel graphic chipsets (how can you play SC2 on them is another question).

Nvidia users get a 3D display driver installed by default (unless you activly tell it to not install it). This driver supports colour 3D glasses (red and blue or something) as well as depth adjustment for every game (some games may be visually buggy). What is great is it works for every 32 bit colour display although it will half the resolution to each eye and also decrease the colour depths to each eye by atleast 1/3 and will require colour filtering glasses to work.

Nvidia also offers a special shutterglass solution which allows full colour depth viewing at full resolution to each eye giving near faultless 3D. This however costs 200£ and requires a 3D compatible display (greater than 120 Hz refresh rate). The 3D behaves exactly like their default 3D drivers but clearer (no reduction in image quality per eye). Additionally the shutterglass glasses when worn reduce the luminence of everything by over 50% so it is like viewing through dark glasses (need high brightness to see a clear image).

Be warned that all 3D solutions will put an insane load on your graphic card and so should not be atempted unless using multiple graphic cards (recommended) or a new top range graphic card (still multiple advisable). This is because it has to render 2 frames per update (1 for each eye) which mean double the buffer requirements, double the card interface load and it can only update every second frame.
 
Level 22
Joined
Feb 4, 2005
Messages
3,971
I don't think ATI is worse. ATI cards are used by Dell and Alienware that you probably know are adept at gaming, take into account the games with their 3d grahpics and such and to play with quality. UNless you buy some 3d shades and surround yourself with monitors :)DDD) I think you're asking for the impossible
 
Level 14
Joined
Jun 13, 2007
Messages
1,432
It has support for nvidia 3d and you need a couple of tools and a specific screen to activate it. It's not worth the money (I bought it and used it like a month and then I needed a more updated driver of nvidia which wasn't compatible with nvidia 3d driver). Use one of those free stuff insteed.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
syltman, if you have that much money to spend on a 200£ glasses set + 3D capable display that you never use you must be too rich.

All nvidia drivers default with some form of Red/cyan driver.

Nvidia cards are also better than ATI cards, the fact that the worlds biggest super computer uses nvidia kind of supports this. ATI are more cost effective at times though.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Nvidia performs tessilisation better than ATI, is generally more stable (better drivers) and supports physcisX (ATI cards do not). There will almost always be games nvidia can do better than ATI and visaversa but nvidia usually has the advantage.

Alien ware and hell (I mean dell) uses ATI so much cause they often have better performance to money ratios and the consumers do not know anything about computers.
 
Level 19
Joined
Feb 4, 2009
Messages
1,313
since the "bug fix" for stereoscopic 3d (red/blue) it does not work any more (using geforce 9800gt)
worked before but the "patch" broke it
previously it just crashed when the game was over
but now there are red and blue bars on the left and right side of the screen and the center is black
screenshots are bugged as well so I can't report the bug

didn't try on my laptop yet since it has ati graphics and I never tried to make it work

one think I like about nvidia cards is that they limit the fps
ati cards tended to overheat during the black cut scenes between video scenes in the campaign
also the glut extension for opengl renders at 1600fps with ati cards (60 with nvidia) and there is no good way to slow it down
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
one think I like about nvidia cards is that they limit the fps
ati cards tended to overheat during the black cut scenes between video scenes in the campaign
also the glut extension for opengl renders at 1600fps with ati cards (60 with nvidia) and there is no good way to slow it down

Tell that to the geforce 8800 GT series. No frame limit + flawed cooling design = crash/fire.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
9800 GT may be simlar, but it is still more advanced in some ways. The geforce 8 series was one of the first DX 10 graphic cards and was miles ahead of ATI at the time. I would not be suprized if the first DX10 hardware was not the most efficent and suffered from major inefficencies performing more complex opperations.

Although the 8800 GT was a remake of the 8800 GTX/GTS with faster interfaces and improved efficency due to smaller die size, it probably still suffered the same early DX10 efficency flaws that the first generation of cards did. However because it was more energy efficent due to smaller die size and was meant to be a cheap card (as it was released around the time as the 9800 cards), they stuck reduced cooling on it compared to its predecessors. Combined with expensive inefficent DX10 opperations, this would explain how it can overheat at 100% fan speed as cooling reduction was based on average power consumption and not peek.

The 9800 GT was a generation more advanced and so probably had additional optimizations on it. Its larger size and power also meant more cooling was needed and its expensive price would allow for that.

My 275 GTX card has no overheating problems in SC2 at all. Ultra all the time to. I am suspecting my brothers brand new 460 will also not suffer overheating at all.

Thus I blame overheating to design faults from the vendors and to initial teething problems with the technology. Like the Xbox 360 with its reduced power and no overheating lite model, time allows for improvements in nearly every way on nearly any hardware system of a large scale (hundreds of millions of transistors).
 
Nvidia performs tessilisation better than ATI, is generally more stable (better drivers) and supports physcisX (ATI cards do not). There will almost always be games nvidia can do better than ATI and visaversa but nvidia usually has the advantage.

Alien ware and hell (I mean dell) uses ATI so much cause they often have better performance to money ratios and the consumers do not know anything about computers.
PhysX*
ViceVersa*
Alienware*
Alienware and Dell use ATI because they made a deal to not use Nvidia chipsets.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
There are many ways to 3D.

By far the healthiest will probably be the 3DS as that needs no glasses at all and has dedicated pixels to each eye (vertically interlaced). Ofcourse it is restirced in the view angle and to only 1 viewer but there should be no problem of flicker or reduced image quality due to glasses.
 
Status
Not open for further replies.
Top