• Listen to a special audio message from Bill Roper to the Hive Workshop community (Bill is a former Vice President of Blizzard Entertainment, Producer, Designer, Musician, Voice Actor) 🔗Click here to hear his message!
  • Read Evilhog's interview with Gregory Alper, the original composer of the music for WarCraft: Orcs & Humans 🔗Click here to read the full interview.

Warcraft 3 Engine Upgrade

Status
Not open for further replies.
Level 21
Joined
May 16, 2012
Messages
638
Recently our beloved warcraft is getting some attention from blizzard, which is awesome, but is there any chance for an engine update, specially an API upgrade. Warcraft still using directx 8 which makes things bad for medium to high end builds like mine, and its even worse for those who like to customize the game with awesome HD models full of polygons ripped from Wow, Starcraft 2, etc, like myself :). I know its hard to do, re-designing the game engine is basically re-designing the game as a whole, but it would be awesome to see warcraft 3 HD and optmized for today avarage computer's.
 
IMO if there's no Warcraft 4 coming, they might as well do some Remastering, at least on doodads and environment, including lighting. :) People already use HQ doodads with regular unit models anyway.
Since our mod came to life atlast it is really needed to have a remastered engine ( x64 support ) so our HD mod can support custom maps and we can put our full quality textures in without reducing to save ram
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
[...] Warcraft still using directx 8 which makes things bad for medium to high end builds like mine [...] it would be awesome to see warcraft 3 HD and optmized for today avarage computer's.
Since Patch 1.27a which was released on March 14, 2016, Warcraft uses DirectX 9 instead. Also, Blizzard switched from Visual Studio 2005 to Visual Studio 2013.

[...] its even worse for those who like to customize the game with awesome HD models full of polygons ripped from Wow, Starcraft 2, etc, like myself :). [...]
There is no need to use that many polygons, for a 3D model to look good. Especially ingame.
 
Last edited:
Since Patch 1.27a which was released on March 14, 2016, Warcraft uses DirectX 9 instead. Also, Blizzard switched from Visual Studio 2005 to Visual Studio 2013.


There is no need to use that many polygons, for a 3D model to look good. Especially ingame.

Well they look really bad ingame to be honest compared to our mod or starcraft 2 or any updated or new game, 15 years ago it was awesome but it needs an official x64 support and graphics update from blizzard itself ( atleast it will make us the modders job alot easier )
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
You may want to experiment with increasing the polycount of some 3D models by using War3PolygonEnhancer v0.01 (from the Warcraft 3 HQ thread) by @oger-lord, to see what works best ingame for any 3D model.


To better understand why only a small amount of extra polygons are required for a 3D model to look better, please have a look at the Authentic Model Improvement project for Quake. A backup of the original thread of the project *with screenshots*, courtesy of archive.org, can be found here.

For a direct download of the latest version atm which is v1.72, please click here.


Regarding Warcraft 3, very detailed 3D models should imho be used only in cinematics, if and where appropriate.
 
Last edited:
Level 21
Joined
May 16, 2012
Messages
638
Since our mod came to life atlast it is really needed to have a remastered engine ( x64 support ) so our HD mod can support custom maps and we can put our full quality textures in without reducing to save ram

I would say that saving ram
You may want to experiment with increasing the polycount of some 3D models by using War3PolygonEnhancer v0.01 (from the Warcraft 3 HQ thread) by @oger-lord, to see what works best ingame for any 3D model.


To better understand why only a small amount of extra polygons are required for a 3D model to look better, please have a look at the Authentic Model Improvement project for Quake. A backup of the original thread of the project *with screenshots*, courtesy of archive.org, can be found here.

For a direct download of the latest version atm which is v1.72, please click here.


Regarding Warcraft 3, very detailed 3D models should imho be used only in cinematics, if and where appropriate.

this sounds interesting, i'm going to look it. For what you say it can increase the amount of polygons, but can it also decrease? so i could create a model that is not so taxing for the game.
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
I do not remember that War3PolygonEnhancer v0.01 can successfully decrease the original polycount of a 3D model.

[...] it is really needed to have a remastered engine ( x64 support ) so our HD mod can support custom maps and we can put our full quality textures in without reducing to save ram
[RANT]
R.I.P. 8-bit paletted textures.
[/RANT]

Btw, has anyone tried to convert the 24-bit TGA files from the Map Texture Packs of QRP down to 256 colors, and then run a source port of Quake with said 256-color map textures? Just saying...

[...] 15 years ago it was awesome [...]
In 2002, Warcraft 3 could (barely) run on a PII 350 PC, because the foundations of its game engine were from the year 1998.


Because it is imho always a good thing to have a look at what the other gaming communities develop, I am mentioning here two patches for Morrowind:
- Exe Optimizer - Timeslip's utilities, mods and patches
- 4GB Patch - NTCore's Homepage

Now, has any of the Warcraft 3 communities created an equivalent of these patches?


A long time ago, I had loaded the Warcraft 3 exe and dlls files of v1.20c in Hackman Disassembler v8.0a, for educational purposes. According to this tool, there were MMX instructions in the code, plus an optional code path for using SSE instructions if they were available.

Now, does someone know if any additional optional code paths have been added, in order to take advantage of more advanced instruction sets, like for example SSE2?
 
Last edited:

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,255
Warcraft still using directx 8 which makes things bad for medium to high end builds like mine
This is not true. For about 2 years now Warcraft III links to Direct3D9, part of DirectX9. Additionally Warcraft III never used the core features added to Direct3D8 (part of DirectX8), programable shaders. Instead Warcraft III still uses the Direct3D7 (part of DirectX7) fixed function pipeline for graphics, possibly with some Direct3D8 extensions.

Switching to more modern APIs such as Direct3D11 or Direct3X12/Vulkan could yield significant render performance improvements. That said rendering Warcraft III is so trivial that the average map developer would have more benefit from a JIT recompiler for JASS than using cutting edge APIs.
and its even worse for those who like to customize the game with awesome HD models full of polygons ripped from Wow, Starcraft 2, etc, like myself
I do not think model geometry and texture size has much of an impact on performance as WC3 is practically never GPU bounded. Sure more bones would effect performance, but I imagine all vertices are stored in some kind of vertex buffer.
Since our mod came to life atlast it is really needed to have a remastered engine ( x64 support ) so our HD mod can support custom maps and we can put our full quality textures in without reducing to save ram
Problem is that x86-64 support can decrease performance in video games. Especially if memory bandwidth is a limiting factor. This is why many StarCraft II maps have performance problems in the x86-64 build that were not present in the plain x86. x86-64 uses more memory bandwidth because the larger pointer sizes decrease memory density and cache efficiency.
R.I.P. 8-bit paletted textures.
I am unsure how these are implemented. As far as I am aware GPUs do not natively support indexed colour anymore. Vulkan API has no direct support for indexed colour lookup. One can do some programable pixel shader hackery to perform lookup however this is likely slower than using 32bit RGBA.

I do recall older versions of OpenGL and Direct3D maybe supporting indexed colour lookup. However if they did this functionality is entirely deprecated today and will likely perform sub-optimally on modern hardware due to emulation.
According to this tool, there were MMX instructions in the code, plus an optional code path for using SSE instructions if they were available.

Now, does someone know if any additional optional code paths have been added, in order to take advantage of more advanced instruction sets, like for example SSE2?
yes they have been added. They are added by the compiler. Compilers emit multiple code paths for bulk operations thanks to intrinsics and such. Building with a modern compiler can be enough to add AVX code paths. That said some code paths might have been removed as modern processors love complex repeat instructions thanks to micro code and pipeline optimizations removing the need for vector instruction intrinsics for bulk memory operations.

Weird enough I've tried one hell of 4 gb patches they work on editor but ingame none of them worked
Without rebuilding for x86-64, building with extended address range will at best allow 4GB of memory to be allocated on x86-64 OSes. The OS might still reserve up to 2 GB of that extra address space.
Will Warcraft 3 someday make the most out of the features of one's CPU for processing BLP files, the same way libjpeg-turbo already does for JPEG files?
Seems a pointless optimization. It might save 2 seconds of load time for a map that must pre-load every single texture in the game.

Intel made Warcraft III's JPEG library. Although it is no longer actively supported, it likely was top quality and already highly optimized at the time. Sure it might under perform a modern implementation using AVX, but my I7 does not support AVX so it does not help me anyway.

The single biggest improvement Blizzard could make to Warcraft III graphics is also the most simple. They just need to change the load type of all textures to sRGB and the swap chain buffer to sRGB. This would fix up the trashy looking lighting, vastly improving how the game looks. Since Warcraft III now requires Direct3D9 anyway, I do not see any down side to this. Even performance will remain the same as all modern GPUs are optimized for sRGB usage.
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
[...] As far as I am aware GPUs do not natively support indexed colour anymore. [...]

I do recall older versions of OpenGL and Direct3D maybe supporting indexed colour lookup. However if they did this functionality is entirely deprecated today [...]
That's why I have written R.I.P.

Seems a pointless optimization. It might save 2 seconds of load time for a map that must pre-load every single texture in the game.
There are no pointless optimizations imho.

Intel made Warcraft III's JPEG library. Although it is no longer actively supported, it likely was top quality and already highly optimized at the time. Sure it might under perform a modern implementation using AVX, but my I7 does not support AVX so it does not help me anyway.
In 2006, an Intel representative "strongly recommends users to check the Intel IPP that will bring more benefits for functionalities and performance."

Source:
Legal obligations on usage of Intel JPEG Library Ijl15.dll

The single biggest improvement Blizzard could make to Warcraft III graphics is also the most simple. They just need to change the load type of all textures to sRGB and the swap chain buffer to sRGB. This would fix up the trashy looking lighting, vastly improving how the game looks. Since Warcraft III now requires Direct3D9 anyway, I do not see any down side to this. [...]
I am guessing this would bump the OpenGL requirements?

GL_EXT_texture_sRGB requires OpenGL 2.1 support
GL_ARB_framebuffer_sRGB requires OpenGL 3.0 support

opengl intel i915GM.png
 
Last edited:

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,255
There are no pointless optimizations imho.
There is if it takes them several days of development and relicensing to implement for a feature that will give little gains. It might only make a difference if every texture is loaded (which few maps do) and even than it might be in the seconds at best.

One the other hand changing some data structures used to resolve localized text strings during object data loading might save several minutes of load time.
In 2006, an Intel representative "strongly recommends users to check the Intel IPP that will bring more benefits for functionalities and performance."
Same reason as above. Trivial benefit gains for a lot of work and potentially more errors. If something is not broken there is no reason to fix it, especially when so much else is still broken.
I am guessing this would bump the OpenGL requirements?
Seeing how Warcraft III requires fully D3D9 compliant hardware, I doubt that is a problem.

Gamma (Direct3D 9) (Windows)
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
There is if it takes them several days of development and relicensing to implement for a feature that will give little gains. It might only make a difference if every texture is loaded (which few maps do) and even than it might be in the seconds at best.
Quoting someone:

"Dramaticly boosts map load speed when first opening a map. I am quite happy with the few seconds it can save when playing WarCraft III."

Source: Experimental WC3 loading performance booster

Same reason as above. Trivial benefit gains for a lot of work and potentially more errors. If something is not broken there is no reason to fix it, especially when so much else is still broken.
With Blizzard, nothing breaks. Instead, everything is getting improved Soon(tm).
:grin:

Because Patch 1.27a was released on March 14, 2016, that is, more or less 800 days ago, I doubt that a few days of specific development work could make any significant difference, regarding their Roadmap. Plus, it would contribute to paving the way to the future for Blizzard's Classic Games.

afaik, Ijl15 only supports MMX extensions.

One the other hand changing some data structures used to resolve localized text strings during object data loading might save several minutes of load time.
As I said previously, there are no pointless optimizations imho.

Seeing how Warcraft III requires fully D3D9 compliant hardware, I doubt that is a problem.

Gamma (Direct3D 9) (Windows)
Can Mac users use Direct3D natively?
:wink:

Also, depending on one's hardware and drivers, 3D API compliance may vary greatly:

DX Intel i915GM.gif OGL Intel i915GM.gif

(note: uses *software* TnL, which might lead to BSODs on startup because some games really do not expect that kind of thing)
 
Last edited:

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,255
Quoting someone:
That tool was trivial to write and only really needed when playing on mechanical drives due to how slow they are at random access. It saves a few seconds of initial load time by improving the performance of Warcraft III file IO on mechanical drives by taking advantage of the OS file cache.
Because Patch 1.27a was released on March 14, 2016, that is, more or less 800 days ago, I doubt that a few days of specific development work could make any significant difference, regarding their Roadmap. Plus, it would contribute to paving the way to the future for Blizzard's Classic Games.
It makes a huge difference when there are major outstanding bugs and errors to fix. It is like trying to demolish a molehill in your garden when there is a volcano right next to it. I could list a dozen feature that would make more of an impact to everyone, some of them pretty major engine bugs. Would their time be better spent dealing with those or optimizing JPEG content BLP image load times?
afaik, Ijl15 only supports MMX extensions.
Which is already a fairly large boost.

Equally well I am sure performance for JPEG decoding is increased quite a lot by moving to x86-64. I guess when/if they decide to do that then going for such a library, or at least a more modern Intel one, would be a worth while thing to do.
As I said previously, there are no pointless optimizations imho.
There are when it comes to allocating development resources. If a building is on fire it is pointless wasting resources improving the finish on the lobby floors. Warcraft III has a lot of serious problems, so many that there is currently no justification wasting development resources trying to save at most a few miliseconds at best with JPEG content BLP load times on map load. Even as far as optimizations go it is not that worth while as it will save at best a few miliseconds per map load where as implementing something like a basic JASS JIT compiler could easily save dozens of seconds or even minutes of execution time over the course of a custom map session.
Can Mac users use Direct3D natively?
Mac has OpenGL equivalents. All APIs implemented the feature around the same time.
Also, depending on one's hardware and drivers, 3D API compliance may vary greatly:
All modern graphic hardware should support sRGB textures. The feature has been pretty mandatory for over 10 years now.
(note: uses *software* TnL, which might lead to BSODs on startup because some games really do not expect that kind of thing)
That does not make sense. As far as the game is concerned it should not be able to tell the difference between software or hardware TnL as the implementation is abstracted away from them by the graphics API. If it BSoDs that points towards a driver bug with the implementation of software TnL.
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
That tool was trivial to write and only really needed when playing on mechanical drives due to how slow they are at random access. It saves a few seconds of initial load time by improving the performance of Warcraft III file IO on mechanical drives by taking advantage of the OS file cache.
That tool was a PoC which was optimized at least once.

As a PoC, it has imho successfully fulfilled its goal. But the final product (hopefully an executable file not written in JAVA not DotNet) could be better imho.

Which is already a fairly large boost [...]
No, because the virtues of MMX Technology(tm) were more or less a marketing ploy, and were overhyped by Intel. In that respect, 3DNow! by AMD was not any better imho.

MMX (instruction set) - Wikipedia

[...] Equally well I am sure performance for JPEG decoding is increased quite a lot by moving to x86-64. I guess when/if they decide to do that then going for such a library, or at least a more modern Intel one, would be a worth while thing to do.
Modern technology requires up to date software, for the two to work together optimally.

It makes a huge difference when there are major outstanding bugs and errors to fix. It is like trying to demolish a molehill in your garden when there is a volcano right next to it. I could list a dozen feature that would make more of an impact to everyone, some of them pretty major engine bugs. Would their time be better spent dealing with those or optimizing JPEG content BLP image load times?
There are when it comes to allocating development resources. If a building is on fire it is pointless wasting resources improving the finish on the lobby floors. Warcraft III has a lot of serious problems, so many that there is currently no justification wasting development resources trying to save at most a few miliseconds at best with JPEG content BLP load times on map load. Even as far as optimizations go it is not that worth while as it will save at best a few miliseconds per map load where as implementing something like a basic JASS JIT compiler could easily save dozens of seconds or even minutes of execution time over the course of a custom map session.
Is Warcraft 3 currently on fire? And if yes, then when did that fire start? And why?
With which version of the game did these errors and bugs become prominent?


Performance improvements are always welcome, especially if they are easy (or easy enough) to implement. Plus, such mentions are always nice to read in a whatsnew. It means that the developers care, and keep preparing the future. That way, people do not grow too impatient (up to a point, that is).

Mac has OpenGL equivalents. All APIs implemented the feature around the same time.
It would be interesting to know more about the Macs which can run the many versions of the game, and notably which versions of OpenGL they support. I do not remember reading this information anywhere.

All modern graphic hardware should support sRGB textures. The feature has been pretty mandatory for over 10 years now.
OpenGL drivers from Intel have notoriously been behind their Direct3D ones. And their implementation of TnL allowed them to cut chipset production costs, at the expense of performance (more CPU power required).

That does not make sense. As far as the game is concerned it should not be able to tell the difference between software or hardware TnL as the implementation is abstracted away from them by the graphics API. If it BSoDs that points towards a driver bug with the implementation of software TnL.
Try running XashXT on an Intel i915GM graphic chip (note: Xash3D runs fine), or try activating bump mapping effects in Drakan: Order of the Flame on said hardware.

Fortunately, the situation is not that bad, because a testing benchmark such as 3DMark2001 SE runs fine. Even though the more demanding tests make the fan

Intel graphic chipsets are somewhat peculiar things. They do not receive that much love from developers, nor from testers unfortunately.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,255
That tool was a PoC which was optimized at least once.

As a PoC, it has imho successfully fulfilled its goal. But the final product (hopefully an executable file not written in JAVA not DotNet) could be better imho.
Sorry but what is the problem with Java?!
No, because the virtues of MMX Technology(tm) were more or less a marketing ploy, and were overhyped by Intel. In that respect, 3DNow! by AMD was not any better imho.
And modern processors will execute multiple simple instructions per clock cycle anyway, so the "unoptimized" route might still be comparibly fast to an AVX optimized implementation.

I am pretty sure AVX shares the same functional units as the standard x86 and x86-64 pipeline. As such simply using it does not grant a fixed multiplier of performance. Instead it potentially allows one to maximize performance but serial execution of instructions may already be close to that depending on the shortest data path (forget term name). Most of the gains come with addition/subtraction and bitwise operations since those functional units are small so available in large quantities. Multiplication less so as those are larger. Division and modulus are always slow even with AVX as those require extremely large functional units which are a very rare resource.
Is Warcraft 3 currently on fire? And if yes, then when did that fire start? And why?
Wide screens support still has a huge room for improvement, such as anchoring stuff correctly. Warcraft III still crashes extremely easily.

For example 4:3 has wrong UI hitboxes... It never used to but now does! Luckily few people use it.
With which version of the game did these errors and bugs become prominent?
Release, and some with the recent patches.
Performance improvements are always welcome, especially if they are easy (or easy enough) to implement. Plus, such mentions are always nice to read in a whatsnew. It means that the developers care, and keep preparing the future. That way, people do not grow too impatient (up to a point, that is).
Or they could change some data structures and save nearly a minute loading some maps.... Optimizations are only worth it where they matter.
It would be interesting to know more about the Macs which can run the many versions of the game, and notably which versions of OpenGL they support. I do not remember reading this information anywhere.
Mac graphics are terrible. Ask the dolphin emulator team.
OpenGL drivers from Intel have notoriously been behind their Direct3D ones. And their implementation of TnL allowed them to cut chipset production costs, at the expense of performance (more CPU power required).
BSoD is still a driver problem and nothing to do with the game using TnL. Both OpenGL and D3D do not specify that TnL need to be done with dedicated hardware. As long as the application uses the APIs correctly and the driver produces correct results no one cares. For a long time Microsoft offered full software D3D11 implementations aimed at driver developers to test conformance and game developers to debug. With these one could literally run a game with a full software implementation of D3D11. Vulkan API even has native support for such graphic devices, allowing a program to resolve if a device is software emulated, integrated or discrete.
Try running XashXT on an Intel i915GM graphic chip (note: Xash3D runs fine), or try activating bump mapping effects in Drakan: Order of the Flame on said hardware.
If it causes a BSoD that is a driver problem. Applications cannot cause BSoDs directly without running into driver bugs or purposely using OS or driver APIs wrong.
Intel graphic chipsets are somewhat peculiar things. They do not receive that much love from developers, nor from testers unfortunately.
Not entirely true. Most emulator developers have more grief with AMD as far as compliance goes. AMD literally ignores their bug reports, while Intel at least resolves some of them. These are API implementation bugs. Out side of obscure graphics companies, NVidia usually resolves them the most swiftly while AMD might never resolve some.

Programmers get around this with shader macros to force different shader code paths for different graphic vendors to avoid buggy features.
 

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
Sorry but what is the problem with Java?!
The average computer user. Java is not a part of the Windows OS. Therefore, people have to install it, and/or update it, and/or learn how to launch an application which uses it. Reading the WC3Boost thread speaks volumes. In fact, installing / updating / using *anything* which is computer-related can be an issue for some people.

As for me, I personally prefer software which uses a runtime by Microsoft with a small HDD footprint, or no runtime / dependencies at all.

I am pretty sure AVX shares the same functional units as the standard x86 and x86-64 pipeline. As such simply using it does not grant a fixed multiplier of performance. Instead it potentially allows one to maximize performance but serial execution of instructions may already be close to that depending on the shortest data path (forget term name). Most of the gains come with addition/subtraction and bitwise operations since those functional units are small so available in large quantities. Multiplication less so as those are larger. Division and modulus are always slow even with AVX as those require extremely large functional units which are a very rare resource.
Afaik the only thing computers are fast at are additions, and the handling of integers.

Quoting Ken Silverman:
"Favorite optimization: "sub eax, 128" -> "add eax, -128" Think about it. Or not."

Every bit of optimization helps. I rest my case.

Wide screens support still has a huge room for improvement, such as anchoring stuff correctly. Warcraft III still crashes extremely easily.

For example 4:3 has wrong UI hitboxes... It never used to but now does! Luckily few people use it.
Release, and some with the recent patches.
Why put a wip version of the game on the stable release channel, seriously...

Or they could change some data structures and save nearly a minute loading some maps.... Optimizations are only worth it where they matter.
Only time will tell what they will decide to do.

For now, I am still waiting for facts and figures about the performance gains some people claim to have noticed with v1.29.2

Mac graphics are terrible. Ask the dolphin emulator team.
Maybe those who use the Mac version of the game can share information with us?

BSoD is still a driver problem and nothing to do with the game using TnL. Both OpenGL and D3D do not specify that TnL need to be done with dedicated hardware. As long as the application uses the APIs correctly and the driver produces correct results no one cares. For a long time Microsoft offered full software D3D11 implementations aimed at driver developers to test conformance and game developers to debug. With these one could literally run a game with a full software implementation of D3D11. Vulkan API even has native support for such graphic devices, allowing a program to resolve if a device is software emulated, integrated or discrete.
The BSOD on startup of Drakan with bump mapping enabled is a known issue iirc. It is related to the RioT engine itself iirc. I need to recheck that soon(tm).

XP users can not use D3D11, but they can install the debug version of DirectX9c, and / or use the DirectX Control Panel to switch to the Ramp Rasterizer, or to the Reference Rasterizer for D3D rendering.

If it causes a BSoD that is a driver problem. Applications cannot cause BSoDs directly without running into driver bugs or purposely using OS or driver APIs wrong.
Having a look at my small collection of minidumps and based on distant memories, I am guessing that the BSOD for XashXT is caused by the Hardware Abstraction Layer driver (hal.dll), which triggers a 'Kernel mode exception not handled' error. Needless to say, I do not want to reproduce this issue. This is why I have never reinstalled XashXT again since... 2013. Again, Xash3D runs fine.

Not entirely true. Most emulator developers have more grief with AMD as far as compliance goes. AMD literally ignores their bug reports, while Intel at least resolves some of them. These are API implementation bugs. Out side of obscure graphics companies, NVidia usually resolves them the most swiftly while AMD might never resolve some.

Programmers get around this with shader macros to force different shader code paths for different graphic vendors to avoid buggy features.
Since the 1990s, ATI is notorious for their buggy drivers. Having been bought by AMD in 2006 did not improve their reputation. I personally would not buy an AMD video card. Old hatreds die hard (ATI Rage IIc, omg!...)

I have used AMD and ATI hardware only in the 1990s. In the 2000s, I was using Intel and Nvidia hardware, and nowadays Intel hardware only.
 
Last edited:

pyf

pyf

Level 32
Joined
Mar 21, 2016
Messages
2,985
Well, starting with v1.29 and afaik atm, the new 'graphicsapi' video mode command comes with the following options:
- OpenGL2 (works as the old opengl)
- OpenGL4 (CRASHES THE GAME)
- Direct3D9 (works as the old d3d, still the default)
- Direct3D11 (CRASHES THE GAME)
- Direct3D12 (CRASHES THE GAME)
- Metal2 (CRASHES THE GAME)
- Null (CRASHES THE GAME)

afaik, an x1650 is designed for Direct3D 9.0c with SM 3.0, and for OpenGL 2.x
 
Status
Not open for further replies.
Top