• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

[OpenGL] Help Drawing the MDL

Status
Not open for further replies.
Hello! I've been working on a JAVA warcraft 3 model editor/viewer/importer, and I have an in-dev model display window. However, I'm really new to OpenGL and I ran into a bug that I didn't know how to fix:

When the program renders something with an alpha channel, it still writes to the depth buffer, so when it goes to write something else afterward if that thing is behind the alpha layer it still gets hidden as though it were "behind" something.
However, parts of the same geoset that had that alpha channel still show up behind the alpha channel. It's a weird bug, probably due to me not knowing entirely what I'm doing with OpenGL.

Anyone who could help would be well appreciated! :)
 
Level 29
Joined
Jul 29, 2007
Messages
5,174
You didn't set a blend mode, so depth tests are done as normal, no matter what your alpha is.
Here's the basic example of setting a blend mode:
Code:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

Blending is tedious and you are unlikely to get matching results to WC3 (if you do, share your code!), but following are a couple of pointers about rendering WC3 models.

What Magos did (and which I mostly copied while changing a couple of things) was a lot of enabling/disabling depth buffer writes, but it is certainly not correct, and it shows in many cases.

For materials, we create layers and sort them according to their filter mode, and then render every geoset multiple times, once per layer.
You first render opaque layers - this is generally what you do whenever you have both opaque and blended meshes, not specific to WC3 - then you start guessing what would yield good results with the blended layers...

The actual "transparent" filter mode is opaque, you just use alpha testing (if desktop OpenGL, since ES/WebGL don't have alpha testing) or do a simple condition in your fragment shader (any GL) to discard pixels you are not going to write, there is no blending here.

For the rest, your guess is as good as anyone's without hacking WC3's rendering code.

You can check out Magos' WC3 Model Editor's code.
It uses Direct3D, but it's generally the same API calls as OpenGL.

You can also check my model viewer's source (using JavaScript and WebGL).
It is more feature-complete than Magos', but you should check both.

Blending in general is a big pain to get decent results with, and trying to mimic a 10 years old game is even more so, so good luck.
 
Interesting... that all sounds incredibly helpful. I don't have time to look into it tonight, but I really appreciate you taking the time to share! :)

My current system was using a buggy bunch of enabling and disabling of depth buffer writes (in a really limited way, you can download the compiled program here) but I had the feeling I was going about it wrong somehow or other. The render order was based on the arbitrary order in which the user enabled/disabled visibility of geosets, so fixing that is a good start.

My code-learning in the past has been too java-heavy, and I must say that my previous attempts at reading Magos's code for things never got too far (although at least once I dug for several hours wondering "how he dealt with" a certain bug in other modeling programs, and in the end he hadn't and his was bugged too, I think? I forget what the issue was, maybe whether VertexGroup references were signed or unsigned for larger models with 128+ vertex groups... i.e. sometimes they rolled over to negative with the MdlxConverter, and I wondered if he dealt with it, perhaps? I forget whether that was the particular issue or not.)

Anyway, I'll definitely have a look at reading your and Magos's sourcecode; thanks again!
 

Dr Super Good

Spell Reviewer
Level 63
Joined
Jan 18, 2005
Messages
27,196
Warcraft III was written targeting Direct3D 8.1. The closest render behaviour you could obtain would logically be using parts of the API that existed around that time.

As the game is old and had to perform well on old hardware, I doubt they were doing anything too hacky with their graphics so there is probably a simpler way to render the models.
 
Level 29
Joined
Jul 29, 2007
Messages
5,174
Warcraft III was written targeting Direct3D 8.1. The closest render behaviour you could obtain would logically be using parts of the API that existed around that time.

As the game is old and had to perform well on old hardware, I doubt they were doing anything too hacky with their graphics so there is probably a simpler way to render the models.

Very likely, the programmable pipeline didn't even exist then (aka shaders), but again, your guess is as good as anyone's as to how they rendered things.

....maybe whether VertexGroup references were signed or unsigned for larger models with 128+ vertex groups... i.e. sometimes they rolled over to negative with the MdlxConverter, and I wondered if he dealt with it, perhaps? I forget whether that was the particular issue or not.)

If you are following Magos' MDX specs, do note that he has errors and missing information in it.
I said I would write updated specs, but I never did, because I am lazy, so you might want to look at my parser.
 

Dr Super Good

Spell Reviewer
Level 63
Joined
Jan 18, 2005
Messages
27,196
Very likely, the programmable pipeline didn't even exist then (aka shaders), but again, your guess is as good as anyone's as to how they rendered things.
Incorrect, as according to Wikipedia Direct3D 8 was actually the first version of Direct3D to implement programmable shaders. However the High Level Shader Language it used is different from Direct3D 9 and later versions so you are partially correct in that it is not what we use today. Direct3D 8 is hailed as the beginning of easy graphic processing thanks to its programmable shaders but obviously we cannot forget OpenGL which also did similar around that time.

I am sure there must be some way to reverse-engineer the shaders WC3 uses.
 
Level 29
Joined
Jul 29, 2007
Messages
5,174
I don't know about Direct3D, but in OpenGL shaders only became core from version 2 (yes, technically they existed a long time before, but as extensions which a graphics card might or might not have).

It's been a couple of years since I last reversed engineered stuff, so count me out.
We need to know both the Direct3D/OpenGL API calls and shader code if it exists (I would bet it does not).

/Edit
Wrote the specs in this forum, hope they help.
 
Last edited:
Status
Not open for further replies.
Top