• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

Reforged HD Historic Asset Pipeline

Status
Not open for further replies.
Hey all,

I developed my own tool for editing mdx then hacked in some HD Reforged support when those Reforged assets were released but I never went and fully learned how Blizzard created the original HD assets, and when I say "how", I specifically mean what computer programs.

I am making this thread because I am curious if anyone knows more how to answer the following question more than I do: If money were no object, and I wanted to make HD assets with the exact technology pipeline used to make Reforged assets just to experience it, exactly what software would I purchase and in what order would I use it?

My current understanding is that the steps included:

  • 3D Asset is created in some tool unknown to me, I heard names like "zbrush" / "substance painter" but never used that nor googled it yet
  • 3D Asset is opened in Maya, which I have never used (or if I used it I forgot what it was like) but I heard of it back in the day as a 3ds Max alternative
    • Along the way to this step, for each surface the artist was drawing, up to four (4) uncompressed TIFF texture layers are generated:
      • Diffuse
      • Normal
      • ORM
      • Emissive (if used)
  • Asset exported from .ma to .mdl
    • Log files in Reforged Beta tell us there was a batch exporter of some kind for this. It includes the Blizzard developer name who wrote the plugin, as well as noting that it was an internal Blizzard plugin for this step. As far as I know, unlike the official 3ds max Blizzard plugin published in the 2000s (used to create all in-game assets for the 2002 game and its expansion) the official Reforged export plugin was not given to the community.
    • Somewhere during this pipeline, the ORM alpha layer is specially encoded to define Team Color. I have no idea if this happens before or after the model was loaded in Maya.
  • The MDL and uncompressed TIFF data is stored into the game's version control system until a Blizzard developer triggers the build/compile process for the Warcraft III game.
    • At this step, the MDL is converted to the binary MDX format, to boost asset load performance in the release game client.
    • Likewise the textures are compressed from their TIFF source files to DDS also for performance.

As an aside to the above, models have metadata embedded during the conversion for particles and facial animations, which are linked by filepath reference from the .mdl game assets
  • Particle data is created in a node scene graph editor with many types of layers and color parameters, purchased from PopcornFX (PopcornFX - Real-time FX solution for particle effects) .
    • Reforged renders these by including licensed software in the game code for rendering, presumably that code in the game was only integrated by Blizzard Entertainment but not written by them.
    • The files are .pkfx inside the Warcraft III version control system, and include links to TIFF source assets that are compressed to DDS during Warcraft III release client build by the same system used for the textures referenced in MDL files.
    • PKB can be used alone anywhere the Reforged game engine accepts MDX or MDL file references, but is usually linked by a CORN type MDX node (called "ParticleEmitterPopcorn" in the version control "MDL" json-like text based model format inside the office, or so I was told by someone who claimed to have reverse engineered some part of the game).
  • Character facial animation is configured with FaceFX, purchased from FaceFX | OC3 Entertainment
    • Uses ".facefx_ingame" file extension after build, probably uses ".facefx" inside version control.

So, this is most of what I know right now. But it would obviously cost me thousands of dollars to try to use this knowledge to try to actually get the software to make my own Reforged HD model, so I never tried yet. But I was curious if others know more!
Thanks.
 
I work in animation, not game development, at a studio that primarily uses Maya, so I may be able to shed some light on some of this, but not all. I am a modeler, so I have no experience using FaceFX or PopcornFX. Those are outside my area of expertise. Hopefully others on this site know more about those!

For modeling, every artist will have their own workflow. Some studios offer more flexibility than others with what tools can be used. From what I've read and heard in panels and interviews, it sounds like the starting point for modelers on Reforged would have been to start by building high resolution geometry, most likely in ZBrush. Other programs offer sculpting tools that can be used to achieve the same result (Mudbox, Blender, Maya, etc), but ZBrush is the standard. Once completed, the high resolution geometry would need to be retopologized to achieve a more reasonable poly count and edge flow that allows for clean deformation. There are plenty of tools for this with no absolute standard. Topogun used to be popular but now ZBrush and Maya have great retopology tools. I'm sure Blender and 3D Studio Max have perfectly viable retopology tools as well.

For texturing, Substance Painter is almost certainly used. In animation, some artists will also use 3D-Coat or Mari but Substance Painter is by a considerable margin the most popular and I would not be surprised if in game development that margin was even larger. The artist brings their geometry into Substance Painter and uses a combination of brush strokes and procedurals to control the look of the surface. Substance Painter allows you to configure export presets that would be able to write your metalness, reflectivity, diffuse, ambient occlusion, emissive, normals, etc to the appropriate files and channels within those files.

In theory, all the modeling, surfacing, rigging, and animation could be done just with Maya and Substance Painter, though I pity any artist who is forced to sculpt their geometry in Maya.
 
Last edited:
@ShadiHD might also have a thing or two to say.
I was also under the impression that ORM alpha = team color is a Reforged thing, not model specific. Because our early models had a working team color and to my knowledge no specific editing of that model was done... unless RMS was that somehow inside.

What is would be really interesting to know, is how the normal map calculation is done. Or why does it break (a little) when converting .obj (at least we use that) to .mdl or .mdx using RMS.
I probably already know the answer... it's that we don't use Maya to covert to .mdl.
 
Level 2
Joined
Mar 18, 2021
Messages
5
@Hawkwing is pretty much spot on. With zbrush you can either blockout with primitives or right away start with a dense mesh (dynamesh) that you can shape and at any point use an auto remesh function that will reorganize the dense topology or half it if you want to go lower. The cool thing is as you start low and increase subdivisions it stores how many times you did so via a slider and you can go up and down in density as you please. After you're happy (usually at final high poly pass with whatever sculpt detail you have), you send it to maya which is pretty much only used for retopology and UV work.
Though since there are many different workflows, there are artists who like to do some modeling in maya at the start to get a clean base mesh to then send into zbrush to subdivide and do the rest of the high res work in there with details etc.

Normal, ambient occlusion, and other maps are baked either in substance painter or marmoset toolbag. Most now prefer marmoset for baking since it's UI gives a better visual representation of what you're getting whereas in painter it can be a lot of back and forth slider playing. Baking is when you bring your retopologized low poly stuff into the same scene with it's high poly parts and the program sort of projects the details of the high poly onto the low poly, thus giving you the exported maps you would then import into painter with your lows and start texturing (substance was acquired by Adobe recently and it's now part of their monthly/annual creative cloud subscription).

I imagine Blizzard or whomever they reportedly outsourced the 3D work to developed an in house tool to then export to the file formats RF uses.
Speaking of which, I haven't looked at the RF WE much, I know you can setup projects as SD or HD, but does it now take .obj's with normal.tga etc?
 
Last edited:

ShadiHD

Hosted Project: W3CSW
Level 31
Joined
Mar 5, 2020
Messages
184
if you want to make a Reforged-level character/asset then you do the following (since I did it with our CSW custom models).

  • base blocking in Maya/3Ds Max: you do the initial shape of the model, how it looks the overall silhouette etc..).
  • then you transfer the model to either Mudbox or Zbrush, which are sculpting programs to give you very high quality results, like adding pores, skin, and all that, after you've done the "high polygon" version of the model in those programs, you get back to Maya to retopolgize it and make it game-accepted polycount.
  • UV mapping it to make it ready for texturing.
  • today's standards, people go for Substance painter since it's basically photoshop for 3D models, so all high end texturing happens there, including ORM and Normal maps, and you export all 3 from it.

tada, you have now a model that's ready for animation.
 
Status
Not open for further replies.
Top