my goal was to have a computer program that spits out this model upload at the touch of a button. For example, the Team Color wings in my upload are computer generated. But like you pointed out, my converter cant distinguish TC alpha and transparency alpha yet.
Do you know if the trimming you did on the wings could be automated? Did you simply drag the UVs?
I was thinking that my triangle detector which checks each triangle for teamcolor and spits it to another geoset if it had TC could maybe try to find triangles that are mostly alpha more than TC, and split the wings so that the frills on the end are all transparency alpha and then the team color starts further up. But it might introduce a geometric line where Reforged's shader is capable of a tattered TC colored surface because it has two channels (TC and transparency) for all fragments
Not sure if you care but for my screenshot on this upload I was using Patch 1.22 so that is like even older than what you are using probably. What technique are you using to doctor the textures? Is that like some manual labor or is it more of an automated process like what I am going for? (Are you just loading the diffuse color? My baking script loads diffuse, normal, and ORM and then tries to integrate them by baking against a static light using a software only version of the same logic that was in the Retera Model Studio HD render shader, although I think the software version has some bugs)