- Joined
- Aug 18, 2009
- Messages
- 4,097
Pondering about some design decisions concerning my method of mapmaking. I have used the basic layout for my major project for some time now and deem it very handy. But when I try to generalize it, I have to think about what I can/still want to assume in order to find a balance between convenience and universality.
So first explaining the current situation:
In the World Editor you face a technical view. There is an Object Editor where you create objects, a Trigger Editor where you write triggers, an Import Manager where you import resources and so on. When you try to realize some feature for your map like a custom ability, it usually consists of multiple technical components. One or more script pages, corresponding objects, textures, sounds, models etc. which you need to integrate. A disadvantage to this paradigm is that you would normally work on one feature at a time and the technical view then makes it troublesome because the parts are mixed in with stuff of different features. There are also no strong references in the World Editor. Deleting your custom ability does not erase all the components, the concept of your custom ability is virtual to begin with, there is no technical mechanism tying the parts together, so you have to do it manually.
What I deem rather effective, although that also depends on the map, is an organization in features. You have a directory structure like:
and each feature can possess multiple components or even a sub feature as depicted above:
That renders your imaginary feature compounds real under a label and grants a scope view.
I display this structure on the operation system. Each feature corresponds to a folder, each component is a file. An IDE like Eclipse can import the file system by linking to the project folder. Using the operation system is good for integrity, too, you won't be able to corrupt everything at once and a main desire in general is to be able to edit multiple components in parallel.
A compiler with a couple of tools translates and merges the data for wc3. It currently looks like this that there are two relevant component types. One is for script code in textual shape (like vjass, wurst), the other are object definitions in the form of data tables. Latter provide additional script code @compiletime, which must be imported in the local scope. Ex:
featureA
Vice-versa is not necessary. Object definitions do not need to know scripts. If they depend on some variable, that variable is exported into an object, too. Objects can reference each other. The current syntax for that is '<pathToObject:field>'.
issue 1:
How to integrate the object data in the script? The problem is that the script files span an own network of scopes, which may not be aligned to the one I described above. However, unless you want the highest level of explicitness, the object data should be bound to a script scope.
That's enough for starters. More questions maybe later on.
So first explaining the current situation:
In the World Editor you face a technical view. There is an Object Editor where you create objects, a Trigger Editor where you write triggers, an Import Manager where you import resources and so on. When you try to realize some feature for your map like a custom ability, it usually consists of multiple technical components. One or more script pages, corresponding objects, textures, sounds, models etc. which you need to integrate. A disadvantage to this paradigm is that you would normally work on one feature at a time and the technical view then makes it troublesome because the parts are mixed in with stuff of different features. There are also no strong references in the World Editor. Deleting your custom ability does not erase all the components, the concept of your custom ability is virtual to begin with, there is no technical mechanism tying the parts together, so you have to do it manually.
What I deem rather effective, although that also depends on the map, is an organization in features. You have a directory structure like:
- featureA
- featureB
- featureB_sub
and each feature can possess multiple components or even a sub feature as depicted above:
- featureA
- some unit for featureA
- some script for featureA
- featureB
- some item for featureB
- some comment for featureB
- featureB_sub
- some whatever for featureB_sub
That renders your imaginary feature compounds real under a label and grants a scope view.
I display this structure on the operation system. Each feature corresponds to a folder, each component is a file. An IDE like Eclipse can import the file system by linking to the project folder. Using the operation system is good for integrity, too, you won't be able to corrupt everything at once and a main desire in general is to be able to edit multiple components in parallel.
A compiler with a couple of tools translates and merges the data for wc3. It currently looks like this that there are two relevant component types. One is for script code in textual shape (like vjass, wurst), the other are object definitions in the form of data tables. Latter provide additional script code @compiletime, which must be imported in the local scope. Ex:
featureA
- unitA
- id = 'PlGn'
- name = "Placid Gnoll"
- life = 20
- moveSpeed = 270
- scriptA
-
Code:
scopeA import unitA_gen.script func createLocalUnit(player owner, real x, real y, real ang) CreateUnit(owner, unitA.id, x, y, ang)
-
Vice-versa is not necessary. Object definitions do not need to know scripts. If they depend on some variable, that variable is exported into an object, too. Objects can reference each other. The current syntax for that is '<pathToObject:field>'.
issue 1:
How to integrate the object data in the script? The problem is that the script files span an own network of scopes, which may not be aligned to the one I described above. However, unless you want the highest level of explicitness, the object data should be bound to a script scope.
- option A:
state the target file/script scope in the object as a special field -> bad for copy-paste, adds a coupling 'object -> script', speaking against what I just said, that the object would not need to know the script
- option B:
write the import line into the script like in the example above -> kinda inconvenient, redundancy, more explicit to show what objects are available, not too good for copy-paste, there are usually multiple objects that need to be imported
- option C:
auto-collect all the objects' scripts to a single one/under a single fixed label and import that one by a single line -> more convenient, less explicit, better for copy-paste, does not work with script files which are on the same feature level unless all of them are targeted
- option D:
auto-import the objects into all of the local script files' scopes and have the compiler build wrappers to not have duplicates of the imports because e.g. initializers must be unique -> most convenient and copy-paste friendly, least explicit, additional language dependency and requirements on the compiler (need to transform all the references or create a clusterf*ck of wrappers)
- option E:
force the condition having only one script scope per folder and auto-import everything therein -> less flexibility, highest implicitness, requirement on the scripts
- option F:
do not use the script scopes as target receiver of the object data directly but instead invent some extra syntax, so the script can call the object outputs while maintaining the short, local paths of the feature the script file is in -> the special syntax on every reference (probably just some $ prefix + relative path add in case of subfeatures), the severe contestation here is that, because the object data is not strongly tied to the script scope, foreign script scopes have to call the object data directly or use the special syntax, too, therefore they have to have additional information about the implementation
- option G (currently in use):
have an extra dedicated folder per script scope inside the feature folder, so all the belonging object data for the script scope gets dumped there -> very implicit, redundancy, folder must be specifically marked and it adjudicates script files a special stance, raises another question whether objects referencing each other should consider the supplementary encapsulation in the path declaration this script scoping induced or not
That's enough for starters. More questions maybe later on.