• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

Efficiency battle (every 0.01 of gametime vs timers)

Status
Not open for further replies.
Level 23
Joined
Apr 16, 2012
Messages
4,041
The oplimit is probably set every time a JASS thread is scheduled/rescheduled.

This should still fail to run function q tho, since it still crashed.

Like when you have loop with TriggerSleepAction, but you unintentionally run too many operatiosn before calling it, there is no point of return, the thread will crash immediately, but function q was executed just fine
 

Cokemonkey11

Code Reviewer
Level 29
Joined
May 9, 2006
Messages
3,522
As per map attached, not true, since if you run only one of these functions, or just uncomment the ExecuteFunc(test), it will show 42857, and if you keep the code showed, it will actually say 85714, which is exactly 2 times higher, which means each conditionfunc runs in its own thread, or at least with its own oplimit, but other than TriggerSleepAction, I do not know of way to reset oplimit without spawning new thread

Indeed, I was mistaken. I failed to recall that virtual thread operation limits and virtual threads being killed by TSA were entirely different concepts.

However it is correct for attacks. As such I am not incorrect, especially with the disclaimer I provided.

If someone asks you "is this apple red?" and you answered "yes! bananas are yellow" you would only not be incorrect in the sense that the correctness of irrelevant, misleading information doesn't have ontological status.

Discrete simulations can be completely decoupled from time as they advance their virtual time through "ticks" rather than continuously as is the case in real time. What you said is very true for reality as time is continuous however in a discrete simulation where time is relative it makes no difference. If you run something twice every tick or every half tick you will still get the same sequence of events generated (runs twice followed by tick in an infinite sequence).

Descritization will logically affect the frequency components the simulation can output. This is why SC2 missile movers compute two movement ticks every deterministic tick so that they can output paths with 16 game Hz movement instead of only 8 game Hz. This is due to the Nyquist theorem with discrete systems.

Not all simulations are invariant on real time. Consider for example a game which is desirable to run at 300 FPS, so that a screen capture of the output could be slowed down. (for purposes of this scenario, the simulation can't be re-played)

Descritization will logically affect the frequency components the simulation can output. This is why SC2 missile movers compute two movement ticks every deterministic tick so that they can output paths with 16 game Hz movement instead of only 8 game Hz.

You keep talking about starcraft 2 missile movers, but all I read is "sc2 made an optimization decision by defining their clock period to Z" (where Z can be anything, since such an optimization is necessary in any context). This is the same thing I said a number of days ago. Either you don't understand what I'm talking about; you're discussing tangential, undisputed information; or you're oblivious.

Most of the time not so. Non-credited kills are usually bad for gameplay as they fail to reward the killer. Any form of such a mechanic often leads to DotA Allstar style "kill deny" mechanics.

At most it could be used as an interpolation method so as not to trigger damage response events at a high frequency. However the damage response mechanism would need to be able to recall the total damage dealt over the period from the DoT effect otherwise it will incorrectly report less damage than the unit sustained. This is very important for many triggered systems and spells that involve damage response.

I said "[sometimes not being reduced or credit given are sometimes desirable]". By disputing me, you're saying that "giving credit and being reduced are always desirable" - do you see how that's illogical? I provided a link, giving at least one *general case, well accepted* context where dealing damage without credit or reduction is useful. Your response is uninteresting too - need I remind you that Warcraft 3 uses last-hitting mechanics? Not to mention 68 million+ players play top MOBA titles, all of which use last-hitting. http://en.wikipedia.org/wiki/League_of_Legends#cite_note-wsj-5 http://en.wikipedia.org/wiki/Heroes_of_Newerth#cite_note-HoNF2P-3 http://en.wikipedia.org/wiki/Dota_2#cite_note-7

That performance difference would be due to the reduction of 1 passed argument.

Right, but that's how the native works. There's no point in talking about why the performance difference is there when the discussion is about whether it's there or not.

20% is also quite trivial except in major performance critical applications and since the attribute is an argument you could possibly recycle the same system code to change mana as well as life (not possible with SetWidgetLife).

1) It still won't perform better (JASS has no pipelining) 2) It's not interesting how trivial or nontrivial the performance difference is, when the discussion is only about which performs better.

Since such errors would violate determinism, a game like WC3 would use a simulation of time which is loosely synchronized to external inputs (lag delay in multiplayer as a result of synchronization to assure correct sequencing).

You're arguing that we shouldn't increase the clock rate for a DPS system, since *if* the game is multiplayer, there will be some latency associated with state synchronization.

1) It's stupid to say we shouldn't increase the clock rate, if the *baseline* for the clock isn't established. In other words, If my clock runs at 1Hz, increasing it to 2Hz will be an improvement on any system where the synchronization delay is less than 2 seconds (by nyquist sampling theorem). If you could reason that multiplayer wc3 has "at best" 25ms delay for its state synchronisation, by nyquist sampling theorem you'd need 80Hz clock to meet the bottleneck.

2) No one said anything about clocks being useful only in multiplayer contexts.

Since the game is not a continuous simulation, timers will only be run at certain stages during state advancement. To assure correct sequencing some form of time event queue could be used allowing timer elements to run in what would appear as the correct relative sequence.

Yes, that is correct, but my point is that *whether or not you know* what the peak state synchronization frequency is for wc3, selecting a clock frequency will still be an optimization process (that is not to say that it's impossible to waste CPU cycles in wc3)

No it is not as a life change will not run damage responses. This is perfect for healing (since you do not want damage responses to run from being healed) but totally useless for damage (as you do want them to run damage responses).

See link provided.
 

Dr Super Good

Spell Reviewer
Level 63
Joined
Jan 18, 2005
Messages
27,192
If someone asks you "is this apple red?" and you answered "yes! bananas are yellow" you would only not be incorrect in the sense that the correctness of irrelevant, misleading information doesn't have ontological status.
That example makes no sense and is totally irrelevant...

It is more like asking "What colour is my apple?" with the response "I saw an apple before and it was red so your apple is likely to be red."

In this case the apple was green so the guess was incorrect. However the apple could have turned out to be red in which case the guess would be right. This is called making an educated guess for knowledge you do not have based on knowledge you do have. Ultimately it is still a guess so could be right or wrong. I made a guess and provided all the facts I based it on however it turned out wrong.

What I do not understand is why you are even making a fuss out of this? I even said at the time that it was quite a wild guess (not the sort one would want to put money on).

Not all simulations are invariant on real time. Consider for example a game which is desirable to run at 300 FPS, so that a screen capture of the output could be slowed down. (for purposes of this scenario, the simulation can't be re-played)
Except they are invariant with regards to real time since computers cannot precicely gauge the exact passing of time, only the realitive passing of time. If you ran two games consoles such as the SEGA Megadrive in parallel with the same state and same realitive input started at the same time you will find that after some extended time period one is a few frames ahead of the other.

In your example what you would do is specify the tick frequency with respect to game time as 300 Hz. Each tick you then push out a rendered frame to be stored. Even if this is made to operate in real time you might find that it is operating within some error of real time (such as 299.9 Hz or 300.1 Hz). This sort of error even affects displays.

In the end, time is a highly relative quantity. Physicists have proof that time progresses at different rates throughout the universe. As such for determinism you need to define your own time progression.

You keep talking about starcraft 2 missile movers, but all I read is "sc2 made an optimization decision by defining their clock period to Z" (where Z can be anything, since such an optimization is necessary in any context). This is the same thing I said a number of days ago. Either you don't understand what I'm talking about; you're discussing tangential, undisputed information; or you're oblivious.
Well stop saying you do not understand it then?

do you see how that's illogical? I provided a link, giving at least one *general case, well accepted* context where dealing damage without credit or reduction is useful.
Except the fact is almost always it is not desirable. You do not want your DoT effects bypassing reductions, mana shield, triggers etc. At one stage abilities that failed to properly credit kills were out right rejected from the spells section. Sure some times it makes for interesting mechanics however so do lots of not very useful stuff at times.

need I remind you that Warcraft 3 uses last-hitting mechanics?
Kill denying was actually an oversight. In most RTS games it makes little difference who lands the final blow and many do not even allow easy friendly fire. However since units give some form of reward when killed (exp, resources etc) in WC3 you can deny your enemies that reward by a friendly unit getting the last hit. This was once considered a bug in DotA Allstars however it eventually made it as a feature (even if it is a rather stupid one).

Right, but that's how the native works. There's no point in talking about why the performance difference is there when the discussion is about whether it's there or not.
The discussion is about timer periods and not the performance of natives. You may want to recap what has been said as I do agree this thread has run on far longer than it needs to.

1) It still won't perform better (JASS has no pipelining)
What does pipelining have to do with this topic or even JASS for that matter?

You're arguing that we shouldn't increase the clock rate for a DPS system, since *if* the game is multiplayer, there will be some latency associated with state synchronization.
No I am not? I was arguing that it is a waste of time to do so since players only care about what they can see.

That statement was to explain to you that games cannot couple to real time since the interpretation of time will vary from system to system. Instead they advance time in a vaguely synchronized to real time way.

1) It's stupid to say we shouldn't increase the clock rate, if the *baseline* for the clock isn't established.
Except the baseline is already established. Most WC3 installations run at 60 Hz refresh rate. Doing something like damage or movement faster than that frame rate will not appear any more smooth and in-fact can potentially introduce aliasing artefacts which will produce weird movement patterns.

For example take a timer with an update rate of 0.01 (100 Hz). Since the game outputs at 60 Hz what will happen is that most frames the result of 2 ticks will be shown except some frames (1/5?) will instead show the result of only 1 tick passing. If this was a movement system you subject the player to a unit moving at a non-constant velocity since some times it will appear to move less for one frame than it mostly does. If people will notice this in real time is another question however it will show up in screen captures.
 

Cokemonkey11

Code Reviewer
Level 29
Joined
May 9, 2006
Messages
3,522
That example makes no sense and is totally irrelevant...

It is more like asking "What colour is my apple?" with the response "I saw an apple before and it was red so your apple is likely to be red."

In this case the apple was green so the guess was incorrect. However the apple could have turned out to be red in which case the guess would be right. This is called making an educated guess for knowledge you do not have based on knowledge you do have. Ultimately it is still a guess so could be right or wrong. I made a guess and provided all the facts I based it on however it turned out wrong.

Regardless, your statement was only not incorrect because it was irrelevant. You and I both recalled based on prior knowledge that 'sometimes dealing damage less than 1 will result in a minimum of 1 damage'. Then edo494 demonstrated we were incorrect. Case closed. Everything after that was arrogant minutiae.

What I do not understand is why you are even making a fuss out of this? I even said at the time that it was quite a wild guess (not the sort one would want to put money on).

Because I'm the dipshit who keeps coming back here and reading everything you have to say, no matter how inconsequential (!) If you're incorrect, nebulous, or otherwise ambiguous, someone should point it out - only now I regret that person being me.

Except they are invariant with regards to real time since computers cannot precicely gauge the exact passing of time, only the realitive passing of time. If you ran two games consoles such as the SEGA Megadrive in parallel with the same state and same realitive input started at the same time you will find that after some extended time period one is a few frames ahead of the other.

In your example what you would do is specify the tick frequency with respect to game time as 300 Hz. Each tick you then push out a rendered frame to be stored. Even if this is made to operate in real time you might find that it is operating within some error of real time (such as 299.9 Hz or 300.1 Hz). This sort of error even affects displays.

In the end, time is a highly relative quantity. Physicists have proof that time progresses at different rates throughout the universe. As such for determinism you need to define your own time progression.

What you say is true, but I fail to see the relevance. I think if you go back and read what I said once more, you'll understand that my counterexample was related to simulations being ran *with respect to* real time, not *exactly in correspondence* with real time (which is obviously impossible, no need for justification)

Well stop saying you do not understand it then?

If you go back and read what I said (see a pattern here?) you'll see that I only don't understand how it is relevant to the conversation (not that I don't understand how it is factual)

Except the fact is almost always it is not desirable. You do not want your DoT effects bypassing reductions, mana shield, triggers etc.

"Almost" isn't good enough to assert "always".

At one stage abilities that failed to properly credit kills were out right rejected from the spells section. Sure some times it makes for interesting mechanics however so do lots of not very useful stuff at times.

Such abilities are still outright rejected. The fact that you point this out tells me that either you (a) didn't actually check the reference provided, or (b) still want to mention inconsequential data regardless.

The discussion is about timer periods and not the performance of natives. You may want to recap what has been said as I do agree this thread has run on far longer than it needs to.

A side-discussion emerged, with the topic of performance of Get/SetWidgetLife, vs Get/SetUnitState.

What does pipelining have to do with this topic or even JASS for that matter?

In the aforementioned side-discussion (regarding *performance*) you pointed out that Get/SetUnitState could be desirable in the sense that life and mana could use shared code. This would *only* be relevant to performance in the case that branch prediction is available - which it isn't, trivially, because JASS isn't pipelined.

No I am not? I was arguing that it is a waste of time to do so since players only care about what they can see.

That statement was to explain to you that games cannot couple to real time since the interpretation of time will vary from system to system. Instead they advance time in a vaguely synchronized to real time way.

Your statement is uninteresting for any case where only one system is involved in simulation.

Except the baseline is already established. Most WC3 installations run at 60 Hz refresh rate. Doing something like damage or movement faster than that frame rate will not appear any more smooth and in-fact can potentially introduce aliasing artefacts which will produce weird movement patterns.

For example take a timer with an update rate of 0.01 (100 Hz). Since the game outputs at 60 Hz what will happen is that most frames the result of 2 ticks will be shown except some frames (1/5?) will instead show the result of only 1 tick passing. If this was a movement system you subject the player to a unit moving at a non-constant velocity since some times it will appear to move less for one frame than it mostly does. If people will notice this in real time is another question however it will show up in screen captures.

In this context, baseline refers to the initial clock frequency (say n - maybe even 1Hz!) which is why I question your statement that the clock frequency should never be increased (why? you don't even know what the clock frequency is initially)
 
Level 23
Joined
Apr 16, 2012
Messages
4,041
so I wonder, you say time is relative, and that you should make the game go like 16 ticks/second, but then you are still using the relativness of the time measured by computer as one second, since you need to base something the tick rate on, because otherwise machine with 2 GHz processor will tick less frequently than machine with 3 GHz processor
 

Dr Super Good

Spell Reviewer
Level 63
Joined
Jan 18, 2005
Messages
27,192
In the aforementioned side-discussion (regarding *performance*) you pointed out that Get/SetUnitState could be desirable in the sense that life and mana could use shared code. This would *only* be relevant to performance in the case that branch prediction is available - which it isn't, trivially, because JASS isn't pipelined.
Actually I was talking from a maintenance point of view. One system could do both life and mana meaning the same code will power both. This has nothing to do with performance but more development, maintenance and map size.

I have no clue what branch prediction has to do with any of this. Since it takes a parameter to determine if life or mana is sampled/modified then one stores said parameter in an array as part of the data structure used. That is then recalled to feed the function when required. The same system now works for both life and mana, although slightly slower (but probably faster than with flow control and constants).

Branch prediction is a hardware feature and as such has nothing to do with code. The reason it exists is to lessen the impact that flow control has on execution performance in pipelined architectures. High-performance architectures often have large pipelines and when flow control is encountered without some form of prediction then the pipeline stalls resulting in a waste of cycles the length of the pipeline. Prediction makes a guess as to which direction the code will execute allowing the pipeline to continue being filled while the flow control is resolved. Since it makes a guess there is a chance it is wrong in which case the entire pipeline is discarded and performance reverts to non-prediction speed. However if it guessed right the flow control statement is resolved like any other instruction in the pipeline so comparatively has almost no cost.

Flow control statements in instruction sets usually correspond to any instruction which can change the program counter in a non-sequential way. Some experts argue that single branch prediction is not sufficient and that executing multiple possible branch outcomes would be faster (since then there is no "wrong" prediction) however the hardware to do so is considerably more complicate so the costs might outweigh the benefits.

In this context, baseline refers to the initial clock frequency (say n - maybe even 1Hz!) which is why I question your statement that the clock frequency should never be increased (why? you don't even know what the clock frequency is initially)
What on earth are you talking about? Certainly not WC3 by the sounds of it.

In WC3 since people only (or at least mostly) play at 60 Hz there is as good as no benefit running a timer inside WC3 faster than 60 Hz. Running one slower will result in noticeable inconsistencies for animation. Running one faster will result in aliasing inconsistencies for animation. Since running something that never produces any user results is pointless (unless it does something to the game state the is no reason to run it), running something above 60 Hz in WC3 is pointless. State wise you may argue that you need to run it faster however you still subject the user to aliasing of the results of execution as the results are only sampled at ~60 Hz.

Games like SC2 get around this sort of problem by discerning deterministic state from visual state. Visual state is advanced at the refresh rate of the display and interpolates from deterministic state. All state changes are run in the deterministic ticks which are constant and synchronized. The game then handles synchronization of deterministic results with visual state. Since the 16 Hz deterministic ticks are under half of the average 60 Hz display refresh then aliasing is not a problem.

so I wonder, you say time is relative, and that you should make the game go like 16 ticks/second, but then you are still using the relativness of the time measured by computer as one second, since you need to base something the tick rate on, because otherwise machine with 2 GHz processor will tick less frequently than machine with 3 GHz processor
Except it is not 16 ticks/second but rather 16 ticks/"game second". A "game second" is an artificial unit of time which determines the speed at which the game simulation advances. If a unit is moving 1 tile per second at constant speed in the game with a tick rate of 16 ticks/game second then it will move 1/16 of a tile per tick. This is because each tick is 1/16 of a game second later.

What then happens is a best effort attempt at synchronizing game seconds to real seconds. In the case of WC3 and SC2 the session server does this by coordinating with clients instructing them how far they can advance and sending out synchronized orders for that period. The clients then use this information to determine a translation from their interpretation of real time to game ticks such that the game appears to play steadily at a fixed rate (avoids getting stuck waiting for the server to advance). In SC2 if the clients are unable to cope with the determined mapping of game time to server real time then it will slow the mapping down and show a "X computer is slowing down the game" message. WC3 will start to show the "waiting for player" dialog in that case. If the host server is unable to cope (usually network reasons) then the client will run out of synchronized game time, freeze and then show the "waiting for host" dialog. If the client falls very far behind the host server (local temporary resource shortage) then it will speed up the mapping of game time per real time temporarily to catch up.

Because game time has a mapping to real time it enables you to change the real time rate at which the game executes without altering the game results. For example when watching replays in WC3 and SC2 you can "fast forward" which greatly increases the mapping rate such that 1 real second will correspond to possibly dozens of game seconds. You could also slow it down such that 1 game second corresponds to multiple real seconds for a "slow motion" effect.

Since it is a best effort attempt at synchronization of game time to real time, it is impossible for 1 game second to be exactly 1 real second. My SEGA Genesis example, where by one console will eventually be ahead of the other. The reality is that it does not mater as no human will notice if a game is advancing at 15.9 ticks per real second or 16.1 ticks per real second and even the game time will appear to map up with real time if the error is low enough. As long as time progresses at a steady rate and at approximately the expected rate no one will notice any time synchronization errors.
 

Cokemonkey11

Code Reviewer
Level 29
Joined
May 9, 2006
Messages
3,522
Actually I was talking from a maintenance point of view. One system could do both life and mana meaning the same code will power both. This has nothing to do with performance but more development, maintenance and map size.

No, actually, what you said was:

That performance difference would be due to the reduction of 1 passed argument. 20% is also quite trivial except in major performance critical applications and since the attribute is an argument you could possibly recycle the same system code to change mana as well as life (not possible with SetWidgetLife).

Since the topic was about performance. Anything else about software architecture and management is uninteresting.

Re branch prediction: The only way reusing code could improve performance in the case of SetWidgetLife vs SetUnitState would be if you formally asserted application order in an environment with branch prediction. I don't really want to explain how "for each constant n, map SetUnitState(n) to each unit m in units" can provide this kind of formal assertion. Hopefully you can think about this yourself.

What on earth are you talking about? Certainly not WC3 by the sounds of it.

In WC3 since people only (or at least mostly) play at 60 Hz there is as good as no benefit running a timer inside WC3 faster than 60 Hz. Running one slower will result in noticeable inconsistencies for animation. Running one faster will result in aliasing inconsistencies for animation. Since running something that never produces any user results is pointless (unless it does something to the game state the is no reason to run it), running something above 60 Hz in WC3 is pointless. State wise you may argue that you need to run it faster however you still subject the user to aliasing of the results of execution as the results are only sampled at ~60 Hz.

Games like SC2 get around this sort of problem by discerning deterministic state from visual state. Visual state is advanced at the refresh rate of the display and interpolates from deterministic state. All state changes are run in the deterministic ticks which are constant and synchronized. The game then handles synchronization of deterministic results with visual state. Since the 16 Hz deterministic ticks are under half of the average 60 Hz display refresh then aliasing is not a problem.

You're literally not listening to me and instead you're just repeating yourself. Let's try this with an ordered list:

  1. I say that a user might want a timer with some frequency n (perhaps 1Hz)
  2. I say that without knowing n, it's impossible to say whether that user could want a timer with period m > n, since n is not known.
  3. You say I'm wrong, because you can think of a frequency n (perhaps 60Hz), such that no m > n is justifiable.
  4. I say you're not thinking critically and just repeating anecdotal details to the benefit of no one.
 

Dr Super Good

Spell Reviewer
Level 63
Joined
Jan 18, 2005
Messages
27,192
Re branch prediction: The only way reusing code could improve performance in the case of SetWidgetLife vs SetUnitState would be if you formally asserted application order in an environment with branch prediction. I don't really want to explain how "for each constant n, map SetUnitState(n) to each unit m in units" can provide this kind of formal assertion. Hopefully you can think about this yourself.
Actually it would most likely benefit from link time optimization as then the extra argument could automatically be in-lined for being a compile time constant. This would make it as fast as any other solution as there would be no conditional tests. Talking optimizations such as this is stupid as JASS is not a machine language. Why did you even raise it...
I say that without knowing n, it's impossible to say whether that user could want a timer with period m > n, since n is not known.
What on earth does this have to do with anything? It is even not what you were saying since you were promoting the use of timers where period m < n where n is the refresh period of WC3. Now suddenly you are saying the opposite. Please make your mind up before you confuse people.

Fact is that running a timer for animation lower than the refresh rate of WC3 (globally agreed as 60 Hz) will result in discontinuous motion (different frames show the same result).
Fact is that running a timer for animation higher than the refresh rate of WC3 (globally agreed as 60 Hz) will result in aliased motion (distance per frame varies with some frequency).

You want to minimize both for the highest quality animation.

As a result the minimum possible period should be 0.0166 (slight aliasing) or 0.0167 (slight discontinuity). This is a saving of ~40% execution count over the topic creator's period of 0.01 while also greatly reducing the exposure to aliasing. This sort of optimization is far larger than changing unit property with argument life to get widget life.
 

Cokemonkey11

Code Reviewer
Level 29
Joined
May 9, 2006
Messages
3,522
Actually it would most likely benefit from link time optimization as then the extra argument could automatically be in-lined for being a compile time constant. This would make it as fast as any other solution as there would be no conditional tests. Talking optimizations such as this is stupid as JASS is not a machine language. Why did you even raise it...

You're mistaken if you think that the extra argument is the source of a 20% performance hit at the machine code layer. The difference in performance is likely attributable to number of characters required to parse at the interpreter layer.

Test this yourself: create function a takes integer q, integer w, integer e, integer r returns nothing, and function longFunctionName takes integer q, then initialize and call them:

JASS:
set z = 1
set x = 1
set c = 1
set v = 1
set longParameterIsLong = 42
call a(z,x,c,v)
call longFunctionName(longParameterIsLong)

JASS isn't compiled, as you have correctly pointed out. However, do understand that if the JASS interpreter had a JIT, pipelining could provide the performance benefit without additional complexity, whereas compile-time optimizations can only provide a performance benefit if they were actually implemented; I hope you can understand how one is more relevant than the other.

What on earth does this have to do with anything? It is even not what you were saying since you were promoting the use of timers where period m < n where n is the refresh period of WC3. Now suddenly you are saying the opposite. Please make your mind up before you confuse people.

Actually, I'm still asserting that for JASS, timers with very high frequencies that need to be virtualized by running multiple cycles per tick are still a reasonable way of providing a causality relationship in a game. My argument about low frequencies like 1Hz came from the fact that you refuted the following:

Yes, that is correct, but my point is that *whether or not you know* what the peak state synchronization frequency is for wc3, selecting a clock frequency will still be an optimization process (that is not to say that it's impossible to waste CPU cycles in wc3)

Furthermore, I doubt that warcraft 3 maintains state in singleplayer at anything below 100Hz. I'm sure you can contrive an example to test this, but as long as you're just speculating ("based on how starcraft 2 works"), I'm not interested.
 

Dr Super Good

Spell Reviewer
Level 63
Joined
Jan 18, 2005
Messages
27,192
JASS isn't compiled, as you have correctly pointed out. However, do understand that if the JASS interpreter had a JIT, pipelining could provide the performance benefit without additional complexity, whereas compile-time optimizations can only provide a performance benefit if they were actually implemented; I hope you can understand how one is more relevant than the other.
JASS is compiled in the sense that it goes to virtual machine code. It is not compiles in the sense that it does not statically link. All names are still resolved at run time (like a compiler) even if the lines of JASS code are all compiled at map load (it does not parse each line every time it is executed).

This is similar to how Python worked (or used to work) where it would still perform run time name resolution of ever named element however it would compile the lines of python into python virtual machine code.

If you want to be technical it is still an interpreter in that it does not run native machine code however it does run virtual machine code and is not an interpreter in the classic sense of it parsing each line as it executes. The performance problem is mostly down to the lack of static linking of named elements meaning that its performance sits somewhere in-between a v full interpreter which parses each line as it runs and true virtual machine language which is executed by a language emulator or translator.

Since each name resolution is a hashtable operation involving hashing the name of the string (no caching) that is where the main performance impact comes from. The performance difference is as simple as it having to resolve the unit property constant's name as that is quite a long name. The passing of it as a parameter is probably quite fast but the increase in name resolution is the largest impact. There is also additional logic needed to determine which property to modify however that will be dwarfed by the name resolution for the property constant in the first place.

Galaxy used by SC2 does static linking. This is why it is considerably faster than JASS even if it is slower in may respects as it performs considerably more error checking.
 

Cokemonkey11

Code Reviewer
Level 29
Joined
May 9, 2006
Messages
3,522
If you want to be technical it is still an interpreter in that it does not run native machine code however it does run virtual machine code and is not an interpreter in the classic sense of it parsing each line as it executes. The performance problem is mostly down to the lack of static linking of named elements meaning that its performance sits somewhere in-between a v full interpreter which parses each line as it runs and true virtual machine language which is executed by a language emulator or translator.

Since each name resolution is a hashtable operation involving hashing the name of the string (no caching) that is where the main performance impact comes from. The performance difference is as simple as it having to resolve the unit property constant's name as that is quite a long name. The passing of it as a parameter is probably quite fast but the increase in name resolution is the largest impact. There is also additional logic needed to determine which property to modify however that will be dwarfed by the name resolution for the property constant in the first place.

So you're saying longer function names don't impact performance, but longer variable names do? That's interesting (source?), but doesn't really refute my point.

Or perhaps I don't see what you're getting at.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
*grabs popcorn*

I wonder who will be the first one that sais "Let's not argue about a limited programming language and start making a game that will completely replace Warcraft 3. One that will even be better than Warcraft 4 if it will ever be released."

I can see that Cokemonkey11 and Dr Super Good could be able to do that on their own :D ... with the help of some modellers/artists.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
I do have my own view of what would be amazing and what would be required...
However I think that there are many others that also have those ideas.

On-topic: ... Yea you might want to go back to the first post to find out what the topic is about.
I actually rewritten the system from timers to 0.03 seconds loop :D
Not because of efficiëncy but for the sake of better collaboration between the system and WC3 :D
It turns out that this topic was not really necessary in the first place.

So this topic can be closed.

(just kiddin)
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
set udg_EOT_Timer = CreateTimer() call TimerStart(udg_EOT_Timer, 0.03, true, function Interval)

Was told to be better than call TriggerRegisterTimerEvent(trigger, 0.03, true)

But was that your question?

and yea, 1 periodic timer is a lot better than 1 timer per instance, because if you get 1k+ instantes, wc3 will start to request more air :D

except that now that 1k+ instances are now setting data, calling events, etc. every 0.03 seconds instead of every 1, 2, 3, 4, 5, 6, 7, 8, 9, whatever you set the interval to.

1k timers or 1k * variable amount gets/sets... do not see much difference.
It was more because of now I can handle standard WC3 Dispels.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
Answering to your first post, I would say it is negligible.
Also it seems dumb to use a variable for a timer when you dont need it, having the event "every X seconds of gametime"

"Every (real) seconds of gametime" ís a timer.
But
1. I don't have a trigger if I use a timer instead.
2. I can pause/resume the timer which is better than enabling/disabling a trigger... as the "Every (real) seconds of gametime" timer still runs.
 

Wrda

Spell Reviewer
Level 25
Joined
Nov 18, 2012
Messages
1,870
"Every (real) seconds of gametime" ís a timer.
I know that...

But
1. I don't have a trigger if I use a timer instead.
2. I can pause/resume the timer which is better than enabling/disabling a trigger... as the "Every (real) seconds of gametime" timer still runs.
1. You will have a trigger if you use a timer, unless it is jass/vjass. Still, you need a function for its "actions".
2. True, but you can just have the event as 0.01 and turn it off, it won't fire more than once and it's only a maximum of 0.01 delay.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
1. I need a function indeed... a trigger is not needed. (yes I use JASS) Trigger actions are like 80% slower than calling a function... like a timer does.

2. I don't use 0.01 as the title sais but 0.03 as it is better and has no visual difference.
When a trigger is disabled, the timer still runs. The timer sais "Run my trigger" and then the trigger sais "No I am disabled!". 0.03 seconds later the same story... again... again... and again. until I enable it again.
Pausing a timer does not. It just stops the timer from calling his function. So that is way better... I don't know how much better :D
 
Status
Not open for further replies.
Top