• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

Efficiency battle (every 0.01 of gametime vs timers)

Status
Not open for further replies.
Level 24
Joined
Aug 1, 2013
Messages
4,657
Hi all.

I have a small question concerning a system that I want to show others.
The system is a Damage over Time/Heal over Time system.
One of the features is that it has to have a customizable interval...
Lets say that one DoT spell has to deal damage every second, while another one has to do it every 0.01 second.

The question:
Is it more efficient if I use "Every 0.01 second of gametime" instead of timers that have the duration of the intervals?

(Keep in mind that there will be a timer for every active DoT effect.
Or there are 100 checks each second if there is only 1 DoT active with a 10 seconds interval.)

Any help (answers that tell me that it is negligible too) is appreciated.
 
Level 21
Joined
Mar 27, 2012
Messages
3,232
Typically timers are faster, but note that having several timers instead of one is slower.
The trigger timer event is also a timer, so there is little difference in that aspect.

So for the purpose of scaling you'd use a single 0.0312500 interval timer.(don't make it lower. There is no visible difference), but for solo effects that are rare you can use a separate timer.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
I know that 0.03 (maybe 0.04) has visually no difference to the human eye and probably to WC3 too but if you make the interval 0.02, how are you gonna fix that?

I made a little concept in which I used timers.
But I just downloaded Paladon's DOT system (because it just caught my eye) and he uses 0.01 of game time.

Every DoT is going to use the same trigger (as there is actually no big deal to use both)
So I don't really understand if you have given me an answer...

The bad things of both (and which I want the least bad) is:
Timers: A lot of timers running. Because every DoT has its own timer.
0.01 checks: A lot of checks that won't do anything except that they read and write one variable (duration) for every active DoT.

EDIT: I will go for 0.01 game time actions.
1. I will be able to adjust the effects easily.
2. I will be able to adjust the inervals.
 
Last edited:
Level 21
Joined
Mar 27, 2012
Messages
3,232
You can also do duration with a 1/32 timer. (This is the 0.0312500 that I mentioned).
Specifically, I usually have a constant variable called TIMEOUT and an array for ticks. The length of a tick is always seconds*TIMEOUT, so when assigning a duration I just have to do duration*TIMEOUT inside the code and I'll never have to change it.
This is as accurate as can be (1/32 accuracy, logically) and doesn't really take much power.
 
Level 22
Joined
Sep 24, 2005
Messages
4,821
Running a function every one-tenth one-hundredth of a second is more taxing than using multiple timers, might as well use multiple timers if you ask me.
 
Last edited:
Level 24
Joined
Aug 1, 2013
Messages
4,657
You can also do duration with a 1/32 timer. (This is the 0.0312500 that I mentioned).
Specifically, I usually have a constant variable called TIMEOUT and an array for ticks. The length of a tick is always seconds*TIMEOUT, so when assigning a duration I just have to do duration*TIMEOUT inside the code and I'll never have to change it.
This is as accurate as can be (1/32 accuracy, logically) and doesn't really take much power.

I place my bets that you are a programmer (for real languages)... am I right?
You could also use a 0.04 (1/25) because the human eye cannot get to that anyway and it never has any trouble with roundings.
I actually have no idea why everyone uses 0.03 though.
That is the same as there is no difference between 30 and 60 fps.

Running a function every one-tenth of a second is more taxing than using multiple timers, might as well use multiple timers if you ask me.

This is the answer I wanted... as long as it is true :)

I will just set timers and think of some other way to adjust the data.
Will be fun.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
Humans can distinguish at least to 60 frames per second. Any claim that 24, 25, 29.97, 30, or 35 frames per second is the maximum distinguishable rate is incorrect. The only true scientific analysis recognized by wikipedia puts human perception peak around 75fps.

That being said, even if there *was* a perceivable difference between 75 and 100 FPS, you still need to choose a tick rate that sacrifices correctness for performance. Ideally, all "damage over time" calculations would occur in a continuous spectrum.

Is it more efficient if I use "Every 0.01 second of gametime" instead of timers that have the duration of the intervals?

The difference is negligible

http://en.wikipedia.org/wiki/Frame_rate#Background
 
Level 21
Joined
Mar 27, 2012
Messages
3,232
The reason people use 0.0312500 is that it gives you an exact 32 frames per second, which seems ideal in warcraft. Higher timeouts can look less smooth with projectile systems(not applicable in this case, I suppose).
Lower timeouts then again simply use more power for no noticable performance gain.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
The reason people use 0.0312500 is that it gives you an exact 32 frames per second, which seems ideal in warcraft.

Do you realize how ridiculous that sounds? If you want 32 frames per second, then use 1./32.. Don't use magic constants. And who says 32 fps is ideal in warcraft? How would they justify that if they even did say it?

Higher timeouts can look less smooth with projectile systems(not applicable in this case, I suppose).
Lower timeouts then again simply use more power for no noticable performance gain.

Warcraft 3 has a framerate cap at 60 FPS. Anything programmed at less than 60 FPS can be perceived by the player.
 
Level 21
Joined
Mar 27, 2012
Messages
3,232
Do you realize how ridiculous that sounds? If you want 32 frames per second, then use 1./32.. Don't use magic constants. And who says 32 fps is ideal in warcraft? How would they justify that if they even did say it?



Warcraft 3 has a framerate cap at 60 FPS. Anything programmed at less than 60 FPS can be perceived by the player.

I have personally tested different timeouts for projectile systems and it's at about 0.03 that the movement becomes smooth. Having it at 32 makes it easy to calculate durations and stuff like that, because I will simply know that 32 ticks is one second.
Also, it would not be any more a magic number than 1./32.
The concept of magic numbers applies to unnamed numbers within code, which is not the case here. I usually have a separate constant variable called TIMEOUT for this purpose.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
I have personally tested different timeouts for projectile systems and it's at about 0.03 that the movement becomes smooth. Having it at 32 makes it easy to calculate durations and stuff like that, because I will simply know that 32 ticks is one second.
Also, it would not be any more a magic number than 1./32.
The concept of magic numbers applies to unnamed numbers within code, which is not the case here. I usually have a separate constant variable called TIMEOUT for this purpose.

So, you're saying that you personally tested various clock frequencies and found that numbers around 30 look smooth? [1][2][3]

Also, if you choose 32 fps to make calculating other values easier, do you not see the contradiction of purpose in creating a TIMEOUT variable?

TIMEOUT = 0.0312500 would still be a magic number without documentation, because TIMEOUT doesn't describe the number at all. CLOCK_PERIOD = 1/30 is the unambiguous case that you want here.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Any period less than 0.0166... seconds is pointless since most people will never see the difference. That is the minimum period you should use which will mean that every of the 60 frames per second of average players gets a difference display value. Most people recommend far larger values like 0.02 or 0.03 etc but these will not update values every frame so appear slightly discontinuous.

Blizzard generally agrees that DoTs should not run every frame. Look at StarCraft II and Heroes of the Storm, both of them have DoTs which generally run only a few times a second since there really is so little difference as long as the DPS remains the same.

Diablo III is an exception but it has a special "Damage over Time" mechanic which actually runs only 16 or so frames per second but interpolates every frame to show your health smoothly go down. WC3 and SC2 lack this which is probably why they go for longer damage tick periods.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
Xonok, my problem is the rampant spreading of misinformation.

Dr Super Good, what you say is pretty accurate, however, timer frequencies greater than wc3's maximum framerate does serve a purpose in simulation. One could find a trivial example of a race condition where a 100 FPS timer satisfies some condition and a 60 FPS one does not. Discretizing pseudo-continuous environments will always be an optimization process and have some margin of error.
 
I have personally tested different timeouts for projectile systems and it's at about 0.03 that the movement becomes smooth. Having it at 32 makes it easy to calculate durations and stuff like that, because I will simply know that 32 ticks is one second.
Also, it would not be any more a magic number than 1./32.
The concept of magic numbers applies to unnamed numbers within code, which is not the case here. I usually have a separate constant variable called TIMEOUT for this purpose.

I agree with codemonkey that this is bull. If you have a constant with the timer interval, ANY interval, you can multiply any value with this constant to know how much of it to apply every loop. Dividing a value by the interval lets you know its "per second" value. This is neither harder nor easier for any interval, because MATH.

@OP, i agree with the others that applying DoT as often as 100 times per second is a waste. I'd rather pick three times a second. Five, tops!
 
Humans can distinguish at least to 60 frames per second. Any claim that 24, 25, 29.97, 30, or 35 frames per second is the maximum distinguishable rate is incorrect. The only true scientific analysis recognized by wikipedia puts human perception peak around 75fps.
This is simply not true. There is a difference to what humans can perceive depending on the type of stimulus. That's why animations look fluent at 30 FPS, but old tube screens running on 50-60 FPS still "flicker".

The 60-75 FPS perceived only apply to drastic changes in contrast or lighting. You will not notice a difference between a unit moving at 25 FPS and a unit moving at 60 FPS. But you will notice a difference when scrolling your screen at 25 FPS and 60 FPS. This is because it's a different kind of stimulus for the human brain.

Again, any movement of a small-scale object above 32 FPS without a change of colors, constrast or lighting is just pointless. 60 FPS only matter if the content of the entire screen changes (again, scrolling!).

Anything beyond 60 FPS has no effect at all, as the maximum FPS of WC3 is capped at that.


And we don't even talk about moving units here. We talk DOT damage. So the amount of FPS needed for a "smooth" transition of the health bar is actually determined by the pixel-width of your health bar. For ordinary units with a collision size of like 16-32, you won't need more than 10 FPS to make the health bar smooth, simply because the low resolution of your health bars will not allow you to tell the difference.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
And we don't even talk about moving units here. We talk DOT damage. So the amount of FPS needed for a "smooth" transition of the health bar is actually determined by the pixel-width of your health bar. For ordinary units with a collision size of like 16-32, you won't need more than 10 FPS to make the health bar smooth, simply because the low resolution of your health bars will not allow you to tell the difference.

In fact, I never opened this thread for visual performance.
I wanted to have a clear view of the difference between the system performance of both.

The problem between 0.03 and 0.01 is that you can get values below 0 as the remaining duration and a duration below zero would have a different function as a duration that is zero or a duration that runs out.

Next to that, the minimum fps to make it smooth is also determined by the damage that is dealt. It is really small but still less smooth.
 
The problem between 0.03 and 0.01 is that you can get values below 0 as the remaining duration and a duration below zero would have a different function as a duration that is zero or a duration that runs out.
I don't have the slightest clue what you are talking about. It doesn't matter what timeout you chose, you can always make the DOT deal the correct damage. Using 0.01 seconds as the timeout is just plain laziness to do the maths.
 
Level 21
Joined
Mar 27, 2012
Messages
3,232
I don't understand how you can achieve negative timeouts/ticks. At worst I've seen a 0-tick long knockback that never ended.(Easy to fix with a check when applying the knockback)

Well, for a system like this it is quite likely best to just use one timer that runs, at most, 32 times per second. I'd opt for something like 10 though.
For the record, the function is exactly the same no matter what the constant is.
 
Level 26
Joined
Aug 18, 2009
Messages
4,097
TriggerRegisterTimerEvent is pretty much the same speed-wise as a timer, I guess they even use the same structure internally, only that a timer is more flexible through the jass API and TriggerRegisterTimerEvent compulsorily runs a trigger, timers can do a function or a trigger. I would never use the static variant.

In any case, one should never use multiple high-frequency timers for the same task. Put the instances in a collection and iterate over it when a single global timer expires. It just should not matter if a fraction of that tiny interval was already opened, so your new instance's first tick starts after 0.001234 seconds instead of 0.01 seconds, wayne.

And anyway, in general, avoid stuffing gameplay features in short-interval timers. That is bound to be non-performant, not because of the timer framework but because of the actions inside. Imagine you have another trigger detecting the damage and doing checks and more actions, that boosts the burden.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
I don't have the slightest clue what you are talking about. It doesn't matter what timeout you chose, you can always make the DOT deal the correct damage. Using 0.01 seconds as the timeout is just plain laziness to do the maths.

That is not true.

If you lower a variable by 0.03 every 0.03 seconds, you can get negative values instead of reaching 0.
Now that negative values are not allowed because they have a different meaning, this will be a glitch.
 
That is not true.

If you lower a variable by 0.03 every 0.03 seconds, you can get negative values instead of reaching 0.
Now that negative values are not allowed because they have a different meaning, this will be a glitch.
Then just ... idk ... make the exit condition duration <= 0?
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
@OP, i agree with the others that applying DoT as often as 100 times per second is a waste. I'd rather pick three times a second. Five, tops!

It's impossible to generalize that easily. It's easy to contrive an example where precision and correctness are the only important factors, and race conditions are relavant.

This is simply not true. There is a difference to what humans can perceive depending on the type of stimulus. That's why animations look fluent at 30 FPS, but old tube screens running on 50-60 FPS still "flicker".

The 60-75 FPS perceived only apply to drastic changes in contrast or lighting. You will not notice a difference between a unit moving at 25 FPS and a unit moving at 60 FPS. But you will notice a difference when scrolling your screen at 25 FPS and 60 FPS. This is because it's a different kind of stimulus for the human brain.

Again, any movement of a small-scale object above 32 FPS without a change of colors, constrast or lighting is just pointless. 60 FPS only matter if the content of the entire screen changes (again, scrolling!).

Are you claiming that a unit that takes a sizable fraction of the field of view and moves very fast would not be distinguishable at 25 and 60 FPS? That's just silly. You have been programmed as a game designer to make engineering choices, but as a scientist you must consider "why", not only how.

Anything beyond 60 FPS has no effect at all, as the maximum FPS of WC3 is capped at that.

That is incorrect. WC3 draws to screen at 60FPS, but 100 or even 1000 FPS timers will still maintain correctness.

And we don't even talk about moving units here. We talk DOT damage. So the amount of FPS needed for a "smooth" transition of the health bar is actually determined by the pixel-width of your health bar. For ordinary units with a collision size of like 16-32, you won't need more than 10 FPS to make the health bar smooth, simply because the low resolution of your health bars will not allow you to tell the difference.

That would only be correct if the health bar sizes were drawn with aliased pixels (they're probably not). sub-pixel smoothing has been around for a very long time.

Besides, health bars aren't the only relevant thing here. You're arguing about appearance when game state is the actual desirable.

Not to mention that health is presented in integer form to the player, so you will hardly notice the fractions.

Actually the fractions won't work at all. You need to assert in such a system that damage ticks are never less than 1.0f

The problem between 0.03 and 0.01 is that you can get values below 0 as the remaining duration and a duration below zero would have a different function as a duration that is zero or a duration that runs out.

What?

It doesn't matter what timeout you chose, you can always make the DOT deal the correct damage. Using 0.01 seconds as the timeout is just plain laziness to do the maths.

That is incorrect. If you deal 10 damage over 10 seconds with a 1000FPS tick rate, you will not deal 10 damage in total.

If you lower a variable by 0.03 every 0.03 seconds, you can get negative values instead of reaching 0.
Now that negative values are not allowed because they have a different meaning, this will be a glitch.

(Almost) Never compare floating point numbers with ==
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
Dr Super Good, what you say is pretty accurate, however, timer frequencies greater than wc3's maximum framerate does serve a purpose in simulation. One could find a trivial example of a race condition where a 100 FPS timer satisfies some condition and a 60 FPS one does not. Discretizing pseudo-continuous environments will always be an optimization process and have some margin of error.
For performance reasons you cannot afford to run periodic timers at that rate. Even real games never simulate that fast and they are written in code that executes thousands of times faster than JASS.

This is simply not true. There is a difference to what humans can perceive depending on the type of stimulus. That's why animations look fluent at 30 FPS, but old tube screens running on 50-60 FPS still "flicker".
Actually you can easily see the difference between a game running at 30 fps and 60fps. The reason cinematography looks fluid at 30 FPS is due to motion blur which is the result of integration (of sorts) of movement over a period of time which gives the appearance of continuous (natural) movement.

Human eyes actually do not have a refresh rate as they are continuous analogue systems. They do however have multiple filter functions which limit the resolvable motion.

For example what you actually see is the result of putting individual photon pickup in the cones/rods being put through a filter function with neighbouring cones/rods to produce a persistent brightness/colour which your brain then interprets. This is why CRTs appear permanently bright when viewed even though only a small part of a frame is visible at any given time (as seen on a camera).

This filtering is also why looking at a LCD screen appears to have 32bit colour. Your eyes are actually picking up the individual colour channels yet the filtering performed merges the colours together to produce what you see as 32bit colour instead of blocks of 3 different 8 bit colour channels.

That is incorrect. WC3 draws to screen at 60FPS, but 100 or even 1000 FPS timers will still maintain correctness.
Except it is pointless to perform any periodic actions sub-frame since the user will never see the results. You might as well get the processor to resolve digits of pi during that time as far as the user is concerned. As such him claiming that it has "no effect" is correct since as far as the user is concerned there is no extra effect and it might as well be running once every frame as the sub-frames are unresolvable.

The only time you ever need to perform a sub-frame action is to "sequence break". In the case of a trigger responding to an issued order event or a damage event you need a 0 second timeout timer to allow the result to execute so you can perform your corrections. Without it the issued order will still be issued even if you "stop" order or the damage will still be dealt even if you "heal" it (which is not possible at maximum life).

That is incorrect. If you deal 10 damage over 10 seconds with a 1000FPS tick rate, you will not deal 10 damage in total.
This is not helpful. You need to emphasise the "minimum damage" mechanic of WC3 which limits the lowest un-reduced receivable damage to 1 (before ability reductions and armor type reductions). As such if you deal 10 damage over 10 seconds at 1,000 ticks/sec you will actually deal 10,000 damage even if you correct the damage to 0.001 per tick since the minimum damage the unit will take is 1.

Even SC2 has this where the minimum damage is 0.5 health.

(Almost) Never compare floating point numbers with ==
In most cases you should always use either "less than or equal to" or "greater than or equal to" even for integers since it provides protection against a range instead of just a specific value.

For a damage over time system I would not recommend more than 16 ticks per second for both performance and to avoid inaccuracies with the WC3 damage system. I agree that once every second is probably too slow (it can give a doomed unit almost an entire second worth of live time extra) however running it every frame or so is probably too fast.
 
Actually the fractions won't work at all. You need to assert in such a system that damage ticks are never less than 1.0f

I was under the impression that we were talking about wc3. In wc3, health is stored as a real but presented as an integer. Hence, it is perfectly legit to decrease health by a real value if, say, you want to deal 0.5 damage per second. Try it yourself.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
For performance reasons you cannot afford to run periodic timers at that rate. Even real games never simulate that fast and they are written in code that executes thousands of times faster than JASS.

That's just your opinion. You can afford to run timers at 1000 FPS in many instances. It just depends on the map.

Your comment about game state is correct, though. League of Legends, for example, has a server tick rate around 60 FPS, if I recall correctly.

I was under the impression that we were talking about wc3. In wc3, health is stored as a real but presented as an integer. Hence, it is perfectly legit to decrease health by a real value if, say, you want to deal 0.5 damage per second. Try it yourself.

There are limitations to this. Try my suggestion:

That is incorrect. If you deal 10 damage over 10 seconds with a 1000FPS tick rate, you will not deal 10 damage in total.

If I'm incorrect please provide a demo map.
 
You should be. Warcraft reals have a presicion of at least five significant digits. I'd even take it they have seven, if they are standard 32bit floats. In any case, your example is bad since 1000fps will hardly sail well in warcraft regardless of what operations you are doing. I've been applying damage this way for ages and never run into any inaccuracies, at least not at the framerates i am using.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
From tests I ran in the past giving a unit a million armor with huge reduction such that it read "100%" damage reduction it still took combat damage of 1 damage per hit, even if the attack was only 1 damage to start with. As such dealing damage with triggers might hit this minimum damage cap. If you are dealing damage over time with a high tick rate then chances are you will run into this minimum damage cap and so could end up dealing more damage per second than intended as a result.

However that is if the minimum damage cap applies to trigger sourced damage, I could be wrong as I only tested attack damage.

That's just your opinion. You can afford to run timers at 1000 FPS in many instances. It just depends on the map.
Except it is pointless to do so since people will only see changes at 60 FPS. Again it makes no difference if something runs at a frequency of 60Hz or 1,000Hz since they still see only 60Hz. Some display hacks might allow more however most people play at 60Hz.
Your comment about game state is correct, though. League of Legends, for example, has a server tick rate around 60 FPS, if I recall correctly.
Sonic Adventure 1 and 2 also ran at 60Hz tick rate for internal update. In fact most console games until recently updated at the same frequency as the display output. Sonic 1 for example ran faster in the US than in the EU due to differences in display standard (NTSC vs PAL). I fail to see the relevance of what you are saying.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
From tests I ran in the past giving a unit a million armor with huge reduction such that it read "100%" damage reduction it still took combat damage of 1 damage per hit, even if the attack was only 1 damage to start with. As such dealing damage with triggers might hit this minimum damage cap. If you are dealing damage over time with a high tick rate then chances are you will run into this minimum damage cap and so could end up dealing more damage per second than intended as a result.

However that is if the minimum damage cap applies to trigger sourced damage, I could be wrong as I only tested attack damage.

I think similar problems occur with both triggered damage and with get/set unit state and/or widgetlife.

Except it is pointless to do so since people will only see changes at 60 FPS. Again it makes no difference if something runs at a frequency of 60Hz or 1,000Hz since they still see only 60Hz. Some display hacks might allow more however most people play at 60Hz.

Except you obviously didn't read anything I wrote because visual distinction is not the same thing as state correctness.

Sonic Adventure 1 and 2 also ran at 60Hz tick rate for internal update. In fact most console games until recently updated at the same frequency as the display output. Sonic 1 for example ran faster in the US than in the EU due to differences in display standard (NTSC vs PAL). I fail to see the relevance of what you are saying.

The difference in relevance was the fact that league of legends is a competitive game with professional players, where correctness and game state are very important. I was pointing out that even for such developers, state correctness at higher than 60 FPS is not strictly necessary. That being said, this doesn't mean that (a) there doesn't exist a game where state correctness is valued more than in league of legends, nor (b) that state correctness in simulation is limited to games at all.
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
I think similar problems occur with both triggered damage and with get/set unit state and/or widgetlife.
Get/set widget life is not damage however. It will not be reduced like damage, will not trigger damage events like damage and does not give kill credit like damage. It will even bypass mana shields unlike real damage. Which is why you should use damage with some periodic interval like 0.2 seconds.

Except you obviously didn't read anything I wrote because visual distinction is not the same thing as state correctness.
Once again there is no need to run stuff people cannot see. Simply run the actions multiple times in a row. This is how SC2 missile movers work.

The difference in relevance was the fact that league of legends is a competitive game with professional players, where correctness and game state are very important. I was pointing out that even for such developers, state correctness at higher than 60 FPS is not strictly necessary. That being said, this doesn't mean that (a) there doesn't exist a game where state correctness is valued more than in league of legends, nor (b) that state correctness in simulation is limited to games at all.
I thought it was a game started by amatures that happened to make it as a big AoS. At first there was Demigod then there was that AoS we all have forgotten (Heroes of Newerth) then there was League of Legends and now Dota 2 and Heroes of the Storm.

Heroes of the Storm for example only updates at 16 frames per second (as it is based on the SC2 engine but can run a bit faster as real seconds are often slower than game seconds) and no one notices. That might come down to the fact that few people actually can do over 16 meaningful actions per second.

It all depends on how you engineere the games. Games like WC3 were engineered so that graphics and game state were mixed together to some degree so it probably updates at 60 frames per second as a result (and why frame rate cannot really go beyond 60 frames per second). SC2 on the other hand uses parallel graphics which interpolate game state so units will still appear to move smoothly even if the game engine is updating only 16 times per second.

Generally you do not need to advance game state faster than 16 odd Hz because as long as it looks smooth (interpolated graphics) people will seldom notice the difference. Additionally this is more future proof as in theory it will support any refresh rate since the interpolation mathematics can be adjusted.

High speed fighting games or first person shooters might need to run faster but that is because you are placed closer to the action so might notice such delays (eg perfect block or shield broken?). Additionally they have less going on so there is less performance reasons to run at lower update rates (good luck moving 1,000 actors at 60 updates per second).
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
@DSG After reading that 3 times I fail to see your point. If I was interested in incoherent ramblings I would just go to twitter
My point was explaining how what you were saying was not backed up with evidence and that it might not be the case as the result.

For example if you did damage 0.01 damage 1,000 times per second you might end up dealing 1,000 damage instead of the 10 damage you would have thought. My evidence for this is that when a unit attacks another unit the minimum damage dealt after armor reduction but before type and other reductions is 1. As such if you deal 1 damage 1,000 times in a second you will have dealt 1,000 damage even if you intended for less damage to be dealt. My evidence for this occurring is weak as I am using the mechanics of attack damage where as trigger damage might not have such a limit however it is more than just sprouting random stuff about SetWidgetLife and that faster timers are needed to be state perfect (when game state is/should not be determined with the passing of real time).

This map will hopefully help you clear your minds up.
Was that aimed at me or Codemonkey11? Please post the answer so people reading the thread in future can get it immediately.
 
Level 23
Joined
Apr 16, 2012
Messages
4,041
it was mainly trying to prove that you can indeed deal less than 1 damage even before mitigation(it does chaos damage, universal too, and pally has 0 life regen and 0 armor).

And also kind of the obvious thing, that the game can handle 200+/s timer callbacks from single timer
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
And also kind of the obvious thing, that the game can handle 200+/s timer callbacks from single timer
You can run several thousand. Obviously there is the physical limit whereby the execution time of the timer callback exceeds the period causing a backlog during which time the game will become pretty much unresponsive.

I am not sure how WC3 timers are implemented to do this (it sure is not interrupt based!).
 
You could also use a 0.04 (1/25) because the human eye cannot get to that anyway and it never has any trouble with roundings.
I actually have no idea why everyone uses 0.03 though.
That is the same as there is no difference between 30 and 60 fps.

Did you know that Maker can recognize Frame rates even with his eyes?
He considers 0.04 as laggy, and considers 0.03 smooth.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
For example if you did damage 0.01 damage 1,000 times per second you might end up dealing 1,000 damage instead of the 10 damage you would have thought.

  • First of all what you say is incorrect, at least with some patch/data version of warcraft 3, as shown by edo494's post. If you took the time to open it yourself instead of spouting purely anecdotal "evidence" as fact you wouldn't have made this mistake.
  • This problem can equally occur with timers of small period (eg 0.01 damage 10 times per second might do 10 damage) - the general solution is to use logic to calculate when to deal 1 damage and when not to, rather than heuristically (guessing) what kind of timer period would make the game "close enough".

My evidence for this is that when a unit attacks another unit the minimum damage dealt after armor reduction but before type and other reductions is 1. As such if you deal 1 damage 1,000 times in a second you will have dealt 1,000 damage even if you intended for less damage to be dealt. My evidence for this occurring is weak as I am using the mechanics of attack damage where as trigger damage might not have such a limit however it is more than just sprouting random stuff about SetWidgetLife...

Please, tell me how to make a "healing over time" trigger without SetWidgetLife? Your other options are invariable worse than SetWidgetLife.

...and that faster timers are needed to be state perfect (when game state is/should not be determined with the passing of real time).

Game state can only be determined by passing of real time on Harvard/Von Neumann architectures. The only real constant is time. If you have some better way of maintaining game state, please, let me know.

Attached is a trivial example, in diagrammatic form, to help get this through your thick head.
 

Attachments

  • idiot.pdf
    37.3 KB · Views: 130

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
First of all what you say is incorrect, at least with some patch/data version of warcraft 3, as shown by edo494's post. If you took the time to open it yourself instead of spouting purely anecdotal "evidence" as fact you wouldn't have made this mistake.
I tested extensively with units attacking each other. My evidence is concrete for the attack behaviour not allowing armor to reduce base damage below 1. I never tested with fractional damage I do admit (units do not really like that or reflect it in the UI) and I made it clear that I am entirely basing my statement on this fact. I also said it could apply differently so you really have no room to make such comments. The fact you needed someone else to provide the evidence says you were as clueless about the mechanic as I was.

This problem can equally occur with timers of small period (eg 0.01 damage 10 times per second might do 10 damage) - the general solution is to use logic to calculate when to deal 1 damage and when not to, rather than heuristically (guessing) what kind of timer period would make the game "close enough".
The correct approach is to not deal damage sub frame (no need for timers with frequency faster than 60 Hz for this) and to do what you described.

Did you know that Maker can recognize Frame rates even with his eyes?
He considers 0.04 as laggy, and considers 0.03 smooth.
That is because 0.03 is ~1 update per 2 frames or better and 0.04 is ~ 1 update per 2 frames or worse. It is a big difference because your orc Grunt will move smoothly at 60 FPS where as both will be changing at half that rate right next to it. You will notice changes right up to a period of 0.0166 on a computer of reasonable resources as only then will each frame receive an update and so no further smoothness can be resolved (unless you work around the 60 FPS hard cap that WC3 has which I think some managed to with varying results).

Please, tell me how to make a "healing over time" trigger without SetWidgetLife? Your other options are invariable worse than SetWidgetLife.
The system is a Damage over Time/Heal over Time system.
SetWidgetLife will work for healing over time only. As I was specifically explaining the damage over time part, SetWidgetLife will not work for damage over time because it does not do any damage (not reduced, no credit). I do agree that for healing over time SetWidgetLife is the correct one to use since healing is not counted as negative damage. You could also use the appropriate unit life native as well with varying performance results (probably lower).
Game state can only be determined by passing of real time on Harvard/Von Neumann architectures. The only real constant is time. If you have some better way of maintaining game state, please, let me know.
Game states have and always will be advanced by "ticks". This is because time is relative on each system and in order for a game to be deterministic it has to define some standard unit of passing of time such that when executed across any supported system it will produce the same results.

In the old days each tick was coupled to the refresh rate of the output display. This is why Sonic 1 plays slower in Europe than in America as it updated at 60 Hz in the US (since TVs were NTSC so supported 30 and 60 Hz refresh) while in Europe it only updated at 50 Hz (since TVs were PAL so supported 25 and 50 Hz refresh). It was even to such an extent that the music played back slower in Europe than America when playing such games as they consoles were scaled down frame by frame so where as the American console advanced 60 frames per second, the European one only 50 times, meaning that the American system was 1/6 ahead of the European system after 1 second.

Since this was total nonsense, people moved towards decupling refresh rate with game state advancement. However since multi-core processors are a new invention, they often did this my forcing the main thread to wait when ahead of the display so frames were produced at the right but skipping drawing frames if state fell behind.

Now it is a lot more complicated with graphics operating as a separate sub system from the deterministic state as seen in games like StarCraft II. However the fact remains that deterministic state (the actual game state, not your view of the game state which is not deterministic and does not mater) still advances in ticks.

For example StarCraft II (which is actually a pretty good text book example of a pseudo modern game) advances at 16 deterministic ticks per second. If you lower the game speed it simply calls fewer deterministic ticks (such as 12 or 8) per second so the game progresses slower. If you raise game speed it calls more deterministic ticks (such as 20 or 24) per second so the game progresses faster. Even when running slower it still looks smooth as your non-deterministic view interpolates to generate unique graphics for each display refresh (if resources allow, something that can be hard if there are 150 Thor units in view).

As a result of this, fractional timing units in StarCraft II are rounded to the nearest frame to execute. This means that it is impossible to run a timer faster than 16 times per game second. So that missile movers look smooth StarCraft II runs them at 32 times per second with each deterministic frame updating them twice at once and feeding the resulting positional data to the graphic sub system so that they appear more smooth (movement has more frequency information).

Back to physical implementations. Since processors have a non-deterministic interpretation of time (clock rate is unstable, in-exact, varies from model to model etc) you cannot run anything outside of game ticks as there is no way to resolve timing deterministically (so not suitable for multiplayer, competitive play, no replay support etc). As you can probably guess, WC3 must not be using this implementation for sub-frame accuracy, it must be using some other method of determining timer execution (possibly loop decrementing the countdown after every frame advanced).

If you define every game tick as 1/16 of a second you know deterministically that if you are at a certain frame the time is a certain amount. If you want something to run sub-frame the idea is you loop multiple times within a single deterministic tick and then interpolate the results over the output frames (aka SC2 missile movers). Since those are also ticks that means that everything in the simulation of your game ultimately comes down to how fast you advance ticks.

A final way to do time is based on time events. If everything can be interpolated between any quantity of time then you just need to "advance time" until the next time event. The problem with this is that for a useful view of the world at a given time you will need to update the entire state to match the time of the event (or at least be able to supply the results as if it was). I do admit WC3 could use this approach with its timers and it would in theory support any time interval allowed however I would also imagine the performance could be a concern for using this approach. Additionally for determinism the game still needs to update 60 times per second for each frame to be unique and deterministic, since rounding error means that the exact same sequence of time events are required to produce the exact same state at the end.

Frankly I do not know how WC3 works inside. The 60 FPS hard limit for refresh tends to lead me to believe an internal update rate of 60 ticks/second however the support for sub-tick ordered timers with reasonable accuracy leads me to believe a more event orientated approach. I have not tested enough to tell how it works however the fact remains that in theory the time continuum of a game is not continuous or even relative to the real time continuum.

Attached is a trivial example, in diagrammatic form, to help get this through your thick head.
I do not think this is a constructive attitude.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
I tested extensively with units attacking each other. My evidence is concrete for the attack behaviour not allowing armor to reduce base damage below 1. I never tested with fractional damage I do admit (units do not really like that or reflect it in the UI) and I made it clear that I am entirely basing my statement on this fact. I also said it could apply differently so you really have no room to make such comments. The fact you needed someone else to provide the evidence says you were as clueless about the mechanic as I was.

I hope you realize that your evidence is uninteresting given that you were incorrect.

The correct approach is to not deal damage sub frame (no need for timers with frequency faster than 60 Hz for this) and to do what you described.

Please see idiot.pdf - in practice it isn't necessary to do something indistinguishable to the player, but in simulation it is sometimes necessary for correctness. Personally, I use 30Hz for almost all real-time timer periods in my maps, but this has nothing to do with the actual discussion.

SetWidgetLife will work for healing over time only. As I was specifically explaining the damage over time part, SetWidgetLife will not work for damage over time because it does not do any damage (not reduced, no credit).

It works just fine for damage, as well. Sometimes not being reduced and not giving credit are desirable. See: http://www.hiveworkshop.com/forums/...pe-structureddd-extension-228883/#post2277787

I do agree that for healing over time SetWidgetLife is the correct one to use since healing is not counted as negative damage. You could also use the appropriate unit life native as well with varying performance results (probably lower).

Probably? http://www.hiveworkshop.com/forums/2456107-post1.html

If you define every game tick as 1/16 of a second you know deterministically that if you are at a certain frame the time is a certain amount. If you want something to run sub-frame the idea is you loop multiple times within a single deterministic tick and then interpolate the results over the output frames (aka SC2 missile movers). Since those are also ticks that means that everything in the simulation of your game ultimately comes down to how fast you advance ticks.

Yes, but if you don't want to violate causality you need to use sequential logic for this - which is inherent from the fact that you're operating on a modified harvard achitecture, or at least virtualizing one.

If you want to contrive a way of maintaining sequential logic on timers at 16Hz without violating causality, be my guest, but the general solution is to optimize timer period based on your specification.
 
Level 24
Joined
Aug 1, 2013
Messages
4,657
I actually thought that this conversation was over -_-

However, basic attacks do not deal damage between 0 and 1. They can do 0 and they can do 1+ but never in between.

Also set unit/widget life will not work fine for damage.
It does not simulate damage as it cannot give the damage source.
When using complete Alternative Damage Systems, you can use it as long as nothing relies on the basic "GetKillingUnit()" method as it will never have a killing unit.

There is one thing in the link you posted that bothers me...
Native comparisons
  • TriggerEvaluate
    • ~25% faster thanTriggerExecutewhich leads us to believe thattriggercondition is more efficient thantriggeractionin terms of execution time.
I really want to see the implementation of both "TriggerEvaluate()" and "TriggerExecute()" now.
Because in fact, "TriggerExecute()" simply runs something, while "TriggerEvaluate()" also checks the return type, return value and keeps track of it.
It also restricts "TriggerSleepAction()" and returns rather than waiting for the end of the sleep.
So... "TriggerEvaluate()" has to do more than "TriggerExecute()". So I wonder why it is faster.
 

Cokemonkey11

Spell Reviewer
Level 29
Joined
May 9, 2006
Messages
3,534
However, basic attacks do not deal damage between 0 and 1. They can do 0 and they can do 1+ but never in between.

This has been established.

Also set unit/widget life will not work fine for damage.
It does not simulate damage as it cannot give the damage source.
When using complete Alternative Damage Systems, you can use it as long as nothing relies on the basic "GetKillingUnit()" method as it will never have a killing unit.

It's only necessary if the damage kills the unit, otherwise it is semantically equivalent in this context.

I really want to see the implementation of both "TriggerEvaluate()" and "TriggerExecute()" now.
Because in fact, "TriggerExecute()" simply runs something, while "TriggerEvaluate()" also checks the return type, return value and keeps track of it.
It also restricts "TriggerSleepAction()" and returns rather than waiting for the end of the sleep.
So... "TriggerEvaluate()" has to do more than "TriggerExecute()". So I wonder why it is faster.

TriggerExecute() runs actions, which open a new virtual thread. TriggerEvaluate() runs conditions, which do not.

Similarly, actions can have TriggerSleepAction and conditions cannot.
 
Level 23
Joined
Apr 16, 2012
Messages
4,041
TriggerExecute() runs actions, which open a new virtual thread. TriggerEvaluate() runs conditions, which do not.

As per map attached, not true, since if you run only one of these functions, or just uncomment the ExecuteFunc(test), it will show 42857, and if you keep the code showed, it will actually say 85714, which is exactly 2 times higher, which means each conditionfunc runs in its own thread, or at least with its own oplimit, but other than TriggerSleepAction, I do not know of way to reset oplimit without spawning new thread
 

Attachments

  • sandbox.w3x
    16.6 KB · Views: 58

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,198
I hope you realize that your evidence is uninteresting given that you were incorrect.
However it is correct for attacks. As such I am not incorrect, especially with the disclaimer I provided.

Please see idiot.pdf - in practice it isn't necessary to do something indistinguishable to the player, but in simulation it is sometimes necessary for correctness. Personally, I use 30Hz for almost all real-time timer periods in my maps, but this has nothing to do with the actual discussion.
No it is not for the reasons I already mentioned. Discrete simulations can be completely decoupled from time as they advance their virtual time through "ticks" rather than continuously as is the case in real time. What you said is very true for reality as time is continuous however in a discrete simulation where time is relative it makes no difference. If you run something twice every tick or every half tick you will still get the same sequence of events generated (runs twice followed by tick in an infinite sequence).

Descritization will logically affect the frequency components the simulation can output. This is why SC2 missile movers compute two movement ticks every deterministic tick so that they can output paths with 16 game Hz movement instead of only 8 game Hz. This is due to the Nyquist theorem with discrete systems.

It works just fine for damage, as well. Sometimes not being reduced and not giving credit are desirable.
Most of the time not so. Non-credited kills are usually bad for gameplay as they fail to reward the killer. Any form of such a mechanic often leads to DotA Allstar style "kill deny" mechanics.

At most it could be used as an interpolation method so as not to trigger damage response events at a high frequency. However the damage response mechanism would need to be able to recall the total damage dealt over the period from the DoT effect otherwise it will incorrectly report less damage than the unit sustained. This is very important for many triggered systems and spells that involve damage response.

That performance difference would be due to the reduction of 1 passed argument. 20% is also quite trivial except in major performance critical applications and since the attribute is an argument you could possibly recycle the same system code to change mana as well as life (not possible with SetWidgetLife).

Yes, but if you don't want to violate causality you need to use sequential logic for this - which is inherent from the fact that you're operating on a modified harvard achitecture, or at least virtualizing one.

If you want to contrive a way of maintaining sequential logic on timers at 16Hz without violating causality, be my guest, but the general solution is to optimize timer period based on your specification.
Except computers have no idea of the exact quantity of real time that passes. They do have a concept of the passing of real time however they cannot quantify it in a comparable way. To do that they need to synchronize with each other and agree on a standard to represent time. This introduces synchronization error and clock drift into the system.

Since such errors would violate determinism, a game like WC3 would use a simulation of time which is loosely synchronized to external inputs (lag delay in multiplayer as a result of synchronization to assure correct sequencing).

Since the game is not a continuous simulation, timers will only be run at certain stages during state advancement. To assure correct sequencing some form of time event queue could be used allowing timer elements to run in what would appear as the correct relative sequence.

However only what a user sees is of any concern in a video game. As such advancing multiple ticks per frame produces hidden state changes that waste CPU time with no improvement in end result to the user. From the ground up all game targeting systems should be designed with this in mind to avoid wasting CPU time unnecessarily. Instead of depending on correct sub-frame timer sequencing for a system to work, you should instead model the interaction between timers and produce a system which produces the same (or a very close approximation) results but only operating every frame so consume considerably fewer resources.

It's only necessary if the damage kills the unit, otherwise it is semantically equivalent in this context.
No it is not as a life change will not run damage responses. This is perfect for healing (since you do not want damage responses to run from being healed) but totally useless for damage (as you do want them to run damage responses).

Sure there are a few special cases (disease cloud from Abomination) however those are special cases. Disease cloud will not kill stuff as a result so that the "who killed it" argument is solved.

As per map attached, not true, since if you run only one of these functions, or just uncomment the ExecuteFunc(test), it will show 42857, and if you keep the code showed, it will actually say 85714, which is exactly 2 times higher, which means each conditionfunc runs in its own thread, or at least with its own oplimit, but other than TriggerSleepAction, I do not know of way to reset oplimit without spawning new thread
The oplimit is probably set every time a JASS thread is scheduled/rescheduled.
 
Status
Not open for further replies.
Top