Desktop Forums Blog Support Patreon
Official Steam Group | Patreon Chat
Anima_
Most of you probably already know that I'm working on a new (and hopefully improved) framework for our upcoming RPGs.

So I'll try to give a bit of insight into the inner workings of the system while it's still in development.



We'll start with the calculations at the core of the combat system:



Hit and damage calculations

The first step is to get a DeltaHit from the characters attributes and a random value.

This time the random value is distributed along a bell curve and at the moment goes from -58 to 59. (If you wonder why such an odd number we pretty much throw 3 40 sided dice and subtract the average from the sum.)
DeltaHit = Attacker.accuracy + random - target.evasion

The DeltaHit is then used to look up the effectiveness in a table. In contrast to Loren it's a discrete function this time. There are five possible outcomes.

Critical Hit: DeltaHit >= 45 : 1.5 x effectiveness

Direct Hit: DeltaHit >= 15 : 1.2 x effectiveness

Clean Hit: DeltaHit >= -10 : 1.0 x effectiveness

Hit: DeltaHit >= -25 : 0.5 x effectiveness

Miss: DeltaHit < -25 : 0.0 x effectiveness

Again the table is subject to change during the development.



Finally knowing this we can calculate the damage outcome.
Damage = Attacker.damage * Effectiveness - target.reduction


This is actually done for pretty much any action, not only attacks. At least any action where having an effectiveness multiplier is useful.



That's it for that topic. If you are interested in something specific please ask. I'll probably update this thread irregularly from time to time otherwise.
Lonestar51
Interesting Read, but...


This time the random value is distributed along a bell curve and at the moment goes from -58 to 59. (If you wonder why such an odd number we pretty much throw 3 twenty sided dice and subtract the average from the sum.)

Sure about that? Either the outcome it would be -28 to 29, or you would need 3D40.


Clean Hit: DeltaHit >= -10 : 1.0 x effectiveness

Hit: DeltaHit >= -25 : 0.5 x effectiveness

So this means, even if the attacker is much weaker, there is still a good chance to get a hit? Or taken the other way round, it is very hard to really protect your heroes.



What will be the range for the values Attacker.accuracy and target.evasion?


Damage = Attacker.damage * Effectiveness - target.reduction

Hmm ... OK, damage reduction also plays a role. As the values are now, it seems as if a player would better invest in damage reduction instead of evasion.



Anyway, thanks for the insights. I am very interested how the system mutates during playtesting.
Anima_

Sure about that? Either the outcome it would be -28 to 29, or you would need 3D40.

You're right it's 3d40. Probably a mistake from my Pen & Paper days were we had no 40 sided dices.



So this means, even if the attacker is much weaker, there is still a good chance to get a hit? Or taken the other way round, it is very hard to really protect your heroes.

Even on hit you will greatly reduce the damage taken since it's applied before reduction. In the playtests so far, evasion was the most powerful attribute.

The system tries to avoid outright misses, since they were such a frustration in PS1.


What will be the range for the values Attacker.accuracy and target.evasion?

Jack didn't make a decision about that yet. Our probability script calculates for accuracy/evasion differences of +/- 30 points, so it will probably end with two or low three digit values.


Hmm ... OK, damage reduction also plays a role. As the values are now, it seems as if a player would better invest in damage reduction instead of evasion.

That is of course a danger in having two different defensive attributes. One of the main differences will be how the attributes work with multi shot abilities. Reduction 10 against four shots with 15 damage each is really impressive. Reduction 10 against a single shot with 60 damage much less so.

But even so balancing the attributes will definitely be one of our major headaches.


Anyway, thanks for the insights. I am very interested how the system mutates during playtesting.

So am I and I'm looking forward to work with you all again on making it better in the open playtests. And with a bit of luck, even before this time around.
jack1974
I only want to add that I don't understand much of all this :lol: I mostly experiment changing values while testing :mrgreen:
Lonestar51
What will be the range for the values Attacker.accuracy and target.evasion?
Jack didn't make a decision about that yet. Our probability script calculates for accuracy/evasion differences of +/- 30 points, so it will probably end with two or low three digit values.

Hmm ... makes sense. Say attacking an enemy with 100 evasion ... you need 75 (accuracity+random) to hit at all ... so with 60 accuracity you have a bit of a chance to hit, and with 50 the chance is down to very seldom (sorry, too tired to plug in the numbers ;-) ) On the other side, to get a critical hit, you need 145 (accuracity+random) -> with 160 accuracity you will very seldom miss at all. (The attacker with 60 accuracity would nearly always hit a slug with 0 evasion .) Yes, this feels like it could work. That is, playtesting will have to check it out anyway.


That is of course a danger in having two different defensive attributes. One of the main differences will be how the attributes work with multi shot abilities. Reduction 10 against four shots with 15 damage each is really impressive. Reduction 10 against a single shot with 60 damage much less so.

In PS1 the Guardian really did shine in the late game due to exactly this effect: Number of shoots per round mattered less than overcoming the damage reduction. (even less than the raw damage before reduction)



So this means, there should be some high level enemies with thick armor (= high damage reduction), and some without damage reduction but with an incredible amount of hitpoints.


I only want to add that I don't understand much of all this :lol: I mostly experiment changing values while testing :mrgreen:

That way works too. ;-)
Seloun
The system seems to be quite workable, though the main thing I would note is that it seems very scale-sensitive, and the value of accuracy/evasion/damage/reduction interact in odd ways depending on where on the scale you are. One of the big questions is how scalable the system needs to be.



Based on the description of the attack roll calculation, it seems like accuracy/evasion are to be balanced by absolute delta (since the relative advantage of 100 accuracy to 80 evasion is equivalent to 20 accuracy to 0 evasion). This means in order to keep things roughly analogous at high level versus low level, the modifiers to accuracy/evasion probably should be a fixed value relative to class rather than level based. 3rd edition D&D is sort of a counterexample; their hit roll is basically the same system except with class-based level dependent scaling on the hit roll, which means the nature of the hit roll changes dramatically as level increases (at low levels, wizards can potentially contribute in combat with weapons; at high levels it becomes pointless since the absolute difference becomes so great). This also means if you want to keep the nature of the hit roll similar as level increases/game progresses, you would want relatively tight control on the variance of accuracy and evasion.



Note though that there's nothing inherently wrong with having the nature of the hit roll change throughout the game, though for balance purposes those who lose out in the longer run should get something else to compensate for it. In effect, having the hit roll be scaled with level/game progression should result in characters becoming more specialized in tasks besides the hit roll (e.g. spell progression in the D&D analogy).



The second part of the mechanic seems to value reduction somewhat highly. The way hit/clean hit are valued seems to imply that in general, reduction values are going to be much smaller than the damage value (or the existence of the regular 'hit' won't mean much). This means in absolute terms reductions are going to have to scale slower than damage as you probably want net damage to scale roughly with health values (not doing so is perfectly reasonable, too, but it means again the early game has a different feel than the late game). The value of reduction also increases as you have more of it, making it tough to assign a value to (a la Armor Pen from old WoW). This formulation is inherently unstable; this is not necessarily a bad thing, it just means you have different regimes and you can't necessarily assign a single value to a statistic (finding corner cases where something becomes suddenly far more valuable than expected is part of the fun of playing non-generic systems for mechanics monkeys, after all).



If you want reductions to scale more closely with damage (hardly a requirement, but may be a design goal), you probably want to make the effectiveness parameters tighter, depending on where you want to balance with respect to reduction. 0.5 - 1.5 doesn't really work well unless reduction is smaller than 0.5 of damage, and it can easily result in crits that are many times the amount of a regular hit (maybe this is a feature). 1.0 - 1.2ish would probably work better if you wanted to put damage and reduction on a more even footing. Moving crits out of the hit roll is also another option (more parameters generally means more options for exact tuning, though also more ways for things to break). Another possibility is to have the hit roll modulate the reduction rather than the damage; this can make damage reduction scaling more manageable.
Anima_
Based on the description of the attack roll calculation, it seems like accuracy/evasion are to be balanced by absolute delta (since the relative advantage of 100 accuracy to 80 evasion is equivalent to 20 accuracy to 0 evasion). This means in order to keep things roughly analogous at high level versus low level, the modifiers to accuracy/evasion probably should be a fixed value relative to class rather than level based. 3rd edition D&D is sort of a counterexample; their hit roll is basically the same system except with class-based level dependent scaling on the hit roll, which means the nature of the hit roll changes dramatically as level increases (at low levels, wizards can potentially contribute in combat with weapons; at high levels it becomes pointless since the absolute difference becomes so great). This also means if you want to keep the nature of the hit roll similar as level increases/game progresses, you would want relatively tight control on the variance of accuracy and evasion.

Yes, the plan is to balance it by absolute delta. Actually the problem in D&D3.5 is more that attack bonus grows far faster than AC. So a high AC becomes pretty meaningless. That's why character optimization usually goes for things like damage reduction or miss chance.
Note though that there's nothing inherently wrong with having the nature of the hit roll change throughout the game, though for balance purposes those who lose out in the longer run should get something else to compensate for it. In effect, having the hit roll be scaled with level/game progression should result in characters becoming more specialized in tasks besides the hit roll (e.g. spell progression in the D&D analogy).

Well there is the way D&D4 did it, with simply adding half the level to the hit roll. Of course for the most part that was equivalent to simply not increasing to hit and defences per level. A system using raises, like TDE or L5R do, would change that of course. Since the higher accuracy would allow more room for raises in the first place.



I agree with you that a constant class specific accuracy value, only modified by actions, weapons and effects/situation, would make a lot of sense. If we want to keep the dynamic the same that is. Of course that would mean loosing the advancement aspect that is so inherent in RPGs for this element. Which could be remedied by a level dependent modifier to both accuracy and evasion.

Of course that would also depend on the nature of the character advancement system we decide on. At this point I'm not sure myself what would be the best course of action and will probably change my mind during later playtests anyway. Some things look unfortunately only good on paper.


The second part of the mechanic seems to value reduction somewhat highly. The way hit/clean hit are valued seems to imply that in general, reduction values are going to be much smaller than the damage value (or the existence of the regular 'hit' won't mean much). This means in absolute terms reductions are going to have to scale slower than damage as you probably want net damage to scale roughly with health values (not doing so is perfectly reasonable, too, but it means again the early game has a different feel than the late game). The value of reduction also increases as you have more of it, making it tough to assign a value to (a la Armor Pen from old WoW). This formulation is inherently unstable; this is not necessarily a bad thing, it just means you have different regimes and you can't necessarily assign a single value to a statistic (finding corner cases where something becomes suddenly far more valuable than expected is part of the fun of playing non-generic systems for mechanics monkeys, after all).

Yes, reduction values should scale slower then damage values. A reduction of 0.5 average damage would indicate a really armoured foe. One of the design requirements I was given was "elemental" strength and weaknesses, so the reduction value will also be different for specific attacks. That way higher reduction values will also be possible. The next post will be about the way attributes work, where I wanted to explain some more about the system.


If you want reductions to scale more closely with damage (hardly a requirement, but may be a design goal), you probably want to make the effectiveness parameters tighter, depending on where you want to balance with respect to reduction. 0.5 - 1.5 doesn't really work well unless reduction is smaller than 0.5 of damage, and it can easily result in crits that are many times the amount of a regular hit (maybe this is a feature). 1.0 - 1.2ish would probably work better if you wanted to put damage and reduction on a more even footing. Moving crits out of the hit roll is also another option (more parameters generally means more options for exact tuning, though also more ways for things to break). Another possibility is to have the hit roll modulate the reduction rather than the damage; this can make damage reduction scaling more manageable.

Actually if we scale reduction to a maximum of 0.5 damage a critical hit will at most deal two times the damage a clean hit would. Which makes me wonder if "Hit" was the proper descriptor for that level, it might cause problems.

If the scale would be 1.0 - 1.2 we would have the problem again that the only choice for a bad hit roll would be an outright miss and a critical hit with only 20% damage increase would be a bit underwhelming. (Probably welcome if you end up on the receiving end.) Making critical hits a separate check would of course also be an option, it wouldn't change the dynamic we currently have though. And you could end with a mere "Hit" being a critical. The main idea was the accuracy wouldn't be just a binary choice, either you hit or you don't. In the first design accuracy had a direct effect on damage (basically we simply added the delta to the damage), but that simply made raising accuracy better then raising damage. Since the expected damage increased by one point for every point of accuracy plus the fact that you wouldn't risk a miss. Having two attributes that do the same thing doesn't make that much sense.



Moving the hit modifier to the reduction side of the equation is an interesting idea. It obviously breaks down for 0 reduction. Still it would create a very different dynamic, I'll definitely think a bit more about it.

Of course when we talk about average damage we should always keep in mind that there will be actions/weapons with a higher base damage but a lower expected damage. For example a heavy weapon that has 150% base damage, but takes twice as long to fire. Usually that weapon wouldn't be used, but against high reduction enemies it becomes a great option.
Anima_
Here is also a rough diagram for the hit level distribution we have right now:

http://i.imgur.com/I4zNcED.png" style="max-width:100%">

Violet is a Critical Hit, Red is a Miss. So Cyan is a Direct Hit, Green a Clean Hit and Orange a Hit.

The numbers are the accuracy/evasion difference.
Seloun

If you want reductions to scale more closely with damage (hardly a requirement, but may be a design goal), you probably want to make the effectiveness parameters tighter, depending on where you want to balance with respect to reduction. 0.5 - 1.5 doesn't really work well unless reduction is smaller than 0.5 of damage, and it can easily result in crits that are many times the amount of a regular hit (maybe this is a feature). 1.0 - 1.2ish would probably work better if you wanted to put damage and reduction on a more even footing. Moving crits out of the hit roll is also another option (more parameters generally means more options for exact tuning, though also more ways for things to break). Another possibility is to have the hit roll modulate the reduction rather than the damage; this can make damage reduction scaling more manageable.

Actually if we scale reduction to a maximum of 0.5 damage a critical hit will at most deal two times the damage a clean hit would. Which makes me wonder if "Hit" was the proper descriptor for that level, it might cause problems.

If the scale would be 1.0 - 1.2 we would have the problem again that the only choice for a bad hit roll would be an outright miss and a critical hit with only 20% damage increase would be a bit underwhelming. (Probably welcome if you end up on the receiving end.) Making critical hits a separate check would of course also be an option, it wouldn't change the dynamic we currently have though. And you could end with a mere "Hit" being a critical. The main idea was the accuracy wouldn't be just a binary choice, either you hit or you don't. In the first design accuracy had a direct effect on damage (basically we simply added the delta to the damage), but that simply made raising accuracy better then raising damage. Since the expected damage increased by one point for every point of accuracy plus the fact that you wouldn't risk a miss. Having two attributes that do the same thing doesn't make that much sense.



Moving the hit modifier to the reduction side of the equation is an interesting idea. It obviously breaks down for 0 reduction. Still it would create a very different dynamic, I'll definitely think a bit more about it.

Of course when we talk about average damage we should always keep in mind that there will be actions/weapons with a higher base damage but a lower expected damage. For example a heavy weapon that has 150% base damage, but takes twice as long to fire. Usually that weapon wouldn't be used, but against high reduction enemies it becomes a great option.

Regarding the multipliers, the main observation is that as it is now, the rate of gain of reduction has to be always be significantly smaller than the gain in damage (so in item valuation, +1 reduction is always much more valuable than +1 damage). This isn't necessarily a problem, but it makes reduction somewhat harder to balance. By moving the scale to 1.0-1.2 (e.g. 1.0, 1.1, 1.15, 1.2 instead of 0.5, 1.0, 1.2, 1.5) and letting reduction vary more comparably to damage (in this case, the 0.5 very heavily armored target is a 1.0 reduction target) you should end up with similar results, except that now the reduction scale has been 'expanded' such that +1 reduction is a lot closer to +1 damage in valuation in more places. This also makes the accuracy bonus less overwhelming against softer targets (who are taking more damage anyway on average). Whether or not these are desirable features are dependent on the requirements, of course.



In the end the choice for the parameters really depends on the scale you'd want to be working on (including the 3D40, which would have identical properties if e.g. the expected attack/evasion delta was increased by 10x and the roll became 3D400 - really you'd want to define what a 'good' delta was and retrofit the multiplier from there. Note you can get this effect explicitly simply by having a scaling constant: DeltaHit = (magic constant)*(accuracy - evasion) + random and keep a single table). However, one of the reasons why you might want damage/reduction/expected to scale together is due to the fact that the accuracy part is being handled as a delta. If damage isn't handled in a similar way, there will always be a point where accuracy ends up being more valuable than damage (at least up to maximum result of accuracy) since accuracy is a multiplier on damage (so if the damage range increases, the value of accuracy does too - while vice versa is true, the balancing of accuracy/evasion as a fixed delta means your relative accuracy remains roughly the same throughout the game); this means it might be hard to balance accuracy bonuses since +1 accuracy early is a lot less useful than +1 accuracy late compared to +1 damage. Of course, if you know the scale you'll be operating on, you can probably tweak things so that the cross-over point is unrealistically high, but it's another thing to keep an eye on.



Moving the table result to the reduction part still keeps the general function of the table (accuracy helps damage) while making it more clear how accuracy helps damage - accuracy is more beneficial if the target has more armor, while damage is better against softer targets. Applying the modifier to the damage part does this too, just in an indirect way. Moving it to reduction makes crits less painful for soft targets (I think you'd probably want this to narrow the gap between squishy and non-squishy somewhat, but whether or not this is desirable is out of scope from mechanics analysis). Mainly it separates the design/balance space of damage and accuracy a bit more (also probably works nicer with a continuous function should that be desirable).



The distribution table looks perfectly workable, though as you mentioned the naming is probably misleading; 'hit' probably makes sense for the most probable result in general (so the current 'clean hit') while the reduced damage hit should probably be something like 'glancing hit'. The other possible suggestion would be to scale the default so that the expected ranges are against more 'even' values (if the table represents the expected range, maybe ~3D132 instead to scale against -100 to 100)



WRT absolute delta, level scaling, 4th ed: While it's true that 4th ed's system is (in practice) very close to not having any scaling at all, it does account for different levelled opponents, which is really an orthogonal design axis. That's really another thing that should probably come out of the requirements (should a beginning party have any chance against an endgame opponent? what is the acceptable range of opponents for a midgame party?). The other point is that while generally levels can easily scale forever, gear usually doesn't (something I found in Loren was that levelling was often a disadvantage due to enemy scaling; even if your characters' inherent stats scaled better, your gear didn't scale past a certain point, making your advantage relatively smaller; also the value of talent points decreased dramatically with levels); 4th ed practically makes gear part of scaled statistics to avoid that issue.
Lonestar51
Regarding the multipliers, the main observation is that as it is now, the rate of gain of reduction has to be always be significantly smaller than the gain in damage (so in item valuation, +1 reduction is always much more valuable than +1 damage). This isn't necessarily a problem, but it makes reduction somewhat harder to balance. By moving the scale to 1.0-1.2 (e.g. 1.0, 1.1, 1.15, 1.2 instead of 0.5, 1.0, 1.2, 1.5) and letting reduction vary more comparably to damage (in this case, the 0.5 very heavily armored target is a 1.0 reduction target) you should end up with similar results, except that now the reduction scale has been 'expanded' such that +1 reduction is a lot closer to +1 damage in valuation in more places.

The problem with upping the damage reduction to 1.0 is that it favours guardians too much. Basically, heavy weapons (few shots with high damage) will work better than pistols (which get out lots of rounds of low damage) against targets with high damage reduction.



Example: (taken multipliers from Aleema)

Guardian Tom has blaster of 10 damage, 2 shots per turn

Soldier Michelle has Pistol, 5 damage, 5 shots per turn

If the target has a damage reduction of ...

1 - Michelle will damage with every hit, as does Tom - If multiplier 1.0: damage 20 to 18

3 - Michelle needs a clean hit for damage - Tom scores with every hit - If multiplier 1.0: damage 10 to 14

5 - Michelle needs direct hit - Tom needs clean hit - If multiplier 1.0: damage 0 to 10

7 - Michelle needs critical hit (for half a point of applied damage) - Tom needs clean hit - If multiplier 1.0: damage 0 to 6



From this example we can draw two conclusions:

1) Targets of high damage reduction require heavy weapons

2) Targets of low damage reduction favour weapons of high raw damage



If the multiplier spread wuld be reduced from 0.5 to 1.5 towards a more narrow version, the advantage of heavy weapons would be even more pronounced.
Seloun

The problem with upping the damage reduction to 1.0 is that it favours guardians too much. Basically, heavy weapons (few shots with high damage) will work better than pistols (which get out lots of rounds of low damage) against targets with high damage reduction.

Setting the asymptotic damage reduction goal to 1.0 doesn't really imply much by itself as it is quite easy to design an item scaling that would still favor fast-firing weapons regardless of the goal (the only important point is the relative deltas between the weapons and the expected damage reduction; there's nothing that says weapons must be balanced against or near 0 damage reduction). The damage reduction goal doesn't imply that in general damage reductions would be equal to the damage done; it only speaks to the scaling. The absolute difference can be made arbitrarily large. The main point of the smaller scaling with the damage reduction goal change is that the value of damage reduction will vary at a closer rate to the value of damage.

What this is really addressing is the following: Should +1 to damage be equivalent in value to +1 in damage reduction?

Neither answer is inherently better, but they have different consequences. The original scaling implies that +1 damage is typically less than half as valuable as +1 in damage reduction asymptotically, since a 'heavy armor' target should roughly have half of the expected damage value (so the same increase in armor as damage implies twice as far in progression). Again, this is not necessarily a problem (you just make sure you account for it in your item development) but it's a way to start thinking about stat valuations.

Furthermore, a higher damage reduction target generally suggests a tighter grouping on the hit table multiplier as it stands now as otherwise crits become more and more overvalued (as armor increases relatively, crits become more and more important -> accuracy becomes more and more important); this is why the logic went from tighter scaling -> higher relative armor values -> reduced multipliers. One of the side effects of the reduced multipliers is also that crits are somewhat less crippling against lower than expected armor (more tolerant of low armor), which also may or may not be desirable.
Lonestar51
I would not agree with the assertion that crits are overvalued - as the base damage of the weapon plays a large part of it.



A sniper would get many crits, but the sniper weapon would have moderate damage and low rate of fire, thus it balances itself out.



----



Seloun, many of the things you write are technically correct, but maybe a bit too much seen from a PvP point of view: There may be ways to balance the different stats in ways other than nerfing the crits: There may be enemies with machine guns and others with overpowered sniper rifles (mayn crits if evasion is neglected and high base damage) and everything in between. For every enemy the ideal build would be different. Also some enemies would have high HP/low damage reduction and others reverse, which would favour different weapons.



This is not like PvP where the other players will flock to an "ideal" build and you need to follow the herd otherwise you cannot match them, here the developer (Jack) may add enemies of different kind which favour different builds, different classes etc.
Anima_
Actually keep in mind that the crit chance goes pretty fast against zero for negative deltas. Someone who has neither reduction nor evasion should die pretty fast. There will probably be a trade off between evasion and reduction in some way. Looking at the diagrams so far reduction is better against lower basedamage, while evasion is more powerful against higher basedamage.


Moving the table result to the reduction part still keeps the general function of the table (accuracy helps damage) while making it more clear how accuracy helps damage - accuracy is more beneficial if the target has more armor, while damage is better against softer targets. Applying the modifier to the damage part does this too, just in an indirect way. Moving it to reduction makes crits less painful for soft targets (I think you'd probably want this to narrow the gap between squishy and non-squishy somewhat, but whether or not this is desirable is out of scope from mechanics analysis). Mainly it separates the design/balance space of damage and accuracy a bit more (also probably works nicer with a continuous function should that be desirable).

So actually we have the opposite of this dynamic. The higher the reduction the higher the basedamage needs to be. I think this is more desirable at the moment.

An additional possibility would be to make the Thresholds and Multiplier variable instead of static. That would add interesting dimensions to armour choice for an example, if the different armour categories had different multipliers.



At the moment I won't change the table, but we will probably get back to it later in the development cycle. It's something that's pretty easily changed later on, so I want to concentrate more on the basics right now.

That's also the reason why I didn't answer that many points and posts in general, this is a pretty busy part in the process for me. I do read everything and take your suggestions into account, but I have to ask for your patience when it comes to answering it. Unfortunately I'm also not the fastest writer in English so these posts take often quite a bit of time.
Seloun
I'm not really advocating a particular scheme so much as trying to identify features of the mechanics.



Also, my position is not that crits are overvalued (period), but that crits become more and more valuable as reduction increases, if the base damage delta remains the same:



- 100 damage vs 50 reduction results in 50 damage on a regular (1.0x) hit; it does 100 damage on a critical hit (1.5x)

- 1000 damage vs 950 reduction results in 50 damage on a regular (1.0x) hit; it does 550 damage on a critical hit (1.5x)



In order to maintain symmetry in terms of the value of crit hits (which, again, may or may not be desirable; I am not suggesting one is better than the other), then, it is necessary for reductions to scale at some asymptotic ratio compared to damage (e.g. 2 damage per 1 reduction, so you end up with 1000 damage/500 reduction instead as the scaling in the latter part of the game). Again, this is not necessarily a bad thing. It is an observation that, should the reduction scale slower than damage, the value of reduction is likely to be higher per point than the value of damage since the balancing would occur at the expected value of reduction. Put another way, +10 reduction would represent further progress in the game than +10 damage would.



Please keep in mind that I am not saying that there is something wrong with the balance (this is an impossible statement given the amount of data available). However, what can be analyzed is the scaling, as we can be reasonably sure that values will change (typically go up) as the game progresses, and thus the mechanics will be used at different points of the curve. Again, I am not saying 'nerf crits!' as there is not enough data to reasonably suggest something like that (as noted, this is highly dependent on the actual environment - i.e. the choice of enemies and itemization and the actual scale values). What I'm saying is only what I've said, among which is that if reduction and damage scale evenly, crits become more valuable, and thus if crit valuation should stay the same (which is a design decision without an immediate, obvious answer) the reduction has to scale slower than the damage.
Anima_
That is true, but didn't I already say that we want to scale it only up to 0.5? Maybe I forgot to write it here. At that level the difference is only 100% of damage which is acceptable.

Interesting at that point would be that the same would happen with the reduction modifier.

- 100 damage vs 50 reduction Clean Hit: 100 - 1.0*50 = 50 damage, Critical Hit: 100-0.0*50 = 100 damage

- 1000 damage vs 950 reduction Clean Hit: 1000 - 1.0*950 = 50 damage, Critical Hit: 1000-0.0*950 = 1000 damage

So actually it would be even worse. Of course this numbers depend on the table used, but that goes both ways. (The 0.0 for critical hits comes from the experimental table, higher values gave disappointing results.)



To really solve the problem we would have to subtract reduction before we apply the multiplier. Then the relation between the hit level modifiers and the actual damage for each hit level would remain the same.

Might even be the better solution since getting crits is largely luck based. On the other hand having a recourse against high reduction enemies other than heavy weapons would be nice as well, even if the expected damage would still be much lower then for heavier weapons. Something to think over.
Seloun
That is true, but didn't I already say that we want to scale it only up to 0.5? Maybe I forgot to write it here. At that level the difference is only 100% of damage which is acceptable.

You did mention that - the computation was to explain why high reduction values tend to favor crit (again, this is not necessarily a bad thing, it's just a description of what happens).



With respect to how this interacts with the 0.5 being the desired top end ratio of reduction to damage:



Given the ratio of damage to reduction going only up to 0.5 means the same amount of reduction is 'further along' the game than for damage.



Assuming the 0.5 reduction to damage ratio, and that early in the game you are at 200 damage and 50 reduction:

- By the time you level/get gear/etc. enough to be at 400 damage (+200 damage) you would expect to be around maybe 150 reduction (+100 reduction). It would not be much more than that or you would be violating the your expected ratio.

- By the time you level/get gear/etc. enough to be at 250 reduction (+200 reduction) you would expect to be around maybe 600 damage (+400 damage). It would not be much less than that or you would be violating the expected ratio.

This is what I mean by the scaling. While these are example numbers, the issue is with the approximate rate of increases (~2 damage to 1 reduction) rather than the specific values. So gear with say +10 reduction is likely worth a lot more than gear with +10 damage.

That in of itself is not necessarily a problem, but it is a characteristic of the mechanics that we can derive independently without knowing more about the gameworld (which is why I bring it up; most other analysis requires more data, which is not available yet).


Interesting at that point would be that the same would happen with the reduction modifier.

- 100 damage vs 50 reduction Clean Hit: 100 - 1.0*50 = 50 damage, Critical Hit: 100-0.0*50 = 100 damage

- 1000 damage vs 950 reduction Clean Hit: 1000 - 1.0*950 = 50 damage, Critical Hit: 1000-0.0*950 = 1000 damage

So actually it would be even worse. Of course this numbers depend on the table used, but that goes both ways. (The 0.0 for critical hits comes from the experimental table, higher values gave disappointing results.)

You're correct here. The 'apply to reduction' is really an alternative to the 'apply to damage', and not really intended to be used with the 1.0 scaling as well (they are two different suggestions, to be applied separately). The computation I would show is using the (1.0,1.1,1.15,1.2) multipliers:

- 100 damage vs 50 reduction Clean Hit: 110 - 50 = 60 damage, Critical Hit: 120 - 50 = 70 damage

- 1000 damage vs 950 reduction Clean Hit: 1100 - 950 = 150 damage, Critical Hit: 1200 - 950 = 250 damage



70/60 ~= 116%, 250/150 ~= 167%, vs the original 200% and 1100%



What I meant before was that the smaller multiplier ranges would let +10 damage be closer in value to +10 reduction than the larger multipliers (whether this is good or bad is not a decision I am making).


To really solve the problem we would have to subtract reduction before we apply the multiplier. Then the relation between the hit level modifiers and the actual damage for each hit level would remain the same.

This does 'fix' this issue (problem is a strong term). The downside, though, is that's it's somewhat boring :) (not a coincidence; interesting almost always means more easily breakable).
Might even be the better solution since getting crits is largely luck based. On the other hand having a recourse against high reduction enemies other than heavy weapons would be nice as well, even if the expected damage would still be much lower then for heavier weapons. Something to think over.

Well, to be honest I would much rather prefer a completely or minimally luck-based results in games like these due to the save/load issue. One kind of mechanic could be:

- Shooting someone always hits (damage is still damage - reduction, with a small random range)

- Delta accuracy 'builds' on the target (maybe with a _small_ random range)

- If the total delta accuracy is over a threshold, it is a critical hit, and delta accuracy is cleared



For example, character A and B are shooting enemy C (assume a +/- 20% randomness to accuracy buildup):

- Char A shoots enemy C, hits normally (automatic) and builds up 11 (10 +/- 2) points of accuracy on the target

- Char B shoots enemy C, hits normally (automatic) and builds up 33 (35 +/- 7) more points of accuracy (44 total)

- on enemy C's turn, enemy C drops 4 (5 +/- 1) points through evasion (40 total)

- Char A shoots enemy C, hits normally (automatic) and builds up 12 (10 +/- 2) points of delta accuracy on the target (total 52), which is over the (example) crit threshold of 50, and thus also gets a crit. Built up accuracy on Char C goes to 0.



Obviously it doesn't have to be exactly like this, but a mechanic like that mostly removes the randomness from the battle, and makes targeting much more an interesting issue (you might not want to 'waste' a high accuracy built with a weaker critical attack, so you might have a good reason to spread around attacks). This could thematically also be represented like cover (shooting someone 'flushes' them from cover, which is what 'accuracy build up' could represent; the name could certainly be altered).



- A glancing hit might be what you get if your attack would drop the built-up accuracy to below 0.



- This could also let the player react with active defenses, since they can see how much of a danger they are in (if a character is sitting near threshold of accuracy built, it might be a good idea to take a defensive action).



- Crits being really big is less of an issue since you can control it much better. Even the tank might need a 'breather' if they get close to their threshold, since normal attacks do little to them while crits are still dangerous (this also means having a second tanky character is a lot more useful than in the first game).
Anima_
Attributes

In the context of the new framework an attribute is anything we use in the internal calculation. Some of the statistics shown to the player will not be attributes in that sense.

That makes only sense if we take a look at how attributes handled internally. Instead of a simple number every attribute is a list of attribute segments. These segments hold the actual attribute values and the final attribute value is the sum of all triggered segments.

Apart from the value every segment also holds a set of internal and external keywords each and a beginning and end time. These are the conditions for the segment to trigger.

That way we can have for example an ability that increases accuracy only when the character is in the open and the target in cover. This way we also realize damage resistances. A fire resistance would simply be a reduction segment with the external "Fire" keyword.



At the moment the following attributes are definitely in the game: HP, PP, Accuracy, Evasion, Damage, Reduction. In the end this may actually double, there is still a few mechanics in the air and some things that are static at the moment might become variable enough to need attributes.



Time keeping

Like Loren we will have a delay based system again. This time with a major twist, the delay is not between action execution and the next time you get to choose an action. Instead it's between choosing your action and that actions execution.
[code]Choose ----------->Execute Choose ---------> Execute[/code]
The main difference is that this system allows the interruption of actions and planing as well. Even in the limited prototype it was already pretty fun to take cover to escape a sniper action.

In addition we keep the total time, so effects will have an absolute duration. This is unlike Loren were effect length was measured in character turns.



Might have been a bit brief but I think those are the important bits about those two aspects so far.
Seloun


Apart from the value every segment also holds a set of internal and external keywords each and a beginning and end time. These are the conditions for the segment to trigger.

That way we can have for example an ability that increases accuracy only when the character is in the open and the target in cover. This way we also realize damage resistances. A fire resistance would simply be a reduction segment with the external "Fire" keyword.

Interesting. I assume this is more effort to make the framework easily extensible.



It's not quite clear though how you would generally apply this - for example, how do you know that 'Fire' means the target is on fire, and not the shooter is on fire? It's interesting you chose an example with a mixed condition check. You could certainly have the logic reside in the ability check itself, but that seems to mostly defeat the purpose of having the segments attached to the attribute (actually, since presumably the condition is not stored in the attribute, just the triggers, it's not clear how this is actually applicable). In the character is in the open/target is in cover example, it's not really easy to see how the segment concept helps here (what is the segment that's tied to target evasion or shooter accuracy that makes it work with the described ability?). It's easy to see how it would work if it's just based on target being in cover (segment off target's evasion) or an ability based on shooter being in the open (segment off shooter's accuracy) but not how to have something that checks both conditions are true.



A secondary, minor issue is that if you are actually using string literals in the implementation, it's notoriously easy to introduce an unintended category of effects (e.g. 'Fire' and 'fire') as a virtually silent bug. While there are some relatively simple workarounds (scanning data files against a known set of keywords, for example; basically a simple compiler) it sort of emphasizes that this is effectively trying to describe a metaprogramming language.



I suppose another possible minor issue is the efficiency, though with sorted lists it seems like you could keep the execution time to O(n) (though the sorting itself would impose costs).



Also, how do you keep track of what thing a segment came from? E.g. how do you know that 'Fire' segment is the one from your chestpiece and not from a buff (time check could resolve potentially resolve this) or a passive ability? There are a number of possible solutions (including the 'doesn't matter' one) but it seems like a potential area of fragility.



The main difference is that this system allows the interruption of actions and planing as well. Even in the limited prototype it was already pretty fun to take cover to escape a sniper action.

In addition we keep the total time, so effects will have an absolute duration. This is unlike Loren were effect length was measured in character turns.

Time independent of character turns is good.



The delay system is interesting, but also somewhat scary. It seems like quicker actions are going to be much stronger than their amortized time cost would indicate. Taking a turn later also seems to be more of an advantage (to a point) since you have far more information. A delay action seems very strong (anything with a short time delay even if it does nothing) which means actions intended as 'interrupt' actions are probably going to be very strong.



First suggestion: Don't have any really long delay actions (or very, very few for specific reasons, e.g. killer boss ability). Instead, make them multi-part actions if possible (instead of a snipe with a long timer, have a 'ready snipe' action, 'select target' action and 'snipe' action; more generally you might allow a 'ready snipe' action that builds up a buff on the shooter that gets consumed on the 'snipe' action, or something similar). if actions are roughly the same delay, most of the problems go away.



Second suggestion: Have abilities that have a post-action delay, even if most abilities don't. The problem is that otherwise interruption-style actions become very strong, since they generally have to be fast in order to be useful. If fast is not the same as short time cost, you can have an action that can work as an interrupt without the time cost being short (e.g. a reflex shot that goes off in 2 time units, but also has a 4 time unit recovery time afterwards is easier to balance than a 2 time unit shot). Implementation wise this could be an extra 'nothing' action automatically taken after the interrupt action (so interrupt actions are actually a queue of 2 actions).
Anima_
Interesting. I assume this is more effort to make the framework easily extensible.

Yes that was the idea. One of my earlier framework already had this concept, but we decided that it wasn't necessary for Loren since it would be a much lighter RPG then Planet Stronghold. Well things changed during development a bit, most of you probably noticed. :wink:



I should have explained a bit more what I meant with internal and external keywords. In the example the ability would be bound to the accuracy attribute of the shooter. We then test if the internal keywords are a subset of the shooters state and if the external keywords are a subset of the targets state.

For the Fire keyword, this would be exclusive to attacks. It wouldn't even show up on a normal check for reasons that have to do with the double wielding implementation.



We are planning to use a GUI based editor for that. That problem was a never ending source of bugs for Loren, even with the lesser use of keywords.



For the comparison we use pythons set implementation. So far the speed is acceptable.



All segments have a reference to their source and of course all sources know their segments.


The delay system is interesting, but also somewhat scary. It seems like quicker actions are going to be much stronger than their amortized time cost would indicate. Taking a turn later also seems to be more of an advantage (to a point) since you have far more information. A delay action seems very strong (anything with a short time delay even if it does nothing) which means actions intended as 'interrupt' actions are probably going to be very strong.

Yes it's totally scary. I was kinda shocked when Jack gave the go for it. Apart from execution speed, there will also be a measure how difficult it is to interrupt an action. But yes interrupts have the potential to become very powerful. That's a fact we will definitely keep in mind, but trying to balance it in vacuum will not yield many results.



Regarding the advantage of going last. It's true that it does give a certain planning potential to the slower characters. On the other hand it forces them to react to the faster characters, so the faster characters has control over the flow of battle.
jack1974

The delay system is interesting, but also somewhat scary. It seems like quicker actions are going to be much stronger than their amortized time cost would indicate. Taking a turn later also seems to be more of an advantage (to a point) since you have far more information. A delay action seems very strong (anything with a short time delay even if it does nothing) which means actions intended as 'interrupt' actions are probably going to be very strong.

Yes it's totally scary. I was kinda shocked when Jack gave the go for it. Apart from execution speed, there will also be a measure how difficult it is to interrupt an action. But yes interrupts have the potential to become very powerful. That's a fact we will definitely keep in mind, but trying to balance it in vacuum will not yield many results.

I'm the scarecrow 8)

Jokes apart I wanted to use this system so there could also be an auto-combat option for lazy players - or simply for battles that are almost won. I liked that feature in some games like Heroes Of Might & Magic, and I'm sure the more casual players would like it too. While the game is not real-time, the battles should be much more frantic than a classic turn-based game like Loren, I mean you will need to consider several values before making your move. More like a chess match. It's surely going to be harder to balance (at Hard mode in particular, easier modes shouldn't be a problem) but I think it has also the potential to be even more fun than Loren's combat which was already cool IMHO.