I've experimented with a d6 dice and a predictive program for dice. Counting the deviation off and on the average estimate to obtain an expected range, for a set number of rolls of the dice. Using predicted deviation to correct the average gave a more accurate estimate for results.
I've cross checked it with 10x 10 rolls, checked the individual results with the estimated range. In 70% of the time the range was correct, in 30% of the time off by 0,6-1,6. At 100 rolls of the dice, it was completely accurate. However more experimentation is needed, due to the small sample size.
While the sample size is small and the experiment for my hypothesis was done with a d6 dice, as where GbG's randomness comes from an virtual RNG, I'm confident that it's quite representative for more accurate estimates.
I think it would be greatly beneficial for the community if someone with greater probability mathematical knowledge and excel skills than I have, could create an excel spreadsheet for this. In which excel calculates the deviation and giving a range of estimated attrition. Based off the given number of fights, calculated estimated deviation for the given number of fights with the X% attrition risk. Alongside an % of accuracy of the estimate.
This maybe complex to setup but at the user level is should be quite straightforward. Just insert number of fights and X% attrition risk. Giving an estimated range for attrition the user can expect. This should be vastly more accurate than going off the expected average and than being surprised the resulted attrition is off.