I am neither a toxicologist nor a biologist, I was just curious if you had more specific grounds for your calculus. I'm sorry if my message was interpreted as aggressive, that was really not my intention.
Still, if I had to make an educated guess, I'd assume it should rather be modeled with something closer to a normal distribution (because rodents would be orders of magnitude more likely to die with doses "around" LD50 than they are likely to die with doses around, say, LD50 * 0.5, implying 0.5 * LD50 << LD25). Which also means that LD100 wouldn't even make sense from a statistical standpoint (and AFAIK, this is the reason why LD50 or LD99 are commonly used in toxicology, not LD100:
https://en.wikipedia.org/wiki/Lethal_dose#Units_and_measurement).
Sure, there are other approximations as you pointed out. I also reckon LD would not be completely proportional with the body mass and is indeed probably not strictly translatable from rodents to humans.
Though I think 2 * LD50 highly overestimates the minimal dose for achieving high probability of death (which is fine in this specific instance because death is the intended effect). It's just that you wrote "to make it 100%" which picked my interest. I was wondering if there were specific studies that empirically determined such result.
TLDR, that was just some nitpicking from me
.