In this code I'm trying to separately normalize two independent probability densities and then combine them to get the joint probability density that's normalized and then use it to calculate the probability that the two variables are equal. fD(x) is a Gaussian divided by x^2 and fA(x) is a Gaussian. The first problem occurs when I'm checking the normalization of the joint probability density by doing the double integral over all space for fD(x)*fA(y)dxdy, I get weird vanishing number when the parameter "hartree" takes a certain value, namely 27.211. If I change hartree to 27 or 1 or 2 it all worked, but 27.211 is not good. Also later when I do a single integral over all space for fD(x)*fA(x)dx to get the probability that these two are equal, I find the result is dependent on hartree. This hartree thing is a unit conversion in my physical problem and in principle should not interfere with either the normalization or the probability result at all. I suspect this is a coding bug but I can't find what it is. I'd appreciate any input.
Thank you very much!
Edit: I found out that the problem with the double integral normalization may have something to do with the discretization for numerical evaluation of the integral, since if I change the lower bound to 1/hartree and upper to 10/hartree then it's fine, however if I use lower bound at 1/hartree and upper at 5/hartree it doesn't work, although the distribution has no value between 5/hartree and 10/hartree. However after this is fixed I still have the problem with the single integral over all space for fD(x)*fA(x)dx changing with hartree. Well as a probability I would expect the integral to be bound between 0 and 1, but since it almost linearly depends on hartree, at hartree around 27 I would get the integral value to be about 25, which doesn't make sense. In fact, I now suspect it is not Maple, but my calculation of the probability of the two random variables taking the same value is wrong, I'd appreciate it very much if someone can confirm this.