- #1
lugita15
- 1,554
- 15
In the Principles of Quantum Mechanics, Dirac derives an identity involving his delta function: xδ(x)=0. From this he concludes that if we have an equation A=B and we want to divide both sides by x, we can take care of the possibility of dividing by zero by writing A/x = B/x + Cδ(x), because our original equation is equivalent to A=B+Cxδ(x).
Now Dirac was writing in the 30's, back before they had a rigorous theory of distributions, so he was freely manipulating the Dirac delta. The identity he derived is certainly correct, but is adding a constant multiple of the delta function when you divide by x really mathematically justified? If it is, why isn't this technique more widely adopted, like when solving Laplace's equation or doing quantum mechanics, places where delta functions usually abound?
Any help would be greatly appreciated.
Thank You in Advance.
Now Dirac was writing in the 30's, back before they had a rigorous theory of distributions, so he was freely manipulating the Dirac delta. The identity he derived is certainly correct, but is adding a constant multiple of the delta function when you divide by x really mathematically justified? If it is, why isn't this technique more widely adopted, like when solving Laplace's equation or doing quantum mechanics, places where delta functions usually abound?
Any help would be greatly appreciated.
Thank You in Advance.