I just started in Griffith's Introduction to electrodynamics and I stumbled upon the divergence of $\frac{ \hat r}{r^2}$ , now from the book, Griffiths says:
Now what is the paradox, exactly? Ignoring any physical intuition behind this (point charge at the origin) how are we supposed to believe that the source of $\vec v$ is concentrated at the origin mathematically? Or are we forced to believe that because there was a contradiction with the divergence theorem?
Also how would the situation differ if $\vec v$ was the same vector function but not for a point charge? Or is it impossible?
Answer
Now what is the paradox, exactly?
The paradox is that the vector field $\vec{v}$ considered obviously points away from the origin and hence seems to have a non-zero divergence, however, when you actually calculate the divergence, it turns out to be zero.
How are we supposed to believe that the source of $\vec v$ is concentrated at the origin mathematically?
Most important point to observe is that $\nabla.\vec v = 0$ everywhere except at the origin. The diverging lines appearing are from the origin. Our calculations cannot account for that since $\vec v$ blows up at $r = 0$. Moreover, eq. (1.84) is not even valid for $r = 0$. In other words, $\nabla.\vec v \rightarrow \infty$ at that point.
However, if you apply the divergence theorem, you will find $$\int \nabla.\vec v \ \text{d}V = \oint \vec v.\text{d}\vec a = 4 \pi$$ Irrespective of the radius of a sphere centred at the origin, we must obtain the surface integral as $4 \pi$. The only conclusion is that this must be contributed from the point $r = 0$.
This serves as the motivation to define the Dirac delta function: a function which vanishes everywhere except blowing up at a point and has a finite area under the curve.
No comments:
Post a Comment