As I study Jackson, I am getting really confused with some of its key definitions. Here is what I am getting confused at. When we substituted the electric field and magnetic field in terms of the scalar and vector potential in the inhomogeneous Maxwell's equations, we got two coupled inhomogeneous wave equations in terms of $\mathbf{A}$ and $\phi$. So, the book states that to uncouple them, which definitely makes our equations simpler to solve, we introduced gauge transformations as adding a gradient to $\mathbf{A}$ and adding a constant to $\phi$ would not affect their meaning. My question is which one is a gauge and why in the expression for a gauge transformation $$\mathbf{A'}=\mathbf{A+\nabla \gamma}.$$ Somewhere in the internet, I read that $\gamma$ is a gauge function. So, is $\gamma$ a gauge, if yes, then why?
Basically: What is a gauge?
Answer
In normal usage, a gauge is a particular choice, or specification, of vector and scalar potentials $\mathbf A$ and $\phi$ which will generate a given set of physical force fields $\mathbf E$ and $\mathbf B$.
More specifically, a physical situation is specified by the electric and magnetic fields, $\mathbf E$ and $\mathbf B$. A set of potentials $\mathbf A$ and $\phi$ generates the force fields if it obeys the equations \begin{align} \mathbf B & =\nabla\times\mathbf A \\ \mathbf E & = -\nabla\phi-\frac{\partial \mathbf A}{\partial t}. \end{align} As you know, for a given set of force fields, the potentials are not unique. A gauge is a specific, additional requirement on the potentials. One good example of a gauge is the Coulomb gauge, which is mostly embodied by the requirement that $\mathbf A$ also be divergenceless, $$\nabla \cdot\mathbf A=0.$$ "The Coulomb gauge" refers to the set of potentials which satisfy this.
Gauges are usually thought of as specifying the potentials uniquely. This is not really true, but they do tend to specify the potentials "uniquely up to reasonable physical assumptions". The Coulomb gauge is a good example of this: the gauge transformation to \begin{align} \mathbf A'&=\mathbf A+\nabla \chi(\mathbf r)\\ \phi'&=\phi \end{align} preserves the physical fields, and if $$\nabla^2 \chi(\mathbf r)=0$$ then it also preserves the gauge condition that $\nabla \cdot\mathbf A'=0$. This is not great for unicity, because there are a lot of harmonic functions that satisfy the above condition. However, for a function to really be harmonic throughout all of space - with no exceptions and no singularities - then it must diverge at infinity, which is not really palatable in most cases. Because of that, saying that $\mathbf A$ is the vector potential in the Coulomb gauge usually means that $\nabla \cdot\mathbf A=0$ and that such 'infinite-self-energy' terms have been set to zero; this is usually a unique set of potentials in situations where the energy in the physical fields themselves is not infinite.
It is worth noting that, in certain situations, the word gauge can be naturally free of this ambiguity. In my field, strong-field physics, the words 'length gauge' and 'velocity gauge' are taken to mean that the total energy of an electron interacting with a laser field, at position $\mathbf r$ and with momentum $\mathbf p$, is of the form $$E=\tfrac1{2m}\mathbf p^2-e\mathbf r\cdot \mathbf E$$ and $$E=\tfrac1{2m}\left(\mathbf p-e\mathbf A\right)^2,$$ respectively. For a uniform field (i.e. in the 'dipole approximation') the two energies are equivalent via a gauge transformation. However, here the word 'gauge' is completely unambiguous except for a total constant energy which can very safely be ignored.
Thus far for technical matters. I think, though, that a lot of what worries you is the word 'gauge' itself, which is indeed a weird choice. In everyday usage, a gauge is a generic form of meter or dial. The phrase 'gauge invariance' seems to have come into physics via German, in Hermann Weyl's use of the word 'Eichinvarianz', which loosely means 'scale invariance' or 'gauge invariance' (in the sense that a choice of measuring instrument (gauge) determines the measured physical values in a given setting, i.e. determines the scale).
This invariance under changes of scale is exactly (part of) the (technical) gauge invariance in general relativity, which is invariant under coordinate transformations.
Note, though, that my source for this history is Wikipedia, so if someone can chime in with a better source it would be fantastic.
No comments:
Post a Comment