Does differentiation of a vector with respect to a vector make any sense? Even if it makes sense, how does it make any physical meaning? I mean what is the physical interpretation?
Answer
Well, a good example is thinking in term of components. In several areas of physics, the math gets more intuitive when you think in terms of components of the vectors. So, instead of writing the vector $\mathbf r$ for the position of a particle, you write $x^i$ as the $i$-th component of a vector. The $i$ in the top is to indicate a contravariant vector, instead of the $i$-th component of a covariant vector: $x_i$. In euclidian geometry, those differences are irrelevant, so, lets forget about them. Thus, I am going to use always lower indexes.
Say you have a scalar function $\phi$, dependent on position: $\phi(\mathbf r(t))$. In component notation: $\phi(x_i(t))$. Its time derivative: $$ \frac{d\phi}{dt} = \sum_i \frac{\partial\phi}{\partial x_i} \frac{d x_i}{dt}. $$
So, transforming it to vector notation, how would one write this? Yes.. Using vector division.. since $x_i$ represents component of a vector: $$ \frac{d\phi}{dt} = \frac{d \phi}{d \mathbf r} \frac{d\mathbf r}{dt}. $$
Now is a simple chain rule. On this small example, the derivative of the scalar function with respect to a vector, would be what you call gradient: $$ \frac{d \phi}{d \mathbf r} = \nabla\phi \quad\Longrightarrow\quad \frac{d\phi}{dt} = \nabla\phi\cdot\frac{d\mathbf r}{dt}. $$
Similarly, instead of scalar field, if was a vector field $\mathbf E = \mathbf E(\mathbf r(t))$, say, an electric field. We can use component-notation: $E_i = E_i(x_k(t))$. So, the time derivative: $$ \frac{dE_i}{dt} = \sum_k \frac{\partial E_i}{\partial x_k} \frac{d x_k}{dt} \quad\Longrightarrow\quad \frac{d\mathbf E}{dt} = \frac{d\mathbf E}{d\mathbf r} \frac{d\mathbf r}{dt} $$
That one is a little bit more tricky, but the component-notation makes it clear: It has two ranks instead of one. Yes, a matrix! Lets call it matrix $J$, and write it in component-notation $J_{ik}$: $$ J_{ik} = \frac{\partial E_i}{\partial x_k} = \left(\frac{d\mathbf E}{d\mathbf r}\right)_{ik} $$
That matrix is called the Jacobian Matrix. So, it makes sense to "differentiate" by vectors, if you look at the component-notation.
For sake of curiosity: The second order derivative of the scalar field would give a second-rank object, or a matrix, called Hessian Matrix. The second order derivative of the vector field would give rise to third-rank objects. The $n$-rank generalization is called a Tensor. And when space is not euclidean, one can build a $r$-rank contravariant and $s$-rank covariant tensor, or a $(r,s)$-rank tensor.
No comments:
Post a Comment