In the Drude model (semi-classical, but should still apply here I think), the conducting electrons are in a constant electric field, and, in between collisions with the lattice ions (that happen, on average, in a time $\tau$ since their last collision) in which their velocities are reset to an average of 0, they are accelerated and gain a momentum
$$p = qE\tau (=mv)$$, where $q$ is its charge and $E$ is the electric field from the applied voltage. This obviously means in between the collisions, the electron accelerates $a$, from $v_f = v_i + a\tau$.
Accelerating charges give off Larmor Radiation. So does a DC carrying wire give off radiation one can measure?
I suspect it does, but it's of an insanely small magnitude. A quick order of magnitude calculation with the following values gives me:
$$q \sim 10^{-19} \,{\rm C}, \\ E\sim 1\,{\rm V}/1\,{\rm cm}=100\,{\rm V/meter},\\ m\sim10^{-30}\,{\rm kg} \\ \Rightarrow a = qE/m \sim 10^{13}{\rm m/s}^2 \\ \Rightarrow P \sim q^2a^2/\epsilon_0c^3 \sim 10^{-28}\,{\rm W}$$
Which is obviously pretty small. I'm not sure what's the smallest power that can be measured, but I'm guessing by increasing the applied voltage one could increase the radiated power. But is the idea even fundamentally sound?
No comments:
Post a Comment