Update edit: my question is really does alternating electric current through a simple resistor have a mass-based inertia that introduces lag if the frequency of oscillation is high enough? Are there every situations where inertial lag would be relevant? I'm not asking about parasitic inductance of real materials.
I have a question about whether an analogy exists or doesn't exist between: 1) a circuit with alternating square wave voltage source and a resistor (no capacitor, no inductor) 2) a ball thrown upwards into the air against gravity with drag My question is whether a transient slosh or lag of the current across a resistor (no C, no L) is known to occur. I started thinking about this because air resistance and RC circuits are both described by similar first order diff eq. But Ohm's law imposes a constraint on initial conditions for the circuit that doesn't exist for the ball, that current across a resistor always moves from high to low voltage.
To conceptualize the current in the resistor, I'm imagining removing the capacitor and considering an AC circuit with only a power source and a resistor (no C, no L). The textbook I'm reading (Halliday & Resnick) shows that current and voltage stay in phase for a sinusoidal AC in this circuit. This seems to follow from Ohm's law in DC circuits, where current is defined to move from high to low voltage. It seems to me that Ohm's law prevents an initial condition where the current moves against its voltage gradient, in contrast to the analogy of throwing a ball up into the air.
I understand that the change in electric field and electrostatic potential transmits very rapidly (nearly speed of light) through the circuit, but it seems like the charged particles have still acquired a momentum that takes a certain amount of time to change. Is such a lag measurable in some simple AC R circuits, perhaps very high frequency square waves? Perhaps also high amplitude square wave? I don't know how it would be measured. I don't know if this would mean charge accumulation in some part of the circuit. Oscilloscopes measure voltage, and I'm trying to imagine the current. I also know that if the current was measured by introducing a capacitor, this would introduce a phase change between voltage and current from the capacitor. My question is about isolating any phase change contribution from the resistor itself. I imagine that this would be equivalent to asking if the impedance of a resistor is truly frequency and amplitude independent. I know this isn't the way that the introductory textbooks are presenting circuits, but I'm curious if this is a known real-world phenomena in some cases, perhaps in device physics?
No comments:
Post a Comment