Suppose a region of space at a distance D from Earth is escaping from us with the velocity v. Since it seems like the expansion of the universe is accelerating, things at D from Earth should be receding faster and faster. It would seem from the v = HD equation, that H is then getting bigger in time. However, in other sources we can read, that the Hubble "constant" it's getting smaller with time. How come?
Answer
Let's first derive Hubble's Law. Consider a galaxy at present distance $x$ from us. Then, if the local velocity of the galaxy within its cluster is ignored, its cosmic distance will change over time as $$ D(t) = a(t)\,x, $$ where $a(t)$ is the so-called scale factor. If we take the time derivative of this equation, we get $$ \dot{D} = \dot{a}\,x = \left(\frac{\dot{a}}{a}\right)\,ax, $$ which we can write as $$ v = HD, $$ where $v = \dot{D}$ is the recession velocity and $H = \dot{a}/a$ is the Hubble parameter. This is the famous Hubble Law. In other words, the Hubble parameter is a ratio of two quantities. The expansion of the universe implies that $a(t)$ increases over time, and $\dot{a}>0$. In fact, the expansion of the universe is accelerating, which means that $\dot{a}(t)$ increases over time as well. However, the ratio $H = \dot{a}/a$ decreases, because $a(t)$ increases more rapidly than $\dot{a}(t)$. The expansion rate of the universe is not high enough for $H(t)$ to increase.
Note that if the expansion were exponential, $a(t)\sim\exp(Ht)$, then $H$ is constant. In the standard cosmological model, the universe is evolving towards a state of exponential expansion, dominated by a constant dark energy density. In other words, if the standard model is correct, the Hubble parameter is slowly converging towards a constant value, and the expansion rate of the universe is increasing towards an exponential rate.
No comments:
Post a Comment