The Hubble parameter H has dimensions equal to [T]−1, and hence there is a natural time-scale for the Universe H−1. This lecture by Neal Weiner says (he wrote at around 4:40)
H−1 is the time-scale over which the universe changes by O(1).
He also said that unlike cosmologists this is how particle physicists think about the time scale H−1.
Can some explain what does he mean by the statement above?
Answer
By definition, H=˙a/a. In terms of tH=H−1, this reads
a=˙a⋅tH
So if you assumed a fixed expansion rate ˙a=const, the universe would have needed a time tH to grow to scale a.
I haven't wached the video, but here's my guess what the lecturer was getting at:
If you do a Taylor-expansion of the scale factor, you end up with Δa=˙a(t0)⋅Δt+O(Δt2) If you want that change to be "O(1)", ie Δa≈a(t0), you end up with Δt≈a(t0)˙a(t0)=H(t0)−1 This of course assumes the validity of our first order approximation, and I also might be completely wrong about the intended meaning of "changes by O(1)".
No comments:
Post a Comment