To put it bluntly, weather is described by the Navier-Stokes equation, which in turn exhibits turbulence, so eventually predictions will become unreliable.
I am interested in a derivation of the time-scale where weather predictions become unreliable. Let us call this the critical time-scale for weather on Earth.
We could estimate this time-scale if we knew some critical length and velocity scales. Since weather basically lives on an $S^2$ with radius of the Earth we seem to have a natural candidate for the critical length scale.
So I assume that the relevant length scale is the radius of the Earth (about 6400km) and the relevant velocity scale some typical speed of wind (say, 25m/s, but frankly, I am taking this number out of, well, thin air). Then I get a typical time-scale of $2.6\cdot 10^5$s, which is roughly three days.
The result three days seems not completely unreasonable, but I would like to see an actual derivation.
Does anyone know how to obtain a more accurate and reliable estimate of the critical time-scale for weather on Earth?
Answer
I am not sure how useful this "back of the envelope" calculation of reliability of Numerical Weather Prediction is going to be. Several of the assumptions in the question are not correct, and there are other factors to consider.
Here are some correcting points:
The Weather is 3 dimensional and resides on the surface of the planet up to a height of at least 10km. Furthermore the density decreases exponentially upwards. Many atmospheric phenomena involve the third dimension such as rising and falling air circulation effects; jet streams (7-16km).
The equations are fluid dynamics plus thermodynamics. The Navier-Stokes equations are not only too complex to solve, but in a sense inappropriate as well for the larger scales. One problem is that they might introduce "high frequency" effects (akin to every individual gust of wind or lapping of waves), which should be ignored. The earliest weather prediction models were seriously wrong because the high frequency fluctuations of pressure needed to be averaged rather than directly extrapolated. Here is a possible equation for one point of the atmosphere:
Tchange/time = solar + IR(input) + IR(output) + conduction + convection + evaporation + condensation + advection
The regionality of the model is important too. In a global model there will be larger grid sizes and sources of error from initial conditions and surface and atmosphere top boundary conditions. In a mesoscopic prediction there will be smaller grid sizes but sources of error from the input edges as well. The smallest scale predictions of airflow around buildings and so on might be a true CFD problem using the Navier-Stokes equations however.
I dont know that any calculation is done to predict the inaccuracies, although the different types including the numerical analysis (chaos) error sources can be studied separately. Models are tested against historical data for accuracy overall with predictions made 6-10 days out.
To assume that the atmosphere "goes turbulent" after 3 days seems to conflate several issues together.
No comments:
Post a Comment