Question: How hot is the water in the pot? More precisely speaking, how can I get a temperature of the water as a function of time a priori?
Background & My attempt: Recently I started spend some time on cooking. And I'm curious about it. I have learned mathematics as a undergraduate student for four years, but I know a little about thermodynamics. (I listened to such a lecture once. So I've heard of $dU = TdS - pdV$, Entropy and Gibbs energy for example though I forgot almost everything; anyway I think I've never seen a formula depending on time.) So I conduct a small experiment first: I heat 100ml of water by IH correspond approximately to 700W and measure its temperature every 30 seconds. Here is the results.
It looks almost linear, but I think linear approximation is inappropriate; because if so, the water gets higher than $100^\circ\mathrm{C}$. So I guess it's some convex increasing function like $T(t) = 100 - \alpha e^{-t/\beta}$ for some positive constant $\alpha$ and $\beta$. But it doesn't fit the data. (It does fit the data. I just made a mistake in simple calculation. See my answer.)
I think I ignored too many factors. So feel free to assume anything reasonable. I would greatly appreciate if you help me. Thank you.
Additional question: I do a experiment and some calculation to deal with a problem pointed out in the comments of my answer: bad fitting at lower temperature. However, I cannot get a better solution. Fitting seems worse than before... Here is the results what I got. I heated 100ml water in pot with 9cm radius by IH correspond to 700W. (For calculation, I added linear interpolation values in the graph.) How can I get a better solution? (Light blue curve is a logistic approximation defined by $T = \dfrac{100}{1 + 1.62 e^{-0.0168 t}}$ as mentioned here.)
Answer
To close this post, I write an answer by myself though it turned out that I made just a simple calculation mistake.
Since the temperature increase should be monotonic and approach to zero at boiling point, it's reasonable to assume that the temperature increase $dT/dt$ is proportional to the difference $T - 100 \mathrm{^\circ C}$, that is, $$ \frac{dT}{dt} = -k(T - 100 \mathrm{^\circ C}) $$ holds for some positive constant $k$. Solving this equation gives $$ T = 100 \mathrm{^\circ C} + (T(t_0) - 100 \mathrm{^\circ C})e^{-k(t - t_0)}. $$
Let's determine coefficient $k$ from $N$ measurements by linear regression. Let $c$ be a time interval of experiments and $x_n$ the temperature of the water on $t_n = cn$. Then estimate the slope of the tangent line by $$ y_n = \mathrm{mean}\big(\frac{t_{n + 1} - t_{n}}{c}, \frac{t_{n} - t_{n - 1}}{c}\big) = \frac{t_{n + 1} - t_{n - 1}}{2c} $$ for $0 < n < N$. From above equation, there should be a relation of the form $$ y_n = -k(x_n - 100 \mathrm{^\circ C}) + \varepsilon_n $$ where $\varepsilon_n$ stands for experimental errors. I denote this equation by $$ y = -kx +\varepsilon $$ as shorthand. The best estimator $\hat{k}$ is given if $x$ and $\varepsilon$ are orthogonal to each other. Therefore $$ \hat{k} = -\frac{(x, y)}{(x, x)} \approx 0.00667 \mathrm{s^{-1}} $$ from calculations. And this value fits the experimental data well.
Note: I completely rewrite this answer. Here, I would like to review where my answer was inappropriate. It seems not a problem of physic but a problem of statistics. Last time, I solved DE first and took logarithm to make it linear. However the experimental errors also transformed. Especially, $\ln(100 - x_n) \to -\infty$ as $x_n \to 100$. So this seems to cause overfitting at higher temperature and bad fitting at lower temperature. (Considering a effect of the pot looks a good idea but everything I tried fails. It won't fit data and still open though I already got a reasonable approximation.)
Thank you so much for helping me, Chris, Stefan Bischof, Michael Brown and Christoph.
No comments:
Post a Comment