So this is a completely random and trivial question that was prompted by looking at my microwave oven and the back of a TV dinner and my google searching failed to produce a meaningful answer so figured I'd ask here.
On my TV dinner box it has different cook times based on Microwave Oven Wattage:
1100 Watts - Cook 2 minutes, stir and cook for another 1.5 minutes. (3.5 minutes total)
700 Watts - Cook 3 minutes, stir and cook for another 2.5 minutes. (5.5 minutes total)
My oven is 900 Watts, which is right in the middle.
Assuming those times listed on the box are the scientifically optimal cook time (which is doubtful, but just go with me), is it fair to assume I should use the linear average for 2.5 minutes, stir and cook for another 2 minutes (4.5 minutes total), or is there a different rate of growth between the 700 watt and 1100 watt ovens that would change the optimal cook time?
Answer
The rate at which a mass absorbs microwave radiation is characterized by the 'Specific Absorption Rate', which is proportional to the electromagnetic field intensity:
Wikipedia has a dedicated article to this phenomenon but in short it says
SAR=∫sampleσ(→r)|→E(→r)|2ρ(→r)d→r
Because the absorption rate is proportional to the EM field intensity, |→E(→r)|2, which is in turn proportional to power, then the relationship will indeed be linear.
Assuming 100% energy efficiency (which is a wild overestimate: 20% might be more accurate but I do not know the answer to that question) your total 'energy' transferred to your dinner will be:
Energy=PowerΔt
i.e.
Δt=EnergyPower
The cook time will be inversely proportional to your oven power.
1100 watts for 3.5 minutes computes to Energy =1100 Watts ×210 seconds =231,000 Joules
700 Watts for 5.5 minutes computes to Energy =700 Watts ×330 seconds =231,000 Joules
Thus a 900 Watt oven would necessitate
Δt=EnergyPower=256.66 seconds =4.278 minutes
No comments:
Post a Comment