I measured the flow rate from the pump at 18 cc/second. The water should come out 3degC hotter than it went in. Sounds fine and dandy.
Then I looked up the thermal transfer coefficient of glass and it is only just enough.
It's a thermal gradient thing. For sake of argument, let us assume the glass is 2mm thick.
If I passed the laser excess heat (54 calories per second) through 1 square cm of 2mm aluminium plate. One side would be 21.6 degC hotter than the other.
If I did the same with a 2mm glass plate, then one side is 4320 degC hotter than the other. It is not very good at conducting heat.
Fortunately I have more than 1 square cm of glass to carry the heat away. In a 40W tube there must be at least 150 square cm, so the inside of the glass should only average around 30degC hotter than the outside.
So, why does an uncooled laser tube melt the cooling ring off the end and lose power? Can someone explain? I am starting to flounder :drowning:
Here's my best guess. As I understand it, a laser has a mirror at one end and a half silvered mirror at the other. A percentage of the light has to reflect back through the tube to get amplified and keep things ticking. To destroy it you probably need to break the end seal or vapourise the mirrors. I cannot see what else could go wrong but I am sure it can.
I think my half silvered mirror is klutzed and not re-energising the tube. It brought the glue (holding the cooled glass ring) up to at least 150degC to make it let go. But, with no way to dump it's excess heat the mirror should have hit red heat PDQ and dropped off, unless it failed in some fashion that meant it had less heat that it needed to dump.
Have I got it right?