So a rod right out of the reactor is emitting radiation at a higher level than one that never went into the reactor for days, not minutes?
Apart from anything else, within the bounds of the fuel cladding are now various fission products. Prior to being exposed to the reactor's existing radiation (i.e. proximity to enriched fuel), there would largely have only been spontaneous fission events occurring (not sparked off from a suitably slow neutron impacting on a fissionable
235U nucleus, just
235 or
238 breaking up on their own). After exposure to the reactor core, you've got a general excess of pre-induced fission making the neutrons whiz around, hitting the
235U nuclei and sparking off more neutrons, plus a whole host of shorter-lived isotopes doing their own spontaneous/induced fissioning and re-radiating of the full spectrum of radiation. (Nucleonic and electromagnetic.)
(There have been a number of 'natural uranium reactors' discovered that because of being richer in
235U than today, and with water flowing through to slow down the emitted neutrons to allow them to hit these other nuclei properly, were basically more radioactive than they should have been, and today are actually less rich in
235U than they should be, having basically undergone a slow depletion over many thousands of years, back at the point in deep history when they were 'fizzing away'.)
...once you put a critical mass together...
Just want to say that while "critical" is the point at which a chain-reaction is just self-sustaining, and that when nuclear bombs go off that's "supercritical", the public perception about "going critical" is that this is the latter situation, so I tend not to use that term. Thinking back, this might have coloured some previous posts I made on the subject in response to some sort of "But it could go critical!" statement.
Does the actual level of heat in the uranium affect the rate of decay?
Not the natural rate of decay. Temperature basically does not affect a nucleus's natural tendency to spontaneously undergo fission. The fact that the uranium is hot would, though, be indicative that there is induced fission going on (neutron-spray landing on fissionable atoms, sending more neutron spray around the material), so there is a kind of causal connection, but not heat->decay, more like previous_decay => heat + more_decay.
Interestingly, neutrons emitted by hot fuel, hitting fissionable atoms of hot fuel, are actually
less likely to spark fission (at least in the materials I'm used to, I hesitate to generalise this across the whole periodic table). This is partly due to the atoms being slightly further apart (thermal expansion) and partly because you get faster neutrons from 'hotter' atoms (though temperature is really a moot point at the atomic level, to be picky) which for various reasons don't get to induce so many new fission events (i.e. over and above those that might happen naturally from spontaneous fission) and thus controlled or run-away chain reactions aren't as predominant.
Decay definitely produces heat, but heat itself actually squelches induced fission a bit. Not enough to be considered a good negative feedback method, when there's so much going on in a critical or supercritical mass.
I previously thought they were cooling the rods simply to prevent them from melting, but from what you say, it sounds more like they're cooling them to slow the reaction.
I'd still say they were stopping them from melting. The mechanism of the natural reactors, that I mentioned early on, was basically that ground-water slowed the faster neutrons, let them strike more of the (at that time, more abundant)
235U nuclei, release some energy, spark some more neutrons, etc. But as more energy was released, the ground-water would evaporate and less of the neutrons would slow, so the reaction slowed. At least until more water seeped back into the rock again. But it's always hard to say that this was therefore a good negative feedback method, as this was a happenstance occurance a couple of billion years ago, and had it gone differently it could have looked just like an unenriched ore bed or like an area composed of rocks of a completely different nature, but which were the result of a runaway reaction that only didn't manage to extinguish our distant ancestry because if it had, we wouldn't have been here to know of it.
*The following was spoilered, but is now out of context...*
Hey, lets drop this really really hot stuff into cold water! What could go wrong?
When trying to water down certain hydrophilic acids, it is best practice to add the acid to the water rather than the water to the acid. The analogy always presented to me is that if you have a cell full of a whole bunch of hungry prisoners, chucking a few loaves of bread in will cause fights until you can get all of them satisfied, whereas leading the prisoners into the room full of bread is a lot less problematic. I'm not saying that this has anything to do with the reasoning behind the Chernobyl designers, but it shows a counter-example where the same thinking as was incorrectly made here may be more appropriate.
(It's a bit late, and something is niggling away at my mind that I've miswritten something , up there somewhere. But I can't find what it is. It could just be that while shuffling my text I've edited a paragraph in at the wrong place, though, rather than any actual mis-fact or having missed-out/incorrectly inserted the word "not" somewhere...
)