According to the theory of impact erosion, the early crust was presumably much hotter w/ radioactive isotopes (U & K, primarily) and thus less congenial for the later development of life. Asteroid bombardment - I suppose during the late heavy bombardment - then blew off a sufficient chunk of this material to lower net radioactivity in the crust.
Was this simply because beforehand for some reason there was an upward gradient of radioactivity in the primeval crust ? I'm at a loss to conceive a selective mechanism for this phenomenon. Any paleo-geologists in the house ?
Also, don't forget that even is uranium is dense, it doesn't "sink below the crust". The mantle is solid rock, uranium is trapped in the crystal lattice of minerals, but it is an incompatible element, meaning it will preferentially go into the melt when the mantle partially melt. These melts then rise by buoyancy and actually make the bulk of the continental crust. So it makes perfect sense that the crust is uranium-enriched and the mantle uranium-depleted. Then, if you remove some of this crust after its formation, you will indeed remove some radioactive elements compared to the original composition (I'm not saying it happened, but the model is perfectly logical).