Airborne particles below 20 micrometers in diameter can be respired when inhaled. This presents a unique array of internal dosimetry considerations, particularly for radioactive particles such as from naturally-occurring radon decay products, nuclear weapons tests, and hypothetical dirty bomb contamination. Federal guidance documentation for local, state, and tribal radiological assessments currently assume that individuals ingesting radioparticles are exposed to the total activity of the particle. While this is a reasonable approximation for gamma radiation, the ranges of alpha and beta particles are finite and on the order of airborne particle sizes. Depending on the size of the particle, a substantial loss can occur, delivering less dose than predicted. This conflict results in a systematic overestimation of predicted dose to individuals when using these exposure models to the degree of known particle size.
The net self-attenuation of alpha and beta particles depends on the characteristic energy of the isotope decay, the particle size, and the concentration of the isotope among inert nuclides within the particulate. Radioparticulates may arise as homogenous fragments such as from spent fuel, or as non-homogenous contaminated dust aggregates. To explore this landscape of factors, a series of Geant4 simulations were performed. Spherical radioparticles of transuranic oxides were created at varying respirable particle diameters with stochastic decay of characteristic radiations (Fig. 1) in order to tally the total outward intensity and average exiting energy.
Preliminary results of million sample runs indicate dramatic reductions in radiation intensity and average energy with particle size for alpha emitters. When the particle is greater than the range of the alphas in the radioparticles (found with the Bethe formula), the losses of intensity and energy are over 10%.