NRE VI, International Symposium, June 5-9, 1995, Montreal, ENVIRONMENT INTERNATIONAL, Vol. 22, Suppl. 1, pp. S93-S99,1996
AN EXPERIMENTAL-NUMERICAL METHOD FOR THE EFFICIENCY CALIBRATION OF LOW ENERGY GERMANIUM DETECTORS
Several radionuclides emit significant gamma rays at energies below 80 keV, which in many cases are essential for their detection in radioenvironmental assays. The gamma spectroscopic analysis at this energy region is conducted using planar Low Energy Germanium detectors, with high and almost constant efficiency in the energy region between 20 and 80 keV. When analysing solid and liquid samples, using these detectors, the count rate is highly affected by the intense self-absorption of the low energy photons. Thus, the difference of the absorption properties between the calibration source and the sample requires the introduction of an efficiency correction factor. A method which is applicable for cylindrical geometries was adapted for the determination of the above correction factor using a newly developed experimental-numerical technique and a FORTRAN program. This program, using as input the source-to-detector geometry and the values of the linear attenuation coefficient (?) of both the calibration source and the material to be analysed, calculates the efficiency correction factor. The value of the linear attenuation coefficient (?) needed for this calculation is being experimentally estimated for each material to be analysed. The technique has been cross-checked using standard materials. According to the results obtained, in the case of surface soil samples, lignite and fly ash values ranges from 0.2 to 0.9 cm?? and leads to efficiency correction factors - for the geometry used and a 4M HCl mixed radionuclide calibration source - in the range of 0.5 to 1.2 for the photons emitted by Pb-210 at 46.52 keV and Am-241 at 59.54 keV. The correction factor in the case of 185.99 keV photons is slightly lower than 1.0, even for the most absorbing of the materials analysed.