Units of Measurement
SI units, for the measurement of radiation dose-rate, have been in use in Europe for a number of years but, because of the continuing use of the original cgs. based units in the USA and the existence of long-lived installed equipment in established facilities, it is still necessary to cross relate the systems of units. The capacity for confusion increases with the passage of time and this note attempts to simplify matters.
HISTORY
As soon as it was discovered that x-rays were harmful, as well as useful in medicine, it became necessary to measure the amount of radiation given to patients and to express this in terms of a dose. As the primary effect of x and gamma radiation is to ionise material through which it passes, a practical method for assessing the quantity of radiation is to measure the ionisation produced in gas (air being the most convenient) in an ionisation chamber. The charge deposited in an ionisation chamber is a direct measure of the ionisation produced by a given amount of radiation and thus can be related to a dose. A corollary of this is that current flowing in an ionisation chamber is a measure of the rate of radiation and thus relates to dose-rate.
At the time x-rays were discovered, electro-technology was in its infancy and two systems of units had evolved from separate observation of the electrostatic and electromagnetic properties of electricity. The instrument available for the measurement of charge was the gold leaf electroscope, which is still found in school physics labs. Operation of the gold leaf electroscope is dependent upon mechanical displacement resulting from the electrostatic force between charged electrodes - the operating principle employed in the present day quartz fibre dosimeter (QFD).
Electrometry being achieved at the time by electrostatic measurement, it was logical therefore to define the unit of radiation in terms of the electrostatic unit (esu) for charge.
UNITS
The agreed unit of radiation exposure became the Röentgen (R) and was defined as the amount of x or gamma radiation which produces 1 esu of charge in 1 cubic centimetre of dry air at normal temperature and pressure (ntp); one esu being 3.3 x 10 exp-10 Coulombs in 'modern' units.
Radiation dose was measured in terms of exposure - Röentgens- for many years but, as radiation produces different amounts of ionisation (and consequently deposits different amounts of energy) in different materials, e.g. tissue, the need was recognised for calibration in terms of the energy deposited in (absorbed by) a given mass of material in order to provide a more meaningful measure of dose. The derived unit of calibration was called the rad.
One rad is defined as the amount of radiation which deposits 100 ergs of energy per gram of material - the material must be stated e.g. rads (air) or rads (tissue). Thus a system of practical measurement evolved whereby instruments respond to exposure dose but are calibrated in terms of a calculated absorbed dose for a given material. It is of course possible to measure absorbed dose directly - e.g. by the heat produced in the material - but at the levels involved in health protection this is hardly practicable. The erg is a very small unit of energy; one Joule equalling 10 exp 7 ergs.
How does the rad (air) relate to the Röentgen?
To find this it is necessary to calculate the energy (ergs) deposited in one gram of air subjected to an exposure dose of 1R.
1 esu = 3.3 x 10 exp-10 Coulombs
The charge on an electron is 1.6 x 10 exp-19 Coulombs.
Thus 1 Coulomb = 6.25 x 10 exp 18 ion-electron pairs.
and consequently 1 esu = 3.3 x 10 exp-10 x 6.25 x 10 exp 18 ion pairs
= 2.0625 x 10 exp 9 ion pairs.
Now the mean absorbed energy (W) to release one ion pair in air is approximately 32.5 electron volts#, hence 1 esu of charge is the result of the abortion of 2.0625 x 10 exp 9 x 32.5 = 6.764 x 10 exp 10 electron volts in a cubic centimetre of air.
Now 1 Coulomb Volt = 1 Joule and consequently 1 electron volt = 1.6 x 10 exp-19 J
Thus the energy absorbed per cc for 1R is:
6.764 x 10 exp 10 x 1.6 x 10 exp-19 = 1.082 x 10 exp-8 J or 0.1082 ergs.
Now the density of air at stp. is 1.293 x 10 exp -3 grams/cc. thus the
absorbed energy per gram = (0.1082 x 10 exp 3) / 1.293 = 83.7 ergs
i.e. just under 84% of a rad(air) as defined.
Therefore 1 rad(air) = 100 = 1.195 Röentgens
83.7
A corresponding value for 1R in tissue is approximately 93 ergs per gram and to obtain an energy response corresponding to that of human tissue, ionisation chambers can be made with 'tissue equivalent' walls and filling gases.
So far only gamma and x radiation has been considered but the principles discussed apply equally to ionisation produced by other types of radiation beta, alpha, neutron etc. In these cases it is necessary to include in the calibration a multiplying factor relating to the biological effectiveness of the radiation( rbe). The rbe for gammas and x rays is unity but for alphas and neutrons it is much greater. That is to say that, for a given deposition of energy, the latter have greater physiological effect.
The Röentgen x rbe is the 'radiation equivalent physical' or rep.
The rad x rbe is the 'radiation equivalent man' or rem.
Instruments for a specific purpose, e.g. neutron monitors, were often calibrated in rem. The rbe for neutrons being between 10 and 20 according to energy.
With the adoption of SI units the unit for absorbed dose became the Gray (Gy) representing an absorbed dose of 1J per kg, an irrationally large unit equivalent to 100rads requiring health related monitoring instruments to be calibrated in mGy.
The Gy x rbe is the Sievert.
There is no SI unit for exposure to directly replace the Röentgen, which the author considers to be a pity, but the Coulomb/kg is an accepted, if less practical unit, for relating radiation fluence to ionisation.
The energy spectrum involved x and gamma dosimetry is wide resulting in penetration into human tissue extending from sub epidermal to complete and relatively recently dose measurement has become depth specific. Instruments are often calibrated in units of Ambient Dose Equivalence (H*10), relating to the Sievert dose or doserate at a depth of 10mm. Instruments for β and soft x-radiation dosimetry may be calibrated for lower depths.
Human dose assessment at depth is achieved by measurement within a 'phantom'- a representation of the human torso- with small ionisation chambers or thermo-luminescent dosimeters (tld)s . Such measurement is predicated upon the Bragg Gray principle which stipulates that the energy deposited in a small 'measuring void' within a material is representative of the energy deposited in the material at the void location.
Field monitoring instruments are required to monitor dose rate, the output from an ionisation volume being current.
From the definition of the Röentgen it is clear that 1R per second will produce a current of
3.3 x 10 exp-10 A in 1 cc at ntp.
Therefore 1R per hour will produce 3.3 x 10 exp-10 / 3.6 x 10 exp 3 = 0.917 x 10 exp-13
Amps per cc or in a practical 1 litre ionisation chamber 0.917 x 10 exp-10 A.
Correspondingly 1Gy(air) per hour will produce:
0.917 x 10 exp -8 / 0.837 = 1.095 x 10 exp -8 Amps
Ionisation chambers are also used to measure the amount of radio-active gas present in the environment in which case the chamber filling gas is sampled from the working environment. In this case the chamber current represents air concentration i.e. Becquerels per cubic meter.
A radio-active gas of interest might be tritium which finds extensive use in beta lights .
Tritium emits beta particles of maximum energy 18keV. However, as will be explained in a further note 'The Atom and Radioactivity' the beta emission spectrum is continuous and that the mean particle kinetic energy is about 6keV. As the tritium is in air the beta particles ionise the air sample within an ionisation chamber such that on average 32.5 eV# of the particle energy is absorbed by each ionisation. Thus complete absorption of a tritium beta results in the generation of about:
6000 /32.5 = 185 ion pairs
or 2.96 x 10 exp-17 Coulombs.
Currently the maximum permitted tritium in air concentration is 0.185 MBq per cubic meter which would result in the generation 2.96 x 10 exp-17 x 0.185 x 10 exp 6 = 5.48 x 10 exp-12 Coulombs per second (Amps) in a cubic meter volume. Thus a maximum concentration in air sample in our practical 1 litre chamber would produce a current of 5.48 x 10 exp-15 A. Because some of the betas are lost to the chamber wall the actual current might be expected to be less than this but, due to their very low energy, tritium betas have very short range in air and wall loss is not significant in tritium monitoring.
# Subject to variation dependent upon the reference consulted.
|