Definition of radiometric dating
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. 1979, 1986 © Harper Collins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012 Cite This Source (rā'dē-ō-mět'rĭk) A method for determining the age of an object based on the concentration of a particular radioactive isotope contained within it.
For inorganic materials, such as rocks containing the radioactive isotope rubidium, the amount of the isotope in the object is compared to the amount of the isotope's decay products (in this case strontium).
Radiometric dating of rocks and minerals using naturally occurring, long-lived radioactive isotopes is troublesome for young-earth creationists because the techniques have provided overwhelming evidence of the antiquity of the earth and life.
Some so-called creation scientists have attempted to show that radiometric dating does not work on theoretical grounds (for example, Arndts and Overn 1981; Gill 1996) but such attempts invariably have fatal flaws (see Dalrymple 1984; York and Dalrymple 2000).
For example, the medical sciences refer to the biological half-life of drugs and other chemicals in the human body. The original term, half-life period, dating to Ernest Rutherford's discovery of the principle in 1907, was shortened to half-life in the early 1950s.