In 1896, Henri Becquerel—a French scientist—made an astounding discovery. He observed that in a dark environment, uranium salts were capable of blackening a photographic plate. This happened even when there was a paper barrier. He had stumbled on the concept of radioactivity: unstable atomic nuclei emit very active subatomic particles. This seemed to happen most frequently in the very heavy chemical elements (such as uranium) as well as unstable isotopes (such as carbon-14).
Through further experimentation, scientists were able to distinguish the different kinds of radiation: alpha particles (which emit a positive charge), beta particles (which emit a negative charge), and gamma rays (which emit a neutral charge).
Radioactivity is a volatile phenomenon; in fact, scientists are unable to forecast whether an atomic nucleus would disintegrate to the point that it would discharge radiation. Instead, they have observed how long the process can take once it started and called it “half life.” Scientists have also discovered how to tap radioactivity and harness its powers for productive purposes. For example, the nuclear reactors tap this rich energy source and turn it into electricity. Radioactivity is also used in medical examinations and research, particularly in studying the movement of any food or medicine in the body after it has been ingested.
Radioactivity certainly has its benefits. Nevertheless, it is a volatile chemical reaction that poses multiple risks to the environment and to the people who handle and use it. For example, a nuclear reactor that melts down can leak toxins that will seep into air and water, affecting the health of the community for many generations. It has also been misused as a weapon of war. Thus, the use of radioactivity has been subject of many ethical debates.