"Here I was talking about the ocean, I mean the entire ocean mass, the entire biosphere, the entire atmosphere, as though it were in my test tube. . . Once you get over that, the whole carbon dating thing falls into place."
Early 14C Research
The newly devised radiocarbon dating method was published in the last week of 1949 just before calendar pages turned to January 1, 1950. That momentous date would, in time, become a significant placeholder on the Western time scale: day-zero Before Present (BP). The year 1950 was chosen to divide radiocarbon time because global atmospheric carbon levels were, by then, drastically altered by human activities. Fossil fuel emissions diluted quantities of 14C (the Suess Effect) while atomic testing resulted in increased atmospheric 14C, known as bomb carbon. The year AD 1950 represents a turning point in chronometrics and is an homage to its inventor, Willard “Wild Bill” Libby, and his colleagues' accomplishment. Arguably, "BP" was also a symbol of an increasingly secular 20th century, one in which scientific breakthroughs such as the atom bomb were rippling across the world.
The roots of radiocarbon science predate Libby's 1949 accomplishment. In 1935 no one knew that 14C even existed. Physicist Franz Kurie was the first to publish suspicions that 14C may be artificially created, based on anomalous particle behavior (recoil tracks) Kurie observed when 14N was bombarded with "fast neutrons" in a particle accelerator. Kurie deduced that if the recoil tracks he observed were from a proton being ejected, rather than from an alpha-particle, 14N must transform into 14C. Imagery of the recoil tracks led Kurie to believe that it was a proton being ejected, though additional work was needed to confirm this possibility. The next year, two different groups independently reported that the same particle behavior could be created with "slow neutrons," though it was still uncertain whether the ejected particle was a proton.
In 1936 further support for Kurie's supposition came from a study by Burcham and Goldhaber, which showed that the particle emission produced in this interaction was almost certainly a proton. Also in 1936, physical chemist Martin Kamen completed a doctoral dissertation for which he examined 730 recoil tracks; his observations were the same as those made by Kurie. In 1937 Kamen and Kurie began working together at the Berkley Radiation Laboratory with the aim of investigating neutron-nuclear interactions. At this point the existence of 14C was sufficiently proved, at least in a laboratory setting, though little was known about the isotope. It was believed that 14C was an unstable, radioactive, isotope, and that the half-life was short—mere hours or days, or at most, months. However, this was yet to be confirmed.
The late 1930s was a time of burgeoning research into the use of isotopes for biological tracer studies, which investigate the behavior of biological processes through chemical elements found in living organisms. Scientists hoped that a radioactive-isotope of one of the abundant biological elements—hydrogen, oxygen, carbon, or nitrogen—would be found to have a long-enough half-life to be used for biological tracer studies. Thus, research into 14C during this time was focused on its possible utility in such applications, not dating. Technological advances in cyclotrons (particle accelerators) made by Ernest Orlando Lawrence and internal-target preparation advances by Kamen set the stage for the future of radioactive-isotope research. Finally, in 1940, Kamen and Samuel Ruben, a student of Willard Libby, found that 14C had a much longer half-life than previously believed; however, Kamen and Ruben estimated the half-life of 14C was 25,000 years! The inaccuracy of their half-life calculation aside, Kamen and Ruben are credited for "discovering" 14C, at least as a tool for biological and chemical research.
Not only was the 14C created in labs artificial, so were the neutrons that produced 14C through bombardment of 14N. In the 1930s it was unknown whether neutrons and 14C occurred naturally. In the late 1930s, cosmic-ray physicist Serge A. Korff at the Bartol Research Foundation was trying to detect neutrons in natural radiation by sending Geiger counters to various levels of the atmosphere with balloons. Eventually Korff and Danforth found increasing neutron intensity with elevation. They suggested that this was the result of cosmic radiation interacting with the atmosphere. It followed that if neutrons could be identified in the atmosphere, 14C must also be present. This study was, according to Libby, the catalyst for his radiocarbon dating work.
Willard Libby (1908-1980) graduated from the University of California, Berkley with his undergraduate degree in 1931. He triple-majored in chemistry, math, and physics, and built the first Geiger counter in the United States for his senior project. In 1933, Libby was awarded his doctoral degree from Berkeley. After receiving his PhD., Libby joined the Berkley faculty and is considered Berkley's first nuclear chemist.
It would be five years between reading Korff and Damforth's (1939) article and Libby finding time to develop the radiocarbon method. In 1940, Libby obtained a Guggenheim Fellowship and took sabbatical from Berkley to conduct research at Princeton University. Soon thereafter, the United States entered World War II and Libby went to work on the Manhattan Project at Columbia University to develop nuclear weapons. In 1945, after the war, Libby began working at the University of Chicago, which was then becoming the leading institution in atomic sciences. It was there, at Chicago's Department of Chemistry and Institute for Nuclear Studies, that Libby would develop radiocarbon dating. Thirty years later, when asked why he was the person to come up with the method and not someone else, Libby answered that the obstacle for others was the idea of global mixing:
Here I was talking about the ocean, I mean the entire ocean mass, the entire biosphere, the entire atmosphere, as though it were in my test tube... Once you get over that, the whole carbon dating thing falls into place.
Libby's early work with radiocarbon dating was conducted in total secrecy, for fear that funding would be withheld from him because of the outlandish nature of this project. Without breaching his secrecy, Libby put a student and an assistant to researching radiocarbon. Graduate student Ernest Anderson was assigned the task of identifying the natural abundance of 14C and assistant James Arnold was tasked with isolating and measuring 14C. Anderson was able to complete his project by obtaining samples of modern wood from around the world, and thereby also addressed the aforementioned obstacle of worldwide mixing of atmospheric carbon, which was a foundational assumption of radiocarbon dating. See In the Environment for a review of this topic.
The radiocarbon dating method, though conceptually straight-forward, faced several challenges. Libby still needed to determine if it was feasible given the costs of access to equipment, sample sizes, and time—it often took four days of round-the-clock counting to get the measurement for a single sample. Libby and his colleagues also needed access to a detector that was sensitive enough to count 14C. In addition, obtaining samples of historical materials of known age to assay was not easy and required the assistance of archeologists. As Libby put it, "those museum dogs were not going to give [samples] to a bunch of physical chemists to burn up, no way." But Libby's team persisted and obtained enough historic samples to test the dating method. These samples required removal of contaminants (pretreatment)—another step Libby cited as critical in the development of radiocarbon dating.
The shared history of radiocarbon dating and archeology began in 1947. By this point Libby was certain radiocarbon dating was feasible but needed funding and access to equipment to test the method. Libby first disclosed his plans for radiocarbon dating to those close to him in 1946, and in 1947 James Arnold's father provided unsolicited Egyptian specimens of known age to Libby, obtained from Ambrose Lansing at the Department of Egyptian Art at New York's Metropolitan Museum of Art.
The year 1947 also saw the informal creation of a University of Chicago seminar club to discuss the role of social science in the atomic age, spearheaded by Chicago researchers Harold Urey (a 1934 Nobel laureate in chemistry, and an ally of Libby's), associate professor Harrison Brown, and anthropologist and dean of social sciences, Robert Redfield. That same year, radiocarbon dating was presented for the first time to an audience outside Chicago, at a Viking Fund Supper Conference. Though two-dozen anthropologists and archeologists were in attendance, they were instructed to keep the development of radiocarbon dating out of the public sphere. Soon after the conference, the Viking Fund financially backed Libby's radiocarbon dating project. Many people, most notably Urey and the Viking Fund's director of research, Paul Fejos, were involved in the events culminating in this funding being secured.
Though communication about the radiocarbon method was slow and fraught with misunderstandings, Libby's project had well-connected advocates and garnered plenty of interest, as well as controversy, among archeologists. Before the method was even shown to be practicable, debate swirled around who should oversee the integration of the new method into archeology. Organizations proposed for this task included the Society for American Archaeology, the American Anthropological Association, the Committee for the Recovery of Archaeological Remains, the National Research Council, and the Viking Fund, among others. In great part the calls to delegate an organization came from fears that the radiocarbon method was going to be controlled by the University of Chicago or the Viking Fund, and that the technique would not be made available to all who sought to use it. There were other concerns as well, such as whether old-world archeologists would have representation in discussions of radiocarbon dating.
A historic meeting occurred in January 1948 with a presentation by Libby at a Viking Fund Supper Conference, which was well attended by archeologists. Here, the dispute over who should represent archeologists was settled—the American Anthropological Association was chosen as the representative body, "to collaborate with Libby's group, coach its brethren to be scrupulous in fulfilling their reciprocal responsibilities, and mediate the inevitable disputes and misunderstandings that arose."
A recording of Libby's seminal 1948 presentation at the Viking Fund Supper Conference is available courtesy of the Wenner-Gren Foundation for Anthropological Research, Inc. The Foundation is the sole owner of these recordings. The conference recordings are split in two parts:
"We are not archeologists and we have no intention of becoming archeologists. Our intention is to produce a method which we can turn over to you people. . . "
- Willard Libby (Part 2, at 35 minutes)
For many archeologists at the time, radiocarbon dating was intimidating. In part this was due to its association with the atom bomb. While radiocarbon dating was not directly related to the development of the bomb, it was developed by atomic scientists in a social climate of fear and awe of the power of the atom. Additionally, most archeologists lacked the necessary background to understand how radiocarbon dating worked, and thus were reluctant to adopt the technology. Radiocarbon dating was also viewed as a threat to established dating methods and chronologies. Some even postulated that it could render obsolete their job as an archeologist—seemingly all questions could now be easily answered.
Libby's first published radiocarbon assays in December 1949 were on wood with known or assumed dates. These samples consisted of two dendrochronological samples, a floor fragment from a Syrian palace, two ancient Egyptian wood fragments (from a coffin and a funerary boat), and two samples from Egyptian tombs which were assayed as one sample. The measured ages were found to be satisfactory in comparison to expected dates of the samples. The half-life used to calculate the ages was 5720 ± 47 years. The study established that the radiocarbon method was useful for up to 4600 years ago and expressed the author’s hope that future research could evaluate the accuracy of the method up to 20,000 years ago. This was radiocarbon's unveiling to the wider scientific public. Since then, instrument sensitivity has increased the measurement capabilities to approximately 55,000 years ago.
Though the new technology was discomfiting to many archeologists at the time of its development, by the end of the 1950s it was widely accepted in archeology as well as in other fields of study, such as geology. By then twenty radiocarbon labs had been established around the world, and the journal Radiocarbon was being published to consolidate radiocarbon date lists from the labs and ensure sufficient information was being published. Most of these early radiocarbon labs were established at universities or research institutions, though one commercial lab was opened in the United States as well.
In 1960, Libby won the Nobel Prize in Chemistry
for the radiocarbon dating method, the first time (and thus far only time) archeology has been mentioned in a Nobel award citation.
The history of radiocarbon dating does not end with Libby. It is, of course, still in the process of unfolding. After the invention of radiocarbon dating, other critical breakthroughs in radiocarbon science were achieved, including the recognition that radiocarbon ages needed to be calibrated to correct for fluctuations in production of atmospheric 14C, and the development of AMS for radiocarbon measurement. A brief history of these follows.
Prior to the development of the first calibration curve was Libby's "Curve of Knowns," in which he plotted known-age ancient Egyptian and dendrochronological samples in relation to his first published 14C half-life (5720 ± 47). With additional known-age samples, Libby developed a second Curve of Knowns, using a half-life of 5568 ± 30, the Libby Half-life conventionally used today. Interestingly, the half-life Libby used in his 1949 publication is closer to the true half-life as it is measured today. Early on, Libby considered that the amount of 14C in the atmosphere might vary with time, as well as with latitude and carbon reservoir (see In the Environment for a discussion of reservoirs). However, in these early days of radiocarbon dating, prior to the development of calibration curves, the agreement between the expected dates and the calculated ages was considered adequate, supporting the hypothesis that 14C abundance did not vary significantly over the last 10,000 years or so.
As radiocarbon dating reached the end of its first decade of use, it became apparent that measured radiocarbon ages for certain periods of time were consistently different than the expected age. In the late 1950s, European researchers, physicist Karl Otto Münnich and nuclear scientist and biophysicist Hessel de Vries, suggested that calendrical and radiocarbon time differed. Libby, however, argued that it was the "known ages" that were incorrect, not the radiocarbon method. In the 1960s, Libby's argument was disproved when consistent radiocarbon ages on tree rings were found to disagree with their dendrochronological dates. University of California professor Hans Suess was the first to assemble a database of paired dendrochronological dates and radiocarbon ages over a substantial span of time: 7000 years.
In the 1960s, the idea that radiocarbon ages needed to be calibrated took hold. However, there was uncertainty about how broadly the early curves could be applied; concern stemmed from inconsistencies between radiocarbon labs, whether the curves could be applied to material types other than wood, and whether a curve could be used for samples from anywhere in the world. Eventually, global variations in atmospheric 14C concentrations were identified, most notably between the northern and southern hemispheres. In addition, short and medium-term variations in atmospheric carbon were identified, which account for the wiggles in the calibration curve (the de Vries Effects, or the Suess wiggles); these variations are thought to be caused by solar activity. By the 1980s, radiocarbon labs were producing 14C measurements with enough accuracy and consistency that earlier concerns over the accuracy of paired dendrochronological and radiocarbon data from different labs were allayed.
The recognition that calibration was necessary caused more waves in the turbid wake left by the invention of radiocarbon dating. Calibration, coupled with an increase in dated samples (in great part due to the increased availability of labs), resulted in another bout of upheaval in archeological chronologies. In some places across the world this not only restructured prehistoric timelines, but also had the effect of turning understandings of technological diffusion in prehistory upside down.
AMS (Accelerator Mass Spectrometry)
Prior to Accelerator Mass Spectrometry (AMS), 14C was measured using conventional beta counting systems. These include proportional solid-carbon counters, gas counters, and liquid scintillation counters. Beta counting systems detect electrons as they are emitted during radioactive decay, and the counts are then compared with a modern standard. Rather than counting beta decay, AMS directly detects quantities of 12C, 13C, and 14C by separating them by their isotopic mass in a particle accelerator.
The biggest difference between AMS and conventional methods of radiocarbon dating is the greater sensitivity of AMS, allowing measurement of much smaller quantities of 14C. With this more sensitive instrument comes faster measurement times, the ability to make measurements on more ancient samples, and a drastic reduction in the amount of sample material required to produce a date. With the very small sample size needed for AMS, archeological site sampling strategy was transformed. Dates can now be obtained from strata, features, and artifacts with little organic material present (such as the organic residue adhering to pottery vessels), and multiple measurements can be made from a single sample to evaluate controversial or unexpected results.
AMS was developed in the late 1970s and early 1980s, and went through separate, competing iterations of the method during that time—the cyclotron AMS and the tandem accelerator. The history of cyclotron AMS is entwined with that of nuclear defense. Its development stemmed from a think tank working for the US government, seeking to detect low levels of radioactive atoms behind passing nuclear submarines. In 1976, think tank participant and University of California-Berkley researcher Richard Muller began experimenting with the use of a cyclotron as a "high-energy," or accelerator, mass spectrometer. Over a decade of trials, Muller achieved uneven success with the Berkley cyclotron for radiocarbon dating.
While Muller was investigating the cyclotron for AMS, tandem accelerators for radiocarbon measurement were also being developed. In fact, they were developed by two independent groups at the same time: Earle Nelson from Simon Fraser University and his colleagues in Canada, and nuclear physicists Harry Gove and Ted Litherland, both from the University of Rochester, with Kenneth Purser, a private business owner. Nelson considered using a tandem accelerator with a magnetic spectrograph to make 14C measurements but was inspired to use a detector telescope in place of the spectrograph to differentiate ions of the same weight. His inspiration to use a detector telescope came from a 1977 article by cyclotron AMS developer Muller. After that, tandem accelerators for radiocarbon dating were quickly tested. Before the end of 1977, Nelson published his tandem accelerator method, as did the American physicists, in the same issue of Science magazine no less.
The first archeological samples were measured by AMS in 1982. In the early days of AMS for radiocarbon dating, the technique was very expensive and wait times could be long because the machines were also used for purposes other than radiocarbon dating. With time, smaller and more affordable AMS systems were developed. In the 1990s, AMS labs became common, and their use for radiocarbon dating has only increased since.
While AMS in and of itself does not give more precise or accurate ages than beta counters, it offers the possibility of a more detailed chronology based on small samples from a greater range of contexts. Smaller sample size requirements in turn allow for more stringent pretreatment methods. Only one milligram of graphitized carbon is needed for most AMS measurement, and in the case of the most sensitive measuring instruments, mere micrograms are needed. In comparison, one gram of graphitized carbon is needed for traditional radiometric measurements. With conventional beta-counting methods, one measurement on bone could require 300 grams of sample material, and in the early days of radiocarbon dating, recommended sample size for ivory or teeth was a now-unthinkable five pounds of sample material!