Fire control changed the course of human evolution, allowing our ancestors to stay warm, cook food, ward off predators and venture into harsh climates. It also had important social and behavioral implications, encouraging groups of people to gather together and stay up late. Despite the significance of kindling flames, when and where human ancestors learned how to do it remains a subject of debate and speculation. There is even little consensus about which hominins—modern humans, a direct predecessor or a long-extinct branch—first acquired the skill.
The oldest unequivocal evidence, found at Israel’s Qesem Cave, dates back 300,000 to 400,000 years, associating the earliest control of fire with Homo sapiens and Neanderthals. Now, however, an international team of archaeologists has unearthed what appear to be traces of campfires that flickered 1 million years ago. Consisting of charred animal bones and ashed plant remains, the evidence hails from South Africa’s Wonderwerk Cave, a site of human and early hominin habitation for 2 million years.
The researchers found the evidence in a layer of rock containing hand axes, stone flakes and other tools attributed by previous excavations to a particular human ancestor: Homo erectus. Characterized by its upright stance and robust build, this early hominin species lived from 1.8 million to 200,000 years ago. “The evidence from Wonderwerk Cave suggests that Homo erectus had some familiarity with fire,” said Francesco Berna, an archaeology professor at Boston University and the lead author of a paper on the team’s findings.
Other groups of researchers armed with remains from Africa, Asia and Europe have also claimed that human fire control originated very early—up to 1.5 million years ago. These studies, however, rely on evidence from open-air sites where wildfires could have blazed, Berna said. And while scorched objects were found and analyzed, the deposits surrounding them were not, meaning the burning could have taken place elsewhere, he added.
Wonderwerk Cave, by contrast, is a protected environment less prone to spontaneous flames. What’s more, an analysis by Berna and his colleagues showed that sediment clinging to charred items there was also heated, suggesting fires were kindled onsite. For these reasons, the team described the singed traces unearthed at Wonderwerk as “the earliest secure evidence for burning in an archaeological context.”
Scientists working outside the realm of archaeology—most notably primatologist Richard Wrangham—have persuasively argued that Homo erectus tamed fire, Berna noted. Wrangham has long been championing the theory that cooking allowed human ancestors to consume more calories and, as a result, to develop larger brains. He has largely based his hypothesis on physical changes in early hominins—for instance, a shift toward smaller teeth and stomachs—that took place around the time Homo erectus evolved.
“So far, Richard Wrangham’s cooking hypothesis is based on anatomical and phylogenetic evidence that show that Homo erectus may have been already adapted to a cooked food diet,” Berna explained. “Our evidence from Wonderwerk is consistent with Homo erectus being able to eat cooked food.”
Berna and his colleagues have been excavating at Wonderwerk since 2004, but more work is on the horizon, he said. In addition to seeking even earlier evidence of fire control, the researchers plan to investigate whether the cave’s Homo erectus inhabitants actually cooked—for instance, by checking for cut marks on bones, Berna explained. “More work needs to be done to exclude that meat was consumed raw and bones were disposed in the fire after that,” he said.
The Earliest Example of Hominid Fire
Discussions of fire and human evolution conjure up images of cavemen sitting around a campfire roasting chunks of meat on sticks. But who were the first “cavemen” to do this? Debate goes back and forth between anthropologists who claim hominids began controlling fire nearly two million years ago and those who think our ancestors started stoking flames only a few hundred thousand years ago.
Now a new study of one-million-year-old charred bones and plant remains provides the earliest “secure” evidence of hominid fire-making, researchers say.
The new evidence comes from South Africa’s Wonderwerk Cave. Archaeological investigations there in the 1970s through 1990s turned up Acheulean tools—stone handaxes and other implements that were likely produced by Homo erectus. In 2004, Francesco Berna of Boston University and his colleagues began new excavations. They found several signs of fire, including tiny charred bone fragments and ash from burned plants. They also found ironstone—which the hominids used to make tools—with telltale fractures indicative of heating. Using a technique called Fourier transform infrared microspectroscopy, which examines how a sample absorbs different wavelengths of infrared light, the team determined the remains had been heated to more than 900 degrees Fahrenheit, with grasses, leaves or brush used as fuel.
The shape of the bone fragments and the exceptional preservation of the plant ash suggest the materials were burned in the cave—not outside and then transported in by water, the team reports this week in Proceedings of the National Academy of Sciences. Spontaneous combustion of bat guano was also ruled out (apparently this sometimes happens in caves). That left hominids as the most likely source of the fire.
This is good news for Harvard University’s Richard Wrangham and supporters of his cooking hypothesis. According to Wrangham, mastering fire was a transformative event in the history of humans. It allowed our ancestors to cook. And because cooked food is easier to digest, the hominid gut shrank, freeing up energy that was then devoted to fueling the evolution of bigger brains, which are very expensive to maintain, energetically speaking. (Brain tissue needs 22 times as much energy as an equivalent amount of muscle.)
Wrangham surmised this important transition must have occurred with the origin of Homo erectus, some 1.9 million years ago, when brain size really began to expand and the hominid body became taller and more modern.
The fire at Wonderwerk is too young to fully support Wrangham’s hypothesis, but it’s a step in the right direction. Previously, the earliest well-accepted instance of fire-building came from Israel’s Qesem Cave at 400,000 years ago. For claims of much older examples of controlled fire, such as at a 1.5-million-year-old Kenyan site called Koobi Fora, wildfires couldn’t be ruled out.
If the history of fire extends back one million years, why don’t archaeologists find more evidence of it? Last year, for example, Wil Roebroeks of Leiden University in the Netherlands and Paola Villa of the University of Colorado Museum in Boulder surveyed the European archaeological record of the last 1.7 million years. They didn’t find habitual use of fire until about 400,000 years ago, they reported in Proceedings of the National Academy of Sciences, leading them to conclude hominids must have colonized the northern latitudes without fire’s warmth.
Berna’s team thinks the problem might be in how archaeologists have been looking for fire. The new research involved examining the cave sediments, bones and plant ash at a microscopic level, which revealed information that’s normally overlooked. Perhaps with the help of such microscopic methods, anthropologists will find that the origin of fire is indeed linked to the origin of Homo erectus.
Our Mysterious Archaic Human Ancestor and Extinct Humans
During the study, researchers compared the genomes of two Neanderthals, a Denisovan and two modern African individuals. The Neanderthals ( Homo neanderthalensis ) were an extinct species of humans that died out about 30000 years ago, and once inhabited vast areas of Eurasia. Denisovans are a mysterious species, only known through their DNA, who probably ranged across an area that covered Siberia and East Asia. The samples from modern Africans were selected because they are known not to have Neanderthal nor Denisovan genes.
The spread and evolution of Denisovans (John D. Croft / CC BY-SA 3.0 )
Based on the ground-breaking algorithm the researchers were able to develop an ancestral recombination graph, which “includes a tree that captures the relationships among all individuals at every position along the genome, and the recombination events that cause those trees to change from one position to the next,” Siepel told Live Science . The team were able to build up a picture of the extensive interbreeding between different species of hominids and gain insights even into their migration patterns.
People May Have Used Fire to Clear Forests More Than 80,000 Years Ago
Humanity&rsquos environmental impact did not start with the bang of agriculture or industrialization but a whisper initiated long ago&mdashone that scientists are finally learning to hear.
New archaeological and paleoenvironmental findings now date human activity that transformed our natural surroundings to more than 80,000 years ago, after early modern humans settled on the northern shores of Lake Malawi at the lower tip of eastern Africa&rsquos Great Rift Valley. These humans dramatically modified the landscape and ecosystem by burning forests to yield a sprawling bushland that remains today, according to a report published on Wednesday in Science Advances.
The finding marks the oldest evidence yet of humans profoundly changing their environment with fire. And it could represent the earliest known case of people deliberately doing so, the researchers hypothesize. &ldquoIt represents a really powerful cultural capacity to transform the landscape in a way . that will enhance the survival of the people,&rdquo says archaeologist Amanuel Beyin of the University of Louisville, who was not involved in the new study.
Lake Malawi is one of the world&rsquos largest lakes today, but it has dramatically fluctuated in size across the ages. In a 2018 study, paleoecologist Sarah Ivory of Pennsylvania State University and her colleagues examined fossils, pollen and minerals in two sediment cores drilled from the lake bed. Their analysis revealed that the lake&rsquos water level and vegetation exhibited a consistent climatic pattern over the past 636,000 years. Dense forests along the lake&rsquos shores typically disappeared during drought periods when the lake ran dry and then returned when it filled up again.
But the pollen records showed an abrupt break from this cycle when the wet period returned about 86,000 years ago. Although the lake level was high again, the shoreline forests just briefly recovered before collapsing. Only some fire-tolerant and hardy species persisted, while grasses became more widespread in the landscape.
When Ivory discussed these data with Yale University paleoanthropologist Jessica Thompson and her colleagues, who were excavating nearby archaeological sites along the northern shores of the lake, an explanation came into focus: human activity. The first known settlements in the area pop up roughly 92,000 years ago, as evidenced by tens of thousands of stone artifacts found by Thompson and others with help from their colleagues in Malawi. Many were tools likely used in hunting and cutting. The researchers observed that the humans&rsquo appearance was followed by a spike in charcoal deposits in the lake cores, suggesting that people started intensively burning the forest just as it was growing back, thereby preventing a full recovery.
Alternative explanations are possible. The charcoal deposits could instead have stemmed from a few fires that spiraled out of control or perhaps from people at that time burning timber for cooking or warmth. But Thompson proposes that the population deliberately burned the forests, as some hunter-gatherers do today. Cleared forest areas allow a patchwork of new grasses and shrubs to emerge, enabling a mosaic habitat with a variety of food sources that attract different animal species&mdashand hence new prey for humans. Thompson thinks the scale of burning is more consistent with this kind of continuous habitat transformation than accidental fires or wood harvesting. Doing the latter efficiently would have required tools that were not available then, she adds.
The use of fire by human ancestors dates back at least a million years, scientists have found. But during the Middle Stone Age&mdashbetween 315,000 and 30,000 years ago&mdashhumans began to wield fire in new ways. For instance, around 164,000 years ago in southern Africa, people likely used fire to heat stone to render it more malleable for toolmaking. &ldquoThis realization that you could use fire . as a tool to modify the productivity of your immediate environment&rdquo would be one of many inventions that took place in this broader period, Thompson says.
Previously, some of the oldest possible evidence of humans using fire to manage their environment stemmed from the Great Cave of Niah in Malaysian Borneo. Scientists hypothesize that humans 50,000 years ago used fire in a dense tropical forest near that location to foster the growth of specific plant species. Other studies propose similar activities about 45,000 years ago in New Guinea and 40,000 years ago in Australia.
It is not easy to prove that humans rather than climatic factors ignited such fires, notes Patrick Roberts, an archaeological scientist at the Max Planck Institute for the Science of Human History in Jena, Germany, who was not involved in the new study. But he thinks the evidence it unearthed around Lake Malawi makes a fairly convincing case that humans were the culprit&mdashgiven the paleoenvironmental record in the lake cores that spans more than 600,000 years and the fact that those cores were extracted close to the archaeological site.
Although human intent is also hard to prove, Roberts says he sees no reason to assume that people were not cognitively capable of taking such action to make their environment more productive. &ldquoWhy else would you go and set fire to the landscape?&rdquo he asks.
Beyin suggests that the early modern humans living around Lake Malawi may have been part of populations migrating from drier environments to the north or south. When they encountered unfamiliar forests there, he says, it is possible that they &ldquomay have resorted to using fire to create . this familiar woodland environment.&rdquo The study also underscores the value of integrating ancient environmental records such as those documented in the lake cores with classic archeological data to detect clues to human cultural innovations, Beyin adds.
The ancient people appear to have left another impression on the landscape near Lake Malawi. After the forests disappeared, rain fell on deforested highlands, gradually eroding sediments to form large triangle-shaped deposits called alluvial fans. Over time, the process of erosion buried and preserved artifacts in the fan deposits. Thompson says she wouldn&rsquot be surprised if more evidence of early modern humans&rsquo environmental impact emerges over the coming years. &ldquoIf we actually just think of this as something we associate with the human &lsquocondition. &rsquo if you shift your perspective that way,&rdquo she adds, &ldquosuddenly, I think, you&rsquore going to see this stuff all over the place.&rdquo
Smoke, Fire and Human Evolution
When early humans discovered how to build fires, life became much easier in many regards. They huddled around fire for warmth, light and protection. They used it to cook, which afforded them more calories than eating raw foods that were hard to chew and digest. They could socialize into the night, which possibly gave rise to storytelling and other cultural traditions.
But there were downsides, too. Occasionally, the smoke burned their eyes and seared their lungs. Their food was likely coated with char, which might have increased their risk for certain cancers. With everyone congregated in one place, diseases could have been transmitted more easily.
Much research has focused on how fire gave an evolutionary advantage to early humans. Less examined are the negative byproducts that came with fire, and the ways in which humans may or may not have adapted to them. In other words, how did the harmful effects of fire shape our evolution?
It’s a question that’s just starting to attract more attention. “I would say it’s mostly barroom talk at the moment,” said Richard Wrangham, a professor of biological anthropology at Harvard University and the author of “Catching Fire: How Cooking Made Us Human.” His work suggested that cooking led to advantageous changes in human biology, such as larger brains.
Now, two new studies have proposed theories on how negative consequences of fire might have shaped human evolution and development.
In the first, published Tuesday, scientists identified a genetic mutation in modern humans that allows certain toxins, including those found in smoke, to be metabolized at a safe rate. The same genetic sequence was not found in other primates, including ancient hominins such as Neanderthals and Denisovans.
The researchers believe the mutation was selected for in response to breathing in smoke toxins, which can increase the risk of respiratory infections, suppress the immune system and disrupt the reproductive system.
It’s possible that having this mutation gave modern humans an evolutionary edge over Neanderthals, though it’s speculation at this point, said Gary Perdew, a professor of toxicology at Pennsylvania State University and an author of the paper. But if the speculation is correct, the mutation may have been one way that modern humans were inured against some adverse effects from fire, while other species were not.
Thomas Henle, a chemistry professor at Dresden University of Technology in Germany who was not involved with the study, has wondered whether humans also have unique genetic mutations to better handle, or even take advantage of, byproducts of fire in food. In 2011, his research group showed that the brown molecules that come from roasting coffee can inhibit enzymes produced by tumor cells, which might explain why coffee drinkers may be at lower risk for certain cancers.
Other research has suggested that these roasting byproducts may stimulate the growth of helpful microbes in the gut.
A genetic mutation that may help humans tolerate smoke toxins could be just one of many adaptations, Dr. Henle said. “I am sure that there are further human-specific mechanisms, or mutations, which are due to an evolutionary adaptation to eating heat-treated foods.”
Understanding how humans might have uniquely adapted to the risks from exposure to fire may have implications for how scientists think about medical research, Dr. Wrangham said. Other animals that didn’t evolve around fire, for instance, may not be the best models for studying how we process food or detoxify substances.
One example, he suggests, is the study of acrylamide, a compound that forms in foods during frying, baking or other high-temperature cooking. When given to lab animals in high doses, acrylamide has been shown to cause cancer. But so far, most human studies have failed to link dietary acrylamide to cancer.
“People keep ‘wanting’ to find a problem for humans,” Dr. Wrangham said, but there’s “nothing obvious at all.”
Humans may not have been able to adjust to all of the dangers of fire. The second study, published last week in Proceedings of the National Academy of Sciences, suggests that with fire’s advantageous effects for human societies also came profound new damage. It offers conjecture that the early use of fire might have helped spread tuberculosis by bringing people into close contact, damaging their lungs and causing them to cough.
With mathematical modeling, Rebecca Chisholm and Mark Tanaka, biologists at the University of New South Wales in Australia, simulated how ancient soil bacteria might have evolved to become infectious tuberculosis agents. Without fire, the probability was low. But when the researchers added fire to their model, the likelihood that tuberculosis would emerge jumped by several degrees of magnitude.
It is thought that tuberculosis has killed more than a billion people, possibly accounting for more deaths than wars and famines combined. Today it remains one of the deadliest infectious diseases, claiming an estimated 1.5 million lives each year.
Many experts believe tuberculosis arose at least 70,000 years ago. By then, humans were most certainly controlling fire. (Estimates of when human ancestors started regularly using fire vary greatly, but the consensus is that it was at least 400,000 years ago.)
“We realized that the discovery of controlled fire must have caused a significant shift in the way humans were interacting with each other and with the environment,” factors known to drive the emergence of infectious diseases, Dr. Chisholm said.
She and Dr. Tanaka believe that fire might have helped spread other airborne diseases, not just tuberculosis. “Fire, as a technological advantage, has been a double-edged sword,” Dr. Tanaka said.
Negative cultural consequences came with fire, too — and continue to leave an imprint. Anthropologists have speculated that inhaling smoke led to the discovery of smoking. Humans have long used fire to modify their environment and burn carbon, practices that now have us in the throes of climate change. Fire is even tied to the rise of patriarchy — by allowing men to go out hunting while women stayed behind to cook by the fire, it spawned gender norms that still exist today.
Investigating how fire’s harmful effects have shaped human history and evolution can provide a rich look into the relationship between culture and biology. Did we evolve biologically to guard against the health risks of inhaling smoke? Did that help us pick up the cultural practice of smoking? There are many other possibilities.
“It’s a fascinating feedback loop,” said Caitlin Pepperell, a professor at the University of Wisconsin-Madison who studies the evolution of human diseases. “I hope these studies will spur us to think more about fire, and take it in all the different directions it can go.”
A Vietnamese delicacy
China is not the only one to blame for the not-so-secret pangolin trade. In both China and Vietnam, pangolins are considered a sign of affluence and status — not when kept as a pet but when cooked and eaten. A single pangolin dish will cost more than the annual income of most Vietnamese adults.
Vietnam also shares with China a perception that pangolins are useful in traditional medicine. They are perceived to cure severe illnesses, to bring on good health, and to help make other medicines more effective. Pangolin scales and blood are supposed to clear up rashes, detox the body, increase milk production in new mothers, and even cure cancer. It goes without saying that there is not a shred of scientific evidence for this.
The perception that pangolins are both delicious and medicinal means Vietnam is now the second leading black market for pangolins. Vietnam criminalized the trade in 2018, with a sentence of up to 15 years, but it's too little, too late. It's estimated that 80 to 90 percent of all Vietnamese pangolins have been hunted to near extinction in the last few decades.
Who Mastered Fire?
Mannequin of a Tautavel Man—would he have had known how to make fire?
Photo credit by Eric Cabanis/AFP/Getty Images.
Richard Wrangham, an anthropologist at Harvard, claims that hominids became people—that is, acquired traits like big brains and dainty jaws—by mastering fire. He places this development at about 1.8 million years ago. This is an appealing premise no matter who you are. For those who see cooking as morally, culturally, and socially superior to not cooking, it is scientific validation of a worldview: proof that cooking is literally what makes us human. For the rest of us, it means we have a clever retort the next time one of those annoying raw-food faddists starts going on about how natural it is never to eat anything heated above 115 degrees Fahrenheit.
There’s one problem with Wrangham’s elegant hypothesis: It’s hardly the scientific consensus. In fact, since 2009, when Wrangham explained his theory in the book Catching Fire, several archaeologists have come forward with their own, wildly divergent opinions about what is arguably the oldest intellectual property debate in the world. Who really mastered fire, in the sense of being able to create it, control it, and cook with it regularly? Was it Homo erectus, Neanderthals, or modern humans?
A brief primer on these species: H. erectus originated about 1.8 million years ago. These hominids were about as tall as modern humans, but probably hairier and definitely dumber. It’s thought that both Neanderthals and Homo sapiens evolved from H. erectus, with Neanderthals emerging about 600,000 years ago (and going extinct around 30,000 years ago) and modern humans emerging around 200,000 years ago (and still going strong). Neanderthals were shorter and had more complex societies than H. erectus, and they’re thought to have been at least as large-brained as modern humans, but their facial features protruded a little more and their bodies were stouter than ours. It’s thought that Neanderthals died out from competing, fighting, or interbreeding with H. sapiens.
According to Wrangham, H. erectus must have had fire—just look at their anatomy! H. erectus had smaller jaws and teeth (and smaller faces in general), shorter intestinal tracts, and larger brains than even earlier hominids, such as Australopithecus afarensis, for instance, who were boxier, more apelike, and probably duller. Wrangham argues that H. erectus would not have developed its distinctive traits if the species hadn’t been regularly eating softer, cooked food.
This hypothesis stems from a few modern observations. When you eat cooked food, you have access to many more calories than if you eat the same food raw. There are two reasons: Our digestive systems can extract more calories from a cooked steak (for instance) than a raw steak, and it takes much less energy to cook and eat a steak than to gnaw on a raw one for hours. Access to cooked food means a hominid no longer needs enormous teeth to break down all that raw meat and roughage into swallowable hunks, nor does it need as robust a digestive system to process it all. The combination of more calories and less complicated intestines means more energy can be devote to cogitating—hence H. erectus’ relatively big brains, which suck up a lot of calories. As evidence for his theory, Wrangham likes to point to the fact that modern-day humans can’t thrive on an all-raw diet—raw foodists tend to stop menstruating, precluding reproduction.
Wrangham’s theory is elegant, but the archaeological record is a little more complicated. There is definitely evidence of fire around 1.6 million years ago in what is now Kenya. But archaeologists dispute whether this was manmade or natural fire. Further complicating Wrangham’s hypothesis is evidence that hominids may not have brought fire with them when H. erectus moved out of Africa into Europe around a million years ago. If fire was as transformative and beneficial as Wrangham said it was, you’d think our ancestors would have brought it with them when they moved to colder climes—or died out if they were unable to do so.
If H. erectus didn’t bring fire mastery to Europe, who did? Archaeologists Wil Roebroeks of Leiden University in the Netherlands and Paola Villa of the University of Colorado Museum found evidence for frequent use of fire by European Neanderthals between 400,000 and 300,000 years ago. Roebroeks and Villa looked at all the data collected at European sites once inhabited by hominids and found no evidence of fire before about 400,000 years ago—but plenty after that threshold. Evidence from Israeli sites put fire mastery at about the same time. H. sapiens arrived on the scene in the Middle East and Europe 100,000 years ago, but our species didn’t have a discernible impact on the charcoal record. Roebroeks and Villa conclude that Neanderthals must have been the ones who mastered fire.
One of the beautiful things about the archaeological record is that archaeologists are always willing to debate about it. Attributing fire to Neanderthals is an overly confident reading of the evidence, according to archaeologist Dennis Sandgathe of British Columbia’s Simon Fraser University. Of course the number of campsites with evidence of fire increased between 1 million and 400,000 years ago, he says—the number of campsites, period, increased during this time in proportion with population growth. But that doesn’t mean the use of fire was universal among European hominids—there are plenty of Neanderthal campsites out there that show little or no evidence of fire, and Sandgathe has personally excavated some of them. What’s more, Sandgathe told me when I asked him about Roebroeks’ and Villa’s data, “We actually have better data than they do when it comes to Neanderthal use of fire.”
According to Sandgathe and his colleagues, hominids didn’t really master fire until around 12,000 years ago—well after Neanderthals had disappeared from the face of the planet (or merged into the human gene pool via interbreeding, depending on your view). Sandgathe and his colleagues excavated two Neanderthal cave sites in France and found, surprisingly, that the sites’ inhabitants used hearths more during warm periods and less during cold periods. Why on earth would Neanderthals not build fires when it was freezing outside? In “On the Role of Fire in Neandertal Adaptations in Western Europe: Evidence from Pech de l’Azé IV and Roc de Marsal, France,” Sandgathe advances the hypothesis that European Neanderthals simply didn’t know how to make fire. All they could do was harvest natural fires—those caused by lightning, for instance—to occasionally warm their bodies and cook their food. (This explains why Sandgathe found more evidence of fire from warm periods: Lightning is far less common during cold spells.)
Roebroeks and Villa think Sandgathe’s reasoning is flawed: After all, there isn’t evidence of fire at every modern human campsite, either, when you look at sites from the Upper Paleolithic period, which concluded about 10,000 years ago. “However, nobody would argue that Upper Paleolithic hunter-gatherers were not habitual users of fire,” they wrote in a response to Sandgathe et al.’s criticism of their work. Wrangham, meanwhile, thinks both Sandgathe et al. and Roebroeks et al. ignore some critical nonarchaeological evidence: his point that contemporary humans can’t survive on a diet of uncooked food. Accepting Sandgathe’s hypothesis, Wrangham wrote in an email, “means that the contemporary evidence is wrong, or that humans have adapted to need cooked food only in the last 12,000 years. Both suggestions are very challenging!”
Why on earth can’t scientists agree on whether people mastered fire 1.8 million years ago or 12,000 years ago? That’s a 150-fold difference. Well, figuring out who burned what, when, is not an easy business. For one thing, archaeologists can’t always tell what caused a fire: a volcano, for instance, a lightning strike, or hominid ingenuity. And even if there is clear evidence of hominid fire use—a hearth at a formerly inhabited cave, for instance—it’s almost impossible to tell whether it was created by people from scratch or merely stolen from a natural fire and then transported to a hearth, where it was kept alive as long as possible. Scientists call this kind of fire use opportunistic.
What’s more, even when people were creating fires, the evidence of said fires doesn’t always stay put. Ashes have a tendency to blow away instead of embedding themselves neatly in the archaeological record, while water can take evidence of fire from its original location and carry it someplace completely different. Then there’s human error: As Sandgathe et al. write in their discussion of the available evidence, “There are … examples where residues originally interpreted as the remains of fires are later identified as something else.” (I hate it when that happens.) At one site in China, for instance, layers of earth originally believed to be ashes were later revealed to be silt and unburned bits of organic matter.
Archaeological methods are improving, and they may well end up bearing out Wrangham’s hypothesis. In a paper published earlier this year, archaeologists used advanced techniques (known as micromorphological and Fourier-transform infrared microspectroscopy) to examine sediment and reveal evidence of fire at a million-year-old South African cave site.
Wrangham is also hopeful that other disciplines will provide evidence for his theory. “I suspect genetics will help,” he says. “If we can pin down the genes underlying the adaptation to cooked food, we may be able to date the control of fire close enough to settle the big question.”
“Sure, that would be pretty compelling evidence,” admits Sandgathe. But he’s hopeful that genetics will bolster his hypothesis: that Neanderthals survived frigid glacial periods not because they regularly used fire, but because they had thick body hair. “At some point someone may announce the discovery of the gene or genes that code for thickness of body hair, and so could answer that question,” he says.
Judging from the way things are going, this debate may rage on for a good while longer. And there is room for more than one right answer: It’s possible that different groups mastered fire independently of one another at different points in time. But laypeople can take comfort in knowing that, even if we don’t know yet who first mastered fire—our simple ancestors almost 2 million years ago, our more advanced cousins 400,000 years ago, or our direct antecedents about 10,000 years ago—there’s no doubt who holds the intellectual property rights to it today. We even put it in an oven and made it our own.
Why Fire Makes Us Human
Wherever humans have gone in the world, they have carried with them two things, language and fire. As they traveled through tropical forests they hoarded the precious embers of old fires and sheltered them from downpours. When they settled the barren Arctic, they took with them the memory of fire, and recreated it in stoneware vessels filled with animal fat. Darwin himself considered these the two most significant achievements of humanity. It is, of course, impossible to imagine a human society that does not have language, but—given the right climate and an adequacy of raw wild food—could there be a primitive tribe that survives without cooking? In fact, no such people have ever been found. Nor will they be, according to a provocative theory by Harvard biologist Richard Wrangham, who believes that fire is needed to fuel the organ that makes possible all the other products of culture, language included: the human brain.
From This Story
Darwin himself considered language and fire the two most significant achievements of humanity. (Illustration by Frank Stockton) The expansion of the brain, seen in fossils from different branches of our family tree, may have been aided by fire, first used at least a million years ago. (NMNH, SI)
Every animal on earth is constrained by its energy budget the calories obtained from food will stretch only so far. And for most human beings, most of the time, these calories are burned not at the gym, but invisibly, in powering the heart, the digestive system and especially the brain, in the silent work of moving molecules around within and among its 100 billion cells. A human body at rest devotes roughly one-fifth of its energy to the brain, regardless of whether it is thinking anything useful, or even thinking at all. Thus, the unprecedented increase in brain size that hominids embarked on around 1.8 million years ago had to be paid for with added calories either taken in or diverted from some other function in the body. Many anthropologists think the key breakthrough was adding meat to the diet. But Wrangham and his Harvard colleague Rachel Carmody think that’s only a part of what was going on in evolution at the time. What matters, they say, is not just how many calories you can put into your mouth, but what happens to the food once it gets there. How much useful energy does it provide, after subtracting the calories spent in chewing, swallowing and digesting? The real breakthrough, they argue, was cooking.
Wrangham, who is in his mid-60s, with an unlined face and a modest demeanor, has a fine pedigree as a primatologist, having studied chimpanzees with Jane Goodall at Gombe Stream National Park. In pursuing his research on primate nutrition he has sampled what wild monkeys and chimpanzees eat, and he finds it, by and large, repellent. The fruit of the Warburgia tree has a “hot taste” that “renders even a single fruit impossibly unpleasant for humans to ingest,” he writes from bitter experience. “But chimpanzees can eat a pile of these fruits and look eagerly for more.” Although he avoids red meat ordinarily, he ate raw goat to prove a theory that chimps combine meat with tree leaves in their mouths to facilitate chewing and swallowing. The leaves, he found, provide traction for the teeth on the slippery, rubbery surface of raw muscle.
Food is a subject on which most people have strong opinions, and Wrangham mostly excuses himself from the moral, political and aesthetic debates it provokes. Impeccably lean himself, he acknowledges blandly that some people will gain weight on the same diet that leaves others thin. “Life can be unfair,” he writes in his 2010 book Catching Fire, and his shrug is almost palpable on the page. He takes no position on the philosophical arguments for and against a raw-food diet, except to point out that it can be quite dangerous for young children. For healthy adults, it’s “a terrific way to lose weight.”
Which is, in a way, his point: Human beings evolved to eat cooked food. It is literally possible to starve to death even while filling one’s stomach with raw food. In the wild, people typically survive only a few months without cooking, even if they can obtain meat. Wrangham cites evidence that urban raw-foodists, despite year-round access to bananas, nuts and other high-quality agricultural products, as well as juicers, blenders and dehydrators, are often underweight. Of course, they may consider this desirable, but Wrangham considers it alarming that in one study half the women were malnourished to the point they stopped menstruating. They presumably are eating all they want, and may even be consuming what appears to be an adequate number of calories, based on standard USDA tables. There is growing evidence that these overstate, sometimes to a considerable degree, the energy that the body extracts from whole raw foods. Carmody explains that only a fraction of the calories in raw starch and protein are absorbed by the body directly via the small intestine. The remainder passes into the large bowel, where it is broken down by that organ’s ravenous population of microbes, which consume the lion’s share for themselves. Cooked food, by contrast, is mostly digested by the time it enters the colon for the same amount of calories ingested, the body gets roughly 30 percent more energy from cooked oat, wheat or potato starch as compared to raw, and as much as 78 percent from the protein in an egg. In Carmody’s experiments, animals given cooked food gain more weight than animals fed the same amount of raw food. And once they’ve been fed on cooked food, mice, at least, seemed to prefer it.
In essence, cooking—including not only heat but also mechanical processes such as chopping and grinding—outsources some of the body’s work of digestion so that more energy is extracted from food and less expended in processing it. Cooking breaks down collagen, the connective tissue in meat, and softens the cell walls of plants to release their stores of starch and fat. The calories to fuel the bigger brains of successive species of hominids came at the expense of the energy-intensive tissue in the gut, which was shrinking at the same time—you can actually see how the barrel-shaped trunk of the apes morphed into the comparatively narrow-waisted Homo sapiens. Cooking freed up time, as well the great apes spend four to seven hours a day just chewing, not an activity that prioritizes the intellect.
The trade-off between the gut and the brain is the key insight of the “expensive tissue hypothesis,” proposed by Leslie Aiello and Peter Wheeler in 1995. Wrangham credits this with inspiring his own thinking—except that Aiello and Wheeler identified meat-eating as the driver of human evolution, while Wrangham emphasizes cooking. “What could be more human,” he asks, “than the use of fire?”
Unsurprisingly, Wrangham’s theory appeals to people in the food world. “I’m persuaded by it,” says Michael Pollan, author of Cooked, whose opening chapter is set in the sweltering, greasy cookhouse of a whole-hog barbecue joint in North Carolina, which he sets in counterpoint to lunch with Wrangham at the Harvard Faculty Club, where they each ate a salad. “Claude Lévi-Strauss, Brillat-Savarin treated cooking as a metaphor for culture,” Pollan muses, “but if Wrangham is right, it’s not a metaphor, it’s a precondition.” (Read about what it's like to have dinner with Pollan)
Wrangham, with his hard-won experience in eating like a chimpanzee, tends to assume that—with some exceptions such as fruit—cooked food tastes better than raw. But is this an innate mammalian preference, or just a human adaptation? Harold McGee, author of the definitive On Food and Cooking, thinks there’s an inherent appeal in the taste of cooked food, especially so-called Maillard compounds. These are the aromatic products of the reaction of amino acids and carbohydrates in the presence of heat, responsible for the tastes of coffee and bread and the tasty brown crust on a roast. “When you cook food you make its chemical composition more complex,” McGee says. “What’s the most complex natural, uncooked food? Fruit, which is produced by plants specifically to appeal to animals. I used to think it would be interesting to know if humans are the only animals that prefer cooked food, and now we’re finding out it’s a very basic preference.”
East Africa Edit
The earliest evidence of humans using fire comes from many archaeological sites in East Africa, like Chesowanja near Lake Baringo, Koobi Fora, and Olorgesailie in Kenya. The evidence at Chesowanja is the discovery of red clay shards that scientists estimate are 1.42 million years old.  Scientists reheated some of the shards at the site, and found that the clay must have been heated to 400 °C to harden.
At Koobi Fora, there are archaeological sites with evidence of control of fire by Homo erectus 1.5 million years ago, with the reddening of sediment that can only come from heating at 200—400 °C.  There is a hearth-like depression at a site in Olorgesailie, Kenya. Some very tiny charcoal was found, but it could have come from a natural brush fire. 
In Gadeb, Ethiopia, fragments of welded tuff that seemed to have been burned were found in Locality 8E, but re-firing of the rocks may have happened because of volcanoes erupting nearby.  These have been found among Herculean artifacts made by H. erectus.
In the Middle Awash River Valley, cone-shaped depressions of reddish clay were found that could be made by temperatures of 200 °C. These features are thought to be burned tree stumps such that they would have fire away from their habitation site.  There are also burnt stones in the "Awash Valley", but volcanic welded tuff is also in the area.
Southern Africa Edit
The earliest certain evidence of human control of fire was found at Swartkrans, South Africa. Many burnt bones were found among Acheulean tools, bone tools, and bones with cut marks that were made by hominids.  This site also shows some of the earliest evidence of H. erectus eating meat. The Cave of Hearths in South Africa has burned deposits dated from 0.2 to 0.7 mya, as do many other places such as Montagu Cave (0.058 to 0.2 mya and at the Klasies River Mouth (0.12 to 0.13 mya. 
The most powerful evidence comes from Kalambo Falls in Zambia where many things related to the use of fire by humans had been found, like charred wood, charcoal, reddened areas, carbonized grass stems and plants, and wooden implements which may have been hardened by fire. The place was dated through radiocarbon dating to be at 61,000 BP and 110,000 BP through amino acid racemization. 
Fire was used to heat silcrete stones to increase their works before they were knapped into tools by Stillbay culture.    This clue shows this not only with Stillbay sites that date back to 72,000 BP but sites that could be as old as 164,000 BP. 
An important change in the behavior of humans happened because of their control of fire and the light that came from the fire.  Activity was no longer restricted to the daylight hours. Some mammals and biting insects avoid fire and smoke.   Fire also led to better nutrition through cooked proteins.   
Richard Wrangham of Harvard University argues that cooking of plant foods may have caused the brain to get bigger, because it made complex carbohydrates in starchy foods easier to digest. This allowed humans to absorb more calories from their food.   
Stahl thought that because some parts of plants, like raw cellulose and starch are hard to digest in uncooked form, they would likely not be a part of the hominid diet before fire could be controlled.  These parts include stems, mature leaves, enlarged roots, and tubers. Instead, the diet was made up of the parts of the plants that were made of simpler sugars and carbohydrates such as seeds, flowers, and fleshy fruits. Another problem was that some seeds and carbohydrate sources are poisonous. Cyanogenic glycosides, which are in linseed, cassava, and manioc, amongst others, are made non-poisonous through cooking.  The teeth of H. erectus and the wear on the teeth reflect the consumption of foods such as tough meats and crisp root vegetables.  
The cooking of meat, as can be seen from burned and blackened mammal bones, makes the meats easier to eat. It is also easier to get the nutrition from proteins because the meat itself is easier to digest.   The amount of energy needed to digest cooked meat is less than that needed for raw meat, and cooking gelatinizes collagen and other connective tissues as well, it "opens up tightly woven carbohydrate molecules for easier absorption."  Cooking also kills parasites and food poisoning bacteria.
Human Ancestors May Have Evolved the Physical Ability to Speak More Than 25 Million Years Ago
Speech is part of what makes us uniquely human, but what if our ancestors had the ability to speak millions of years before Homo sapiens even existed?
Some scientists have theorized that it only became physically possible to speak a wide range of essential vowel sounds when our vocal anatomy changed with the rise of Homo sapiens some 300,000 years ago. This theoretical timeline means that language, where the brain associates words with objects or concepts and arranges them in complex sentences, would have been a relatively recent phenomenon, developing with or after our ability to speak a diverse array of sounds.
But a comprehensive study analyzing several decades of research, from primate vocalization to vocal tract acoustic modeling, suggests the idea that only Homo sapiens could physically talk may miss the mark when it comes to our ancestors’ first speech—by a staggering 27 million years or more.
Linguist Thomas Sawallis of the University of Alabama and colleagues stress that functional human speech is rooted in the ability to form contrasting vowel sounds. These critical sounds are all that differentiates entirely unrelated words like "bat," "bought," "but" and "bet." Building a language without the variety of these contrasting vowel sounds would be nearly impossible. The research team’s new study in Science Advances concludes that early human ancestors, long before even the evolution of the genus Homo, actually did have the anatomical ability to make such sounds.
When, over all those millions of years, human ancestors developed the cognitive ability to use speech to converse with each other remains an open question.
“What we’re saying is not that anyone had language any earlier,” Sawallis says. ”We’re saying that the ability to make contrasting vowel qualities dates back at least to our last common ancestor with Old World monkeys like macaques and baboons. That means the speech system had at least 100 times longer to evolve than we thought.”
A screaming guinea baboon. Studies that have found monkeys such as baboons and macaques can make contrasting vowel sounds suggest that the last common ancestor between these primates and modern humans could make the sounds too. ( Andyworks via Getty Images)
The study explores the origins and abilities of speech with an eye toward the physical processes that primates use to produce sounds. “Speech involves the biology of using your vocal tracts and your lips. Messing around with that as a muscular production, and getting a sound out that can get into somebody else’s ear that can identify what was intended as sounds—that’s speech,” Sawallis says.
A long-popular theory of the development of the larynx, first advanced in the 1960s, held that an evolutionary shift in throat structure was what enabled modern humans, and only modern humans, to begin speaking. The human larynx is much lower, relative to cervical vertebrae, than that of our ancestors and other primates. The descent of the larynx, the theory held, was what elongated our vocal tract and enabled modern humans to begin making the contrasting vowel sounds that were the early building blocks of language. “The question is whether that’s the key to allowing a full, usable set of contrasting vowels,” Sawallis says. “That’s what we have, we believe, definitely disproven with the research that’s led up to this article.”
The team reviewed several studies of primate vocalization and communication, and they used data from earlier research to model speech sounds. Several lines of research suggested the same conclusion—humans aren’t alone in their ability to make these sounds, so the idea that our unique anatomy enabled them doesn’t appear to hold water.
Cognitive scientist Tecumseh Fitch and colleagues in 2016 used X-ray videos to study the vocal tracts of living macaques and found that monkey vocal tracts are speech ready. “Our findings imply that the evolution of human speech capabilities required neural changes rather than modifications of vocal anatomy. Macaques have a speech-ready vocal tract but lack a speech-ready brain to control it,” the study authors wrote in Science Advances.
In a 2017 study, a team led by speech and cognition researcher Louis-Jean Boë of Université Grenoble Alpes in France, also lead author of the new study, came to the same conclusion as the macaque study. By analyzing over 1,300 naturally produced vocalizations from a baboon troop, they determined that the primates could make contrasting proto-vowel sounds.
Some animals, including birds and even elephants, can mimic human voice sounds by using an entirely different anatomy. These amazing mimics illustrate how cautious scientists must be in assigning sounds or speech to specific places in the evolutionary journey of human languages.
“Of course, vocalization involves vowel production and of course, vocalization is a vital evolutionary precursor to speech, “ says paleoanthropologist Rick Potts of Smithsonian’s Human Origins Program, in an email. “The greatest danger is equating how other primates and mammals produce vowels as part of their vocalizations with the evolutionary basis for speech.”
While anatomy of the larynx and vocal tract help make speech physically possible, they aren’t all that’s required. The brain must also be capable of controlling the production and the hearing of human speech sounds. In fact, recent research suggests that while living primates can have a wide vocal range—at least 38 different calls in the case of the bonobo—they simply don’t have the brainpower to develop language.
“The fact that a monkey vocal tract could produce speech (with a human like brain in control) does not mean that they did. It just shows that the vocal tract is not the bottle-neck,” says University of Vienna biologist and cognitive scientist Tecumseh Fitch in an email.
A male Japanese macaque or snow monkey a making threatening expression in Jigokudani Yean-Koen National Park. ( Anup Shah)
Where, when, and in which human ancestor species a language-ready brain developed is a complicated and fascinating field for further research. By studying the way our primate relatives like chimpanzees use their hands naturally, and can learn human signs, some scientists suspect that language developed first through gestures and was later made much more efficient through speech.
Other researchers are searching backward in time for evidence of a cognitive leap forward which produced complex thought and, in turn, speech language abilities able to express those thoughts to others—perhaps with speech and language co-evolving at the same time.
Language doesn’t leave fossil evidence, but more enduring examples of how our ancestors used their brains, like tool-making techniques, might be used as proxies to better understand when ancient humans started using complex symbols—visual or vocal—to communicate with one another.
For example, some brain studies show that language uses similar parts of the brain as toolmaking, and suggest that by the time the earliest advanced stone tools emerged 2 million years ago, their makers might have had the ability to talk to each other. Some kind of cognitive advance in human prehistory could have launched both skills.
Sawallis says that the search for such advances in brain power can be greatly expanded, millions of years back in time, now that it’s been shown that the physical ability for speech has existed for so long. “You might think of the brain as a driver and the vocal tract as a vehicle,” he says. “There’s no amount of computing power that can make the Wright Flyer supersonic. The physics of the object define what that object can do in the world. So what we’re talking about is not the neurological component that drives the vocal tract, we’re just talking about the physics of the vocal tract.”
How long did it take for our ancestors to find the voices they were equipped with all along? The question is a fascinating one, but unfortunately their bones and stones remain silent.