Thousands of years ago, it didn’t just rain on the Sahara Desert. It poured.
Grasslands, trees, lakes and rivers once covered North Africa’s now arid, unforgiving landscape. From about 11,000 to 5,000 years ago, much higher rainfall rates than previously estimated created that “Green Sahara,” say geologist Jessica Tierney of the University of Arizona in Tucson and her colleagues. Extensive ground cover, combined with reductions of airborne dust, intensified water evaporation into the atmosphere, leading to monsoonlike conditions, the scientists report January 18 in Science Advances. Tierney’s team reconstructed western Saharan rainfall patterns over the last 25,000 years. Estimates relied on measurements of forms of carbon and hydrogen in leaf wax recovered from ocean sediment cores collected off the Sahara’s west coast. Concentrations of these substances reflected ancient rainfall rates.
Rainfall ranged from 250 to 1,670 millimeters annually during Green Sahara times, the researchers say. Previous estimates — based on studies of ancient pollen that did not account for dust declines — reached no higher than about 900 millimeters. Saharan rainfall rates currently range from 35 to 100 millimeters annually.
Leaf-wax evidence indicates that the Green Sahara dried out from about 8,000 to at least 7,000 years ago before rebounding. That’s consistent with other ancient climate simulations and with excavations suggesting that humans temporarily left the area around 8,000 years ago. Hunter-gatherers departed for friendlier locales, leaving cattle herders to spread across North Africa once the Green Sahara returned (SN Online: 6/20/12), the investigators propose.
Hunter-gatherers and farming villagers who live in worlds without lightbulbs or thermostats sleep slightly less at night than smartphone-toting city slickers, researchers say.
“Contrary to conventional wisdom, people in societies without electricity do not sleep more than those in industrial societies like ours,” says UCLA psychiatrist and sleep researcher Jerome Siegel, who was not involved in the new research.
Different patterns of slumber and wakefulness in each of these groups highlight the flexibility of human sleep — and also point to potential health dangers in how members of Western societies sleep, conclude evolutionary biologist David Samson of Duke University and colleagues. Compared with other primates, human evolution featured a shift toward sleeping more deeply over shorter time periods, providing more time for learning new skills and knowledge as cultures expanded, the researchers propose. Humans also evolved an ability to revise sleep schedules based on daily work schedules and environmental factors such as temperature. Samson’s team describes sleep patterns in 33 East African Hadza hunter-gatherers over a total of 393 days in a paper published online January 7 in the American Journal of Physical Anthropology. The team’s separate report on slumber among 21 rural farmers in Madagascar over 292 days will appear later this year in the American Journal of Human Biology.
Sleep patterns in these groups were tracked with wrist devices that measure a person’s activity levels. Both Hadza and Malagasy volunteers slept an average of about 6.5 hours nightly, less than the about seven-hour average for most U.S. adults. Foragers and villagers, who slept in areas with various family and group members, awoke more frequently during the night than has been reported among Westerners. Scalp electrodes worn at night by nine villagers during nine nights revealed biological signs of relatively light sleep compared with Westerners, including shorter periods of slow-wave and rapid eye movement sleep. But Hadza and Malagasy individuals often supplemented nighttime sleep with one or two daytime naps. Shut-eye breaks averaged 47.5 minutes for the Hadza and about 55 minutes for villagers. Critically, Samson says, foragers and villagers displayed more consistent daily cycles of sleep and wakefulness than are characteristic of Westerners. Hadza adults tended to hit the sack — or, more commonly, the mat — shortly after midnight and nap in the early afternoon. Malagasy villagers napped once or twice during the day’s hottest hours, usually starting around noon, and retired in the early evening. At night, they slept in two phases, awakening for around an hour shortly after midnight. Historical accounts describe a similar sleep pattern among Western Europeans between 500 and 200 years ago — two sleep segments, divided by a period of activity or reflection (SN: 9/25/99, p. 205). Nighttime sleep in both populations became deeper and less fragmented as tropical humidity dipped.
Researchers also noted that hunter-gatherers and villagers got plenty of direct sunlight, unlike many Westerners. Several studies have linked inconsistent sleep-wake cycles and lack of sun exposure to health problems, including inflammation and heart problems, Samson says. “People in modern societies can take lessons from this research by attempting to get lots of light exposure during the day while reducing blue-wave light exposure after dark and dropping inside temperatures by a few degrees at night.” Smartphones and other digital devices emit blue-wave light, which can suppress melatonin production and delay sleep.
Effects of wayward sleep patterns or too little sleep on health vary across cultures and regions, says biomedical anthropologist Kristen Knutson of Northwestern University Feinberg School of Medicine in Chicago. For instance, sleeping less than six hours per night may increase appetite, as some studies suggest, but a sleep-deprived office worker surrounded by fast-food joints is more likely to become obese than a physically active hunter-gatherer faced with a limited food supply.
Samson’s research aligns with previous evidence, conducted by Knutson, that rural Haitians living without electricity sleep an average of about seven hours nightly. In addition, Siegel’s team recently reported that nightly sleep averages 5.7 to 7.1 hours in three hunter-gatherer societies, including the Hadza (SN: 11/14/15, p. 10).
Until recently, researchers thought cannibalism took place only among a few species in the animal kingdom and only under extraordinary circumstances. But as zoologist Bill Schutt chronicles in Cannibalism, plenty of creatures inhabit their own version of a dog-eat-dog world.
Over the last few decades, scientists have observed cannibalism — defined by Schutt as eating all or part of another individual of the same species — among all major groups of vertebrates. The practice seems to be even more prevalent, and less discriminating, among invertebrates such as mollusks, insects and spiders, whose eggs, larvae and young are often produced in profusion and are therefore readily available, not to mention nutritious. Cannibalism, Schutt contends, makes perfect evolutionary sense, and not merely as a feeding strategy. When food supplies are low or living conditions are crowded, some mammals and birds may eat some or all of their offspring to terminate an expenditure of effort with poor chances of paying off. For birds, eating a dead or dying hatchling also is a way to get rid of a carcass that could spread infection or whose scent could attract flies or predators to the nest.
Switching to a historical and cultural perspective, Schutt tackles the various forms of human cannibalism, where, he admits, “the ick factor is high.” That includes medicinal cannibalism, from 17th and 18th century Europeans’ consumption of powdered mummies to modern moms’ ingestion of their newborns’ placentas to purportedly restore nutrients lost during childbirth. The author also explores survival cannibalism (think famine victims, people under siege, plane-crash survivors and the ill-fated Donner Party) and briefly addresses our natural shock and seemingly unnatural fascination with criminal cannibalism (à la Jeffrey Dahmer).
As Schutt explains, ritual cannibalism — the consumption of a foe or loved one to acquire the decedent’s strength, courage or wisdom — is a practice that has apparently taken place in different cultures throughout history. In an interesting aside, Schutt ponders whether people who consume wafers and wine during Communion, especially those who firmly believe these items are literally converted into the body and blood of Christ, are engaging in a form of ritual cannibalism.
Cannibalism is a wide-ranging, engaging and thoroughly fun read. The author’s numerous field trips and lab visits with scientists who study the phenomenon heartily enrich this captivating book.
Human gene editing to prevent genetic diseases from being passed to future generations may be permissible under certain conditions, a panel of experts says.
Altering DNA in germline cells — embryos, eggs, and sperm, or cells that give rise to them — may be used to cure genetic diseases for future generations, provided it is done only to correct disease or disability, not to enhance people’s health or abilities, a report issued February 14 by the National Academies of Sciences and Medicine recommends. The decision contradicts earlier recommendations by organizers of a global summit on human gene editing, who concluded that gene editing with molecular scissors such as CRISPR/Cas9 should not be used to produce babies (SN: 12/26/15, p. 12). Heritable gene editing is not yet ready to be done in people, says Alta Charo, a bioethicist at the University of Wisconsin‒Madison Law School who cochaired the panel. “We are not trying to greenlight heritable germline editing. We’re trying to find that limited set of circumstances where its use is justified by a compelling need and its application is limited to that compelling need,” says Charo. “We’re giving it a yellow light.”
National Academies reports carry no legislative weight, but do often influence policy decisions in the United States and abroad. It will be up to Congress, regulatory agencies such as the U.S. Food and Drug Administration, and state and local governments to implement the recommendations.
Supporters of new genetic engineering technologies hailed the decision.
“It looks like the possibility of eliminating some genetic diseases is now more than a theoretical option,” says Sean Tipton, a spokesman for the American Society for Reproductive Medicine in Washington, D.C. “That’s what this sets up.” Diseases such as cystic fibrosis and Huntington’s, which are caused by mutations in single genes, could someday be corrected by gene editing. More complex diseases or disorders caused by changes in multiple genes, such as autism or schizophrenia, probably would not be the focus of genome editing.
Others worry that allowing any tinkering with the germline will inevitably lead to “designer babies” and other social ills. It raises fears of stigmatization of people with disabilities, exacerbation of inequalities between people who can afford such therapies and those who can’t, and even a new kind of eugenics, critics say. “Once you approve any form of human germline modification you really open the door to all forms,” says Marcy Darnovsky, executive director of the Center for Genetics and Society in Berkeley, Calif.
Panelist Jeffrey Kahn, a bioethicist at Johns Hopkins University, says the door to heritable gene therapy remains closed until stringent requirements can be met. “It’s frankly more of a knock on the door,” he said at the public presentation of the report.
The report also changes the debate from whether to allow germline editing to instead focus on the line between therapy and enhancement, Darnovsky says. “I’m feeling very unsettled and disappointed by what they are recommending.”
Several clinical trials in the United States, China and other countries are already under way to do gene editing in people who have cancer or other diseases. But those therapies do not involve altering germline cells; instead they fix defects or make alterations to DNA in other body, or “somatic,” cells. The panel recommended that such somatic cell therapies should also be restricted to treating diseases, not allowing enhancements.
Researchers in the United Kingdom, Sweden and China have already done gene editing on early human embryos in the lab. Recent clinical trials in Mexico and Ukraine to produce “three-parent babies” are also seen as altering the germline because such children carry a small amount of DNA from an egg donor (SN Online: 10/18/16). But those children don’t have modifications of their nuclear DNA, where the genetic instructions that determine traits are stored.
Currently, researchers in the United States are effectively banned from conducting clinical trials that would produce heritable changes in the human genome, either by gene editing or making three-parent babies. The new recommendations could pave the way to allow such experiments.
But the panel lays out a number of hurdles that must be cleared before germline editing could move forward, ones that may be impossible to overcome, says Nita Farahany, a bioethicist at Duke Law School in Durham, N.C. “Some people could read into the stringency of the requirements to think that the benefits could never outweigh the risks,” she says.
One hurdle is a requirement to follow multiple generations of children who have gotten gene editing to determine whether the therapy has consequences for future generations. Researchers would never be able to guarantee that they could conduct such long-term studies, Farahany says. “You can’t bind your children and grandchildren to agree to be tracked by such studies.”
Distinctions between therapies and enhancements are also vague. Researchers may not be able to convincingly draw lines between them, says George Church, a Harvard University geneticist who has developed CRISPR/Cas9 for a variety of purposes. Virtually everything medicine has accomplished could be considered as enhancing human life, he says. “Vaccines are advancements over our ancestors. If you could tell our ancestors they could walk into a smallpox ward and not even worry about it, that would be a superpower.”
But the new technology may make it harder to enhance humans than drugs do, says Charo. Gene-editing technologies are so precise and specific that someone who does not carry a disease-causing mutation would probably not benefit from the technology, she says.