A pair of simultaneous nuclear explosions, one more than 1.6 miles underground and the other 1,000 feet above it, have been proposed as a way to extract huge quantities of natural gas from subterranean rock. Each blast would be … about 2.5 times the size of the bomb used at Hiroshima. By breaking up tight gas-bearing rock formations, a flow of presently inaccessible gas may be made available.… A single-blast experiment, called Project Gasbuggy, is already planned. — Science News, December 17, 1966 Update On December 10, 1967, Project Gasbuggy went ahead, with a 29-kiloton nuclear explosion deep underground in northwestern New Mexico. The blast released natural gas, but the gas was radioactive. The area is still regularly monitored for radioactive contamination. Today, natural gas trapped below Earth’s surface is often extracted via fracking, which breaks up rock using pressurized fluid (SN: 9/8/12, p. 20). Though less extreme, potential links to drinking water contamination and earthquakes have stoked fears about the technique.
With virtual reality finally hitting the consumer market this year, VR headsets are bound to make their way onto a lot of holiday shopping lists. But new research suggests these gifts could also give some of their recipients motion sickness — especially if they’re women.
In a test of people playing one virtual reality game using an Oculus Rift headset, more than half felt sick within 15 minutes, a team of scientists at the University of Minnesota in Minneapolis reports online December 3 in Experimental Brain Research. Among women, nearly four out of five felt sick. So-called VR sickness, also known as simulator sickness or cybersickness, has been recognized since the 1980s, when the U.S. military noticed that flight simulators were nauseating its pilots. In recent years, anecdotal reports began trickling in about the new generation of head-mounted virtual reality displays making people sick. Now, with VR making its way into people’s homes, there’s a steady stream of claims of VR sickness.
“It’s a high rate of people that you put in [VR headsets] that are going to experience some level of symptoms,” says Eric Muth, an experimental psychologist at Clemson University in South Carolina with expertise in motion sickness. “It’s going to mute the ‘Wheee!’ factor.”
Oculus, which Facebook bought for $2 billion in 2014, released its Rift headset in March. The company declined to comment on the new research but says it has made progress in making the virtual reality experience comfortable for most people, and that developers are getting better at creating VR content. All approved games and apps get a comfort rating based on things like the type of movements involved, and Oculus recommends starting slow and taking breaks. But still some users report getting sick.
The new study confirms these reports. A team led by Thomas Stoffregen, a kinesiologist who has been studying motion sickness for decades, tested the susceptibility of two sets of 18 male and 18 female undergraduates during two different VR games using an Oculus Rift DK2 headset. The first game, which involved using head motions to roll a virtual marble through a maze, made 22 percent of the players feel sick within the 15 minutes they were asked to play.
Another 36 students played the horror game Affected, using a hand-held controller to navigate a creepy building. This time, 56 percent felt sick within 15 minutes. Fourteen of 18 women, nearly 78 percent, were affected, compared with just over 33 percent of men. Though the study tested only an Oculus Rift, other companies’ VR headsets based on similar technology may have similar issues. This gender difference shows up in almost any situation that can cause motion sickness, like a moving car or a rocking boat. But Stoffregen says the disparity can’t be explained by the most widely accepted theory of motion sickness, which suggests that it’s caused by a mismatch between the motion your body is sensing and what your eyes are seeing, like when you read in a moving car. With VR, the theory goes, your eyes think you’re moving, but your body feels stationary, and this makes you feel sick.
Stoffregen thinks motion sickness is instead caused by things that disrupt your balance, like a boat pitching over a wave. And if you try to stabilize your body in the virtual world you see — say, by leaning into a virtual turn — instead of in the physical world you’re in, you can lose stability.
Men and women are typically different shapes and sizes, so they differ in the subtle, subconscious movements that keep their bodies balanced, known as postural sway, Stoffregen says. This difference makes women more susceptible to motion sickness, he claims. For the new study, he measured participants’ balancing motions before they played the games and found a measurable difference in sway between those who reported feeling sick and those who didn’t.
Because motion sickness is a complicated set of symptoms, self-reporting by participants may not be a reliable way to measure it, Muth argues. And, he says, “I would say the science isn’t there yet to draw that conclusion” about gender bias, adding he’d like to see the result replicated with a larger group.
Even so, with VR potentially poised to jump from the gaming world into more mainstream aspects of society — Facebook CEO Mark Zuckerberg says he wants “a billion people on Facebook in virtual reality as soon as possible” — a gender disparity could become a real problem, especially if VR enters the workplace, Stoffregen says. “If it were only games, it wouldn’t matter, and nobody would care.”
It was barely more than half a century ago that the Nobel Prize–winning virologist Sir Frank Macfarlane Burnet mused about the demise of contagions. “To write about infectious disease,” he wrote in 1962, “is almost to write of something that has passed into history.”
If only. In the past several decades, over 300 infectious pathogens have either newly emerged or emerged in new places, causing a steady drumbeat of outbreaks and global pandemic scares.
Over the course of 2016, their exploits reached a crescendo. Just as the unprecedented outbreak of Ebola in West Africa was collapsing in early 2016, the World Health Organization declared Zika virus, newly erupted in the Americas, an international public health emergency. What would balloon into the largest outbreak of yellow fever in Angola in 30 years had just begun. A few months later, scientists reported the just-discovered “superbug” mcr-1 gene in microbes collected from humans and pigs in the United States (SN Online: 5/27/16). The gene allows bacteria to resist the last-ditch antibiotic colistin, bringing us one step closer to a looming era of untreatable infections that would transform the practice of medicine. Its arrival presaged yet another unprecedented event: the convening of the United Nations General Assembly to consider the global problem of antibiotic-resistant bugs. It was only the fourth time over its 70-plus-year history that the assembly had been compelled to consider a health challenge. It’s “huge,” says University of Toronto epidemiologist David Fisman. But even as UN delegates arrived for their meeting in New York City in September, another dreaded infection was making headlines again. The international community’s decades-long effort to end the transmission of polio had unraveled. In 2015, the WHO had declared Nigeria, one of the three last countries in the world that suffered the infection, free of wild polio. By August 2016, it was back. Millions would have to be vaccinated to keep the infection from establishing a foothold. Three fundamental, interrelated factors fuel the microbial comeback, experts say. Across the globe, people are abandoning the countryside for life in the city, leading to rapid, unplanned urban expansions. In crowded conditions with limited access to health care and poor sanitation, pathogens like Ebola, Zika and influenza enjoy lush opportunities to spread. With more infections mingling, there are also more opportunities for pathogens to share their virulence genes.
At the same time, global demand for meat has quadrupled over the last five decades by some estimates, driving the spread of industrial livestock farming techniques that can allow benign microbes to become more virulent. The use of colistin in livestock agriculture in China, for example, has been associated with the emergence of mcr-1, which was first discovered during routine surveillance of food animals there. Genetic analyses suggest that siting factory farms full of chickens and pigs in proximity to wild waterfowl has played a role in the emergence of highly virulent strains of avian influenza. Crosses of Asian and North American strains of avian influenza caused the biggest outbreak of animal disease in U.S. history in 2014–2015. Containing that virus required the slaughter of nearly 50 million domesticated birds and cost over $950 million. Worryingly, some strains of avian influenza, such as H5N1, can infect humans. The thickening blanket of carbon dioxide in the atmosphere resulting from booming populations of people and livestock provides yet another opportunity for pathogens to exploit. Scientists around the world have documented the movement of disease-carrying creatures including mosquitoes and ticks into new regions in association with newly amenable climatic conditions. Climate scientists predict range changes for bats and other animals as well. As the organisms spread into new ranges, they carry pathogens such as Ebola, Zika and Borrelia burgdorferi(a bacterium responsible for Lyme disease) along with them. Since we can rarely develop drugs and vaccines fast enough to stanch the most dangerous waves of disease, early detection will be key moving forward. Researchers have developed a welter of models and pilot programs showing how environmental cues such as temperature and precipitation fluctuations and the insights of wildlife and livestock experts can help pinpoint pathogens with pandemic potential before they cause outbreaks in people. Chlorophyll signatures, a proxy for the plankton concentrations that are associated with cholera bacteria, can be detected from satellite data, potentially providing advance notice of cholera outbreaks.
Even social media chatter can be helpful. Innovative financing methods, such as the World Bank’s recently launched Pandemic Emergency Financing Facility — a kind of global pandemic insurance policy funded by donor countries, the reinsurance market and the World Bank — could help ensure that resources to isolate and contain new pathogens are readily available, wherever they take hold. Right now, emerging disease expert Peter Daszak points out, “we wait for epidemics to emerge and then spend billions on developing vaccines and drugs.” The nonprofit organization that Daszak directs, EcoHealth Alliance, is one of a handful that instead aim to detect new pathogens at their source and proactively minimize the risk of their spread.
Burnet died in 1985, two years after the discovery of HIV, one of the first of the latest wave of new pathogens. His vision of a contagion-free society was that of a climber atop a foothill surrounded by peaks, mistakenly thinking he’d reached the summit. The challenge of surviving in a world of pathogens is far from over. In many ways, it’s only just begun.
SAN FRANCISCO — One climate doomsday scenario can be downgraded, new research suggests.
Decades of atmospheric measurements from a site in northern Alaska show that rapidly rising temperatures there have not significantly increased methane emissions from the neighboring permafrost-covered landscape, researchers reported December 15 at the American Geophysical Union’s fall meeting.
Some scientists feared that Arctic warming would unleash large amounts of methane, a potent greenhouse gas, into the atmosphere, worsening global warming. “The ticking time bomb of methane has clearly not manifested itself yet,” said study coauthor Colm Sweeney, an atmospheric scientist at the University of Colorado Boulder. Emissions of carbon dioxide — a less potent greenhouse gas — did increase over that period, the researchers found. The CO2 rise “is still bad, it’s just not as bad” as a rise in methane, said Franz Meyer, a remote sensing scientist at the University of Alaska Fairbanks who was not involved in the research. The measurements were taken at just one site, though, so Meyer cautions against applying the results to the entire Arctic just yet. “This location might not be representative,” he said.
Across the Arctic, the top three meters of permafrost contain 2.5 times as much carbon as the CO2 released into the atmosphere by human activities since the start of the Industrial Revolution. As the Arctic rapidly warms, these thick layers of frozen soil will thaw and some of the carbon will be converted by hungry microbes into methane and CO2, studies that artificially warmed permafrost have suggested. That carbon will have a bigger impact on Earth’s climate as methane than it will as CO2. Over a 100-year period, a ton of methane will cause about 25 times as much warming as a ton of CO2.
A research station in Alaska’s northernmost city, Barrow, has been monitoring methane concentrations in the Arctic air since 1986 and CO2 since 1973. An air intake on a tower about 16.5 meters off the ground constantly sniffs the air, taking measurements. Barrow has warmed more than twice as fast as the rest of the Arctic over the last 29 years. This rapid warming “makes this region of the Arctic a great little incubation test to see what happens when we have everything heating up much faster,” Sweeney said.
Over the course of a year, methane concentrations in winds wafting from the nearby tundra rise and fall with temperatures, the Barrow data show. Since 1986, though, seasonal methane emissions have remained largely stable overall. But concentrations of CO2 in air coming from over the tundra, compared with over the nearby Arctic Ocean, have increased by about 0.02 parts per million per year since 1973, the researchers reported.
The lack of an increase in methane concentrations could be caused by the thawing permafrost allowing water to escape and drying the Arctic soil, Sweeney proposed. This drying would limit the productivity of methane-producing microbes, potentially counteracting the effects of warming. Tracking Arctic wetness will be crucial for predicting future methane emissions in the region, said Susan Natali, an Arctic scientist at the Woods Hole Research Center in Falmouth, Mass. Studies have shown increased methane emissions from growing Arctic lakes, she points out. “We’re going to get both carbon dioxide and methane,” she said. “It depends on whether areas are getting wetter or drier.”
NEW ORLEANS, La. – Skin that mostly hangs loose around hagfishes proves handy for living through a shark attack or wriggling through a crevice.
The skin on hagfishes’ long, sausage-style bodies is attached in a line down the center of their backs and in flexible connections where glands release slime, explained Douglas Fudge of Chapman University in Orange, Calif. This floating skin easily slip-slides in various directions. A shark tooth can puncture the skin but not stab into the muscle below. And a shark attack is just one of the crises when loose skin can help, Fudge reported January 5 at the annual meeting of the Society for Integrative and Comparative Biology. Hagfishes can fend off an attacking shark by quick-releasing a cloud of slime. Yet video of such events shows that a shark can land a bite before getting slimed. To figure out how hagfishes might survive such wounds, Fudge and colleagues used an indoor guillotine to drop a large mako shark tooth into hagfish carcasses. With the skin in its naturally loose state, the tooth readily punched through skin but slipped away from stabbing into the body of either the Atlantic (Myxine glutinosa) or Pacific (Eptatretus stoutii) hagfish species. But when the researchers glued the skin firmly to the hagfish muscle so the skin couldn’t slip, the tooth typically plunged into inner tissue. For comparison, the researchers tested lampreys, which are similarly tube-shaped but with skin well-fastened to their innards. When the guillotine dropped on them, the tooth often stabbed directly into flesh. The finding makes sense to Theodore Uyeno of Valdosta State University in Georgia, whose laboratory work suggests how loose skin might work in minimizing damage from shark bites. He and colleagues have tested how hard it is to puncture swatches of skin from both the Atlantic and Pacific species. As is true for many other materials, punching through a swatch of hagfish skin held taut didn’t take as long as punching through skin patches allowed to go slack, he said in a January 5 presentation at the meeting. Even a slight delay when a sharp point bears down on baggy skin might allow the hagfish to start dodging and sliming.
But Michelle Graham, who studies locomotion in flying snakes at Virginia Tech, wondered if puncture wounds would be a drawback to such a defense. A hagfish that avoids a deep stab could still lose blood from the skin puncture. That’s true, said Fudge, but the loss doesn’t seem to be great. Hagfish have unusually low blood pressure, and video of real attacks doesn’t show great gushes.
Hagfish blood also plays a part in another benefit of loose skin — an unusual ability to wiggle through cracks, Fudge reported in a second talk at the meeting. One of his students built an adjustable crevice and found that both Atlantic and Pacific hagfishes can contort themselves through slits only half as wide as their original body diameter. Videos show skin bulging out to the rear as the strong pinch of the opening forces blood backward.
The cavity just under a hagfish’s skin can hold roughly a third of its blood. Forcing that reservoir backward can help shrink the body diameter. Fortunately the inner body tapers at the end, Fudge said. So as blood builds up, “they don’t explode.”
Thousands of years ago, it didn’t just rain on the Sahara Desert. It poured.
Grasslands, trees, lakes and rivers once covered North Africa’s now arid, unforgiving landscape. From about 11,000 to 5,000 years ago, much higher rainfall rates than previously estimated created that “Green Sahara,” say geologist Jessica Tierney of the University of Arizona in Tucson and her colleagues. Extensive ground cover, combined with reductions of airborne dust, intensified water evaporation into the atmosphere, leading to monsoonlike conditions, the scientists report January 18 in Science Advances. Tierney’s team reconstructed western Saharan rainfall patterns over the last 25,000 years. Estimates relied on measurements of forms of carbon and hydrogen in leaf wax recovered from ocean sediment cores collected off the Sahara’s west coast. Concentrations of these substances reflected ancient rainfall rates.
Rainfall ranged from 250 to 1,670 millimeters annually during Green Sahara times, the researchers say. Previous estimates — based on studies of ancient pollen that did not account for dust declines — reached no higher than about 900 millimeters. Saharan rainfall rates currently range from 35 to 100 millimeters annually.
Leaf-wax evidence indicates that the Green Sahara dried out from about 8,000 to at least 7,000 years ago before rebounding. That’s consistent with other ancient climate simulations and with excavations suggesting that humans temporarily left the area around 8,000 years ago. Hunter-gatherers departed for friendlier locales, leaving cattle herders to spread across North Africa once the Green Sahara returned (SN Online: 6/20/12), the investigators propose.
Hunter-gatherers and farming villagers who live in worlds without lightbulbs or thermostats sleep slightly less at night than smartphone-toting city slickers, researchers say.
“Contrary to conventional wisdom, people in societies without electricity do not sleep more than those in industrial societies like ours,” says UCLA psychiatrist and sleep researcher Jerome Siegel, who was not involved in the new research.
Different patterns of slumber and wakefulness in each of these groups highlight the flexibility of human sleep — and also point to potential health dangers in how members of Western societies sleep, conclude evolutionary biologist David Samson of Duke University and colleagues. Compared with other primates, human evolution featured a shift toward sleeping more deeply over shorter time periods, providing more time for learning new skills and knowledge as cultures expanded, the researchers propose. Humans also evolved an ability to revise sleep schedules based on daily work schedules and environmental factors such as temperature. Samson’s team describes sleep patterns in 33 East African Hadza hunter-gatherers over a total of 393 days in a paper published online January 7 in the American Journal of Physical Anthropology. The team’s separate report on slumber among 21 rural farmers in Madagascar over 292 days will appear later this year in the American Journal of Human Biology.
Sleep patterns in these groups were tracked with wrist devices that measure a person’s activity levels. Both Hadza and Malagasy volunteers slept an average of about 6.5 hours nightly, less than the about seven-hour average for most U.S. adults. Foragers and villagers, who slept in areas with various family and group members, awoke more frequently during the night than has been reported among Westerners. Scalp electrodes worn at night by nine villagers during nine nights revealed biological signs of relatively light sleep compared with Westerners, including shorter periods of slow-wave and rapid eye movement sleep. But Hadza and Malagasy individuals often supplemented nighttime sleep with one or two daytime naps. Shut-eye breaks averaged 47.5 minutes for the Hadza and about 55 minutes for villagers. Critically, Samson says, foragers and villagers displayed more consistent daily cycles of sleep and wakefulness than are characteristic of Westerners. Hadza adults tended to hit the sack — or, more commonly, the mat — shortly after midnight and nap in the early afternoon. Malagasy villagers napped once or twice during the day’s hottest hours, usually starting around noon, and retired in the early evening. At night, they slept in two phases, awakening for around an hour shortly after midnight. Historical accounts describe a similar sleep pattern among Western Europeans between 500 and 200 years ago — two sleep segments, divided by a period of activity or reflection (SN: 9/25/99, p. 205). Nighttime sleep in both populations became deeper and less fragmented as tropical humidity dipped.
Researchers also noted that hunter-gatherers and villagers got plenty of direct sunlight, unlike many Westerners. Several studies have linked inconsistent sleep-wake cycles and lack of sun exposure to health problems, including inflammation and heart problems, Samson says. “People in modern societies can take lessons from this research by attempting to get lots of light exposure during the day while reducing blue-wave light exposure after dark and dropping inside temperatures by a few degrees at night.” Smartphones and other digital devices emit blue-wave light, which can suppress melatonin production and delay sleep.
Effects of wayward sleep patterns or too little sleep on health vary across cultures and regions, says biomedical anthropologist Kristen Knutson of Northwestern University Feinberg School of Medicine in Chicago. For instance, sleeping less than six hours per night may increase appetite, as some studies suggest, but a sleep-deprived office worker surrounded by fast-food joints is more likely to become obese than a physically active hunter-gatherer faced with a limited food supply.
Samson’s research aligns with previous evidence, conducted by Knutson, that rural Haitians living without electricity sleep an average of about seven hours nightly. In addition, Siegel’s team recently reported that nightly sleep averages 5.7 to 7.1 hours in three hunter-gatherer societies, including the Hadza (SN: 11/14/15, p. 10).
Until recently, researchers thought cannibalism took place only among a few species in the animal kingdom and only under extraordinary circumstances. But as zoologist Bill Schutt chronicles in Cannibalism, plenty of creatures inhabit their own version of a dog-eat-dog world.
Over the last few decades, scientists have observed cannibalism — defined by Schutt as eating all or part of another individual of the same species — among all major groups of vertebrates. The practice seems to be even more prevalent, and less discriminating, among invertebrates such as mollusks, insects and spiders, whose eggs, larvae and young are often produced in profusion and are therefore readily available, not to mention nutritious. Cannibalism, Schutt contends, makes perfect evolutionary sense, and not merely as a feeding strategy. When food supplies are low or living conditions are crowded, some mammals and birds may eat some or all of their offspring to terminate an expenditure of effort with poor chances of paying off. For birds, eating a dead or dying hatchling also is a way to get rid of a carcass that could spread infection or whose scent could attract flies or predators to the nest.
Switching to a historical and cultural perspective, Schutt tackles the various forms of human cannibalism, where, he admits, “the ick factor is high.” That includes medicinal cannibalism, from 17th and 18th century Europeans’ consumption of powdered mummies to modern moms’ ingestion of their newborns’ placentas to purportedly restore nutrients lost during childbirth. The author also explores survival cannibalism (think famine victims, people under siege, plane-crash survivors and the ill-fated Donner Party) and briefly addresses our natural shock and seemingly unnatural fascination with criminal cannibalism (à la Jeffrey Dahmer).
As Schutt explains, ritual cannibalism — the consumption of a foe or loved one to acquire the decedent’s strength, courage or wisdom — is a practice that has apparently taken place in different cultures throughout history. In an interesting aside, Schutt ponders whether people who consume wafers and wine during Communion, especially those who firmly believe these items are literally converted into the body and blood of Christ, are engaging in a form of ritual cannibalism.
Cannibalism is a wide-ranging, engaging and thoroughly fun read. The author’s numerous field trips and lab visits with scientists who study the phenomenon heartily enrich this captivating book.
Human gene editing to prevent genetic diseases from being passed to future generations may be permissible under certain conditions, a panel of experts says.
Altering DNA in germline cells — embryos, eggs, and sperm, or cells that give rise to them — may be used to cure genetic diseases for future generations, provided it is done only to correct disease or disability, not to enhance people’s health or abilities, a report issued February 14 by the National Academies of Sciences and Medicine recommends. The decision contradicts earlier recommendations by organizers of a global summit on human gene editing, who concluded that gene editing with molecular scissors such as CRISPR/Cas9 should not be used to produce babies (SN: 12/26/15, p. 12). Heritable gene editing is not yet ready to be done in people, says Alta Charo, a bioethicist at the University of Wisconsin‒Madison Law School who cochaired the panel. “We are not trying to greenlight heritable germline editing. We’re trying to find that limited set of circumstances where its use is justified by a compelling need and its application is limited to that compelling need,” says Charo. “We’re giving it a yellow light.”
National Academies reports carry no legislative weight, but do often influence policy decisions in the United States and abroad. It will be up to Congress, regulatory agencies such as the U.S. Food and Drug Administration, and state and local governments to implement the recommendations.
Supporters of new genetic engineering technologies hailed the decision.
“It looks like the possibility of eliminating some genetic diseases is now more than a theoretical option,” says Sean Tipton, a spokesman for the American Society for Reproductive Medicine in Washington, D.C. “That’s what this sets up.” Diseases such as cystic fibrosis and Huntington’s, which are caused by mutations in single genes, could someday be corrected by gene editing. More complex diseases or disorders caused by changes in multiple genes, such as autism or schizophrenia, probably would not be the focus of genome editing.
Others worry that allowing any tinkering with the germline will inevitably lead to “designer babies” and other social ills. It raises fears of stigmatization of people with disabilities, exacerbation of inequalities between people who can afford such therapies and those who can’t, and even a new kind of eugenics, critics say. “Once you approve any form of human germline modification you really open the door to all forms,” says Marcy Darnovsky, executive director of the Center for Genetics and Society in Berkeley, Calif.
Panelist Jeffrey Kahn, a bioethicist at Johns Hopkins University, says the door to heritable gene therapy remains closed until stringent requirements can be met. “It’s frankly more of a knock on the door,” he said at the public presentation of the report.
The report also changes the debate from whether to allow germline editing to instead focus on the line between therapy and enhancement, Darnovsky says. “I’m feeling very unsettled and disappointed by what they are recommending.”
Several clinical trials in the United States, China and other countries are already under way to do gene editing in people who have cancer or other diseases. But those therapies do not involve altering germline cells; instead they fix defects or make alterations to DNA in other body, or “somatic,” cells. The panel recommended that such somatic cell therapies should also be restricted to treating diseases, not allowing enhancements.
Researchers in the United Kingdom, Sweden and China have already done gene editing on early human embryos in the lab. Recent clinical trials in Mexico and Ukraine to produce “three-parent babies” are also seen as altering the germline because such children carry a small amount of DNA from an egg donor (SN Online: 10/18/16). But those children don’t have modifications of their nuclear DNA, where the genetic instructions that determine traits are stored.
Currently, researchers in the United States are effectively banned from conducting clinical trials that would produce heritable changes in the human genome, either by gene editing or making three-parent babies. The new recommendations could pave the way to allow such experiments.
But the panel lays out a number of hurdles that must be cleared before germline editing could move forward, ones that may be impossible to overcome, says Nita Farahany, a bioethicist at Duke Law School in Durham, N.C. “Some people could read into the stringency of the requirements to think that the benefits could never outweigh the risks,” she says.
One hurdle is a requirement to follow multiple generations of children who have gotten gene editing to determine whether the therapy has consequences for future generations. Researchers would never be able to guarantee that they could conduct such long-term studies, Farahany says. “You can’t bind your children and grandchildren to agree to be tracked by such studies.”
Distinctions between therapies and enhancements are also vague. Researchers may not be able to convincingly draw lines between them, says George Church, a Harvard University geneticist who has developed CRISPR/Cas9 for a variety of purposes. Virtually everything medicine has accomplished could be considered as enhancing human life, he says. “Vaccines are advancements over our ancestors. If you could tell our ancestors they could walk into a smallpox ward and not even worry about it, that would be a superpower.”
But the new technology may make it harder to enhance humans than drugs do, says Charo. Gene-editing technologies are so precise and specific that someone who does not carry a disease-causing mutation would probably not benefit from the technology, she says.
As a treatment for the ailments of aging, testosterone’s benefits are hit or miss.
For men with low testosterone, the hormone therapy is helpful for some health problems, but not so much for others, researchers report in five papers published February 21 in JAMA and JAMA Internal Medicine. Testosterone therapy was good for the bones, but didn’t help memory. It remedied anemia and was linked to a lower risk of heart attack and stroke. But treatment also upped the amount of plaque in the arteries, an early indicator of heart attack risk, researchers report. “It’s a very confusing area,” says Caleb Alexander, a prescription drug researcher at Johns Hopkins Bloomberg School of Public Health, who was not involved with the work. “Testosterone very well may help men feel more energized,” he says. “But the real question is: At what cost?”
As men age, their testosterone levels tend to drop. Researchers have suggested that boosting the levels back up to normal might counter some signs of aging, including memory loss and weakened bones. But the risks of such treatment — especially the cardiovascular risks — remain unclear, Alexander says. Dozens of studies have tackled the question, but the results “point in lots of different directions,” he says.
Despite lack of clarity on testosterone therapy’s safety and benefits, the number of men taking the hormone has soared in recent years. One 2014 analysis estimated that 2.2 million men filled testosterone prescriptions in 2013 compared with 1.2 million men in 2010. That includes many men with testosterone levels on the borderline between low and normal, men who don’t actually meet clinical guidelines for treatment, Alexander says. The new studies attempted to answer some of the long-standing questions about the pros and cons of treatment. Four present findings from a set of clinical trials known as the Testosterone Trials, designed to evaluate the effects of testosterone therapy in men age 65 or older. One study found that the density and strength of hip and especially spine bones improved after a year of using a daily dose of testosterone gel. Researchers don’t yet know whether these gains will translate to fewer fractures. Daily testosterone gel treatment over a year also helped men recover from anemia, raising levels of hemoglobin, an oxygen-carrying molecule in the blood, a second study showed. But testosterone gel didn’t seem to have an effect on men’s memory and cognition. In a study of 788 men, those who took the hormone performed about as well on memory and other tests as those who got a placebo.
Two studies attempted to untangle how exactly testosterone treatment affects the heart and blood vessels. One study, part of the Testosterone Trials, linked testosterone treatment with more plaque buildup in the vessels that carry oxygen-rich blood to the heart. That sounds ominous. Too much plaque can block blood flow and cripple the heart. But the second study didn’t find more heart attacks, strokes or other cardiovascular problems in men taking the hormone. In that study, researchers examined medical records of more than 44,000 men, around 8,800 of whom had been given a prescription for testosterone treatment. Over a roughly three-year follow-up period, these men actually had a lower risk of cardiovascular issues than men who hadn’t been given a testosterone prescription, researchers report.
The new work does “little overall to clarify the role of testosterone replacement” for cardiovascular risk and cognitive function, says Dimitri Cassimatis, a cardiologist at Emory University School of Medicine in Atlanta. But taken together, he says, the studies strengthen the evidence for testosterone’s benefits on bone density and anemia.