Thousands of national parks have been established around the world to preserve wildlife. But towns, farms, ranches and roads have grown up around many of these parks, creating islands of wilderness in a sea of humanity. If the creatures inside are to thrive, they need ways to travel between the islands, contends “Wild Ways,” a new documentary from the TV series NOVA.
Isolation can be especially troublesome for large predators, such as lions, that live alone or in small groups. In some areas of Africa, lions can move between populations to avoid inbreeding. But some lions, such as the few in Tanzania’s Ngorongoro Crater, are cut off from other groups. In such populations, cubs are born smaller, die younger and are more susceptible to disease. And drought or overhunting could easily wipe them out, the show notes. To connect these smaller populations, conservationists are now building wildlife corridors between parks. One of the most ambitious projects is the Yellowstone to Yukon Conservation Initiative, which aims to create connections between grizzly bears in the Canadian Arctic and the western United States. Other large wildlife corridors are being planned in Central America, eastern Australia and the Himalayas. But there are often roadblocks. It can be difficult to persuade people to spend money on wildlife, and it can be even harder when those animals kill livestock or humans.
“It is important that we provide incentives for local communities, in particular, who should now look at wildlife as some form of economic asset to themselves,” says Simon Munthali of the Kavango-Zambezi Transfrontier Conservation Area, which is attempting to connect parks in five countries across southern Africa. With the right incentives, people will be more accepting of wildlife moving across land and may even benefit from it, he says in the documentary. Botswana, for instance, has developed a large ecotourism industry that provides jobs and money for local people, motivating animal protection.
The documentary is a bit too optimistic about the removal of hurdles that stand in the path of wildlife corridors, especially in the American West, where there is ongoing debate about how to manage public lands. And then there is the question of whether these corridors can be created fast enough to save the world’s dwindling animal populations. But, as Michael Soulé, one of the founders of the field of conservation biology, says: “It’s our last chance to protect the diversity of life on Earth.” “Wild Ways” makes a convincing case that we should be willing to try.
Parasitic worms may hold the secret to soothing inflamed bowels.
In studies of mice and people, parasitic worms shifted the balance of bacteria in the intestines and calmed inflammation, researchers report online April 14 in Science. Learning how worms manipulate microbes and the immune system may help scientists devise ways to do the same without infecting people with parasites.
Previous research has indicated that worm infections can influence people’s fertility (SN Online: 11/19/15), as well as their susceptibility to other parasite infections (SN: 10/5/13, p. 17) and to allergies (SN: 1/29/11, p. 26). Inflammatory bowel diseases also are less common in parts of the world where many people are infected with parasitic worms. P’ng Loke, a parasite immunologist at New York University School of Medicine, and colleagues explored how worms might protect against Crohn’s disease. The team studied mice with mutations in the Nod2 gene. Mutations in the human version of the gene are associated with Crohn’s in some people.
The mutant mice develop damage in their small intestines similar to that seen in some Crohn’s patients. Cells in the mice’s intestines don’t make much mucus, and more Bacteroides vulgatus bacteria grow in their intestines than in the guts of normal mice. Loke and colleagues previously discovered that having too much of that type of bacteria leads to inflammation that can damage the intestines. In the new study, the researchers infected the mice with either a whipworm (Trichuris muris) or a corkscrew-shaped worm (Heligmosomoides polygyrus). Worm-infected mice made more mucus than uninfected mutant mice did. The parasitized mice also had less B. vulgatus and more bacteria from the Clostridiales family. Clostridiales bacteria may help protect against inflammation. “Although we already knew that worms could alter the intestinal flora, they show that these types of changes can be very beneficial,” says Joel Weinstock, an immune parasitologist at Tufts University Medical Center in Boston.
Both the increased mucus and the shift in bacteria populations are due to what’s called the type 2 immune response, the researchers found. Worm infections trigger immune cells called T helper cells to release chemicals called interleukin-4 and interleukin-13. Those chemicals stimulate mucus production. The mucus then feeds the Clostridiales bacteria, allowing them to outcompete the Bacteroidales bacteria. It’s still unclear how the mucus encourages growth of one type of bacteria over another, Loke says.
Blocking interleukin-13 prevented the mucus production boost and the shift in bacteria mix, indicating that the worms work through the immune system. But giving interleukin-4 and interleukin-13 to uninfected mice could alter the mucus and bacterial balance without worms’ help, the researchers discovered.
Loke and colleagues also wanted to know if worms affect people’s gut microbes. So the researchers took fecal samples from people in Malaysia who were infected with parasitic worms.
After taking a deworming drug, the people had less Clostridiales and more Bacteriodales bacteria than before. That shift in bacteria was associated with a drop in the number of Trichuris trichiura whipworm eggs in the people’s feces, indicating that getting rid of worms may have negative consequences for some people.
Having data from humans is important because sometimes results in mice don’t hold up in people, says Aaron Blackwell, a human biologist at the University of California, Santa Barbara. “It’s nice to show that it’s consistent in humans.”
Worms probably do other things to limit inflammation as well, Weinstock says. If scientists can figure out what those things are, “studying these worms and how they do it may very well lead to the development of new drugs.”
Someone new has entered my life — a tiny, ear-splitting maniac who inchworms across the entire living room on her arched, furious back. My 3-year-old daughter hasn’t thrown temper tantrums, but to our surprise, her 1½-year-old sister is happy to fill that void.
Last week, my pleasant yet firm, “No, you can’t eat the stick of butter,” sent my toddler down to the floor, where she shrieked and writhed for what felt like an eternity. I’ve learned that talking to her or even looking at her added fuel to her fiery fit, so there’s not much to do other than ride out the storm.
It’s awful, of course, to see how miserable a kid is in the throes of a tantrum. And the fact that she can’t fully express her little heart with words just makes it worse. Adults, with our decades of practice, have trouble wrangling big emotions, so it’s no surprise that children can be easily overcome.
Because tantrums offer glimpses into the expression of strong emotions, they are a “compelling phenomenon for scientific study,” psychologist James Green of the University of Connecticut and colleagues wrote in a 2011 paper in the journal Emotion. I agree, by the way, but I am also supremely relieved that I am not the one studying them. To catch tantrums in their native habitat, the scientists outfitted 13 preschoolers with special onesies with microphones sewn into the front, and waited for the kids to lose their minds.
This masochistic experiment caught 24 emotional tsunamis in 2- to 3-year olds. By analyzing the sounds contained in them, the researchers could deconstruct tantrums into five types of noises, each with its own with distinct auditory quality. At the most intense end of the acoustic spectrum was the screams. The acoustical assault slowly eased as sounds became less energetic yells, cries, whines and fusses.
Screams and yells are more similar to each other, forming a group of sounds that often mean anger, the authors suggest. Cries, fusses and whines also group together, representing sadness. This excruciatingly detailed breakdown hints that tantrums have underlying structures. Sadly, the results do not tell us parents how to head off tantrums in the first place.
Simple preventives like keeping kids well-fed, rested and comfortable can stave off some meltdowns, but beyond those basics, we may be out of luck. My somewhat fatalistic view is that when faced with unruly emotions, some kids just can’t help themselves. After all, my older daughter doesn’t throw tantrums, at least not yet. Yet I do suspect that my own behavior is involved. An illuminating study in the August 2013 Journal of Behavioral Medicine hints that parents of tantrum-prone kids can curb tantrums (or at least their perceptions of tantrums) by somehow changing their own behavior. After eight days of giving their kids “flower essence,” an inert substance sold as a tantrum reducer, parents reported fewer outbursts from their kids. The effect was “placebo by proxy,” meaning that the parents’ beliefs in the product — and possibly their subsequent behavior — may have transferred to their kids. So just believing that their kid is going to throw fewer tantrums led the parents to believe that their kid threw fewer tantrums.
The study couldn’t say whether tantrums actually decreased or parents just perceived fewer of them. But really, either one would be an improvement in our house. Either that or my toddler is going to eat a lot of butter.
Before anybody even had a computer, Claude Shannon figured out how to make computers worth having.
As an electrical engineering graduate student at MIT, Shannon played around with a “differential analyzer,” a crude forerunner to computers. But for his master’s thesis, he was more concerned with relays and switches in electrical circuits, the sorts of things found in telephone exchange networks. In 1937 he produced, in the words of mathematician Solomon Golomb, “one of the greatest master’s theses ever,” establishing the connection between symbolic logic and the math for describing such circuitry. Shannon’s math worked not just for telephone exchanges or other electrical devices, but for any circuits, including the electronic circuitry that in subsequent decades would make digital computers so powerful.
It’s now conveniently a good time to celebrate Shannon’s achievements, on the occasion of the centennial of his birth (April 30) in Petoskey, Michigan, in 1916. Based on the pervasive importance of computing in society today, it wouldn’t be crazy to call the time since then “Shannon’s Century.”
“It is no exaggeration,” wrote Golomb, “to refer to Claude Shannon as the ‘father of the information age,’ and his intellectual achievement as one of the greatest of the twentieth century.”
Shannon is most well-known for creating an entirely new scientific field — information theory — in a pair of papers published in 1948. His foundation for that work, though, was built a decade earlier, in his thesis. There he devised equations that represented the behavior of electrical circuitry. How a circuit behaves depends on the interactions of relays and switches that can connect (or not) one terminal to another. Shannon sought a “calculus” for mathematically representing a circuit’s connections, allowing scientists to be able to design circuits effectively for various tasks. (He provided examples of the circuit math for an electronic combination lock and some other devices.)
“Any circuit is represented by a set of equations, the terms of the equations corresponding to the various relays and switches in the circuit,” Shannon wrote. His calculus for manipulating those equations, he showed, “is exactly analogous to the calculus of propositions used in the symbolic study of logic.”
As an undergraduate math (and electrical engineering) major at the University of Michigan, Shannon had learned of 19th century mathematician George Boole’s work on representing logical statements by algebraic symbols. Boole devised a way to calculate logical conclusions about propositions using binary numbers; 1 represented a true proposition and 0 a false proposition. Shannon perceived an analogy between Boole’s logical propositions and the flow of current in electrical circuits. If the circuit plays the role of the proposition, then a false proposition (0) corresponds to a closed circuit; a true proposition (1) corresponds to an open circuit. More elaborate math showed how different circuit designs would correspond to addition or multiplication and other features, the basis of the “logic gates” designed into modern computer chips.
For his Ph.D. dissertation, Shannon analyzed the mathematics of genetics in populations, but that work wasn’t published. In 1941 he began working at Bell Labs; during World War II, he wrote an important (at the time secret) paper on cryptography, which required deeper consideration of how to quantify information. After the war he developed those ideas more fully, focusing on using his 1s and 0s, or bits, to show how much information could be sent through a communications channel and how to communicate it most efficiently and accurately.
In 1948, his two papers on those issues appeared in the Bell System Technical Journal. They soon were published, with an introductory chapter by Warren Weaver, in a book titled The Mathematical Theory of Communication. Today that book is regarded as the founding document of information theory.
For Shannon, communication was not about the message, or its meaning, but about how much information could be communicated in a message (through a given channel). At its most basic, communication is simply the reproduction of a message at some point remote from its point of origin. Such a message might have a “meaning,” but such meaning “is irrelevant to the engineering problem” of transferring the message from one point to another, Shannon asserted. “The significant aspect is that that actual message is one selected from a set of possible messages.” Information, Shannon decided, is a measure of how much a communication reduces the ignorance about which of those possible messages has been transmitted.
In a very simple communication system, if the only possible messages are “yes” and “no,” then each message (1 for yes, 0 for no) reduces your ignorance by half. By Shannon’s math, that corresponds to one bit of information. (He didn’t coin the term “bit” — short for binary digit — but his work established its meaning.) Now consider a more complicated situation — an unabridged English dictionary, which should contain roughly half a million words. One bit would correspond to a yes-or-no that the word is in the first half of the dictionary. That reduces ignorance, but not very much. Each additional bit would reduce the number of possible words by half. Specifying a single word from the dictionary (eliminating all the ignorance) would take about 19 bits. (This fact is useful for playing the game of 20 Questions — just keep asking about the secret word’s location in the dictionary.)
Shannon investigated much more complicated situations and devised theorems for calculating information quantity and how to communicate it efficiently in the presence of noise. His math remains central to almost all of modern digital technology. As electrical engineer Andrew Viterbi wrote in a Shannon eulogy, Shannon’s 1948 papers “established all the key parameters and limits for optimal compression and transmission of digital information.”
Beyond its practical uses, Shannon’s work later proved to have profound scientific significance. His math quantifying information in bits borrowed the equations expressing the second law of thermodynamics, in which the concept of entropy describes the probability of a system’s state. Probability applied to the ways in which a system’s parts could be arranged, it seemed, mirrored the probabilities involved in reducing ignorance about a possible message. Shannon, well aware of this connection, called his measure entropy as well. Eventually questions arose about whether Shannon’s entropy and thermodynamic entropy shared more than a name.
Shannon apparently wasn’t sure. He told one writer in 1979 that he thought the connection between his entropy and thermodynamics would “hold up in the long run” but hadn’t been sufficiently explored. But nowadays a deep conceptual link shows up not only between Shannon’s information theory and thermodynamics, but in fields as diverse as quantum mechanics, molecular biology and the physics of black holes. Shannon’s understanding of information plays a central role, for instance, in explaining how the notorious Maxwell’s demon can’t violate thermodynamics’ second law. Much of that work is based on Landauer’s principle, the requirement that energy is expended when information is erased. In developing that principle, Rolf Landauer (an IBM physicist) was himself influenced both by Shannon’s work and the work of Sadi Carnot in discerning the second law in the early 19th century.
Something Shannon and Carnot had in common, Landauer once emphasized to me, was that both discovered mathematical restrictions on physical systems that were independent of the details of the system. In other words, Carnot’s limit on the efficiency of steam engines applied to any sort of engine, no matter what it was made of or how it was designed. Shannon’s principles specifying the limits on information compression and transmission apply no matter what technology is used to do the compressing or sending. (Although in Shannon’s case, Landauer added, certain conditions must be met.)
“They both find limits for what you can do which are independent of future inventions,” Landauer told me. That is, they have grasped something profound about reality that is not limited to a specific place or time or thing.
So it seems that Shannon saw deeply not only into the mathematics of circuits, but also into the workings of nature. Information theorist Thomas Cover once wrote that Shannon belongs “in the top handful of creative minds of the century.” Some of Shannon’s original theorems, Cover noted, were not actually proved rigorously. But over time, details in the sketchy proofs have been filled in and Shannon’s intuitive insights stand confirmed. “Shannon’s intuition,” Cover concluded, “must have been anchored in a deep and natural theoretical understanding.” And it seems likely that Shannon’s intuition will provide even more insights into nature in the century ahead.
A decent office scanner has beaten X-ray blasts from multimillion-dollar synchrotron setups in revealing how air bubbles kill plant leaves during drought.
Intricate fans and meshes of plant veins carrying water are “among the most important networks in biology,” says Timothy Brodribb of the University of Tasmania in Hobart, Australia. When drought weakens the water tension in veins, air from plant tissues bubbles in, killing leaves much as bubble embolisms and clots in blood vessels can kill human tissue. As climate change and population growth increase risks of water shortage, Brodribb and other researchers are delving into the details of what makes some plants more resistant than others to drought. The high energy of X-rays destroys delicate leaf tissue. So, based on a chat with microfluidics specialist Philippe Marmottant of the French National Center for Scientific Research, Brodribb tried repeatedly scanning a leaf with a light source below it to reveal darkening lines as air bubbles shot through the veins. A microscope or scanner proved perfect. Tracked this way, an invasion of killer bubbles “looks like a lightning storm,” he says.
He was surprised to see that bigger veins, despite their robust looks, fail before tiny ones (blue indicates earliest failures; red, the latest), as seen in an oak leaf (lower right) and Pteris fern (top). And networks in ferns with simpler branching patterns, as in the Adiantum ferns at bottom left, crash quickly.
This system of visualizing plant plumbing gave better resolution than expensive and elaborate X-ray techniques had, Brodribb, Marmottant and Diane Bienaimé report online April 11 in the Proceedings of the National Academy of Sciences.
Arabian camels (Camelus dromedarius) have trekked across ancient caravan routes in Asia and Africa for 3,000 years. But it’s unclear how camels’ domestication has affected their genetic blueprints.
To find out, Faisal Almathen of King Faisal University in Saudi Arabia and his colleagues combed through the DNA of 1,083 modern camels and ancient remains of wild and domesticated camels found at archaeological sites going back to 5000 B.C.
Camels run high on genetic diversity thanks to periodic restocking from now-extinct wild populations in the centuries after their domestication, the team reports May 9 in the Proceedings of the National Academy of Sciences. Travel on human caravan routes also created a steady flow of genes between different domesticated populations, except in a geographically isolated group in East Africa. That diversity may give some camel populations a leg up in adapting to future changes in climate, the authors suggest.
Something catapulted a pair of stars from the outer rim of our galaxy, but astronomers aren’t sure what. A binary star known as PB 3877 is rocketing away at about 2 million kilometers per hour — possibly fast enough to escape the galaxy’s gravitational pull — and all the usual explanations for such speedy stars fall short. Astrophysicist Péter Németh of the University of Erlangen-Nuremberg in Germany and colleagues report the discovery in the April 10 Astrophysical Journal Letters.
Many galactic escapees get kicked out after a close brush with the supermassive black hole in the Milky Way’s center. But PB 3877, first noticed in 2011 and currently about 18,000 light-years away in the constellation Coma Berenices, has been nowhere near that behemoth. A supernova could be responsible; it has happened before (SN Online: 3/5/15). But PB 3877 is two stars traveling together. A supernova would have torn the two apart. Németh and colleagues propose that the duo may be left over from a smashup between the Milky Way and a smaller galaxy. If that’s the case, then there might be others like PB 3877 lurking in the galactic outskirts.
Zapping Zika As Zika virus spreads, researchers are racing to find Zika-carrying mosquitoes’ Achilles’ heel, Susan Milius reported in “Science versus mosquito” (SN: 4/2/16, p. 30). Some approaches include genetic sterilization so mosquitoes can’t reproduce and infecting them with bacteria to decrease their disease-spreading power.
One reader had another suggestion. “Maybe we could just secure some wilderness areas for birds and bats,” Kurt Feierabend wrote on Facebook. “Let the feasting begin.”
“Some bats and birds do eat mosquitoes, but Aedes aegypti mosquitoes, which are Zika carriers, typically dwell in or around people’s houses — not an easy hunting ground for predators,” says Meghan Rosen, who also reported on the virus for this special report (SN: 4/2/16, p. 26). The mosquitoes lay eggs in water though, so predatory fish and crustaceans could serve as a type of biological control. But, according to the U.S. Centers for Disease Control and Prevention, that option may not be practical. Ae. aegypti larvae can also develop in dog bowls, plant saucers and rain splash left in crumpled plastic bags, Milius notes. Wandering baby Jupiter As proto-Jupiter moved through the solar system, it may have absorbed so much planet-building material that it reduced the number of planets that could form near the sun, Christopher Crockett reported in “Jupiter could have formed near sun,” (SN: 4/2/16, p. 7).
“Over what time span would this have occurred, and are any of our planets currently moving in or out?” asked online reader Mark S.
If Jupiter formed close to the sun, it spent only about 100,000 years in the inner solar system, well before the rocky planets started to form, Crockett says. “Researchers suspect that the outer planets danced around quite a bit during their formative years. All the sun’s worlds are now, fortunately, staying put for the foreseeable future,” he says.
Addicted to microbes In “Microbes and the mind” (SN: 4/2/16, p. 22), Laura Sanders reported on the surprising ways in which gut microbes influence depression, anxiety and other mental disorders. But it’s not a one-way street. “Our behavior can influence the microbiome right back,” she wrote.
Reader George Szynal wondered how addiction to drugs, alcohol and other substances may influence microbes and vice versa. “Can treatments of microbiome enhance and aid the recovery of addicted persons?” he asked.
“That’s a fascinating idea, but so far, little research has been done on this question,” Sanders says. “Alcohol disorders have been linked to changes in the gut microbiome, and smoking has been linked to differences in mouth bacteria. But until scientists figure out whether those microbe changes are consequences or causes of the addictions, we won’t know whether changing the microbes could help people kick the habits,” she says.
Water rising Without a sharp decrease in carbon dioxide emissions, rapid melting of the Antarctic ice sheet could raise sea levels by 60 meters by the end of the century, Thomas Sumner reported in “Tipping point for ice sheet looms” (SN: 4/2/16, p. 10).
In addition to melting ice sheets, reader Carolyn Lawson asked if human depletion of groundwater also contributes to sea level rise.
About 80 percent of groundwater losses end up in the oceans, according to a recent study in Nature Climate Change. Simulations showed that groundwater contributed about 0.02 millimeters of sea level rise annually in 1900 and increased to around 0.27 millimeters annually by 2000. “Current sea level rise is about 3 millimeters per year, so that’s a pretty large chunk,” Sumner says. “Unfortunately, Earth’s groundwater reserves are disappearing. It’s unclear whether the groundwater contribution to sea level rise will continue to increase indefinitely.”
On April 19, 1966, Roberta Gibb became the first woman to (unofficially) finish the Boston marathon. Women were officially allowed to enter the race in 1971, and Boston medaled its first female winner in 1972 — the year that also saw the passage of Title IX — the amendment that prohibits discrimination based on sex in education programs or any program receiving federal funding. This year, 13,751 women crossed the Boston marathon finish line, making the finisher list 45 percent female. In the last 50 years, other sports have also welcomed in women, from weightlifting to rugby to wrestling. And of course, women exercise noncompetitively, lifting weights, holding yoga poses and putting in hours on the track and in the gym.
Women are making up for a historical bias against them in sports. Not surprisingly, there’s also historically been a bias in sports science. “If you went all the way back to the 1950s, a lot of exercise physiology studies about metabolism talk about the 150-pound-man,” says Bruce Gladden, an exercise physiologist at Auburn University in Alabama and the editor in chief of the journal Medicine and Science in Sports and Exercise. “That was the average medical student.” It was a matter of convenience, studying the people nearest at hand, he explains.
Over time, athletes (and convenient student populations) have become more diverse, but diversity in studies of those athletes has continued to lag behind. When Joe Costello, an exercise physiologist at the University of Portsmouth in England, began studying the effects of extreme cold exposure on training recovery in athletes, he found that women were under-represented in the field compared to men. He wondered, he says, “is that the case across the board in sports science?”
Digging through three influential journals in the field — Medicine and Science in Sports and Exercise, the British Journal of Sports Medicine and the American Journal of Sports Medicine — Costello and his colleagues analyzed 1,382 articles published from 2011 to 2013, which added up to more than six million participants. The percentage of female participants per article was around 36 percent, and women represented 39 percent of the total participants, the scientists reported in April 2014 in the European Journal of Sport Science.
“In my opinion, it’s not enough,” he says. The numbers are relatively close to the gender breakdowns in competitive sport, he notes, but participation in noncompetitive exercise and casual running is a lot closer to a 50:50 breakdown, and the studies don’t reflect that. Despite the gap, Costello’s study did show that women are represented in exercise science studies in general. But I wondered if the trend was improving — and if the type of study mattered. Are scientists studying women in, say, studies of metabolism, but neglecting them in studies of injury? I looked at published studies in two top exercise physiology journals and found that women remain under-studied, especially when it comes to studies of performance. Reasons for this under-representation abound, from menstrual cycles to funding to simple logistics. But with recent requirements for gender parity from funding agencies, reasons are no longer excuses. When it comes to the race to fitness, women are well out of the starting blocks, but the science still has some catching up to do. Let’s look at the data I followed Costello’s lead and looked at studies published in Medicine and Science in Sports and Exercise and the American Journal of Sports Medicine, this time looking at the first five months of 2015(the former journals had articles available for free through May 2015; the latter granted me access. The third journal in the previous study, the British Journal of Sports Medicine, would only grant me access on a case-by-case basis). I excluded single case studies, animal studies, cell studies, studies involving cadavers and studies that dealt with coaches’ or doctors’ evaluations. I also excluded studies where the gender breakdown of participants wasn’t given (11 studies that included people didn’t mention the gender of the participants), and studies where there would be no reason to include women (such as those involving prostate cancer recovery).
That left me with 188 studies that included 254,813 participants. Of the 188 studies, 138, or 73 percent involved at least some women. But overall, women made up only 42 percent of participants. While 27 percent of the studies included only men, only 4 percent were studies of only women.
These results were similar to those Costello and his group showed in 2014. But I also wondered what, exactly, those women were being studied for. I took the 188 studies and divided them into six categories:
Studies on metabolism, obesity, sedentary behavior, weight loss and diabetes Studies of nonmetabolic diseases Basic physiology studies Social studies, including uses of pedometers and group exercise Sports injury Performance studies. In studies of metabolism, obesity, weight loss and diabetes (23 total studies), women were included in 87 percent of studies and represented 45 percent of participants, getting relatively close to gender parity. For nonmetabolic diseases (18 studies), 85 percent of studies included women, and they represented 44 percent of participants. Out of 188 studies, the number of studies involving women ranged from 36 percent in performance to 100 percent in social studies.
In basic physiology studies (11 total studies), including studies of knee and muscle function and studies of people in microgravity, women were included in 45 percent of studies, and represented 42 percent of all participants.
Women were represented in 100 percent of social studies (seven papers) and made up 60 percent of the participants. These included studies such as self-cognition, how well people adhere to wearing activity trackers, and the influence of meet-up groups on exercise. “Women are more likely to take part in [or] be recruited to group training programs than men,” notes Charlotte Jelleyman, an exercise physiologist at the University of Leicester in England.
The most striking differences came when studying performance and sports injury. There were 102 studies of sports injury and recovery, from concussions and elbow and shoulder repair in baseball players to studies of injury in surfers. Women were present in 80 percent of these studies, but made up 40 percent of participants.
I was especially interested in the large number of studies (38 total) on knee and ACL repair. In these studies, women were present in 94 percent of studies, but were only about 42 percent of participants. “That’s a case where you would think there would be more emphasis,” Gladden notes. “ACL injuries are much more prevalent in female athletes.” Out of more than 250,000 participants in the 188 studies analyzed, the majority were men, particularly in analyses of sports performance and injury.
But the biggest difference came in sports performance — training to get better, recover faster and perform stronger. Of 30 studies, 39 percent involved women, and women made up almost 40 percent of participants. But this result was heavily skewed by a single study of more than 90,000 participants, which examined sex differences in pacing during marathons. When this study was removed, the total number of participants in all performance studies dropped to 4,001. And the percentage of female participants dropped with it — to 3 percent. Scientists may be trying to get at the secrets of the best athletes, but to do so, they are mostly looking in men.
Time, money and menstrual cycles There are many reasons why women might be under-represented in exercise science. One is the same reason that haunts many sex disparities in biological research — the menstrual cycle.
With monthly hormone cycles, “[we] have to test [women] at certain phases,” even if you’re studying something seemingly unrelated, such as knee pain, explained Mark Tarnopololsky, a neurometabolic specialist at McMaster University, who has extensively studied sex differences in exercise. “One has to choose which phase — follicular or luteal phase — so I think when physiologists are limited in their funds, it’s easier to get guys to come in at any time.”
For some types of studies, scientists note that no previous studies have found sex differences. So scientists just study men — with no menstrual cycle to worry about — and apply the results to women. But “it’s not good enough,” says Jelleyman. “Just to say that because it works in men and previous studies have found no sex differences we assume it will work on women too – you have to show it.” Many scientists worry that cycling hormones means variable data points, so it’s easier to study men and state that the results probably apply to women, too. But that’s a cop out, says Marie Murphy, an exercise scientist at Ulster University in Northern Ireland. “If you revisit [women] in the same phase, they should be no more variable than a man,” she notes. “You return to them 28 days later and that’s easy enough. It’s not a difficult thing to do. But I think if you’re looking for an excuse you’ll find one.”
Using that excuse can mean missing important differences. Before Gibb’s Boston run in 1966, many people — including scientists — viewed distance running and extreme exercise as somehow unhealthy for women, Tarnopololsky explains. After his lab studied differences in metabolism in men and women during endurance exercise, his group found that “Women were at least as good, if not better able to withstand the rigors of the exercise.”
But menstrual cycles aside, studies are expensive, particularly studies involving people. In many cases, simplifying the study population is the only way to complete the work on time and within budget. As a member of the coaching teams associated with elite athletes, Louise Burke, a sports nutritionist at the Australian Institute of Sport, says she takes her research chances where she can find them. For a recent study of male race walkers, “when we decided to do the study I did think we’d have female race walkers,” she says. But she found that the pool of potential female participants was small. “We didn’t have a lot in Canberra,” she recalls. “Of that ones that were of the right caliber, we had people being injured, a couple who were doing a race that wouldn’t make them available.”
And when logistics shoot down one sex in a study, it will be the women who lose out. “Conference organizers are careful and include symposia on sex differences,” says John Hawley, an exercise physiologist at Australian Catholic University. But when it comes to actually doing studies, there can be challenges. Many of Hawley’s studies are invasive, involving biopsies that leave scars. And many women aren’t willing to get scarred for science. “If I go out to a triathlon and say to the females, ‘we’d like to do invasive work,’ they’re like ‘ooh, no biopsies,’” Hawley says. “It’s a legitimate practical issue.”
Finally, there are also cultural reasons that women end up underrepresented. Female athletes don’t get the same TV time as male athletes, and the players don’t get paid as much, even though, as in soccer, the women’s national team is more highly ranked than the men’s. This disparity might also result in [gender] disparity in performance studies, Gladden suggests. “Science unfortunately isn’t immune to those same problems.”
Leveling the playing field Calls for equality in exercise research continue. In a recent article in The Sport and Exercise Scientist, Murphy looked at the March issue of the Journal of Sports Sciences, and found that the 13 papers in the issue included 852 participants, but only 103 women, a dismal participation rate of only 12 percent.
While Murphy notes that other fields of study may have similar findings, exercise science needs to do better. “It’s quite simple,” she says. “If we want to apply the findings to men and women, we need to test our hypotheses and do our measures in research involving men and women.”
The lack of parity for female research participants “should be alarming,” Hawley says. He notes that while scientists bear some responsibility, “the funding bodies and editors of journals should be asking more serious questions.” Scientists who peer-review each other’s work should also ask hard questions, he says. “Peer review is failing as well….The typical responses [are] ‘unfortunately the budget does not permit females’ (a complete white lie of course), and time and practicalities. It’s not an excuse.”
As is true in many areas of science, as more women join the ranks of scientists studying exercise, they are more likely to include women in their studies. But Murphy notes that it won’t solve the problem. “I don’t think scientists think of it unless they have a particular interest in the area,” she says. “There are really good women researchers [in exercise science], but they study men, and the men study men! We’re not doing ourselves any favors.”
The broader impact of this gender imbalance is that training, fitness and diet recommendations for performance and recovery are based on science that may have only been done in men, and then downsized to fit women. Sometimes it may make no difference. But what if it could? In the end, the road to stronger, better, faster and healthier is one with studies that include everyone. “It is important to show that the general principles of exercise effectiveness are applicable to all populations whether it be males or females, older or younger, ethnically different or diseased populations,” says Jelleyman. “Sometimes it emerges that there are differences, other times less so. But it is still important to know this so that recommendations can be based on relevant evidence.”
Bad weather may have driven the mighty Mongols from Hungary. The Mongols’ retreat shows that small climate changes can influence historical events, researchers argue May 26 in Scientific Reports.
In the early 1200s, the Mongol empire had expanded across Eurasia into Russia and Eastern Europe. But when the Mongols got to Hungary in 1241, they invaded and then suddenly retreated back to Russia in 1242. Though theories abound, historians have never found a clear reason for the abrupt exit.
Now, Ulf Büntgen of the Swiss Federal Research Institute in Birmensdorf and Nicola Di Cosmo of the Institute for Advanced Study in Princeton, N.J., think they may have an explanation. Weather data preserved in tree rings points to a series of warm, dry summers in the region until 1242, when temperatures dropped and rainfall increased. The shift to a wet, cold climate caused flooding and created marshy terrain. That would have been less than ideal for the nomadic Mongol cavalry, reducing their mobility and pastureland.