From deep in the gold mines of South Africa’s Orange Free State has come evidence that there was some form of biologic activity on Earth at least 2.15 billion years ago. Polymerized hydrocarbon “chemo-fossils” found in the gold ores … [probably] were originally part of a rich bacterial and algal life in the Witwatersrand basin. Since the rock layers from which they come have been dated to about 2.15 billion years ago, it seems likely that photosynthesis existed on Earth before then. — Science News, March 18, 1967
UPDATE Scientists still debate when early photosynthesizing organisms called cyanobacteria began pumping oxygen into Earth’s atmosphere. Recent evidence suggests the microbes existed some 3.2 billion years ago (SN Online: 9/8/15), even though a larger oxygen surge didn’t happen until about 2.4 billion years ago (SN: 3/4/17 p. 9). Those tiny bacteria left an outsized impact on our planet, releasing extra oxygen into the atmosphere that paved the way for complex multicellular life like plants and animals.
Catch sight of someone scratching and out of nowhere comes an itch, too. Now, it turns out mice suffer the same strange phenomenon.
Tests with mice that watched itchy neighbors, or even just videos of scratching mice, provide the first clear evidence of contagious scratching spreading mouse-to-mouse, says neuroscientist Zhou-Feng Chen of Washington University School of Medicine in St. Louis. The quirk opens new possibilities for exploring the neuroscience behind the spread of contagious behaviors. For the ghostly itch, experiments trace scratching to a peptide nicknamed GRP and areas of the mouse brain better known for keeping the beat of circadian rhythms, Chen and colleagues found. They report the results in the March 10 Science.
In discovering this, “there were lots of surprises,” Chen says. One was that mice, nocturnal animals that mostly sniff and whisker-brush their way through the dark, would be sensitive to the sight of another mouse scratching. Yet Chen had his own irresistible itch to test the “crazy idea,” he says.
Researchers housed mice that didn’t scratch any more than normal within sight of mice that flicked and thumped their paws frequently at itchy skin. Videos recorded instances of normal mice looking at an itch-prone mouse mid-scratch and, shortly after, scratching themselves. In comparison, mice with not-very-itchy neighbors looked at those neighbors at about the same frequency but rarely scratched immediately afterward. Videos of scratching mice produced the same result. More audience itching and scratching followed a film of a mouse with itchy skin than one of a mouse poking about on other rodent business. Next, researchers looked at how contagious itching plays out in the mouse nervous system. Brains of mice recently struck by contagious urges to scratch showed heightened activity in several spots, including, surprisingly, a pair of nerve cell clusters called the suprachiasmatic nuclei, or SCN. People have these clusters, too, deep in the brain roughly behind the eyes.
Other tests linked the contagious itching with GRP, previously identified as transmitting itch information elsewhere in the mouse nervous system. Mice didn’t succumb to contagious itching if they had no working genes for producing GRP or the molecule that detects it. Yet these mice still scratched when researchers irritated their skin. Also, in normal mice, a dose of GRP injected to the SCN brain regions brought on scratching without the sight of an itchy neighbor, but a dose of plain saline solution to same spots failed to set off much pawing.
It’s fine work, says dermatologist Gil Yosipovitch, who studies itching at the University of Miami. But he wonders how the mouse discovery might apply to people. So far, brain imagery in his own work has not turned up evidence for an SCN role in human contagious itching, he says.
SCN is better known as a circadian timekeeper, responding to cues in light. It’s unclear how the nerve cell clusters might orchestrate behavior based on seeing a scratching mouse, “a very specific and rich visual stimulus,” says psychologist and neuroscientist Henning Holle of the University of Hull in England. Other research suggests different brain regions are involved in contagious itching in people.
Tracking down the mechanisms behind the phenomenon is more than an intriguing science puzzle, Yosipovitch says. People troubled with strong, persistent itching are often unusually susceptible to contagious scratching, and new ideas for easing their misery would be welcome.
The American badger is known to cache carrion in the ground. The animals squirrel away future meals underground, which acts something like a natural refrigerator, keeping their food cool and hidden from anything that might want to steal it. Researchers, though, had never spotted badgers burying anything bigger than a jackrabbit — until 2016, when a young, dead cow went missing in a study of scavengers in northwestern Utah.
That January, University of Utah researchers had set out seven calves (all of which had died from natural causes) weighing 18 to 27 kilograms in the Great Basin Desert, each monitored by a camera trap. After a week, one of the carcasses went missing, even though it, like the others, had been staked in place so nothing could drag it off. But perhaps a coyote or mountain lion managed the feat, the researchers thought.
Then they checked the camera. What they found surprised them.
The images showed a badger happening upon the calf on January 16. The next evening, the badger returned and spent four hours digging below and around the bovine, breaking for only five minutes to snack on its find. It came back and continued digging the next afternoon and the following morning, by which time the calf had fallen into the crater the badger had dug. But that wasn’t the end. The badger then spent a couple more days backfilling the hole, covering its find and leaving itself a small entrance. The badger stayed with his meal for the next couple of weeks, venturing out briefly from time to time. (It’s impossible to know where the badger went, but getting a drink is one possibility, says the study’s lead author Ethan Frehner.) By late February, the badger was still visiting its find from time to time. But herds of (living) cows kept coming through the site, and though the badger checked on its cache several times, it never re-entered the burrow after March 6.
It turns out that this badger was not alone in taking advantage of the research project for a huge, free meal. Simultaneously at one of the other carcass sites about three kilometers away, another badger attempted to bury a calf that had been staked out there. It only got the job partway done, though, as the anchoring stake prevented the badger from finishing a full burial. Instead, the badger dug itself a hole and spent several weeks there, periodically feeding on its find. This is the first time scientists have documented American badgers burying a carcass so much bigger than themselves (the calves were three to four times the weight of the badgers), the team reports March 31 in Western North American Naturalist.
“All scavengers play an important ecological role — helping to recycle nutrients and to remove carrion and disease vectors from the ecosystem,” Frehner says. “The fact that American badgers could bury carcasses of this size indicates that they could potentially bury the majority of the carrion that they would come into contact with in the wild. If they exhibit this behavior across their range, the American badger could be accounting for a significant amount of the scavenging and decomposition process which occurs throughout a large area in western North America.”
And that burial may have a benefit for ranchers, the researchers note: If badgers bury calves that have died of disease, that may reduce the likelihood that a disease will spread. It’s too soon to say whether that happens, but study co-author Evan Buechley notes, “that merits further study.”
[Millions of diabetics] could be indebted to a strain of diabetic mice being bred in Bar Harbor, Maine. In diabetes research, “this mouse is the best working model to date,” one of its discoverers, Dr. Katharine P. Hummel, says.… A satisfactory animal subject had eluded diabetes researchers, until the mouse was found. — Science News, August 12, 1967
Update Hummel’s diabetic mice are still used in research to mimic type 2 diabetes in humans, which is linked to obesity. In the mid-1990s, researchers found that the diabetic mice carry a mutation in the leptin receptor gene, which prevents the hormone leptin from signaling fullness and triggering other metabolic processes. In people, however, the disease is more complicated. More than 40 genetic variants are associated with susceptibility to type 2 diabetes. Unlike the mouse mutation, none of those variants guarantee a person will develop the disease.
A Neandertal child whose partial skeleton dates to around 49,000 years ago grew at the same pace as children do today, with a couple of exceptions. Growth of the child’s spine and brain lagged, a new study finds.
It’s unclear, though, whether developmental slowing in those parts of the body applied only to Neandertals or to Stone Age Homo sapiens as well. If so, environmental conditions at the time — which are currently hard to specify — may have reduced the pace of physical development similarly in both Homo species. This ancient youngster died at 7.7 years of age, say paleoanthropologist Antonio Rosas of the National Museum of Natural Sciences in Madrid and colleagues. The scientists estimated the child’s age by counting microscopic enamel layers that accumulated daily as a molar tooth formed.
Previous excavations uncovered the child’s remains, as well as fossils of 12 other Neandertals, at a cave site in northwestern Spain called El Sidrόn.
Much — but not all —of the Neandertal child’s skeleton had matured to a point expected for present-day youngsters of the same age, the scientists report in the Sept. 22 Science. But bones at the top and in the middle of the spine had not fully fused, corresponding to a stage of development typical of 4- to 6- year-olds today. Also, the ancient child’s brain was still growing at an age when living humans’ brains have nearly or fully reached adult size. Signs of bone tissue being reshaped on the inner surface of the child’s braincase pointed to ongoing brain expansion. Rosas’ team calculated that the youngster’s brain volume was about 87.5 percent of that expected, on average, for Neandertal adults.
Neandertals’ slightly larger brains relative to people today may have required more energy, and thus more time, to grow, the researchers suggest. And they suspect that the growth of Neandertals’ bigger torsos, and perhaps spinal cords, slowed the extinct species’ backbone development in late childhood.
Rosas’ new study “reinforces what should have been apparent for some time — that Neandertal growth rates and patterns, except for those related to well-known differences in [skeletal shape], rarely differ from modern human variations,” says paleoanthropologist Erik Trinkaus of Washington University in St. Louis.
But researchers need to compare the El Sidrόn child to fossils of H. sapiens youngsters from the same time or later in the Stone Age, Trinkaus adds. Relative to kids today, ancient human youth may display slower growth rates comparable to those of the Neandertal child, he suspects.
A mysterious group of microbes may be controlling the fate of carbon in the dark depths of the world’s oceans.
Nitrospinae bacteria, which use the nitrogen compound nitrite to “fix” inorganic carbon dioxide into sugars and other compounds for food and reproduction, are responsible for 15 to 45 percent of such carbon fixation in the western North Atlantic Ocean, researchers report in the Nov. 24 Science. If these microbes are present in similar abundances around the world — and some data suggest that the bacteria are — those rates may be global, the team adds. The total amount of carbon that Nitrospinae fix is small when compared with carbon fixation on land by organisms such as plants or in the sunlit part of the ocean, says Maria Pachiadaki, a microbial ecologist at Bigelow Laboratory for Ocean Sciences in East Boothbay, Maine, who is lead author on the new study. “But it seems to be of major importance to the productivity and health of the 90 percent of the ocean that is too deep and too dark for photosynthesis.” These bacteria likely form the base of the food web in much of this enigmatic realm, she says.
Oceans cover more than two-thirds of Earth’s surface, and most of those waters are in the dark. In the shallow, sunlit part of the ocean, microscopic organisms called phytoplankton fix carbon dioxide through photosynthesis. But in the deep ocean where sunlight doesn’t penetrate, microbes that use chemical energy derived from compounds such as ammonium or hydrogen sulfide are the engines of that part of the carbon cycle.
Little has been known about which microbes are primarily responsible for this dark ocean carbon fixation. The likeliest candidates were a group of ammonium-oxidizing archaea (single-celled organisms similar to bacteria) known as Thaumarchaeota because they are the most abundant microbes in the dark ocean.
But there was no direct proof that these archaea are the main fixers in those waters, says Pachiadaki. In fact, previous studies of carbon fixation in these depths suggested that ammonium-oxidizers weren’t performing the task quickly enough to match observations, she says. “The energy gained from ammonium oxidation is not enough to explain the amount of the carbon fixed in the dark ocean.” She and colleagues suspected that a different group of microbes might be bearing the brunt of the task. Nitrospinae bacteria that use the chemical compound nitrite were known to be abundant in at least some parts of the dark ocean, but the microbes weren’t well studied. So Pachiadaki’s team analyzed 3,463 genomes, or genetic blueprints, of single-celled organisms found in 39 seawater samples collected in the western North Atlantic Ocean, at depths ranging from “twilight” regions below about 200 meters to the ocean’s deepest zone below 9,000 meters. The team identified Nitrospinae as the most abundant bacteria, particularly in the twilight zone. Although still less abundant than the ammonium-oxidizing Thaumarchaeota, the nitrite-oxidizers are much more efficient at fixing carbon, requiring only a tiny amount of available nitrite.
And although scientists knew that these bacteria use nitrite to produce energy, the new study showed that the compound is the primary source of energy for the microbes. Marine microbiologist Frank Stewart of Georgia Tech in Atlanta says the study “exemplifies how advances in genomic methods can generate hypotheses about metabolism and ecology.” These findings suggest that scientists need to rethink how energy and materials cycle in the dark ocean, he says. “While this ocean realm remains underexplored, studies like this are models for how to close our knowledge gap.”
Some scientific anniversaries celebrate events so momentous that they capture the attention of many nonscientists as well — or even the entire world.
One such anniversary is upon us. December 2 marks the semisesquicentennial (75th anniversary) of the first controlled and sustained nuclear fission chain reaction. Only four years after German scientists discovered nuclear fission, scientists in America took the first step toward harnessing it. Many of those scientists were not Americans, though, but immigrants appalled by Hitler and horrified at the prospect that he might acquire a nuclear fission weapon.
Among the immigrants who initiated the American fission effort was Albert Einstein. His letter to President Franklin Roosevelt, composed at the request and with the aid of immigrant Leo Szilard from Hungary, warned of nuclear fission’s explosive potential. Presented with Einstein’s letter in October 1939, Roosevelt launched what soon became the Manhattan Project, which eventually produced the atomic bomb. It was another immigrant, Enrico Fermi from Italy, who led the initial efforts to show that building an atomic bomb was possible.
Fermi had arrived in the United States in January 1939, shortly after receiving the Nobel Prize in physics for his work on creating artificial elements heavier than uranium. Except that he hadn’t actually done so — his “new elements” were actually familiar elements produced by the splitting of the uranium nucleus. But nobody knew that fission was possible, so Fermi had misinterpreted his results. Chemists Otto Hahn and Fritz Strassmann, working in Germany, conducted experiments in 1938 that produced the element barium by bombarding uranium with neutrons. So Hahn and Strassmann got the credit for discovering fission, although they didn’t really know what they had done either. It was Lise Meitner, a former collaborator of Hahn’s who had recently left Germany to avoid Nazi anti-Semitism, who figured out that they had split the uranium nucleus. Meitner’s nephew Otto Frisch revealed her insight to Niels Bohr, the world’s leading atomic physicist, just as he stepped aboard a ship for a visit to America. Upon arriving in the United States, Bohr informed Fermi and Princeton University physicist John Archibald Wheeler of Hahn’s experiment and Meitner’s explanation. Fermi immediately began further experimental work at Columbia University to investigate fission, as did Szilard, also at Columbia (and others in Europe); Bohr and Wheeler tackled the issue from the theoretical side.
Fermi and Szilard quickly succeeded in showing that a fission “chain reaction” was in principle possible: Neutrons emitted from fissioning uranium nuclei could induce more fission. By September, Bohr and Wheeler had produced a thorough theoretical analysis, explaining the physics underlying the fission process and identifying which isotope of uranium fissioned most readily. It was clear that the initial speculations about fission’s potential power had not been exaggerated.
“Almost immediately it occurred to many people around the world that this could be used to make power and that it could be used for nuclear explosives,” another immigrant who worked on the Manhattan Project, the German physicist Hans Bethe, told me during an interview in 1997. “Lots of people verified that indeed when uranium is bombarded by neutrons, slow neutrons in particular, a process occurs which releases tremendous amounts of energy.” Bethe, working at Cornell University, did not immediately join the fission project — he thought building a bomb would take too long to matter for World War II. “I thought this had nothing to do with the war,” he said. “So I instead went into radar.”
Fermi, despite being an immigrant, was put in charge of constructing an “atomic pile” (nowadays nuclear reactor) to verify the chain reaction theory. He was, after all, widely acknowledged as the world’s leading nuclear experimentalist (and was no slouch as a theorist either); colleagues referred to him as “The Pope” because of his supposed infallibility. Construction of the pile began on a squash court under the stands of the University of Chicago’s football stadium. The goal was to demonstrate the ability to generate a chain reaction, in which any one fissioning nucleus would emit enough neutrons to trigger even more nuclei to fission.
“It became clear to Fermi almost immediately that in order to do this with natural uranium you had to slow down the neutrons,” Bethe said.
Fermi decided that the best material for slowing neutrons was graphite, the form of carbon commonly used as pencil lead. But in preliminary tests the graphite did not do the job as Fermi had anticipated. He reasoned that the graphite contained too many impurities to work effectively. So Szilard began searching for a company that could produce ultrapure graphite. He found one, Bethe recalled, that happily agreed to meet Fermi’s purity requirements — for double the usual graphite price. Ultimately Fermi’s atomic pile succeeded, producing a sustained chain reaction on December 2, 1942. That success led to the establishment of the secret laboratory in Los Alamos, N.M., where physicists built the bombs that brought World War II to an end in 1945.
By then, Bethe had been persuaded to join the project. He arrived at Los Alamos in April 1943 and witnessed the first nuclear explosion, at Alamogordo, N.M., on July 16, 1945.
“I was among the people who looked at it from a 20-mile distance,” he said. “It was impressive.”
Historians frequently cite the report of J. Robert Oppenheimer, director of the Los Alamos project, who said that the explosion reminded him of a line from the Hindu Bhagavad Gita: “Now I am become Death, the destroyer of worlds.”
Bethe recalled a different response, from one of the military officials on the scene.
“One of the officers at the explosion said, ‘My god. Those longhairs have let it get away from them.’”
One of the strangest things about growing a human being inside your body is the alien sensation of his movements. It’s wild to realize that these internal jabs and pushes are the work of someone else’s nervous system, skeleton and muscles. Someone with his own distinct, mysterious agenda that often includes taekwondoing your uterus as you try to sleep.
Around the 10-week mark, babies start to bend their heads and necks, followed by full-body wiggles, limb movement and breathing around 15 weeks. These earliest movements are usually undetectable by pregnant women, particularly first-timers who may not recognize the flutters until 16 to 25 weeks of pregnancy. These movements can be exciting and bizarre, not to mention uncomfortable. But for the developing baby, these kicks are really important, helping to sculpt muscles, bones and joints. While pregnant women can certainly sense a jab, scientists have largely been left in the dark about how normal fetuses move. “It’s extremely difficult to investigate fetal movements in detail in humans,” says Stefaan Verbruggen, a bioengineer formerly at Imperial College London who recently moved to Columbia University in New York.
Now, using relatively new MRI measurements of entire fetuses wiggling in utero, researchers have tracked these kicks across women’s pregnancies. The results, published January 24 in the Journal of the Royal Society Interface, offer the clearest look yet at fetal kicking and provide hints about why these moves are so important. Along with bioengineer Niamh Nowlan, of ICL, and colleagues, Verbruggen analyzed videos of fetal kicks caught on MRI scans. These scans, from multiple pregnant women, included clear leg kicks at 20, 25, 30 and 35 weeks gestation. Other MRI scans provided anatomical details about bones, joints and leg sizes. With sophisticated math and computational models, the researchers could estimate the strengths of the kicks, as well as the mechanical effects, such as stresses and strains, that those kicks put on fetal bones and joints. Kicks ramped up and became more forceful from 20 to 30 weeks, the researchers found. During this time, kicks shifted the wall of the uterus by about 11 millimeters on average, the team found. But by 35 weeks, kick force had declined, and the uterus moved less with each kick, only about 4 millimeters on average. (By this stage, things are getting pretty tight and tissues might be stretched taut, so this decrease makes sense.) Yet even with this apparent drop in force, the stresses experienced by the fetus during kicks kept increasing, even until 35 weeks. Increasing pressure on the leg bones and joints probably help the fetus grow, the researchers write.
Other work has found that the mechanical effects of movement can stimulate bone growth, which is why weight-bearing exercises, such as brisk walking and step aerobics, are often recommended for people with osteoporosis. In animal studies, stationary chick and mouse fetuses have abnormal bones and joints, suggesting that movement is crucial to proper development.
The results highlight the importance of the right kinds of movements for fetuses’ growth. Babies born prematurely can sometimes have joint disorders. It’s possible that bone growth and joints are affected when babies finish developing in an environment dominated by gravity, instead of the springy, tight confines of a uterus. Even in utero position might have an effect. Head-up breech babies, for instance, have a higher risk of a certain hip disorder, a link that hints at a relationship between an altered kicking ground and development. In fact, the researchers are now looking at the relationship between fetal movements and skeletal stress and strain in these select groups.
Mechanical forces in utero might have long-lasting repercussions. Abnormal joint shapes are thought to increase the risk of osteoarthritis, says Verbruggen, “which means that how you move in the womb before you’re even born can affect your health much later in life.”
There’s a lot more work to do before scientists fully understand the effects of fetal movements, especially those in less than ideal circumstances. But by putting hard numbers to squirmy wiggles, this new study is kicking things off right.
It’s another record for SpaceX. At 3:50 p.m. Eastern on February 6, the private spaceflight company launched the Falcon Heavy rocket for the first time.
The Heavy — essentially three SpaceX Falcon 9 rocket boosters strapped together — is the most powerful rocket launched since the Saturn V, which shot astronauts to the moon during the Apollo program. SpaceX hopes to use the Heavy to send humans into space. The company is developing another rocket, dubbed the BFR, to eventually send people to Mars. Another first for this launch: the synchronized return of two of the boosters. (The third, from the center core, didn’t descend properly, and instead of landing on a droneship, it hit the ocean at 300 mph.) Part of SpaceX’s program is to reuse rockets, which brings down the cost of space launches. The company has successfully landed the cores of its Falcon 9 rockets 21 times and reflew a rocket six times. The company landed a previously used rocket for the first time in March.
But the cargo for today’s launch is aimed at another planet. The rocket carried SpaceX CEO Elon Musk’s red Tesla Roadster with “Space Oddity” by David Bowie playing on the stereo. It is now heading toward Mars.
“I love the thought of a car drifting apparently endlessly through space and perhaps being discovered by an alien race millions of years in the future,” Musk tweeted in December. Editor’s note: This story was updated on February 7 to update the status of the booster landings, and again on February 9 to correct the rocket that SpaceX hopes to use to send people to Mars. The company intends to use its BFR rocket, not the Falcon Heavy.
Researchers from Google are testing a quantum computer with 72 quantum bits, or qubits, scientists reported March 5 at a meeting of the American Physical Society — a big step up from the company’s previous nine-qubit chip.
The team hopes to use the larger quantum chip to demonstrate quantum supremacy for the first time, performing a calculation that is impossible with traditional computers (SN: 7/8/17, p. 28), Google physicist Julian Kelly reported. Achieving quantum supremacy requires a computer of more than 50 qubits, but scientists are still struggling to control so many finicky quantum entities at once. Unlike standard bits that take on a value of 0 or 1, a qubit can be 0, 1 or a mashup of the two, thanks to a quantum quirk known as superposition.
Nicknamed Bristlecone because its qubits are arranged in a pattern resembling a pinecone’s scales, the computer is now being put through its paces. “We’re just starting testing,” says physicist John Martinis of Google and the University of California, Santa Barbara. “From what we know so far, we’re very optimistic.” The quantum supremacy demonstration could come within a few months if everything works well, Martinis says.
Google is one of several companies working to make quantum computers a reality. IBM announced it was testing a 50-qubit quantum computer in November 2017 (SN Online: 11/10/17), and Intel announced a 49-qubit test chip in January.