Hayabusa2 has blasted the asteroid Ryugu with a projectile, probably adding a crater to the small world’s surface and stirring up dust that scientists hope to snag.
The projectile, a two-kilogram copper cylinder, separated from the Hayabusa2 spacecraft at 9:56 p.m. EDT on April 4, JAXA, Japan’s space agency, reports.
Hayabusa2 flew to the other side of the asteroid to hide from debris that would have been ejected when the projectile hit (SN: 1/19/19, p. 20). Scientists won’t know for sure whether the object successfully made a crater, and, if so, how big it is, until the craft circles back. But by 10:36 p.m. EDT, Hayabusa2’s cameras had captured a blurry shot of a dust plume spurting up from Ryugu, so the team thinks the attempt worked. “This is the world’s first collision experiment with an asteroid!” JAXA tweeted.
Hayabusa2 plans to briefly touch down inside the crater to pick up a pinch of asteroid dust. The spacecraft has already grabbed one sample of Ryugu’s surface (SN Online: 2/22/19). But dust exposed by the impact will give researchers a look at the asteroid’s subsurface, which has not been exposed to sunlight or other types of space radiation for up to billions of years.
If all goes as planned, Hayabusa2 will return to Earth with both samples in late 2020. A third planned sample pickup has been scrapped because Ryugu’s boulder-strewn surface is so hazardous for the spacecraft. Comparing the two samples will reveal details of how being exposed to space changes the appearance and composition of rocky asteroids, and will help scientists figure out how Ryugu formed (SN Online: 3/20/19). Scientists hope that the asteroid contains water and organic material that might help explain how life got started in the solar system.
My youngest child, now just over a year old, has started to talk. Even though I’ve experienced this process with my older two, it’s absolutely thrilling. He is putting words to the thoughts that swirl around in his sweet little head, making his mind a little less mysterious to the rest of us.
But these early words may not mean what we think they mean, a new study hints. Unsurprisingly, when 2-year-olds were asked a series of “this or that” questions, the toddlers showed strong preferences — but not for the reasons you’d think. Overwhelmingly, the toddlers answered the questions with the last choice given. That bias, described in PLOS ONE on June 12, suggests that young children’s answers to these sorts of questions don’t actually reflect their desires. Instead, kids may simply be echoing the last thing they heard.
This verbal quirk can be used by parents to great effect, as the researchers point out in the title of their paper: “Cake or broccoli?” More fundamentally, the results raise questions about what sort of information a verbal answer actually pulls out of a young child’s mind. This murkiness is especially troublesome when it comes to questions whose answers call for adult action, such as: “Did you hit your sister on purpose or on accident?”
In the first series of experiments, researchers led by Emily Sumner at the University of California, Irvine, asked 24 1- and 2-year-olds a bunch of two-choice questions, some of which involved a polar bear named Rori or a grizzly bear named Quinn. One question, for example, was, “Does Rori live in an igloo or a tepee?” Later, the researchers switched the bear and the order of the options, asking, for example, “Does Quinn live in a tepee or an igloo?”
The toddlers could answer either verbally or, for reluctant speakers, by pointing at one of two stickers that showed the choices. When the children answered the questions by pointing, they chose the second option about half the time, right around chance. But when the toddlers spoke their answers, they chose the second option 85 percent of the time, regardless of the bear. SECOND BEST A toddler taking part in a study selects the second option in three either-or questions. This tendency, called the recency bias, may reflect kids’ inability to juggle several choices in their minds simultaneously. Credit: E. Sumner et al/PLOS ONE 2019
This abundance of second options selected — a habit known as the recency bias — might be due to the fact that young children have trouble holding the first option in mind, the researchers suspect. Other experiments showed that children’s tendency toward the second option got stronger when the words got longer.
Adults actually have the opposite tendency: We’re more inclined to choose the first option we’re given (the primacy bias). To see when this shift from last to first occurs, the researchers studied transcripts of conversations held between adults and children ages 1.5 to 4. In these natural conversations, 2-year-olds were more likely to choose the second option. But 3- and 4-year-olds didn’t show this bias, suggesting that the window closes around then.
The results hold a multitude of delightful parenting hacks: “Would you like to jump on the bed all night, or go to sleep?” But more importantly, the study serves as a reminder that the utterances of small children, while fascinating, may not carry the same meanings as those that come from more mature speakers. If you really want a straight answer, consider showing the two options to the toddler. But if you go that route, be prepared to hand over the cake.
A skull found in a cliffside cave on Greece’s southern coast in 1978 represents the oldest Homo sapiens fossil outside Africa, scientists say.
That skull, from an individual who lived at least 210,000 years ago, was encased in rock that also held a Neandertal skull dating to at least 170,000 years ago, contends a team led by paleoanthropologist Katerina Harvati of the University of Tübingen in Germany.
If these findings, reported online July 10 in Nature, hold up, the ancient Greek H. sapiens skull is more than 160,000 years older than the next oldest European H. sapiens fossils (SN Online: 11/2/11). It’s also older than a proposed H. sapiens jaw found at Israel’s Misliya Cave that dates to between around 177,000 and 194,000 years ago (SN: 2/17/18, p. 6).
“Multiple Homo sapiens populations dispersed out of Africa starting much earlier, and reaching much farther into Europe, than previously thought,” Harvati said at a July 8 news conference. African H. sapiens originated roughly 300,000 years ago (SN: 7/8/17, p. 6). A small group of humans may have reached what’s now Greece more than 200,000 years ago, she suggested. Neandertals who settled in southeastern Europe not long after that may have replaced those first H. sapiens. Then humans arriving in Mediterranean Europe tens of thousands of years later would eventually have replaced resident Neandertals, who died out around 40,000 years ago (SN Online: 6/26/19).
But Harvati’s group can’t exclude the possibility that H. sapiens and Neandertals simultaneously inhabited southeastern Europe more than 200,000 years ago and sometimes interbred. A 2017 analysis of ancient and modern DNA concluded that humans likely mated with European Neandertals at that time.
The two skulls were held in a small section of wall that had washed into Greece’s Apidima Cave from higher cliff sediment and then solidified roughly 150,000 years ago. Since one skull is older than the other, each must originally have been deposited in different sediment layers before ending up about 30 centimeters apart on the cave wall, the researchers say. Earlier studies indicated that one Apidima skull, which retains the face and much of the braincase, was a Neandertal that lived at least 160,000 years ago. But fossilization and sediment pressures had distorted the skull’s shape. Based on four 3-D digital reconstructions of the specimen, Harvati’s team concluded that its heavy brow ridges, sloping face and other features resembled Neandertal skulls more than ancient and modern human skulls. An analysis of the decay rate of radioactive forms of uranium in skull bone fragments produced an age estimate of at least 170,000 years.
A second Apidima fossil, also dated using uranium analyses, consists of the back of a slightly distorted braincase. Its rounded shape in a digital reconstruction characterizes H. sapiens, not Neandertals, the researchers say. A bunlike bulge often protrudes from the back of Neandertals’ skulls. But without any facial remains to confirm the species identity of the partial braincase, “it is still possible that both Apidima skulls are Neandertals,” says paleoanthropologist Israel Hershkovitz of Tel Aviv University. Hershkovitz led the team that discovered the Misliya jaw and assigned it to H. sapiens.
Harvati and her colleagues will try to extract DNA and species-distinguishing proteins (SN: 6/8/19, p. 6) from the Greek skulls to determine their evolutionary identities and to look for signs of interbreeding between humans and Neandertals.
The find does little to resolve competing explanations of how ancient humans made their way out of Africa. Harvati’s suggestion that humans trekked from Africa to Eurasia several times starting more than 200,000 years ago is plausible, says paleoanthropologist Eric Delson of City University of New York’s Lehman College in an accompanying commentary. And the idea that some H. sapiens newcomers gave way to Neandertals probably also applied to humans who reached Misliya Cave and nearby Middle Eastern sites as late as around 90,000 years ago, before Neandertals occupied the area by 60,000 years ago, Delson says.
Hershkovitz disagrees. Ancient humans and Neandertals lived side-by-side in the Middle East for 100,000 years or more and occasionally interbred, he contends. Misliya Cave sediment bearing stone tools dates to as early as 274,000 years ago, Hershkovitz says. Since only H. sapiens remains have been found in the Israeli cave, ancient humans probably made those stone artifacts and could have been forerunners of Greek H. sapiens.
No one should have to sleep with the fishes, but new research on zebrafish suggests that we sleep like them.
Sleeping zebrafish have brain activity similar to both deep slow-wave sleep and rapid eye movement, or REM, sleep that’s found in mammals, researchers report July 10 in Nature. And the team may have tracked down the cells that kick off REM sleep.
The findings suggest that the basics of sleep evolved at least 450 million years ago in zebrafish ancestors, before the evolution of animals that give birth to live young instead of laying eggs. That’s 150 million years earlier than scientists thought when they discovered that lizards sleep like mammals and birds (SN: 5/28/16, p. 9).
What’s more, sleep may have evolved underwater, says Louis C. Leung, a neuroscientist at Stanford University School of Medicine. “These signatures [of sleep] really have important functions — even though we may not know what they are — that have survived hundreds of millions of years of evolution.” In mammals, birds and lizards, sleep has several stages characterized by specific electrical signals. During slow-wave sleep, the brain is mostly quiet except for synchronized waves of electrical activity. The heart rate decreases and muscles relax. During REM or paradoxical sleep, the brain lights up with activity almost like it’s awake. But the muscles are paralyzed (except for rapid twitching of the eyes) and the heart beats erratically.
For many years, scientists have known that fruit flies, nematodes, fish, octopuses and other creatures have rest periods reminiscent of sleep. But until now, no one could measure the electrical activity of those animals’ brains to see if that rest is the same as mammals’ snoozing.
Leung and colleagues developed a system to do just that in zebrafish by genetically engineering them to make a fluorescent molecule that lights up when it encounters calcium, which is released when nerve cells and muscles are active. By following the flashes of light using a light sheet microscope, the researchers tracked brain and muscle activity in the naturally transparent fish larvae.
The next task was to lull fish asleep under the microscope. In some experiments, the team added drugs that trigger either slow-wave or REM sleep in mammals to the fish’s water. In others, researchers deprived fish of sleep for a night or tuckered the fish out with lots of activity during the day. Results from all the snooze-inducing methods were the same.
Sleeping fish have two distinct types of brain activity while sleeping, the team found. One, similar to slow-wave sleep, was characterized by short bursts of activity in some nerve cells in the brain. The researchers call that state slow-bursting sleep. REM-like sleep, which the researchers dubbed “propagating-wave sleep,” was characterized by frenzied brain activity that spreads like a wave through the brain. The researchers aren’t calling the sleep phases REM or slow-wave sleep because there are some minor differences between the way fish and mammals sleep. A group of cells that line hollow spaces called ventricles deep in the brain seems to trigger that wave of REM-like brain activity. These ependymal cells dip fingerlike cilia into the cerebral spinal fluid that bathes the ventricles and the central nervous system. The cells appear to beat their cilia faster as amounts of a well-known, sleep-promoting hormone called melanin-concentrating hormone in the fluid increases, the researchers discovered. It’s unclear how the ependymal cells communicate with the rest of the brain to set off REM-like activity. Such cells are also present in mammals, but no one has yet been able to see that deeply into the brains of sleeping mammals to determine whether the cells play a role in sleep. But knowing about these cells may help researchers develop better sleep aids, Leung says.
Just as in mammals, zebrafish’s whole bodies are affected during sleep. Their muscles relax during sleep and their hearts slow from about 200 beats per minute when awake to about 110 to 120 beats per minute while asleep during the slow-wave–like sleep. During the REM-like sleep, the heart slows even more to about 90 beats per minute and loses its regular rhythm. And the fish’s muscles also go completely slack. The one characteristic that the fish lack is rapid eye movement. Instead, the eyes roll back into their sockets, says study coauthor Philippe Mourrain, a biologist at Stanford University School of Medicine.
Lack of eye movement could indicate that emotion-processing parts of the brain, such as the amygdala, aren’t as active in zebrafish as they are in mammals, says sleep researcher Allan Pack of the University of Pennsylvania Perelman School of Medicine. With their brain-activity monitoring, the researchers have taken sleep research “to the next level,” says Pack, and “they present pretty compelling evidence” of slow-wave and REM-like sleep in the fish.
The whole-body involvement that the researchers documented solidifies the argument that fish sleep is similar to mammals, says neuroscientist Paul Shaw of Washington University School of Medicine in St. Louis. In all organisms known to snooze, “sleep is manifest everywhere” in the body, he says.
Future experiments may show why poor sleep or a lack of Zs contributes to health problems in people, such as obesity, heart disease and diabetes.
Ice sheets expanded across much of northern Europe from around 25,000 to 19,000 years ago, making a huge expanse of land unlivable. That harsh event set in motion a previously unrecognized tale of two human populations that played out at opposite ends of the continent.
Western European hunter-gatherers outlasted the icy blast in the past. Easterners got replaced by migrations of newcomers.
That’s the implication of the largest study to date of ancient Europeans’ DNA, covering a period before, during and after what’s known as the Last Glacial Maximum, paleogeneticist Cosimo Posth and colleagues report March 1 in Nature. As researchers have long thought, southwestern Europe provided refuge from the last Ice Age’s big chill for hunter-gatherers based in and near that region, the scientists say. But it turns out that southeastern Europe, where Italy is now located, did not offer lasting respite from the cold for nearby groups, as previously assumed.
Instead, those people were replaced by genetically distinct hunter-gatherers who presumably had lived just to the east along the Balkan Peninsula. Those people, who carried ancestry from parts of southwestern Asia, began trekking into what’s now northern Italy by about 17,000 years ago, as the Ice Age began to wane.
“If local [Ice Age] populations in Italy did not survive and were replaced by groups from the Balkans, this completely changes our interpretation of the archaeological record,” says Posth, of the University of Tübingen in Germany.
Posth and colleagues’ conclusions rest on analyses of DNA from 356 ancient hunter-gatherers, including new molecular evidence for 116 individuals from 14 countries in Europe and Asia. Excavated human remains that yielded DNA dated to between about 45,000 and 5,000 years ago (SN: 4/7/21).
Comparisons of sets of gene variants inherited by these hunter-gatherers from common ancestors enabled the researchers to reconstruct population movements and replacements that shaped ancient Europeans’ genetic makeup. For the first time, ancient DNA evidence included individuals from what’s known as the Gravettian culture, which dates from about 33,000 to 26,000 years ago in central and southern Europe, and from southwestern Europe’s Solutrean culture, which dates to between about 24,000 and 19,000 years ago. Contrary to expectations, makers of Gravettian tools came from two genetically distinct groups that populated western and eastern Europe for roughly 10,000 years before the Ice Age reached its peak, Posth says. Researchers have traditionally regarded Gravettian implements as products of a biologically uniform population that occupied much of Europe.
“What we previously thought was one genetic ancestry in Europe turned out to be two,” says paleogeneticist Mateja Hajdinjak of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, who did not participate in the new study. And “it seems that western and southwestern Europe served as a [refuge from glaciation] more than southeastern Europe and Italy.”
Descendants of the western Gravettian population, who are associated with Solutrean artifacts and remnants of another ancient culture in western Europe that ran from about 19,000 to 14,000 years ago, outlasted the Ice Age before spreading northeastward across Europe, the researchers say.
Further support for southwestern Europe as an Ice Age refuge comes from DNA extracted from a pair of fossil teeth that belonged to an individual linked to the Solutrean culture in southern Spain. That roughly 23,000-year-old adult was genetically similar to western European hunter-gatherers who lived before and after the Last Glacial Maximum, Max Planck paleogeneticist Vanessa Villalba-Mouco and colleagues, including Posth, report March 1 in Nature Ecology & Evolution.
Meanwhile, the genetic evidence suggests that hunter-gatherers in what’s now Italy were replaced by people from farther east, probably based in the Balkan region. Those newcomers must have brought with them a distinctive brand of stone artifacts, previously excavated at Italian sites and elsewhere in eastern Europe, known as Epigravettian tools, Posth says. Many archaeologists have suspected that Epigravettian items were products of hunter-gatherers who clustered in Italy during the Ice Age’s peak freeze.
But, Hajdinjak says, analyses of DNA from fossils of Ice Age Balkan people are needed to clarify what groups moved through Italy, and when those migrations occurred.
Ultimately, descendants of Ice Age migrants into Italy reached southern Italy and then western Europe by around 14,000 years ago, Posth and colleagues say. Ancient DNA evidence indicates that, during those travels, they left a major genetic mark on hunter-gatherers across Europe.
Bacteria can slip into the brain by commandeering cells in the brain’s protective layers, a new study finds. The results hint at how a deadly infection called bacterial meningitis takes hold.
In mice infected with meningitis-causing bacteria, the microbes exploit previously unknown communication between pain-sensing nerve cells and immune cells to slip by the brain’s defenses, researchers report March 1 in Nature. The results also hint at a new way to possibly delay the invasion — using migraine medicines to interrupt those cell-to-cell conversations. Bacterial meningitis is an infection of the protective layers, or meninges, of the brain that affects 2.5 million people globally per year. It can cause severe headaches and sometimes lasting neurological injury or death.
“Unexpectedly, pain fibers are actually hijacked by the bacteria as they’re trying to invade the brain,” says Isaac Chiu, an immunologist at Harvard Medical School in Boston. Normally, one might expect pain to be a warning system for us to shut down the bacteria in some way, he says. “We found the opposite…. This [pain] signal is being used by the bacteria for an advantage.”
It’s known that pain-sensing neurons and immune cells coexist in the meninges, particularly in the outermost layer called the dura mater (SN: 11/11/20). So to see what role the pain and immune cells play in bacterial meningitis, Chiu’s team infected mice with two of the bacteria known to cause the infection in humans: Streptococcus pneumoniae and S. agalactiae. The researchers then observed where that bacteria ended up in mice genetically tweaked to lack pain-sensing nerve cells and compared those resting spots to those in mice with the nerve cells intact.
Mice without pain-sensing neurons had fewer bacteria in the meninges and brain than those with the nerve cells, the team found. This contradicts the idea that pain in meningitis serves as a warning signal to the body’s immune system, mobilizing it for action.
Further tests showed that the bacteria triggered a chain of immune-suppressing events, starting with the microbes secreting toxins in the dura mater.
The toxins hitched onto the pain neurons, which in turn released a molecule called CGRP. This molecule is already known to bind to a receptor on immune cells, where it helps control the dura mater’s immune responses. Injecting infected mice with more CGRP lowered the number of dural immune cells and helped the infection along, the researchers found.
The team also looked more closely at the receptor that CGRP binds to. In infected mice bred without the receptor, fewer bacteria made it into the brain. But in ones with the receptor, immune cells that would otherwise engulf bacteria and recruit reinforcements were disabled. The findings suggest that either preventing the release of CGRP or preventing it from binding to immune cells might help delay infection.
In humans, neuroscientists know that CGRP is a driver of headaches — it’s already a target of migraine medications (SN: 6/5/18). So the researchers gave five mice the migraine medication olcegepant, which blocks CGRP’s effects, and infected them with S. pneumoniae. After infection, the medicated mice had less bacteria in the meninges and brain, took longer to show symptoms, didn’t lose as much weight and survived longer than mice that were not given the medication.
The finding suggests olcegepant slowed the infection. Even though it only bought mice a few extra hours, that’s crucial in meningitis, which can develop just as quickly. Were olcegepant to work the same way in humans, it might give doctors more time to treat meningitis. But the effect is probably not as dramatic in people, cautions Michael Wilson, a neurologist at the University of California, San Francisco who wasn’t involved with the work.
Scientists still need to determine whether pain-sensing nerve cells and immune cells have the same rapport in human dura mater, and whether migraine drugs could help treat bacterial meningitis in people.
Neurologist Avindra Nath has doubts. Clinicians think the immune response and inflammation damage the brain during meningitis, says Nath, who heads the team investigating nervous system infections at the National Institute of Neurological Disorders and Stroke in Bethesda, Md. So treatment involves drugs that suppress the immune response, rather than enhance it as migraine medications might.
Chiu acknowledges this but notes there might be room for both approaches. If dural mater immune cells could head the infection off at the pass, it may keep some bacteria from penetrating the defenses, minimizing brain inflammation.
This study might not ultimately change how clinicians treat patients, Wilson says. But it still reveals something new about one of the first lines of defense for the brain.
Like so many others, I’ve been watching the HBO series The Last of Us. It’s a classic zombie apocalypse drama following Joel (played by Pedro Pascal) and Ellie (Bella Ramsey) as they make their way across the former United States (now run by a fascist government called Fedra).
I’m a big fan of zombie and other post-apocalyptic fiction. And my husband had told me how good the storyline is in the video game that inspired the series, so I was prepared for interesting storytelling. What I didn’t expect was to be so intrigued by the science behind the sci-fi. In the opening minutes of the series, two scientists on a fictional 1968 talk show discuss the microbes that give them pandemic nightmares. One says it’s fungi — not viruses or bacteria — that keep him awake. Especially worrisome, he says, are the fungi that control rather than kill their hosts. He gives the example of fungi that turns ants into living zombies, puppeteering the insects by flooding their brains with hallucinogens.
He goes on to warn that even though human body temperature keeps us fungus-free, that might not be true if the world got a little bit warmer. He predicts that as the thermostat climbs, a fungus that hijacks insects could mutate a gene allowing it to burrow into human brains and take control of our minds. Such a fungus could induce its human puppets to spread the fungus “by any means necessary,” he says. What’s worse, there are no preventatives, treatments or cures, nor any way to make them.
It’s a brief segment, but it had me hooked. It all sounded so chilling and … plausible. After all, fungi like ones that cause nail infections, yeast infections and ringworm already infect people.
So I consulted some experts on fungal infections to find out whether this could actually happen.
I’ve got good news and bad news.
First, the bad news.
Bad news: Climate change has already helped one fungus mutate to infect humans I wanted to know if warming has spurred any fungi to mutate and become infectious. So I called Arturo Casadevall. He has been thinking about fungi and heat for a long time. He’s proposed that the need to avoid fungal infections may have provided the evolutionary pressure that drove mammals and birds to evolve warm-bloodedness (SN: 12/3/10).
Most fungal species simply can’t reproduce at human body temperature (37° Celsius, or 98.6° Fahrenheit). But as the world warms, “these strains either have to die or adapt,” says Casadevall, a microbiologist who specializes in fungal infections at Johns Hopkins Bloomberg School of Public Health. That raises the possibility that fungi that now infect insects or reptiles could evolve to grow at temperatures closer to human body temperature.
At the same time, humans’ average body temperature has been falling since the 19th century, at least in high-income countries, researchers reported in eLife in 2020. One study from the United Kingdom pegs average body temperature at 36.6° C (97.9° F). And some of us are even cooler.
Fungi’s possible adaptation to higher heat and humans’ cooling body temperature are on a collision course, Casadevall says. He and colleagues presented evidence of one such crash. Climate change may have allowed a deadly fungus called Candida auris to acclimate to human body temperatures (SN: 7/26/19). A version of the fungus that could infect humans independently emerged on three continents from 2012 to 2015. “It’s not like someone took a plane a spread it. These things came out of nowhere simultaneously,” Casadevall says.
Some people argue that the planet hasn’t warmed enough to make fungi a problem, he says. “But you have to think about all the really hot days [that come with climate change]. Every really hot day is a selection event,” in which many fungi will die. But some of those fungi will have mutations that help them handle the heat. Those will survive. Their offspring may be able to survive future even hotter heat waves until human body temperature is no challenge.
Fungi that infect people are usually not picky about their hosts, Casadevall says. They will grow in soil or — if given an opportunity — in people, pets or in other animals. The reason fungi don’t infect people more often is that “the world is much colder than we are, and they have no need of us,” he says.
When people do get infected, the immune system usually keeps the fungi in check. But fungal infections can cause serious illness or be deadly, particularly to people with weakened immune systems (SN: 11/29/21; SN: 1/10/23).
The second episode of The Last of Us reveals that the zombie-creating fungi initially spread through people eating contaminated flour. Then, the infected people attack and bite others, spreading the fungus.
In real life, most human infections arise from breathing in spores. But Casadevall says it’s “not implausible” that people could get infected by eating spores or by being bitten.
Also bad: Fungal genes can adapt to higher heat I also wondered exactly how a fungus could evolve in response to heat. Asiya Gusa, a fungal researcher at Duke University School of Medicine, has published one possibility.
In 2020, she and colleagues reported in the Proceedings of the National Academy of Sciences on how one fungus mutated at elevated temperature to become harder to fight.
Cryptococcus deneoformans, which already infects humans (though it’s no zombie-maker), became resistant to some antifungal drugs when grown at human body temperature. The resistance was born when mobile bits of DNA called transposons (often called jumping genes) hopped into a few genes needed for the antifungals to work.
In a follow-up study, Gusa and colleagues grew C. deneoformans at either 30° C or 37° C for 800 generations, long enough to detect multiple changes in their DNA. Fungi had no problem growing at the balmy 30° C (86° F), the temperature at which researchers typically grow fungi in the lab. But their growth slowed at the higher temperature, a sign that the fungi were under stress from the heat.
In C. deneoformans, that heat stress really got things jumping. One type of transposon accumulated a median of 12 extra copies of itself in fungi grown at body temperature. By contrast, fungi grown at 30° C tended to pick up a median of only one extra copy of the transposon. The team reported those results January 20 in PNAS. The researchers don’t yet know the effect the transposon hops might have on the fungi’s ability to infect people, cause disease or resist fungus-fighting drugs.
So yeah, the bad news is not great. Fungi are mutating in the heat and at least one species has gained the ability to infect people thanks to climate change. Other fungi that infect people are more widespread than they were in the 1950s and 1960s, also thanks to a warming world (SN: 1/4/23).
But I promised good news. And here it is.
Good news: Human brains may resist zombification It may not be our body temperature, but our brain chemistry, that protects us from being hijacked by zombifying fungi.
I consulted Charissa de Bekker and Jui-Yu Chou, two researchers who study the Ophiocordyceps fungi that are the model for the TV show’s fungal menace. These fungi infect ants, flooding the insects with a cocktail of chemicals that steer the ants to climb plants. Once in position, the ants chomp down and the chemicals keep the jaw muscles locked in place (SN: 7/17/19).
Unlike most fictional zombies, the ants are alive during this process. “A lot of people get the misconception that we work on undead ants,” says de Bekker, a microbiologist at Utrecht University in the Netherlands. She’s glad to see the show “stick to the story of the host being very much alive while its behaviors change.” The fungi even help preserve the ant, keeping it alive even while feeding on it. But eventually the ant dies. Then a mushroom rises from the corpse, showering spores onto the ground where other ants may become infected.
Related species of Ophiocordyceps infect various species of ants and other insects. But each fungal species is very specific to the host it infects. That’s because the fungi had to individualize the chemicals they use to control the particular species they infect. The ability to manipulate behavior comes at the cost of not being able to infect multiple species. A fungus that specializes in infecting ants probably can’t get past humans’ immune systems, says Chou, a fungal researcher at the National Changhua University of Education in Taiwan. “Think of a key that fits into a specific lock. It is only this unique combination that will trigger the lock to open,” he says.
Even if the fungi evolved to withstand human body temperature and immune system attacks, they probably couldn’t take control of our minds, de Bekker says. “Manipulation is like a whole different ballgame. You need a ton of additional tools to get there.” It took millions of years of coevolution for the fungi to master piloting ants, after all.
While fungi do make mind-altering chemicals that can affect human behavior (LSD and psilocybin, for instance), Casadevall agrees that fungi that mind control insects probably won’t turn humans into zombies. “It’s not one of my worries,” he says.
Infected ants don’t turn into vicious, biting zombies either, de Bekker says. “If anything, we actually see the healthy ants being aggressive toward infected individuals, once they figure out that they’re infected, to basically get rid of them.” That “social immunity” helps protect the rest of the nest from infection.
Also good: Humans are innovative enough to develop treatments The fictional scientist’s assertion that we couldn’t prevent, treat or cure these fungal infections is also a stretch.
Antifungal drugs exist and they cure many fungal infections, though some infections may persist. Some that spread to the brain may be particularly difficult to clear.Some fungi are also evolving resistance to the drugs. And a few fungal vaccines are in the works, although they may not be ready for years.
The experts I talked to say they hope the show will bring attention to real fungal diseases.
Gusa was especially glad to see fungi in the limelight. And she shares my fondness for that retro series opening in which the scientist predicts climate change could spawn mind-controlling fungi bent on infecting every person on the planet.
“I was pretty much yelling at the TV when I watched the [show’s] intro,” in an excited kind of way, she says. “This is the foundation of a lot of my grant funding … the threat of thermal adaptation of fungi.… To see it played out on the screen was something kind of fun.”
The Milky Way is churning out far more stars than previously thought, according to a new estimate of its star formation rate.
Gamma rays from aluminum-26, a radioactive isotope that arises primarily from massive stars, reveal that the Milky Way converts four to eight solar masses of interstellar gas and dust into new stars each year, researchers report in work submitted to arXiv.org on January 24. That range is two to four times the conventional estimate and corresponds to an annual birthrate in our galaxy of about 10 to 20 stars, because most stars are less massive than the sun. At this rate, every million years — a blink of the eye in astronomical terms — our galaxy spawns 10 million to 20 million new stars. That’s enough to fill roughly 10,000 star clusters like the beautiful Pleiades cluster in the constellation Taurus. In contrast, many galaxies, including most of the ones that orbit the Milky Way, make no new stars at all.
“The star formation rate is very important to understand for galaxy evolution,” says Thomas Siegert, an astrophysicist at the University of Würzburg in Germany. The more stars a galaxy makes, the faster it enriches itself with oxygen, iron and the other elements that stars create. Those elements then alter star-making gas clouds and can change the relative number of large and small stars that the gas clouds form.
Siegert and his colleagues studied the observed intensity and spatial distribution of emission from aluminum-26 in our galaxy. A massive star creates this isotope during both life and death. During its life, the star blows the aluminum into space via a strong wind. If the star explodes when it dies, the resulting supernova forges more. The isotope, with a half-life of 700,000 years, decays and gives off gamma rays.
Like X-rays, and unlike visible light, gamma rays penetrate the dust that cloaks the youngest stars. “We’re looking through the entire galaxy,” Siegert says. “We’re not X-raying it; here we’re gamma-raying it.”
The more stars our galaxy spawns, the more gamma rays emerge. The best match with the observations, the researchers find, is a star formation rate of four to eight solar masses a year. That is much higher than the standard estimate for the Milky Way of about two solar masses a year.
The revised rate is very realistic, says Pavel Kroupa, an astronomer at the University of Bonn in Germany who was not involved in the work. “I’m very impressed by the detailed modeling of how they account for the star formation process,” he says. “It’s a very beautiful work. I can see some ways of improving it, but this is really a major step in the absolutely correct direction.”
Siegert cautions that it is difficult to tell how far the gamma rays have traveled before reaching us. In particular, if some of the observed emission arises nearby — within just a few hundred light-years of us — then the galaxy has less aluminum-26 than the researchers have calculated, which means the star formation rate is on the lower side of the new estimate. Still, he says it’s unlikely to be as low as the standard two solar masses per year. In any event, the Milky Way is the most vigorous star creator in a collection of more than 100 nearby galaxies called the Local Group. The largest Local Group galaxy, Andromeda, converts only a fraction of a solar mass of gas and dust into new stars a year. Among Local Group galaxies, the Milky Way ranks second in size, but its high star formation rate means that we definitely try a lot harder.
For the first time, researchers have harnessed the body’s own chemistry to “grow” electrodes inside the tissues of living fish, blurring the boundary between biology and machines.
The technique uses the body’s sugars to turn an injected gel into a flexible electrode without damaging tissues, experiments show. Zebrafish with these electrodes grown in their brains, hearts and tail fins showed no signs of ill effects, and ones tested in leeches successfully stimulated a nerve, researchers report in the Feb. 24 Science. Someday, these electrodes could be useful for applications ranging from studying how biological systems work to improving human-machine interfaces. They also could be used in “bioelectronic medicine,” such as brain stimulation therapies for depression, Parkinson’s disease and other conditions (SN: 2/10/19).
Soft electronics aim to bridge the gap between soft, curvy biology and electronic hardware. But these electronics typically still must carry certain parts that can be prone to cracks and other issues, and inserting these devices inevitably causes damage to tissues.
“All the devices we have made, even though we have made them flexible, to make them more soft, when we introduce them, there will still be a scar. It’s like sticking a knife into the organ,” says Magnus Berggren, a materials scientist at Linköping University in Sweden. That scarring and inflammation can degrade electrode performance over time.
Previous efforts to grow soft electronics inside tissues have drawbacks. One approach uses electrical or chemical signals to power the transformation from chemical soup to conducting electrodes, but these zaps also cause damage. A 2020 study got around this problem by genetically modifying cells in worms to produce an engineered enzyme that does the job, but the new method achieves its results without genetic alterations.
Berggren and colleagues’ electrodes instead exploit a natural energy source already present in the body: sugars. The gel cocktail contains molecules called oxidases that react with the sugars — glucose or lactate — to produce hydrogen peroxide. That then activates another ingredient in the cocktail, an enzyme called hydrogen peroxidase, which is the catalyst needed to transform the gel into a conducting electrode.
“The approach leverages elegant chemistry to overcome many of the technical challenges,” says biomedical engineer Christopher Bettinger of Carnegie Mellon University in Pittsburgh, who was not involved in the study.
To test the technique, the researchers injected the cocktail into the brains, hearts and tail fins of transparent zebrafish. The gel turns blue when it becomes conductive, giving a visual readout of its success. “The beautiful thing is you can see it: The zebrafishes’ tail changes color, and we know that blue indicates a conducting polymer,” says materials scientist Xenofon Strakosas, also of Linköping University. “The first time I saw it, I thought ‘Wow, it’s really working!’”
The fish appeared to suffer no ill effects, and the researchers saw no evidence of tissue damage. In partially dissected leeches, the team showed that delivering a current to a nerve via a soft electrode could induce muscle contractions. Ultimately, devices like this could be paired with various wireless technologies in development.
Long-term implant performance remains to be determined, however. “The demonstrations are impressive,” Bettinger says. “What remains to be seen is the stability of the electrode.” Over time, substances in the body could react with the electrode materials, degrading it or even producing toxic substances.
The team still needs to refine how precisely the electrodes can stimulate nerves, says chemical engineer Zhenan Bao of Stanford University, who was not involved in the work. She and colleagues developed the way to “grow” electrical components using genetic modifications. Ensuring stimulation is concentrated where it’s needed for a treatment, while preventing the leakage of current to unwanted regions will be important, she says.
In the new study, the relative abundance of different sugars in different tissues determines exactly where electrodes form. But in the future, a component of the main ingredient could be swapped out for elements that attach to specific bits of biology to make targeting much more precise, Berggren says. “We’re conducting experiments right now where we’re trying to bind these materials directly on individual cells.” Notes Strakosas: “There are some applications where precision is really important; that’s where we have to invest effort.”
The best shot we have at minimizing the future impacts of climate change is to limit global warming to 1.5 degrees Celsius. Since the Industrial Revolution began, humankind has already raised the average global temperature by about 1.1 degrees. If we continue to emit greenhouse gases at the current rate, the world will probably surpass the 1.5-degree threshold by the end of the decade.
That sobering fact makes clear that climate change isn’t just a problem to solve someday soon; it’s an emergency to respond to now. And yet, most people don’t act like we’re in the midst of the greatest crisis humans have ever faced — not politicians, not the media, not your neighbor, not myself, if I’m honest. That’s what I realized after finishing The Climate Book by Greta Thunberg.
The urgency to act now, to kick the addiction to fossil fuels, practically jumps off the page to punch you in the gut. So while not a pleasant read — it’s quite stressful — it’s a book I can’t recommend enough. The book’s aim is not to convince skeptics that climate change is real. We’re well past that. Instead, it’s a wake-up call for anyone concerned about the future.
A collection of bite-size essays, The Climate Book provides an encyclopedic overview of all aspects of the climate crisis, including the basic science, the history of denialism and inaction, and what to do next. Thunberg, who became the face of climate activism after starting the Fridays For Future protests as a teenager (SN: 12/16/19), assembles an all-star roster of experts to write the essays.
The first two sections of the book lay out how a small amount of warming can have major, far-reaching effects. For some readers, this will be familiar territory. But as each essay builds on the next, it becomes clear just how delicate Earth’s climate system is. What also becomes clear is the significance of 1.5 degrees (SN: 12/17/18). Beyond this point, scientists fear, various aspects of the natural world might reach tipping points that usher in irreversible changes, even if greenhouse gas emissions are later brought under control. Ice sheets could melt, raise sea levels and drown coastal areas. The Amazon rainforest could become a dry grassland.
The cumulative effect would be a complete transformation of the climate. Our health and the livelihood of other species and entire ecosystems would be in danger, the book shows. Not surprisingly, essay after essay ends with the same message: We must cut greenhouse gas emissions, now and quickly.
Repetition is found elsewhere in the book. Numerous essays offer overlapping scientific explanations, stats about emissions, historical notes and thoughts about the future. Rather than being tedious, the repetition reinforces the message that we know what the climate change threat is, we know how to tackle it and we’ve known for a long time. Thunberg’s anger and frustration over the decades of inaction, false starts and broken pledges are palpable in her own essays that run throughout the book. The world has known about human-caused climate change for decades, yet about half of all human-related carbon dioxide emissions ever released have occurred since 1990. That’s the year the Intergovernmental Panel on Climate Change released its first report and just two years before world leaders met in Rio de Janeiro in 1992 to sign the first international treaty to curb emissions (SN: 6/23/90).
Perversely, the people who will bear the brunt of the extreme storms, heat waves, rising seas and other impacts of climate change are those who are least culpable. The richest 10 percent of the world’s population accounts for half of all carbon dioxide emissions while the top 1 percent emits more than twice as much as the bottom half. But because of a lack of resources, poorer populations are the least equipped to deal with the fallout. “Humankind has not created this crisis,” Thunberg writes, “it was created by those in power.”
That injustice must be confronted and accounted for as the world addresses climate change, perhaps even through reparations, Olúfẹ́mi O. Táíwò, a philosopher at Georgetown University, argues in one essay.
So what is the path forward? Thunberg and many of her coauthors are generally skeptical that new tech alone will be our savior. Carbon capture and storage, or CCS, for example, has been heralded as one way to curb emissions. But less than a third of the roughly 150 planned CCS projects that were supposed to be operational by 2020 are up and running.
Progress has been impeded by expenses and technology fails, science writer Ketan Joshi explains. An alternative might be “rewilding,” restoring damaged mangrove forests, seagrass meadows and other ecosystems that naturally suck CO2 out of the air (SN: 9/14/22), suggest environmental activists George Monbiot and Rebecca Wrigley.
Fixing the climate problem will not only require transforming our energy and transportation systems, which often get the most attention, but also our economies (endless growth is not sustainable), political systems and connection to nature and with each other, the book’s authors argue.
The last fifth of the book lays out how we could meet this daunting challenge. What’s needed is a critical mass of individuals who are willing to make lifestyle changes and be heard. This could trigger a social movement strong enough to force politicians to listen and create systemic and structural change. In other words, it’s time to start acting like we’re in a crisis. Thunberg doesn’t end the book by offering hope. Instead, she argues we each have to make our own hope.
“To me, hope is not something that is given to you, it is something you have to earn, to create,” she writes. “It cannot be gained passively, through standing by and waiting for someone else to do something. Hope is taking action.”