THE WOODLANDS, TEXAS — A young, ultrabright Jupiter may have desiccated its now hellish moon Io. The planet’s bygone brilliance could have also vaporized water on Europa and Ganymede, planetary scientist Carver Bierson reported March 17 at the Lunar and Planetary Science Conference. If true, the findings could help researchers narrow the search for icy exomoons by eliminating unlikely orbits.
Jupiter is among the brightest specks in our night sky. But past studies have indicated that during its infancy, Jupiter was far more luminous. “About 10 thousand times more luminous,” said Bierson, of Arizona State University in Tempe. That radiance would have been inescapable for the giant planet’s moons, the largest of which are volcanic Io, ice-shelled Europa, aurora-cowled Ganymede and crater-laden Callisto (SN: 12/22/22, SN: 4/19/22, SN: 3/12/15). The constitutions of these four bodies obey a trend: The more distant the moon from Jupiter, the more ice-rich its body is.
Bierson and his colleagues hypothesized this pattern was a legacy of Jupiter’s past radiance. The team used computers to simulate how an infant Jupiter may have warmed its moons, starting with Io, the closest of the four. During its first few million years, Io’s surface temperature may have exceeded 26° Celsius under Jupiter’s glow, Bierson said. “That’s Earthlike temperatures.”
Any ice present on Io at that time, roughly 4.5 billion years ago, probably would have melted into an ocean. That water would have progressively evaporated into an atmosphere. And that atmosphere, hardly restrained by the moon’s weak gravity, would have readily escaped into space. In just a few million years, Io could have lost as much water as Ganymede may hold today, which may be more than 25 times the amount in Earth’s oceans.
A coruscant Jupiter probably didn’t remove significant amounts of ice from Europa or Ganymede, the researchers found, unless Jupiter was brighter than simulated or the moons orbited closer than they do today.
The findings suggest that icy exomoons probably don’t orbit all that close to massive planets.
Art historians often wish that Renaissance painters could shell out secrets of the craft. Now, scientists may have cracked one using chemistry and physics.
Around the turn of the 15th century in Italy, oil-based paints replaced egg-based tempera paints as the dominant medium. During this transition, artists including Leonardo da Vinci and Sandro Botticelli also experimented with paints made from oil and egg (SN: 4/30/14). But it has been unclear how adding egg to oil paints may have affected the artwork. “Usually, when we think about art, not everybody thinks about the science which is behind it,” says chemical engineer Ophélie Ranquet of the Karlsruhe Institute of Technology in Germany.
In the lab, Ranquet and colleagues whipped up two oil-egg recipes to compare with plain oil paint. One mixture contained fresh egg yolk mixed into oil paint, and had a similar consistency to mayonnaise. For the other blend, the scientists ground pigment into the yolk, dried it and mixed it with oil — a process the old masters might have used, according to the scant historical records that exist today. Each medium was subjected to a battery of tests that analyzed its mass, moisture, oxidation, heat capacity, drying time and more.
In both concoctions, the yolk’s proteins, phospholipids and antioxidants helped slow paint oxidation, which can cause paint to turn yellow over time, the team reports March 28 in Nature Communications.
In the mayolike blend, the yolk created sturdy links between pigment particles, resulting in stiffer paint. Such consistency would have been ideal for techniques like impasto, a raised, thick style that adds texture to art. Egg additions also could have reduced wrinkling by creating a firmer paint consistency. Wrinkling sometimes happens with oil paints when the top layer dries faster than the paint underneath, and the dried film buckles over looser, still-wet paint.
The hybrid mediums have some less than eggs-ellent qualities, though. For instance, the eggy oil paint can take longer to dry. If paints were too yolky, Renaissance artists would have had to wait a long time to add the next layer, Ranquet says.
“The more we understand how artists select and manipulate their materials, the more we can appreciate what they’re doing, the creative process and the final product,” says Ken Sutherland, director of scientific research at the Art Institute of Chicago, who was not involved with the work.
Research on historical art mediums can not only aid art preservation efforts, Sutherland says, but also help people gain a deeper understanding of the artworks themselves.
In a book about his travels in Africa published in 1907, British explorer Arnold Henry Savage Landor recounted witnessing an impromptu meal that his companions relished but that he found unimaginably revolting.
As he coasted down a river in the Congo Basin with several local hunter-gatherers, a dead rodent floated near their canoe. Its decomposing body had bloated to the size of a small pig.
Stench from the swollen corpse left Landor gasping for breath. Unable to speak, he tried to signal his companions to steer the canoe away from the fetid creature. Instead, they hauled the supersize rodent aboard and ate it. “The odour when they dug their knives into it was enough to kill the strongest of men,” Landor wrote. “When I recovered, my admiration for the digestive powers of these people was intense. They were smacking their lips and they said the [rodent] had provided most excellent eating.”
Starting in the 1500s, European and then later American explorers, traders, missionaries, government officials and others who lived among Indigenous peoples in many parts of the world wrote of similar food practices. Hunter-gatherers and small-scale farmers everywhere commonly ate putrid meat, fish and fatty parts of a wide range of animals. From arctic tundra to tropical rainforests, native populations consumed rotten remains, either raw, fermented or cooked just enough to singe off fur and create a more chewable texture. Many groups treated maggots as a meaty bonus.
Descriptions of these practices, which still occur in some present-day Indigenous groups and among northern Europeans who occasionally eat fermented fish, aren’t likely to inspire any new Food Network shows or cookbooks from celebrity chefs.
Case in point: Some Indigenous communities feasted on huge decomposing beasts, including hippos that had been trapped in dug-out pits in Africa and beached whales on Australia’s coast. Hunters in those groups typically smeared themselves with the fat of the animal before gorging on greasy innards. After slicing open animals’ midsections, both adults and children climbed into massive, rotting body cavities to remove meat and fat.
Or consider that Native Americans in Missouri in the late 1800s made a prized soup from the greenish, decaying flesh of dead bison. Animal bodies were buried whole in winter and unearthed in spring after ripening enough to achieve peak tastiness.
But such accounts provide a valuable window into a way of life that existed long before Western industrialization and the war against germs went global, says anthropological archaeologist John Speth of the University of Michigan in Ann Arbor. Intriguingly, no reports of botulism and other potentially fatal reactions to microorganisms festering in rotting meat appear in writings about Indigenous groups before the early 1900s. Instead, decayed flesh and fat represented valued and tasty parts of a healthy diet. Many travelers such as Landor considered such eating habits to be “disgusting.” But “a gold mine of ethnohistorical accounts makes it clear that the revulsion Westerners feel toward putrid meat and maggots is not hardwired in our genome but is instead culturally learned,” Speth says.
This dietary revelation also challenges an influential scientific idea that cooking originated among our ancient relatives as a way to make meat more digestible, thus providing a rich calorie source for brain growth in the Homo genus. It’s possible, Speth argues, that Stone Age hominids such as Neandertals first used cooking for certain plants that, when heated, provided an energy-boosting, carbohydrate punch to the diet. Animals held packets of fat and protein that, after decay set in, rounded out nutritional needs without needing to be heated. Putrid foods in the diets of Indigenous peoples Speth’s curiosity about a human taste for putrid meat was originally piqued by present-day hunter-gatherers in polar regions. North American Inuit, Siberians and other far-north populations still regularly eat fermented or rotten meat and fish.
Fermented fish heads, also known as “stinkhead,” are one popular munchy among northern groups. Chukchi herders in the Russian Far East, for instance, bury whole fish in the ground in early fall and let the bodies naturally ferment during periods of freezing and thawing. Fish heads the consistency of hard ice cream are then unearthed and eaten whole.
Speth has suspected for several decades that consumption of fermented and putrid meat, fish, fat and internal organs has a long and probably ancient history among northern Indigenous groups. Consulting mainly online sources such as Google Scholar and universities’ digital library catalogs, he found many ethnohistorical descriptions of such behavior going back to the 1500s. Putrid walrus, seals, caribou, reindeer, musk oxen, polar bears, moose, arctic hares and ptarmigans had all been fair game. Speth reported much of this evidence in 2017 in PaleoAnthropology.
In one recorded incident from late-1800s Greenland, a well-intentioned hunter brought what he had claimed in advance was excellent food to a team led by American explorer Robert Peary. A stench filled the air as the hunter approached Peary’s vessel carrying a rotting seal dripping with maggots. The Greenlander had found the seal where a local group had buried it, possibly a couple of years earlier, so that the body could reach a state of tasty decomposition. Peary ordered the man to keep the reeking seal off his boat.
Miffed at this unexpected rejection, the hunter “told us that the more decayed the seal the finer the eating, and he could not understand why we should object,” Peary’s wife wrote of the encounter.
Even in temperate and tropical areas, where animal bodies decompose within hours or days, Indigenous peoples have appreciated rot as much as Peary’s seal-delivery man did. Speth and anthropological archaeologist Eugène Morin of Trent University in Peterborough, Canada, described some of those obscure ethnohistorical accounts last October in PaleoAnthropology. Early hominids may have scavenged rotten meat These accounts undermine some of scientists’ food-related sacred cows, Speth says. For instance, European explorers and other travelers consistently wrote that traditional groups not only ate putrid meat raw or lightly cooked but suffered no ill aftereffects. A protective gut microbiome may explain why, Speth suspects. Indigenous peoples encountered a variety of microorganisms from infancy on, unlike people today who grow up in sanitized settings. Early exposures to pathogens may have prompted the development of an array of gut microbes and immune responses that protected against potential harms of ingesting putrid meat.
That idea requires further investigation; little is known about the bacterial makeup of rotten meat eaten by traditional groups or of their gut microbiomes. But studies conducted over the last few decades do indicate that putrefaction, the process of decay, offers many of cooking’s nutritional benefits with far less effort. Putrefaction predigests meat and fish, softening the flesh and chemically breaking down proteins and fats so they are more easily absorbed and converted to energy by the body.
Given the ethnohistorical evidence, hominids living 3 million years ago or more could have scavenged meat from decomposing carcasses, even without stone tools for hunting or butchery, and eaten their raw haul safely long before fire was used for cooking, Speth contends. If simple stone tools appeared as early as 3.4 million years ago, as some researchers have controversially suggested, those implements may have been made by hominids seeking raw meat and marrow (SN: 9/11/10, p. 8). Researchers suspect regular use of fire for cooking, light and warmth emerged no earlier than around 400,000 years ago (SN: 5/5/12, p. 18).
“Recognizing that eating rotten meat is possible, even without fire, highlights how easy it would have been to incorporate scavenged food into the diet long before our ancestors learned to hunt or process [meat] with stone tools,” says paleoanthropologist Jessica Thompson of Yale University.
Thompson and colleagues suggested in Current Anthropology in 2019 that before about 2 million years ago, hominids were primarily scavengers who used rocks to smash open animal bones and eat nutritious, fat-rich marrow and brains. That conclusion, stemming from a review of fossil and archaeological evidence, challenged a common assumption that early hominids — whether as hunters or scavengers — primarily ate meat off the bone.
Certainly, ancient hominids were eating more than just the meaty steaks we think of today, says archaeologist Manuel Domínguez-Rodrigo of Rice University in Houston. In East Africa’s Olduvai Gorge, butchered animal bones at sites dating to nearly 2 million years ago indicate that hominids ate most parts of carcasses, including brains and internal organs.
“But Speth’s argument about eating putrid carcasses is very speculative and untestable,” Domínguez-Rodrigo says.
Untangling whether ancient hominids truly had a taste for rot will require research that spans many fields, including microbiology, genetics and food science, Speth says.
But if his contention holds up, it suggests that ancient cooks were not turning out meat dishes. Instead, Speth speculates, cooking’s primary value at first lay in making starchy and oily plants softer, more chewable and easily digestible. Edible plants contain carbohydrates, sugar molecules that can be converted to energy in the body. Heating over a fire converts starch in tubers and other plants to glucose, a vital energy source for the body and brain. Crushing or grinding of plants might have yielded at least some of those energy benefits to hungry hominids who lacked the ability to light fires.
Whether hominids controlled fire well enough to cook plants or any other food regularly before around 400,000 to 300,000 years ago is unknown. Neandertals may have hunted animals for fat Despite their nutritional benefits, plants often get viewed as secondary menu items for Stone Age folks. It doesn’t help that plants preserve poorly at archaeological sites.
Neandertals, in particular, have a long-standing reputation as plant shunners. Popular opinion views Neandertals as burly, shaggy individuals who huddled around fires chomping on mammoth steaks.
That’s not far from an influential scientific view of what Neandertals ate. Elevated levels of a diet-related form of nitrogen in Neandertal bones and teeth hint that they were committed carnivores, eating large amounts of protein-rich lean meat, several research teams have concluded over nearly the last 30 years.
But consuming that much protein from meat, especially from cuts above the front and hind limbs now referred to as steaks, would have been a recipe for nutritional disaster, Speth argues. Meat from wild, hoofed animals and smaller creatures such as rabbits contains almost no fat, or marbling, unlike meat from modern domestic animals, he says. Ethnohistorical accounts, especially for northern hunters including the Inuit, include warnings about weight loss, ill health and even death that can result from eating too much lean meat.
This form of malnutrition is known as rabbit starvation. Evidence indicates that people can safely consume between about 25 and 35 percent of daily calories as protein, Speth says. Above that threshold, several investigations have indicated that the liver becomes unable to break down chemical wastes from ingested proteins, which then accumulate in the blood and contribute to rabbit starvation. Limits to the amount of daily protein that can be safely consumed meant that ancient hunting groups, like those today, needed animal fats and carbohydrates from plants to fulfill daily calorie and other nutritional needs.
Modern “Paleo diets” emphasize eating lean meats, fruits and vegetables. But that omits what past and present Indigenous peoples most wanted from animal carcasses. Accounts describe Inuit people eating much larger amounts of fatty body parts than lean meat, Speth says. Over the last few centuries, they have favored tongue, fat deposits, brisket, ribs, fatty tissue around intestines and internal organs, and marrow. Internal organs, especially adrenal glands, have provided vitamin C — nearly absent in lean muscle — that prevented anemia and other symptoms of scurvy.
Western explorers noted that the Inuit also ate chyme, the stomach contents of reindeer and other plant-eating animals. Chyme provided at least a side course of plant carbohydrates. Likewise, Neandertals in Ice Age Europe probably subsisted on a fat- and chyme-supplemented diet (SN Online: 10/11/13), Speth contends.
Large numbers of animal bones found at northern European Neandertal sites — often viewed as the residue of ravenous meat eaters — may instead reflect overhunting of animals to obtain enough fat to meet daily calorie needs. Because wild game typically has a small percentage of body fat, northern hunting groups today and over the last few centuries frequently killed prey in large numbers, either discarding most lean meat from carcasses or feeding it to their dogs, ethnographic studies show.
If Neandertals followed that playbook, eating putrid foods might explain why their bones carry a carnivore-like nitrogen signature, Speth suggests. An unpublished study of decomposing human bodies kept at a University of Tennessee research facility in Knoxville called the Body Farm tested that possibility. Biological anthropologist Melanie Beasley, now at Purdue University in West Lafayette, Ind., found moderately elevated tissue nitrogen levels in 10 deceased bodies sampled regularly for about six months. Tissue from those bodies served as a stand-in for animal meat consumed by Neandertals. Human flesh is an imperfect substitute for, say, reindeer or elephant carcasses. But Beasley’s findings suggest that decomposition’s effects on a range of animals need to be studied. Intriguingly, she also found that maggots in the decaying tissue displayed extremely elevated nitrogen levels.
Paleobiologist Kimberly Foecke of George Washington University in Washington, D.C., has also found high nitrogen levels in rotting, maggot-free cuts of beef from animals fed no hormones or antibiotics to approximate the diets of Stone Age creatures (SN: 1/2/19).
Like arctic hunters did a few hundred years ago, Neandertals may have eaten putrid meat and fish studded with maggots, Speth says. That would explain elevated nitrogen levels in Neandertal fossils.
But Neandertal dining habits are poorly understood. Unusually extensive evidence of Neandertal big-game consumption has come from a new analysis of fossil remains at a roughly 125,000-year-old site in northern Germany called Neumark-Nord. There, Neandertals periodically hunted straight-tusked elephants weighing up to 13 metric tons, say archaeologist Sabine Gaudzinski-Windheuser of Johannes Gutenberg University of Mainz in Germany and colleagues.
In a study reported February 1 in Science Advances, her group analyzed patterns of stone-tool incisions on bones of at least 57 elephants from 27 spots near an ancient lake basin where Neandertals lit campfires and constructed shelters (SN: 1/29/22, p. 8). Evidence suggests that Neandertal butchers — much like Inuit hunters — removed fat deposits under the skin and fatty body parts such as the tongue, internal organs, brain and thick layers of fat in the feet. Lean meat from elephants would have been eaten in smaller quantities to avoid rabbit starvation, the researchers argue.
Further research needs to examine whether the Neandertals cooked elephant meat or boiled the bones to extract nutritious grease, Speth says. Mealtime options would have expanded for hominids who could not only consume putrid meat and fat but also heat animal parts over fires, he suspects.
Neandertals who hunted elephants must also have eaten a variety of plants to meet their considerable energy requirements, says Gaudzinski-Windheuser. But so far, only fragments of burned hazelnuts, acorns and blackthorn plums have been found at Neumark-Nord. Neandertals probably carb-loaded Better evidence of Neandertals’ plant preferences comes from sites in warm Mediterranean and Middle Eastern settings. At a site in coastal Spain, Neandertals probably ate fruits, nuts and seeds of a variety of plants (SN: 3/27/21, p. 32).
Neandertals in a range of environments must have consumed lots of starchy plants, argues archaeologist Karen Hardy of the University of Glasgow in Scotland. Even Stone Age northern European and Asian regions included plants with starch-rich appendages that grew underground, such as tubers.
Neandertals could also have obtained starchy carbs from the edible, inner bark of many trees and from seaweed along coastlines. Cooking, as suggested by Speth, would have greatly increased the nutritional value of plants, Hardy says. Not so for rotten meat and fat, though Neandertals such as those at Neumark-Nord may have cooked what they gleaned from fresh elephant remains.
There is direct evidence that Neandertals munched on plants. Microscopic remnants of edible and medicinal plants have been found in the tartar on Neandertal teeth (SN: 4/1/17, p. 16), Hardy says.
Carbohydrate-fueled energy helped to maintain large brains, enable strenuous physical activity and ensure healthy pregnancies for both Neandertals and ancient Homo sapiens, Hardy concludes in the January 2022 Journal of Human Evolution. (Researchers disagree over whether Neandertals, which lived from around 400,000 to 40,000 years ago, were a variant of H. sapiens or a separate species.) Paleo cuisine was tasty Like Hardy, Speth suspects that plants provided a large share of the energy and nutrients Stone Age folks needed. Plants represented a more predictable, readily available food source than hunted or scavenged meat and fat, he contends.
Plants also offered Neandertals and ancient H. sapiens — whose diets probably didn’t differ dramatically from Neandertals’, Hardy says — a chance to stretch their taste buds and cook up tangy meals.
Paleolithic plant cooking included preplanned steps aimed at adding dashes of specific flavors to basic dishes, a recent investigation suggests. In at least some places, Stone Age people apparently cooked to experience pleasing tastes and not just to fill their stomachs. Charred plant food fragments from Shanidar Cave in Iraqi Kurdistan and Franchthi Cave in Greece consisted of crushed pulse seeds, possibly from starchy pea species, combined with wild plants that would have provided a pungent, somewhat bitter taste, microscopic analyses show.
Added ingredients included wild mustard, wild almonds, wild pistachio and fruits such as hackberry, archaeobotanist Ceren Kabukcu of the University of Liverpool in England and colleagues reported last November in Antiquity.
Four Shanidar food bits date to about 40,000 years ago or more and originated in sediment that included stone tools attributed to H. sapiens. Another food fragment, likely from a cooked Neandertal meal, dates to between 70,000 and 75,000 years ago. Neandertal fossils found in Shanidar Cave are also about 70,000 years old. So it appears that Shanidar Neandertals spiced up cooked plant foods before Shanidar H. sapiens did, Kabukcu says.
Franchthi food remains date to between 13,100 and 11,400 years ago, when H. sapiens lived there. Wild pulses in food from both caves display microscopic signs of having been soaked, a way to dilute poisons in seeds and moderate their bitterness.
These new findings “suggest that cuisine, or the combination of different ingredients for pleasure, has a very long history indeed,” says Hardy, who was not part of Kabukcu’s team.
There’s a hefty dollop of irony in the possibility that original Paleo diets mixed what people in many societies today regard as gross-sounding portions of putrid meat and fat with vegetarian dishes that still seem appealing.
Beer lovers could be left with a sour taste, thanks to the latest in a series of studies mapping the effects of climate change on crops.
Malted barley — a key ingredient in beer including IPAs, stouts and pilsners — is particularly sensitive to warmer temperatures and drought, both of which are likely to increase due to climate change. As a result, average global barley crop yields could drop as much as 17 percent by 2099, compared with the average yield from 1981 to 2010, under the more extreme climate change projections, researchers report October 15 in Nature Plants. That decline “could lead to, on average, a doubling of price in some countries,” says coauthor Steven Davis, an Earth systems scientist at University of California, Irvine. Consumption would also drop globally by an average of 16 percent, or roughly what people in the United States consumed in 2011.
The results are based on computer simulations projecting climate conditions, plant responses and global market reactions up to the year 2099. Under the mildest climate change predictions, world average barley yields would still go down by at least 3 percent, and average prices would increase about 15 percent, the study says.
Other crops such as maize, wheat and soy and wine grapes are also threatened by the global rising of average atmospheric temperatures as well as by pests emboldened by erratic weather (SN: 2/8/14, p. 3). But there’s still hope for ale aficionados. The study did not account for technological innovations or genetic tweaks that could spare the crop, Davis says.
What do you get when you flip a fossilized “jellyfish” upside down? The answer, it turns out, might be an anemone.
Fossil blobs once thought to be ancient jellyfish were actually a type of burrowing sea anemone, scientists propose March 8 in Papers in Palaeontology.
From a certain angle, the fossils’ features include what appears to be a smooth bell shape, perhaps with tentacles hanging beneath — like a jellyfish. And for more than 50 years, that’s what many scientists thought the animals were. But for paleontologist Roy Plotnick, something about the fossils’ supposed identity seemed fishy. “It’s always kind of bothered me,” says Plotnick, of the University of Illinois Chicago. Previous scientists had interpreted one fossil feature as a curtain that hung around the jellies’ tentacles. But that didn’t make much sense, Plotnick says. “No jellyfish has that,” he says. “How would it swim?”
One day, looking over specimens at the Field Museum in Chicago, something in Plotnick’s mind clicked. What if the bell belonged on the bottom, not the top? He turned to a colleague and said, “I think this is an anemone.”
Rotated 180 degrees, Plotnick realized, the fossils’ shape — which looks kind of like an elongated pineapple with a stumpy crown — resembles some modern anemones. “It was one of those aha moments,” he says. The “jellyfish” bell might be the anemone’s lower body. And the purported tentacles? Perhaps the anemone’s upper section, a tough, textured barrel protruding from the seafloor.
Plotnick and his colleagues examined thousands of the fossilized animals, dubbed Essexella asherae, unearthing more clues. Bands running through the fossils match the shape of some modern anemones’ musculature. And some specimens’ pointy protrusions resemble an anemone’s contracted tentacles. “It’s totally possible that these are anemones,” says Estefanía Rodríguez, an anemone expert at the American Museum of Natural History in New York City who was not involved with the work. The shape of the fossils, the comparison with modern-day anemones — it all lines up, she says, though it’s not easy to know for sure.
Paleontologist Thomas Clements agrees. Specimens like Essexella “are some of the most notoriously difficult fossils to identify,” he says. “Jellyfish and anemones are like bags of water. There’s hardly any tissue to them,” meaning there’s little left to fossilize. Still, it’s plausible that the blobs are indeed fossilized anemones, says Clements, of Friedrich-Alexander-Universität Erlangen-Nürnberg in Germany. He was not part of the new study but has spent several field seasons at Mazon Creek, the Illinois site where Essexella lived some 310 million years ago. Back then, the area was near the shoreline, Clements says, with nearby rivers dumping sediment into the environment – just the kind of place ancient burrowing anemones may have once called home.
Weird materials called Weyl metals might reveal the secrets of how Earth gets its magnetic field.
The substances could generate a dynamo effect, the process by which a swirling, electrically conductive material creates a magnetic field, a team of scientists reports in the Oct. 26 Physical Review Letters.
Dynamos are common in the universe, producing the magnetic fields of the Earth, the sun and other stars and galaxies. But scientists still don’t fully understand the details of how dynamos create magnetic fields. And, unfortunately, making a dynamo in the lab is no easy task, requiring researchers to rapidly spin giant tanks of a liquefied metal, such as sodium (SN: 5/18/13, p. 26). First discovered in 2015, Weyl metals are topological materials, meaning that their behavior is governed by a branch of mathematics called topology, the study of shapes like doughnuts and knots (SN: 8/22/15, p. 11). Electrons in Weyl metals move around in bizarre ways, behaving as if they are massless.
Within these materials, the researchers discovered, electrons are subject to the same set of equations that describes the behavior of liquids known to form dynamos, such as molten iron in the Earth’s outer core. The team’s calculations suggest that, under the right conditions, it should be possible to make a dynamo from solid Weyl metals.
It might be easier to create such dynamos in the lab, as they don’t require large quantities of swirling liquid metals. Instead, the electrons in a small chunk of Weyl metal could flow like a fluid, taking the place of the liquid metal. The result is still theoretical. But if the idea works, scientists may be able to use Weyl metals to reproduce the conditions that exist within the Earth, and better understand how its magnetic field forms.
Neandertals are shaking off their reputation as head bangers.
Our close evolutionary cousins experienced plenty of head injuries, but no more so than late Stone Age humans did, a study suggests. Rates of fractures and other bone damage in a large sample of Neandertal and ancient Homo sapiens skulls roughly match rates previously reported for human foragers and farmers who have lived within the past 10,000 years, concludes a team led by paleoanthropologist Katerina Harvati of the University of Tübingen in Germany. Males suffered the bulk of harmful head knocks, whether they were Neandertals or ancient humans, the scientists report online November 14 in Nature.
“Our results suggest that Neandertal lifestyles were not more dangerous than those of early modern Europeans,” Harvati says.
Until recently, researchers depicted Neandertals, who inhabited Europe and Asia between around 400,000 and 40,000 years ago, as especially prone to head injuries. Serious damage to small numbers of Neandertal skulls fueled a view that these hominids led dangerous lives. Proposed causes of Neandertal noggin wounds have included fighting, attacks by cave bears and other carnivores and close-range hunting of large prey animals.
Paleoanthropologist Erik Trinkaus of Washington University in St. Louis coauthored an influential 1995 paper arguing that Neandertals incurred an unusually large number of head and upper-body injuries. Trinkaus recanted that conclusion in 2012, though. All sorts of causes, including accidents and fossilization, could have resulted in Neandertal skull damage observed in relatively small fossil samples, he contended (SN: 5/27/17, p. 13). Harvati’s study further undercuts the argument that Neandertals engaged in a lot of violent behavior, Trinkaus says.
Still, the idea that Neandertals frequently got their heads bonked during crude, close-up attacks on prey has persisted, says paleoanthropologist David Frayer of the University of Kansas in Lawrence. The new report highlights the harsh reality that, for Neandertals and ancient humans alike, “head trauma, no matter the level of technological or social complexity, or population density, was common.”
Harvati’s group analyzed data for 114 Neandertal skulls and 90 H. sapiens skulls. All of these fossils were found in Eurasia and date to between around 80,000 and 20,000 years ago. One or more head injuries appeared in nine Neandertals and 12 ancient humans. After statistically accounting for individuals’ sex, age at death, geographic locations and state of bone preservation, the investigators estimated comparable levels of skull damage in the two species. Statistical models run by the team indicate that skull injuries affected an average of 4 percent to 33 percent of Neandertals, and 2 percent to 34 percent of ancient humans.
Estimated prevalence ranges that large likely reflect factors that varied from one locality to another, such as resource availability and hunting conditions, the researchers say.
Neandertals with head wounds included more individuals under age 30 than observed among their human counterparts. Neandertals may have suffered more head injuries early in life, the researchers say. It’s also possible that Neandertals died more often from head injuries than Stone Age humans did.
Researchers have yet to establish whether Neandertals experienced especially high levels of damage to body parts other than the head, writes paleoanthropologist Marta Mirazón Lahr of the University of Cambridge in a commentary in Nature accompanying the new study.
Oceans may be shrinking — Science News, March 10, 1973
The oceans of the world may be gradually shrinking, leaking slowly away into the Earth’s mantle…. Although the oceans are constantly being slowly augmented by water carried up from Earth’s interior by volcanic activity … some process such as sea-floor spreading seems to be letting the water seep away more rapidly than it is replaced.
Update Scientists traced the ocean’s leak to subduction zones, areas where tectonic plates collide and the heavier of the two sinks into the mantle. It’s still unclear how much water has cycled between the deep ocean and mantle through the ages. A 2019 analysis suggests that sea levels have dropped by an average of up to 130 meters over the last 230 million years, in part due to Pangea’s breakup creating new subduction zones. Meanwhile, molten rock that bubbles up from the mantle as continents drift apart may “rain” water back into the ocean, scientists reported in 2022. But since Earth’s mantle can hold more water as it cools (SN: 6/13/14), the oceans’ mass might shrink by 20 percent every billion years.
A new species of hulking ancient herbivore would have overshadowed its relatives.
Fossils found in Poland belong to a new species that roamed during the Late Triassic, a period some 237 million to 201 million years ago, researchers report November 22 in Science. But unlike most of the enormous animals who lived during that time period, this new creature isn’t a dinosaur — it’s a dicynodont.
Dicynodonts are a group of ancient four-legged animals that are related to mammals’ ancestors. They’re a diverse group, but the new species is far larger than any other dicynodont found to date. The elephant-sized creature was more than 4.5 meters long and probably weighed about nine tons, the researchers estimate. Related animals didn’t become that big again until the Eocene, 150 million years later. “We think it’s one of the most unexpected fossil discoveries from the Triassic of Europe,” says study coauthor Grzegorz Niedzwiedzki, a paleontologist at Uppsala University in Sweden. “Who would have ever thought that there is a fossil record of such a giant, elephant-sized mammal cousin in this part of the world?” He and his team first described some of the bones in 2008; now they’ve made the new species — Lisowicia bojani — official.
The creature had upright forelimbs like today’s rhinoceroses and hippos, instead of the splayed front limbs seen on other Triassic dicynodonts, which were similar to the forelimbs of present-day lizards. That posture would have helped it support its massive bodyweight.
A new design for sun-powered desalination technology may lead to longer-lasting devices that produce cleaner water.
The trick boils down to preventing a device’s components from touching the saltwater. Instead, a lid of light-absorbing material rests above a partially filled basin of water, absorbing sunlight and radiating that energy to the liquid below. That evaporates the water to create pure vapor, which can be condensed into freshwater to help meet the demands of a world where billions of people lack safe drinking water (SN: 8/18/18, p. 14). This setup marks an improvement over other sun-powered desalination devices, where sunshine-absorbing materials float atop the saltwater (SN: 8/20/16, p. 22). In those devices, salt and other contaminants left behind during evaporation can degrade the material’s ability to soak up sunlight. Having water in contact with the material also prevents the material from getting hotter than about 100° Celsius or producing steam above that temperature. That limits the technology’s ability to purify the final product; killing pathogenic microbes often requires temperatures of at least 121° C.
In the new device, described online December 11 in Nature Communications, the separation between the light-absorbing lid and the water’s surface helps keep the lid clean and allows it to generate vapor tens of degrees hotter than the water’s boiling point.
The lid comprises three main components: a top layer made of a metal-ceramic composite that absorbs sunshine, a sheet of carbon foam and a bottom layer of aluminum. Heat spreads from the sunlight-absorbing layer to the aluminum, from which thermal energy radiates to the water below. When the water temperature hits about 100° C, vapor is produced. The steam rises up through holes in the aluminum and flows through the lid’s middle carbon layer, heating further along the way, until it is released in a single stream out the side of the lid. There, it can be captured and condensed.
Producing superheated steam in this way, without any gunk buildup, is “a very innovative idea,” says Jia Zhu, a materials scientist at Nanjing University in China not involved in the work. Under a lamp that mimics natural sunlight in the lab, the device evaporated 100 grams of saltwater without any salt collecting on the underside of the lid. Salt crystals formed at the bottom of the basin washed away easily. In experiments in October on a rooftop in Cambridge, Mass., researchers used a curved mirror to focus incoming sunlight onto the light-absorbing layer of the device to produce steam hotter than 146° C.
“When you can access these temperatures, you can use the steam for things like sterilization, for cooking, for cleaning, for industrial processes,” says coauthor Thomas Cooper, a mechanical engineer at York University in Toronto. A device measuring 1 square meter could generate 2.5 liters of freshwater per day in sunny regions such as the southeastern United States, and at least half that in shadier regions such as New England, Cooper estimates.
This sun-powered technology could also provide an ecofriendly alternative to reverse osmosis, a water purification process that involves pushing seawater through salt-filtering membranes (SN: 9/15/18, p. 10). Reverse osmosis, which runs on electricity, “is an energy-hungry technology,” says Qiaoqiang Gan, an engineer at the University at Buffalo in New York not involved in the work. “For resource-limited areas, remote areas or people who live on small islands, this [new device] might be a very good option for them to address their freshwater needs.” But researchers still need to investigate how affordable a commercial version of this device would be, Gan says.