Friday, 31 October 2025

Ancient DNA reveals the deadly diseases behind Napoleon’s defeat

 Scientists from the Institut Pasteur have conducted a genetic analysis of the remains of soldiers who retreated from Russia in 1812. Their work uncovered traces of two disease-causing pathogens -- those behind paratyphoid fever and relapsing fever -- which match the symptoms described in eyewitness records from that time. The findings were first shared as a preprint on bioRxiv on July 16, 2025, and later published in the journal Current Biology on October 24.

Investigating the Mystery of the 1812 Retreat

Napoleon's invasion of Russia in 1812, known as the "Patriotic War of 1812," ended in one of history's most disastrous retreats. To better understand what role disease may have played in this collapse, researchers from the Institut Pasteur's Microbial Paleogenomics Unit partnered with the Laboratory of Biocultural Anthropology at Aix Marseille University. The team analyzed the DNA of 13 French soldiers exhumed in 2002 from a burial site in Vilnius, Lithuania, uncovered during archaeological excavations led by the Aix-Marseille University group. Using next-generation sequencing technology on ancient DNA, they searched for genetic traces of infectious organisms.

The researchers detected two distinct disease agents: Salmonella enterica subsp. enterica (serovar Paratyphi C), which causes paratyphoid fever, and Borrelia recurrentis, the bacterium responsible for relapsing fever. The latter is transmitted by lice and produces alternating periods of fever and recovery. Although different, both infections can cause severe fever, exhaustion, and digestive distress. Their combined impact could have intensified the soldiers' suffering at a time when cold, hunger, and poor sanitation were already taking a heavy toll.

Genetic Evidence From Napoleonic Soldiers

Out of the 13 soldiers examined, DNA from S. enterica Paratyphi C was found in four individuals, and B. recurrentis was detected in two. This marks the first direct genetic confirmation that these pathogens were present in Napoleon's army. Their exact contribution to the enormous death toll remains uncertain, but the findings complement earlier research that identified Rickettsia prowazekii (the cause of typhus) and Bartonella quintana (responsible for trench fever), both long suspected of spreading through the ranks during the retreat.

Because only a small number of samples could be analyzed compared to the thousands of remains in Vilnius, researchers cannot yet determine how widespread these infections were. The tested soldiers represent a tiny fraction -- 13 out of more than 3,000 bodies at the site and roughly 500,000 to 600,000 troops who took part in the campaign, of whom about 300,000 died during the retreat.

Understanding the Past to Protect the Future

"Accessing the genomic data of the pathogens that circulated in historical populations helps us to understand how infectious diseases evolved, spread and disappeared over time, and to identify the social or environmental contexts that played a part in these developments. This information provides us with valuable insights to better understand and tackle infectious diseases today," explains Nicolás Rascovan, Head of the Microbial Paleogenomics Unit at the Institut Pasteur and last author of the study.

Source: ScienceDaily

Thursday, 30 October 2025

Hippos once roamed frozen Germany with mammoths

 Hippos, now found only in sub-Saharan Africa, managed to survive in central Europe far longer than anyone previously believed. A new analysis of ancient bones shows that hippos lived in the Upper Rhine Graben between about 47,000 and 31,000 years ago, during the depths of the last ice age. The findings come from an international research team led by the University of Potsdam and the Reiss-Engelhorn-Museen Mannheim in collaboration with the Curt-Engelhorn-Zentrum Archäometrie, and were recently published in Current Biology.

Extinction Timeline Rewritten

Until recently, scientists thought the common hippopotamus (Hippopotamus amphibius) disappeared from central Europe roughly 115,000 years ago, when the last interglacial period ended. However, the new study -- conducted by researchers from the University of Potsdam, the Reiss-Engelhorn-Museen Mannheim, the Curt-Engelhorn-Zentrum Archäometrie Mannheim, ETH Zurich, and several international partners -- reveals that hippos actually persisted in the Upper Rhine Graben of southwestern Germany tens of thousands of years later, well into the middle of the last ice age.

The Upper Rhine Graben serves as a vital record of ancient climate conditions. Animal bones buried for millennia in layers of gravel and sand offer rare windows into the past. "It's amazing how well the bones have been preserved. At many skeletal remains it was possible to take samples suitable for analysis -- that is not a given after such a long time," said Dr. Ronny Friedrich, a specialist in age determination at the Curt-Engelhorn-Zentrum Archäometrie.

Genetic and Radiocarbon Clues

Researchers examined numerous hippopotamus fossils using both genetic and radiocarbon dating methods. Ancient DNA sequencing revealed that these Ice Age hippos were closely related to modern African populations and were part of the same species. Radiocarbon dating confirmed their presence during a warmer phase of the middle Weichselian glaciation, when conditions temporarily allowed the animals to survive in central Europe.

Further genome-wide analysis showed that the European hippo population had extremely low genetic diversity, suggesting it was both small and geographically isolated. Fossil evidence also revealed that these warm-adapted hippos lived alongside cold-climate animals such as mammoths and woolly rhinoceroses -- an unusual ecological mix that highlights the complexity of Ice Age environments.

Rethinking Europe's Ice Age Ecosystem

"The results demonstrate that hippos did not vanish from middle Europe at the end of the last interglacial, as previously assumed," summarizes first author Dr. Patrick Arnold. "Therefore, we should re-analyze other continental European hippo fossils traditionally attributed to the last interglacial period."

Prof. Dr. Wilfried Rosendahl, general director of the Reiss-Engelhorn-Museen Mannheim and project leader of "Eiszeitfenster Oberrheingraben" is convinced that ice age research still holds many exciting questions: "The current study provides important new insights which impressively prove that ice age was not the same everywhere, but local peculiarities taken together form a complex overall picture -- similar to a puzzle. It would now be interesting and important to further examine other heat-loving animal species, attributed so far to the last interglacial."

The research was carried out as part of the "Eiszeitfenster Oberrheingraben" project, supported by the Klaus Tschira Stiftung Heidelberg. This interdisciplinary effort aims to shed light on climate and environmental evolution in the Upper Rhine Graben and southwestern Germany over the past 400,000 years. The study focused on Ice Age bones from the Reis collection, housed at the Reiss-Engelhorn-Museen, which continue to reveal remarkable insights into Europe's dynamic prehistoric world.

Source: ScienceDaily

Wednesday, 29 October 2025

Exercise and omega-3s could be the secret to healthier teeth

 New research published in Scientific Reports has found that regular exercise paired with omega-3 supplementation can significantly enhance immune function and reduce the severity of chronic apical periodontitis, a type of inflammation that affects the tip of the tooth root.

Understanding Apical Periodontitis

Apical periodontitis occurs when bacteria from untreated tooth decay spread through the root canal to the apex of the tooth (the root tip), triggering inflammation in the surrounding bone. This infection can gradually destroy bone tissue in the area if left untreated.The new study is the first to show that moderate exercise combined with omega-3 supplementation can substantially improve this inflammatory condition. Together, these two factors helped control bacterial growth, minimize bone loss, balance the production of inflammatory molecules called cytokines, and stimulate fibroblasts, the cells that repair and maintain tissues.

The Connection Between Oral and Overall Health

Untreated apical periodontitis can lead to tooth loss, but its effects extend beyond the mouth. The condition is closely linked to systemic diseases such as diabetes, metabolic syndrome, arteriosclerosis, and kidney disease. Each can worsen the other, creating a harmful feedback loop between oral inflammation and general health.

"It's a condition that patients may not even know they have because of its chronic nature, but which can evolve and lead to bone destruction and tooth mobility. In addition, in specific situations, such as a drop in immunity, it can become acute, so the patient starts to feel pain, pus forms at the site, the face can become swollen," explains Rogério de Castilho, a professor at the Araçatuba School of Dentistry at São Paulo State University (FOA-UNESP) in Brazil. Castilho supervised the study and is supported by FAPESP.

Exercise and Supplements Show Measurable Impact

"In rats, physical exercise alone brought about a systemic improvement, regulating the local immune response. In addition, when combined with supplementation, it further reduced the destructive condition caused by endodontic pathology," explains Ana Paula Fernandes Ribeiro, the first author of the study, carried out during her doctorate at FOA-UNESP.

To explore these effects, researchers induced apical periodontitis in 30 rats and divided them into three groups. One group received no treatment, the second completed a 30-day swimming routine, and the third both swam and received omega-3 supplements, a fatty acid known for reducing inflammation in chronic diseases.

The swimming-only group showed improved outcomes compared to the untreated animals, but the group that both exercised and took omega-3 supplements demonstrated the greatest improvement in immune regulation and infection control.

Lower Inflammation, Stronger Bone

Detailed immune testing showed that rats receiving both interventions had the lowest levels of the inflammatory cytokines interleukin 17 (IL-17) and tumor necrosis factor alpha (TNF-α). Those that exercised without supplementation also had reduced levels compared to untreated rats, but the combination proved most effective.

Researchers also observed fewer osteoclasts -- cells that break down bone -- in the exercise and supplement groups, indicating less bone loss. Micro CT scans confirmed these findings: animals that swam had less loss of alveolar bone (the bone that supports teeth) than the control group, and the omega-3 group showed the greatest bone preservation overall.

Implications for Human Health

According to the authors, these results add to growing evidence that exercise and omega-3 fatty acids benefit not only systemic immunity but also oral health.

"To know if the same would be true for humans, we'd need a clinical study with a significant number of patients. However, in addition to the many proven benefits of physical exercise and omega-3 consumption, this is yet another important piece of evidence," Jacinto says.

The work was supported by FAPESP through Scientific Initiation grants awarded to Michely de Lima Rodrigues (20/13089-3 and 22/04884-0), another co-author of the study.

Source: ScienceDaily

Tuesday, 28 October 2025

Dinosaurs were thriving when the asteroid struck

 For much of the past century, scientists thought dinosaurs were already in decline long before the asteroid impact that ended their reign 66 million years ago. However, a new study published in Science by researchers from Baylor University, New Mexico State University, The Smithsonian Institution, and several international partners challenges that long-standing belief.

The findings reveal that dinosaurs were not fading away at all -- they were thriving.

A final flourish in the San Juan Basin

In northwestern New Mexico, layers of ancient rock hold clues to a lively, previously overlooked chapter of Earth's history. Within the Naashoibito Member of the Kirtland Formation, scientists found evidence of rich dinosaur ecosystems that continued to flourish until just before the asteroid struck.High-precision dating determined that fossils from these rocks are between 66.4 and 66 million years old, placing them right at the boundary between the Cretaceous and Paleogene periods, when the global extinction event occurred.

"The Naashoibito dinosaurs lived at the same time as the famous Hell Creek species in Montana and the Dakotas," said Daniel Peppe, Ph.D., associate professor of geosciences at Baylor University. "They were not in decline -- these were vibrant, diverse communities."

Dinosaurs in their prime

The fossil evidence from New Mexico tells a strikingly different story from what many had assumed. Instead of dwindling, dinosaurs across North America were thriving in distinct regional communities. By analyzing ecological and geographic patterns, researchers found that dinosaur populations in western North America were divided into separate "bioprovinces" shaped primarily by regional temperature differences rather than by mountains or rivers.

"What our new research shows is that dinosaurs are not on their way out going into the mass extinction," said first author Andrew Flynn, Ph.D. '20, assistant professor of geological sciences at New Mexico State University. "They're doing great, they're thriving and that the asteroid impact seems to knock them out. This counters a long-held idea that there was this long-term decline in dinosaur diversity leading up to the mass extinction making them more prone to extinction."

Life after impact

The asteroid impact brought the age of dinosaurs to an abrupt end, but the ecosystems they left behind became the foundation for a new evolutionary chapter. Within just 300,000 years, mammals began rapidly diversifying, developing new diets, sizes, and ecological roles.

The same temperature-related patterns that once defined dinosaur ecosystems continued into the Paleocene epoch, guiding how life recovered after the disaster.

"The surviving mammals still retain the same north and south bio provinces," Flynn said. "Mammals in the north and the south are very different from each other, which is different than other mass extinctions where it seems to be much more uniform."

Why this discovery matters

This discovery offers more than just a look into the distant past. It underscores both the resilience and fragility of life on Earth. Conducted on public lands managed by the U.S. Bureau of Land Management, the research highlights how protected landscapes can unlock vital insights into how ecosystems respond to global upheaval.

By refining the timeline of the dinosaurs' final days, the study reveals that their extinction was not a slow decline but an abrupt, catastrophic end to a flourishing era of life -- cut short by chance from beyond the sky.

About the authors

In addition to Peppe and Flynn, the research team included scientists from Baylor University, New Mexico State University, the Smithsonian Institution, the University of Edinburgh, University College London and multiple U.S. and international institutions.

  • Stephen L. Brusatte, Ph.D., The University of Edinburgh
  • Alfio Alessandro Chiarenza, Ph.D., Royal Society Newton International Fellow, University College London
  • Jorge Garcia-Giron, Ph.D., University of Leon
  • Adam J. Davis, Ph.D., WSP USA Inc.
  • C. Will Fenley, Ph.D., Valle Exploration
  • Caitlin E. Leslie, Ph.D., ExxonMobil
  • Ross Secord, Ph.D., University of Nebraska-Lincoln
  • Sarah Shelley, Ph.D., Carnegie Museum of Natural History
  • Anne Weil, Ph.D., Oklahoma State University
  • Matthew T. Heizler, Ph.D., New Mexico Institute of Mining and Technology
  • Thomas E. Williamson, Ph.D., New Mexico Museum of Natural History and Science.

Monday, 27 October 2025

MIT physicists just found a way to see inside atoms

 Physicists at MIT have introduced a technique to study the interior of an atom's nucleus by relying on the atom's own electrons as "messengers" inside a molecule.

In research published on October 23 in Science, the team precisely measured the energy of electrons orbiting a radium atom that was chemically bound to a fluoride atom, forming radium monofluoride. By using the molecular environment as a microscopic stand-in for a particle collider, they confined the radium atom's electrons and increased the likelihood that some would briefly pass through the nucleus.Traditional experiments that investigate nuclear interiors depend on kilometer-scale accelerators that speed up electron beams to smash into and fragment nuclei. The new molecule-centered approach provides a compact, table-top way to directly probe the inside of a nucleus.

Table-Top Method Detects Nuclear "Messages"

Working with radium monofluoride, the researchers tracked the energies of the radium atom's electrons as they moved within the molecule. They observed a small shift in energy and concluded that some electrons must have briefly entered the nucleus and interacted with what lies inside. As those electrons left, they retained the energy change, effectively carrying a nuclear "message" that reveals features of the nucleus's interior.

The method opens a path to measuring the nuclear "magnetic distribution." Inside a nucleus, each proton and neutron behaves like a tiny magnet, and their orientations depend on how these particles are arranged. The team plans to use the technique to map this property in radium for the first time, a step that could inform one of cosmology's central puzzles: why the universe contains far more matter than antimatter.

"Our results lay the groundwork for subsequent studies aiming to measure violations of fundamental symmetries at the nuclear level," says study co-author Ronald Fernando Garcia Ruiz, who is the Thomas A. Franck Associate Professor of Physics at MIT. "This could provide answers to some of the most pressing questions in modern physics."

MIT co-authors include Shane Wilkins, Silviu-Marian Udrescu, and Alex Brinson, together with collaborators from several institutions, including the Collinear Resonance Ionization Spectroscopy Experiment (CRIS) at CERN in Switzerland, where the experiments took place.

Matter-Antimatter Imbalance and Radium's Role

According to current understanding, the early universe should have contained nearly equal amounts of matter and antimatter. Yet nearly everything we can detect today is matter built from protons and neutrons inside atomic nuclei.This observation conflicts with expectations from the Standard Model, suggesting that additional sources of fundamental symmetry violation are needed to account for the scarcity of antimatter. Such effects could appear within the nuclei of certain atoms, including radium.

Unlike most nuclei, which are close to spherical, radium's nucleus has an asymmetric, pear-like shape. Theorists predict that this geometry can amplify signals of symmetry violation enough to make them potentially observable.

"The radium nucleus is predicted to be an amplifier of this symmetry breaking, because its nucleus is asymmetric in charge and mass, which is quite unusual," says Garcia Ruiz, whose group has focused on developing methods to probe radium nuclei for signs of fundamental symmetry violation.

Building Ultra-Sensitive Molecular Experiments Peering inside a radium nucleus to test fundamental symmetries is extremely challenging.

"Radium is naturally radioactive, with a short lifetime and we can currently only produce radium monofluoride molecules in tiny quantities," says study lead author Shane Wilkins, a former postdoc at MIT. "We therefore need incredibly sensitive techniques to be able measure them."

The team recognized that embedding a radium atom in a molecule could confine and magnify the behavior of its electrons.

"When you put this radioactive atom inside of a molecule, the internal electric field that its electrons experience is orders of magnitude larger compared to the fields we can produce and apply in a lab," explains Silviu-Marian Udrescu PhD '24, a study co-author. "In a way, the molecule acts like a giant particle collider and gives us a better chance to probe the radium's nucleus."

Energy Shift Reveals Electron-Nucleus Encounters

The researchers created radium monofluoride by pairing radium atoms with fluoride atoms. In this molecule, the radium electrons are effectively squeezed, which increases the chance that they will interact with and briefly enter the radium nucleus.

Source: ScienceDaily

Sunday, 26 October 2025

Life expectancy gains have slowed sharply

 A new international analysis led by a University of Wisconsin-Madison professor reveals that the remarkable gains in life expectancy seen across wealthy nations during the early 20th century have slowed dramatically. The findings indicate that no generation born after 1939 is expected to reach an average age of 100.

Researchers Track a Century of Longevity Data

The study, published in the Proceedings of the National Academy of Sciences, was conducted by Héctor Pifarré i Arolas of the La Follette School of Public Affairs, José Andrade of the Max Planck Institute for Demographic Research, and Carlo Giovanni Camarda of the Institut national d'études démographiques. Drawing from the Human Mortality Database, the researchers examined data from 23 high-income, low-mortality countries using six independent methods to forecast mortality trends.According to Pifarré i Arolas, "The unprecedented increase in life expectancy we achieved in the first half of the 20th century appears to be a phenomenon we are unlikely to achieve again in the foreseeable future. In the absence of any major breakthroughs that significantly extend human life, life expectancy would still not match the rapid increases seen in the early 20th century even if adult survival improved twice as fast as we predict."

A Century of Uneven Gains

Between 1900 and 1938, life expectancy in wealthy nations rose by roughly five and a half months per generation. Someone born in 1900 could expect to live an average of 62 years, while a person born in 1938 could expect to reach about 80 years -- a dramatic improvement over just a few decades.

For generations born between 1939 and 2000, however, progress slowed to around two and a half to three and a half months per generation, depending on the statistical model used. Mortality forecasting models -- analytical tools that predict future lifespans using past and present mortality data -- allowed the researchers to project multiple possible futures for human longevity.

"We forecast that those born in 1980 will not live to be 100 on average, and none of the cohorts in our study will reach this milestone. This decline is largely due to the fact that past surges in longevity were driven by remarkable improvements in survival at very young ages," according to corresponding author Andrade.

In the early 20th century, rapid declines in infant mortality -- brought about by medical innovation, improved sanitation, and higher living standards -- significantly boosted average life expectancy. Today, infant and child mortality rates in high-income countries are already extremely low, meaning future gains must come from improved survival at older ages. The study concludes that such advances are unlikely to match the explosive pace of progress achieved a century ago.

Implications for Policy, Healthcare, and Planning

Although forecasts can never be fully certain, the authors emphasize that their results provide essential insights for policymakers preparing for the future. Unexpected developments such as new pandemics, medical breakthroughs, or major societal shifts could alter these trends, but current evidence suggests a long-term slowdown.

This slowdown has consequences that go beyond national statistics. While the study focuses on populations rather than individuals, slower life expectancy growth may influence how people approach saving, retirement, and long-term care. As Pifarré i Arolas and his colleagues suggest, both governments and individuals may need to adjust their expectations and plans for the decades ahead.

Source: ScienceDaily

Saturday, 25 October 2025

Scientists reveal Alaska could get up to two minutes’ warning before the next big quake

 For a wide variety of earthquake scenarios in Alaska, an earthquake early warning (EEW) system could provide at least 10 seconds of warning time for hazardous shaking, according to a new report.

Increasing the density and improving the spacing of seismic stations around the state could add 5 to 15 seconds to these estimated warning times, write Alexander Fozkos and Michael West at the University of Alaska Fairbanks. Alaska experiences tens of thousands of earthquakes each year, and has been the site some of the world's largest and most destructive seismic events.

The study's findings published in the Bulletin of the Seismological Society of America could help lay the groundwork for the expansion of the U.S. ShakeAlert earthquake early warning system, which now covers California, Oregon and Washington State.

"There were a lot of studies before EEW was widely available on the West Coast, where people were looking at different scenarios," said Fozkos. "So we wanted a similar kind of science up here with numbers that are Alaska specific."

For earthquakes along well-known faults in southcentral and southeast coastal Alaska, Fozkos and West estimated potential warning times of 10 to 120 seconds for magnitude 8.3 scenarios.

For magnitude 7.3 earthquake scenarios in crustal faults in interior and southcentral Alaska, the researchers estimated potential warning times ranging from 0 to 44 seconds.

And for a set of magnitude 7.8 earthquake scenarios along the dip of the subducting slab beneath Alaska, estimated warning times ranged from 0 to 73 seconds.

"I was expecting decent warning times along the coast and for most of the subduction zone events," said Fozkos, because there is dense seismic station coverage in these areas. "I was not expecting decent warning times for the shallow crustal events, so that was the biggest surprise to me."

The scenarios used in the study vary in earthquake magnitude, depth, location and fault style -- all of which impacted warning times. The researchers' models estimated how many seconds after an earthquake's origin the quake could be detected, how many seconds after origin time an alert could be available, and minimum and maximum warning times at a location.

Warning times were defined as the time difference between the time of the alert and the time that peak ground motion from an earthquake arrived at a location. This definition differs from a more common definition used in EEW systems, which ties warning time to the arrival of the initial S-wave or shear wave of an earthquake.

The researchers wanted to use peak ground motion instead, to create a warning time measurement that might be more relevant to people as they respond to an earthquake. The initial S-wave may not always cause significant ground motion, and strong shaking can arrive tens of seconds after the initial S-wave in large earthquakes, they explain.

The study doesn't analyze "the time it takes to disseminate the alert -- the time it actually takes to send the alert from a radio tower or from a satellite to somebody's phone and then for them to take out their phone and react to it," Fozkos noted.

The potential lag time in transmitting data and sharing an alert with the public "could be a big challenge for Alaska, but I don't think it's going to be insurmountable," he added.

The harsh Alaskan winters and wilderness locations of some seismic stations could also be challenging for an early warning system, if stations go down and can't be repaired quickly. "I think there is definitely a need for adding stations to cover redundancy for remote stations," Fozkos said.

Ocean-bottom seismometers (OBS) and more earthquake detection via distributed acoustic sensing or DAS would also be welcome additions to a warning system, he added. "Coupled with the fact that some of our biggest earthquakes are going to be offshore, tsunamigenic threats, I think OBS and DAS are probably big targets for the future."

sources-science daily

Friday, 24 October 2025

Could this new earthquake system give Alaska 50 seconds to prepare?

 A proposed earthquake early warning system could have provided several communities an alert of 10 seconds or more ahead of strong shaking from the magnitude 7.3 quake that occurred south of Sand Point near the tip of the Alaska Peninsula in mid-July.

That analysis is provided by Alex Fozkos of the Alaska Earthquake Center's systems team at the University of Alaska Fairbanks Geophysical Institute.

"Individuals in Sand Point could have expected approximately 10 seconds of warning time before shaking increased to its strongest," Fozkos said. "In King Cove, individuals could have expected a warning of approximately 20 seconds."

Sand Point is 55 miles from the epicenter; King Cove is slightly farther away. The community of Chignik, about 140 miles from the epicenter, would have received about 50 seconds of warning.

Fozkos' Sand Point analysis is based on a hypothetical early warning system. The Alaska Earthquake Center and the U.S. Geological Survey earlier this year described the first phase of implementing the proposed USGS ShakeAlert warning system, which operates in California, Oregon and Washington.

The Sand Point analysis was enabled by earthquake early warning modeling by Fozkos. That modeling system was published on August 5 in the Bulletin of the Seismological Society of America. The Sand Point analysis is not part of the research paper, however, as it occurred after the paper was submitted.

Research professor Michael West, the Alaska Earthquake Center's director and state seismologist, is a co-author.

For the research paper, Fozkos ran numerous warning time scenarios in several categories with varying inputs such as locations, magnitudes and fault configurations.

"This lays the groundwork for showing potential stakeholders how an early warning system could benefit Alaskans and why they should be paying attention," said Fozkos, who conducted the research at the UAF Geophysical Institute while a graduate student.

Fozkos and West define warning time as the time difference between when a person receives an alert and the arrival of peak ground motion.

In the research paper's scenario for the Southcentral and Southeast coasts, Fozkos simulated a magnitude 8.3 earthquake that created shaking intensities of 7 to 8. Shaking at those levels can cause moderate to heavy damage to buildings and would be widely felt.

Magnitude and intensity don't always correspond. The earthquake magnitude scale measures the energy released at the quake's source, while the shaking intensity scale describes the strength of ground shaking at specific locations.

Alerts for Southcentral and Southeast residents in that scenario could be issued 10 to 33 seconds after a quake occurs, with an average of 24 seconds, Fozkos writes in the research paper. Alerts would be issued quickest in the Southcentral region, where sensor density is highest.

"Having more stations in an area means an earthquake can be detected faster and a warning can be issued faster," Fozkos said.

An early warning system uses a network of seismic sensors to detect an earthquake's fast-moving primary, or P, waves as soon as they begin. It then calculates the earthquake's location and magnitude to send alerts before the slower, more-damaging secondary, or S, waves arrive.

Alaska's initial ShakeAlert phase, if funded, would focus on the Anchorage, Fairbanks, Kodiak and Prince William Sound regions, which include about 90% of the state's population.

The federal-state system would consist of 450 real-time Advanced National Seismic System stations. Of those, 20 exist in the state, 270 would be new stations and 160 would be upgraded existing stations.

Fozkos' research provides essential information to show how Alaskans could benefit from an earthquake early warning system. His work assumes a generic warning system, but his modeled outcomes are assumed to be comparable with ShakeAlert.

West said the science and support to establish early warning in Alaska has been in the works for several years and that the research paper's goal is to make it "feel real and accessible."

"Alaska has so many types of earthquakes that it can be difficult to explain to people what is possible," he said. "This study takes complicated algorithms and technologies and shows what might happen in real world situations."

Fozkos said Alaska's tectonic environment is vastly different from that of the West Coast states.

"We have crustal earthquakes, we've got the deep earthquakes in the slab, we've got the interface earthquakes, there's strike slip, there's normal faulting."

"If we're going to advocate for an earthquake early warning system, then we owe Alaskans the numbers that are directly tied to Alaska and not to California, Oregon and Washington," he said.

sources-science daily

Thursday, 23 October 2025

The surprising reason timber plantations explode into megafires

 The odds of high-severity wildfire were nearly one-and-a-half times higher on industrial private land than on publicly owned forests, a new study found. Forests managed by timber companies were more likely to exhibit the conditions that megafires love -- dense stands of regularly spaced trees with continuous vegetation connecting the understory to the canopy.

The research, led by the University of Utah, University of California, Berkeley, and the United States Forest Service, is the first to identify how extreme weather conditions and forest management practices jointly impact fire severity.Leveraging a unique lidar dataset, the authors created three-dimensional maps of public and private forests before five wildfires burned 1.1 million acres in the northern Sierra Nevada, California.

In periods of extreme weather, stem density -- the number of trees per acre -- became the most important predictor of a high-severity fire. Even in the face of accelerating climate change, how we manage the land will make a difference.

"That's a really hopeful finding because it means that we can adjust how we manage these landscapes to impact the way fires move through them," said Jacob Levine, postdoctoral researcher at the U and lead author of the study. "Strategies that reduce density by thinning out both small and mature trees will make forests more robust and resilient to fire in the future."

In a 2022 study, Levine and collaborators found that fire severity was typically higher on privately managed forests. They also discovered the risks extended to areas near to, but not owned by, private industry, threatening the wilderness, small landowners and urban areas in their shadow.This new study is the first to identify the underlying forest structures that make high-severity fires more likely in some areas than in others.

The study was published on Aug. 20, 2025, in the journal Global Change Biology.

Lidar unlocks forest structure secrets

Plumas National Forest, the study area in California's northern Sierra Nevada, is emblematic of the wider trend of wildfire occurrence and severity. The region's mixed conifer forests are adapted to periodic, low- to medium-severity fires that cleared vegetation, creating large spaces between clumps of trees. Efforts to increase timber resources led the U.S. government to implement fire suppression policies in the 1800s, including a ban on controlled burns that Indigenous People practiced for millennia. In the absence of natural fire cycles and Indigenous burning, modern forests have more fodder to fuel high-severity fires, defined as a fire that kills more than 95% of overstory trees.

Plumas National Forest is a mosaic of private industrial and public ownership, and 70% of the study area was burned in five massive wildfires between 2019 and 2021, including the largest single fire in California's recorded history, the Dixie Fire. Serendipitously, a unique dataset had been collected a year before the region burned.

In 2018, the U.S. Forest Service, Geological Survey and National Aeronautics and Space Administration surveyed the Plumas National Forest and surrounding private land using airborne light detection and ranging (lidar) flights. The lidar sensors shoot billions of lasers at the landscape below, which bounce off the grass, shrubs, saplings, tree canopies and other structures in the forest with high precision.

"We have a really detailed picture of what the forest looked like immediately before these massive fires. It's an unbelievably valuable thing to have," Levine said. "Understanding the forest structures that lead to high-severity fire allows us to target mitigation strategies to get ahead of this massive fire problem while still producing enough timber to meet market demand."

Private vs public management strategies

Timber companies are focused on maximizing profits and providing a sustainable source of wood, a valuable resource for society and economic engine for rural communities. Most practice plantation forestry -- clear-cutting an area and replanting the trees in a tightly packed grid. After 80 to 100 years, they do it all again, leaving a patchwork of dense stands of trees of similar age and size.

"You can think about stacking a bunch of matches together in a grid -- that's going to burn a lot better than if you have those matches dispersed as smaller clumps," Levine explained. "A bigger fire can easily reach the canopy in dense forests. Then it's ripping through one tree after another, tossing out chunks of burning material miles in advance. It's a different story."

The objectives of public lands are more varied, requiring management for grazing, recreation, restoration, timber production and wildlife corridors. They're also beholden to the public, which stymies their ability to do active management. Environmental organizations often sue to stop proposed projects that would remove trees to thin down density.

Although the study demonstrates that private industrial lands fare worse, both private and public agencies have much room for improvement to protect our nation's forests. Most Sierra Nevada trees lack adaptations to recover from high-severity fires, leading to more and more of our forests turning into shrub and grasslands.

"This has major implications for timber, but also for carbon sequestration, water quality, wildlife habitat and recreation," Levine said. "Shrub and grasslands can be beautiful, but when we think of the Sierra Nevada we picture majestic forests. Without major changes in forest management, future generations could inherit a landscape that looks very different than the one we cherish today."

Other authors include Brandon Collins of the U.S. Forest Service and University of California, Berkeley; Michelle Coppoletta of the U.S. Forest Service; and Scott Stephens of University of California, Berkeley.

The study, "Extreme weather magnifies the effects of forest structure on wildfire, driving increased severity in industrial forests," was published on Aug. 20, 2025, in the journal Global Change Biology.

sources-science daily

Wednesday, 22 October 2025

Scientists uncover wildfire paradox that’s putting 440 million people in danger

 Despite fewer acres burned, wildfire danger is rising where it matters most: near people. Africa dominates exposure, California suffers extreme intensity, and climate-driven fire weather makes future disasters almost certain. Credit: Shutterstock

Researchers at the University of California, Irvine and other institutions have spotted a contradiction in worldwide wildfire trends: Despite a 26 percent decline in total burned area from 2002 to 2021, the number of people exposed to wildfires has surged by nearly 40 percent.

The study, published on August 21 in Science, revealed another statistic that may come as a surprise to people who rely primarily on Western news sources: While high-profile wildfire disasters in the United States, Canada and Australia often dominate headlines, the researchers found that 85 percent of all human exposures to wildfires during that period occurred in Africa.

Just five central African countries - Congo, South Sudan, Mozambique, Zambia and Angola - accounted for half of all global human exposure. In contrast, the United States, Europe and Australia collectively constituted less than 2.5 percent of the total.

"Nevertheless, the western U.S. and particularly California are hot spots of intense fires globally," said senior author Mojtaba Sadegh, an associate professor of civil engineering at Idaho's Boise State University who earned a Ph.D. in civil and environmental engineering at UC Irvine in 2015. "Our previously published study shows that California experiences a disproportionately large share of U.S. fire impacts, accounting for 72 percent of human exposures despite comprising 15 percent of the nation's burned area."

The researchers analyzed population data and more than 18.6 million fire records from 2002 to 2021 to find that an estimated 440 million people worldwide were exposed to a wildfire encroaching on their home during this period - a number roughly equivalent to the entire population of the European Union. They discovered that human exposure to wildland fire increased by 7.7 million people, an average of 382,700 persons per year during the study period. This surge in human exposure was prompted not by a global jump in fire activity but primarily by population growth and migration into fire-prone landscapes.

Another factor illuminated by the research is a significant rise in the intensity of wildfires in North and South America. This is linked to the climate change-driven amplification of "fire weather," which includes conditions like increased heat, lower humidity and strong winds.

Extreme fire weather has grown by more than 50 percent over the past four decades globally.

When combined with such human activities as land development and historical fire suppression practices, this trend has led to an escalating risk of destructive fires in regions like California. The frequency of conditions conducive to extreme-impact wildfires (like the 2025 Los Angeles fires) quadrupled from 1990 to 2022 across the state.

In Europe and Oceania, the study noted a decline in wildfire exposures, mainly because of population shifts from rural to urban areas. This highlights how both social and environmental factors play critical roles in shaping wildfire risk.

"The global paradox of decreased burn area and increased human impacts we uncovered … is due largely to an increasing overlap between human settlements and fire-prone landscapes," said co-author Amir AghaKouchak, UC Irvine Chancellor's Professor of civil and environmental engineering.

Underscoring a growing human vulnerability to wildfires - particularly in regions that receive little international attention - the research emphasizes the urgent need for proactive mitigation strategies to protect communities from the burgeoning threat of wildfires. These include vegetation management techniques like prescribed fires, public education and engineering solutions to reduce human-caused ignitions.

"As climate change intensifies fire weather and global populations continue to expand into fire-prone zones, proactive mitigation will be increasingly critical to reduce the risk of future wildfire disasters," AghaKouchak said.

Study collaborators included Matthew Jones of England's University of East Anglia; Seyd Teymoor Seydi of Boise State University; John Abatzoglou and Crystal Kolden of UC Merced; Gabriel Filippelli of Indiana University Indianapolis; Matthew Hurteau of the University of New Mexico; Charles Luce of the U.S. Department of Agriculture Forest Service's Rocky Mountain Research Station in Boise; and Chiyuan Miao of Beijing Normal University. Funding was provided by the U.S. National Science Foundation.

sources-science daily

Tuesday, 21 October 2025

Scientists create biodegradable plastic stronger than PET

 A Kobe University group now published that they achieved the production of PDCA — which is biodegradable and materials incorporating this show physical properties comparable to or even surpassing those of PET — in bioreactors at concentrations more than seven-fold higher than previously reported. Credit: Tsutomu Tanaka

The PET-alternative PDCA is biodegradable and has superior physical properties. A Kobe University team of bioengineers engineered E. coli bacteria to produce the compound from glucose at unprecedented levels and without byproducts -- and opened up a realm of possibilities for the future of bioengineering.

The durability of plastics is both the reason why they have become so wide-spread and why they pose environmental problems. In addition, they are mainly sourced from petroleum, making them non-renewable and contingent on geopolitics. Research groups worldwide work on both biodegradable and bio-sourced alternatives, but there often are issues with yield, purity and -- as a result -- associated production cost.

Kobe University bioengineer Tsutomu Tanaka says: "Most biomass-based production strategies focus on molecules consisting of carbon, oxygen and hydrogen. However, there are highly promising compounds for high-performance plastics that include other elements such as nitrogen, but there are no efficient bioproduction strategies. And purely chemical reactions inevitably generate unwanted byproducts." PDCA, which stands for pyridinedicarboxylic acid, is such a candidate. It is biodegradable, and materials incorporating this show physical properties comparable to or even surpassing those of PET, which is widely used in containers and textiles. "Our group approached the challenge from a new angle: We aimed to harness cellular metabolism to assimilate nitrogen and build the compound from start to finish," says Tanaka.

In the journal Metabolic Engineering, the Kobe University group now published that they achieved the production of PDCA in bioreactors at concentrations more than seven-fold higher than previously reported. Tanaka explains, "The significance of our work lies in demonstrating that metabolic reactions can be used to incorporate nitrogen without producing unwanted byproducts, thereby enabling the clean and efficient synthesis of the target compound."

The group, however, did have some stubborn problems to solve along the way. The most stubborn of these came when they discovered a bottleneck where one of the enzymes they had introduced produced the highly reactive compound hydrogen peroxide, H2O2. The compound then attacked the enzyme that produced it, thereby deactivating it. "Through refining the culture conditions, in particular by adding a compound that can scavenge H2O2, we could finally overcome the issue, although this addition may present new economic and logistical challenges for large-scale production," says Tanaka.

The bioengineers already have plans on how to improve the production going forward, with every problem pointing the way to its solution. Looking at the future, Tanaka says: "The ability to obtain sufficient quantities in bioreactors lays the groundwork for the next steps toward practical implementation. More generally, though, our achievement in incorporating enzymes from nitrogen metabolism broadens the spectrum of molecules accessible through microbial synthesis, thus enhancing the potential of bio-manufacturing even further."

sources-science daily

Monday, 20 October 2025

Scientists just cracked a 60-million-year-old volcanic mystery

 Sixty million years ago, a powerful mantle plume beneath Iceland triggered colossal volcanic activity across the North Atlantic, shaping landscapes from Greenland to Ireland. Cambridge researchers now show that thinner patches of Earth’s tectonic plates acted like volcanic “funnels,” spreading molten rock far and wide. Credit: University of Cambridge

What do the rumblings of Iceland's volcanoes have in common with the now peaceful volcanic islands off Scotland's western coast and the spectacular basalt columns of the Giant's Causeway in Northern Ireland?

About sixty million years ago, the Icelandic mantle plume -- a fountain of hot rock that rises from Earth's core-mantle boundary -- unleashed volcanic activity across a vast area of the North Atlantic, extending from Scotland and Ireland to Greenland.

For decades, scientists have puzzled over why this burst of volcanism was so extensive. Now, research led by the University of Cambridge has found that differences in the thickness of tectonic plates around the North Atlantic might explain the widespread volcanism.

The researchers compiled seismic and temperature maps of Earth's interior, finding that patches of thinner tectonic plate acted like conduits, funneling the plume's molten rock over a wide area.

Iceland, which is one of the most volcanically active places on Earth, owes its origin largely to the mantle plume. Beyond volcanism, the Iceland Plume's influence even extends to shaping the seafloor and ocean circulation in the North Atlantic and, in turn, climate through time. Despite its global significance, many aspects of the plume's behavior and history remain elusive.

"Scientists have a lot of unanswered questions about the Iceland plume," said Raffaele Bonadio, a geophysicist at Cambridge's Department of Earth Sciences and lead author of the study.

Bonadio set out to explain why the plume's volcanic imprint was much more widespread sixty million years ago -- before the Atlantic opened -- forming volcanoes and lava outpourings stretching over thousands of kilometers. The pattern could be explained by the mantle plume spreading outward in a branched, flowing formation, Bonadio explained, "but evidence for such flow has been scarce."

In search of answers, Bonadio focused on a segment of the North Atlantic Igneous Province to better understand the complex distribution of volcanoes in Scotland and Ireland. He wanted to know if the structure of Earth's tectonic plates played a role in the surface expression of volcanism.

Using seismic data extracted from earthquakes, Bonadio created a computer-generated image of Earth's interior beneath Britain and Ireland. This method, known as seismic tomography, works similarly to a medical CT scan, revealing hidden structures deep within the planet. Bonadio coupled this with seismic thermography measurements -- a new method developed by the team -- which reveal variations in the temperature and thickness of the tectonic plate.For decades, scientists have puzzled over why this burst of volcanism was so extensive. Now, research led by the University of Cambridge has found that differences in the thickness of tectonic plates around the North Atlantic might explain the widespread volcanism.

The researchers compiled seismic and temperature maps of Earth's interior, finding that patches of thinner tectonic plate acted like conduits, funneling the plume's molten rock over a wide area.

Iceland, which is one of the most volcanically active places on Earth, owes its origin largely to the mantle plume. Beyond volcanism, the Iceland Plume's influence even extends to shaping the seafloor and ocean circulation in the North Atlantic and, in turn, climate through time. Despite its global significance, many aspects of the plume's behavior and history remain elusive.

"Scientists have a lot of unanswered questions about the Iceland plume," said Raffaele Bonadio, a geophysicist at Cambridge's Department of Earth Sciences and lead author of the study.

Bonadio set out to explain why the plume's volcanic imprint was much more widespread sixty million years ago -- before the Atlantic opened -- forming volcanoes and lava outpourings stretching over thousands of kilometers. The pattern could be explained by the mantle plume spreading outward in a branched, flowing formation, Bonadio explained, "but evidence for such flow has been scarce."

In search of answers, Bonadio focused on a segment of the North Atlantic Igneous Province to better understand the complex distribution of volcanoes in Scotland and Ireland. He wanted to know if the structure of Earth's tectonic plates played a role in the surface expression of volcanism.

Using seismic data extracted from earthquakes, Bonadio created a computer-generated image of Earth's interior beneath Britain and Ireland. This method, known as seismic tomography, works similarly to a medical CT scan, revealing hidden structures deep within the planet. Bonadio coupled this with seismic thermography measurements -- a new method developed by the team -- which reveal variations in the temperature and thickness of the tectonic plate.For decades, scientists have puzzled over why this burst of volcanism was so extensive. Now, research led by the University of Cambridge has found that differences in the thickness of tectonic plates around the North Atlantic might explain the widespread volcanism.

The researchers compiled seismic and temperature maps of Earth's interior, finding that patches of thinner tectonic plate acted like conduits, funneling the plume's molten rock over a wide area.

Iceland, which is one of the most volcanically active places on Earth, owes its origin largely to the mantle plume. Beyond volcanism, the Iceland Plume's influence even extends to shaping the seafloor and ocean circulation in the North Atlantic and, in turn, climate through time. Despite its global significance, many aspects of the plume's behavior and history remain elusive.

"Scientists have a lot of unanswered questions about the Iceland plume," said Raffaele Bonadio, a geophysicist at Cambridge's Department of Earth Sciences and lead author of the study.

Bonadio set out to explain why the plume's volcanic imprint was much more widespread sixty million years ago -- before the Atlantic opened -- forming volcanoes and lava outpourings stretching over thousands of kilometers. The pattern could be explained by the mantle plume spreading outward in a branched, flowing formation, Bonadio explained, "but evidence for such flow has been scarce."

In search of answers, Bonadio focused on a segment of the North Atlantic Igneous Province to better understand the complex distribution of volcanoes in Scotland and Ireland. He wanted to know if the structure of Earth's tectonic plates played a role in the surface expression of volcanism.

Using seismic data extracted from earthquakes, Bonadio created a computer-generated image of Earth's interior beneath Britain and Ireland. This method, known as seismic tomography, works similarly to a medical CT scan, revealing hidden structures deep within the planet. Bonadio coupled this with seismic thermography measurements -- a new method developed by the team -- which reveal variations in the temperature and thickness of the tectonic plate.

He found that northwest Scotland and Ireland's volcanoes formed in areas where the lithosphere (Earth's rigid outer layer that makes up the tectonic plates) is thinner and weaker.

"We see ancient volcanoes concentrated within this corridor of thin lithosphere beneath the Irish Sea and surrounding areas," said Bonadio. He thinks the hot plume material was preferentially funneled along this corridor, ponding in the thin plate areas due to its buoyancy.

Previously, some scientists had put forward alternative, non-mantle plume origins for the volcanic activity, said Bonadio. But his new research shows the scattering could be explained by the magma being diverted and re-routed to areas of thinner lithosphere.

Sergei Lebedev, from the University of Cambridge said, "this striking correlation suggests that hot plume material eroded the lithosphere in this region. This resulting combination of thin lithosphere, hot asthenosphere and decompression melting likely caused the uplift and volcanic activity."

Previously, the authors have found a close link between the uneven distribution of earthquakes in Britain and Ireland and the thickness of the lithosphere, showing how the scars left by the mantle plume influence seismic hazards today.

Bonadio and Lebedev are also using their methods to map geothermal energy resource potential. "In Britain and Ireland, the greatest supply of heat from the Earth's mantle is in the same places where volcanoes erupted sixty million years ago, and where the lithosphere is thinner," said Lebedev. He and Bonadio are working with international colleagues to apply their new seismic thermography methods to global geothermal assessment.

sources-science daily