Wednesday, 6 May 2026

Common knee surgery found ineffective, may make things worse

 A widely performed knee procedure known as partial meniscectomy may not deliver the benefits many patients expect. A major study with a 10-year follow-up has found that trimming a damaged meniscus does not improve symptoms or knee function when compared to a placebo procedure.

Partial meniscectomy is one of the most common orthopedic surgeries worldwide. While its use has declined in Finland in recent years, it remains a routine treatment in many countries

10-Year Study Finds Worse Outcomes After Surgery

The long-term results paint a concerning picture. Patients who underwent partial meniscectomy did not experience better outcomes than those who had sham surgery. In fact, they tended to do worse.

After a decade, these patients reported more knee symptoms and poorer function. They also showed greater progression of osteoarthritis and were more likely to need additional knee surgery compared to those who did not receive the actual procedure.

Unique Trial Design Strengthens Findings

The Finnish Degenerative Meniscal Lesion Study (FIDELITY) stands out for its rigorous design. It included a sham surgery control group, allowing researchers to directly compare outcomes against a placebo procedure. Participants with degenerative meniscal tears were randomly assigned to receive either partial meniscectomy or sham surgery, and their progress was tracked for 10 years.

Teppo Järvinen, Professor at the University of Helsinki and the principal investigator of the FIDELITY emphasizes the broader significance of the results:

"Our findings suggest that this may be an example of what is known as a medical reversal, where broadly used therapy proves ineffective or even harmful."

Rethinking the Cause of Knee Pain

The surgery has long been based on the idea that knee pain, especially on the inner side, is caused by a meniscus tear that can be fixed surgically. However, this assumption may not hold up.

"The surgery is based on the assumption that pain in the inside of the knee is caused by a medial meniscus tear, which can be treated surgically. Such reasoning -- assumption based on biological credibility -- is still very common in medicine but in this case, the assumption does not withstand critical examination. Based on current understanding, pain in various joints, such as the knee joint in this case, is related to degeneration brought about by aging," says Raine Sihvonen, Specialist in Orthopaedics and Traumatology and the other principal investigator of the FIDELITY study.

Concerns About Risks and Long-Term Harm

Earlier registry and observational studies have already raised red flags about potential downsides of this surgery. These include a higher likelihood of arthroplasty, or joint replacement surgery, and a possible increase in complications after the procedure. However, observational data alone cannot prove cause and effect.

"Several randomized studies have already demonstrated that partial meniscectomy has not improved patients' symptoms or function in the short (1-2 years) or medium (5 years) term. Regardless, the procedure has remained widely used in many countries," says Doctoral Researcher and Specialist in Orthopaedics and Traumatology, Dr. Roope Kalske.

Why the Procedure Is Still Widely Used

Despite mounting evidence, changing clinical practice has been slow.

"For nearly a decade, many independent, non-orthopedic organizations providing clinical guidelines have recommended that the procedure should be discontinued. Still, for example, the American Academy of Orthopedic Surgeons (AAOS) and the British Association for Surgery of the Knee (BASK) have continued to endorse the surgery.

"This effectively illustrates how difficult it is to give up inefficient therapies," Järvinen sums up.

Strong Collaboration Behind the Study

The research was carried out across five hospitals, highlighting strong collaboration and patient commitment. Of the original 146 participants, more than 90% completed the final follow-up phase.

"The study conducted in five hospitals is an example of smooth multicenter collaboration, as well as the commitment of research patients to an interesting project. Of the original 146 participants, more than 90% took part in the final stage of the study," says the research manager Pirjo Toivonen.

The Finnish Degenerative Meniscal Lesion Study FIDELITY) is part of the broader work of the FICEBO research group in assessing the impact of surgical therapies. The project is a collaboration between the university hospitals of Helsinki, Kuopio and Turku, Hatanpää Hospital in Tampere, Hospital Nova in Jyväskylä and the Finnish Institute for Health and Welfare.

Source: ScienceDaily

Tuesday, 5 May 2026

AI lets chemists design molecules by simply describing them

 Creating new molecules is one of the toughest tasks in chemistry. Whether the goal is a life-saving drug or a cutting-edge material, each compound must be built through a carefully planned series of reactions. Mapping out these steps requires deep expertise and strategic thinking, which is why chemists often spend years mastering the process.

A major hurdle is retrosynthesis. In this approach, chemists begin with the final molecule they want and work backward to figure out simpler starting materials and possible reaction routes. This involves many decisions, such as selecting the right building blocks, deciding when to form rings, and determining whether sensitive parts of the molecule need protection. While computers can scan enormous "chemical spaces," they still struggle to match the strategic judgment of experienced chemists.

Another challenge involves reaction mechanisms, which describe how reactions proceed step by step through the movement of electrons. Understanding these mechanisms allows scientists to predict new reactions, improve efficiency, and avoid costly trial and error. Although current computational tools can suggest many possible pathways, they often lack the intuition needed to pinpoint the most realistic ones.

A New AI Approach to Chemical Reasoning

Researchers led by Philippe Schwaller at EPFL have developed a new method that uses large language models (LLMs) as reasoning tools for chemistry. Rather than directly generating chemical structures, these models act as evaluators that guide existing computational systems.

The new framework, called Synthegy, combines traditional search algorithms with AI that can interpret chemical strategies written in natural language.

"When making tools for chemists, the user interface matters a lot, and previous tools relied on cumbersome filters and rules," says Andres M Bran, the first author of the Synthegy paper published in Matter. "With Synthegy, we're giving chemists the power to just talk, allowing them to iterate much faster and navigate more complex synthetic ideas."

How Synthegy Improves Retrosynthesis Planning

Synthegy starts with a target molecule and a simple instruction written in everyday language. For example, a chemist might request that a specific ring be formed early or that unnecessary protecting groups be avoided. Standard retrosynthesis software then generates many possible pathways.

Each of these pathways is converted into text and reviewed by a language model. Synthegy scores how well each option matches the chemist's instructions and explains its reasoning. This makes it easier to rank and filter the best routes. By guiding searches with natural language, chemists can quickly focus on strategies that align with their goals.

Source:ScienceDaily

Monday, 4 May 2026

Scientists turn plastic waste into clean hydrogen fuel using sunlight

 Scientists are developing a new way to tackle two major global problems at once: plastic pollution and the demand for clean energy. By using sunlight, they are finding ways to turn discarded plastic into useful fuels.

A recent study led by Adelaide University PhD candidate Xiao Lu examines how solar-powered systems can convert waste plastics into hydrogen, syngas, and other industrial chemicals. This approach could help create a more sustainable, circular economy by giving new value to materials that are usually thrown away.

Plastic Waste as a Hidden Energy Resource

More than 460 million tonnes of plastic are produced worldwide each year, and large amounts end up polluting land and oceans. At the same time, the need to move away from fossil fuels has intensified the search for cleaner energy alternatives.

The research, published in Chem Catalysis, shows that plastics, which are rich in carbon and hydrogen, can be treated as a resource rather than just waste.

"Plastic is often seen as a major environmental problem, but it also represents a significant opportunity," said Ms Lu. "If we can efficiently convert waste plastics into clean fuels using sunlight, we can address pollution and energy challenges at the same time."

How Sunlight Converts Plastic Into Fuel

The method, called solar-driven photoreforming, relies on light-sensitive materials known as photocatalysts. These materials use sunlight to break down plastics at relatively low temperatures.

Through this process, plastics can be transformed into hydrogen, which is a clean fuel that produces no emissions at the point of use, along with other valuable industrial chemicals.Compared to traditional water splitting for hydrogen production, this approach can be more energy-efficient. Plastics are easier to oxidize, which makes the reactions require less energy and increases the potential for large-scale use.

Promising Results From Early Studies

According to senior author Professor Xiaoguang Duan from the School of Chemical Engineering at Adelaide University, recent experiments have delivered strong results.Researchers have reported high levels of hydrogen production, as well as the creation of acetic acid and even diesel-range hydrocarbons. Some systems have run continuously for more than 100 hours, demonstrating improving stability and performance.

Challenges to Scaling the Technology

Despite this progress, several obstacles must be addressed before the technology can be widely adopted."One major hurdle is the complexity of plastic waste itself," Prof Duan said. "Different types of plastics behave differently during conversion, and additives such as dyes and stabilisers can interfere with the process. Efficient sorting and pre-treatment are therefore essential to maximise performance and product quality."

Another key issue involves the photocatalysts themselves. These materials need to be highly selective and durable, capable of operating under demanding chemical conditions without losing effectiveness. Current versions can degrade over time, which limits their long-term reliability.

"There is still a gap between laboratory success and real-world application," Prof Duan said. "We need more robust catalysts and better system designs to ensure the technology is both efficient and economically viable at scale."

Engineering and Efficiency Hurdles

Separating the final products is also a challenge. The reactions often produce a mix of gases and liquids, which must be separated through energy-intensive processes. This can reduce the overall environmental benefits.

To overcome these issues, researchers emphasize the need for a more integrated strategy. This includes improvements in catalyst design, reactor engineering, and overall system optimization. New ideas being explored include continuous-flow reactors, systems that combine solar with thermal or electrical energy, and advanced monitoring tools to improve efficiency.

A Roadmap Toward Real-World Use

Looking ahead, the team has outlined steps for scaling up the technology. Their goals include boosting energy efficiency and enabling continuous industrial operation over the coming decades.

"This is an exciting and rapidly evolving field," Ms Lu said. "With continued innovation, we believe solar-powered plastic-to-fuel technologies could play a key role in building a sustainable, low-carbon future."

Source: ScienceDaily

Sunday, 3 May 2026

MIT scientists finally reveal the hidden structure of a mysterious high-tech material

 Materials known as relaxor ferroelectrics have played an important role for decades in technologies such as ultrasound imaging, microphones, and sonar. Their unusual performance comes from the way atoms are arranged inside them. However, that internal structure has been extremely difficult to measure directly, leaving scientists to rely on incomplete models.

Now, researchers from MIT and collaborating institutions have, for the first time, mapped the three dimensional atomic structure of a relaxor ferroelectric. Their results, to be published in Science, offer a clearer foundation for improving the models used to design future computing systems, energy devices, and advanced sensors.

"Now that we have a better understanding of exactly what's going on, we can better predict and engineer the properties we want materials to achieve," says corresponding author James LeBeau, MIT's Kyocera Professor of Materials Science and Engineering. "The research community is still developing methods to engineer these materials, but in order to predict the properties those materials will have, you have to know if your model is right."

Revealing Hidden Charge Patterns in Complex Materials

In the study, the team used a cutting edge imaging method to examine how electric charges are distributed throughout the material. What they found challenged previous assumptions.

"We realized the chemical disorder we observed in our experiments was not fully considered previously," says co-first authors Michael Xu PhD '25 and Menglin Zhu, who are both postdocs at MIT. "Working with our collaborators, we were able to merge the experimental observations with simulations to refine the models and better predict what we see in experiments."

The research team also included Colin Gilgenbach and Bridget R. Denzer, MIT PhD students in materials science and engineering; Yubo Qi, an assistant professor at the University of Alabama at Birmingham; Jieun Kim, an assistant professor at the Korea Advanced Institute of Science and Technology; Jiahao Zhang, a former PhD student at the University of Pennsylvania; Lane W. Martin, a professor at Rice University; and Andrew M. Rappe, a professor at the University of Pennsylvania.

Probing Disordered Materials at the Atomic Scale

Computer models have long suggested that when an electric field is applied to relaxor ferroelectrics, interactions between positively and negatively charged atoms within tiny regions help create their strong energy storage and sensing abilities. Until now, those nanoscale regions could not be directly observed.

To investigate further, the researchers focused on a widely used material found in sensors, actuators, and defense systems, a lead magnesium niobate-lead titanate alloy. They applied an advanced technique called multi-slice electron ptychography (MEP). This method involves scanning a nanoscale beam of high energy electrons across the material and recording the diffraction patterns that result.

"We do this in a sequential way, and at each position, we acquire a diffraction pattern," Zhu explains. "That creates regions of overlap, and that overlap has enough information to use an algorithm to iteratively reconstruct three-dimensional information about the object and the electron wave function."

Using this approach, the team uncovered a layered hierarchy of chemical and polar structures, extending from individual atoms up to larger, mesoscopic features. They also discovered that regions with different polarization were significantly smaller than earlier simulations had predicted. By incorporating these observations into their models, the researchers were able to improve how well simulations match real world behavior.

"Previously, these models basically had random regions of polarization, but they didn't tell you how those regions correlate with each other," Xu says. "Now we can tell you that information, and we can see how individual chemical species modulate polarization depending on the charge state of atoms."

Toward Better Materials for Future Technologies

According to Zhu, the findings highlight the growing power of electron ptychography for exploring complex, disordered materials and could lead to new lines of research.

"This study is the first time in the electron microscope that we've been able to directly connect the three-dimensional polar structure of relaxor ferroelectrics with molecular dynamics calculations," Xu says. "It further proves you can get three-dimensional information out of the sample using this technique."

The team believes this method could eventually help scientists design materials with tailored electronic properties, improving technologies such as memory storage, sensing systems, and energy devices.

Source: ScienceDaily

Saturday, 2 May 2026

The da Vinci bloodline is unlocking the genius’s genetic secrets

 For more than 500 years, Leonardo da Vinci has been admired as a brilliant artist, inventor, and thinker whose talents seemed far ahead of his time. Now, an ambitious international effort known as the Leonardo DNA Project is bringing scientists closer than ever to uncovering the biological roots of his genius.

A newly published book, "Genìa Da Vinci. Genealogy and Genetics for Leonardo's DNA," brings together three decades of research led by Alessandro Vezzosi and Agnese Sabato of the Leonardo Da Vinci Heritage Association in Vinci. Supported by the Municipality of Vinci, the work reconstructs an extensive family tree stretching back to 1331. It spans 21 generations and includes more than 400 individuals, creating the foundation for an unprecedented attempt to rebuild Leonardo's genetic profile.

By carefully studying archival records and historical documents, the researchers were able to map out previously unknown branches of Leonardo's family. In the process, they identified 15 living male descendants linked directly through the paternal line to Leonardo's father and his half-brother, Domenico Benedetto.

DNA Testing Links Living Descendants

This discovery opened the door for genetic analysis. David Caramelli, who coordinates the anthropological and molecular aspects of the Leonardo DNA Project and leads the Department of Biology at the University of Florence, worked with forensic anthropologist Elena Pilli to analyze DNA from six of these descendants.

The results showed that segments of the Y chromosome matched across the participants. Because this chromosome is passed from father to son with little change, the findings confirm a continuous male lineage within the Da Vinci family dating back at least 15 generations.

Ancient Tomb Could Hold Crucial Evidence

Researchers have also identified a Da Vinci family tomb at the Church of Santa Croce in Vinci. Archaeological excavations are currently underway in collaboration with the University of Florence. The site is believed to contain the remains of Leonardo's grandfather Antonio, his uncle Francesco, and several half-brothers, Antonio, Pandolfo, and Giovanni.

Anthropologists Alessandro Riga and Luca Bachechi have recovered bone fragments from the site, some of which have been radiocarbon dated. One specimen, consistent in age with Leonardo's relatives, has already undergone paleogenomic testing. Early analysis suggests the individual was male.

"Further detailed analyses are necessary to determine whether the DNA extracted is sufficiently preserved," says Caramelli, who is also President of the University Museum System. "Based on the results, we can proceed with analysis of Y chromosome fragments for comparison with current descendants."

If the Y chromosome from these remains matches that of living descendants, it would strengthen historical records and family lineage reconstructions. It could also make it possible to analyze biological traces connected to Leonardo himself, including material left on manuscripts or artworks, potentially enabling scientists to reconstruct his DNA.

A Global Scientific Effort

The Leonardo da Vinci DNA Project began in 2016 and is coordinated from The Rockefeller University in New York. It brings together institutions including the J. Craig Venter Institute in California and the University of Florence, with support from foundations such as the Achelis and Bodman Foundation (New York) and the Richard Lounsbery Foundation (Washington, D.C.).

The project focuses on tracking the Y chromosome, which passes largely unchanged through generations of males.

"Our goal in reconstructing the Da Vinci family's lineage up to the present day, while also preserving and valuing the places connected to Leonardo, is to enable scientific research on his DNA," says Vezzosi. "Through the recovery of Leonardo's DNA, we hope to understand the biological roots of his extraordinary visual acuity, creativity, and possibly even aspects of his health and causes of death."

"Even a tiny fingerprint on a page could contain cells to sequence," says Jesse H. Ausubel of The Rockefeller University and director of the project. "21st-century biology is moving the boundary between the unknowable and the unknown. Soon we may gain information about Leonardo and other historical figures once believed lost forever."

Beyond DNA: New Insights Into Leonardo's Life

The book goes far beyond genetics, offering a detailed exploration of Leonardo's world. Across 21 chapters, it examines historical, geographical, and genealogical evidence to better understand the environment in which he lived.

Researchers identified seven Da Vinci family homes in the village and castle of Vinci, along with two properties once owned by Leonardo himself. These properties were inherited from his uncle Francesco and were the subject of a long dispute with his half-brothers.

The study also revisits key figures in Leonardo's life. His grandfather Antonio is revealed as a traveling merchant who operated between Catalan Spain and Morocco, rather than simply a farmer. Meanwhile, new archival analysis provides a clearer view of Leonardo's mother, Caterina. Evidence suggests she may have been a slave working for a wealthy banker, Vanni di Niccolò di ser Vanni. Historical documents, including wills and donation records dating back to 1449, shed light on the relationship between this banker and Leonardo's father, ser Piero.

Source: ScienceDaily

Friday, 1 May 2026

Greenland ice melt has surged sixfold and scientists are alarmed

 Climate change is dramatically reshaping how Greenland's ice sheet melts, according to a new study led by the University of Barcelona and published in Nature Communications. Researchers found that extreme melting events are now happening more often, covering larger areas, and producing significantly more meltwater than in the past.Since 1990, the surface area affected by these extreme events has been expanding by about 2.8 million km2 per decade. At the same time, the amount of water released from melting ice has surged. Between 1950 and 2023, extreme melt events produced an average of 12.7 gigatons of water per decade. Since 1990, that figure has jumped to 82.4 gigatons per decade, marking a sixfold increase.

Record-Breaking Melt Events Are Becoming More Common

Most of the most intense melting episodes have occurred in recent decades. Seven of the ten most extreme events on record have taken place since 2000, including major events in August 2012, July 2019, and July 2021. These events stand out because they have no comparable dynamic precedents, highlighting how unusual current conditions have become.The study also shows that each extreme event is now producing more meltwater than similar events in the past. Since 1990, meltwater output during these episodes has risen by 25% compared to the 1950-1975 period when examining cases with similar anticyclonic and cyclonic air mass circulation. When considering all extreme events together, the increase reaches as high as 63%. This points to a strong thermodynamic effect, meaning rising temperatures are intensifying the melting beyond what atmospheric circulation alone would explain.

Northern Greenland Emerges as a Key Hotspot

The northern part of Greenland is now one of the regions most affected by these changes, emerging as a major hotspot for extreme melting. Looking ahead, projections under high greenhouse gas emission scenarios suggest that by the end of the century, the most intense meltwater anomalies could increase by as much as threefold.

New Methods Reveal Drivers Behind Intensifying Melt

The research was led by Josep Bonsoms, a postdoctoral researcher and professor in the Department of Geography at the University of Barcelona, with contributions from Marc Oliva, also a professor in the department. Conducted as part of the Antarctic, Arctic and Alpine Environments (ANTALP) Research Group, the study examined extreme melting events recorded between 1950 and 2023.

To better understand what is driving these changes, the team used a novel classification method that combines types of anticyclonic and cyclonic air mass circulation with a regional climate model. This approach allowed researchers to separate thermodynamic influences, which are linked to atmospheric warming, from dynamic influences tied to atmospheric circulation patterns.

Global Implications and Growing Strategic Importance

As global attention increasingly focuses on Greenland due to rapid environmental changes and their geopolitical implications, these findings carry added weight. Bonsoms, the article's lead author, says that "the rapid transformation of the ice sheet not only has global environmental consequences, such as sea level rise and possible alterations in ocean circulation, but also places the Arctic at the centre of new strategic, economic and territorial dynamics."

Understanding the processes that intensify extreme melting is critical for anticipating future risks and shaping informed policy decisions. The study is part of the GRELARCTIC project led by the UB ANTALP research group, with Marc Oliva as principal investigator, and was supported by an award from the ICREA Academia program.

Source: ScienceDaily

Thursday, 30 April 2026

This simple amino acid supplement greatly reduces Alzheimer’s damage

 Alzheimer's disease (AD) is a progressive brain disorder and a leading cause of dementia worldwide. Despite years of research, there is still no cure. New antibody-based treatments that target amyloid β (Aβ) have recently emerged, but their benefits have been modest. These therapies can also be expensive and may trigger immune-related side effects, underscoring the urgent need for safer, more affordable options that can slow the disease.

A recent study published in Neurochemistry International offers a surprising possibility. Researchers from Kindai University and partner institutions found that arginine, a naturally occurring amino acid, can reduce the buildup of harmful Aβ proteins in animal models of Alzheimer's. Arginine also acts as a safe chemical chaperone, helping proteins maintain their proper structure.

The team noted that while arginine is widely available as an over-the-counter supplement, the doses and methods used in this study were specifically designed for research and are not the same as commercial products.

The research group included Graduate Student Kanako Fujii and Professor Yoshitaka Nagai from the Department of Neurology at Kindai University Faculty of Medicine in Osaka, along with Associate Professor Toshihide Takeuchi from the Life Science Research Institute at Kindai University.

Lab and Animal Studies Show Strong Effects

In laboratory experiments, the scientists first showed that arginine can block the formation of Aβ42 aggregates, which are considered especially toxic. The effect increased with higher concentrations.

They then tested oral arginine in two well-established Alzheimer's models:

  • A Drosophila model, expressing Aβ42 with the Arctic mutation (E22G)
  • An AppNL-G-F knock-in mouse model, carrying three familial AD mutations

In both cases, arginine treatment reduced the accumulation of Aβ and lessened its harmful effects.

"Our study demonstrates that arginine can suppress Aβ aggregation both in vitro and in vivo," explains Prof. Nagai. "What makes this finding exciting is that arginine is already known to be clinically safe and inexpensive, making it a highly promising candidate for repositioning as a therapeutic option for AD."

Improved Brain Health and Reduced Inflammation

In the mouse model, the benefits went beyond reducing protein buildup. Arginine lowered amyloid plaque levels and reduced the amount of insoluble Aβ42 in the brain. Treated mice also performed better in behavioral tests.

The researchers found that arginine reduced the activity of genes linked to pro-inflammatory cytokines, which are associated with neuroinflammation, a major feature of Alzheimer's disease. This suggests that arginine may not only prevent harmful protein aggregation but also protect brain cells more broadly.

"Our findings open up new possibilities for developing arginine-based strategies for neurodegenerative diseases caused by protein misfolding and aggregation," notes Prof. Nagai. "Given its excellent safety profile and low cost, arginine could be rapidly translated to clinical trials for Alzheimer's and potentially other related disorders."

A Low-Cost Path Toward New Alzheimer's Treatments

The study highlights the growing interest in drug repositioning, which involves finding new uses for existing, well-established compounds. Because arginine is already used clinically in Japan and has been shown to safely reach the brain, it could bypass some of the early hurdles that slow down traditional drug development.

Still, the researchers caution that more work is needed. Additional preclinical and clinical studies will be required to determine whether these results can be reproduced in humans and to establish the most effective dosing strategies.

Even so, the findings provide strong early evidence that simple nutritional or pharmacological approaches may help reduce amyloid buildup and improve brain function.

Expanding Understanding of Alzheimer's Biology

Beyond its potential as a treatment, this work sheds new light on how Aβ proteins form and accumulate in the brain. It also points to a practical and cost-effective strategy that could eventually benefit millions of people living with Alzheimer's worldwide.

Professor Yoshitaka Nagai, a neurologist and Chair of the Department of Neurology at Kindai University Faculty of Medicine in Osaka, focuses his research on neurodegenerative diseases including Alzheimer's, Parkinson's, and amyotrophic lateral sclerosis. His work centers on protein misfolding and RNA-related mechanisms, and he has received multiple honors from organizations such as the Japanese Society of Neurochemistry and the Japanese Dementia Society.

This research was supported by the Ministry of Education, Culture, Sports, Science, and Technology (MEXT) (Grant No. 20H05927), Japan Society for the Promotion of Science (JSPS) (Grant Nos. 24H00630, 21H02840, 22H02792, and 25K02432), Japan Science and Technology Agency (JST) Super-Highway Program (SHW2023-03), and National Center of Neurology and Psychiatry.

Source: ScienceDaily