Sunday 31 October 2021

SARS-CoV-2 infection: What is the role of genes?

 

  • A recent review summarizes current evidence about the role of genetic factors in a person’s response to SARS-CoV-2 infection.
  • People with certain gene variants, especially those that code for proteins involved in the immune response to SARS-CoV-2, may be more susceptible to severe COVID-19.
  • Variants of human genes that encode proteins that are necessary for supporting various stages of the virus’s life cycle could confer natural resistance to SARS-CoV-2 infection.
  • Identifying gene variants that confer this resistance may help scientists develop treatments for COVID-19.

SARS-CoV-2 is highly likely to transmit from one person to the members of their household.

One 2020 review suggests that on average, the likelihood of SARS-CoV-2 transmitting to household contacts is 16.9%Trusted Source. But this increases to 41.5% in households made up of the person with the infection and one other contact.

Yet some people do not develop the infection even after prolonged contact with household members who have it. This suggests that these people may be resistant to SARS-CoV-2 infection.

Genetic factors are known to play a significant role in determining response to infectious disease.

A recent review published in Nature ImmunologyTrusted Source summarizes current evidence about genetic factors that could explain the variability in individual response to SARS-CoV-2 infection. Specifically, it describes genes that may result in increased susceptibility to SARS-CoV-2 and those that could potentially confer resistance.

The review was authored by researchers participating in the COVID Human Genetic Effort, an international collaboration that aims to understand the genetic and immune factors underlying SARS-CoV-2 infection.

Paul Bastard, a doctoral student at Imagine Institute, in Paris, and a collaborator in the COVID Human Genetic Effort, explained the significance of identifying genes that may confer natural resistance against SARS-CoV-2 to Medical News Today:

“This would be of major importance, as it could help identify the pathways involved in the fight against COVID-19. It could help us better understand the pathogenesis of COVID-19. In addition, it could potentially lead to the development of new therapeutics.”

Previous studies have shown that possessing certain genetic variants can increase susceptibility to tuberculosis. These genes generally encode proteins that are involved in the immune response.

Similarly, scientists have found mutations in genes that are involved in or influence the type-1 interferon response in people with severe COVID-19. Type-1 interferons are important chemical messengers in the immune system and are crucial to our antiviral response.

Certain gene variants can also, however, protect a person from severe illness and even confer resistance to an infectious disease.

For instance, people with a mutation in the gene that encodes the CCR5 receptorTrusted Source are naturally resistant to HIV-1. The CCR5 receptor binds to chemokines, a family of immune proteins, and is used by HIV-1 to enter human cells and spread in the body. People with a CCR5 gene mutation express a shorter version of the CCR5 protein, preventing HIV from entering and infecting cells.

The discovery of this natural resistance led to the development of drugs that block the receptor. This example shows how characterizing genes that confer natural resistance can facilitate the development of treatments for infectious diseases.

Likewise, scientists have identified several candidate genes that could potentially confer resistance against SARS-CoV-2 infection.

SARS-CoV-2 enters human cells by binding to the angiotensin-converting enzyme 2 (ACE2) protein, which is expressed on the surface of a wide variety of cells.

A recent preprint study, which has yet to be peer reviewed, showed that a rare gene variant located close to the ACE2 gene is associated with a lower risk of SARS-CoV-2 infection and severe illness.

Moreover, the study suggests that these protective effects may result from the variant gene’s ability to reduce ACE2 expression and, thus, potentially influence the entry of SARS-CoV-2.

Other laboratory studiesTrusted Source have identified human proteins that interact with SARS-CoV-2 and facilitate processes essential for viral infection. Variants of these genes could thus potentially confer resistance to SARS-CoV-2.

The characterization of genes that confer resistance to SARS-CoV-2 requires the identification of individuals with a natural resistance to the infection. However, there are a few major methodological obstacles.

One is demonstrating that a person has contracted SARS-CoV-2 in the past. Polymerase chain reaction (PCR) tests using nasal swabs or other respiratory samples only provide information about recent exposure to the virus. While detecting antibodies in plasma samples can provide information about a prior SARS-CoV-2 infection, a small percentage of individuals who have had the infection do not have detectable levels of antibodies.

It can also be challenging to distinguish individuals who have never been exposed to the virus from those who possess natural resistance.

The authors of the Nature ImmunologyTrusted Source review are currently conducting a study to characterize genes that may confer resistance to SARS-CoV-2 infection and propose a strategy to address these challenges.

To identify people with natural resistance to SARS-CoV-2 infection, the authors intend to enroll participants who do not have the infection but have a household member, especially a spouse or partner, with symptomatic COVID-19.

They also intend to include people without the infection who have been in contact, without protective equipment, with a symptomatic person during the first 3–5 days of their infection.

And in addition to PCR and antibody testing, they propose to assess the participants’ T-cell responses.

The immune response to a SARS-CoV-2 infection is characterized by the production of antibodies and a response by T cells, a type of white blood cell. The absence of a T-cell response specific to the virus, along with negative PCR and antibody tests, could thus help confirm the absence of a prior SARS-CoV-2 infection.

After analyzing the genomes of these participants to identify genes associated with natural resistance to SARS-CoV-2 infection, the authors will conduct subsequent studies to determine the role of the genes in the infection process.

Source: Medical News Today

Saturday 30 October 2021

Mars' surface shaped by fast and furious floods from overflowing craters

 On Earth, river erosion is usually a slow-going process. But on Mars, massive floods from overflowing crater lakes had an outsized role in shaping the Martian surface, carving deep chasms and moving vast amounts of sediment, according to a new study led by researchers at The University of Texas at Austin.

The study, published Sept. 29 in Nature, found that the floods, which probably lasted mere weeks, eroded more than enough sediment to completely fill Lake Superior and Lake Ontario.

"If we think about how sediment was being moved across the landscape on ancient Mars, lake breach floods were a really important process globally," said lead author Tim Goudge, an assistant professor at the UT Jackson School of Geosciences. "And this is a bit of a surprising result because they've been thought of as one-off anomalies for so long."

Crater lakes were common on Mars billions of years ago when the Red Planet had liquid water on its surface. Some craters could hold a small sea's worth of water. But when the water became too much to hold, it would breach the edge of the crater, causing catastrophic flooding that carved river valleys in its wake. A 2019 study led by Goudge determined that these events happened rapidly.

Remote sensing images taken by satellites orbiting Mars have allowed scientists to study the remains of breached Martian crater lakes. However, the crater lakes and their river valleys have mostly been studied on an individual basis, Goudge said. This is the first study to investigate how the 262 breached lakes across the Red Planet shaped the Martian surface as a whole.

The research entailed reviewing a preexisting catalog of river valleys on Mars and classifying the valleys into two categories: valleys that got their start at a crater's edge, which indicates they formed during a lake breach flood, and valleys that formed elsewhere on the landscape, which suggests a more gradual formation over time.

From there, the scientists compared the depth, length and volume of the different valley types and found that river valleys formed by crater lake breaches punch far above their weight, eroding away nearly a quarter of the Red Planet's river valley volume despite making up only 3% of total valley length.

"This discrepancy is accounted for by the fact that outlet canyons are significantly deeper than other?valleys," said study co-author Alexander Morgan, a research scientist at the Planetary Science Institute.

At 559 feet (170.5 meters), the median depth of a breach river valley is more than twice that of other river valleys created more gradually over time, which have a median depth of about 254 feet (77.5 meters).

In addition, although the chasms appeared in a geologic instant, they may have had a lasting effect on the surrounding landscape. The study suggests that the breaches scoured canyons so deep they may have influenced the formation of other nearby river valleys. The authors said this is a potential alternative explanation for unique Martian river valley topography that is usually attributed to climate.

The study demonstrates that lake breach river valleys played an important role in shaping the Martian surface, but Goudge said it's also a lesson in expectations. The Earth's geology has wiped away most craters and makes river erosion a slow and steady process in most cases. But that doesn't mean it will work that way on other worlds.

"When you fill [the craters] with water, it's a lot of stored energy there to be released," Goudge said. "It makes sense that Mars might tip, in this case, toward being shaped by catastrophism more than the Earth."

The study's other co-authors are Jackson School postdoctoral researcher Gaia Stucky de Quay and Caleb Fassett, a planetary scientist at the NASA Marshall Space Flight Center.

Source: ScienceDaily

Friday 29 October 2021

Dwarf planet Vesta a window to the early solar system

 The dwarf planet Vesta is helping scientists better understand the earliest era in the formation of our solar system. Two recent papers involving scientists from the University of California, Davis, use data from meteorites derived from Vesta to resolve the "missing mantle problem" and push back our knowledge of the solar system to just a couple of million years after it began to form. The papers were published in Nature Communications Sept. 14 and Nature Astronomy Sept. 30.

Vesta is the second-largest body in the asteroid belt at 500 kilometers across. It's big enough to have evolved in the same way as rocky, terrestrial bodies like the Earth, moon and Mars. Early on, these were balls of molten rock heated by collisions. Iron and the siderophiles, or 'iron-loving' elements such as rhenium, osmium, iridium, platinum and palladium sank to the center to form a metallic core, leaving the mantle poor in these elements. As the planet cooled, a thin solid crust formed over the mantle. Later, meteorites brought iron and other elements to the crust.

Most of the bulk of a planet like Earth is mantle. But mantle-type rocks are rare among asteroids and meteorites.

"If we look at meteorites, we have core material, we have crust, but we don't see mantle," said Qing-Zhu Yin, professor of earth and planetary sciences in the UC Davis College of Letters and Science. Planetary scientists have called this the "missing mantle problem."

In the recent Nature Communications paper, Yin and UC Davis graduate students Supratim Dey and Audrey Miller worked with first author Zoltan Vaci at the University of New Mexico to describe three recently discovered meteorites that do include mantle rock, called ultramafics that include mineral olivine as a major component. The UC Davis team contributed precise analysis of isotopes, creating a fingerprint that allowed them to identify the meteorites as coming from Vesta or a very similar body.

"This is the first time we've been able to sample the mantle of Vesta," Yin said. NASA's Dawn mission remotely observed rocks from the largest south pole impact crater on Vesta in 2011 but did not find mantle rock.

Probing the early solar system

Because it is so small, Vesta formed a solid crust long before larger bodies like the Earth, moon and Mars. So the siderophile elements that accumulated in its crust and mantle form a record of the very early solar system after core formation. Over time, collisions have broken pieces off Vesta that sometimes fall to Earth as meteorites.

Yin's lab at UC Davis had previously collaborated with an international team looking at elements in lunar crust to probe the early solar system. In the second paper, published in Nature Astronomy, Meng-Hua Zhu at the Macau University of Science and Technology, Yin and colleagues extended this work using Vesta.

"Because Vesta formed very early, it's a good template to look at the entire history of the Solar System," Yin said. "This pushes us back to two million years after the beginning of solar system formation."

It had been thought that Vesta and the larger inner planets could have got much of their material from the asteroid belt. But a key finding from the study was that the inner planets (Mercury, Venus, Earth and moon, Mars and inner dwarf planets) got most of their mass from colliding and merging with other large, molten bodies early in the solar system. The asteroid belt itself represents the leftover material of planet formation, but did not contribute much to the larger worlds.

Additional coauthors on the Nature Communications paper are: James Day and Marine Paquet, Scripps Institute of Oceanography, UC San Diego; Karen Ziegler and Carl Agee, University of New Mexico; Rainer Bartoschewitz, Bartoschewitz Meteorite Laboratory, Gifhorn, Germany; and Andreas Pack, Georg-August-Universität, Göttingen, Germany. Yin's other coauthors on the Nature Astronomy paper are: Alessandro Morbidelli, University of Nice-Sophia Antipolis, France; Wladimir Neumann, Universität Heidelberg, Germany; James Day, Scripps Institute of Oceanography, UCSD; David Rubie, University of Bayreuth, Germany; Gregory Archer, University of Münster, Germany; Natalia Artemieva, Planetary Science Institute, Tucson; Harry Becker and Kai Wünnemann, Freie Universität Berlin.

The work was partly supported by the Science and Technology Development Fund, Macau, the Deutsche Forschungsgemeinschaft and NASA.


Source: ScienceDaily

Thursday 28 October 2021

Chang'e-5 samples reveal key age of moon rocks

 A lunar probe launched by the Chinese space agency recently brought back the first fresh samples of rock and debris from the moon in more than 40 years. Now an international team of scientists -- including an expert from Washington University in St. Louis -- has determined the age of these moon rocks at close to 1.97 billion years old.

"It is the perfect sample to close a 2-billion-year gap," said Brad Jolliff, the Scott Rudolph Professor of Earth and Planetary Sciences in Arts & Sciences and director of the university's McDonnell Center for the Space Sciences. Jolliff is a U.S.-based co-author of an analysis of the new moon rocks led by the Chinese Academy of Geological Sciences, published Oct. 7 in the journal Science.

The age determination is among the first scientific results reported from the successful Chang'e-5 mission, which was designed to collect and return to Earth rocks from some of the youngest volcanic surfaces on the moon.

"Of course, 'young' is relative," Jolliff said. "All of the volcanic rocks collected by Apollo were older than 3 billion years. And all of the young impact craters whose ages have been determined from the analysis of samples are younger than 1 billion years. So the Chang'e-5 samples fill a critical gap."

The gap that Jolliff references is important not only for studying the moon, but also for studying other rocky planets in the solar system.

As a planetary body, the moon itself is about 4.5 billion years old, almost as old as the Earth. But unlike the Earth, the moon doesn't have the erosive or mountain-building processes that tend to erase craters over the years. Scientists have taken advantage of the moon's enduring craters to develop methods of estimating the ages of different regions on its surface, based in part on how pocked by craters the area appears to be.

This study shows that the moon rocks returned by Chang'e-5 are only about 2 billion years old. Knowing the age of these rocks with certainty, scientists are now able to more accurately calibrate their important chronology tools, Jolliff said.

"Planetary scientists know that the more craters on a surface, the older it is; the fewer craters, the younger the surface. That's a nice relative determination," Jolliff said. "But to put absolute age dates on that, one has to have samples from those surfaces."

"The Apollo samples gave us a number of surfaces that we were able to date and correlate with crater densities," Jolliff explained. "This cratering chronology has been extended to other planets -- for example, for Mercury and Mars -- to say that surfaces with a certain density of craters have a certain age."

"In this study, we got a very precise age right around 2 billion years, plus or minus 50 million years," Jolliff said. "It's a phenomenal result. In terms of planetary time, that's a very precise determination. And that's good enough to distinguish between the different formulations of the chronology."

Other interesting findings from the study relate to the composition of basalts in the returned samples and what that means for the moon's volcanic history, Jolliff noted.

The results presented in the Science paper are just the tip of the iceberg, so to speak. Jolliff and colleagues are now sifting through the regolith samples for keys to other significant lunar science issues, such as finding bits and pieces tossed into the Chang'e 5 collection site from distant, young impact craters such as Aristarchus, to possibly determining the ages of these small rocks and the nature of the materials at those other impact sites.

Jolliff has worked with the scientists at the Sensitive High Resolution Ion MicroProbe (SHRIMP) Center in Beijing that led this study, including study co-author Dunyi Liu, for over 15 years. This long-term relationship is possible through a special collaboration agreement that includes Washington University and its Department of Earth and Planetary Sciences, and Shandong University in Weihai, China, with support from Washington University's McDonnell Center for the Space Sciences.

"The lab in Beijing where the new analyses were done is among the best in the world, and they did a phenomenal job in characterizing and analyzing the volcanic rock samples," Jolliff said.

"The consortium includes members from China, Australia, the U.S., the U.K. and Sweden," Jolliff continued. "This is science done in the ideal way: an international collaboration, with free sharing of data and knowledge -- and all done in the most collegial way possible. This is diplomacy by science."

Jolliff is a specialist in mineralogy and provided his expertise for this study of the Chang'e-5 samples. His personal research background is focused on the moon and Mars, the materials that make up their surfaces and what they tell about the planets' history.

As a member of the Lunar Reconnaissance Orbiter Camera science team and leader of the Washington University team in support of NASA's Apollo Next Generation Sample Analysis (ANGSA) program, Jolliff investigates the surface of the moon, relating what can be seen from orbit to what is known about the moon through the study of lunar meteorites and Apollo samples -- and now, from Chang'e-5 samples.

Source: ScienceDaily

Wednesday 27 October 2021

Happiness in early adulthood may protect against dementia

 While research has shown that poor cardiovascular health can damage blood flow to the brain increasing the risk for dementia, a new study led by UC San Francisco indicates that poor mental health may also take its toll on cognition.

The research adds to a body of evidence that links depression with dementia, but while most studies have pointed to its association in later life, the UCSF study shows that depression in early adulthood may lead to lower cognition 10 years later and to cognitive decline in old age.

The study publishes in the Journal of Alzheimer's Disease on Sept. 28, 2021.

The researchers used innovative statistical methods to predict average trajectories of depressive symptoms for approximately 15,000 participants ages 20 to 89, divided into three life stages: older, midlife and young adulthood. They then applied these predicted trajectories and found that in a group of approximately 6,000 older participants, the odds of cognitive impairment were 73 percent higher for those estimated to have elevated depressive symptoms in early adulthood, and 43 percent higher for those estimated to have elevated depressive symptoms in later life.

These results were adjusted for depressive symptoms in other life stages and for differences in age, sex, race, educational attainment, body mass index, history of diabetes and smoking status. For depressive symptoms in midlife, the researchers found an association with cognitive impairment, but this was discounted when they adjusted for depression in other life stages.

Excess Stress Hormones May Damage Ability to Make New Memories

"Several mechanisms explain how depression might increase dementia risk," said first author Willa Brenowitz, PhD, MPH, of the UCSF Department of Psychiatry and Behavioral Sciences and the Weill Institute for Neurosciences. "Among them is that hyperactivity of the central stress response system increases production of the stress hormones glucocorticoids, leading to damage of the hippocampus, the part of the brain essential for forming, organizing and storing new memories."

Other studies have linked depression with atrophy of the hippocampus, and one study has shown faster rates of volume loss in women, she said.

In estimating the depressive symptoms across each life stage, researchers pooled data from younger participants with data from the approximately 6,000 older participants and predicted average trajectories. These participants, whose average age was 72 at the start of the study and lived at home, had been enrolled by the Health Aging and Body Composition Study and the Cardiovascular Health Study. They were followed annually or semi-annually for up to 11 years.

U-Shaped Curve Adds Credence to Predicted Trajectories

While assumed values were used, the authors stated, no longitudinal studies have been completed across the life course. "Imputed depressive symptom trajectories fit a U-shaped curve, similar to age-related trends in other research," they noted.

Participants were screened for depression using a tool called the CESD-10, a 10-item questionnaire assessing symptoms in the past week. Moderate or high depressive symptoms were found in 13 percent of young adults, 26 percent of midlife adults and 34 percent of older participants.

Some 1,277 participants were diagnosed with cognitive impairment following neuropsychological testing, evidence of global decline, documented use of a dementia medication or hospitalization with dementia as a primary or secondary diagnosis.

"Generally, we found that the greater the depressive symptoms, the lower the cognition and the faster the rates of decline," said Brenowitz, who is also affiliated with the UCSF Department of Epidemiology and Biostatistics. "Older adults estimated to have moderate or high depressive symptoms in early adulthood were found to experience a drop in cognition over 10 years."

With up to 20 percent of the population suffering from depression during their lifetime, it's important to recognize its role in cognitive aging, said senior author Kristine Yaffe, MD, of the UCSF departments of Psychiatry and Behavioral Sciences, and Epidemiology and Biostatistics. "Future work will be needed to confirm these findings, but in the meantime, we should screen and treat depression for many reasons."

Source: ScienceDaily

Tuesday 26 October 2021

Scientists discover 14 genes that cause obesity

 Promising news in the effort to develop drugs to treat obesity: University of Virginia scientists have identified 14 genes that can cause and three that can prevent weight gain. The findings pave the way for treatments to combat a health problem that affects more than 40% of American adults.

"We know of hundreds of gene variants that are more likely to show up in individuals suffering obesity and other diseases. But 'more likely to show up' does not mean causing the disease. This uncertainty is a major barrier to exploit the power of population genomics to identify targets to treat or cure obesity. To overcome this barrier, we developed an automated pipeline to simultaneously test hundreds of genes for a causal role in obesity. Our first round of experiments uncovered more than a dozen genes that cause and three genes that prevent obesity," said Eyleen O'Rourke of UVA's College of Arts & Sciences, the School of Medicine's Department of Cell Biology and the Robert M. Berne Cardiovascular Research Center. "We anticipate that our approach and the new genes we uncovered will accelerate the development of treatments to reduce the burden of obesity."

OBESITY AND OUR GENES

O'Rourke's new research helps shed light on the complex intersections of obesity, diet and our DNA. Obesity has become an epidemic, driven in large part by high-calorie diets laden with sugar and high-fructose corn syrup. Increasingly sedentary lifestyles play a big part as well. But our genes play an important role too, regulating fat storage and affecting how well our bodies burn food as fuel. So if we can identify the genes that convert excessive food into fat, we could seek to inactivate them with drugs and uncouple excessive eating from obesity.

Genomicists have identified hundreds of genes associated with obesity -- meaning the genes are more or less prevalent in people who are obese than in people with healthy weight. The challenge is determining which genes play causal roles by directly promoting or helping prevent weight gain. To sort wheat from chaff, O'Rourke and her team turned to humble worms known as C. elegans. These tiny worms like to live in rotting vegetation and enjoy feasting on microbes. However, they share more than 70% of our genes, and, like people, they become obese if they are fed excessive amounts of sugar.

The worms have produced great benefits for science. They've been used to decipher how common drugs, including the antidepressant Prozac and the glucose-stabilizing metformin, work. Even more impressively, in the last 20 years three Nobel prizes were awarded for the discovery of cellular processes first observed in worms but then found to be critical to diseases such as cancer and neurodegeneration. They've also been fundamental to the development of therapeutics based on RNA technology.

In new work just published in the scientific journal PLOS Genetics, O'Rourke and her collaborators used the worms to screen 293 genes associated with obesity in people, with the goal of defining which of the genes were actually causing or preventing obesity. They did this by developing a worm model of obesity, feeding some a regular diet and some a high-fructose diet.

This obesity model, coupled to automation and supervised machine learning-assisted testing, allowed them to identify 14 genes that cause obesity and three that help prevent it. Enticingly, they found that blocking the action of the three genes that prevented the worms from becoming obese also led to them living longer and having better neuro-locomotory function. Those are exactly the type of benefits drug developers would hope to obtain from anti-obesity medicines.

More work needs to be done, of course. But the researchers say the indicators are encouraging. For example, blocking the effect of one of the genes in lab mice prevented weight gain, improved insulin sensitivity and lowered blood sugar levels. These results (plus the fact that the genes under study were chosen because they were associated with obesity in humans) bode well that the results will hold true in people as well, the researchers say.

"Anti-obesity therapies are urgently needed to reduce the burden of obesity in patients and the healthcare system," O'Rourke said. "Our combination of human genomics with causality tests in model animals promises yielding anti-obesity targets more likely to succeed in clinical trials because of their anticipated increased efficacy and reduced side effects."

Source: ScienceDaily


Monday 25 October 2021

Brain cell differences could be key to learning in humans and AI

 Imperial researchers have found that variability between brain cells might speed up learning and improve the performance of the brain and future artificial intelligence (AI).

The new study found that by tweaking the electrical properties of individual cells in simulations of brain networks, the networks learned faster than simulations with identical cells.

They also found that the networks needed fewer of the tweaked cells to get the same results, and that the method is less energy intensive than models with identical cells.

The authors say that their findings could teach us about why our brains are so good at learning, and might also help us to build better artificially intelligent systems, such as digital assistants that can recognise voices and faces, or self-driving car technology.

First author Nicolas Perez, a PhD student at Imperial College London's Department of Electrical and Electronic Engineering, said: "The brain needs to be energy efficient while still being able to excel at solving complex tasks. Our work suggests that having a diversity of neurons in both brains and AI systems fulfils both these requirements and could boost learning."

The research is published in Nature Communications.

Why is a neuron like a snowflake?

The brain is made up of billions of cells called neurons, which are connected by vast 'neural networks' that allow us to learn about the world. Neurons are like snowflakes: they look the same from a distance but on further inspection it's clear that no two are exactly alike.

By contrast, each cell in an artificial neural network -- the technology on which AI is based -- is identical, with only their connectivity varying. Despite the speed at which AI technology is advancing, their neural networks do not learn as accurately or quickly as the human brain -- and the researchers wondered if their lack of cell variability might be a culprit.

They set out to study whether emulating the brain by varying neural network cell properties could boost learning in AI. They found that the variability in the cells improved their learning and reduced energy consumption.

Lead author Dr Dan Goodman, of Imperial's Department of Electrical and Electronic Engineering, said: "Evolution has given us incredible brain functions -- most of which we are only just beginning to understand. Our research suggests that we can learn vital lessons from our own biology to make AI work better for us."

Tweaked timing

To carry out the study, the researchers focused on tweaking the "time constant" -- that is, how quickly each cell decides what it wants to do based on what the cells connected to it are doing. Some cells will decide very quickly, looking only at what the connected cells have just done. Other cells will be slower to react, basing their decision on what other cells have been doing for a while.

After varying the cells' time constants, they tasked the network with performing some benchmark machine learning tasks: to classify images of clothing and handwritten digits; to recognise human gestures; and to identify spoken digits and commands.

The results show that by allowing the network to combine slow and fast information, it was better able to solve tasks in more complicated, real-world settings.

When they changed the amount of variability in the simulated networks, they found that the ones that performed best matched the amount of variability seen in the brain, suggesting that the brain may have evolved to have just the right amount of variability for optimal learning.

Nicolas added: "We demonstrated that AI can be brought closer to how our brains work by emulating certain brain properties. However, current AI systems are far from achieving the level of energy efficiency that we find in biological systems.

"Next, we will look at how to reduce the energy consumption of these networks to get AI networks closer to performing as efficiently as the brain."

This research was funded by the Engineering and Physical Sciences Research Council and Imperial College President's PhD Scholarship.

Source: ScienceDaily

Sunday 24 October 2021

Neuroscientists roll out first comprehensive atlas of brain cells

 When you clicked to read this story, a band of cells across the top of your brain sent signals down your spine and out to your hand to tell the muscles in your index finger to press down with just the right amount of pressure to activate your mouse or track pad.

A slew of new studies now shows that the area of the brain responsible for initiating this action -- the primary motor cortex, which controls movement -- has as many as 116 different types of cells that work together to make this happen.

The 17 studies, appearing online Oct. 6 in the journal Nature, are the result of five years of work by a huge consortium of researchers supported by the National Institutes of Health's Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative to identify the myriad of different cell types in one portion of the brain. It is the first step in a long-term project to generate an atlas of the entire brain to help understand how the neural networks in our head control our body and mind and how they are disrupted in cases of mental and physical problems.

"If you think of the brain as an extremely complex machine, how could we understand it without first breaking it down and knowing the parts?" asked cellular neuroscientist Helen Bateup, a University of California, Berkeley, associate professor of molecular and cell biology and co-author of the flagship paper that synthesizes the results of the other papers. "The first page of any manual of how the brain works should read: Here are all the cellular components, this is how many of them there are, here is where they are located and who they connect to."

Individual researchers have previously identified dozens of cell types based on their shape, size, electrical properties and which genes are expressed in them. The new studies identify about five times more cell types, though many are subtypes of well-known cell types. For example, cells that release specific neurotransmitters, like gamma-aminobutyric acid (GABA) or glutamate, each have more than a dozen subtypes distinguishable from one another by their gene expression and electrical firing patterns.

While the current papers address only the motor cortex, the BRAIN Initiative Cell Census Network (BICCN) -- created in 2017 -- endeavors to map all the different cell types throughout the brain, which consists of more than 160 billion individual cells, both neurons and support cells called glia. The BRAIN Initiative was launched in 2013 by then-President Barack Obama.

"Once we have all those parts defined, we can then go up a level and start to understand how those parts work together, how they form a functional circuit, how that ultimately gives rise to perceptions and behavior and much more complex things," Bateup said.

Together with former UC Berkeley professor John Ngai, Bateup and UC Berkeley colleague Dirk Hockemeyer have already used CRISPR-Cas9 to create mice in which a specific cell type is labeled with a fluorescent marker, allowing them to track the connections these cells make throughout the brain. For the flagship journal paper, the Berkeley team created two strains of "knock-in" reporter mice that provided novel tools for illuminating the connections of the newly identified cell types, she said.

"One of our many limitations in developing effective therapies for human brain disorders is that we just don't know enough about which cells and connections are being affected by a particular disease and therefore can't pinpoint with precision what and where we need to target," said Ngai, who led UC Berkeley's Brain Initiative efforts before being tapped last year to direct the entire national initiative. "Detailed information about the types of cells that make up the brain and their properties will ultimately enable the development of new therapies for neurologic and neuropsychiatric diseases."

Ngai is one of 13 corresponding authors of the flagship paper, which has more than 250 co-authors in all.

Bateup, Hockemeyer and Ngai collaborated on an earlier study to profile all the active genes in single dopamine-producing cells in the mouse's midbrain, which has structures similar to human brains. This same profiling technique, which involves identifying all the specific messenger RNA molecules and their levels in each cell, was employed by other BICCN researchers to profile cells in the motor cortex. This type of analysis, using a technique called single-cell RNA sequencing, or scRNA-seq, is referred to as transcriptomics.

The scRNA-seq technique was one of nearly a dozen separate experimental methods used by the BICCN team to characterize the different cell types in three different mammals: mice, marmosets and humans. Four of these involved different ways of identifying gene expression levels and determining the genome's chromatin architecture and DNA methylation status, which is called the epigenome. Other techniques included classical electrophysiological patch clamp recordings to distinguish cells by how they fire action potentials, categorizing cells by shape, determining their connectivity, and looking at where the cells are spatially located within the brain. Several of these used machine learning or artificial intelligence to distinguish cell types.

"This was the most comprehensive description of these cell types, and with high resolution and different methodologies," Hockemeyer said. "The conclusion of the paper is that there's remarkable overlap and consistency in determining cell types with these different methods."

A team of statisticians combined data from all these experimental methods to determine how best to classify or cluster cells into different types and, presumably, different functions based on the observed differences in expression and epigenetic profiles among these cells. While there are many statistical algorithms for analyzing such data and identifying clusters, the challenge was to determine which clusters were truly different from one another -- truly different cell types -- said Sandrine Dudoit, a UC Berkeley professor and chair of the Department of Statistics. She and biostatistician Elizabeth Purdom, UC Berkeley associate professor of statistics, were key members of the statistical team and co-authors of the flagship paper.

"The idea is not to create yet another new clustering method, but to find ways of leveraging the strengths of different methods and combining methods and to assess the stability of the results, the reproducibility of the clusters you get," Dudoit said. "That's really a key message about all these studies that look for novel cell types or novel categories of cells: No matter what algorithm you try, you'll get clusters, so it is key to really have confidence in your results."

Bateup noted that the number of individual cell types identified in the new study depended on the technique used and ranged from dozens to 116. One finding, for example, was that humans have about twice as many different types of inhibitory neurons as excitatory neurons in this region of the brain, while mice have five times as many.

"Before, we had something like 10 or 20 different cell types that had been defined, but we had no idea if the cells we were defining by their patterns of gene expression were the same ones as those defined based on their electrophysiological properties, or the same as the neuron types defined by their morphology," Bateup said.

"The big advance by the BICCN is that we combined many different ways of defining a cell type and integrated them to come up with a consensus taxonomy that's not just based on gene expression or on physiology or morphology, but takes all of those properties into account," Hockemeyer said. "So, now we can say this particular cell type expresses these genes, has this morphology, has these physiological properties, and is located in this particular region of the cortex. So, you have a much deeper, granular understanding of what that cell type is and its basic properties."

Dudoit cautioned that future studies could show that the number of cell types identified in the motor cortex is an overestimate, but the current studies are a good start in assembling a cell atlas of the whole brain.

"Even among biologists, there are vastly different opinions as to how much resolution you should have for these systems, whether there is this very, very fine clustering structure or whether you really have higher level cell types that are more stable," she said. "Nevertheless, these results show the power of collaboration and pulling together efforts across different groups. We're starting with a biological question, but a biologist alone could not have solved that problem. To address a big challenging problem like that, you want a team of experts in a bunch of different disciplines that are able to communicate well and work well with each other."

Other members of the UC Berkeley team included postdoctoral scientists Rebecca Chance and David Stafford, graduate student Daniel Kramer, research technician Shona Allen of the Department of Molecular and Cell Biology, doctoral student Hector Roux de Bézieux of the School of Public Health and postdoctoral fellow Koen Van den Berge of the Department of Statistics. Bateup is a member of the Helen Wills Neuroscience Institute, Hockemeyer is a member of the Innovative Genomics Institute, and both are investigators funded by the Chan Zuckerberg Biohub.


Source: ScienceDaily

Saturday 23 October 2021

Hypertension, gut bacteria, and sleep apnea: Is there a link?

 In this Special Feature, we examine a potential link between sleep apnea, hypertension, and gut bacteria. Although a link between the three may seem unlikely on the surface, scientists are unraveling the connections.

Before taking a look at the bonds between these three seemingly unconnected things, we should start with an explainer: What is sleep apnea?

Sleep apnea is a condition where an individual stops breathing for periods throughout the night. Obstructive sleep apnea (OSA), the most common form, occurs when throat muscles temporarily relax and block the airways.

Central apnea, which is relatively rare, occurs when the brain does not send appropriate signals to the muscles engaged in breathing. The issue here lies within the central nervous system, rather than a physical obstruction of the airway.

OSA affectsTrusted Source an estimated 22% of men and 17% of women. If undetected, it can increase the riskTrusted Source of developing heart disease and depression. Research also suggests a link between OSA and endocrine disorders such as type 2 diabetes, neurological disorders including epilepsy, and hypertensionTrusted Source.

Hypertension needs no introduction. It is even more prevalent than sleep apnea, affecting nearly halfTrusted Source of adults in the United States. Worldwide, high blood pressure affects around 1.28 billionTrusted Source people aged 30–79.

Although treatment for hypertension is available, the drugs do not work effectively for some individuals. It is not always clear why this is the case, but OSA is commonTrusted Source in individuals with drug-resistant hypertension.

Also, research suggests that there is a dose-response relationship between sleep-disordered breathing and hypertension. In other words, individuals with more severe breathing difficulties at night have an increased risk of developing hypertension.

Scientists are still exploring the mechanisms involved in the link between OSA and hypertension. Multiple pathwaysTrusted Source are likely, but some researchers believe that the gut microbiome might play a role.

If this article had appeared 20 years ago, we might have needed to explain the importance of gut bacteria. Today, though, with the microbiome featuring prominently in both scientific research and yogurt marketing campaigns, most people will be familiar with the concept.

In brief, we harbor up to 100 trillionTrusted Source microorganisms in our mouths and gastrointestinal tract.

Historically famous for helping us digest certain components of our food, scientists now suggest associations between these microbes and a wide range of health conditions, including obesity, diabetes, Parkinson’sTrusted Source, and dementiaTrusted Source.

Studying the interactions between our microbial residents and our health is notoriously challenging, and many questions remain. What we do now know for certain is that we would be far worse off without them.

The link between our intestinal residents and disease will take some unpicking, but one of the key factors seems to be dysbiosis, an imbalance or reduction in microbial diversity.

In oversimplified terms, changes to the bacteria’s environment might, for instance, hinder “good” bacteria while giving “bad” bacteria an opportunity to thrive.

Now, to the matter in hand.

The question remains: How could bacteria in our gut play a role in sleep apnea-related hypertension?

A review in the Journal of Clinical Sleep Medicine (JCSM) outlines one theory as to how disordered breathing during sleep might influence bacterial populations.

Sleep apnea causes intermittent hypoxia or low levels of oxygen in the blood throughout the night. This hypoxia produces periodic decreases in the partial oxygen pressure gradient inside the tubes of the gastrointestinal system.

Consequently, bacteria that can only grow in low oxygen environments — obligate anaerobes — and those that can thrive with or without oxygen — facultative anaerobes — get a boost. As with any finely balanced ecosystem, when certain populations receive a leg up, they might push others aside.

In one studyTrusted Source, scientists induced intermittent hypoxia in mice over the course of 6 weeks. After analyzing their feces, they note a “significant effect of intermittent hypoxia on global microbial community structure.”

Specifically, the study notes an abundance of Firmicutes and a reduction in Bacteroidetes and Proteobacteria compared with control mice.

An increase in Firmicutes with a reduction in Bacteroidetes is considered a hallmarkTrusted Source of dysbiosis.

Some people refer to Bacteroidetes as good bacteria because they produce short-chain fatty acids (SCFAs) — we will hear much more about the importance of SCFAs later. Conversely, some think of Firmicutes as bad because they have a negative influence on glucose and fat metabolism.

These two bacterial phyla, or types, make up around 90%Trusted Source of the bacteria in our gut, so changes to these groups are likely to impact the delicately balanced ecosystem within us.

In animal studies, researchers have shown that a shift in bacterial populations can cause the degradation of mucins in the gut. Mucins help keep the gut lining or epithelium healthy. If they suffer damage, the epithelium can become more permeable or “leaky” as the damage disrupts junctions between epithelial cells.

Also, good bacteria, which produce SCFAs from dietary fiber, are in shorter supply.

SCFAs are a source of nutrition and energy for the epithelium. With the limited production of SCFAs, the epithelium takes a second hit. Again, this can cause dysfunction in the epithelium.

On top of the mucin degradation and drop in SCFAs, intermittent hypoxia itself can physically damageTrusted Source the epithelium.

Here, we have outlined how disordered breathing at night can cause damage to the intestinal epithelium through mucin breakdown, reduced levels of SCFAs, and physical damage through hypoxia.

But we must ask again, what has this got to do with hypertension?

A damaged intestinal epithelium or “leaky gut” allows increased traffic from the gut into the blood. Compounds that the gut would normally trap and excrete from the body can now enter the blood and travel to distant organs and systems.

Another effect of dysbiosis is an increase in bacterial species that produce toxins. A healthy epithelium might block these toxins, but an overly permeable epithelium allows these compounds to slip into the blood.

Once they are in circulation, the body mounts a low-grade inflammatory response. Here, finally, we meet the intersection of OSA, gut bacteria, and hypertension.

Inflammation occurs with a wide range of conditions, including atherosclerosisTrusted Source, a buildup of a fatty substance on the walls of arteries.

Inflammation also appears to play a role in the development of hypertensionTrusted Source. For instance, a studyTrusted Source that followed 3,274 men without hypertension at baseline for 9 years concludes:

To summarize, intermittent hypoxia experienced during the night disturbs the microbiome. This bacterial disturbance makes the gut epithelium leaky and allows toxins to leach into the blood. This sparks inflammation and, therefore, increases the risk of hypertension and other cardiovascular problems.

As the authors of the JCSM review conclude:

“OSA animal models confirmed that [intermittent hypoxia during sleep] and sleep fragmentation […] are associated with worsening gut microbiota profile with the emergence of species negatively influencing the gut through damage to the epithelial cells and producing endotoxins leading to a state of ‘low-grade systemic inflammation.’”

This, they explain, “may contribute to the pathophysiologic mechanisms for many chronic inflammatory, cardiovascular, and neoplastic disorders.”

Source: Medical News Today