Scientists Simplifying Science

Category archive

Reporting from the Lab

hot research from labs' oven

Antisense-ing Alzheimer’s

in Reporting from the Lab by

Enhanced life expectancy has led to a rise in aging associated disorders such as Alzheimer’s disease (AD). Two important pathological hallmarks of AD include the appearance of Beta amyloid plaques and neurofibrillary tangles (NFT) in the brain. The abnormal clustering of beta amyloid protein between neurons forms beta amyloid plaques. NFT’s occur as a result of aggregates formed by Tau, a protein molecule critical for microtubule stability and axonal transport. Both these processes lead to the disruption of neuronal communication subsequently leading to neuronal damage and loss.

While the current treatment options correct the cognitive symptoms of the disease, there is a quest to target specific underlying disease mechanisms. In a fascinating study in Science Translational Medicine, researchers DeVos et al suggest the use of an antisense oligonucleotide (ASO) to decrease the accumulation of the misfolded Tau proteins in the brain as well as to reverse the deposition of Tau in older mice.

In this paper, the authors designed an ASO that can specifically target human tau and reduce its expression. ASOs are synthetic single stranded nucleotides that bind to complementary mRNA or precursor pre-mRNA (transcript that undergoes splicing or other modifications) and consequently inhibit or reduce protein expression or modify protein function. For this study, the researchers used a transgenic PS19 mouse that expresses a mutant P301S human Tau protein responsible for the development of AD in these mice. Upon administration of a synthetic ASO targeting human Tau protein in the brains of these mice, the expression of human Tau protein was significantly reduced. Additionally, in order to determine whether treatment with ASO can be used to prevent the toxic accumulation of Tau proteins, they administered the ASO at 6 months of age and examined the levels of Tau protein in the brain at 9 months of age. Interestingly, treated mice showed significantly reduced Tau levels, suggesting the ability of the ASO to prevent Tau-associated pathology, as the mice got older. The researchers further demonstrated that NFT accumulation observed in 9 month old mice can be reversed upon ASO treatment, underscoring the therapeutic ability of these ASOs. AD progression is brought about by the propagation of Tau proteins within the brain. This ability of pathologic Tau to misfold naïve Tau was also reduced upon treatment with ASO. Ultimately, treatment with the ASO increased survival of these mice without causing any decline in their ability to complete a functional task such as building a nest, which is used as a common measure of cognition, social behavior and motor capabilities in mice.

These in vivo preclinical studies were further supported by studies in non-human primates, Cynomolgus monkeys. As Tau proteins are naturally occurring within the nervous system, the researchers showed that ASO treatment reduces the endogenous Tau levels in the brain and spinal cord of these monkeys. Further, the levels of Tau within the cerebrospinal fluid (CSF) can be used as a surrogate marker of treatment efficacy as the levels of Tau in the CSF correlated directly with reduction in protein level within the brain of ASO-treated monkeys.

A major limitation in the treatment of current Tau pathologies is the inability to reverse the damage that has already occurred by these aggregates. In this regard, one of the most remarkable features of this study is the ability of the human Tau ASO to not just prevent but also reverse the Tau pathologies observed in the PS19 mouse. The advancement to clinical trials requires further studies to establish its efficacy in clearing Tau aggregates without affecting general cognitive functions in humans. In the current study, ASO was administered by the surgical placement of an osmotic pump in the brain of mice. However, it is critical to identify feasible and safe mechanisms of delivery of ASOs to the central nervous system of patients. This has been an ongoing area of research for several neurological disorders.

ASOs have been studied extensively as therapeutic molecules for various disorders especially devastating neurodegenerative diseases such as Spinal muscular atrophy (SMA) and Amyotrophic lateral sclerosis (ALS). A recent breakthrough in the ASO therapeutic field came about with the FDA approval of Biogen and Ionis pharmaceuticals’ Nusinersen, an ASO to treat SMA, a debilitating disease affecting children. An ASO (BIIB067) that disrupts the production of misfolded proteins produced by the mutant SOD1 gene in ALS is also in clinical trials. In partnership with Roche, Ionis is also conducting clinical trials with an ASO (ASO-HTT-Rx) which can reduce the levels of Huntingtin protein in Huntington’s disease.

AD is in dire need of a significant drug molecule that targets specific pathogenic activity and not just the symptoms of the disease. Unfortunately, promising candidates that prevent the formation of Beta amyloid plaques, such as Pfizer and Johnson & Jonhsons’ bapineuzumab, Eli Lilly’s solanezumab and Merck’s verubecestat have failed clinical trials. Despite these setbacks, companies continue to investigate novel therapies to fight this disease. Preclinical studies in this paper with an ASO that reduces Tau protein levels can transform Alzheimer’s therapeutic landscape.

Journal article:

DeVos SL et al. Tau reduction prevents neuronal loss and reverses pathological tau deposition and seeding in mice with tauopathy, Science Translational Medicine. DOI:10.1126/scitranslmed.aag0481

Additional newsfeed:

http://www.thescientist.com/?articles.view/

http://www.sciencemag.org/news

https://www.alz.org/research/science/

https://www.eurekalert.org/pub_releases/

https://www.newscientist.com/article/2119254

Photo source: 

www.alzheimersreadingroom.com

Edited by Isha Verma 

About the author 

Radhika completed her PhD from Cornell University and is currently a Postdoctoral fellow at the Brigham and Women’s Hospital. Her research interests have centered around oncology and neuroimmunology. Among other things, she is striving to effectively communicate scientific discoveries to the community.

 

 

 

 

 

 

Dare to Share – The dilemma surrounding data sharing

in Reporting from the Lab by
  • DataShareing_Opensource_Ritu300417.png?fit=2918%2C3283

Preview note: As the scientific community slides into the era of towering collaborative and multidisciplinary projects it is now impossible to ignore the importance of Open source and Free data sharing. However, the community is gravely divided on this ground. As we progress in our discussion, we figure out that both the pros and cons are justified and deserve open minded considerations. Inspired by a heated online debate on the official Facebook page of Career and Support Group on 29th March 2017, Rohit decided to reasonably curate the views under one roof. This is a very enticing article, especially for the early stage researchers who often find themselves in the dilemma of ‘To be shared or not to be’ Rituparna Chakrabarti


Data sharing is an integral part of collaborative scientific research and is often encouraged within the scientific community. National Science Foundation has emphasized the importance of sustainable data sharing and management in the progress of science and engineering, and has proposed policies in its favor. The New England Journal of Medicine has published a number of articles and editorials highlighting the importance and new developments in data sharing, especially in clinical sciences. Therefore, generally speaking, while there may be multiple aspects and finer details attached to the individual arguments, it is accepted that data sharing has a positive impact on scientific research and is encouraged1. There is, however, a part of the data sharing conversation that is often experienced firsthand by fresh PhDs, post-doctoral fellows and other young scientists and researchers.

 

As young researchers attempt to embark upon new career opportunities, they must reply upon the limited research experience they have accumulated thus far. It is natural they want to use this experience to sell their skills and knowledge to the prospective employer during their job interview. It is also expected that sometimes the prospective employer would want to learn more about the candidate’s past research and evaluate them based on their work. This may involve seeking relevant data, research methodologies and innovations involved in that research. If the said research has already been published (or has been accepted to be published) then the subsequent process would be fairly straightforward and the candidate will triumphantly share the past research. If, however, the research is yet unpublished and/or is under peer-review process then data sharing can be tricky. The Principal Investigator (PI) heading the research might not be too keen on sharing it outside the lab until it is published. Scientific research is a competitive domain and it is a valid concern for such a PI who might be at a risk of getting scooped because the data may end up in the hands of a competitor. And the push and impact of publishing novel research on the career of a scientist is not unknown. But what must then the young researchers do? They are not allowed to share the research they have worked hard on thus far even if they want to. A prospective employer will want to evaluate the candidate’s research skills and such a discussion might require discussing unpublished work. Everyone involved appears to be justified in their stance.

 

There are some suggestions2, ranging from having better interpersonal communication to changing mindsets in the field that may allow young researchers to circumvent this issue. At the outset it is essential to establish the fact that all data and other research content generated in an academic setting belongs to the hosting university or institute, and therefore, the employees may not be at complete liberty to disseminate this content without prior permission of the university. But this doesn’t mean that the research may not be discussed at all (unless some form of a non-disclosure agreement is involved) with a third party. In such a situation, researchers seeking for employment outside the university may choose to have prior discussion with the prospective employer and ask to share only the published research details (preferably pre-approved by the PI/university), which might demonstrate their competence and research acumen. But this may not always be easy. Talking about the dilemma of how much research one should share when interviewing at a company while working at another, Derek Lowe has noted,“ No published work worth talking about, no patent applications, no nothing. I actually did go out and give an interview seminar under those conditions once, and it was an unpleasant experience. I had to talk about ancient stuff from my post-doc, and it was a real challenge convincing people that I knew what was going on in a drug company. I don’t recommend trying it.” If the unpublished data must be discussed, it can be done as a part of more general problem to solution to impact discussion i.e. without communicating any specific details, presenting any slides and only mentioning the data verbally– the interviewee must be clear about what can and cannot be discussed, because once the idea is out there anyone may be able to claim rights to it if it is not already published or patented. Data sharing under these circumstances does present a unique challenge. As an innovative solution, the PI and the prospective employer may discuss a publication strategy beforehand that may benefit all parties involved. This approach is more likely to succeed if the candidate takes the initiative to establish such a communication channel and be open about the prospects – high risk, high reward.

 

Needless to say that this problem may not arise at all if the research in question is already published on a preprint platform such as arXiv or bioRxiv.While the jury is still out on the pros and cons of the preprint strategy, it is undeniable that it has been gaining in popularity due to its open source nature and ease of submission. In spite of the benefits, this option may be more favorable to researchers publishing in physics, mathematics and informatics related fields (primarily on arXiv), than to those publishing in life sciences (primarily on bioRxiv) and chemistry related fields. This argument is supported by the statistics as they currently stand. Launched in November 2013, bioRxiv had received ~3100 submissions till January 2016 (as per ; ~114 submissions/month). On the other hand, arXiv was launched in August 1991 and has received ~1.2 million submissions to date* (~4050 submissions/month). The reason it took some 23 years for a life science-centric preprint server to be launched might have had to do with both culture and content in the life sciences research. As it may be clear by now, publishing one’s research on a preprint server is not the end of the road. Most researchers eventually want to publish the same (or an improved version of) research in a high impact peer-reviewed academic journal. Policy on publishing the manuscripts that have been published elsewhere, including on preprint servers, differs by journal. Even though researchers would like to publish on a preprint server (for all the reasons discussed above), they are wary of the fact that they will likely not be able to ultimately publish the same research in high-impact journals like NEJM or AACR. The problem is more acute in life science journals while most physics and mathematics journals accept research previously published on preprint servers.

 

But things seem to looking up in the preprint world in general. bioRxiv is far younger than arXiv, but the trend in the rate of submissions has been steadily increasing since their inception which means that more and more researchers are opting for this route. On that note, it is now possible to directly submit bioRxiv preprints to leading academic journals. Encouraged by the success of preprint approach and its potential in pacing the speed of scientific discoveries Chan Zukerberg Initiative has decided to provide a funding of $3 billion over 10 years to bioRxiv.  What’s more, American Chemical Society has decided to launch a preprint server for chemists. These developments point to the preprint approach becoming the leading approach to share research data before it is published in a peer-reviewed journal. Therefore, in absence of extraordinary circumstances, if the PI can be persuaded to publish the research to a preprint server, the candidate may avoid the difficulties around data sharing. It is, therefore, important to foster a productive, amicable and strong professional relationship with the PI.

 

Even with the increase in the sheer amount of data, data sharing today is easier than ever. The question is how much are we willing to share and how much are we allowed to share. For young researchers and graduates transitioning into new research positions these questions can be the difference between success and failure. These suggestions aim to provide a template and facilitate decision making for these very researchers. Eventually, a more collaborative effort and understanding by all the stakeholders is required.

 

 

1Further reading on the importance of Data sharing:

https://www2.usgs.gov/datamanagement/share/guidance.php

http://blogs.nature.com/methagora/2013/07/importance-of-data-sharing.html

https://www.nature.com/nbt/journal/v25/n4/full/nbt0407-398.html

 

2Certain suggestions are adapted from a recent discussion on the official Career and Support Group Facebook page, which inspired this post. The contribution of the members to this discussion is acknowledged and appreciated.

 

*At the time of publishing of this article

Acknowledgements: Somdatta Karak (Editing), Rituparna Chakrabarti (Featured image).

 

About the author: Rohit Arora obtained his PhD from ENS in France. Post-Phd he worked as a postdoc in France in collaboration with a major pharmaceutical company. He is currently a postdoc at Beth Israel Deaconess Medical Center. His research focus includes understanding biological structure-function relationships and developing novel tools to make sense out of “big data” in biology. He enjoys reading about his newfound interest in the history of mathematics, geometry, and philosophy. He can be reached on Twitter @RealRohitArora (sure, you try and come up with a better handle for name this common)

This work by ClubSciWri is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Bad complements ?

in Reporting from the Lab by

Recently, we observed “Rare Disease day” to raise awareness of the rare diseases that afflict millions of people in the United States. While the definition of rare diseases varies in every country, in the United States a rare disease is one that affects less than 200,000 people at a given time. Scientists worldwide have made tremendous progress to identify and understand the clinical manifestations and pathogenesis of these diseases leading to several treatment options and saving many lives. In a recent article by Pandey et al in the journal Nature, scientists unraveled a critical role played by the body’s immune system in Gaucher disease, thereby prompting a potential treatment option.

Gaucher disease is an inherited disorder characterized by a mutation in the gene Gba1 that leads to a deficiency in the enzyme glucocerebrosidase.(GCase). This enzyme is present in the lysosome, the digestive system of the cell, that contains numerous enzymes necessary for the break down of complex molecules. GCAse, in particular, is required for the breakdown of a fatty acid compound, glucosylceramide (GC) into simpler components that can be recycled and utilized for other cellular processes. Consequently, a deficiency of this enzyme leads to the accumulation of GC within the lysosomes in the immune cells of the spleen, bone marrow and liver leading to chronic inflammation. The reason for this tissue inflammation remains elusive.

Enzyme replacement therapy (ERT) is effective in compensating for the enzyme deficiency associated with Gaucher disease. Genzyme’s Cerezyme (imiglucerase) was the first approved ERT treatment. Alternatively, substrate reduction therapy (SRT) prevents the formation of GC itself thereby reducing its accumulation. Two oral drugs, Actelion pharmaceutical’s Zavesca (miglustat ) and Genzyme’s Cerdelga (eliglustat) are commonly used for SRT. However, neither of them address the inflammation associated with the disease. The study by Pandey et al identifies a novel target that can help overcome some of the limitations of the current treatment and potentially benefit patients.

In a mouse model of Gaucher disease, Gba19V/-, Pandey et al found elevated levels of C5a in the immune cells of the spleen, liver and lung in these mice compared to normal mice. This was accompanied by an increased expression of C5aR1 in these cells. C5a is part of the complement immune system that plays an important role in inflammation and homeostasis. It is a cleavage product of C5 and is produced upon activation of macrophages and circulating monocytes, cells of the innate immune system that play a critical role in protecting the body by generating effective immune responses. C5a binds to the receptor C5aR1 on the surface of some innate immune cells that perpetuates the inflammatory response by activating another type of immune cells, namely T cells. Strikingly, double-deficient mice lacking GCase and the C5aR1 receptor (Gba19V/- C5aR1-/-) showed little to no GC accumulation and a significantly reduced inflammatory response with an improved survival. Another interesting feature of these double-deficient mice was the decreased expression of the enzyme glucosyl ceramide synthase (GCS), an enzyme that is required for the synthesis of GC. The treatment of Gba19V/- mice , with an antagonist compound that blocks C5aR (C5aRA) also resulted in decreased GC accumulation and reduced inflammation.

This study suggests that targeting  C5aR1 or C5 itself can potentially ameliorate inflammation and GC accumulation. There are two options available pharmaceutically to test this proposition in preclinical models. Alexion pharmaceuticals’ Soliris (eculizumab) is an anti-C5 monoclonal antibody that binds C5 and prevents its cleavage into C5a. It is currently approved for the treatment of a rare disorder, paroxysomal nocturnal hemoglobinuria. Another option is to target the C5a receptor (C5aR1) using antagonists such as Avacopan (CCX168) developed by Chemocentryx which is currently in clinical trials for the treatment of inflammatory disorders that affect the kidney.

Another interesting implication of these studies arises from the observation that Gaucher disease is closely associated with the neurodegenerative disorder, Parkinson’s disease (PD). Studies indicate that GCase and alpha- synuclein, the protein whose dysfunction is a major phenomenon in PD, have a reciprocal relationship and several ongoing investigations are focused on parsing apart this connection. This study published by Pandey et al opens up an area of investigation to determine the interplay between the complement system and inflammation in the brain that can perhaps explain the correlation between these diseases.

There are over 50 lysosomal storage disorders (LSDs) that are rare, inherited and commonly driven by enzyme deficiencies leading to unwanted accumulation of materials in the body. This study provides a promising therapeutic strategy not only for Gaucher disease but also for other LSDs associated with chronic inflammation.

Journal article :

Manoj K. Pandey et al, Complement drives glucosylceramide accumulation and tissue inflammation in Gaucher disease, Nature (2017). DOI: 10.1038/nature21368

Additional newsfeed :

https://gaucherdiseasenews.com/2017/03/02/

study-says-

suppression-of-protein-

could-lead-to-new-gaucher-therapies/

http://www.alzforum.org/papers/complement-drives-

glucosylceramide-accumulation-and-tissue-inflammation-

gaucher-disease

https://www.sciencedaily.com/releases/

2017/02/

170222131459.htm

http://healthmedicinet.com/nature-study-suggests-new-therapy-

for-gaucher-disease/

Photo source : Stocktrek images

 Edited by Isha Verma

About the author 

Radhika completed her PhD from Cornell University and is currently a Postdoctoral fellow at the Brigham and Women’s Hospital. Her research interests have centered around oncology and neuroimmunology. Among other things, she is striving to effectively communicate scientific discoveries to the community.

 

 

Identifying the lemurs

in Biodiversity and Environment/Reporting from the Lab by

In the last century, we lost many of our magnificent animal species including the Honshu wolf, California grizzly bear, Tasmanian tiger, Barbary lion, Caribbean monk seal, Arabian ostrich, and Japanese sea lion. Additionally, many other species are facing the risk of extinction. Among them are the lemurs – you might remember them as the cute fuzzy creatures from the movie Madagascar. Lemurs are a unique group of primate endemic to Madagascar Island in Africa and are considered to be the most threatened mammalian species on Earth. According to the International Union for Conservation of Nature (IUCN) red list of threatened species, out of the 111 lemur species, 24 are critically endangered, 49 are endangered and 20 are vulnerable. This highlights the urgent need to develop conservation strategies for these animals.

In order to do so, it is important to acquire knowledge of behavior, ecology, and evolution of various lemur species, including data on life history, fitness, longevity, and reproductive patterns. Such data can be acquired through long-term studies of known sets of lemurs. However, long-term studies are limited by the difficulties in tracking the known individuals over extended periods of time. The most commonly used method of lemur identification is by capturing and tagging them with unique identifiers. However, this method is expensive, can cause harm to the animals and is not suitable for large scale studies. Alternatively, the researchers rely on the variations in the appearances of lemurs, such as the differences in body size and shape, to identify them. But this is highly subjective and prone to errors and also requires substantial training of the researchers. Addressing these problems, scientists (Crouse et al.) recently published a study in BMC Zoology, where they modified the human facial recognition technology to develop a highly accurate computer-assisted lemur facial recognition system termed as LemurFaceID. This system uses the variations in the facial patterns of the lemurs for their identification based on the photographs.

For the prototype development, the researchers generated a dataset of 462 photographs of 80 red-bellied lemurs (Eulemur rubriventer) mostly from the individuals in Madagascar. Additionally, to increase the size of the lemur photo gallery, another database was generated that contained the images of lemurs belonging to other species. Each image in the database was subjected to multiple pre-processing steps and further normalizations were performed to reduce the effects of the ambient illumination and lemur’s facial hair on the accuracy of LemurFaceID. The corrected image was subjected to feature extraction using multi-scale local binary pattern (MLBP) method. The final feature vector was constructed based on the linear discriminant analysis (LDA), which helped to minimize the variations between the photographs of the same individual. To perform the face matching, the lemur dataset was divided into (i) a training set which was used to train the LemurFaceID system and (ii) a testing set which was used to test the accuracy of this system. Further, in the test set, two-thirds of the images of each individual were used as a gallery in the system database, while the remaining one-third of the images were used as queries. Each query consisted of one or more images which were identified against the gallery database.

The researchers conducted the face recognition experiments in two different modes. The open-set mode was based on the assumption that during the experiments, queries might be encountered that may not match with any of the images in the gallery. This corresponds to the conditions in the wild, where one might encounter novel lemur individuals which were not spotted before and are consequently absent from the dataset. On the other hand, experiments in the closed-set mode were performed with the assumption that all the query lemurs were present in the gallery. This simulates the condition in the captive lemur colonies where all the individuals are already identified. Across a 100 trials performed in the closed-set mode, LemurFaceID identified lemurs with an accuracy of about 93.3% for a 1-image query and 98.7% for a 2-image query. However, the results with the open-set mode were less accurate suggesting a need to further improve the technique perhaps by increasing the size of the lemur database. In the future, the researchers plan to test the system in the field to compare its accuracy with that of the trained and untrained field observers.

The LemurFaceID provides a novel tool that will greatly facilitate the long-term research of known lemur populations and will help to develop informed strategies for lemur conservation. As lemurs also face the threat of being live-captured to be kept as pets, this technique can be developed into a tool to identify the captive lemurs and report their sightings. IUCN has started the lemur conservation program under the auspices of Save Our Species (SOS) initiative and has been trying to tackle various threats faced by lemurs. LemurFaceID can boost the IUCN’s efforts to conserve the lemur populations. In the future, face recognition tools similar to LemurFaceID can be developed for other animals that show similar variations in facial and skin patterns, such as bears and red pandas. Such innovative approaches, combined with advanced technology, have the potential to create better solutions for conserving our biodiversity.

Journal reference:

Crouse D, Jacobs RL, Richardson Z, Klum S, Jain A, Baden AL, Tecot SR. LemurFaceID: a face recognition system to facilitate individual identification of lemurs. BMC Zoology. 2017, 2:2. DOI: 10.1186/s40850-016-0011-9.

Other references:

http://www.sciencenewsline.com/news/2017021701450007.html

http://www.livescience.com/57995-lemur-facial-recognition-software.html

https://phys.org/news/2017-02-facial-recognition-lemurs.html

http://www.seeker.com/facial-recognition-tech-could-help-save-endangered-lemurs-2268739486.html

http://stateschronicle.com/save-endangered-lemurs-18714.html

https://www.iucn.org/news/species/201610/major-donation-boosts-efforts-save-madagascar%E2%80%99s-lemur-species-extinction

Featured image source: Pixabay

About the author:

Isha Verma is currently pursuing her PhD in Stem cell research from the Indian Institute of Science, Bangalore. She loves reading and traveling.

Edited by: Radhika Raheja

For the love of sleep

in Reporting from the Lab by

After a long and tiring day, we all love to go to our beds and get lost in the sweet world of sleep. Sleep provides us a break from the outside world and rejuvenates our bodies and minds. It is essential for our physical and mental well-being. Research has shown that sleep serves many important purposes including energy conservation, replenishment of cellular supplies, waste clearance, memory processing, and learning. However, sleep is still a mysterious biological phenomenon as we do not completely understand its mechanisms and functions.

Researchers have specifically linked sleep to the normal functioning of the brain through the synaptic homeostasis hypothesis (SHY). According to this hypothesis, a core function of sleep is to restore the strength of the synapses, the structures that allow the neurons in our brains to communicate with each other. SHY states that the learning occurs during the wake, when we are under the influence of signals from the environment, through the process of synaptic potentiation which leads to an increase in the strength of synapses. On the other hand, while we are asleep, synaptic depression takes place in our brains, resulting in the decrease in the strength of the synapses. Hence, sleep helps in the renormalization of the synaptic strength. The strengthening and weakening of synapses, termed as synaptic scaling, occur regularly across the wake/sleep cycle and is crucial for the integration of new information in our brains.

In a recent study published in Science, scientists (Vivo et al.) at the University of Wisconsin-Madison provided the morphological evidence of synaptic scaling occurring in the mouse brain across the wake/sleep cycle. To study this phenomenon, the researchers isolated the brains from three groups of mice. The first group of mice got proper sleep, whereas the mice in the other two groups were forcefully kept awake or stayed awake on their own. The researchers then used a technique called block-face scanning electron microscopy to image the synapses in two different regions of the isolated brains. Based on these images, they calculated the axon-spine interface (ASI) i.e., the surface area of direct contact between the axonal bouton and dendritic spine head, which serve as the transmitting and receiving ends of the neuron, respectively. The ASI was used as the parameter to assess the strength of the synaptic connections.

Researchers made the interesting observation that the ASI decreased by about 18% in the first group of mice that got sufficient sleep as compared to the other two groups that were awake, indicating a downscaling in the synaptic connections during sleep. However, downscaling was not observed uniformly across all the synapses but was limited to small and medium synapses, which represented about 80% of the total synapses. The remaining 20% of the synapses, which were larger, did not undergo a decrease in ASI. In addition, while downscaling was observed in the spines which were structurally unstable and contained endosomes which facilitated the structural changes by recycling of cellular material, it did not occur in the spines that lacked the endosomes. Also, the decrease in ASI during sleep was observed to be minimal in the dendritic spines with high synaptic density. The researchers suggest that the synapses that are large or do not contain endosomes or have high synaptic density might be associated with committed and stable memory circuits and hence, they escape the process of downscaling during sleep.

Although it might not be possible to replicate this study in humans, the researchers suggest that synaptic scaling during wake/sleep cycle also occurs in our brains. Taken together, this study provides a definite evidence for the SHY and establishes that while wake results in an increase in synaptic strength, an important function of sleep is to selectively reduce the synaptic strength to bring it back to the normal levels. In simpler terms, sleep provides a useful mechanism of “smart forgetting”. When we are asleep, our brains can comprehensively analyze all the memories made during wake and keep the ones that are important, while discarding the ones that are irrelevant. Another group of researchers has confirmed the results of this study by identifying the gene involved in synaptic downscaling during sleep.

This study highlights the importance of sleep for the proper functioning of our brains. A good night’s sleep enhances our reasoning and problem-solving skills and helps us to concentrate and memorize. So, next time whenever you are feeling confused, unable to make a decision, just try to sleep on it. And hopefully, when you wake up, your brain will be able to think much more clearly!

Journal reference:

de Vivo L, Bellesi M, Marshall W, Bushong EA, Ellisman MH, Tononi G, Cirelli C. Ultrastructural evidence for synaptic scaling across the wake/sleep cycle. Science. 2017 Feb 3;355(6324):507-510. doi: 10.1126/science.aah5982.

Other references:

http://science.sciencemag.org/content/355/6324/511.long

http://www.cell.com/neuron/abstract/S0896-6273(13)01186-0

https://www.theguardian.com/science/neurophilosophy/2017/feb/03/sleep-may-help-us-to-forget-by-rebalancing-brain-synapses

https://www.sciencedaily.com/releases/2017/02/170202141913.htm

http://www.learningscientists.org/blog/2017/2/23-1

https://www.psychologytoday.com/blog/memory-medic/201702/sleep-perhaps-learn

Featured image source: Pixabay

About the author:

Isha Verma is currently pursuing her PhD in Stem cell research from the Indian Institute of Science, Bangalore. She loves reading and traveling.

 

Got fat ? Let’s migrate !

in Reporting from the Lab by

Targeted cancer therapy, for the most part, focuses on restricting the uncontrolled growth of a tumor. While these treatment strategies have been successful during the early stages of cancer, there is a constant need to identify treatment options for tumors that have undergone metastasis i.e. the tumor cells have dispersed from their primary site and localized to other organs of the body. In a recent study published by Nature, Pascual et al have shed some major insights into the process of metastasis and identified a fatty acid receptor, CD36 as a potential target to impair metastasis.

The researchers  generated tumors in mice by injecting them with oral carcinoma cell lines and patient-derived cells. These cells were stained with a fluorescent label dye, which diminishes with every dividing cancer cell. They were able to identify slow dividing dye-retaining cells as well as rapidly dividing dye-negative cells in the tumors that developed. A transcriptome analysis, to identify differences in the gene signature of these two populations, showed an enhancement of genes involved in metastasis and lipid metabolism in the slow dividing dye-retaining cells. CD36, a cell surface receptor and a crucial component for lipid uptake and metabolism, was one of the top implicated genes in their data analysis. Cell surface receptors communicate with specific molecules in the extracellular environment and transmit signals within the cell, which consequently dictates cellular processes.

How does CD36 affect metastasis? Interestingly, loss of CD36 in mice reduced the ability of tumors to penetrate to other organs by 80-100% while it did not affect primary tumor formation. Consistent with its requirement for metastasis, antibodies that block the CD36 receptor significantly inhibited metastasis in mice without affecting the size of the tumor. Furthermore, the expression of the cell surface receptor, CD36 was greatly increased when mice were fed with a high-fat diet. In a series of subsequent experiments, the authors concluded that the metastatic potential of tumors is increased with a high fat diet in a CD36 dependent manner.

There are several aspects of this study that are interesting.

This work shifts the paradigm of cancer metastasis theories where tumor cells are believed to undergo a transition from an adhering epithelial cell to a migratory mesenchymal cell (EMT) to invade distant sites. These CD36 expressing cells did not exhibit a mesenchymal gene signature. While further experimentation is required to link CD36 and EMT, it is conceivable that these processes are independent of each other to facilitate metastasis. A detailed mechanism of how CD36 initiates and regulates metastasis remains to be determined.

A high fat diet, which included palmitic acid (an essential component of palm oil) enhanced metastasis in a CD36-dependent manner in these mice. Palm oil is a key ingredient in several food products including Nutella. A press release early last year claimed that the breakdown products of palm oil are potentially carcinogenic, therefore correlating Nutella consumption with cancer risk. While these correlative studies require further scrutiny, validation, and support by causation studies in humans, it is imperative to understand the impact of an EXCLUSIVE high-fat diet on health.

The constantly evolving landscape of cancer research has witnessed the discovery of promising molecules to combat the most aggressive forms of the disease. A majority of these molecules are immuno-oncological targets that enhance the anti-tumor immune response and prevent tumor spreading. In 2016, the FDA approved two drugs, Bristol Myers-Squibb’s Opdivo for metastatic head and neck squamous carcinoma and Genentech’s Tecentriq for metastatic non-small cell lung carcinoma. Both these drugs regulate the immune checkpoint PD-1 and PDL-1 respectively. Some other drugs in the pipeline include Bristol-Myers Squibb and ASLAN pharmaceuticals’ ASLAN002, an inhibitor of the receptor tyrosine kinase, RON. RON regulates immune surveillance and its activation enhances tumor metastasis. Innate Pharma‘s, anti-CD73 blocks the enzyme, CD73 whose function contributes to the generation of an immunosuppressed and pro-angiogenic tumor microenvironment. What makes the fatty acid receptor CD36 unique, so far, is that it exclusively affects metastasis without affecting primary tumor formation. While its cross talk with the immune system remains to be investigated, CD36 represents a novel class of potential anti-metastatic targets that requires further validation. Targeting CD36 by itself, or perhaps in combination with the other aforementioned drugs, might have the potential to treat some of the most aggressive forms of tumor and subsequently have a positive impact on patient lives.

Journal article:

http://www.nature.com/nature/journal/

v541/n7635/full/

nature20791.html

Additional newsfeed :

http://www.nature.com/nature/journal/

v541/n7635/

nature20791/metrics/news

https://www.worldwidecancerresearch.org/blog-post/

new-research-links-major-component-of-palm-oil-to-cancer-spread/

https://www.sciencedaily.com/releases/

2016/12/161207132117.htm

http://healthmedicinet.com/i/preventing-cancer-spread-

mouse-study-points-to-fat/

https://www.centerwatch.com/drug-information/fda-approved-drugs/

therapeutic-area/12/oncology

http://www.aslanpharma.com/drug/aslan002/

http://www.innate-pharma.com/en/pipeline/

first-class-anti-cd73-checkpoint-inhibitor-program

Photo source: Shutterstock

Edited by Abhinav Dey.

About the author 

Radhika completed her PhD from Cornell University and is currently a Postdoctoral fellow at the Brigham and Women’s Hospital. Her research interests have centered around oncology and neuroimmunology. Among other things, she is striving to effectively communicate scientific discoveries to the community.

 

 

 

 

Turning back the hands of time

in Reporting from the Lab by

Aging is one of the most complex biological processes and the prime driver for many human diseases. For a long time, it was considered to be a unidirectional process. However, in the year 2006, Nobel Prize winning scientist Yamanaka proved that mature differentiated cells of the body, such as skin cells, can be converted to undifferentiated embryonic-like cells by the process of cellular reprogramming. This can be achieved by the expression of Yamanaka factors, which include four genes namely Oct4, Sox2, Klf4 and c-Myc (OSKM). These factors result in epigenetic changes in the cells, i.e. heritable changes in the cellular gene expression without any change in the DNA sequence. The reprogrammed cells are termed as induced pluripotent stem cell (iPSCs). iPSCs have unlimited self-renewal ability and under suitable conditions, they can give rise to all the differentiated cell types found in the body.

Interestingly, the process of cellular reprogramming has been shown to improve various age-related phenotypes in cells under in vitro or laboratory culture conditions. However, an important question that remains to be answered is whether we can use the reprogramming method to slow or reverse the process of aging without converting the cells to iPSCs. Also, one of the major concerns is that when the process of cellular reprogramming is performed in vivo i.e. at the organismal level, it results in tumor development and high mortality rates. Addressing these issues, scientists (Ocampo et al.), at the Salk Institute for Biological Sciences, recently published their findings in the journal Cell, where they developed an in vivo partial reprogramming method that can reverse the signs of aging without the risk of tumor formation.

The researchers initially tested their method under in vitro conditions. For this purpose, they used skin cells isolated from the mouse model of premature aging and induced short-term expression of Yamanaka factors for 2-4 days. It is important to note that for the complete reprogramming of mature cells to generate iPSCs, these factors are typically expressed for 2-3 weeks. Partial reprogramming, induced by short-term expression of Yamanaka factors, did not alter the identity of skin cells. However, researchers made the interesting observation that this method reduced the generation of double-strand breaks in DNA and production of reactive oxygen species and also lowered the expression of various genes involved in ageing-associated pathways. Further experiments revealed that epigenetic remodeling of the cells, during the process of partial reprogramming, is the main driver for improving these hallmarks of aging.

When applied in the aging mouse model, in vivo partial reprogramming resulted in an increase in the average lifespan of animals from 18 weeks to 24 weeks. These mice rescued the development of cardiovascular alterations and showed normal proliferation rates in multiple organs. The researchers also tested the applicability of this method in physiologically aged mice. Partial induction of OSKM in 12-month-old mice resulted in better pancreatic function and enhanced muscle regeneration.

In vivo partial reprogramming might not be possible in humans because of the requirement of genetic engineering technique for the expression of Yamanaka factors. However, researchers showed that partial reprogramming of aged human cells under in vitro conditions resulted in improvement in a few of the hallmarks associated with aging. Although further experiments are necessary for its validation, if successful, this technique will provide a unique platform to tackle age-related disorders, such as Alzheimer’s and diabetes, in humans. Epigenetic-modifying drugs, that can mimic the process of partial reprogramming, can be developed to ameliorate various effects of aging.

In conclusion, while the results from this study are very encouraging, it is important to keep in mind that we have not discovered the Philosopher’s stone and it might not be possible to stop the process of aging completely. However, in the future, various strategies can be developed to control age-associated diseases, resulting in healthier living and increased longevity. Until then, we can be hopeful to turn back the hands of time at least at the cellular level!

Journal reference:

Ocampo A, Reddy P, Martinez-Redondo P, Platero-Luengo A, Hatanaka F, Hishida T, Li M, Lam D, Kurita M, Beyret E, Araoka T, Vazquez-Ferrer E, Donoso D, Roman JL, Xu J, Rodriguez Esteban C, Nuñez G, Nuñez Delicado E, Campistol JM, Guillen I, Guillen P and Izpisua Belmonte JC. In Vivo Amelioration of Age-Associated Hallmarks by Partial Reprogramming. Cell. 2016 Dec 15; 167(7):1719-1733.e12. doi: 10.1016/j.cell.2016.11.052.

Other references:

http://www.cell.com/cell/fulltext/S0092-8674(16)31662-2

http://www.sciencedirect.com/science/article/pii/S1471491416300533

https://www.sciencedaily.com/releases/2016/12/161215143541.htm

http://www.salk.edu/news-release/turning-back-time-salk-scientists-reverse-signs-aging/

https://www.sciencenews.org/article/proteins-reprogram-cells-can-turn-back-mices-aging-clock

https://www.regmednet.com/users/24433-naamah-maundrell/posts/14169-forever-young-reversing-the-hallmarks-of-aging

About the authors:

Isha Verma is currently pursuing her PhD in Stem cell research from the Indian Institute of Science, Bangalore. She loves reading and traveling.

Radhika Raheja completed her PhD from Cornell University and is currently a Postdoctoral fellow at the Brigham and Women’s Hospital. Her research interests have centered around oncology and neuroimmunology. Among other things, she is striving to effectively communicate scientific discoveries to the community.

About the illustrator:

Ipsa Jain is a Ph.D. student at IISc. She wants to gather and spread interestingness. She prefers painting and drawing over writing.

Rethink your diet

in Reporting from the Lab by

 

The new year is often the time to make lifestyle changes that most commonly include dietary restrictions to stay healthy and happy. It is no surprise, to most of us, that these alterations impact us as well as our gut microbiome – the microbial community associated with the human digestive tract. Several studies have underpinned the role of the gut microbiome in regulating our physiology including metabolic functions and immune response. As a consequence, variations in the gut microbiome have been associated with autoimmune disorders, cancer, obesity and cardiovascular diseases.

To understand how the gut microbiome influences our health, a fundamental question that remains to be answered is – what dietary components influence our microbiome. Scientists (Holmes AJ et al.) in a recent publication in the journal Cell Metabolism have made some insightful observations to address this question. The researchers used experimental as well as simulation models to thoroughly investigate the influence of 25 different dietary compositions on the gut microbiome of 858 mice fed over a period of 15 months.

One of the key findings from this study was that the microbial diversity is dependent on the energy density as well as the nutrient distribution of the food i.e. the ratio of protein to carbohydrate. The study also showed that this diversity is largely governed by the utilization of nutrients by our body and their subsequent availability to the gut microbiome. In simpler terms, the complex proteins and carbohydrates we consume are broken down into end products that are primarily made up of carbon and nitrogen. These products are reabsorbed for our metabolic activities and physiological functions. The microbiome, however, has two major sources of nutrients which include (i) endogenous secretions in our gut such as mucin and (ii) digestion-resistant or partially digested carbohydrates and proteins present in our diet.

Interestingly, the researchers were able to generate ‘guilds’ that constitute bacterial species that vary in their utilization of substrates for nutrients. For instance, a high protein to carbohydrate intake led to an increased abundance of Firmicutes that utilized dietary carbon and nitrogen for their metabolism. On the other hand, Bacteroidetes were more abundant in a low protein to carbohydrate diet where host endogenous secretions were the primary source of nutrients. Further analysis revealed that nitrogen that constitutes the protein diet played a key role in governing these microbial shifts in the gut. This suggests that the ratio of protein to carbohydrate is critical in mediating the gut diversity. Previous studies by the same group indicated that low protein intake by mice is associated with better immune response and intestinal function. Also, other studies have revealed that the abundance of Bacteroidetes to Firmicutes helps in regulating obesity. Overall, this study has tremendous implications as we usher in an era of host-gut-diet interactions to understand disease processes.

It is imperative to acknowledge that such studies are difficult to recapitulate in humans where several confounding variables exist within the diet. Nevertheless, they have begun to provide an understanding of the qualitative and quantitative aspects of our diet that can predict microbiome composition and consequently our health. In a period where precision medicine is the new norm, it will not be surprising if precision diet regimens might be generated for an optimal symbiotic relationship between us and our gut microbiome for healthy living. Ultimately, while these findings warrant further study, you might want to think twice before you completely exclude those carbs and indulge in your legumes. Remember, it’s all relative!

Journal reference:

Andrew J. Holmes, Yi Vee Chew, Feyza Colakoglu, John B. Cliff, Eline Klaassens, Mark N. Read, Samantha M. Solon-Biet, Aisling C. McMahon, Victoria C. Cogger, Kari Ruohonen, David Raubenheimer, David G. Le Couteur, Stephen J. Simpson. Diet-Microbiome Interactions in Health Are Controlled by Intestinal Nitrogen Source Constraints. Cell Metabolism, 2017 Jan 10;25(1):140-151. 1016/j.cmet.2016.10.021

Additional  newsfeed:

https://www.sciencedaily.com/releases/2016/11/161123141436.htm

http://www.nutraingredients.com/Research/Nitrogen-key-for-gut-health-Study

http://www.sbs.com.au/topics/science/humans/article/2016/11/25/

nitrogen-holds-key-healthy-gut-bacteria

https://sydney.edu.au/news-opinion/news/2016/11/24/

major-finding-identifies-nitrogen-as-key-driver-for-gut-health.html

Article contributions: 

Isha Verma  is currently pursuing her PhD in Stem cell research from the Indian Institute of Science, Bangalore. She loves reading and traveling.

 

 

Ipsa Jain (Illustration). Ipsa is a PhD student at IISC. She wants to gather and spread interestingness. She prefers painting and drawing over writing.

Go to Top
%d bloggers like this: