Category Archives: science news

Microbiome and research reproducibility

This week journal Nature published a long overdue perspective where the authors argued that one of the critical, but frequently unaccounted reasons for pre-clinical research irreproducibility, was role of microbiome in shaping physiology and phenotype of laboratory animals.

Written by Thaddeus Stappenbeck & Herbert Virgin, this article explained that when designing and analyzing research data we need to consider effect of “metagenome” defined “as the sum of all host genes plus all organism genes of the microbiome.”

The term ‘microbiome’ refers to not only endogenous bacteria, clarified the authors, but “also the virome, archaea, the mycobiome (fungi) and meiofauna (for example, protists and helminths)”.

I am going to highlight some of the important messages from this article.

First, the authors start with acknowledgment that “nearly all aspects of human physiology, as well as model organisms such as mice, are influenced by the microbiome and metagenome”.  This is especially true for conditions where immune system is heavily implicated, but even organs such as lung, pancreas, and brain are directly influenced by microbiome.

A gold standard for working with gene-modified organisms, argue the authors, is to use “littermate controls” to control the effects of the microbiome and metagenome. For example, the authors correctly pointed out that “the common practice of comparing wild type mice to mice with a mutation when the wild-type and mutant mice are bred separately or one group is purchased from an outside facility” is totally flawed and no self-respected research investigator or science journals should publish results obtained from such sloppy experiments. Use of littermate controls is not a new concept and many publications specifically mention such controls, but it must be a mandatory requirement for gene-modified animal experimentation going forward.

Another important recommendation, the authors say, would be to conduct key experiments “in multiple animal facilities in order to draw firm conclusions about the generality of a role of host and/or microbiome genes in a phenotype.” This is akin to clinical trials on humans that “relies on multi-centre trials as its gold standard for treatment efficacy”.

I personally fully support the recommendations proposed in this article because they are sound observations derived from analysis of decade-long scientific experimentation.

Of course, implementation of these rules would be expensive and time-consuming. But without it, vast number of experiments are totally wasted to begin with, not to mention millions of lab animals sacrificed for no good cause. I will go even further and say that it is moral and ethical obligation for every scientist to do the best he/she can to minimize undue experimentation on live animals.

Finally, one way to accomplish such transformation, funding agencies and scientific journals should demand higher standards for “practices and controls for mouse experiments”.

David Usharauli

Advertisements

My view on “An incentive-based approach for improving data reproducibility”

This week Science Translational Medicine published the commentary by Michael Rosenblatt, Chief Medical Officer at the big Pharma company Merck & Co., addressing the problem of research data reproducibility.

He correctly pointed out that “The problem of [research] data that cannot be replicated has many potential origins: pressures to publish papers or secure grants, criteria for career advancement, deficiencies in training, and nonrigorous reviews and journal practices, with fraud an infrequent cause”.

He proposed that one way to improve confidence and reliability of translational research would be “if universities stand behind the research data that lead to collaborative agreements with industry” and “In the instance of failure [i.e. irreproducible data]” “what if universities offered some form of full or partial money-back guarantee [to industry partner]?”

The main starting point for this proposal is the fact that currently “industry expends and universities collect funding, even when the original data cannot be reproduced.”. “Compensation clause” proposed by Michael Rosenblatt is an attempt to put certain additional “accountability” [call it “incentive” if you prefer] on universities’ shoulders.

Would such arrangement work? Unlikely, in my opinion. Why? By accepting such arrangement, it naturally would imply  that academic scientists are less than “good” in their work. It would suggest that university does not have a confidence in their own scientists. It would ultimately impinge on academic freedom by dividing scientists between reliable and non-reliable ones (in my view, simple double-blind peer review of scientific manuscripts would greatly improve the quality of the academic research).

In addition, this proposal also somehow wants to unnecessary ease the burden on industry bosses who are ultimately responsible for selection of the best academic projects for commercial purpose.

These are few examples why it would be extremely sensitive subject to implement. I don’t have illusion that data reproducibility issue has simple solution. One aspect that is not even mentioned in this commentary is whether our heavy reliance on animal [mouse] models is misplaced and is actually one of the main causes for failure to “translate” into human research.

posted by David Usharauli

How T cells interpret signaling delivered through chimeric antigen receptor (CAR)

Recently chimeric antigen receptor (CAR)-transduced T cell-based immunotherapy made headlines around the globe. Human trials conducted mostly against B cell-derived malignancies showed extraordinary medical benefits in large percentage of treated patients by extending their disease-free episodes for several years.

Below is a diagram of CAR molecule borrowed from Juno Therapeutics’ home page. Juno is one out of several biotech/pharma companies leading this field. As shown here, it is clear that current CAR construct contains several sub-signaling pathways (CD3zeta, CD28, etc).

 

Colors

Now, people who are familiar with basic immunology would notice that CAR construct artificially combines two principally distinct signaling pathways commonly known as signal 1 (CD3zeta) and signal 2 (CD28, etc). It is believed that such combination of two signaling pathways increases CAR T cells vitality and effector differentiation.

I would like to remind the readers that two-signaling model of T cell activation was introduced to account for tolerance towards self-antigens. It is thought to operate a kind of fail-safe mechanism to discriminate between self and nonself antigens. Since vast majority of nonself antigens would carry other attributes of “foreignness”, 2nd signal would be selectively available to nonself-specific T cells, but rarely to self-specific T cells.

Separation of signal 1 and signal 2 in T cells had to have some biological significance on how T cells interpret incoming signaling. This is especially critical for generation of long-term memory since unlike short-term immune response, memory cells with self-renewal potential could clearly cause long-term damage to the host’s tissue when inappropriately activated.

In my opinion, this is why innate cells, including NK cells, in most part, lack truly long-term memory potential. Innate cells are typically signal 1-only cells, meaning they could get fully activated after receiving specific signal 1. However, absence of memory formation among innate cells ensures that they would not go on and continue to damage host’s tissue once they are activated.

So, I can visualize similarities between how innate cells interpret activation signaling and CAR signaling in T cells. Absence of spatio-temporal separation of two signaling pathways in CAR-T cells may affect their memory differentiation potential, essentially transforming CAR-T cells into signal-1-only cells. If this model is correct, persistence and self-renewal potential of CAR-T cells in the hosts will be drastically reduced compared to normal memory T cells. Practically this could translate into diminished long-term medical benefits.

posted by David Usharauli

Future of Medicine should not be Elysium

To have a long and healthy life was and is a dream of humankind since the inception of human civilization. Based on current pace, within 20-50 years medical science will achieve level where many difficult to manage diseases, like cancers and some infectious diseases become easily treatable.

Most recent example is hepatitis C virus story. Just 5 years ago, it seemed that there was no hope for people affected by Hep C. There was no approved vaccine or approved drugs capable of stopping this virus that causes extensive liver fibrosis. However now, if completely unexpectedly, we have anti-viral medicine that show up to 99% effectiveness rate (in some combination) and with less side effect (double benefit). No one could imagine this.

What about recent success in cancer therapy? Treatments with patient’s own tumor-specific re-engineered T cells or with antibodies to awaken the body’s own immune system had achieved an unprecedented level of success reaching up to 50% effectiveness rate.

These are all amazing news for anyone and especially for individuals with these medical conditions. However, even though the future seems bright, there is something unsettling in all this. It is the price tag.

Costs are so prohibitively high that almost 95% of world population would not able to afford it on their own (for example, tumor immuno-therapy with the patient’s own immune cells can cost more than 500,000$ , and Hep C treatment can cost between 50,000 – 120,000$ per treatment ).

More worrisome is the fact that unlike small molecules used in conventional pharmacology that can be easily manufactured in mass scale with minimal investment as generics, many advanced new treatments now days are based on large, complex molecules, called biologicals like humanized antibodies, or even more complex procedures like harvesting and re-engineering patient’s own T cells. These are completely different categories of medicine with no real way for mass scale production.

How can we afford this costly medicine to live longer and healthier?

Movie, Elysium, shows one such scenario. In this movie, society are divided into super rich and everyone else. Super rich live on a giant orbital station and are completely free of any disease thanks to medical technologies. These medical technologies are inaccessible to regular earthly people.

In my opinion, with the few rare exceptions, we do not control our medical health trajectory. Hence, every human should have an inherent right to have an access to the most advanced medical technologies. However, no federal or state budget can afford medical cost so high.

So what is the solution? how to make cutting edge medical technologies affordable to average person? The solution can lie in technology but in different kind of technology that is in infancy right now. It is gene therapy (gene silencing, cutting, replacing, modifying). Gene based medical technologies has two major advances over other type of biologicals: (a) gene therapy will be much cheaper (everyone who worked with PCR or RT-PCR can verify that costs for primer nucleotides are way cheap compared to antibodies), (b) gene therapy will treat the cause (genes) rather than outcome (proteins) of medical conditions.

Current medicine is based on repeated engagement with body’s proteins (receptors, cytokines, transcription factors). Gene therapy will be based on single or a few engagements with body’s DNA or RNA for long-term effect (for example, using CRISPR-Cas9 system). Such intervention can produce permanent fixes. An Overall cost will be low and majority will be able to afford it.

posted by David Usharauli

Online access to federally funded research publications must be free of charge from day 1

An absolute majority of the published scientific research are funded by tax payers’ money through the federal grants from NIH or NSF. Every scientist dreams to publish his/her work in the prestigious journals. Publication is the simplest way for a scientist to showcase the quality of his/her work.

The scientists, unlike social media followers, do not really care how many people will see their publication as long as it is seen by their colleague-scientists and could be submitted to the granting agencies. Furthermore, the absolute majority of the scientists would be OK if their research is freely avalable from the day 1 of its publication.

So, why are the journals still charging for online access to the new articles? some 30-35 years ago, when majority of scientific research were conducted at the University campuses by academic scientists, such fee for access was acceptable since Universities were able to afford multiple annual subscriptions to subject-matter journals.

However, these days, the small biotech companies conduct a big chuck of scientific research but still may lack additional funds to have annual subscriptions to many important subject-matter journals. Many times, when a new treatment fails in clinical trials, the source for a failure can be traced back to the lack of access to scientific literature at the earlier stages of treatment development.

Now days, there is 12 months delay before federally funded publication becomes open access. Can you believe that before 2007 legislation that made 12 months requirement the law, there was no such requirement at all.

But even 12 months is too long. Today research become accelerated due to improvement in computer technologies and the large data mining machinery and software. I just read news that beginning from Jan 1, 2017, Bill and Melinda Gates Foundation will require their grantees to make their research openly available immediately upon publication. Great decision, but I am still puzzled why not make it effective from Jan 1, 2015.

Journals should generate their revenue through an advertisement on their pages or through minimal submission fees. Subscription based revenue system is not viable in long run. Now days many science organizations stopped purchasing monthly hard copies of journals, including Nature and Science, and switched to online-only subscriptions to cut the cost. Eventually, the scientists will be asked to publish in a cheaper open access journals.

In the end, the prestige of any journal comes from the publication of high quality papers. Migration of high quality article submissions from today’s top journals to open access journals will raise the latter’s prestige and will benefit everyone else.

posted by David Usharauli

How to hire that works

Just read a very good blog post from Science careers about some of the limitations of current hiring practices. The author correctly points out that too much emphasis on job applicant’s non-job related skills may lead to unintentional exclusion of highly talented individuals from hiring pool.

I totally agree. First of all, despite the multiple criteria employers use to screen the job candidates, only reliable tests that could measure anything with reasonable objectivity are the actual work accomplishments (in prior job positions) or the actual grades received (if evaluating college graduates). Beyond this, everything is like a toss of a coin.

When the employer starts to focus on the social or behavior characteristic of the applicants at the expense of their job skills, one inevitably ends up with workers who spend a considerable amount of time in learning or improving their “office politics” skills rather than learning or advancing the project’s needs. In general, in my view, the conformity to the preexisting work culture becomes the dominant work culture.

This culture of conformity, however, comes at the cost of the quality. People with the talent for any science field are charismatic by nature. It means that they spend a lot of time in sharpening their science skills. In most cases, they are not as polished in their inter-personal skills as their more social colleagues are. Nature rarely produces people with advanced skills in both categories. A cursory overview of biographies of the famous scientists proves this point easily (just see an upcoming movie about enigma code-breaker Alan Turing). The absolute majority of the famous scientists could be characterized as “difficult” employees in a today’s terminology.

So, how to make the best informed decision regarding a science job applicant? I have the following suggestion: the “hard” skills drive science enterprise forward and the “soft” skills maintain the status quo. So, if an employer has any particular scientific or technological issue to solve or new protocol to develop in order to move forward, then one needs to hire an individual with the best record of using the specific “hard” scientific skills. However, if one needs to just continue with the existing protocols, then preference goes to individuals with the “soft” skills.

posted by David Usharauli

Do we really train more scientists than we really need?

I want to return to this topic once more and share my opinion.

Today I read an article with the title “Postdocs speak up” in the journal Science where the authors discuss a recent meeting focused on opportunities for young scientists. The basic plot is well-known: we train far more scientists than the system (lets say the biomedical system) can absorb.

In general, I would like to point out that the title “Postdocs speak up” is quite to the point. There is definitely a perception that scientists, especially young scientists are vocal people. However, this is quite inaccurate. Science is a very conservative and hierarchical system. Postdocs who “speak up” are rare. Individualism is neither encouraged nor perceived as a positive trait. Basically, people on lower echelons (grad students, Postdocs) follow directives from people on higher echelons (Principal Investigators, etc). This is the standard. Very few PIs allow truly independent thinking. This is the evolutionary stable strategy. Postdocs in biomedical field cannot just write some code in computer and change the world. Research in biomedical science costs money and since PIs bring money, they dictate who can “speak up”.

Now, regarding the number of young scientists, I personally think that training of more scientists is an evolutionary stable strategy and we would need even more of them in the future.

Why is that? I think this trend has to do with the self-imposed reduction of productivity per scientist. We think that scientists (both in academia and industry) work 12h-16h daily. This was true in the past when access to higher education was restricted, and only a very brilliant or very dedicated person was able to complete the training necessary to become a scientist.

However, in the past 20 years, access to higher education became more affordable, not because it costs less but mainly because qualification criteria became less stringent. This change allowed people with ordinary skills, but no zeal, to receive science degrees.

Ordinary people, however, view science as ordinary work, not a vocation. They spend less time in the lab. Where before Postdocs rarely took vacation time, now people take any opportunity to spend time outside the lab. It may be called work-life balance now, but that is just a polite way of saying “I only spend minimally required time in the lab”. However, this leads to productivity decrease. The only practical solution to this dilemma is to hire more people with similar skills in order to maintain the same output with less work per person. This scenario applies equally to both academia and industry.

I predict that in the future, scientists with ordinary skills will work less and less, creating the need to hire more and more people with the similar skills.

So, we actually do need more people with PhD in biomedical sciences. Since it’s far more difficult to find people with zeal, better to train more people with less dedication.

posted by David Usharauli

What is Collaborative Cross Recombinant Inbred (CC-RI) mice?

I was nicely surprised to see that on October 30, 2014, journal Science published an online article, in advance of print, that described experimental results with Ebola virus using Collaborative Cross recombinant inbred (CC-RI) mice.

Just day before, on October 29, 2014, I suggested that the common laboratory mice strains, like B6 or Balb/c, may not represent an adequate experimental models to study viral-host interactions relevant for human health.

To overcome some of the limitations of common laboratory mice, the authors in this paper have used so-called Collaborative Cross recombinant inbred (CC-RI) mice.

Honestly, I have not heard about these mice until I read this paper. So I read a little bit about CC-RI mice to understand the advantage of using them in this study. It appears that these CC-R inbred mice are derived from eight founders (C57BL/6J, A/J, 129S1/SvImJ, NOD/ShiLtJ, NZO/H1LtJ, CAST/EiJ, PWK/PhJ, and WSB/EiJ) that capture around 90% of gene diversity in all mouse.

I would like to explain how CC-RI mice are useful. For example, take as an example Ebola virus and host susceptibility or resistance to it as in this paper.

How to study which genes confers resistance or susceptibility to Ebola? What will be your control? Any given laboratory mouse strain will have an unique response to Ebola virus. Trying to compare two laboratory mouse strains may provide information that one strain is more or less susceptible or resistance compared to other mouse strain [to Ebola virus], but it will not tell you what makes this difference. Why is that? Because gene difference between mouse strains is too huge and there is no way to pinpoint to any gene of gene loci.

Here is where having CC-RI mice are helpful. Starting from eight founder strains, CC-RI mice are derived by multiple, marker-assisted inter-crossing between founder strains and their F1 off-springs. In the end, one can get CC-RI mice that differs from other CC-RI mice with just small known loci in entire genome. This locus or loci may contain either just one gene or few genes. Of course, number of CC-RI mice strains will be in hundreds.

See Genetics. Feb 2012; 190(2): 389–401.

An external file that holds a picture, illustration, etc. Object name is 389fig1.jpg

Now, one can conduct experiment on these different CC-RI mice and determine what gene or gene loci are responsible for observed phenotype.

In this new science paper the authors have used 47 CC-RI mice strains, 4-5 mice per group or per time point. The authors found that endothelial tyrosine kinases Tek (Tie2), determines susceptibility to Ebola virus in CC-RI mice. TEK signaling promote activation of coagulation factors, such as thrombin, so it make sense if considering that Ebola virus affects blood coagulation timing.

This type of experiments require enormous resources (imagine conducting experiments on hundreds of different CC-RI mice strain with 5-8 mice in each strain).

I hope someone will come up with new idea how to do this type of screening easier way.

posted by David Usharauli