Happy Birthday Carl!

carl-sagan-647717_1920One of my all-time childhood heroes Carl Sagan, whose work as a scientist, author and broadcaster played a huge role in my decision to pursue science as a career, would have been 81 years old today. Growing up I avidly read everything he wrote and watched everything he made for TV. He was a true polymath and a gifted communicator. I remember him talking about us “watching from the shores of a cosmic ocean” and I really hope that’s exactly what he’s enjoying right now.

Happy Birthday Carl. We miss you.

© The Digital Biologist

Academic research publishing is a broken system that needs to be fixed

First_Printing_machineMany in the life sciences agree that academic research is a broken and dysfunctional system that has largely ceased to be a true meritocracy. The evidence certainly supports the broader consensus among researchers that the field is in crisis. At the heart of the problem is the current system of academic publishing. Well before U.S. life science research funding started to stagnate in recent years, the insidious  ‘Publish or Perish’  mantra has been progressively eroding the quality and integrity of academic research through its perpetuation of a faux aristocracy of elite journals and its emphasis on bogus and self-fulfilling metrics of academic success like impact factors.

By defining and measuring academic success in terms of the number of publications in these ‘elite’ journals, the research community has (perhaps unwittingly) ceded to the editors, executives and shareholders of  multi-billion dollar publishing corporations, enormous and undue influence in determining the direction of academic research. The leading academic publishing companies even have an indirect but significant influence over which research actually gets done, since it is largely an assessment of research publications that determines who gets funding and who does not.

With better research funding comes a better chance of having your research published in an elite journal. With more publications in elite journals, the better are your chances of getting research funding. This is the vicious feedback loop that sustains the status quo in academic research, creating something of an echo chamber in which the same few voices are increasingly and disproportionately represented.

Even the academic peer review process upon which the research community has always depended to eliminate bias and ensure the integrity of published research, is itself deeply flawed and at times, unequivocally corrupt.

That many of these elite journals also build their own lucrative paywalls around the results of publicly funded research has only compounded the problem. The eye-watering prices that these academic publishing companies charge for their journals play a considerable role in further draining public money from a research system that is already enduring a major funding crisis. By some estimates, the subscriptions that universities must pay for access to these journals swallow up  as much as 10% of the public research funding that they receive.  This public money is essentially being channeled away from research and into the coffers of private sector corporations.

In addition to paying for the wealth of materials and resources necessary to conduct scientific research – laboratory reagents, consumables, equipment, instruments and so on – these funds in most cases, must also cover the salaries of the researchers who are already woefully under-compensated for their experience and expertise. It is a testament to how expensive access to these journals has become, that even Harvard University, one of the wealthiest institutions of higher education in the world, recently sent a memo to its faculty members informing them that it could no longer afford the price hikes imposed by many large journal publishers.

The publishers have repeatedly tried to claim that the prices they charge scientists and institutions for access to their journals, are necessary to offset the costs incurred in their work to ensure the quality of the published research. In reality however, the major academic publishers have significantly increased their profit margins over the last few years while institutional libraries struggle to keep up with the rising costs. The University of Montreal has for example (much to the dismay of its research staff), started scaling back its journal subscriptions as its annual expenditure on academic journals has reached an unsustainable $7 million,  One of its own researchers in its School of Library and Information Science, summarized the situation very well:

The quality control is free, the raw material is free, and then you charge very, very high amounts – of course you come up with very high profit margins.

Vincent Larivière, University of Montreal

The economics of the current situation are not the only issue for academic research. The pressure to publish in order to have a successful academic career, has also given rise to a dramatic increase in the publishing of fraudulent results, and a corresponding and even more dramatic decrease in the reproducibility of academic research. To quote a 2013 report in the Economist: “A rule of thumb among biotechnology venture-capitalists is that half of published research cannot be replicated. Even that may be optimistic.” This observation by itself, is a serious indictment of the current state both of academic research, and of the peer-reviewed academic publishing process that is supposed to ensure the integrity, accuracy and honesty of published scientific research.

To be fair, there are undoubtedly other forces at work here as well – the government’s reluctance even to sustain current levels of public funding for research for example. Yet through their willingness to generate excessive profit at the expense of publicly funded research, and their inordinate and undue influence over its direction – the leading academic publishers have played a central and significant role in the steady deterioration of both the quality of academic research, and the professional lives of the scientists who pursue it.

New and better models of research publishing must of necessity, take center stage in any attempt to reform academic research.

Thankfully, researchers are increasingly starting to recognize that the current situation in academic publishing is untenable and there are already signs of a movement against it. Scientists worried about the future of their profession, have started to organize in order to give voice to their concerns. Researchers and research institutions are openly championing an end to the publication of publicly funded research in subscription journals. Interestingly and most recently, we have even started to see signs of dissent at the heart of the academic publishing industry itself. Witness for example, the spectacular mutiny of the entire editorial board of an academic journal owned by Elsevier, over the issue of allowing the journal to become Open Access.  Elsevier apparently refused to address the editors’ longstanding concerns with the result that the departing editors are now preparing to launch a directly competing Open Access journal that seems certain to reduce the readership of Elsevier’s own subscription-based publication.

In the effort to reform academic research, there are undoubtedly important battles to be fought on several different fronts, the most visible and easily understood of which is the public funding of research. In the U.S. at least, this has been an uphill battle in recent years as a result of opposing political forces and a seeming apathy in the court of public opinion, for the case to fund scientific research with public money. It is however, hard to imagine even given a windfall of additional public funding – that the quality and integrity of academic research or the lot of its researchers, could be significantly improved so long as the current system of academic publishing remains in place. Academic research can only flourish if it is able to function as true meritocracy, and that will mean (amongst other things) breaking its damaging dependence upon the current system of academic publishing for its validation.

© The Digital Biologist

Symposium On The Future Of Research 2015

6a00e553e3a603883401b7c7e2bdcf970b (1)After enduring years of steady erosion of their funding, pay, career opportunities and work/life balance, it’s great to see researchers taking things into their own hands and getting behind a truly grassroots organization like Future Of Research (FOR) that was founded in October 2014 to address these issues. I was fortunate enough to be able to attend the 2015 symposium and what follows are my thoughts on what I heard there and where things stand for academic researchers, a year on from the launch of FOR.

I’m not going to reproduce all of the sombre statistics that were presented, illustrating the extent to which the working conditions for academic researchers in the U.S. have deteriorated under a system that rewards the worst kind of aggressive individualism while paying lip service to ideas like collaboration and collective effort. And all of this while trying to squeeze more and more out of researchers in return for low pay, low status and poor career opportunities.

The current crisis in academic research in the U.S. has been years in the making and is already well documented

To a very large extent, it can also be viewed as the unfortunate sequela of a much broader social crisis that we are currently witnessing in the U.S.

Let me state from the outset that the opinions that follow are entirely my own. I have no affiliation with the Future of Research. I would say that for me, the 2015 FOR Symposium was very enjoyable for the most part but also something of a disappointment, mainly for the following reason: Despite the general consensus that the current academic research system is broken and unsustainable, the great majority of the content presented at the symposium focused upon how researchers can optimize their career prospects within this broken system, rather than trying to change the system or seek alternatives to it.

For me, the 2015 symposium was less about the “Future Of Research” than it was about how to survive the current crisis.

This is not to say that any of this isn’t well-intentioned or laudable, but I feel that it misses the essential point of trying to forge a better future for researchers. There is already an entire industry of recruiters, coaches and consultants built around helping researchers navigate the current system, and I felt that recycling this material, however well-intentioned, was not really in the spirit of FOR’s stated goals of improving the scientific endeavor.

Again, it’s important to note that this is only my personal perspective on the symposium, based largely upon my own expectations. The conversation in which FOR is engaging the research community is an important one and there are undoubtedly many possible paths to take, depending upon what each of us feels represents a real solution to the crisis. I may differ with the FOR organizers over the content and approach of  their 2015 symposium, but not over the substance of their mission, which I support wholeheartedly.

I would also like to make another observation that I feel is directly germane to the choices made by the organizers of the FOR symposium. My overwhelming impression of the general sentiment amongst the academics who were present, is one of helplessness and disempowerment; a feeling that there is an academic research  ‘establishment’ composed of a relatively select group of gatekeepers who hold the keys to everything, and without whose help and blessing, nothing meaningful can be done to improve the current situation. From such a perspective, it is much easier to contemplate seeking permission to act within the constraints of the system (and presumably with the approbation of its establishment), rather than taking the much more challenging path of trying to change the system.

I do not accept the argument that one researcher voiced to me, that this is just a kind of pragmatic wisdom that offers a better chance of success because it’s based upon a more realistic perception of how things are.

These establishment gatekeepers – the academic journals, the NIH, college admission and tenure committees etc. etc. etc. dominated the conversation either in absentia, or in the case of the journal publishers, actually in the room. I have to confess that I was somewhat perplexed by the organizers’ decision to give a platform to the publisher Cell Press. As a corporation that takes publicly-funded research and puts a paywall around the results, they are much more a part of the current problem than they are any part of its solution. They also play a big role in the kind of self-fulfilling academic career inequalities born of insidious metrics like ‘impact factors’ and the awful ‘publish or perish’ mentality that can make academic research such a miserable experience, even for many really good and gifted researchers.

Even worse, most of the publishing industry panelists on the stage proceeded to school the attendees on their expectations for how manuscripts should be prepared and submitted, while disingenuously engaging in the charade that this was all to the benefit of the researchers (yes, let’s add ‘unpaid editor’ to the list of the overworked and poorly paid scientist’s responsibilities). Even the research faculty panelists who sat on the stage with the publishers, wasted little time in perpetuating the notion that there is really only one path to success in academic research and that it involves playing by the publishers’ rules.

Despite the presence of The Winnower and Faculty Of 1000 on the panel, who were beacons of hope in this otherwise rather depressing portion of the symposium, it was disheartening to see the academic publishing status quo going relatively unchallenged. Indeed, this part of the program seemed to be a thinly veiled message from the traditional publishers to all present, that this is just how things are and you  really need to get with the program if you’re to have any chance of success in your academic career. All of this passing I would add, with little or no serious discussion of other, better ways to do things.

status quo: 1, alternatives: 0

For all of the polite acceptance of the way things are and willingness to ‘play nice’ with it, there were one or two moments during the symposium in which some of the braver souls did step up and make impassioned pleas for something better – for example (and I’m paraphrasing): “How can we change things so that academic research is no longer a system of abuse?” and:  “How are postdocs supposed to start a family when their compensation during what would be their child-rearing years, is so dismally low that it essentially excludes the possibility?”

I was really horrified when a tenured faculty member on the panel actually suggested that this latter issue is not really a problem since the postdoc (a woman in this case) always has the option to leave academia and find a career path more conducive to starting a family.

Really?

After hearing earlier from that same faculty member that academic research needs to attract and retain “the brightest and the best”, we should probably append the caveat “so long as they don’t want to start a family”.

Another tenured faculty member on the panel dismissed the postdocs’ grievances by suggesting that the whole sorry state of affairs was just the result of the fact that we live in a capitalist society with free markets (and therefore, that we should presumably just accept things the way they are).

status quo: 10, alternatives: 0

Start to see a theme here?

It has been my experience that it is generally a waste of time to make an appeal to change the system, to those who have been raised up by the system and who are its beneficiaries. This is what drives my overwhelming feeling about what could be done to improve things for researchers and now if you’ll bear with me, I would like to connect the dots that integrate this idea with an issue raised in one of the more interesting portions of the symposium – the panel on diversity.

As one of the diversity panel rightly pointed out (and again, I’m paraphrasing here), by the time you’re considering diversity at the graduate school or postdoctoral level, you’re already at the tip of a huge iceberg beneath which much of the diversity race has already been run. The inequalities start at a very young age in the public schools and whether you’re a woman, a racial minority, or gay – by the time you get to the research level, the number of you still left in the race, has already been greatly depleted by years of systematic bias. How do you even start to address the diversity issue in your research department when as one panelist recounted, only 1 faculty applicant in 200 is a minority? For sure you can and should do everything in your power support diversity at this level, but sadly, much of the damage is being done long before the graduate school, postdoctoral or faculty hiring process.

But here’s where we can connect the dots with regard to the current situation in academic research and the fears of many who participate in it. One of the diversity panelists (a woman) really nailed the crux of the issue when she said that her boss, an older, white male, had certain expectations for the way that a female colleague should behave and express herself. Remaining silent with regard to her (very legitimate) grievances on how she was being treated as a woman in the workplace was essentially ‘rewarded’, but of course, under such circumstances nothing changes for the better. Airing those grievances on the other hand, was considered unseemly and overly aggressive in a way that it would not have been if she were a man. This behavior would also most likely result in the proverbial blot on her career copybook – an awful Catch-22 if ever there was one.

Which brings me to what I consider to be the crux of the issue for the research crisis.

We really need to be exploring alternatives to the current system rather than just trying to reform it. 

It’s sad but true I think, that the great majority of those ‘old, white guys’* alluded to by the female diversity panelist, who are in positions of authority and influence, are just not going to substantively help in any effort to dismantle what has gotten them (and keeps them) where they are. Now granted that this is an extreme analogy, but imagine the starving workers in pre-revolutionary St Petersburg, knocking on the Russian tsar’s palace door and politely asking that he and his nobles please address the current inequalities that are keeping them in poverty (as it turns out, this was in effect exactly what they did, and the response was brutal).

To be absolutely clear, I’m trying to make a point here – maybe with my tongue very firmly in my cheek, but a real point nonetheless – and of course I’m not advocating any kind of bloody revolution. But the current situation for scientific research is grave and I believe that what it does call for are the kind of alternatives that might be termed ‘revolutionary’ – revolutionary in the way that YouTube is transforming communication and self-expression – revolutionary in the way that Kickstarter is transforming the funding of business and the arts.

Even as we speak, there are already those who have taken the first steps into a new model of scientific research, both in academia and in industry. They are pioneers characterized by a willingness to look outside the bounds of the current model, many of whose leaders and influencers are all too eager to tell us that this is still the only game in town.

I firmly believe that it is not.

*Disclaimer: The author of this article is also an old, white guy, but of the kind with neither authority nor influence. You can therefore, quite safely tell him to go jump in a lake if you disagree with him, without fear of any damaging consequences for your career 🙂

© The Digital Biologist

The healthcare sector now has its own technology cheating scandal.

cheatingSo after the whole Volkswagen scandal, now the science and technology sector has its own “caught cheating” story as well. This time it’s the turn of Theranos – a medical diagnostic company whose technology and founder have been much lauded in the media and held up as exemplars of technological innovation and progress.

Based upon some investigative journalism by the Wall Street Journal, which includes information disclosed by Theranos employees and others with inside knowledge of the secretive company’s workings, it seems that Theranos is for the most part, using other companies’ technologies for the great majority of its diagnostic blood tests, and may even have done so in order to duplicitously win FDA approval for its own much lauded but never publicly disclosed, diagnostic technology – the same proprietary technology that underpins the company’s $9 billion valuation.

In the world of technology and medicine, it is perhaps no surprise that there are always some with serious skin in the game, who are not above resorting to hype, exaggeration and outright dishonesty where there’s money at stake. The startup world is certainly no stranger to this – and now it looks like Theranos has been caught cheating in order to advance its own blood test technology, while trying to hide this from regulatory agencies and investors in a cover-up that has some parallels with the Volkswagen scandal.

Perhaps even more interesting though, is that these kinds of problems seem to be exacerbated by the cult of celebrity that surrounds those who are perceived to be the movers and shakers of the startup world. Had Theranos founder Elizabeth Holmes not been put on such a high pedestal by her investors, peers and the press, it is interesting to wonder whether she might not have been taken to task much earlier for refusing to scientifically substantiate the efficacy and accuracy of her company’s proprietary diagnostic technology.

Even in the dry and ‘objective’ world of science and technology, people still seem to need to create heroes and to build a kind of mythology around them and their work, however disconnected the stories are from the reality.

As this Wired article points out, when this kind of bubble of delusion and deception gets built around something like a social network startup, what stands to be lost is mostly money. That’s already bad enough, but when the company in question has a hand in your healthcare, what’s at stake could be much, much more serious.

Image courtesy of PostMemes

© The Digital Biologist

What Innovation Is Not

myths-of-innovationInnovation is seen to be something as American as apple pie, that everybody from the US President on down is talking about. From university presidents and corporate leaders to Silicon Valley tycoons, all agree that we need more of it. Against this background of hype, my colleague and fellow scientist Alex Lancaster finds that Scott Berkun’s book “The Myths of Innovation” is a refreshing and unpretentious take on this overused buzzword.

© The Digital Biologist

Complex adaptive systems pioneer, John Holland passes away at 86

fitness-landscapeSad to see the passing of John Holland, one of the great thinkers in the field of complex adaptive systems. I greatly admired his work on evolutionary optimization and his unconventional approach. In the early 2000s I even published a research paper of my own, describing an evolutionary computational approach to the phase problem in x-ray crystallography, that was directly inspired by his work. He was a great scientist and a great communicator.

My colleague Alex Lancaster who was similarly inspired and influenced by Holland’s groundbreaking work, wrote a very nice piece to mark his passing, on his blog. You can read it here.

© The Digital Biologist

The Future Of Research

forThese are difficult times for researchers. In inflation-adjusted terms, research funding is actually down compared to recent years and everybody is talking about the apparent surplus of researchers being produced by graduate school and post-doctoral training programs. If you care about the future of research, the Symposium on the Future of Research will interest you.

This event is exceptional insofar as it is actually being organized and run by the people that are at the heart of this crisis – the postdocs themselves. In this respect, attendees should expect to hear a range of insights and perspectives on this crisis that is much more wide-ranging than those of the scientific establishment voices that we are more typically used to hearing in the media. It’s great to see a group of Boston area postdocs from several of the region’s excellent schools, taking matters into their own hands.

So if you care about the future of research, would like to hear from those who are on the front line of this issue and even add your own voice to the conversation, you should register for the symposium that will be held at the Boston University campus on October 2nd and 3rd, 2014.

 © The Digital Biologist | All Rights Reserved

The art of deimmunizing therapeutic proteins

Antibody-EpitopesThe consideration of potential immunogenicity is an essential component in the development work flow of any protein molecule destined for use as a therapeutic in a clinical setting. If a patient develops an immune response to the molecule, in the best case scenario, the patient’s own antibodies can neutralize the drug, blunting or even completely ablating its therapeutic activity. In the worst case scenario, the immune response to the drug can endanger the health or even the life of the patient.

Thanks to the incredible molecular diversity that can be achieved by VDJ recombination in antibody-producing lymphocytes (B-cells), the antibody repertoire of even a single individual is so vast (as many as 1011 distinct antibodies for a single individual) that it is difficult to imagine ever being able to design all potential antibody (or B-Cell) epitopes out of a protein while still preserving its structure and function. There is however a chink in the antibody defense’s armor that can be successfully exploited to make therapeutic proteins less visible to the immune system – the presentation of antigens to T-cells by antigen-presenting cells (APCs), a critical first step in the development of an adaptive immune response to an antigen.

Protein antigens captured by antigen-presenting cells such as B-cells, are digested into peptide fragments that are subsequently presented on the cell surface as a complex of the peptide bound to a dual chain receptor coded for by the family of Major Histocompatibility Complex (MHC) Class II genes. If this peptide/MHC II complex is recognized by a T-cell antigen receptor of one of the population of circulating T- helper (Th) cells, the B-cell and its cognate Tcell will form a co-stimulatory complex that activates the B-cell, causing it to proliferate. Eventually, the continued presence of the B-cell antigen that was captured by the surface-bound antibody on the B-cell, will result not only in the proliferation of that particular B-cell clone, but also in the production of the free circulating form of the antibody (it should be noted that antibody responses to an antigen are typically polyclonal in nature, i.e. a family of cognate antibodies is generated against a specific antigen). It is through this stimulatory T-cell pathway that the initial detection of an antigen by the B-cell is escalated into a full antibody response to the antigen. Incidentally, one of the major mechanisms of self-tolerance by the immune system is also facilitated by this pathway via the suppression of T-cell clones that recognize self-antigens that are presented to the immune system during the course of its early development.

This T-Helper pathway is therefore a key process in mounting an antibody-based immune reponse to a protein antigen and while the repertoire of structural epitopes that can be recognized by B-cells is probably far too vast to practically design a viable therapeutic protein that is completely free of them, the repertoire of peptides that are recognized by the family of MHC Class II receptors and presented to T-cells (T-cell epitopes), while still considerable in scope, is orders of magnitude smaller than the set of potential B-cell epitopes.

So as designers of therapeutic proteins and antibodies, how can we take advantage of this immunological “short-cut”, to make our molecules more “stealthy” with regard to our patient’s immune system?

mhcThe solution lies in remodeling any peptide sequences within our molecules, that are determined to have a significant binding affinity for the MHC Class II receptors. The two chains of an MHC Class II receptor form a binding cleft on the surface of an APC into which peptide sequences of approximately 9 amino acids can fit. The ends of the cleft are actually open, so longer peptides can be bound, but the binding cleft itself is only long enough to sample about 9 amino acid side chains. It is this cleft with the bound peptide that is presented on the surface of an APC for recognition by T-cells.

The genetic evolution of MHC Class II alleles in humans is such that there are about 50 very common alleles that account for more than 90% of all the MHC Class II receptors found in the human population. There are of course, many more alleles in the entire human population, but they become ever rarer as you go down the list from the 50 most common ones, with some of the rarer alleles being entirely confined to very specific populations and ethnicities. What this means for us as engineers of therapeutic proteins is that if we can predict potential T-cell epitopes for the 50 or so most common MHC Class II alleles, we can predict the likelihood of a given peptide sequence being immunogenic for the vast majority of the human population.

It actually turns out that some researchers have published experimental peptide binding data for the 50 most common MHC Class II alleles and their results are very encouraging for the would-be immuno-engineer. The peptide binding motif of the MHC II receptor essentially consists of 9 pockets, each of which has a variable binding affinity across the 20 amino acid side chains that is independent of the side chains bound in the other 8 pockets. This last property is of particular importance because it means that we can calculate the relative MHC II binding affinity for any particular 9-mer peptide by the simple summation of the discrete binding pocket/side chain affinities, rather than having to consider the vast combinatorial space of binding affinities that would be possible if the amino acid binding affinity of each pocket was dependent upon the side chains bound in the other 8 pockets.

This is the point at which a computer and some clever software can be enormously helpful. While I was employed at a major biotechnology company, I created software that could use a library of this kind of MHC II peptide affinity data, in order to scan the peptide sequences of protein drugs and antibodies that we were developing for the clinic. The software not only predicted the regions of the peptide sequence containing potential T-Cell epitopes, but it also used other structural and bioinformatics algorithms to help the scientist to successfully re-engineer the molecule to reduce its immunogenicity while preserving its structure and function.

This last phrase explains why I used the word “art” in the title of this article.

What we learned from experience was that while it is relatively easy to predict T-cell epitopes in a peptide sequence, reengineering the sequences while preserving the structure and function of the protein is the much greater challenge.

Based upon this experience, it was no surprise to me that the great majority of the thousands of lines of Java code that I wrote developing our deimmunization software, was dedicated to functionality that guided the scientist in selecting amino acid substitutions that would have the highest probability of preserving the structure and function of the protein. Even with this software however, the essential elements in this process were still the eyes and brain of the scientist, guided by training and experience in protein structure and biochemistry.

In other words, the art and craft of the experienced protein engineer.

Much like the old joke “My car is an automatic but I still have to be there” – the software could not substitute for the knowledge and experience of a skilled protein engineer, but it could make her life a lot easier by suggesting amino acid substitutions with a high probability of being structurally and functionally conservative; and by keeping track of all the changes and their impact upon the sequence and structure.

The software really showed its value in the improvement it brought to our success rate in converting our computational designs to successful molecules in the laboratory. For any given project with a new biologic, we would typically design a bunch of variants to be tested in the lab, of which one or two might have all the properties we were shooting for. Once we  started using the software, there was a noticeable increase in the proportion of our designs that tested well in the lab, compared to previously. This was interesting to me insofar as it showed that while the software could not replace the scientist’s knowledge and experience, it could certainly enhance and augment its application to the problem at hand – probably by keeping track of the many moving parts in the deimmunization process, so that the scientist is free to think more carefully about the actual science.

In spite of all this technological support however, a successful deimmunization depends heavily upon skill and experience in protein engineering, and there’s arguably still as much art in successfully re-engineering T-cell epitopes as there is science in predicting them.

© The Digital Biologist | All Rights Reserved

Ebola: The next big frontier for protease inhibitor therapies?

Ebola_virusWhile I was on vacation this summer, the news was full of stories about the Ebola Virus outbreak in Africa and the health workers who had contracted the virus through working with the infected population there. Then on the heels of all of this, comes a very timely paper High Content Image-Based Screening of a Protease Inhibitor Library Reveals Compounds Broadly Active against Rift Valley Fever Virus and Other Highly Pathogenic RNA Viruses in the journal PLOS Neglected Tropical Diseases.

While the primary pathogen tested in the article is the Rift Valley Fever Virus, the library of protease inhibitors was also screened for efficacy against Ebola Virus and a range of other related RNA viruses, and shown to have activity against those pathogens as well.

Given the incredible successes we have seen with the use of protease inhibitors in other virally induced diseases like HIV/AIDS, it is tempting to wonder whether there might be a similarly promisng new medical frontier for protease inhibitors in the treatment of these extremely dangerous viral hemorrhagic fevers.

Interestingly, most of the compound screening for these kinds of antiviral therapies in the last couple of years, has been focused upon signaling molecules like kinases, phosphatases and G-Protein Coupled Receptors (GPCRs). The use of protease inhibitors as antiviral compounds therefore, represents something of a departure from the mainstream in this research field. The authors of the current study however, felt that the success of protease inhibitors in the treatment of other diseases, were grounds to merit a study of their efficacy against RNA viruses.

When I think about all of the people who are still alive today thanks to the use of protease inhibitors to control their HIV/AIDS, the early signs of a similarly efficacious class of compounds for treating hemorrhagic fevers, described in this new research article, definitely give one hope for the prospect of a future in which there are successful therapies to treat deadly (and really scary) diseases like Ebola.

© The Digital Biologist | All Rights Reserved