A TALE OF MANY PANDEMICS – PART 3: THE FINISHING LINE

In parts one and two of this series on pandemics, I took you on a journey into the world of zoonotic diseases, from the 14th century Black Death to modern day pandemics caused by zoonotic viruses. In the last part of this series, I will attempt to shed light on how pandemics come to an end. Just to offer a little light at the end of this long and dark Covid- 19 tunnel. 

Herd immunity

As a concept, herd immunity is incredibly controversial. In theory, it sounds appealing as it is a rapid way to end a pandemic. If the disease is allowed to spread uncontrolled and infect at least 60- 85% of a population, it can fizzle out as those people are now immune and the virus effectively runs out of people to infect. However, the human cost of such a strategy is humungous. In a country like India for example, nearly 800 million people would need to be infected. Considering India’s current death rate of 1.64 per 100 cases, this would translate to roughly 13 million deaths by the time herd immunity is reached. COVID- 19 is also a disease that is still being studied. There is no concrete evidence to indicate a sustained immune response, so there is no guarantee that herd immunity will result in a permanent eradication of COVID-19. Sweden was indirectly working towards herd immunity with their COVID-19 strategy. The jury is still out as to whether it was a success, or a moral and scientific failure.  

Containment

At the outset of a disease outbreak, the primary strategy is containment. This involves well developed public health measures such as isolating infected people, tracing their contacts, quarantining said contacts and extensive testing. These measures were brilliantly implemented by Taiwan, which did not need to enforce an economically debilitating lockdown. Instead, it shut its borders, hoarded personal protective equipment (PPE) and enacted the robust epidemiological methods mentioned above. Since the beginning of the pandemic, Taiwan has only registered 535 cases and 7 deaths (as of October 16th). It helps that Taiwan is a small country, whose vice president is the epidemiologist who effectively handled the SARS outbreak in 2003. On the contrary, countries that poorly executed containment measures due to a lack of preparation and rampant misinformation have suffered economic devastation from harsh lockdowns. Some like India allowed the pandemic to get out of control in spite of buying preparation time with one of the longest and most restrictive lockdowns in the world. Although, it should be noted that the case load and death toll in India would have been exponentially higher without the lockdown due to its large, densely packed population and poor rural healthcare facilities. 

SARS- CoV2 becomes endemic

Imagine a worst- case scenario wherein we do not find an effective vaccine against SARS-CoV2. What then? The answer can be found in the trajectory of the 1918 H1N1 influenza. Early 20th century doctors did not have access to modern medicine and virology. In the absence of therapeutics and vaccines, the disease spread like wildfire, infecting 500 million and killing 50-100 million people. Its march was only halted when enough people developed immunity. Once this occurred, the virus became endemic. It continued to circulate for the next 40 years as a seasonal flu, before being kicked out by the onset of the H2N2 influenza pandemic in 1957. To this day, we don’t know why one virus was able to effectively eliminate the other.  Unless there is extensive vaccination, it is likely that SARS-CoV2 will also become endemic. It might circulate seasonally like other endemic coronaviruses, but a combination of vaccination and natural immunity through infection will downgrade it from pandemic status. 

Vaccination

This is the most effective way out of a pandemic. This strategy worked brilliantly during the 2009 novel H1N1 avian influenza pandemic. Admittedly the virus was not as infectious as initially thought, but 6 months into the pandemic, scientists developed a vaccine. This slowed the spread and this version of H1N1 now simply circulates seasonally as the flu. Flu shots are given annually to reduce the number of infections from endemic influenza viruses. Unlike vaccines against diseases like small pox, which granted life- long immunity and helped eradicate the disease, the flu vaccine only offers temporary protection. This is because the virus, in a bid to survive, mutates rapidly and is able to evade the protection bestowed by vaccination. The hope is for a vaccine against SARS-CoV2 to behave more like the small pox vaccine rather than the flu vaccine, although the latter is more likely. Currently, there are 180 vaccines in development. Russia has authorised the use of its Sputnik V vaccine, but having skipped crucial phase three clinical trials before approval, it is being viewed with extreme scepticism. Their phase three trials are currently ongoing. The most high profile vaccine in phase three clinical trials (meaning that it is very close to the end of the testing phase) is being developed by the University of Oxford and AstraZeneca. This uses a weakened chimpanzee common cold virus that produces the same spike protein that is on the surface of the SARS- CoV2 virus. This has shown promising results in ongoing trials and the hope is for it to be rolled out to the public early next year. Although a safe and effective vaccine may be delivered within a staggeringly rapid timeframe, there are several questions that need answering. Firstly, it is unknown how many doses may be required. It is probable that at least two doses may be needed, which means a demand of 16 billion doses to vaccinate the entire world population. This raises questions about how many doses can be realistically produced immediately. AstraZeneca says it has the capacity to produce 3 billion doses and has signed an agreement with the Serum Institute in Pune, India to produce a further 100 million doses. There is also uncertainty around vaccine rollout. This encompasses questions such as: Who gets it first? Will poorer countries be able to buy approved vaccines? How affordable will it be for an average person? Hopefully world governments and the WHO can come up with answers and we can be vaccinated next year so that this nightmare can finally end. 

Featured image: The Conversation. https://www.inverse.com/mind-body/how-do-pandemics-end

A TALE OF MANY PANDEMICS – PART 2: ZOONOTIC VIRUSES

My previous post described the oldest, most well- known pandemic, the Plague. This was an example of a zoonotic bacterial disease with an insect intermediary (the flea). Any disease that originates in animals and successfully crosses the species barrier to infect humans is called a zoonosis/ zoonotic disease. These diseases can be due to bacteria, viruses, fungi or parasites.  In part II of this series on pandemics, I will delve into the those caused by viruses that jumped from animals to humans and created absolute chaos when they emerged. 

Pandemics caused by zoonotic viruses

During lockdown, I watched the movie Contagion against my better judgement. At the end of the movie a montage depicts how the novel virus enters the human population. Essentially, it jumps from bats (which are disturbed by human activity), uses pigs as an intermediary and finds its way to humans due to a chef with poor personal hygiene. This fictional novel virus is an example of a zoonotic virus. Not all zoonotic viruses cause deadly pandemics. Most viruses are endemic, meaning they circulate in a human population in one geographical location. The coronavirus that causes the common cold is an example of an endemic virus. Other viruses are epidemic, whereby they are restricted to a small geographic location, but cause infections at higher- than- normal rates. A classic example is Ebola, which is thought to have spilled over from fruit bats, although this is still unconfirmed. There have been several outbreaks since the 1970s, with the largest Western African outbreak being declared an epidemic (2013- 2016). When zoonotic viruses acquire a passport, cross international borders and cause abnormally high rates of infection, the disease is declared a pandemic. Zoonotic viruses have been responsible for some of the worst pandemics to hit humanity, not including the current one we’re living through. 

  1. HIV/AIDS– People don’t usually think of Acquired Immunodeficiency Syndrome (AIDS) as a pandemic and it is sometimes designated an epidemic by WHO. However, the causative virus Human Immunodeficiency Virus (HIV) has infected roughly 38 million people globally as of 2019. AIDS is thought to have started in the Democratic Republic of Congo around 1920 when HIV jumped from chimpanzees to humans. However, the current pandemic is only estimated to have started in the mid- to late- 1970s. Recent advances in anti- retroviral therapy mean that HIV infection is no longer a death sentence, as it is possible to suppress the virus with drugs and avoid getting AIDS. However, there is still no vaccine or permanent cure. 
  2. Influenza– There have been four influenza pandemics in the last 100 years, with the Spanish Flu of 1918 being the worst. The causative organism for all is the influenza A virus, which is predominantly found in birds. This is classified into several subtypes based on the presence of two identifying proteins (haemagglutinin {H} and neuraminidase {N}) on its surface. Each Influenza A subtype is named for the number of these surface proteins. For instance, H1N1, the causative virus of the 1918 pandemic, has one of each. Flu pandemics are different from the seasonal flu as they are caused by a virus that humans have never encountered before. Influenza A and B that cause the seasonal flu are simply the latest iteration of the same virus, modified by spontaneous genetic mutations. There is a vaccine for the seasonal flu, which should be taken at this time to reduce strain on our healthcare systems. 
  3. SARS/MERS– These two diseases were caused by coronaviruses, named for the glycoproteins on their surface that give them a crown- like appearance. Severe Acute Respiratory Syndrome (SARS) emerged in February 2003. The causative SARS-CoV virus is thought to have emerged from bats in 2002. It then jumped to civet cats before finding a home in humans. Middle East Respiratory Syndrome (MERS) is not classified as a pandemic, but I will mention it as it was a novel coronavirus with pandemic potential. It was first reported in Saudi Arabia in 2012 and is caused by the MERS-CoV virus. Like SARS-CoV, this virus is thought to originate in bats, but was transmitted to humans from camels. Both diseases were not as infectious as COVID 19, as the number of cases was in the low to high thousands. However, the death rate was extremely high (10% for SARS and 35% for MERS). There are currently no vaccines for either disease. 

Crossing the species barrier: Why does this happen? 

Why do these viruses spill over to humans from animals? The simple answer is proximity. Research into virus ‘spill- over’ events has revealed two stark truths: One, the majority of zoonotic viruses with pandemic potential arise from wildlife. Two, they can jump to humans because we have the unfortunate habit of exploiting wild animals and destroying their natural habitats. It should be noted that zoonotic viruses have also jumped from domesticated animals (swine flu) and birds (avian influenza), but the pandemics in the last 100 years all had wildlife origins. Once a virus jumps to humans, it still needs to establish itself and propagate through human- human transmission. Interestingly, many novel viruses cannot make it past the first human host. For a virus to cause an outbreak, several factors need to be satisfied. These include but are not limited to: (1) Frequency of contact between animal- host and humans, (2) mechanism of spread, for example airborne, waterborne or sexually transmitted, (3) ability of the virus to evade the host immune system and (4) efficacy of gaining entry into human cells. If a novel virus is able to easily infect humans and spread without being detected, it has pandemic potential. SARS-CoV2 is causing the deadliest pandemic since the Spanish influenza because it is one of those rare viruses that has satisfied all the above criteria. Through the illegal wildlife trade/ wet markets, the virus had ample opportunity to encounter human hosts. It is easily spread through droplets released by coughing, sneezing and talking. It can evade the host immune system by suppressing the first line of defence against an infection and finally, it can easily enter human cells using the ACE2 receptor in lung cells. 

How do we prevent the next big pandemic? 

Honestly? We might not be able to. Since the turn of the century, there have been pandemics/ emergent novel viruses practically every decade. There are epidemiologists and virologists whose job it is to identify zoonotic viruses and estimate the risk of ‘spill- over’ events. However, without major shifts in policy and human lifestyle, their efforts will be meaningless. Although wildlife trade has been banned and wet markets better regulated, critics say these measures are insufficient. Pandemics are not caused by animals going berserk. They are a direct result of irresponsible human behaviour. Changing the way humans interact with wildlife and farmed animals could significantly reduce the risk of future pandemics even if it will not completely eradicate them. Researchers at the University of Cambridge recently came up with a list of solutions that can be implemented by policy- makers. Some of these are common sense measures such as keeping farmed animals away from humans and wildlife, using effective personal protective equipment (PPE) and maintaining good animal health and hygiene standards. Others would involve significant expenditure on an international scale and new laws that could impact on people’s livelihoods.  All humans have agendas and we dislike having them interfered with. However, if we don’t get our act together and take immediate decisive action, in a decade we might find ourselves in the throes of yet another deadly pandemic. 

Featured image: SARS-CoV2 diagram courtesy New Scientist (CDC/SCIENCE PHOTO LIBRARY)- https://www.newscientist.com/term/covid-19/

A TALE OF MANY PANDEMICS-PART 1: THE PLAGUE

We live in unprecedented times. At the inception of 2020, who knew that we would exist in a world where our actions are defined by the risk of catching Covid 19? Consumed by the uprooting of our lives, it is easy to forget that this is not the first pandemic to bedevil the human race, nor will it be the last. In part I of this series on pandemics, I will discuss The Plague. 

Bubonic plague

In Shakespeare’s Romeo and Juliet, the character of Mercutio comes off worse in a fight and curses his opponents with a “plague on both your houses!” as he is dying. The disease that Mercutio refers to is the bubonic plague, commonly attributed to the 14th century Black Death and two other plague pandemics: The plague of Justinian in 541 AD and the Third plague pandemic in the mid 19th century. The bubonic plague is caused by a bacterium calledYersinia pestis, which usually lives on fleas, that in turn are primarily hosted by rats. As Y. pestis multiplies, it blocks the flea’s digestive tract, which makes it very infectious and very hungry. As the flea is feeding, it infects its rat host with Y. pestis, which eventually kills the rat. Now without a host to feed on and starving, the flea goes looking for another host. If this happens to be a human in close proximity, which was often the case on crowded rat- infested ships, that unfortunate person gets bubonic plague. 

What really caused the Black Death? 

We’ve been told for over two centuries that Black Death was the bubonic plague. What if it wasn’t? Scientists started questioning the narrative when they compared the spread of the Black Death in the late 14th century to the third plague pandemic. Here’s the evidence they presented. 

  1. The disease spread too rapidly for it to be a rat and flea borne disease. The Black Death spread across Northern Europe in the middle of a harsh winter (conditions unfavourable for flea and rat survival). The rate of spread was 4 km per day, much faster than a rat can move. In contrast, the 19th century plague in India took 6 months to move 100 meters. The disease was also identified in towns hundreds of kilometres away from each other, without any reported intermediary infections. 
  2. The disease was violent for too long. The Black Death ravaged Europe for 300 years. In fact, France would have annual outbreaks and some of these spawned European epidemics. Typically, infected fleas kill their rodent hosts, resulting in cycling of rodent populations. This does not allow the disease to establish itself as there cannot exist a stable, disease resistant rodent population. 
  3. The mortality rate was too high. The Black Death had a mortality rate of nearly 100%. This was 10 times higher than the reported mortality of modern bubonic plague. In fact, humans can be infected with Y. pestis without falling ill and there are milder forms of the bacteria that do not kill. 
  4. The Black Death was directly infectious. Mandatory social distancing (4 meters) was imposed during the Black Death. It was found to be extremely beneficial, indicating that disease transmission was propagated by human to human contact. This is in direct contrast to what we know about the bubonic plague requiring intermediary flea carriers.
  5. Development of disease resistant mutations in humans. There is a specific genetic mutation (CCR5-∆32) that is enriched in European populations. Research suggests that this CCR5 deletion appeared roughly 2000 years ago, and its frequency increased to present levels due to positive selection by several plague epidemics/ pandemics, including the Black Death. Basically, this mutation protected people from dying of the plague, so survivors passed it on to their progeny, hence amplifying the mutation in the human population. The CCR5 protein is actually used by the HIV virus to access human cells. Hence, people with this mutation that deletes CCR5 are protected from HIV. However, there is no evidence to suggest that this protein is also used by Y. pestis to gain cell access. 

Based on these anecdotal evidences, scientists came up with an alternate hypothesis: Perhaps the Black Death was caused by a virus. More specifically, a haemorrhagic virus like Ebola, due to the similarities in symptoms between the two diseases. It should be noted that this theory is backed up by circumstantial evidence and not hard, irrefutable data. 

A virus? Really? 

Fierce critics of this theory have offered up compelling counterarguments in support of the widely accepted bubonic plague explanation. 

  1. Rats and fleas were not the carriers. Although Y. pestis is typically transmitted by fleas that live on rats, perhaps this was not the primary transport mechanism during the Black Death. Scientists identified human body lice that can transmit Y. pestis from sick to healthy rabbits. Replace rabbits with humans and this could explain how the disease could spread in a cold climate. Similarly, plague does not exclusively require rats to propagate. Potentially, an animal that is not as severely affected as rats and humans could have developed resistance and sustained the disease for a long time. There were also no records of a spike in rat deaths around this time, lending credence to the idea that they were not the only animal carrier. 
  2. Fleas may have spread the disease earlier than thought. There are several flea species that can become infectious before multiplying bacteria block off their stomachs and starve them. Hence, an infectious flea can jump from human- human, thereby spreading the infection in the absence of rats. Personal hygiene was not high during the 14th century and humans infested with fleas/ lice were not uncommon. This might explain why social distancing seemed to help curb the spread of Black Death. 
  3. Pneumonic, not bubonic plague. Another form of plague that is also caused by Y. pestis is pneumonic plague. It infects the lungs and can spread from person to person via droplets. In the absence of antibiotics in the 14th century, this plague would’ve had a 100% fatality rate. This could explain the human- human transmission and rapidity of the spread. 
  4. DNA analysis of plague victims. Scientists went digging in the burial sites of plague victims to isolate DNA from the causative pathogen. Early results were conflicting as 700-year-old DNA can be quite degraded, making it difficult to interpret results. Scientists tweaked their methods and identified a circular DNA called a plasmid, that is unique to bacteria. Sequencing this plasmid DNA revealed that it was a match to plasmid DNA found in modern versions of Y. pestis. Further sequencing of the nuclear DNA revealed changes to the sequence in ancient Y. pestis compared to its modern relative. Another genetic analysis identified two previously unknown strains of Y. pestis in victims of the Black Death. These data support the presence of a different strain of Y. pestis in Black Death victims. Essentially the same pathogen, but it appeared to have caused a very different disease. It is possible that these ancient strains are no longer present, or perhaps they mutated into the less virulent strain that caused the modern plague pandemic in the 19th century. 

My inexpert opinion

I should stress that I am not an expert and this section is purely speculative. With the available evidence, I am inclined to agree with the majority that the Black Death was the bubonic plague, albeit caused by a different bacterial strain. However, I do think there are unanswered questions. In my mind, the biggest question is why there was an increase in frequency of the CCR5 deletion. There is some data that implies this could be due to small pox (another viral disease) and not the bubonic plague as originally thought. The small pox virus family is known to access immune cells using proteins similar to CCR5 and has been around more consistently than the plague. This could silence some sceptics, but the true impact of small pox on positive gene selection needs to be better evaluated. There is clearly hard DNA data indicating that bubonic plague was the Black Death. However, it should be noted that acquiring and analysing ancient DNA is not without flaws. Due to DNA degradation and myriad contaminants, extracting the right DNA is a Herculean task that can have a high error rate. Although the plasmid DNA data presented in support of bubonic plague is compelling, the ‘different strain’ theory is incomplete without full genome analysis of the ancient bacteria. Personally, I don’t think the virus theory should be entirely discounted until it is comprehensively disproved. If there really is an unknown, devastating virus out there that can re-emerge at any time, we should do our best to establish its existence (or lack thereof). After all, we do not want to end up with another COVID-19 situation, do we? 

ALL ABOUT THAT BASIC SCIENCE RESEARCH

I was chatting with a scientist cousin of mine and the one vexation we had about the state of research these days was the lack of investment in basic science research. After spending some time complaining, I felt it would be interesting to write about basic science research, how it is different from applied research and explain why it is so important. I should stress that the opinions expressed here are purely my own. 

At this point, I think it would be instructive to briefly describe the premise of scientific research. There are a lot of unknown things about the natural world. Obviously, this raises a lot of questions. Scientists come up with hypotheses about the potential answers to these questions and design experiments to test these hypotheses. The answers usually raise more questions and research goes on, allowing scientists like me to hold jobs and buy way too many books at Blackwells.

What is basic science research anyway?

Basic or fundamental science is a study of the world around us to understand how it works. It can be a study of cells (cell biology), microorganisms (microbiology) or molecules (molecular biology). Irrespective of the subject matter, scientists do basic research to increase human knowledge. To illustrate my point, I was looking for some intellectual quotes about basic science and I came across this real humdinger by the organic chemist Homer Burton Adkins: “Basic research is like shooting an arrow into the air and, where it lands, painting a target.” Sounds esoteric, right? What Dr. Adkins means is that basic science is a lot like looking for something unknown and once it is accidentally- on- purpose found, focussing all further research on that one entity. 

How is that different from applied research?

As the name suggests, applied research is an application of the knowledge acquired by basic research to solve problems faced by specific groups of people. Penicillin is a classic example of this.  Alexander Fleming was a Scottish scientist and physician who accidentally stumbled upon penicillin, much like Dr. Adkins’s analogy. He was a brilliant, but messy scientist who left an uncovered petri dish of Gram-positive bacteria called Staphylococcus aureus on the window sill of his lab and went on a month-long holiday. When he returned, he found that his plates were contaminated with mould. Interestingly, he found that the bacteria closest to the mould were dying, as evidenced by a clearing up of bacterial colonies. He isolated the mould and found that it was a member of the Penicillium genus of fungi. Further investigation found that this mould was effective in killing other Gram-positive bacteria responsible for deadly diseases like meningitis and diphtheria. He also realised that the mould was killing the bacteria via a ‘juice’ it secreted. He named this ‘mould juice’ penicillin. While this was very exciting, it did not have much use if it could not be used in patients. This is where, nearly 12 years after the initial discovery, Howard Florey and Ernst Chain were able to purify the compound penicillin in large enough quantities to treat sick soldiers during World War II. The initial discovery of penicillin and its potential applications are an example of basic research. Its subsequent purification and deployment to the frontlines in WWII are examples of applied research. 

Why do I think basic science is so important? 

It is obvious that without basic research, modern technology and medicine that significantly improve the lives of people would not exist. Unfortunately, the current research landscape does not appear to value purely basic science research. Most often grants that ask for basic science funding have to be packaged as applied research grants and even then, scientists might not get the money they’ve asked for. 

Without the initial observation that a plant pathogen was small enough to pass through a filter that was designed to trap bacteria, scientists would not have discovered viruses. Without further basic research into the biology of viruses, scientists would not have figured out how viruses infect and propagate in hosts. It is only by applying this knowledge about the structure and behaviour of viruses, that scientists have been able to develop drugs and vaccines to combat them. Without an initial investment in basic research and development, we would not have the internet, smartphones or GPS. We would not have drugs that treat cancer without basic research to understand the underlying mechanisms of why cells become cancerous and how they spread like wildfire. I can name so many more examples, but then this post would become unreadable. 

It is obvious that increasing human knowledge through basic research is beneficial to improving the health, quality of life and security of human beings. Why then is there a reluctance to fund and support such research? Naturally, in times of adversity, it is human nature to invest time and resources into finding an immediate solution. However, we will rapidly run out of innovative solutions without understanding the underlying problem further. Consider investment into basic research as setting up a retirement plan. We should save up and invest long- term for a brighter and more secure future. 

CALMING THE STORM: CONTROLLING CYTOKINE RAMPAGE

Crazy little thing called Covid 19

The onset of a global pandemic has meant that suddenly everyone is a virologist, immunologist or epidemiologist. People throw around phrases such as “flatten the curve” and “cytokine storm”, without really understanding what they are. I’m not an epidemiologist, so I’m not going to touch the first one. I’m not an immunologist either, but I can definitely have a go at discussing the second one. 

Storm? What storm?

Not an actual storm obviously. A cytokine storm, or cytokine release syndrome (CRS) is simply a massive and rapid release of proteins called cytokines into the blood. Why does this happen though? Let us assume that patient X is infected with SARS- CoV2, the coronavirus that causes Covid-19. The virus enters the body and travels down the throat until it reaches the lungs. There, it enters the lung cells via a door called the ACE2 receptor. Once inside, it hijacks the protein production machinery and starts rapidly multiplying. As it is a novel coronavirus, the immune system does not immediately recognise this as an invader and the virus has time to make thousands of copies of itself. Once it has reached a certain threshold, the virus bursts out of the cells, effectively killing them. At this point, there are enough virus particles for the immune system to realise that something is very wrong, and they leap into action. Like any war, this is a violent process and the lung cells end up becoming collateral damage. This immune response also results in inflammation due to cytokines that are released by the stressed cells. This manifests in patient X as a fever of 37.8°C or higher. There is a delicate balance to be maintained and the inflammation response is essential for several functions, including recruiting more immune cells to the fight and removing dead cells. Sometimes however, the immune system tends to go overboard. Sort of like that trigger-happy fighter who won’t listen to orders to stand down. When this happens, there is an uncontrolled inflammation response, leading to a large release of cytokines. These start to attack other organs in the body, resulting in multiorgan failure and eventually death. 

Proteins to the rescue!

Unfortunately, once CRS sets in, it is very difficult to control and treat. Recently however, scientists at MIT have developed a novel protein- based tool that can be used to ‘mop up’ extra cytokines.1 This has the potential to mitigate the cytokine storm and reduce consequent fatalities. As a protein biochemist with an unnatural love for protein structure and function, this was the most exciting thing to happen to me during self- isolation!

Most people are aware that proteins are composed of amino acids. Of the twenty amino acids in humans, nine are hydrophobic (incompatible with water). These are only found in proteins that reside in cell or organelle membranes (membrane proteins). Protein receptors that bind to free cytokines are examples of membrane proteins. The tool developed at MIT is essentially a protein modification method that makes the receptors water compatible. This means that they can now be deployed in the blood stream to bind to the extra cytokines floating around and prevent them from creating systemic havoc. So how does this tool work? The scientists selected four hydrophobic amino acids: isoleucine (I), leucine (L), valine (V) and phenylalanine (F) and replaced them with three hydrophilic (water compatible) amino acids, namely glutamic acid (Q), threonine (T) and tyrosine (Y)2. The resultant protein was christened a QTY- variant receptor. The QTY- receptors were fused to part (the Fc region) of an immunoglobulin G (IgG) protein, creating an antibody- like structure. These water-soluble fusion cytokine receptors were able to bind to free cytokines with extremely high specificity and affinity, hence showing promise as therapy for CRS. Before getting too excited however, it is important to note that this effect was seen in a test tube. These QTY fusion receptors need to be tested for safety and efficacy in animal model studies and effective delivery methods have to be designed. If these tests are passed, this could be answer doctors are looking for in the treatment of CRS during Covid 19 and other diseases. 

References

1.        Hao, S., Jin, D., Zhang, S. & Qing, R. QTY code-designed water-soluble Fc-fusion cytokine receptors bind to their respective ligands. QRB Discov. 1–18 (2020). doi:10.1017/qrd.2020.4

2.        Qing, R. et al. QTY code designed thermostable and water-soluble chimeric chemokine receptors with tunable ligand affinity. Proc. Natl. Acad. Sci. U. S. A. 116, 25668–25676 (2019).

CANCER CELLS ARE VERY BAD NEIGHBOURS

The expansionism of cancer cells

Scientists have made the startling discovery that cancer cells have the capacity to ‘corrupt’ their healthy neighbours into aiding and abetting their metastatic crusade.1 Sort of like that bad egg who convinces their straight- laced friends that doing drugs is a good idea. As part of a larger study to investigate the tumour microenvironment (TME), Dr. Ilaria Malanchi’s group at the Francis Crick institute in London, in collaboration with other research groups across Europe identified a population of cells that bear resemblance to stem cells and are able to support cancer development. This is exciting because this insight into the TME allows scientists to identify new drug targets and study how cancer cells are able to rapidly proliferate. 

The tumour microenvironment

Let me take a step back. What is this tumour microenvironment I speak of? To put it simply, it is the environment surrounding a tumour. It includes blood vessels, various cell types including fibroblasts and inflammatory cells, signalling molecules and the extracellular matrix.2 It has been known for a while now that cancer cells and the TME share a symbiotic relationship. However, not much is known about how metastatic cells cause early cellular changes within the TME. To answer this question, scientists designed an ingenious system wherein the metastatic cancer cells were engineered to secrete a fluorescent protein that could be taken up by neighbouring cells. This allowed them to spatially study the immediate microenvironment of those metastatic cells. 

The fluorescent protein labelling system

Fluorescent labels are amazing. Potentially, any protein in a cell can be tagged with a fluorophore and easily tracked using a confocal microscope. Also, the pictures are usually so colourful and pretty!3 In this study, a modified peptide (very small protein) called sLP was tagged with mCherry (yes it gives out a lovely red colour when excited with a laser (540- 590 nm)). Mouse origin breast cancer cells called 4T1, that are typically used to study metastasis, were engineered to co- express sLP- mCherry and GFP. Why GFP? As the 4T1 cells would be the only ones expressing GFP, researchers could distinguish them from other neighbouring cells. When these engineered 4T1 cells were injected intravenously into mice to induce a lung metastasis, researchers found that sLP- mCherry could label surrounding host tissue cells. It could do that as it is a naturally secreted protein that is soluble in lipids. This means that once its secreted from a cell, it can cross the cell membrane of a neighbouring cell. This specific labelling allowed researchers to differentiate lung tissue immediately affected by the metastatic cells from more distal lung tissue. Using this strategy, the scientists could study all the various cell types that make up the TME around metastatic cells. It was during this process that they identified lung epithelial cells that expressed markers normally present in progenitor cells (cells that differentiate into other cell types). Through in vitro studies, they were able to establish that this modification in the epithelial cells was a direct result of the breast cancer cell metastasis to the lung. 

Why should people care about this research? 

The science sounds really cool, but other than that, why should people care? Why is this research important? I should emphasise at this point that the opinions here are my own. Firstly, the discovery that breast cancer metastasis to the lung leads to a stem- cell like regenerative phenotype in epithelial cells gives scientists a new target for therapy. Studying these cell populations in the TME can allow for the development of novel drugs that could potentially target these cells and halt or at least slow down cancer metastasis. The fluorescent protein technology used in this paper can also be used to interrogate the TME and metastasis of other cancers. It can also potentially be applied to other fields such as stem cell biology and developmental biology to track cellular differentiation.4 This opens up so many new avenues for research and development. In conclusion, the bottom line is that cancer cells are not great in general, but it turns out that they can be a really bad influence on their healthy neighbours. 

References

1.        Ombrato, L. et al. Metastatic-niche labelling reveals parenchymal cells with stem features. Nature 572, 603–608 (2019).

2.        Wang, M. et al. Role of tumor microenvironment in tumorigenesis. J. Cancer 8, 761–773 (2017).

3.        Toseland, C. P. Fluorescent labeling and modification of proteins. J. Chem. Biol. 6, 85–95 (2013).

4.        Islam, I. et al. In Vitro Osteogenic Potential of Green Fluorescent Protein Labelled Human Embryonic Stem Cell-Derived Osteoprogenitors. Stem Cells Int. 2016, (2016).

MY EXPERIENCE WITH IMPOSTOR SYNDROME

Impostor syndrome. That wonderful (insert sarcastic eye roll) feeling of insufficiency at a job we are perfectly capable of doing. For the blissfully uninitiated, impostor syndrome or impostor phenomenon is a psychological state where people who have clearly achieved things in their lives, feel like frauds. They live with the constant anxiety that their accomplishments are down to sheer dumb luck and that they will soon be revealed as impostors masquerading as experts in their field. This is not isolated to experts though. I myself suffer from crippling impostor syndrome and I am quite far from being an expert in my field. In fact, it took a huge amount of self- motivation to start this blog. It’s funny to think of now, but it took me nearly an hour of hovering before I hit the submit button on my first post! While the positive feedback has been fantastic, the little devil on my shoulder that is impostor syndrome is waiting for the other shoe to drop. 

In her book The Secret Thoughts of Successful Women: Why Capable People Suffer From the Imposter Syndrome and How to Thrive in Spite of It, Dr. Valerie Young has compiled a helpful list of subgroups that can classify sufferers. 

  1. The perfectionist– This is a lethal combination. Perfectionists tend to set lofty, sometimes unrealistic goals for themselves and when they fail to meet those expectations, major self- doubt can set in. 
  2. The superwoman/ superman– These people tend to push themselves to the brink to overcome a false sense of inadequacy. Usually the first into the office and the last out, this can inevitably adversely affect their mental and physical health. 
  3. The natural genius– A close relative of the perfectionist, this group believes in the need to be a natural achiever or genius. They assume that they have failed if they have not achieved their goals on the first try. 
  4. The soloist– These people are fiercely independent. They feel that they have failed and been revealed as phonies if they have to ask for help to complete their work. 
  5. The expert– Experts determine their worth based on the amount of knowledge they possess. Their biggest fear is that they will never know enough and be exposed as inexperienced or ignorant. 

I’m going to make this post personal and discuss my impostor syndrome. I identify with categories 4 and 5 the most. The soloist and the expert. I suppose that this is an outcome of my training as a scientist. The standard expectation from a graduate student or postdoctoral scientist is scientific independence. In me, that manifested as a reluctance to ask for help even when I needed it due to my irrational fear that people would think I was incompetent. Many times, this proved to be a huge detriment because I would needlessly struggle with tasks on my own, when it would have been so much more efficient and better for my sanity if I asked for help. Another facet of being a scientist is the expectation that we’re experts in something, however small. I spend an unnecessary amount of time stressing that I will do something that will expose me as totally inexperienced and serve me up for the judgement of my peers. It is interesting because impostor syndrome is an illogical state and is in gross contrast to my analytical and logical nature. 

So how do I deal with this on a daily basis? Whenever crippling self-doubt hits, I stop what I’m doing and take a minute to regroup. This usually involves a few deep breaths and counting to 10. Then I remind myself that I managed to get a PhD from a reputed university and get a postdoc at one of the world’s top universities. I wouldn’t have got here if I was genuinely rubbish at my work. On a more practical note, I have recently started forcing myself to start thinking about scientific questions and issues removed from my own. For example, I am a protein biochemist by training and the majority of my research has been on proteins involved in multidrug resistance. I recently contributed an article on plastic degrading bacterial enzymes to my departmental science communication journal. My first instinct was to shy away from the task due to the fear that people would find inaccuracies in my article. I then went through my ‘Nah you’re awesome!’ pep talk and delivered an article that fortunately got a lot of positive feedback. This blog started as a way for me to pen down my thoughts and discuss interesting science. It has also incidentally fed my mission to banish or at least diminish my impostor syndrome. I doubt that I will ever reach a point in my life where I have absolute confidence in my abilities. However, this trial by fire approach seems to be helping reinforce my scientific abilities so far. Everybody’s experiences are different. I don’t expect my approach to help everyone, but if I manage to positively impact even one other person with this blog post, I will consider that a huge win. I have no illusions about my influence, but I really hope this can start an honest conversation about impostor syndrome and how we can deal with it. 

Evolution: Have we found a key missing link in the story?

Obviously evolution is real. It’s not a theory, it’s not a belief system, it’s an irrefutable fact. Now that the scientific disclaimer is out of the way, how did higher beings such as Homo sapiens (humans like you and me) really come to be? How did humans evolve from single celled organisms whose physiology and cell architecture is rather primitive compared to ours? At this juncture it is important to state another disclaimer: I’m not an evolutionary biologist. I just find evolution endlessly fascinating. 

Life forms are broadly classified into two groups: prokaryotes and eukaryotes. The prokaryotes appeared first in the infant Earth and include all the various bacteria. Eukaryotes evolved from them about 1-1.5 billion years later and are far more complex. Humans are essentially multicellular eukaryotes. If we put a single prokaryotic and eukaryotic cell next to each other and just study their architecture, there are several striking differences. One of these is the presence of several tiny organs or organelles in eukaryotic cells that have diverse functions. The mitochondrion, more popularly known as the powerhouse of the cell, is one such organelle. Through oxygen based metabolism, they generate a molecule called adenosine triphosphate or ATP which is the energy source that drives cell function. Mitochondria are curious organelles because they seem to function like an independent organism. They also have their own genetic material (DNA), that encodes some of their components. These observations have led to the theory that modern eukaryotes evolved from a symbiotic relationship between two or more prokaryotes; essentially bacteria living within larger cells.

Based on current data, it is thought that eukaryotes may have evolved from an ancient lineage with the best possible name. The Norse mythology / Marvel comics nerd in me loves the idea that we might be evolved from an organism that belongs to the group called Asgard archaea. Members are named after popular characters such as Thor, Loki, Odin, Hela and Heimdall. Asgard archaea are thought to be the most likely ancestral candidates due to their eukaryote- like genomic features. However, in spite of such compelling evidence, it has always been difficult to clearly understand the evolutionary transition from Asgard archaea to eukaryotes. The main problem has been the lack of a pure sample of Asgard archaea which could be genetically analysed. All the data to date were unreliable due to sample contamination and variations in protocols. It is possible that this problem has now been overcome and we might be getting answers to this baffling question.

A group in Japan spent nearly 12 years attempting to isolate an Asgard archaeon related to Lokiarchaeota. As an aside, can I just marvel at how the Japanese are willing to fund basic research for 12 long years? This group isolated the archaea from deep sea sediments found 2533 m below the sea surface. Trials, tribulations and several cool electron micrographs later, they identified round shaped (cocci) archaea with an average diameter of 550 nm. These organisms were christened Prometheoarchaeum syntrophicum strain MK-D1. This will be referred to as MK-D1 from now on. By isolating a pure sample, they were able to achieve a ‘closed genome’, allowing them prove that MK-D1 represents the closest archaeal relative to eukaryotes. A key feature of MK-D1 was its dependence on at least one other type of archaea (Methanogenium in this case) for sustenance via the catabolism (break down) of amino acids. This phenomenon called syntrophic growth is the cornerstone of this evolution theory. Genetic, biochemical and morphological analysis of MK-D1 in association with Methanogenium led to the ‘Entangle- Engulf- Enslave’ evolutionary model. According to this model, when a Asgard archaea ancestor was challenged by a rapidly oxygenating Earth, it might have been forced to syntrophically interact with an oxygen scavenging alphaproteobacterium (PA). Through protrusions on its cell surface, the archaea might have established a close physical interaction with PA, ultimately leading to its engulfment. This might have eventually led to the enslavement of PA by entrusting it with ATP generating metabolism. This would have caused this new fused organism to switch from amino acid catabolism to ATP transport from PA to the cytoplasm for sustenance. Hence, PA is essentially a primitive mitochondrion. 

This research might seem insignificant to most, but it is a huge step forward in our understanding of how we evolved. It is not the end point though, this research needs to be refined before we have a definitive understanding of how eukaryotes evolved. One of the biggest questions that needs to be answered is: How did other cell organelles evolve? MK-D1 did not have any cell organelles; so how did those appear? Additionally, it is important to understand how the enslaved PA evolved into the modern mitochondrion. Was it just natural selection? Now that we have a clear starting point for research, I’m really looking forward to finding out the answers. Maybe 12 years from now?

Reference: Imachi, H. et al.Isolation of an archaeon at the prokaryote-eukaryote interface. bioRxiv726976 (2019). doi:10.1101/726976Reference: