Transitions

Hello! This blog was formerly known as DNA Rambles. I created this site while working as a bench scientist in a cancer genetics lab; but I’ve moved on and I think it’s high time this blog moves on too.

I am now a PA (physician assistant) student at Boston University School of Medicine! I’ve got the white coat and everything. What I don’t have is much free time. The program started in April, and since then I’ve been buried in lectures, exams, and a mountain of information. Medical school is no joke.

A creative outlet is necessary though, and so I’m returning to writing. As my brain is completely immersed in medicine right now, I’m planning to write creative fiction based on accurate medical science.

For example, have you ever considered what a tick bite would look like from a cellular perspective?  Below is a preface to a short story I would like to flesh out, and will hopefully post sometime soon. Enjoy:
tick

Preface

 

It was a quiet night in The Knee. The Heart thumped sonorously at a steady, slow rate of 43 beats per minute, unhurriedly swirling erythrocytes through the popliteal artery as it snaked its way down the back of The Knee. But something was amiss. For buried in the nook created by the tendons of the biceps femoris, a tick poised itself to engorge on a bloodmeal. The arachnid’s pincer-like jaws clamped down, spiking through the epidermis and penetrating the vasculature pulsing languidly beneath. As the cytoplasmic contents of the shredded cells triggered alarm bells in the subcutaneous tissue, salivary secretions dripped from the tick’s feasting maw into the wound. Within those secretions tiny coccobacilli rode the wave to their new host.

Antibodies and Scientific Credibility

There is no tweezer small enough, no pipette tip narrow enough, that can allow scientists to physically touch proteins. They’re just too dang small. So how do we get around that? Antibodies.

Without these small glycoproteins, modern biology as we know it would not exist. In fact, without antibodies you would not exist. These proteins recognize pathogens in your body, bind to them, and direct white blood cells to the pathogens to destroy them. This elegant defensive technique is called the adaptive immune system. The “adaptive” part of that name refers to the unique characteristic of B-cells to create new antigens based on interactions with pathogens. Basically, if your body recognizes something that will make you sick, it not only stores the molecular fingerprint of the offender, but also creates matches to that fingerprint carried by antibodies so that any similar attacker can be caught and destroyed.

1.jpg495fb371-1eac-4590-bad7-f830f8d2080bLarger

An antibody binding to a pathogen. Image from turbosquid.com

Scientists have learned how to take advantage of this adaptive quality of the immune system to make molecular fingerprints of proteins they are interested in. The take their protein, inject it into a rabbit or mouse or some other host animal, and then collect the antibodies that respond to the injection. Of course, this is a skimmed down version of antibody generation, but you get the idea.

You can imagine that doing this procedure is both expensive and time-consuming. A scientist has more pressing matters to attend to than generating antibodies for their experiments. It was logical then, for pharmaceutical companies to step in and take on that role.

Of course, once you allow the free market to step in you have multiple companies with competing antibodies. Some companies are definitely better than others, and this becomes a problem when, as a scientist, you are attempting to interpret results. It is not enough to trust the scientist doing the work, you have to trust the company who manufactured the antibody.

So, if you are looking at results from immunofluorescence staining, for example, not only do you have to question the results in terms of their biological relevance, but you also have to question the specificity and validity of the antibodies used.

Neurons!

Antibodies are used to highlight cellular structures, like the Schwann cell (red) in this neuromuscular junction.

 

 

This is not to say that our current modus operandi is inherently flawed. Rather, because of the lack of regulation and verification, commercial antibodies not only introduce waste into scientific endeavors (both in terms of time and money), but also introduce a layer of suspicion inherent in most results. If antibodies were vetted by some unbiased third party, much of the fat that goes into laboratory expenses could be trimmed, and that, I think, is worthwhile.

The intersection of -omics

One could argue that the art of Science is choosing which lens to view your subject with. For certain diseases you could apply a variety of -omics, (genomics, proteomics, metabolomics, etc.) to explore the mechanisms behind the pathology of your disease. Each -omic, each lens, will offer the investigator different insights into the same larger truth.

The problem is that people become deeply invested into their own particular lenses, and therefore they only see one angle of the deeper reality. Therefore, a geneticist may only consider a mutation as the only important finding in the same tumor sample that a metabolomics expert may be keenly interested in because he believes he can pinpoint how cancer cells use local lipid stores to fuel their progression. Both of these points of view provide insight into a larger truth- how the tumor grows in vivo, at the intersection and ultimate culmination of every -omic occurring simultaneously.

What is the answer to this problem? Undoubtedly collaboration is important. Bringing together experts from each of the -omics to provide a cohesive picture is not only helpful for contextualizing each others’ research, but imperative in establishing a framework with which to target interventions within.

But I like to think of a scientist as the person behind the microscope, able to flit between one lens and the next with practice and ease. To me, if you want to be able to attack a pathology with the keen scientific insight required to tackle Modern Science’s most difficult questions, you have to have a macro multi-faceted view of your subject.  It is not enough to know just one angle. A scientist must strive for the larger truth behind the sliver they normally see. That truth? It lies at the intersection of the -omics.

 

 

Eye of the Beholder

The human eye is estimated to have a resolution of 576 megapixels. The camera attached to my laboratory’s microscope has 8. That’s the same amount of pixels an iPhone camera collects, and while it may be true that one could capture breathtaking images that convey everything a viewer may wish to know in one shot… those images will never hold a candle to what can be seen with the human eye.

What happens then, when the human eye is removed from the equation? Many microscopes these days are being built without eyepieces, and instead are manipulated and viewed entirely through (often proprietary) software. This not only cuts down costs for both the manufacturer and the customer (optics are expensive to fashion and align), but also allows for fun environmental parameters such as live cell imaging with temperature and carbon dioxide level controls. And don’t get me wrong, these machines can take beautiful images- check out the picture below if you don’t believe me.

MAX_Composite Vimentin 10x

Dismorphic vasculature in the mouse kidney.

However, with immunofluorescence in particular, a common problem plaguing interpretation of data is autofluorescence, or background fluorescence. When you are looking through the eyepiece of a standard confocal microscope, your eye usually does a pretty good job at discerning the true signal from the false autofluorescence. This in turn allows you to adjust the camera’s settings such that you acquire a biologically accurate image.

But with these software controlled microscopes, using a camera as an eyepiece means that you cannot easily detect the nuances in emission from your fluorophore. Already I’ve begun to see images creeping into the literature that are overexposed, full of background, and are generally of poor quality and doing little to highlight biologically significant features. That is why I am asking you, dear reader, whether you be a student, post-doc, PI, or anywhere in-between, to always opt for optics. That fanciest software in the world is no match to the power of the human eye.

You’re a very special scientist

At what point did you decide you were a scientist?  Was it the first time you looked through the microscope, the first box of gloves you went through?  I think that more often than not it is a gentle gradient.  You keep doing experiments, failing, doing them again, and after a while, they work more than they fail.  You feel like you can do science, so you’re a scientist.

But science isn’t just about getting a specific protocol right, is it?

I think it is.

It is well-accepted that a fully-fledged scientist fills a very narrow niche.  One tiny lacunae in the large matrix of bone which is part of a massive SKELETON OF SCIENCE.  Sorry, I’ve been brushing up on my bone anatomy.

Which leads me to my point.  My PI, a man held in high regard in his field (the genetics of tuberous sclerosis complex, for those interested), is extremely reticent to dip his toe in bone biology.  A mouse model I’ve been working on has some interesting bone pathology, and we’re wondering whether or not to pursue it further.  His reticence stems from his lack of knowledge of the field.  But as a scientist, shouldn’t he be equipped to set up a rigorous, well-thought experiment, execute it, then carefully examine the results, regardless of subdiscipline?

No.

Bone biology is very specific, with it’s own particular lexicon of the esoteric.  One of the primary tools for bone biology, for example, is computer chromotography, or CT.  In hospitals, if you get a CT, it is taken by a radiologist – a person with a specific degree or certification to understand the minutiae that will yield your final results.  This radiologist has had years of training to comprehend his work – as a geneticist, this is not something you take on lightly.

When you learn about science, you learn about ideology.  You learn the rigor of the scientific method, experimental design, and statistical analysis.  You are taught that the experiment you set up should reflect the question you’re asking.  That is to say, if you want to set up an experiment using a model organism, the model organism you choose should depend on the experiment, not on previous use of said organism.  This is even emphasized in protocols for animal care and use.  But no one follows it.  If you work on drosophilia, you will continue to work with fruit flies.  Same with zebra fish, mice, etc.

My point is that scientists, while theoretically equipped to face any question with a variety of tools and techniques, become ingrained in their habits and techniques.  Personally, I think this is a good thing.  I want an expert, not a Jack-of-all-trades when it comes to pushing the bounds of human knowledge.

But this faces us with a dilemma when we approach problems.  As a geneticist, do I attempt to answer a question in bone biology, or do I let it slide? What is lost and what is gained?

Collaboration is the answer in most cases, but then your experiment becomes mired in the political mire of modern day science.

I don’t have an answer here.  These are just DNA Rambles.  What my gut tells me though, is that as scientists, we must constantly search for a new vantage point, a new way to look at something.  If that means leaving our comfort zone, our established field, then so be it.  Let’s return to the ideology.  Let’s learn, make mistakes, then try again.  Sure I like being a geneticist, but bone biologist sounds pretty sweet too.

Alan Turing and the Necessity of Gay Scientists

If you are reading this, you can thank a gay scientist. An idiosyncratic man by the name of Alan Turing – one of the most brilliant minds this world has ever seen – also happened to be gay.  If you know anything about his history, or have watched The Imitation Game, you’ll know that his sexual orientation proved to be the root ofhis untimely and tragic downfall.  The UK still criminalized homosexuality, and when given the choice between jail and chemical castration, Turing (an unsung national hero) was forced to choose the latter. Chemical castration is one of the more inhumane practices that society has inflicted upon individuals, and it drove him to suicide. Now, today actually, Turing’s family walked in the largest Gay Pride parade London has ever seen.

This is a benchmark for how far along we have come as a society, but think about what was lost.  Turing’s machine paved the way for modern computers.  A whole science – computer science –  was invented by him.  Yet he died much too soon, 16 days before his 42nd birthday.  We will never know what genius marvels he would have thought up had society not been too backward to let him lead the life he wished.

And this is exactly my point.  Science is genderless, unbiased, and secular.  No matter who you are, if your ideas can stand the rigor of the scientific method, you can do science.  But, as we recently saw when Nobel Laureate Sir Tim Hunt made off-hand sexist comments, science is by no means free of bias. The problem is science is done by scientists who are, believe it or not, people.  And people can be sexist, racist, homophobic, transphobic , etc etc. Not to mention the micropolitics of individual research institutions and competing labs.  All of this adds up to social mores impeding the progress of science.

So here’s my point: don’t let your biases and preconceptions let you stop scientists from doing science.  Treat every man, woman, and transperson with the same respect, intellectual criticism, and encouragement regardless of race, sex, gender, or sexual orientation.  Let scientists do science, and let the next Alan Turing usher us into a new age of technological wonder we have yet to even imagine.

Inexperience: Frustrations of a Liberal Arts Graduate

I watched with nervousness as my PI scanned through the wells of the plate.  Five of the six wells held happy little pericytes growing blissfully, while one well was fraught with contamination.  Some sort of fungus or bacteria had taken up residence, killing my precious culture.  The thing is, I have very little experience with cell culture.  So when my PI looked at the Well Of Shame, he, understandably, became very upset.

SEM of pericyte

SEM of pericyte

You see, cell culture contamination can ruin experiments, even month-long projects.  As soon as you even suspect the presence of contamination, you need to bleach the plate, spray everything it ever touched with ethanol, and maybe burn the clothes you were wearing.  It’s seriously that bad.  But me?  After I had noticed it I simply marked it with a red marker and moved on, thinking that I would take care of it later.  That’s like noticing you’re not wearing your glasses and you continue driving anyway.  It’s stupid and irresponsible, and is representative of the many encounters I’ve faced as a liberal arts graduate, all of them stemming from inexperience.

This is not to say that I don’t value my college education.  Having attended Grinnell College, one of the premier liberal arts institutions in the country, I consider myself intellectually equipped to face almost any challenge.  I can adapt easily, and have been trained to think critically and creatively, attributes highly praised in science.  The problem is, I have very little experience with the technical aspects of science.

Grinnell College in winter is a beautiful, bone-chillingly cold, thing.

Grinnell College in winter is a beautiful, bone-chillingly cold, thing.

I could tell you every step of PCR, but that doesn’t mean I know why a reaction might fail or how I should dilute my primers to create easy-to-use stocks.  There are so many minutia in the day-to-day workings of a lab that never go into a liberal arts education, and at times, like yesterday in the culture room, I really feel that lack.

It’s an odd dichotomy of feeling intelligent yet inexperienced; mentally equipped yet technically lacking.  It often creates this frustrating feeling of helplessness, that can only be overcome by asking for help.  But sometimes it’s difficult to gauge what you need help with, and what you can figure out on your own.  I’ve learned that asking for help is almost always your best bet, but there are only so many hours in a day, and research is an incredibly time-consuming endeavor; sometimes post-docs are simply too busy, and sometimes you’re so busy you think it’s too small an issue to get worked up about.

As someone who enjoys science, is currently working in research, but it not anticipating a career in it, I am glad that my education affords me a macroscopic view of science as a discipline and as a human endeavor, but sometimes I really wish someone had told me to bleach the contaminated well.

Happy (belated) International Women’s Day!

Yesterday was International Women’s Day, which celebrates women all around the world.  So today, I want to tell the (abridged) stories of three women in science whose work has shaped my life in science.  Often ill-credited and over-looked, these women have made contributions in a field that largely rejected their ideas as inferior, and in doing so, they not only proved that women can achieve amazing scientific feats, but paved the way for generations of scientists to come.


Rosalind Franklin

Rosalind_Franklin

Sometimes referred to as “The Dark Lady of DNA,” Rosalind Franklin is the real scientist behind Watson and Crick’s famous “discovery” of the double-helix structure of DNA.  Franklin was an X-Ray crystallographer, which means that she took pictures of crystal structures, which are often biological substances in crystal form.  This requires a math chops like you wouldn’t believe – modern day x-ray crystallography involves heavy computational use.  But Franklin, being the badass that she was, did them all by hand.

Now, according to many first-hand accounts, Watson and Crick were imaginative and creative, but also kind of jackasses.  Franklin, being the hard-nosed scientist she was, did not work well with them.  However, they did collaborate, and together they tried to elucidate the structure of DNA.  Watson and Crick were more interested in modeling than in the science though, and, when Franklin viewed their premature model, acerbically noted “It’s very pretty, but how are they going to prove it?”

Eventually, of course, Watson and Crick did manage to build a model of DNA, largely based on Franklin’s work.  The picture below, for example, which is an x-ray diffraction picture of DNA, definitively shows DNA as a helical structure.  Rosalind_Franklin_Plate_1_DNA_B_form_1000

Did Rosalind Franklin receive any credit for this monumental discovery?  No!  It wasn’t until TWENTY-FIVE YEARS LATER that Franklin’s contribution was even acknowledged, and it was buried behind some pretty sexist bull-shit in Watson’s The Double Helix.   While Watson has since changed his tune, saying that Franklin should have been awarded the Nobel Prize in Chemistry, the fact remains that his Nobel stands on the unseen shoulders of the real scientist behind the discovery: Rosalind Franklin.


Marie Curie

Marie_Curie_c1920

You cannot talk about women in science without bringing up Marie Curie.  To quote Wikipedia, “She was the first woman to win a Nobel Prize, the first person and only woman to win twice, the only person to win twice in multiple sciences, and was part of the Curie family legacy of five Nobel Prizes.”

This is a woman who quite literally devoted her life to science.  Before her pioneering research, radioactivity wasn’t even a word – she coined it!  She discovered two elements, began the first radiation treatments for cancer, and developed novel techniques for isolating isotopes.

During the time of her research, people still were not convinced that there was anything smaller than the atom; after all, atom literally means indivisible.
atom

Curie’s work on radioactivity was dependent upon the hypothesis that radiation is not the result of molecular interactions, but from the atoms themselves.  This of course proved to be correct, and was instrumental in the shift of how we view the fundamental particles of our universe.

Sadly, Curie was a pioneer in a field where the dangers had not yet been mapped.  She died due to aplastic anemia as a result of radiation exposure.  She is immortalized in her field of research, as the unit of radiation is a curie.


Carolyn Porco

Carolyn_porco

Last, but certainly not least, I want to introduce you to a wonderful woman named Dr. Carolyn Porco.  Dr. Porco is a planetary scientist who has devoted much of her career to studying Saturn. She has been involved in the Voyager missions (no, not Star Trek), as well as the more recently the Cassini mission.  The discoveries she has made through the fantastic images taken way out in the deep space of our solar system, has changed how we view ourselves.

Don’t believe me?  Well, Dr. Porco is not only a great scientist, but is also an amazing speaker and science advocate as well.  She has been on the TED stage several times, and has spoken at several other conferences around the world.

I STRONGLY encourage you to watch her TED talk on her Saturn work.  She can inspire you in a manner that a mere blog post could never hope to aspire to.

Dr. Porco continues to do research and push forward the boundaries of human intellect wherever she goes.


I hope you’ve enjoyed my short blurbs on these amazing women, and I want end by saying that science is a genderless endeavor, and all who are gripped by scientific curiosity should be allowed to pursue it.

What if? Microsoft Kinect and Confocal Microscopy

It is a well-known phenomenon that you come up with some of your best ideas when you are thinking late at night.  Such an occurrence happened to me, and I wanted to share with you some of my thoughts.  But first, I have to cover a little background.

Confocal Microscopy

Immunofluorescent Microscopy: adding visual effects to science!

Much of my undergraduate work involved immunofluorescence: a type of staining in which fluorescently-tagged antibodies are applied to a specimen, target your epitope of interest, and fluoresce when exposed to certain wavelengths of light.  In other words: they glow under the right conditions, showing you exactly the structures of the cell you want to see.  In effect, this means that when you look through the eyepieces of a fluorescent microscope, the cellular world explodes in a vivid landscape of vibrant hues before your eyes.

If you don’t believe me, check out the gif below.  With the help of some excellent people at the University of Iowa’s microscopy core, we were able to visualize the lizard neuromuscular junction using a confocal microscope so sensitive it can distinguish between fluorophores of extremely similar wavelengths: it could tell the difference between two hues of red, for example.  The result, in my opinion, both intellectually and aesthetically pleasing.

hylzt

Now, the microscope I was using for these experiments moved the slide around much the way a traditional microscope does: the right hand controls movement on the x and y planes while the left hand focuses on the z plane.  This begins to feel quite intuitive after repeated use.

The microscope I used this afternoon though, has no such controls.  You can’t even see the slide.  You simply put the slide on a plate, put the plate in the machine, and close the lid.  The rest of the work – light exposure, positioning, everything – is done by software provided by the microscope’s manufacturer.  This means the centuries old stereotype of a scientist hunched over at his microscope is becoming obsolete.  Instead, in our technological age, a scientist is simply a person at a computer, flicking through tissues and cellular components with casual keystrokes and mouse clicks.

progress

An important paradigm shift as a result of this change is that the people who will now be controlling how we view the microscopic world are computer scientists and software engineers.  This means that our microscopic world is beginning to be manipulated with the infinite possibility of the digital realm.  Already we have seen incredibly detailed computer simulations of the cell and proteins visualized in 3D on your smartphone.  What will the ever-improving world of computer science give to biology next?

I don’t know what industry leaders are planning, but if it were me, I’d want to make the microscopic world fun and accessible.  The mouse clicks and keystrokes I was using to control my collaborator’s microscope this afternoon represent an interface between me, the software, and consequentially the microscope.  This interface could be switched with more accessible, intuitive interfaces.  The next step would undoubtedly be a touch surface, probably first through a touch desktop machine and possibly simplifying to a tablet.  And normally I would say that would be cool enough – God knows I would start drooling at the idea of making in-depth images from actual slides through something so simple as a tablet.

But I walked past the Microsoft store the other day, and one of the workers was playing with a Kinect.  The gestures were so simple and intuitive, and I suddenly imagined viewing the microscopic world in such a way.  The Kinect, after all, is simply another interface.  It is within the realm of possibility – given current technology – that I could image cells at resolutions smaller than than a micrometer using only hand gestures.  Switch objectives with a wave, change emitted wavelengths by voice command, and take an image with the snap of the fingers: to me, this is far cooler than mouse clicks and key strokes.

On the Eloquence of Intelligence

This morning I had the privilege of listening to Dr. Darius Ebrahimi-Fakhari discuss some of his latest work regarding mitophagy in neurons.  He had oodles of cool videos depicting mitochondria trucking back and forth down the axon – similar to the gif below (pulled from Dr. Kittler’s lab at UCL).

JKimage4

The work itself was elegant, with an organized methodical approach utilizing the most novel techniques available.  However, what truly captured my attention was the fluency with which he spoke of his work.  Becoming a scientist is not hard: you just have to look at questions in a certain way.  Working within a specific field IS hard.  You have to become intimately familiar with the relevant signalling pathways, how they operate under different conditions, and of course, the nonsensical and often multiple names for different proteins within the pathway.

Not only that, but SO much emphasis is placed on knowing who published what in where and when.  It is not enough to say, “p62 knockdown causes mitochondria to accumulate within the soma of the neuron.”  Rather, as Darius did, you should say, “oh, Alessia in her 2009 paper showed that p62…” etc. etc.

Now, some of this comes with sheer time.  Once you’ve been working in the same environment talking about the same thing, concepts become ingrained.  But I’ve been a research technician in this lab for over two years and I’m only JUST beginning to be able to name some of the ancillary proteins of the mTOR pathway offhand.

Also, when you and I talk, we use a lot of “filler” sounds; “um,” “uh,” and “like,” to name a few.  These verbal spaces are reflective of our mental states: we’re still searching for the right words.  However, if you want to sound intelligent (and scientists especially do), you have to entirely remove those filler words.  Not only will this provide a more cogent speech pattern, but also give you confidence.  This in turn will make the listener more confident that you actually know what the heck you are talking about, which is the crux of scientific discourse.

This is not to say you should sound like a research paper when you’re talking, but rather that you can find the appropriate names for all of those wacky proteins at the tip of your tongue, and you know EXACTLY what they do and WHO discovered they do that.  Oh, and in what model and in what context would also be helpful.

I know, I know, this all sounds like a headache and a half.  However, if you can master this lexicon of the esoteric, you will sound like the smartest guy alive.