On version control

I was thinking about booting up a PC that I haven’t used, since, oh, 2003 or so. I was wondering what would happen. Rather than just finding out (and potentially losing all my files), I asked several of my computer science / hacker friends whether the ‘ancient’ operating system would essentially protect me from all the new-fangled viruses out there. Their answer: an emphatic ‘no.’ Apparently there is enough similarity between operating systems for many of the viruses and worms and malware in general to still take advantage of a machine whose virus protection software is itself now woefully out of date. One of my friends said I’d have to go back to at least DOS 5 to get any protection. So, the PC remains in my closet for the moment.

But this got me to thinking: what about our operating systems and the viruses that attack them? Is the co-evolution of organisms and their pathogens / parasites such that old versions of either are no longer capable of infecting the newer versions, or vice versa? For example, could a really old plague still do harm to the human population, a topic I have speculated wildly on elsewhere (http://ellingtonlab.org/blog/2011/03/09/on-paleovirology/).

But this raises the interesting question of version control: do new version viruses infect old version wetware? And how would you even install the old version wetware? It’s not like we do alot of genetic engineering of humans, sigh. It then occurred to me that we are different degrees of ‘old.’ We retain the genes of antiquity to varying degrees, even within a population that has largely at this point equilibrated. This has been most forcefully brought out by the idea that we are all to some degrees Neandertals, and that we have different percentages and types of Neandertal genes between us (http://www.livescience.com/42933-humans-carry-20-percent-neanderthal-genes.html). This might be the biggest version control jump of all: could a H. sapiens virus infect a Neandertal, and vice versa? Would there be an inherent genetic barrier, and might we still contain such a barrier?

I went to the literature on a lark, assuming that no one could have published anything relevant to this arcane speculation. However, as is usually the case when I doubt PubMed, that glorious gift of the NCBI to humankind, I was wrong.

“… we analyzed individual sequence reads used to assemble the published Neandertal and Denisovan genomes for insertions of Human Endogenous Retrovirus K (HERV-K) DNA. Virus-host junctions were identified that defined 14 proviruses where modern humans contain the corresponding, empty, preintegration site. Thus, HERV-K reinfected germ lineage cells of Neandertals and Denisovans multiple times, and these events occurred around the time or or subsequent to the divergence of the archaic hominin lineages from that leading to modern humans.” (Agoni et al., Current Biology (2012), 22:R437; http://www.ncbi.nlm.nih.gov/pubmed/22677281).

So, cool! Version control is not enough. At least one retrovirus was hopping and bopping around the genome before and after the divergence.

Or maybe it is, sometimes?

“… one provirus (HERV-K-Ne1 = HERV-K-De6) was detected in both Neandertals and Denisovans and not in modern humans ….”

Or maybe we just don’t know enough, as a follow-up paper by Marchi et al. (Current Biology (2013), 23:R994; http://www.ncbi.nlm.nih.gov/pubmed/24262833) suggests:

“… while searching many new genome sequences of modern humans for ERVs [endogenous retroviruses], we have found most of these loci. For example, of the eight Denisovan loci for which Agoni et al. were able to give precise genome coordinates, at least seven exist in modern humans.”

The ultimate explanation for these discrepancies may be that ERVs are far from fixed in the human population, and rise and fall due to population dynamics (“… many loci will have persisted at fluctuating frequencies in all three lineages [human, Neandertal, Denisovan].”).

So, ancient viruses are still likely infecting modern hosts, although this does not suggest that *an* actually ancient (reconstructed) virus *would* of necessity infect a modern host.

What about the converse? Could an old human be infected by a modern virus? This is a bit more difficult to get at, but once again there is a marvel of literature that suggests that “… Most simian immunoviruses (SIVs), including the direct precursor of HIV, use their Nef protein to antagonize BST2 of their respective host species. Human BST2m, however, contains a five amino acid deletion in its cytoplasmic domain that confers resistance to Nef …. Here, we show that this protective deletion has already been present in Neandertal and Denisovan BST2 …. This ancient origin helps to explain why effectively spreading zoonotic transmissions of SIVs to humans have been rare, although SIVs are widespread in African non-human primates and humans must have been exposed to these viruses many times.” (Suater et al., Human Mutation (2011), 32:1243; http://www.ncbi.nlm.nih.gov/pubmed/21796732).

This is impressive. With a tip ‘o the hat to some ancient hominid that survived a sweep (or was just bottlenecked otherwise), we get to live free of HIV-1 from 800,000 years ago to roughly 1959. This is all the more remarkable to me because of my great respect for viral evolution. Viruses create quasispecies that are constantly testing our defenses; viruses evolve way, way faster than we do. And yet we are able, for the most part, to hold them at bay, with some rare and intermittent exceptions (http://ellingtonlab.org/blog/2010/10/05/on-predicting-evolution/). This speaks to the power of complex systems. We may evolve slowly, but we have multiple layers of defenses that can be subtly altered to place us out of reach of the sleek-but-limited pathogens that are competing for our ATP. It’s like we have deep combination locks and viruses can only count to 10, quickly, over and over again. Sara Sawyer taught me that, but I don’t listen to her enough.

Still, as we continue to be challenged by the geovirome (I just made that word up … I think … ha, yes, even Google does not know this word) it might be useful to use the Neandertal / Denisovan portions of our genomes as convenient references for the relative susceptibility of individuals and populations. Just in case the viruses, like Spinal Tap, learn to count to 11.

 

- originally posted on Tuesday, September 30th, 2014

On conscience

I cannot say anything here other than to distribute the thoughts of my friend Dan Tawfik. Those of you close to the pointy end of the stick always have harder decisions to make than those of us who have the privilege and luxury of enjoying peace. I would only hope that I would have the fortitude that Dan does.

The following appeared in Haaretz:

“I  am  an  Israeli,  a  Jew,  an  Arab,  but  first  and  foremost  a  Human  Being      

In  these  troubled  times, we, Israelis, are expected to  all  unite,  to  choose  one,  clear-­‐ cut  identity.  So  I  set  out  to  find  out:  who  am  I?  First,  my  family,  those  I  love,  the   desert  landscape.  Second,  I’m  a  scientist  fascinated  by  the  molecular  complexity   of  nature.  Together  with  colleagues  from  many  nationalities  and  religions   (including  Arabs),  I  explore  this  complexity.  Third,  I  love  mountains,  music   (including  Umm  Kulthum),  literature  and  food  (especially    hummus).  I’m  Jewish   by  religious  origin  and  cultural  heritage,  Israeli  by  nationality,  and  an  Arab  by   ethnic  origin  and  name,  Saleh  Dan  Taufik,  a  spelling  that  clearly  “exposes”  this.  I   have  no  intention  what  soever  of  giving  up  any  of  these  elements,  or  of  being  put   in  any  compartment,  or  category,  let  alone  one  originating  from  prejudice,  racist   views,  blindly  following  fanatical  preaching  and  political  populism.  I  have  a   dialogue,  affection,  and  a  lot  in  common  with  many  Arabs,  but  not  with  all  Jews,   and  certainly  not  with  those  trying  to  impose  opinions  and  beliefs  which  I  do  not   share.      

I  detest  the  illusion  of  “unity”,  as  its  lowest  common  denominator  is  hatred  of   Arabs  in  general  and  Israeli  Arabs  in  particular.  I  denounce  racism  and  the  false   concept  of  putting  all  Arabs  in  one  basket,  any  one  from  Muhammad  Deif  (head   of  the  Hamas’  military  wing)  to  a  hummus  maker  from  Ramleh.  This  racism   primarily  serves  the  goals  of  Hamas.  In the same vein,  I  denounce  the  description   of  all  IDF  soldiers  as  war  criminals  (including  me,  since  I  served  in  the  IDF  as  an   officer  (Major  Ret.),  and  in  Gaza  and  the  Occupied  Territories).  If  this  is  unity,  I   proudly  count  myself  out.      

I  am  aware  that  I  might  receive  insults  and  threats.  Nevertheless,  I  absolutely   refuse  to  hate,  refuse  to  generalize,  refuse  not  to  lament  the  death  of  children,   even  if  they  lived  next  to  a  rocket  launcher  and  their  parents  voted  for  Hamas  –  I   lament  their  death  along  with  the  tears  for  our  dead.  I  refuse  to  see  any  Muslim   or  Arab  as  my  enemy,  because,  amongst  my  others  identities,  I  am  an  Arab.  The   day  when  many  more,  here  and  there,  will  think  as  I  do,  will  be  a  better  day  for   us  all.      

Professor  Dan,  Salah  Tawfik”

 

- originally posted on Wednesday, August 6th, 2014

On tenure and labor

If you’re an Old, like me (and I know I’m an Old, because I think to myself “I’m not that old”) then you may remember that 70s classic “Future Shock,” by Alvin Toffler. Basically, this book describes the psychological shock of culture change. Whether this is a real thing or not is open to debate, given all the Tweeting grannies out there. However, it is true that we have been undergoing a tremendous cultural change in academia in recent years. A brief synopsis might look like:

The American research experience is ovah. Between sequestration, budget ceiling fights, congressional inaction, and just a general malaise in the system, we’re working hard to destroy a generation’s worth of scientific infrastructure in our Universities in just a few short years. Junior faculty are getting annihilated. Senior faculty are downsizing their laboratories as quickly as possible, and then going deadwood. I note in passing that while this is happening in a very general way, there are winnahs as well as losahs: there is a new, amazing ’superclass’ of scientists, who live in a syncytium between academia, industry, and government. I cannot but envy George Church and Jay Keasling, good on ya, I mean that. But the fact is that I’m just a wildcatter to their oil baron, and that will probably always be the case.

This American teaching experience is … different. For years we stood in front of classes and lectured, whatever that means. For my part, I’ve been doing Socratic instruction for over twenty years now, so this whole “inside out” or “flipped” classroom stuff is nothing new to me. It’s almost like my Flock of Seagulls ‘do has come back into style (I’m kidding, of course, I had a tinted rat tail, for God’s sake). While it is a good thing that folks are being encouraged not to just read from the textbook, the fact that we now have high end reading from the textbook in the form of online or virtual instruction is not necessarily a good thing (and, yes, MOOCs, online mentoring and chat, yada, yada … it’s great for the same 1% of the student base that got it when you were just reading from the textbook). It is, however, a very cheap thing.

Which really leads us to our thesis, that I keep looking in the mirror and seeing the coal smudges under my eyes. A tenured professor, the very height of privilege, is the last person that should think of themselves as Labor. Nonetheless, the professoriate bears more resemblance to a Wobbly Shop than might first be grasped: we’re not wage slaves, and we used to practice self-management, for the most part. It’s this latter point that is particularly sharp: it’s not so much that I’m Labor, as I sit here swilling my martini (OK, cider), it’s that I definitely recognize Management when I see it.

I think that one of the largest differences between academia as it used to be when we all hung onions on our belts, and academia as it is today, is that there is a vast, gaping divide between Labor and Management. There was a time when Department Chairs and above were once professors, and had a feel for the faculty. That time is long past. There are now career graspers who move fluidly upwards and discard their faculty credentials like a second skin as they go (biased much?). They don’t really teach; they don’t really do research: they administer, for the greater glory of us all. And, yes, there are many counterexamples: I can find my peer and former Dean David Laude in a classroom many days of the week, doing his thing. Still, exception, rule, and all that.

And in consequence, they begin to see us perhaps for what we are: not their professional peers engaged in an equity enterprise not unlike law or medicine, where our contributions should return value directly to us, but as paid-for laborers who can be called upon at any time to do any task that needs doing. Now, the great thing about tenure is … we still don’t have to do it. We’ll get to that in a minute. But I think the disturbing part is that this Management divide has somehow gained traction over the years, relative to the notion that we’re carrying a shared burden. Less money in the system? Teach more. Major rearrangements in the structure of your institution? Not your problem, you’re not really part of the system. Group initiatives? That’s nice, but we know better what you should be doing. I could give examples, but what would be the point? If you’re reading this and you’re Labor, you already know what I’m talking about. If you’re reading this and you’re Management, you’re already coming up with rationales in your mind for why it can’t possibly be true. If you’re reading this and you’re not faculty, you just want to punch me for being such a privileged asshole.

But then there’s that pesky tenure, that wonderful, awful ability to say “No.” I exercise that ability quite often (just as often as I use the same superpower to take on 80 hour work weeks to launch new, unfunded projects or initiatives). With research on the skids, and virtual instruction supposedly providing all the education you’ll ever need off the back of a cereal box, we stand revealed as: overpriced educational labor. If you can’t really force us to teach more (and, mostly, you can’t, not by nearly the amount we’d need to teach to actually ‘rightsize’ the system, especially here in Rick Perry’s $10 K degree Texas), then you have to … get rid of us.

What Management really is, is a bunch of former faculty dedicated to the elimination of current faculty. It doesn’t really matter that they’re just trying to balance the books. By replacing faculty with instructors and lecturers and online lesson plans and as much credit as can possibly be loaded into college prep courses (that will ultimately doom your average Freshman who first encounters a real college course) they are in the end letting us sail gracefully into the night. The point of modern University administrators, of Management, is to remake the University so that it is no longer a University.

And that will be very interesting indeed. The Wobblies are gone, long live the Wobblies. I would rouse our Rabelais to take back our shop, but it’s late and I have a lecture to not write (because I don’t lecture, remember?).

 

- originally posted on Thursday, December 5th, 2013

On controversy

One of the great things about science is that it is not set in stone. We always say that scientists revise their thoughts based on new evidence, but there aren’t that many times when we actually see this happen. For example, while we can play around the edges of the Central Dogma (at least the dogmatic version of it, that DNA makes RNA makes protein), it is unlikely that anything is going to come along to dispel this foundation of modern biology.

This is why it has been fun to be embroiled in an actual scientific controversy. My colleagues in the Marcotte lab have been studying an interesting phenomenon for the last few years, the observation that when a variety of fluorescent protein fusions are expressed in yeast, they tend to aggregate into punctate bodies under starvation conditions [http://www.ncbi.nlm.nih.gov/pubmed/19502427; http://www.ncbi.nlm.nih.gov/pubmed/23057741; http://www.ncbi.nlm.nih.gov/pubmed/23405267]. It was (and is) to some extent unclear what these punctate bodies are: functional or non-functional, happenstance or programmed, due to the fluorescent protein’s potential for aggregation or inherent to many different fusion partners. Over time, students have continued to piece apart the observation, and the current thinking is that it seems likely that at least some of the aggregation is unintentional, due at least in part to the propensity of the fluorescent protein itself to aggregate. This does not make the phenomenon less interesting, as there clearly are many aggregates that form in the absence of the fluorescent protein. But what has been interesting for me to observe is the sociology of science at work, as hypotheses are put forward and tested, and various arguments are crafted in narratives that do or don’t stand.

You see, that’s the very tricky part of this, the tricky part of most science. While we are dispassionate investigators going where the data may lead us, we are also storytellers who believe in our stories. Both can be true at once, we can hold both ideas in our heads, as long as we are honest with ourselves about both ideas. Data should not be twisted to fit a story, nor should a good story be overlooked because the data are incomplete. Narratives are to creative science what a modus operandi or profile is to someone in criminal justice; they tie the facts together and guide insights into how to look for more facts.

But beyond watching my peers down the hall work this puzzle, it has also been fascinating to see another story emerge, from half a continent away. Steve Benkovic and his lab have developed their own narrative about the purinosome, an aggregate of purine biosynthetic enzymes that may come together for functional reasons in cells, streamlining the passage of intermediates during purine nucleotide biosynthesis [http://www.ncbi.nlm.nih.gov/pubmed/18388293]. It is a very compelling story, and is backed by a variety of data. But it also overlaps with the story that Marcotte and others have been trying to tell, as the punctates that we have observed overlap with the ‘some that Benkovic has been substantiating.

So, is it an aggregate or a functional body? A happenstance set of interactions or a finely crafted evolutionary machine where the parts come together to channel substrates? The dissection of such questions has taken up many pages in many journals, and it would be foolish of me to reiterate the arguments in miniature here. Rather, I want to point out how very cool it is that now you can see these arguments play out in real time, you can watch scientific culture at work, doing what it does best: search for truth.

The most recent fare from the Marcotte lab was published in PLoSOne (a journal that you can download and look at for free), and to a first approximation said that “Transiently Transfected Purine Biosynthetic Enzymes Form Stress Bodies” (the title of the article; http://www.ncbi.nlm.nih.gov/pubmed/23405267), to which Benkovic and company replied “Zhao et al. do not report on purinosomes” [http://www.plosone.org/attachments/pone.0056203.comment1.pdf]. Fun, fun! The Marcotte clan has responded to this, with many Figures and muchly logic, and the reader can decide for themselves what the status of the science is [http://www.plosone.org/attachments/pone.0056203.comment2.pdf].

And that’s the point. That’s the wonderful, amazing, exciting point. There it is, right there. Give and take. Interplay. Thought and counterthought. All in public view. All with scientific precision. This is how science should be done. This is what we dedicate our lives to, this search for truth. Not glory, not money, not even necessarily impact, but the truth. I am humbled by my collaborators and even my adversaries, as the point and counterpoint continues.

And you can make fun of my sentimentality, but even as you do so, go and look, and see what the real world of science is like.

 

- originally posted on Tuesday, April 9th, 2013

On dna Nanotechnology and DNA nanotechnology

It’s all about what syllable you put the accent on. I’m just coming back from the AAAS meeting, where we had a most awesome Symposium on Nucleic Acid Nanotechnology. I sort of meant this both as a swan song for my tenure as President of the ISNSCE (go, Hao!) and as a coming-out cotillion for DNA nanotechnology. The world needs to know what is happening in this amazing field.

Right before the Symposium (which was attended by … dozens!) I got to hear Chad Mirkin from Northwestern speak. As always, Chad’s talk was awesome. He is a genius at both coming up with new approaches and applications, and at marketing these. For example, we learned that spherical nucleic acids (SNAs) are the logical next step in the progression from linear nucleic acid through circular nucleic acid. And we know how long it took to make the wheel with DNA, so SNAs are really taking us out of the molecular biology Stone Age. Once Chad discovers kinetics we’ll likely have a *fourth dimension* of nucleic acid structure, sort of a time tesseract of DNA. Maybe a time-traveling cubic lattice, who knows?

Anyway, Chad’s work, as impressive as it is, is dna Nanotechnology. The accent is on the Nano, not the DNA. As it should be, there’s lots of fun things you can do with nucleic acids as smart glues. But for me this just puts into sharper relief the DNA nanotechnology community, in which the accent is on the DNA.

As speakers, we had Ned Seeman, who as always gave a great historical and contemporary perspective on the power of using nucleic acids as building materials. Ned continues to emphasize that this is how we control the structure of matter from the atomic level upwards, and he continues to be right. He presented some exciting results on determining the structures of nanocrystals, and may be on the verge of realizing the dream he started in the early 1980s, using DNA as a scaffold for solving molecular structures. He was followed by William Shih, who showed an actual example of using DNA as a … well, not a scaffold, but an organizer … for solving a NMR structure. William also did his 3D Origami thang, but took it into new dimensions, so to speak, by examining the impact of DNA shape on cellular uptake. Finally, Erik Winfree made a strong case for embedded molecular programming, with DNA as the avatar for a world where we have not just Bluetooth enabled toothbrushes (wow, Erik, I didn’t believe you: http://www.geekosystem.com/bluetooth-toothbrush/), but toothbrushes that may directly interact with the bacteria they find, via molecular programming.

And this leads me to our other speaker, Greg Heath, from Illumina, who actually kicked off the proceedings. The NextGen revolution continues to consume us all, and I think it was modestly incongruous for folks to see this amazing commercial enterprise, which is impacting human health in a very immediate way, juxtaposed with DNA nanotechnology. As Ned leaned over to me to whisper, during the lively Q&A that followed Greg’s talk, “What the heck am I doing here?” Except, of course, it was Ned, and he didn’t say “heck.”

But to me, that was the point, that juxtaposition. In the frame tale that I tried to loosely wrap around these proceedings, it is the fact that we now know the biological world with single molecule, digital resolution that makes DNA nanotechnology ever-more-important. It is the possibility that the knowledge of every goddamn molecule in you will not just translate into electrons and information, but into actuation via molecular programming. Ned’s and William’s and Erik’s dreams really are the dreams of Wally Gilbert and Frederick Sanger. When there are nanotheranostic systems, they will undoubtedly be actuated in part by sequence, via the structural and programming tools these researchers have given to us.

One problem, of course, is that unless you’re working inside a cell with a ready supply of DNA, or have some other replicating nanotech, DNA is not the world’s best (and certainly not the world’s cheapest) material. This is where Chad’s dna Nanotechnology gets it right, we have to use the adhesive, self-assembly, programming, and sequence recognition properties of nucleotides, but we have to do it with something that isn’t so much nucleotides. As I have said elsewhere, we are primed for a next generation of materials based on nucleotides, where the backbones are … something different. Maybe peptides, maybe vinyl (http://www.ncbi.nlm.nih.gov/pubmed/22614385), maybe some other polymer. But definitely more rugged and more readily synthesized in bulk. As they should have said in The Graduate:

Older fart: I just want to say one word to you. Just one word.
Young Turk: Yes, sir.
Older fart: Are you listening?
Young Turk: Yes, I am.
Older fart: Replicating Plastics.
Young Turk: Exactly how do you mean?

Ah, yes, exactly how.

 

- originally posted on Sunday, February 24th, 2013

On Identity

Who are you? Who-who, who-who?

This question, by the eponymous band, has been on my mind ever since I heard about the recent study where electronic clues allowed genetic identities to be deciphered (http://www.wired.com/wiredscience/2013/01/your-genome-could-reveal-your-identity/). Since the sequences of Y chromosomes turn out to be correlated with male surnames, this provides enough of a boost, when combined with other data, to pinpoint at least some genetic identities. Women are of course far more tricky, sometimes hiding their X chromosomes in tractable males for a generation, then taking them back. One presumes that there will now be a shift in gender representation amongst CIA field officers.

Anyway, this also raised the whole question of what our identities are, genetic or otherwise. I know this is not exactly a new question (although it may lead to a new college major for parents to question: genetic philosophy). And it is probably a question I can better appreciate after a conversation with my friend Suzanne Barber, who started the Center for Identity here at the University of Texas at Austin. This is a really nifty place that I always think of as challenging that classic Internet cartoon / meme: “On the Internet, nobody knows you’re a dog.” Good grief, this thing has its own Wikipedia page: http://en.wikipedia.org/wiki/On_the_Internet,_nobody_knows_you%27re_a_dog. I always thought Suzanne missed an opportunity by not having that as her logo. Oh well. Anyway, the Center for Identity is very cool, in that they do take on the knotty question of whether and how to trust electronic identity. As we have developed electronic doppelgangers of our real selves, the question of how those doppelgangers can be used and misused has become increasingly important.

But now our electronic worlds begin to collide even more fiercely with our biological ones. Now someone can ’steal’ not just your electronic identity, but also your biological identity, ala the ’stolen ladders’ of GATTACA. What happens when from the point of view of the rest of the world you are as closely tied to your DNA sequence as you are to that mole on your chin (yes, I have a mole on my chin)? But yet … you’re not tied to that sequence. Unlike your mole, that sequence is now floating in the aether, available for the taking. Of course, that’s true of the mole as well, given either the wonders of biometrics and / or Photoshop. But still, you get the point. Will we one day arrive at the operating room to find that someone else had our hysterectomy first? Again, this would also be possible via either just a nurse having a particularly bad day, or via hacking someone’s medical records, with or without DNA.

And as I continued to muse on the possibilities, I realized that it didn’t really matter. And that was what mattered. You see, the real lesson from the study that anonymous genetic identities could somehow be coupled to our real ones is that … genetic identity doesn’t matter. And the reason it doesn’t matter is because our real identities … don’t matter much anymore.

This obviously led me to ask the question: what is the difference between a bar tab and a credit card? This is not the start of a particularly bad and politically incorrect joke, although it should be. Rather, it is the question that kept coming back to me as I considered what the role of my genetic identity in … me … was. Let’s consider the question in a little more detail. Irrespective of whether I’m drinking on a tab or charging something that I can’t afford I’m incurring debt. That debt can be settled, it can be paid by others, it can be forgiven … but it is a debt attached to me. My debt. Except … not really. On the one hand, the debt appends to the person who walked in the door with a mole on their chin and a history of not being able to hold their Guinness. On the other hand the debt appends to … the holder of the debt. As far as the greater electronic world is concerned, the credit card debt defines me, not the other way around. When I go to buy yet another thing I can’t afford, and they look up my credit history, they don’t particularly give a damn if I’ve got a mole on my chin, or not. And such is of course the case for most of my financial existence, which is increasingly an electronic financial existence.

Somewhere along the line, and I know this will come as a surprise to no one except those of us like me who are particularly dense about such things because we didn’t take genetic philosophy as a gut when we could have, because the Russian teacher was way cuter … we lost our identities to the Machine. Cue Pink Floyd as necessary, or whatever modern day equivalent of tiresome teenage angst exists. But now the stakes are higher, our genetic identities are also about to be gobbled up. The problem is not that our genomes could be identified or stolen. The problem is that our genetic identities are now just another piece of us that isn’t us anymore, it’s a classifier for the person-formerly-known-by-the-mole-on-their-chin. Increasingly, we are classified, not identified. As consumers, as taxpayers, and now as meatpuppet healthcare objects we are the sum of our enumerated parts.

I should probably Go Galt, and get my tricornered hat, and rant on streetcorners about how they’re coming to take my guns, like the rest of my Texas counterparts. I’ve actually always thought there was something to that meme. Not the rock-craziness of it, but its motivation. Why were these people so upset? Well, you Eastern effetes got it right: we identify strongly with our guns and our trucks and our gunracks in our trucks. We are those things. Those are at least like my mole; they’re real. So ha-ha: you may laugh at our identities … but at least we have a residual identity. What do you have?

Yes, the Teapartier is the last, futile cry of a dying breed of American. And, to a first approximation, good riddance. Because the electronic identities we created threw our economy into hyperdrive, and the genetic identities that are coming may be the deliverance of healthcare. When the actuarial tables more finely sift through my propensity for cancer, and equalize that risk across the larger population, my hope is that I get better, not worse, healthcare. When clinical trials are virtualized and new drugs arise that are more closely matched to my allelic makeup, then hopefully I’ll live a longer and fuller life (with my gun and my truck and my gunrack in my truck). The Machine only takes our identities if we let it. The biggest threat to our existence is not the fact that we can sequence genomes at the drop of a hat. It is, and will remain: Citizens United (and its bastard offspring, Western Tradition Partnership). Or, if you happen to live in another country (and why would you want to do that?): offshoring.

 

- originally posted on Saturday, February 2nd, 2013

On apologetics

Well, the University of Texas iGEM team just returned from its first appearance in several years. They did great, and I’m mostly proud of them. They worked on engineering a micro-organism that could eat caffeine. Unfortunately, they somehow became mired in controversy, even though I wasn’t there. It seems that they had the unfortunate gall to suggest to their betters that things might be done differently. As we all know, education in general, and the iGEM in particular, are not about student input as much as they are about student obedience and obsequiousness. Students should understand that the process of entering a field, such as synthetic biology, is really more about how you present yourself to established members of the community, not about your own thoughts or aspirations. I mean, seriously, what are we teaching students if we teach them to think for themselves?!? Chaos! Madness! Students might actually begin to understand that they have the power to change the world themselves, rather than operate through comfortable systems of privilege until they can, through sufficient bowing and scraping, reach the point where they are the ones that are determining who is and who is not a synthetic biologist, and who is or is not making a true advance. Didn’t we already learn this with the great English public school system? Haven’t we made it abundantly clear through avatars like Bill Gates, Steve Jobs, and Michael Dell that it is only by conforming to the rules that real change can be accomplished?

In any event, while I am generally proud of the team, I did think it was appropriate to make a public apology for their rude behavior. The contretemps started when they foolishly pointed out to the group as a whole that perhaps Biobricks were not the most useful method for the generation of synthetic genetic circuits, especially when cheap DNA synthesis and Gibson assembly were so readily available. Now, obviously, they didn’t know their place! The great advantages of Biobricks in the synthesis of the Mycoplasma genome, for example, are well-known, and can be further attested by the large numbers of publications that are brought up when the term is searched in PubMed. Eight! Eight publications, by God! Oh, wait, two of those are Keasling’s BglBricks, and one looks like some manner of review …. Well, anyway, Biobricks rule! Especially compared to Gibson assembly, which just because it’s more widely used and vastly easier doesn’t make it *better.*

And so, a bunch of punk-ass students had the temerity to suggest that perhaps an international competition and showcase, which they stupidly thought was about science (hahahahaha), should perhaps use better scientific tools, or at least have input on such scientific tools. Such insolence had to be put down, and fortunately the judges were right there to do so. Not by refusing to advance the team to the Finals (there were some pretty kick-ass and awesome projects ahead of caffeine-eating bugs), but by using the formal and approved network of scientific review: Tweets (like Blogs, except shorter!). Karmella Haynes from ASU wrote the following after hearing from our snot-nosed brats:

That’s telling them! God forbid that students should think their opinions matter, that they can be scientists without having received the imprimatur of the synthetic biology community via the use of Biobricks and the favor of their superiors, the judges.

Now, these kids, they’re such crazy idealists. If you’re not a liberal when you’re young, you don’t have a heart, and if you’re not a conservative when you’re older, you don’t have a shotgun, amiright? They actually had a slide (which mercifully the judges did not ask to see) in which they thought they rationally set out some reasons for a different standard. It included, in part:

“…unfortunately it was rejected due to multiple illegal restriction sites. Our assembled operon came from a 13 kb segment, a size that will almost invariably contain illegal sites. In smaller constructs, it may be viable to change these sites, but as iGEM projects become more ambitious in size and scope, this becomes less and less feasible. And now, since we have Gibson assembly and other next-gen methods, the traditional assembly standards have become unnecessary, and mutating away restriction sites has become a waste of time, and we have been thinking about ways to improve it. Is it time for a new standard?”

Madness! And totally unnecessary, as it turns out. In communications with Randy Rettberg, we learned that “We [clearly the royal we] have been aware of Gibson assembly for several years.” Oh, good! Whew! The Biobricks universe and its eight publications can now give its imprimatur to that upstart who helped build an entire fracking genome (Battlestar Galactica has saved me approximately $765 / annum in the swear jar).

Now, cooler heads will undoubtedly say that it’s perfectly appropriate to have standards for an undergraduate competition, and that it levels the playing field for folks from other countries, like, I dunno, Slovenia (what’s that? Slovenia is a powerhouse? And regularly kicks ass at the competition? Dammit, when did it happen that the United States isn’t the best at everything?!? Well, at least we invented Biobricks!). That makes complete sense, it would be like trying to have young scientists do actual science, rather than an artificially constrained semblance of science.

Anyway, it just seemed that it was appropriate for me to pen this apology, not on behalf of the team, which generally stays out of my way if at all possible (I do keep wondering why so many people avoid me, hmmm), but on behalf of Texas. It’s all that frontier-mentality, bigger-and-better, secede-from-the-Union crap that we listen to down here. I know, I know, you’re ready to be rid of us, anyway, and if we’re really, really lucky and humble and stuff, people will be nice enough to let us stick around, both in the United States and at the iGEM competition.

Nawwww. We’re Texas. Get used to it.


- originally posted on Friday, November 9th, 2012

On robots and communism

While the kids were playing in the surf, I was slathered up with SPF 500 sunscreen, hunkered down with my beach reading; Robopocalypse, by Daniel H. Wilson (Robocalypse, Daniel! It should have been Robocalypse!). It’s a pretty good read, in the spirit of World War Z and other pseudo-documentary accounts of worlds yet to come. Of course, this begs the question of when, in the spirit of Alien versus Predator we get the World War Z / Robopocalypse crossover three-way (in the immortal words of Pepper Brooks, “Usually you pay double for that kind of action ….”).

But I do digress. That’s what I do.

Anyway, Robopocalypse (Robocalypse!) is like Terminator minus nukes plus Emancipation Proclamation. I give it one-and-a-half extensible appendages up.

And then I open this morning’s paper, and semi-buried on page 4 is the heart-warming story “Robots take over factory jobs.” It is mostly content-free, but has the following memorable line “At a sister factory [to a human assembly line] in the Dutch countryside 128 robot arms do the same work with yogalike flexibility. One robot arm endlessly forms three perfect bends in two connector wires and slips them into holes almost too small for the eye to see.”

Now, despite what you techies may think, I have not been living in a bio-inspired cave. I do know that robots are pretty kickass. Actually, I just got back from DNA18, the annual conference dedicated to DNA computing and nanotechnology, and I heard a wonderful talk from Radhika Nagpal of MIT and the Wyss Institute, on swarm robotics. They are getting very good, these distributed amorphous systems, and you can totally imagine them on the urban battlefield as scouts or more.

In fact, for a long time I have known robots are getting so good that most of the research in my lab has been geared towards Judgement Day, the day when robots take over. Ha, ha, just kidding, we know that already happened. What? You doubt me? Go review the Citizens United ruling and then let’s talk. In what way is a corporate entity with large resources at its disposal manipulating human political processes not the economic equivalent of Robopocalypse? Yes, yes, companies have human Boards and whatnot, but they are much more than the sum of their elite human directors. Capital finally has the ability to direct its own evolution, and it should be interesting to see where that all goes. Indeed, several years back I tried to incite student programmers to build me the equivalent of E-Trade that could press its own button, as a sort of distributed economic version of BitTorrent (or whatever generation of piracy rules the waves these days). Nothing came of that particular version of Code Salad, which is as it should be when a biologist looks into the Internet abyss. But that’s what high frequency trading is for, amateurs need not apply.

Which brings us, veering as is my wont, to communism. Not a fan, as you know, but it has some riffs you can dance to. What happens to “From each according to their ability to each according to their need” (replacing male-centered pronoun) when very few of us have an ability worth squat? Better futurists than I have already been over the “humans in the dustbin of history” trope. But we’re living it, in real-time, and we should have a better response than squatting in parks and shining “We are the 99%” on skyscrapers. Knowledge is not power. Knowledge is the means to acquire power. Power is the process of using knowledge for a purpose, good, bad, or indifferent. We do not need to make ourselves into human robots, which really did seem to be the ultimate goal of communism, we need to make ourselves into *better* humans so that we have a fighting chance against the robots.

With that in mind, I continue to wonder where the augments are? Where are the factory training programs for Waldoes (boy does that date me)? Where are the personal exoskeletons? Why don’t I have my own drones to check out the traffic situation on Mopac? There is not really a personal consumer market in human augments. And that is because our economic system has decided we are cogs in the machine, and replaceable cogs at that. The system is beginning to direct its own evolution, and we have not bought in in an intimate enough way. We have the amazing, amazing box on our desk that lets us multiply our productivity to a degree that would make an overachieving ant blush, but that box, even with all its Facebooks and Twitters and Pinterests, was really designed just to let us get inside the machine, not be part of it, especially now that there is a self-sustaining ‘it.’ The whack-job Tea Party types vaguely perceive that we are losing our control over the real world, and respond with Gunz and Truck Nutz and vociferous repetitions of Jeebus in the public square. Good luck with that, tell me how throwing those wooden shoes is working out for you.

We need augments. We need human:electronic interfaces. We need to be stronger, faster, smarter. We need to rebuild ourselves, and for way, way less than $6 million a pop. Only then can we rebuild our economic and political systems in some way that favors us, because otherwise we are the dinosaurs in this equation. Against generations of advanced robotics that are coming online with remarkable rapidity we have … education. Seriously? That’s our response? The three Rs versus Big Rob? As I am fond of saying: wug.

We need an epiphany, and in many ways that epiphany is … DNA nanotechnology. The field I am fond of criticizing for its utter uselessness is in fact the route to a future that involves us.

Well, that’s it for today. Back to work.

What, if I told you everything all at once, then why would you ever read this?

 

- originally posted on Monday, August 20th, 2012

On vampirism

We all know what vampires are. Some of you may even think that such things are real. Well, they are. Not in the sense of actually having the dude with the teeth show up and ask to be invited in, but in the more insidious sense of having one take your soul. This, to me, is the real scary part of the fantasy, that your mind, your will, your identity could be devoured by a process that is sometimes represented as magical and sometimes (especially in the more modern era) as viral. That said, the supposedly more realistic biological transformations often move with ludicrous rapidity, ala the gory scene in Daybreakers where the vampire police goons turn on one another as they are progressively cured of the virus.

But in musing about the fantasy, one can’t but help to feel that there is an atavistic, visceral recognition of reality that should be respected. Because we can be taken over, mind and body, by viruses. The most obvious example is rabies, which slowly climbs the neurons into your brain, and then brings on a variety of behavioral changes, the most obvious of which is hypersensitivity and hyperaggresion (it is not for nothing that ‘rabies’ is the Latin word for ‘madness’). Think Cujo. Or, for a human version, Hap and Leonard encounter a rabies victim and foe in Lansdale’s brilliant “Bad Chili.”

Actually, now that I think about it, it’s a bit surprising that there aren’t more viruses that move their victims to do things they wouldn’t normally do, whether it’s open the window with someone floating outside (never open the window!) or irrationally bite anything that moves.

Or maybe there are.

My good friend Chris Sullivan, consummate virologist, loves to keep track of idiosyncracies of viruses. They are his friends, each one a personality to be savored. And so it was Chris who turned me on to the ‘zombie virus’ that takes over the brains of its caterpillar hosts (yes, I know, this is about vampires, but zombies are vampires with brains, until they don’t have brains, if you see what I mean; http://news.nationalgeographic.com/news/2011/09/110908-zombie-virus-caterpillars-science-weird-animals/). Once infected by a particular baculovirus, gypsy moth caterpillars climb high into trees, disintegrate as the virus eats them from the inside out, and then all the viruses ooze out of the remaining caterpillar muck and fall on their unsuspecting victims in the foliage below. The caterpillar’s brain was driven to this action, much as Cujo was driven to biting and Dracula was driven to overly dramatic entrances, by a viral gene known as egt.

Now, this is very cool: a single gene, that can promote a single action (climb tree, don’t stop until you reach the top). Who would have thought that behavior could be so … programmed. Well, biologists, really. Or at least the ones that know about behavior, instead of molecules (molecules are my friends). A really, really big snail, the Cone Snail, has developed a harpoon that it shoots into the much faster fish around it, paralyzing them and allowing them to be slowly devoured. The toxin contains a variety of peptides that interact with specific receptors in the brain. When injected into mouse brains, the toxins caused mice to “either jump, sleep, scratch, drag their hind legs, swing their heads or shake (http://www.accessexcellence.org/RC/AB/BA/pain_Meds/pain6.php).” Whoa.

But back to viruses. Again, there should be lots of possibilities for little things (viruses) to redirect the behaviors of big things (us), to their advantage. Chris recently pointed me to another example, this time with the somewhat well-known dengue virus, an insidious tropical disease transmitted by mosquitoes. Researchers at Johns Hopkins recently dissected the impact of dengue virus infection on gene expression in the mosquito, and found that there were changes in the production of numerous proteins. Amongst these proteins, though, were ones that “appeared to make the antennae more sensitive to odors — making them better at hunting humans, the virus’s only known mammalian host. Other changes in salivary gland genes appeared to make it easier for the virus to get into a mosquito’s saliva, ready for injection.” The virus turns the mosquito into a hunter-killer machine for the virus. If Japanese mosquitoes have anime, then there must be some awesome cartoons of the virus piloting a great big Gundam mosquito.

So, what does this mean for the future? Will viruses be engineered to take over our brains, turning us into foppish Victorians with capes and teeth? Um, unlikely. But the notion that there are more subtle behaviors being influenced by the genes of pathogens and other riders, that has legs. As more and more sequence data is collected from the bacteria that inhabit our gut (there’s more bacterial DNA in us than ‘us’ DNA), it’s becoming increasingly clear that not all the genes are just about making the bacteria happy to eat the stuff we put down our gullets, nor even just for the internecine bacterial warfare that goes on inside us every day in every way. No, the bacteria may be harboring genes that … guide us. It’s becoming pretty clear that there are bacteria associated with obesity, bacteria associated with diabetes, bacteria associated with schizophrenia …. But is this cause or effect? Do they inhabit niches that we’ve carved out for them, or did they do some of the carving? Who knows, although one recent study showed that bacteria-free mice have profound neurochemical changes, and much less anxiety (Neurogastroenterol Motil (2011), 23:255). And maybe there are genes like egt waiting to be found, genes that drive us to climb our own trees, or at least our own walls, all to the bacteria’s benefit.

 

- originally posted on Friday, April 20th, 2012