The Death of Microarrays, revisited

Nearly a year ago, back when I was still over on the Nature Blogs site, I decided to announce the obvious: Microarrays are dead.  Of course, I meant that in the research setting, and not that we should all throw out everything that’s microarray related.  In the long term, microarrays are simply going to be pushed further and further into niche applications.  I think I was pretty verbose on that point – there will always be niches for certain technologies and Microarrays will always reign supreme on diagnostics and mail order SNP panels.   My opinion on it hasn’t really changed.

However, I did have an opportunity to talk with people today who work on microarrays, and one of my internet friends reminds me every time one of her microarray projects works, that I’ve already declared them dead, so I figure it’s worth visiting.

The real catalysis for this blog entry, however, came from an article I read on GenomeWeb.  Unfortunately, I don’t have a premium account and the article is not freely available… which begs the question of how I read it in the first place.  I haven’t the faintest. Twitter link?

In any case, the major point of the article was that the death of arrays have been greatly exaggerated and that there are still experiments that do better with arrays than with next gen sequencing.

Well.. yeah.  No one claimed that there weren’t applications.  My internet friend’s arrays are helping her understand horse colour patterns, while I know that diagnostics can be much more efficiently done using arrays.  Clearly, those are niche applications where microarrays have the edge –  and next gen sequencing is overkill in cost, bioinformatics required AND the amount of information gathered.

Unfortunately, in claiming the death of microarrays is premature, the GenomeWeb basically cites the niche applications, rather than demonstrate a resurgence of microarrays into cutting edge science.  I don’t find it particularly convincing, really.

So, here’s my challenge: If you’d like to announce that microarrays aren’t dead, you’ll have to show their use outside of the niche applications.  To be clear, lets enumerate niches where Microarrays will flourish:

  1. Large sample sets with small numbers of genes.  Maybe you want expression levels of transcribed genes over a large number of patients, then microarrays are probably much cheaper, and are likely to remain cheaper,  as long as you don’t mind gene-level resolution.
  2. Diagnostics:  You only want information on an exact set of traits.  Extraneous information is actually a hindrance, rather than a benefit.
  3. Personal medicine: well, this isn’t any different than number diagnostics, except the information is probably going direct to the consumer.
  4. Experiments that would have been cutting edge on drosophila in the 80’s or 90’s.  Not all organisms have been well studied.  Horse colouring, for instance, is just one of those things that hasn’t been explored in great detail and is now the topic of research.  Again, you don’t need the depth of next gen sequencing to study a simple genomic set of traits.

So, did you see the theme?  Simple experiments, nothing too in-depth and nothing where you’re fishing for something new and unexpected.  While you CAN do experimental work with microarrays, I just don’t buy that cutting edge work will happen on that platform anymore.  That’s not to say that there’s nothing left of value (there CLEARLY is), but those aren’t going to be studies that give you new mechanisms or huge insight into a well studied organism.

Microarrays have, from my perspective, passed beyond the realm of cutting edge genomics into the toolbox of “oldschool” applications.  Again, it’s not an insult to microarrays, and oldschool doesn’t mean useless. (For instance, see my posts on complete genomics – they are truly the oldschool of sequencing, and doing some fantastic things with it.)

So, in conclusion, I’m going to stand by my original post and reiterate that microarrays are dead – at least as far as doing cutting edge personal medicine and research. But, hey, that doesn’t mean you have to throw them all out the window.  They’ll still be around, hiding in the quiet corners, pumping out volumns of data… just more slowly than sequencing.  I just don’t expect them to jump out and surprise me with a resurgence and displace next gen technologies, which are only going to keep pushing microarrays further into the shadows.

4 thoughts on “The Death of Microarrays, revisited

  1. Hi
    I have to declare an interest as someone who has worked on both ma and seq…that said I take issue with your description of these as ‘niche’

    1, 2, and 3 are not ‘niche’ they are the bread and butter of science. And i have to say that everything we have learned about disease, genomics and diagnostics has shown that expression over a large number of patients IS far far more important than depth over a small set.

    I would recommend clinical trials to be done in conjunction with patient ma..not seq.

    If anything finding depth of transcript over a small set is the ‘niche’ application.

    That said … more and more stuff will be done by seq as it becomes a default.

    • Hi Stephen,

      Thanks for the reply! I really enjoy being challenged on my points, and if nothing else, I should be getting used to defending my opinions in advance of my defense. (-;

      And i have to say that everything we have learned about disease, genomics and diagnostics has shown that expression over a large number of patients IS far far more important than depth over a small set.

      I thin you’ve missed my point. Expression over a small set is useless, so I wasn’t advocating that. I’m simply saying that the resolution of microarrays is too poor compared to rna-seq to allow microarrays to be the default position for most cutting edge science. For instance, indels, snvs, alternate splicing are all key components of diseases, and epigenetics is certainly a major contributing factor, so just looking globally at expression isn’t the bread and butter of science anymore – it’s just one small facet of the bigger picture. And, focusing solely on expression in the face of this much larger expansion of our understanding of genomics and transcriptomics is exactly the definition of niche.

      Many of the roles of microarrays have eroded with the introduction of next gen sequencing, leaving us with a much smaller number of applications for which arrays are appropriate, and anyone who’s focused solely on expression is either doing a niche application, or is hiding from the massive changes sweeping through the field.

      If anything finding depth of transcript over a small set is the ‘niche’ application.

      Very few people are focused on using rna-seq only as a digital expression counter at this point: It’s hard to imagine someone doing rna-seq and then ignoring the SNV, indel and splicing information generated. That makes your comparison somewhat hollow, in my opinion.

      Furthermore, the cost of doing rna-seq is dropping rapidly so, if anything, this field will continue to expand as it becomes increasingly cost effective to do larger sample sizes. It is only a matter of time (5 years?) before it becomes cheaper to do rna-seq than arrays. Thus, I think your point is a little overstated. RNA-seq will only continue to displace microarrays for the foreseeable future. pushing arrays further into what I would consider niche applications.

  2. Pingback: Dear Affymetrix: You Suck. | blog.fejes.ca

  3. Pingback: Y al tercer día resucitaron | Die Biochemie

Leave a Reply

Your email address will not be published. Required fields are marked *