Nearly a year ago, back when I was still over on the Nature Blogs site, I decided to announce the obvious: Microarrays are dead. Of course, I meant that in the research setting, and not that we should all throw out everything that’s microarray related. In the long term, microarrays are simply going to be pushed further and further into niche applications. I think I was pretty verbose on that point – there will always be niches for certain technologies and Microarrays will always reign supreme on diagnostics and mail order SNP panels. My opinion on it hasn’t really changed.
However, I did have an opportunity to talk with people today who work on microarrays, and one of my internet friends reminds me every time one of her microarray projects works, that I’ve already declared them dead, so I figure it’s worth visiting.
The real catalysis for this blog entry, however, came from an article I read on GenomeWeb. Unfortunately, I don’t have a premium account and the article is not freely available… which begs the question of how I read it in the first place. I haven’t the faintest. Twitter link?
In any case, the major point of the article was that the death of arrays have been greatly exaggerated and that there are still experiments that do better with arrays than with next gen sequencing.
Well.. yeah. No one claimed that there weren’t applications. My internet friend’s arrays are helping her understand horse colour patterns, while I know that diagnostics can be much more efficiently done using arrays. Clearly, those are niche applications where microarrays have the edge – and next gen sequencing is overkill in cost, bioinformatics required AND the amount of information gathered.
Unfortunately, in claiming the death of microarrays is premature, the GenomeWeb basically cites the niche applications, rather than demonstrate a resurgence of microarrays into cutting edge science. I don’t find it particularly convincing, really.
So, here’s my challenge: If you’d like to announce that microarrays aren’t dead, you’ll have to show their use outside of the niche applications. To be clear, lets enumerate niches where Microarrays will flourish:
- Large sample sets with small numbers of genes. Maybe you want expression levels of transcribed genes over a large number of patients, then microarrays are probably much cheaper, and are likely to remain cheaper, as long as you don’t mind gene-level resolution.
- Diagnostics: You only want information on an exact set of traits. Extraneous information is actually a hindrance, rather than a benefit.
- Personal medicine: well, this isn’t any different than number diagnostics, except the information is probably going direct to the consumer.
- Experiments that would have been cutting edge on drosophila in the 80’s or 90’s. Not all organisms have been well studied. Horse colouring, for instance, is just one of those things that hasn’t been explored in great detail and is now the topic of research. Again, you don’t need the depth of next gen sequencing to study a simple genomic set of traits.
So, did you see the theme? Simple experiments, nothing too in-depth and nothing where you’re fishing for something new and unexpected. While you CAN do experimental work with microarrays, I just don’t buy that cutting edge work will happen on that platform anymore. That’s not to say that there’s nothing left of value (there CLEARLY is), but those aren’t going to be studies that give you new mechanisms or huge insight into a well studied organism.
Microarrays have, from my perspective, passed beyond the realm of cutting edge genomics into the toolbox of “oldschool” applications. Again, it’s not an insult to microarrays, and oldschool doesn’t mean useless. (For instance, see my posts on complete genomics – they are truly the oldschool of sequencing, and doing some fantastic things with it.)
So, in conclusion, I’m going to stand by my original post and reiterate that microarrays are dead – at least as far as doing cutting edge personal medicine and research. But, hey, that doesn’t mean you have to throw them all out the window. They’ll still be around, hiding in the quiet corners, pumping out volumns of data… just more slowly than sequencing. I just don’t expect them to jump out and surprise me with a resurgence and displace next gen technologies, which are only going to keep pushing microarrays further into the shadows.