Larry Moran points to a couple of posts critical of microarrays (The Problem with Microarrays):
Microarrays are small chips that are covered with short stretches of single stranded DNA. People hybridize DNA from some source to the microarray, which lights up if the DNA hybridizes to the probes on the array.
Most biologists are familiar with microarrays being used to measure gene expression. In this case, transcribed DNA is hybridized to the array, and the intensity of the signal is used as a proxy for the transcriptional level of a large sample of genes. Other uses include identifying copy number polymorphism, genotyping single nucleotide polymorphisms (SNPs), and capturing sequences of interest for downstream analysis.
However, many of these uses are much better implemented with next generation sequencing. For example:
- Gene expression can be measured using Solexa sequencing (doi:10.1101/gr.079558.108). This digital quantification is far more precise than microarray analysis, which relies on hybridization intensities.
- Copy number polymorphism can be identified by 454 sequencing using paired-end reads (doi:10.1126/science.1149504).
- SNP genotyping can be performed with next-gen sequencing (doi:10.1016/j.gde.2006.10.009).
- Additionally, Solexa sequencing is replacing microarrays in the high throughput identification of DNA sequences in chromatin immunoprecipitation (ChIP-seq)
Now, all of these techniques require a completely sequenced genome (or transcriptome). However, so do microarrays. Therefore, the up front needs aren't very different. Also, using microarrays to capture sequences can't be replaced by another technology. But it does rely on next-generation sequencing for downstream analysis.
Okay, so the question "Do people still use microarrays?" is a bit of hyperbole. But will microarrays be obsolete any time soon? Not if people are still using Sanger sequencing.