The prize is specifically for a series of papers, beginning in 1998, measuring the redshift and luminosity distance of a sample of type Ia supernovae, independently by two teams, and showing that, combined with other known constraints, the result was consistent with a Hubble constant that was increasing in the recent past, and therefore an accelerating universe.
"Measurements of Omega and Lambda from 42 High-Redshift Supernovae", Perlmutter et al. 1999
"Observational Evidence from Supernovae for an Accelerating Universe and a Cosmological Constant", Riess et al. 1998
"The High-Z Supernova Search: Measuring Cosmic Deceleration and Global Curvature of the Universe Using Type IA Supernovae", Schmidt et al. 1998
Riess et al and Schmidt et al were submitted concurrently on May 15th, from the same collaboration, the High-Z Supernova Search, while Perlmutter et al. submitted later in 1998, with data from the the Supernova Cosmology Project.
The motivation for the searches for distant supernovae, is to measure cosmological parameters, in particular: H0 the Hubble Constant, measuring the current expansion rate of the Universe; Ω - the mean density of the Universe, as a fraction of the "critical density"; and, q0 - the "deceleration parameter", which basically measures the rate of change of H0.
"All of cosmology is the search for two numbers: H0 and q0" - Allan Sandage
A key part of the effort is to measure absolute distances, specifically the luminosity distance, DL, to objects that are cosmological distances away, and compare them with their redshift, z.
The cosmological redshift measures directly the fractional expansion of the universe since the light was emitted from the object, and, crudely, by comparing z and DL for many objects at different z, we can measure the rate of expansion of the Universe at different times.
A conceptually straightforward way to do this measurement, is to find standard candles that enable a measurement of DL, by comparing the apparent brightness of an object with its true, known, brightness.
Catch is these objects have to be very luminous to be seen at cosmological distances.
Kirshner and Kwan 1974 proposed use of Supernovae as standard candles, and Bob Wagoner 1977 showed how this could be used to measure q0.
Kirshner is a founding member of High-Z and Riess and Schmidt were his graduate students.
The road to measuring q0 was long and tortuous, and involved contributions from very many astronomers, doing bread and butter astrophysics, like measuring actual brightnesses of nearby supernovae, correlating them with their spectra and time variability and calibrating the data so the supernovae could be shown to be robust standard candles.
Type II (gravitational collapse) supernovae turn out to have too many sub-classes and ranges of luminosity to be useful calibrated standard candles, as yet, but type Ia supernovae (thermonuclear detonations of about 1.4 solar mass stellar cores) work - a set of standard candles Ia can be robustly calibrated by bootstrapping directly from the actual measurements of the supernova themselves.
The calibration is, of course, cross-calibrated to other distance ladder measurements.
The Hubble Space Telescope played a critical role in following up and later (with new wider field cameras) finding distant type Ia supernovae.
The expectation in cosmology back in the '90s, was that q0 would be found to be positive, indicating a decelerating Universe. Theoretical prejudices suggested that Ω = 1, that the density of the Universe was at, or very close to, the critical density, at which expansion continues indefintiely but the rate of expansion approahes zero, as gravity decelerated the initial expansion.
Observations, however, were not consistent with the preconceptions, and by the mid-'90s a lot of astronomer were starting to contemplate alternative configurations for the Universe.
The complementary data from the High-Z and SCP teams strongly suggested that q0 was negative, that the Universe was accelerating.
This was a known theoretical possibility, the simplest model in which it might happen, due to Einstein, is if there is a "cosmological constant", Λ, representing a constant embedded energy density pushing the Universe out.
There are a host of other possibilities, many conceptually covered by the label of "dark energy", which differ in the physical process by which the Universe is accelerated out, overcoming the decelerating pull of gravity.
Thus, when the Ia data suggested the presence of "dark energy", the time was ripe, scientifically, and the general idea was rapidly and broadly adapted by the community, it fixed a lot of accumulating discrepancies and was an elegant solution to cosmological quandries. Subsequent data from other observations provided consistent support for a Universe with "dark energy", in fact a Universe dominated by Λ.
Other related papers include:
"Discovery of a Supernova Explosion at Half the Age of the Universe and its Cosmological Implications", Perlmutter et al, Nature, 1998;
"Supernova Limits on the Cosmic Equation of State" Garnavich et al. 1998
There is still much to do: both basic astronomy, like continued calibration and checking of the photometric and spectroscopic properties of type Ia supernovae; the hunt to find and calibrate independent standard candles to cross-check the supernovae results; we still do not know the progenitors of type Ia supernovae and urgently need to find and understand the (multiple) formation channels of these systems, many channels are postulated, but none seem adequate individually.
There is also a major push to measure the properties of dark energy: the simplest characterisation is to measure the equation of state of Λ, W..
A cosmological constant has W=-1 and W', the rate of change of W, is zero.
Other dark energy concepts have W' non-zero, in general, and therefore W is not constant.
Or there may well be some other physical effect mimicking a cosmological constant without the underlying physics of a constant dark vacuum energy.
As this is the dominant aspect of the Universe in the current era, and one which seems likely to control the future evolution of the Universe, it seems prudent to followup on these results.
The SNAP mission was proposed to do that, and was expedited into the NASA mission lineup in the last decade, but has not met budgetary or technological goals.
Currently the NASA concept for such a dark energy mission is WFIRST, but in the current budgetary climate there is no prospect for WFIRST flying in the foreseeable future.
The conceptually similar, but less ambitious, European Space Agency mission, EUCLID, has just been selected as part of the future M-class flight missions by ESA.
PS: via telescoper - EUCLID and Solar Orbiter selected for ESA M-class mission, PLATO loses.
Λ - the COBE and WMAP website at GSFC has an excellent discussion, at all levels, of issues in cosmology, including the history and current role of the cosmic microwave background measurements.
www.armonpogosyan.com Samwel Pogosyan-"The armon strukture of Metauniwerse","Dark energy is a solution of the problem","Primordial black holes-nyu regim formation"...
Excellent work done by Saul Perlmutter, Adam Riess and Brian Schmidt. They deserve this prize.
Yet whether our universes cools ultimately should be disputed. When calculating back to point 1 (the Big Bang), one can calculate the distance in time. Yet when calculating back to point 0, time becomes less of a clear measurement. If the Big Bang was a large-scale phenomenon, and for instance included a center that did not materialize (dark matter), then our material universe did not have an exact center, but arrived from the 'skin' of a gigantic mass without the mass itself materializing. Matter can then be seen as similar to photons escaping the sun, while dark matter would then be seen as the sun that stayed in place.
If the size of the initial moment of materialization is, say, the size of a galaxy, then time did not start where we envision it today, because nothing of the center materialized. Time started then at the edge of dark matter.
Regarding the cosmological constant
Researchers recently have made great strides in advancing our knowledge of the rate of expansion of the universe. Theorists have been less at the forefront of things since Einstein called the Cosmoligical Constant the "greatest error" of his life.
If, however, we take a step back, all the way back, a picture becomes clear.
Either the universe was empty (of energy, of momentum, of everything) at the moment of the big bang, and now it sums to something - this is the mathematical equivalent of saying 0=1. I hardly need to refute this. Reductio ad absurdum.
Alternatively, the universe has always held (and always will due to the conservation of energy) the same energy today as at the big bang. 1=1 in maths, but not consistent with observation - and it does not answer the question "how did the universe come about"...it was always there, which flies in the face of the big bang theory.
The only remaining solution is that there was nothing, and it split.... 0 = 3 - 2 - 1 ..... fine in maths, explains where the universe came from (nowhere) and has measurable consequences.
If I am right - the universe MUST be in balance, neither open nor closed, Eistein did not make and error and Î© = 1, the universe is at critical density - always has been - always will be. We don't even need fruitless searches for dark matter!
How this split from nothing to "something - something" is actually quite simple. We are familiar with the spontaneous cessation of existance of particles - radioactivity. We are less familiar with spontaneous creation, yet it is just the reverse and happens all the time.
The universe may appear simple, but is in fact complex (in the mathematical sense). The "imaginary" number is essential to modern maths and phyics. My view is that the universe has a 5th dimension, the 'imaginary" above the 3 physical and time - and there is plenty of evidence for this - especially in quantum physics. We do not even have to invoke the countless extra dimensions of string or brane theory - for which there is no evidence