Contact Us:

670 Lafayette Ave, Brooklyn,
NY 11216

+1 800 966 4564
+1 800 9667 4558

Supernova pairs have been used to accurately measure cosmological distances.

Within the last 20 years, astronomers have discovered that the Universe is expanding at an ever increasing rate due to dark energy. One technique used to detect this increase is to measure the distances to bright supernovae.

Only a Type 1a supernova is used for this type of measurement; characterised by a collision of white dwarf stars, or a single white dwarf feeding off a companion star. In either case, the result is a gargantuan explosion, the brightness of which can outshine the host galaxy. The resulting supernova is dazzlingly luminous for about a week, gradually fading over a period of about three months. 

These sudden bursts of stellar radiation have been used for a long time to measure distances to far off galaxies. Known as ‘standard candles’, these thermonuclear explosions have proved to be far from standard though, varying in brightness by up to 40%. But a new paper published in November 2015 reveals a new method for reducing the uncertainty of their luminosity.

The problem with using a Type 1a as a standard measure is that the detection of its light can be hampered by intervening molecular clouds – interstellar gas and dust that can dim the apparent brightness of the supernova, causing a scattering of brightness.

Now, Dr Hanna Fakhouri and colleagues at the ‘Nearby Supernova Factory’ (SNfactory) based at the U.S. Department of Energy have reduced the brightness dispersion of Type 1a supernovae to as little as 8%. Their research sampled almost 50 nearby supernovae, identifying cosmic ‘twins’ – pairs with closely matching spectra. Using spectrophotometric time series observations, Fakhouri and her team have developed a method to improve cosmological measurements. The results have helped to standardise the brightness index of these explosive events.