Monday, December 24, 2007


The stars visible to the naked eye range more than a thousandfold in brightness, from the most brilliant one, Sirius, to those that can only just be glimpsed on the darkest of nights. Astronomers term a star's brightness its magnitude. The magnitude system is one of the odder conventions of astronomy.
Naked-eye stars are ranked in six magnitude classes, from first magnitude (the brightest) to sixth magnitude (the faintest). A difference of five magnitudes is defined as equalling a brightness difference of exactly 100 times. Hence a step of one magnitude corresponds to a difference of about 2.5 times in brightness. A difference of two magnitudes corresponds to a brightness difference of 2.5 X 2.5 = 6.3 times. Three magnitudes equals a brightness difference of 2.5 x 2.5 x 2.5 = 16 times, and so on.
A star 2.5 times brighter than magnitude 1.0 is said to be of magnitude 0. Objects brighter still are assigned negative magnitudes. Sirius, the brightest star in the sky, has a magnitude of -1.46.
The magnitude system can be extended indefinitely to take account of the brightest and the faintest objects. Example, the Sun has a magnitude of -27. Objects fainter than sixth magnitude are classified in succession as seventh magnitude, eighth magnitude, and so on. The faintest objects that can be detected by telescopes on Earth are about magnitude 25.