magnitude


magnitudesearch for term

A measure of brightness, or faintness, as perceived by the human eye. In the system used by astronomers, the higher the magnitude, the fainter the object. The magnitude and apparent brightness of a star are related in a logarithmic fashion. For every five steps in magnitude, the apparent brightness of a star, galaxy, or nebula changes by a factor of 100. For example, we receive 100 times more light energy from Vega -- a zero-magnitude star -- than from Eta Ursa Minor -- a fifth-magnitude star in the Little Dipper. Under the clearest, darkest skies, your eye cannot see stars fainter than sixth magnitude. With the aid of binoculars, the human eye can detect 10th-magnitude stars.

The magnitude scale is organized logarithmically because that's the way our human eyes perceive brightness, whether it's light bulbs or stars. For instance, your eye would perceive the same brightness difference between a 25- and 50-watt light bulb as it would between a 100- and 200-watt light bulb. Likewise with stars, your eye would detect the same brightness difference between a first- and a second-magnitude star as it would between a second- and third magnitude star. If you repeat this exercise down to a sixth-magnitude star, the first-magnitude star's brightness (amount of light received on Earth) is 100 times the sixth-magnitude star.