Magnitudes were first placed on stars by the Greek astronomer Hipparchus, more
than two thousand years ago. He listed stars from first magnitude (the brightest)
to sixth magnitude (the faintest) with a one magnitude increase corresponding to a
star one-half as bright. In the mid-1800's astronomers made a more precise
definition of magnitude, determining that the intensity difference between
magnitudes was 2.512. This means that a second-mag. star appears 2.5 times as
bright as a third-mag. star (close to Hipparchus' value!), and it works out
perfectly so that a first-mag. star is 100 times as bright as a sixth-mag. star.
With further measurements it was found that four stars were brighter than
first-mag., but instead of changing the scale these stars were given negative
magnitudes. The most important thing to remember is that as magnitude
decreases a star's brightness increases.