Tonight's Sky
Sun
Sun
Moon
Moon
Mercury
Mercury
Venus
Venus
Mars
Mars
Jupiter
Jupiter
Saturn
Saturn

Tonight's Sky — Change location

OR

Searching...

Tonight's Sky — Select location

Tonight's Sky — Enter coordinates

° '
° '

What is the baseline for determining the magnitude scale of celestial objects? Why do brighter objects have negative numbers?

Dean Treadway, Knoxville, Tennessee
RELATED TOPICS: STARS | OBSERVING
Astronomical-brightness
The star Vega has an apparent magnitude of 0. NASA/JPL-Caltech/University of Arizona
The first observer to catalog differences in star brightnesses was Greek astronomer Hipparchus. He created a catalog around 135 b.c. of roughly 850 stars divided into six ranges. He called the brightest 1st magnitude and the faintest 6th magnitude. Observers used this system for more than 1,500 years.

But then came Galileo Galilei. In addition to discovering the phases of Venus, Jupiter’s large moons, and more, he noted that his telescope did not simply magnify — it revealed the invisible. In 1610, Galileo coined a term that had not been used before when he called the brightest stars below naked-eye visibility “7th magnitude.”

The telescope, therefore, demanded an expansion of Hipparchus’ magnitude system, but not only on the faint end. Observers noted that 1st-magnitude stars varied greatly in brightness. Also, to assign a magnitude to the brightest planets, the Moon, and especially the Sun, scientists would have to work with negative numbers.

In 1856, English astronomer Norman R. Pogson suggested astronomers calibrate all magnitudes so that a difference of 5 magnitudes would equal a brightness difference of 100. (For example, a 1st-magnitude star is 100 times brighter than a 6th-magnitude one.) We still use Pogson’s formula today.

Astronomers routinely use two main divisions of magnitudes to describe the same object. “Apparent magnitude” describes how bright an object looks. Back in the day, observers measured apparent magnitudes by eye. Now ultrasensitive CCD cameras provide measurements with accuracies of 0.01 magnitude.

With “absolute magnitude,” astronomers indicate how bright an object really is. Two things determine this number (also called luminosity): apparent magnitude and distance. Absolute magnitude defines an object’s brightness if it were exactly 10 parsecs (32.6 light-years) from Earth. So any object closer than 32.6 light-years has an apparent magnitude brighter than its absolute magnitude. For any object farther away, the absolute magnitude is brightest. — Michael E. Bakich, Senior Editor, Astronomy magazine
0

JOIN THE DISCUSSION

Read and share your comments on this article
Comment on this article
Want to leave a comment?
Only registered members of Astronomy.com are allowed to comment on this article. Registration is FREE and only takes a couple minutes.

Login or Register now.
0 comments
ADVERTISEMENT
FREE EMAIL NEWSLETTER

Receive news, sky-event information, observing tips, and more from Astronomy's weekly email newsletter. View our Privacy Policy.

ADVERTISEMENT
ADVERTISEMENT
Apollo_RightRail
A chronicle of the first steps on the Moon, and what it took to get there.
Find us on Facebook