# Star magnitude

(Redirected from Apparent magnitude)
Jump to: navigation, search

Star magnitude refers to the apparent brightness of stars.[1] More generally, the term "apparent magnitude" is a measure of how bright any astronomical object is as seen from Earth. The absolute magnitude of an object is the magnitude of the object as seen from 10 parsecs from the object. Somewhat confusingly, lower magnitudes correspond to brighter objects.

## History

Stars were first classified by brightness by Hipparchus in 129 BC. He produced the first well known catalog of stars in which the brightest stars were listed as "first magnitude" and the second grouping was "second magnitude". This division went up to sixth magnitude.[2] The observation of different brightness of stars is mentioned also in the New Testament: "The sun has one kind of splendor, the moon another and the stars another; and star differs from star in splendor."[3]

In 140 AD, Claudius Ptolemy added "greater" and "smaller" within the magnitude to separate them further.

With Galileo and the telescope, it was possible to make out stars that were much dimmer than sixth magnitude.

## Modern definition

In the 1800s, Norman R. Pogson proposed and codified magnitude to be a logarithmic scale where the difference between the first and fifth magnitude is one hundred times brighter. The ratio of one number to the next is the fifth root of one hundred, or about 2.512 - known as the Pogson ratio.[1] For example, this means a magnitude +5.0 star is about 2.512 times brighter than a magnitude +6.0 star.

The apparent magnitude of a star, m* can be computed using,[4]



where mc is the apparent magnitude of a comparison star and b* and bc are the brightness of the star and comparison star as seen on Earth. Historically, the star Vega was used as a comparison and defined to have an apparent magnitude of +0.0, though recent measurements have meant it is now +0.3 in order to be consistent with other stars.[2] The  refers to a base 10 logarithm.

The apparent magnitude of an object depends not only on how much light the object emits, but also on how far away the object is. To compare the instrinsic brightness of objects, the absolute magnitude is often used. It is defined as the magnitude of an object as seen from 10 parsecs (32.6156 light years). The absolute magnitude, M, can be calculated using,[5]



where m is the apparent magnitude and d is the distance to the object from Earth. d must be measured in parsecs for this equation to work.

## Examples

• Naked eye limit with a dark sky: +6
• Visual limit of a 12" telescope: +14
• Visual limit of a 200" telescope: +20
• Sirius: –1.5
• Venus: –4.4
• Full moon: –12.5
• The Sun: –26.7

## References

1. Magnitude. britannica. Retrieved on July 28, 2020.
2. Brightest Stars: Luminosity & Magnitude. space.com (October 11, 2017). Retrieved on July 28, 2020.
3. Bible, 1 Corinthians 15:41 (New International Version). Bible Gateway. “Credit:Holy Bible, New International Version®, NIV® Copyright © 1973, 1978, 1984, 2011 by Biblica, Inc.®”
4. Formulas - Magnitudes. astronomyonline.org. Retrieved on July 28, 2020.
5. Absolute Magnitude. astronomy.swin.edu.au. Retrieved on July 28, 2020.