Star magnitude

From Conservapedia

Jump to: navigation, search

Star magnitude refers to the apparent brightness of stars.

Contents

History

Stars were first classified by brightness by Hipparchus in 129 BC. He produced the first well known catalog of stars in which the brightest stars were listed as "first magnitude" and the second grouping was "second magnitude". This division went up to sixth magnitude.[1] The observation of different brightness of stars is mentioned also in the New Testament: "The sun has one kind of splendor, the moon another and the stars another; and star differs from star in splendor."[2]

In 140 AD, Claudius Ptolemy added "greater" and "smaller" within the magnitude to separate them further.

With Galileo and the telescope, it was possible to make out stars that were much dimmer than sixth magnitude.

Today

In the 1800s, Norman R. Pogson proposed and codified magnitude to be a logarithmic scale where the difference between the first and fifth magnitude is one hundred times brighter. The ratio of one number to the next is the fifth root of one hundred, or about 2.512 - known as the Pogson ratio.[3]

Examples

  • Naked eye limit with a dark sky: +6
  • Visual limit of a 12" telescope: +14
  • Visual limit of a 200" telescope: +20
  • Sirius: –1.5
  • Venus: –4.4
  • Full moon: –12.5
  • The Sun: –26.7

References

  1. http://skytonight.com/howto/basics/3304516.html
  2. Bible, 1 Corinthians 15:41 (New International Version). Bible Gateway. “Credit:Holy Bible, New International Version®, NIV® Copyright © 1973, 1978, 1984, 2011 by Biblica, Inc.®”
  3. http://www.astro.ufl.edu/~guzman/ast4402/lectures/lect06.html
Personal tools