Difference between revisions of "Star magnitude"
m (→Examples) |
|||
Line 16: | Line 16: | ||
* Visual limit of a 12" telescope: +14 | * Visual limit of a 12" telescope: +14 | ||
* Visual limit of a 200" telescope: +20 | * Visual limit of a 200" telescope: +20 | ||
− | * Sirius: –1.5 | + | * [[Sirius]]: –1.5 |
− | * Venus: –4.4 | + | * [[Venus]]: –4.4 |
− | * Full moon: –12.5 | + | * Full [[moon]]: –12.5 |
− | * The Sun: –26.7 | + | * The [[Sun]]: –26.7 |
==References== | ==References== | ||
<references/> | <references/> | ||
[[category:astronomy]] | [[category:astronomy]] |
Revision as of 15:45, May 18, 2008
Star magnitude refers to the apparent brightness of stars.
Contents
History
Stars were first classified by brightness by Hipparchus in 129 BC. He produced the first well known catalog of stars in which the brightest stars were listed as "first magnitude" and the second grouping was "second magnitude". This division went up to sixth magnitude.[1]
In 140 AD, Claudius Ptolemy added "greater" and "smaller" within the magnitude to separate them further.
With Galileo and the telescope, it was possible to make out stars that were much dimmer than sixth magnitude.
Today
In the 1800s, Norman R. Pogson proposed and codified magnitude to be a logarithmic scale where the difference between the first and fifth magnitude is one hundred times brighter. The ratio of one number to the next is the fifth root of one hundred, or about 2.512 - known as the Pogson ratio.[2]
Examples
- Naked eye limit with a dark sky: +6
- Visual limit of a 12" telescope: +14
- Visual limit of a 200" telescope: +20
- Sirius: –1.5
- Venus: –4.4
- Full moon: –12.5
- The Sun: –26.7