Changes

Star

28 bytes removed, 14:19, April 26, 2008
The visual magnitude system is defined as follows: a star of any given magnitude is about 2.512 times as bright as is a star of the next magnitude. [[Hipparchus]] devised the magnitude system, and [[Ptolemy]] refined it further. By convention, an arbitrary sample of the twenty brightest stars that they could observe were assigned to the first magnitude, and the stars that they could barely observe were assigned to the sixth. Sixth-magnitude stars are actually 100 times less bright than first-magnitude stars. Magnitude levels between these extremes are assigned on a logarithmic scale. Thus, given two stars of brightness l<sub>1</sub> and l<sub>2</sub>, their magnitude difference (V<sub>2</sub> - V<sub>1</sub>) relates to their respective brightnesses in this way:<ref name=Haworth>Haworth, David. "[http://www.stargazing.net/david/constel/magnitude.html Star Magnitudes]." ''[http://www.stargazing.net/david/index.html Observational Astronomy]'', 2003. Accessed April 21, 2008.</ref>
<math>\,\!V_2 - V_1 = 2.5 \times \operatorname{log} \frac{l_1}{l_2}</math>
The ''absolute'' magnitude of any star is the visual magnitude that it would have if it were ten parsecs distant. To convert apparent magnitude V to actual magnitude M, use this formula:
<math>\,\!M = V + 5 \times \operatorname{log} \frac{s_0}{s}</math>
where s<sub>0</sub> is the standard distance. This distance is ten parsecs, or about 2,062,650 AU.
12,071
edits