Friday, 23 October 2009

How do you figure out the magnitude of stars?

You must remember that there are Absolute magnitude (that measured from a fixed distance of 10pc) and Apparent magnitude (which is measured from Earth).



Historically magnitudes ranged from First to Sixth and were assigned by guess by Ptolomaeus. His First Magnitude was the equivalent of saying first size, so it was the brightest, and Sixth Magnitude was the assignment he made for the dimmest seen with the naked eye.



Nowadays, we use roughly the same scale but with real numbers, ranging from -26.74 (Sun) up. We defined the star Vega to be the exact 0.0 apparent magnitude (and 0.0 colour in all filters, for the case), and then we use the formula



$m_1-m_{ref}=-2.5log{Iover I_{ref}}$



to define all other apparent magnitudes from that one by measuring Intensities.



So to answer you, yes, we measure apparent magnitudes rather accurately. For stars other than Sun, they range from -1.46 for Sirius up to +8 in a perfect night for a trained naked eye (typically +6 for the normal eye on a normal, far from city lights, night).



Reference http://en.wikipedia.org/wiki/Magnitude_(astronomy)

No comments:

Post a Comment