summaryrefslogtreecommitdiffstats
path: root/tde-i18n-en_GB/docs/tdeedu/kstars/magnitude.docbook
blob: 099f5b769130d7f61a891e07d90a58a233203492 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
<sect1 id="ai-magnitude">
<sect1info>
<author><firstname>Girish</firstname> <surname>V</surname> </author>
</sect1info>
<title>Magnitude Scale</title>
<indexterm><primary>Magnitude Scale</primary>
<seealso>Flux</seealso> <seealso>Star Colours and Temperatures</seealso> </indexterm>
<para>2500 years ago, the ancient Greek astronomer Hipparchus classified the brightnesses of visible stars in the sky on a scale from 1 to 6. He called the very brightest stars in the sky <quote>first magnitude</quote>, and the very faintest stars he could see <quote>sixth magnitude</quote>. Amazingly, two and a half millenia later, Hipparchus's classification scheme is still widely used by astronomers, although it has since been modernised and quantified.</para>
<note><para>The magnitude scale runs backwards to what you might expect: brighter stars have <emphasis>smaller</emphasis> magnitudes than fainter stars). </para>
</note>
<para>The modern magnitude scale is a quantitative measurement of the <firstterm>flux</firstterm> of light coming from a star, with a logarithmic scaling: </para><para>m = m_0 - 2.5 log (F / F_0) </para><para>If you do not understand the maths, this just says that the magnitude of a given star (m) is different from that of some standard star (m_0) by 2.5 times the logarithm of their flux ratio. The 2.5 *log factor means that if the flux ratio is 100, the difference in magnitudes is 5 mag. So, a 6th magnitude star is 100 times fainter than a 1st magnitude star. The reason Hipparchus's simple classification translates to a relatively complex function is that the human eye responds logarithmically to light. </para><para>There are several different magnitude scales in use, each of which serves a different purpose. The most common is the apparent magnitude scale; this is just the measure of how bright stars (and other objects) look to the human eye. The apparent magnitude scale defines the star Vega to have magnitude 0.0, and assigns magnitudes to all other objects using the above equation, and a measure of the flux ratio of each object to Vega. </para><para>It is difficult to understand stars using just the apparent magnitudes. Imagine two stars in the sky with the same apparent magnitude, so they appear to be equally bright. You cannot know just by looking if the two have the same <emphasis>intrinsic</emphasis> brightness; it is possible that one star is intrinsically brighter, but further away. If we knew the distances to the stars (see the <link linkend="ai-parallax">parallax</link> article), we could account for their distances and assign <firstterm>Absolute magnitudes</firstterm> which would reflect their true, intrinsic brightness. The absolute magnitude is defined as the apparent magnitude the star would have if observed from a distance of 10 parsecs (1 parsec is 3.26 light-years, or 3.1 x 10^18 cm). The absolute magnitude (M) can be determined from the apparent magnitude (m) and the distance in parsecs (d) using the formula: </para><para>M = m + 5 - 5 * log(d) (note that M=m when d=10). </para><para>The modern magnitude scale is no longer based on the human eye; it is based on photographic plates and photoelectric photometers. With telescopes, we can see objects much fainter than Hipparchus could see with his unaided eyes, so the magnitude scale has been extended beyond 6th magnitude. In fact, the Hubble Space Telescope can image stars nearly as faint as 30th magnitude, which is one <emphasis>trillion</emphasis> times fainter than Vega. </para><para>A final note: the magnitude is usually measured through a colour filter of some kind, and these magnitudes are denoted by a subscript describing the filter (&ie;, m_V is the magnitude through a <quote>visual</quote> filter, which is greenish; m_B is the magnitude through a blue filter; m_pg is the photographic plate magnitude &etc;). </para>
</sect1>