Magnitude
In astronomy it is common to use the concept of Apparent Magnitude, which is the measure that expresses brightness as we see it.
The magnitude scale has a historical origin but has been redefined in the modern era. The magnitude Scale is explained in detail in our EBook Magnitude & Distance.
In Summary apparent magnitude (symbol m):
- indicates the brightness of a star as we see it
- is a scale that runs opposite to brightness (the smaller the magnitude, the brighter the star)
- is centred at the apparent magnitude of Vega (m = 0)
- allows negative values for bright stars
- is a logarithmic scale in which five units in magnitude correspond to a factor of 100 in brightness
Absolute magnitude uses the same scale as apparent magnitude. It has a value which is defined as the apparent magnitude a star would have if it were located at a standard distance of 10 pc from Earth.
As an example the Sun has an apparent magnitude m = -26.73 but it has an absolute magnitude of +4.75. That is the magnitude we should see if the Sun was at a distance of 10 pc. Therefore absolute magnitude is directly related to the luminosity of a star and is a way to express its luminosity in the magnitude scale.
Now if:
- we have a measure of the absolute magnitude (M) of a star and
- we compare that with the apparent magnitude (m),
then we can use the Inverse Square Law to find the distance.
The method used for this in astronomy is related to (m - M) which is called the Distance Modulus.