Tim TrottTim TrottWelcome to my life

The Magnitude Scale

By , Monday 21st April 2008 in Astronomy Basics History of Astronomy

One of the fundamental concepts in astronomy is that of the Magnitude Scale. You cannot get far in astronomy without hearing a reference; even on the television weather forecast when clear skies are predicted. In this article we will look at the Magnitude Scale and what it means to you.

The brightness of an object is a basic observable quantity. It is easy to observe two stars and say that star A is brighter than star B, but it would be handy if we had a way of quantifying this brightness so we can say that star A is x times as bright as star B. To this end the Magnitude Scale was introduced.

History

The Greek mathematician Hipparchus is widely credited for the origin of the magnitude scale, but it was Ptolemy who popularised it and brought it to the main stream.

In his original scale, only naked eye objects were categorised (excluding the Sun), the brightest Planets were classified as magnitude 1, and the faintest objects were magnitude 6, the limit of the human eye. Each level of magnitude was considered to be twice the brightness of the previous; therefore magnitude 2 objects are twice as bright as magnitude 3 objects. This is a logarithmic magnitude scale.

With the invention of the telescope and other observational aids the number of new objects soared and a modification was needed to the system in order to accurately categorise so many new objects. In 1856 Norman Robert Pogson formalised the magnitude scale by defining that a first magnitude object is an object that is 100 times brighter than a sixth magnitude object, thus a first magnitude star is 2.512 times brighter than a second magnitude object.

Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and later again switched to using tabulated zero points for the measured fluxes. This is the system used today.

A few common visual magnitudes
A few common visual magnitudes

Two Magnitude Scales

Going back to star A and star B, let's say that star A is magnitude 2 and star B is magnitude 3. According to the magnitude scale, star A would appear to be 2.512 times as luminous than star B. Here we are referring to the stars Apparent Magnitude, that is, its brightness as seen from Earth. This is how most magnitudes are presented on TV, planetarium software and magazines.

But how do we know that Star A is actually brighter than Star B? It is entirely possible for Star A and Star B to have the same luminosity, but star B could be further away than star A, thus appears dimmer to us from Earth.

We need another scale which compares the actual brightness of a star if it were a fixed distance from the Earth. This scale is called the Absolute Magnitude and the fixed distance is set at an internationally agreed 10 parsecs. A parsec is the distance from the Earth to an astronomical object which has a parallax angle of one arcsecond (1⁄3,600 of a degree). We will cover parallax in another article, but for now 1 parsec is equal to 3.26 light-years or 1.92 x 1013 miles.

Absolute Magnitude is given the symbol M, while Apparent Magnitude is given lowercase m.

Our Sun has an apparent magnitude of -26.73, which is easily makes it the brightest object visible in the sky, however the Sun would not be as bright if it was 10 parsecs away. At this distance it would only shine at a mere apparent magnitude of 4.6, so it would be quite faint in the night sky. At 10 parsecs the Sun's magnitude is called the Absolute Magnitude.

Sirius is the next brightest star in the sky has an apparent magnitude of -1.47, however it only lies 2.64 parsecs away so it is relatively close. If it was moved to a standard 10 parsecs away it would be absolute magnitude 1.4, that's 8 times brighter than our Sun at the same distance.

Here's a quick way of remembering the difference between absolute and apparent magnitude:

Apparent magnitude appears to be brightest, Absolute magnitude absolutely is the brightest.

Here is a more technical comparison between apparent and absolute magnitude which looks at the mathematics behind the magnitude scale and how distance modulus effects them.

My website and its content are free to use without the clutter of adverts, tracking cookies, marketing messages or anything else like that. If you enjoyed reading this article, or it helped you in some way, all I ask in return is you leave a comment below or share this page with your friends. Thank you.

About the Author

Tim Trott

Tim is a professional software engineer, designer, photographer and astronomer from the United Kingdom. You can follow him on Twitter to get the latest updates.