*The apparent magnitude of the Sun is listed as -26.74. I want to know what is the formula used to compute this? How is this figure of -26.74 arrived at? Can this formula be employed for calculating the apparent magnitudes of stars of different spectral types too?*

The idea of "apparent magnitude" goes all the way back to the Greek astronomer Hipparchus. Basically, he looked at the stars in the sky and classified them by how bright they appear -- the brightest stars were "magnitude 1", the next brightest were "magnitude 2", etc., down to "magnitude 6", which were the faintest stars he could see. It is this basic classification from over 2,000 years ago that led to the magnitude scale we have now!

This scale seems nice and simple, but it turns out that it doesn't correspond very well to how bright the stars actually are. This is because the human eye tends to take very big differences in brightness and make these differences look much smaller. Thus, you might have expected that if the magnitude scale were set up "correctly", the difference in brightness between a magnitude 1 and magnitude 2 star would be the same as the difference in brightness between a magnitude 2 and magnitude 3 star -- this would be a so-called ** linear** brightness scale. However, when we measure how much light we receive from these stars (using precise instruments such as digital cameras, or CCDs), we actually find that the brightness

*ratios*between different magnitude stars are constant. That is, a magnitude 2 star is around 2.5 times fainter than a magnitude 1 star, while a magnitude 3 star is around 2.5 times fainter than a magnitude 2 star and therefore 2.5 x 2.5 = 6.3 times fainter than a magnitude 1 star. Going all the way down to 6th magnitude (the faintest Hipparchus could see) we find that a magnitude 6 star is around 100 times fainter than a magnitude 1 star. This type of behavior characterizes a so-called

**brightness scale.**

*logarithmic*In the modern day, we've tweaked Hipparchus's definitions a bit to make them more precise and more convenient. Thus, we now *define* a difference of five "magnitudes" as exactly equal to a brightness ratio of 100. Furthermore, we tie the whole scale to the star Vega, which is *defined* to have a magnitude of 0 (or very close to it, at any rate -- modern, precise readjustments of the magnitude scale actually now put Vega at closer to 0.03, but that's a technical point). Finally, even though Hipparchus's scale only went from 1 to 6, we make no restrictions on how far the scale can go in either direction -- anything brighter than Vega simply has a negative number as its apparent magnitude! (An astronomy professor of mine in college used to speculate that maybe Hipparchus was only looking in one direction of the sky when he made up his magnitude scale; then he turned around and saw a brighter star like Vega and therefore said, "Whoops, I guess we'll have to make that magnitude 0...")

The end result of the above mess is that we get the following formulas relating the brightness and magnitudes of any two stars:

F_{2} / F_{1} = 100^{(m1 - m2) / 5}

OR

m_{1} - m_{2} = -2.5 log_{10} ( F_{1} / F_{2})

In the above, F_{1} and F_{2} are the amount of light (or flux) we receive from each star and would measure with a CCD camera, while m_{1} and m_{2} are the apparent magnitudes of the two stars.

So the way we find that the Sun has a visual magnitude of -26.74 is that we measure how bright it appears from Earth and observe that we receive around 51 billion times as much light from the Sun as we do from Vega. Plugging this ratio in for F_{1} / F_{2} in the above formula gives us a value of -26.74 for the apparent magnitude.

It is important to point out that apparent magnitude doesn't measure how bright objects *actually* are; it measures how bright they appear to us, which also depends on how close they are. (That's why the Sun's apparent magnitude is so extreme, even though it is really just a normal star.) We can also define something called "absolute magnitude" which measures how bright objects actually are -- it is defined as the apparent magnitude that an object *would* have if it were located at a distance of 10 parsecs from us. The Sun works out to have an absolute magnitude of 4.83.

To answer your final question, the above formulas work for any stars, regardless of their spectral type or color. However, the fact that stars have different colors means that the brightness ratio between a given pair of stars can change depending on what color of light you are observing. Therefore, the magnitude of each star also depends on the color of the light, so we have to define exactly what type of light we're looking at when we quote a star's magnitude. In the above, we've been talking about "visual" or "V band" magnitude, which corresponds to magnitudes measured using light that is similar to the light your eyes are most sensitive to (greenish-yellow).

For some opinions on the pros and cons of the apparent magnitude system (despite its complexity, there are more pros than you might think), have a look at this essay by Steve White from Kitt Peak National Observatory.

*This page updated on June 27, 2015*