Calculate star luminosity and magnitude with this astronomy tool
Star luminosity is the total energy emitted by a star per unit time. It's a measure of the star's intrinsic brightness. Magnitude, on the other hand, is a measure of how bright a star appears from Earth. There are two types of magnitude: apparent magnitude (how bright it appears) and absolute magnitude (how bright it would appear at a standard distance).
The luminosity of a star is related to its surface temperature and radius. The Stefan-Boltzmann law states that the total energy radiated per unit surface area of a blackbody across all wavelengths per unit time (also known as the blackbody's emissive power) is proportional to the fourth power of the blackbody's temperature [1]. Astronomers use this law to calculate a star's luminosity.
Magnitude is measured on a logarithmic scale. The apparent magnitude scale is inverse, meaning that brighter objects have lower (more negative) magnitudes. The absolute magnitude of a star is defined as the apparent magnitude it would have if it were at a standard distance of 10 parsecs from Earth [2].
Parameter | Value |
---|---|
Luminosity (L/L☉) | 1.23 |
Absolute Magnitude | 4.56 |