The brightness of an astronomical body. The apparent magnitude is the brightness of an object when seen from Earth while the absolute magnitude is equal to the apparent magnitude the object would have if placed at a hypothetical distance of 10 parsecs or 32.6 light-years from Earth. Absolute magnitude is used to compare the intrinsic brightness of stars and other astronomical objects.

The magnitude scale is an inverse logarithmic scale, which means the lower the number the brighter the object is. For example, the Sun has an apparent magnitude of -26.7, the full Moon has an apparent magnitude of -12.7 and Sirius the brightest star in the night sky has an apparent magnitude of -1.5.

The magnitude scale has been constructed so that a difference of 5 magnitudes corresponds to a brightness difference of exactly 100.