Francis raised her hand, "In astronomy, brightness is measured by the apparent magnitude scale. This is a system that uses the stars themselves as standards rather than using energy units.The magnitude scale was developed by the Ancient Greek astronomer, Hipparchus, around 120 BC. It was originally a rough visual system. The brightest stars were said to be of the first magnitude, the next brightest were second magnitude stars. This continued down to sixth magnitude stars at the limit of naked eye visibility. Note that the brighter stars are associated with the smaller magnitudes. There are five magnitudes between 1 and 6. Since 100 is 102, a little mathematics shows that each whole number value on the magnitude scale differs from the next by a factor of 102/5 (the fifth root of 10 squared). 102/5 is roughly equal to 2.512. In other words, a first magnitude star is 2.512 times brighter than a second magnitude star. A second magnitude star is 2.512 times brighter than a third magnitude star, and so on. The difference between two magnitudes is 2.512 x 2.512 (approximately 6.310). So a first magnitude star is 6.310 times brighter than a third magnitude star."
Francis continued, "Modern stellar magnitudes are often given to two decimal places. For example the magnitude of the star, Deneb is given as 1.25. Aldebaran has a magnitude of 0.85. When the magnitudes of stars were measured accurately using this new definition, some stars were found to be brighter than first magnitude. Arcturus, for example is found to have a magnitude of 0.00. Sirius, the brightest star, has its magnitude given as -1.46."
|