A number specifying how bright a star looks, without correcting for its distance, or other factors. Because different stars are at different distances, not a good measure of a star's true brightness. Also see absolute magnitude.
A measure of a star's (or other celestial object's) brightness as measured from Earth. Originally developed as a six-point scale by Hipparchus, now extended and open ended. Sirius, the brightest star in the night star has an apparent magnitude of â€“ 1.47 whilst the faintest detectable by the naked eye is magnitude 6.
The brightness of a body, as it appears to the observer, measured on a standard magnitude scale. It is a function of the luminosity and distance of the object, and the transparency of the medium through which it is observed.
This is a scale which was first developed by Hipparcus (160-127 BC). The system gave a number from 1 to 6 to the visible stars. Magnitude 1 being the brightest, and magnitude 6 the dimmest. The scale was later refined so that each magnitude is 2.5 times as bright as the next dimmest. This kept close to the initial scale although, since the invention of telescopes, we have been able to see much fainter stars with magnitudes well above 6. On the modern scale, some stars even have negative magnitudes. The brightest star, Sirius, has magnitude -1.42. The planets also often have negative magnitudes.The above definition is strictly a definition of apparent visual magnitude, that is, a measure of light intensity at visual wavelengths. Another sort of magnitude is apparent bolometric magnitude which would be the value if all wavelengths of light were considered. See also absolute magnitude.
The brightness of a celestial object as measured on the magnitude scale. The apparent magnitude of an object can vary depending on the method used in the observation. Astronomers use CCD cameras with filters designed to pass blue (B), visual (V), red (R) and infrared (I) radiation. Thus, when talking about the apparent magnitude of an object, it is important to know the wavelength range in which the object was observed. For example, many galaxies have been measured for B magnitude. However since galaxies include stars across the full range of the color spectrum, their V magnitudes are often about one magnitude brighter than their B magnitudes.
A system used to compare the apparent brightness of celestial objects. The lower an object's apparent magnitude, the brighter it is. A change in magnitude of 1.0 corresponds to a change in brightness by a factor of 2.5. Objects with a magnitude of 6.0 represent the approximate limit of what can be seen with the naked eye under good observing conditions.
The system used to give the brightness of stars in the sky. Brighter stars have lower numbers and dimmer stars have higher numbers. The dimmest objects visible with giant telescopes have a magnitude of +30. A good portable telescope might see down to magnitude +15. Binoculars can see down to magnitude +9 and the faintest naked eye stars have a magnitude of +6. Very bright objects have a negative magnitude, the brightest star has a magnitude of -1.4, the full Moon has a magnitude of -12.7 and the noon Sun has a magnitude of -26.8.
The magnitude of a star or other celestial body as measured from Earth. Apparent magnitude depends upon the instrinsic brightness of the object and on its distance; that is, near-by objects appear brighter than more distant objects of the same intrinsic brightness. See also
The apparent magnitude (m) of a star, planet or other celestial body is a measure of its apparent brightness as seen by an observer on Earth. The brighter the object appears, the lower the numerical value of its magnitude.