An astronaut on the moon could only be seen by reflecting the Sun's light towards Earth. Stars on the other hand emit their own light.
To first order, the amount of flux incident upon the Moon from the Sun is the same as that at the Earth - about 1.4 kW/m$^{2}$.
Let us assume that an astronaut is perfectly reflective and that the relevant reflective area that we can see from the Earth is 1 m$^2$. NB: If the astronaut is not lit up by the Sun, then there is obviously no way that they can be seen.
Treating the astronaut on the Moon as an isotropic point source emitter of reflected light, we have a light source of power 1.4 kW at a distance of 400,000 km. The flux at the Earth is therefore $7times10^{-16}$ W m$^{-2}$.
How does that compare with starlight? Well, the total luminosity of the Sun is $3.8times10^{26}$ W. It has an absolute magnitude of 4.8. This means that if we put the Sun at a distance of about 20 pc, it would be about as faint as the faintest naked eye star in the sky. The flux received at the Earth from such a star would be $8times 10^{-11}$ W m$^{-2}$ and thus 100,000 times brighter than the astronaut.
No need to worry about the resolution of the eye, since both the star and the astronaut (at the distance of the moon) are unresolved points.
Also no need to go into the problems of contrast against the moon's bright surface (which you would need to consider if an astronaut's reflective area was 100,000 times bigger), the reflected light from the astronaut is just too faint to be seen at that distance.
No comments:
Post a Comment