• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

The supermoon

Doug Kerr

Well-known member
Last night, 2012.05.05 (in the US time zones), an astronomical event often popularly called a "supermoon" occurred. There are many misunderstandings about this event. (The actual time was 20120506 0335 UTC.)

The event was that a lunar perigee (the closest approach of the moon to the earth on some lunar orbital cycle) almost precisely coincided in time with the full moon.

Thus, we have a situation in which the visual size of the moon (its subtended angle) at full moon is greater than for other full moons (when the moon is not at perigee - is a little father away).

What made this particular event so "special" is that the times of the lunar perigee and of the full moon were so nearly identical - differing by about 2 minutes (I do not have the exact value at hand right now, and it hardly matters). (To the nearest minute, perigee was at 0334, full moon at 0336.)

This event did not represent the closet approach in many years of the moon to the earth, as we often hear stated. The perigee distance varies from lunar cycle to cycle, owing to the gravitational interaction between (mostly) the earth, moon, and sun. In 2011, for example, there were times at which the moon was closer to the earth than last night, but none of those times so closely coincided with the time of a full moon.

Basically, the subtended visual angle (apparent size) of the (entire) moon at perigee is about 6% greater at perigee than the average over a typical lunar cycle. When we heard that the moon was "14% bigger than normal", that referred to its subtended solid angle (its apparent area).

We heard that during last night's event the moon was "30% brighter than normal". The "inverse square law" is often cited in this connection.

We here realize that the perceived luminance of an an extended source (an object surface) does not vary with the distance to the observer (we assume no attenuation by the intervening medium). The inverse square law is not involved.

Thus, the luminance of the moon as seen from the earth does not depend on its distance from the earth.

It does depend on its solar illuminance - the degree to which it is illuminated by the sun. This varies with its distance from the sun, which of course varies slightly during a lunar cycle. (The inverse square law is involved there.) (There are also issues relating to the angles of incidence and observation and the non-lambertian nature of the moon's surface.)

This distance will be slightly less during a full moon that occurs at or near lunar perigee, but that is not of any great consequence, and the distance will vary to a greater degree over the year owing to the ellipticity of the earth's orbit around the sun.

The "supermoon" event is sometimes, but imprecisely, described "technically" as a lunar perigee-syzygy. Syzygy (in this context) refers to three astronomical bodies lying along a straight line. Since at full moon, the earth, sun, and moon nearly lie in a straight line, the term syzygy is often used by show-offs to mean "the situation of a full moon".

Of course, when the sun, moon, and earth actually lie precisely in a straight line (true syzygy), we have a lunar eclipse.

But not last night.

Best regards,

Doug
 
Last edited:

Asher Kelman

OPF Owner/Editor-in-Chief
Thanks for the reminder!



_60060702_jex_1398779_de27-1.jpg


Source for more info.
 

Doug Kerr

Well-known member
Hi, Asher,

Thanks for the reminder!



_60060702_jex_1398779_de27-1.jpg


Source for more info.
Evidently shot from a long distance from the ancient colonnade (some might say, "with a long focal length lens", but of course we can't directly tell, and this might just have been a small crop from the original image.)

By the way, for those who speak of the "enormous size" of the supermoon, here's how that same shot would have looked with the moon at an "average" distance from the earth:

Supermoon-01-02.jpg


Not-so-super moon

Sadly, in the article you cited, the spokesman for the Royal Astronomical Society says:

"The eye is so good at compensating for changes in brightness that you simply don't notice (that element) so much."

In fact, there is no difference in brightness to be expected from the difference in distance from the earth. (Glad he was not speaking for the Royal Photometric Society.)

Best regards,

Doug
 

Doug Kerr

Well-known member
I thought I would discuss the misconception that the perceived brightness of the moon varies with its distance from the earth, in particular that it varies in accordance with the "inverse square law".

What is the "inverse square law"? This refers to the fact that for light emitted by a point source of a certain luminous intensity, the luminous flux density at some distant location varies as the inverse of the square of the distance of that location from the point source.

A corollary is that, for light emitted by a point source of a certain luminous intensity, the illuminance upon a distant surface, for a given angle of incidence (for example, for a surface perpendicular to the line of travel of the light) varies as the inverse of the square of the distance of that surface from the point source.

This relationship is approximated when the source is not a point source but rather a source of finite dimensions so long as its size is small compared to the distance to the distant location of interest.

But here we are not concerned with the luminous flux density, at the earth, of the luminous radiation from the moon, nor with the illuminance caused on the earth's surface by the luminous radiation from the moon (and if we were, we would need to note that, from the earth, the moon is not a good approximation of a point source).

Rather, we are concerned with the luminance ("brightness") of the moon as observed from the earth. If the diameter of the observing eye's pupil is small compared to the distance of the observer from the moon (and it certainly is), then the observed luminance is unaffected by that distance.

Let's look into that a little more. First, we must assume a fixed pupil diameter. (If the pupil diameter changes from case-to-case, as a result of the adaptation of the eye to the overall "scene" it observes, then of course the relationships I am about to discuss will not hold constant.)

The perceived brightness of an arbitrary small region on an observed surface is proportional to the illuminance upon the retina of the image of that region. This is the ratio of the luminous flux falling upon the retina across the image of that region to the area of the image of the region.

Setting aside "transmission loss" in the eye (or assuming it to be constant), the luminous flux falling on the retina over the image of some arbitrary small region of the surface being observed is the luminous flux coming from that region and captured by the aperture of the retina.

Now consider the surface being observed being at a varying distance, and we will always consider a small region on the surface whose image on the retina is the same size. That would be a region on the surface that always subtends the same solid angle from the eye.

Starting with a certain location of the surface being observed, we then move it to twice the distance from the eye. The region we now consider has four times the area on the surface as the region we first considered.

Any tiny portion of that region may be considered a point source. Let us consider such a tiny portion of the region whose area is the same regardless of the distance from which we observe the surface.

Having moved the surface to twice its original distance from the eye, then for any such tiny area, treated as a point source, its luminous flux density upon the pupil of the observer's eye will be only 1/4 what it was for the original situation, and so the luminous flux from that tiny area gathered by the pupil (whose area is constant) will be 1/4 as much as in the original situation..

But the region of the surface whose image on the retina is the same size as the image in the first case now has four times the area of the region in the first case.

Thus that region comprises four times the number of "tiny areas" as in the first case. Accordingly, the total luminous flux incident on the pupil, and captured by it, from the entire (new) region of interest is the same as for the region of interest in the first case.

Since, by definition, the size of the image on the retina of the new region in the second case is the same as for the image of the original region in the first case, and the flux captured by the pupil from the pertinent region is the same in both cases, the illuminance on the retina for the image of the region is the same. Thus the perceived brightness of the surface (at the region of interest) is the same in each case.

Quod erat demonstrandum.

There can of course be second-order effects that may cause the perceived brightness of the moon in a given phase (state of illumination by the sun) to vary slightly with distance from the earth. But the degree of this variation is certainly not what would be expected from the (inappropriate) application of the "inverse square law".

Best regards,

Doug
 
Top