Michael A. Covington    Michael A. Covington, Ph.D.
Books by Michael Covington
Consulting Services
Previous months
About this notebook
Search site or Web
Ichthys

Daily Notebook

Popular topics on this page:
Explore Scientific 8x50 finderscope
Eyecup for Explore Scientific finder
Dependency parsing comes of age
In praise of "useless" research
Piaget on mental development
A cover for a telescope pier
Can a glove translate sign language into English?
Now it's Nikon...?

Astrophotos:
Sunspot (yes, just one)
Transit of Mercury
Mars
Saturn
Jupiter with shadow of Ganymede
M81 and M82
M104 with the Arrow and Stargate
Many more...
This web site is protected by copyright law.
Reusing pictures or text requires permission from the author.

For more topics, scroll down, press Ctrl-F to search the page, or check previous months.

For the latest edition of this page at any time, use this link: www.covingtoninnovations.com/michael/blog

2016
May
29-31

Very short note

I'm writing a lot of Digital SLR Astrophotography and a lot of software, but not a lot in the Daily Notebook. I've added some material to the Nikon review directly below, and thanks to the generosity of a friend, am about to test quite a few Nikon lenses on the stars. I'll see you in June!

2016
May
26-28

Nikon D5300 impressions

[Updated.]

I've had the Nikon D5300 for 3 days and haven't taken a presentable astronomical photograph with it, thanks to clouds (and Tropical Storm Bonnie). But here is a presentable nature photograph. The native format of the camera is 4000×6000, so the image you see here has been downsampled by a factor of 8.3.

I have done some astrophotographic tests, as well as a bit of daytime photography.

Pleasant surprises:

  • I don't need to cover up the light on the front! Exposure delay (equivalent to a 2-second self-timer with mirror prefire) doesn't make it light up.
  • The D5300 can take any Nikon F-mount lens made since 1959, except for a few strange ones that don't fit most other Nikons either. On the lens mount, it has one AI sensor (at the lower left, to see if an AF lens is set to minimum aperture), but the sensor is designed to press down harmlessly when a pre-AI lens is attached. Nikon Support has confirmed this for the D5300, even though Nikon's instruction manual for the D5300 says pre-AI lenses cannot be used.
  • The sensor noise really is low.
  • The "Canon tartan" row and column noise isn't there. Even the most extreme stretching of astronomical pictures produces only random noise and a very slight gradient near the bottom (the latter totally dealt with by dark frame or bias subtraction).
  • According to published tests, the included 18-55mm "kit lens" is quite sharp and has little distortion from 35 to 55 mm (quite a bit of barrel distortion below 35, like its Canon competitor).
  • Reportedly, this camera can output uncompressed video through its HDMI port (to an external capture device). That opens up some interesting possibilities with lunar and planetary work. Most DSLRs cannot output uncompressed video in any form.

Disappointments or unfulfilled wishes:

  • With non-electronic lenses, the camera can't use its light meter or aperture-priorty auto exposure. This is not a necessary limitation — Canons don't have it — but it's how Nikon chose to do things. What this means for astrophotographers is that the exposure for flat fields has to be determined by trial and error.
  • No electronic first-curtain shutter (a la Canon 40D "Silent Shooting"). So this is not the camera for still photos of the moon and planets.
  • No "movie crop" mode that uses the full resolution of the sensor. (By the way, does anybody have any use for Canon's movie crop mode, except for astronomers doing planetary imaging?) As best I can determine, the highest-resolution video mode has the pixels binned 3×3. I'll have to see how that works for planetary work.
  • When focusing an astrophoto in Live View, I need to turn the camera up to its highest ISO setting in order to see stars.
  • GPS (for recording location data, and, more importantly for an astronomer, for setting the camera's clock) didn't work until I downloaded an updated "assisted GPS" file from Nikon. They tell me I may have to do this periodically. Hmmm...

Although this is a great camera, I don't have a good lens for doing serious deep-sky work with it. My first astrophotos will probably be taken through a 50-mm lens. I'm plotting and scheming... Nikon lenses fit on Canons with an adapter, but not vice versa, so if I want to use both kinds of bodies, I need to revert to the lineup of big Nikon lenses that I was using ten years ago. My long-term plan is to have a hybrid system with both Canon and Nikon bodies and lenses.

Of course, this camera is great for finding fault with any lens, because its sensor is sharper than any lens can be. It has 250 pixels to the millimeter, so a one-pixel star image would be only 4 microns across. That is the diffraction-limited Airy disk diameter for about f/6, and for lenses faster than f/6, aberrations predominate. So maybe what Nikon has actually made is a lens-testing instrument.

I note that the D5300 does not have an optical anti-aliasing filter. The rationale is that no detail actually rendered by a lens could possibly be small enough to tangle with the individual pixels.

2016
May
24-25

Mars

Mars is closer to the earth than it has been in some years, and I got a good picture of it last night (May 24). 8-inch telescope, 3x focal extender, DFK video camera; stack of the best 75% of several thousand video frames. To the right is a map generated with WinJUPOS to tell you what you're looking at.

Bear in mind that Mars has weather (such as the light-colored clouds at the south pole at the bottom of the picture) and also that the dark streaks can shift as the wind blows the dust around.



Saturn

Saturn is near Mars in the sky (much farther away in space, of course), and I got a picture of Saturn too. Same telescope as with the Mars image above, but because Saturn is so much fainter, I used a 2x extender and my Canon 60Da recording video. The Canon produces compressed video, which loses some subtle detail compared to the DFK planetary camera, but it is more sensitive to light.

2016
May
22-23

Can a glove translate sign language into English?

Engineers in a number of places are working on special gloves that sense the position of the hand and recognize some form of sign language of the deaf.

Such devices will surely be useful, but there are two things the public needs to know:

(1) Sign language does not translate into English word-for-word or letter-for-letter. (There are signs for letters of the alphabet, but that is not the usual way of using sign language.) Signs stand for ideas; they are put together in different ways than English words; and there are different sign languages in different English-speaking countries. All of them are as expressive as spoken languages.

(2) Sign language involves things that a glove cannot detect, such as facial expressions and gestures involving nearby objects. For example, a tabletop might substitute for one hand in a two-hand sign.

Concern about these points has prompted an open letter to the University of Washington publicity office. The authors of the letter point out that the proposed gloves would not be universal communication tools. They would help only in a situation where one of the parties does not understand sign language at all, and, for some reason, the other one would rather use partial sign language with the glove than resort to writing or other alternatives.

In the letter and the surrounding discussion, you will find two words related to advocacy for the deaf. "Audism" is the prejudice that people who can't talk (or can't speak English) are stupid, which of course they aren't, but the prejudice has been hard to overcome. "Deaf" written with a capital letter refers to people who form a community held together by sign language and its associated culture.



Now it's Nikon...?

I've just ordered a Nikon D5300 DSLR camera to try out for astrophotography. This may be a brief experiment, or it may be the start of an avalanche.

At the end of the film era, I was an avid Nikon user. When Olympus discontinued the OM series, I moved to a Nikon F3HP for astrophotography and an autofocus Nikon N70 (F70) for daytime photography. I built up a fine collection of lenses and took some excellent pictures.

But then came the DSLR era, and for the first ten years (about 2004-2014), Canon DSLRs were the ones to use for astronomy. I tried a Nikon D70 and didn't get good results. So I built up my present Canon system, which currently includes three bodies and about a dozen lenses, many of them excellent.

Time to re-assess. Nikon, Pentax, and Sony (all using Sony sensors) seem to be pulling ahead of Canon in some aspects of sensor performance. The sensor on the Nikon D5300, in particular, has unusually low noise, or so I'm told. It's one of several good ones. Nikon even makes an astronomical DSLR, the D810A, and although it's expensive, it has an enormous sensor by astronomical standards.

The astronomy-related feature that the D5300 lacks is electronic (vibrationless) first-curtain shutter. So no matter how well it serves me, it won't take the place of my Canons for full-face sun and moon shots.

My D5300 is going to have a deep red body rather than the usual black. If it becomes my main deep-sky camera, in due course I will have the hydrogen-alpha filter modifcation done on it by Lifepixel.

And at less than $600 with lens, I think it's the cheapest DSLR I've ever bought.

2016
May
22

(Extra)

Sunspot (yes, just one)

The sun has only one spot at the moment, but it's a big one. I was easily able to see this spot with just a filter in front of my eyes (no telescope), and you may be able to see it when the sun is setting and dimmed by clouds.

These pictures were taken with my (vintage 1980) Celestron 5, a Thousand Oaks plastic solar filter, and my Canon 60Da at the f/10 focus. The upper one is a stack of fifteen still images, processed by PIPP, stacked with AutoStakkert, and sharpened with RegiStax. The lower one is from a 2600-frame movie taken with the camera in movie crop mode, processed with the same software, which extracted the best 25% of the video frames.

2016
May
22

A cover for a telescope pier

My telescope pier (a permanent steel pipe) has a new cover. It's a Char-Broil 8919401 smoker cover from (as best I recall) Lowe's. It has Velcro at the bottom, and by wrapping it around, I can make the Velcro meet even though my pier is much narrower than the smoker it was intended to cover.

Past experience suggests it will last about 4 years out in the sun and rain.

2016
May
19-21

One more difference between orality and literacy

Following up the previous entry, one noteworthy difference between the way people think in oral cultures and in literate cultures has to do with whether things outside your immediate experience are real.

Plenty of people can read but still have largely oral (pre-literate) habits of thinking. To them, things like the Eiffel Tower, the roundness of the earth, and Henry VIII may seem almost as fictitious as Sherlock Holmes or Mickey Mouse. They are "things we read about" and are found in textbook-land, which is a lot like storybook-land.

Those of us who are highly literate are accustomed to using books, mass media, Wikipedia, etc., to get true information about things that really exist, and we make a really sharp distinction between reality and fiction. (Baker Street is real; Queen Victoria was real; Sherlock Holmes was fictional.) It's not that way for everyone.

It has been observed that when rural people go to college, they quit smoking. I don't think it's just peer pressure; it's also acculturation into a set of people who believe that those warnings about lung cancer, etc., are real, as opposed to just being erudite talk from distant characters in textbook-land.



Piaget on developmental psychology

While we're talking about mental development, another useful tool of thought is the set of stages described by Jean Piaget, pioneer child psychologist. The three most relevant, for education, are:

Preoperational stage (early childhood): The child interacts with people and things; learns what they are like and what they do; but engages in little abstract reasoning.

Concrete operational stage (mid-childhood): The child reasons about people and things; learns how games and machines work; learns procedures for doing things; and can make generalizations from observations. "If-then" statements have to refer to relatively familiar things and situations.

Formal operational stage (adolescence to adulthood): The person thinks about thinking and about hypothetical situations, which can be far removed from reality; keeps track of logical connections elaborately; and can keep track of differences between different people's thoughts.

People who are weak on the formal operational stage easily get mixed up about hypothetical situations; they recognize that what is described is not real, but they aren't good at keeping track of the conditions that apply. Think about people who are attracted to the lottery because "you can't win if you don't play" but don't seem to grasp that big winnings are always rare. Think also about people who mix up "all X are Y" with "all Y are X" and people who can't do word problems in mathematics.

We don't all move into the formal operational stage in all areas of life. Much of everyday life can be lived on the concrete operational stage; assembly-line work can even be done preoperationally (you see something and do the usual thing with it, without pondering).

Further, we probably go through all three stages when learning a new activity. Consider learning to play chess as an adult. You might experience a preoperational stage (see the pieces, observe how they move); a concrete operational stage (make legal moves but only follow shallow guidelines or tactics for winning); and, finally, a formal operational stage (imagine various ways the game might go, and try to make one of them happen).

Teaching people to write, I've often tried to nudge them into the formal operational stage. At the concrete stage, they want procedures and recipes; they imagine writing is largely about following rules, which most people hate but a few delight in. The threshold of the formal operational stage, as a writer, is when you think of more than one way to say something, then choose the best. The same is true of the other arts; the novice comes up with one way to do something; the expert thinks of several good ways, then makes a choice.

2016
May
15-18

Facebook, a pre-literate culture that uses writing?

Do you think like a person who can't read? Surely not, or you wouldn't have found this and read it.

But I've recently put my finger on something that bugs me about Facebook and other social media (some going back to the earliest online forums).

On Facebook and in similar places, we often see people using written communication but behaving like a pre-literate (oral) culture.

Marshall McLuhan and many others have pointed out that reading changes the way people process information, and when lots of people learn to read, a whole society changes from "oral" to "literate" habits.

Consider some specific contrasts.

ORAL LITERATE
People experience language almost entirely as one-on-one or small-group conversations. People commonly read or hear statements addressed to the whole public, to large groups, or to third parties.
People are often unaware of controversies; when aware of them, they take them as issues of personal loyalty, and they associate only with like-minded people. People are accustomed to hearing about controversies from more than one side; they view controversies as occasions for fact-finding or for getting a message out to the public.
People get information from a few trusted individuals; trust depends on who told you. People get information from sources that have some claim to authority, such as newspapers, encyclopedias, and pamphlets from businesses and organizations; trust depends on who originally said it.

Now think about your Facebook experience. The first of these 3 contrasts explains why some people seem so mixed up about privacy. "I wasn't talking to you!" Then why did you put it where it would be shown to me? "My ex is stalking me on Facebook." Well, stop putting things where he can see them. The Internet is a public place. You can make things (reasonably) private, but if you show us things, we'll see them.

The second contrast explains one of the worst things about social media — that it becomes entirely too easy to surround yourself with people who have the same opinions (and are ignorant of the same things) and never learn about the rest of the world. That makes it far too easy to turn disagreements into personal conflicts instead of questions of fact.

And the third contrast is why gossip is such a problem. In a pre-literate culture, you can't find out who originally said anything; you just have to decide whether to trust the close friend who passed it along to you. That's what people do on the Internet: "This has to be true, my sister-in-law forwarded it to me and she don't lie!" Never mind that it says Abraham Lincoln will give you a billion dollars for forwarding his e-mail a thousand times, or something like that. People with literate habits know that Google and Snopes are just a click away — or at least that if you don't know the original source of something, you can't rely on it.

So, to sociologists of the future, I commend Facebook as an object of study. Lengthy written records of oral cultures must be uncommon, but this is a copious one.

2016
May
14

In praise of "useless" research

[Minor revision.]

Further to what I was just saying, everybody please note that the dependency-grammar research that led to Google's language analyzer was considered useless at the time. Many educated people thought that theoretical linguistics as a whole was almost useless.

That is why it pains me to see journalists who don't know much about science ranting about "useless" research grants. "Why would anybody want to study the effect of cocaine on goldfish? Yuk, yuk, yuk!"

It wouldn't have been funded if experts hadn't thought it was worthwhile. And even experts don't always get it right.

Anyone who wants to invest in pure research — even if the investor is the government — is going to have to let experts judge it, not journalists.

I want to separate this from a different issue: whether the government should fund basic research at all. Obviously, the government has to fund some research that it needs for its own purposes in areas such as defense, economics, and public health. But should the government fund pure research that isn't for its own needs? That's a political question. And if that's the question you want to argue about, please do so! Just don't muddy the waters by claiming that spending is "useless" or "wasteful" when you don't actually know.

We got the Google language analyzer (which Google has made available free for everyone) from a long line of seemingly useless linguistic research.

If and when we get a cure for AIDS or cancer, it will come from "useless" cellular biology, from laboratory work that might revolve around lower animals or even plants. (Genetics, after all, was pioneered with English peas.) It won't come from an AIDS or cancer clinic. It won't even get into a clinic until lots of research and development has been done.

Incidentally, my work on dependency grammar was not funded (except by my salary at the University of Georgia, and they were paying me to teach other people how to do the same kind of things as I was doing).

2016
May
13

Dependency parsing comes of age

Allow me to blog, and even brag, for a moment about what I do in my day job.

Any computer that understands English has to recognize sentence structure. What that means is that grammar, and ways of anayzing it, is now an important part of cutting-edge software. I am, by training, a theoretical linguist; this is exactly the kind of thing I've been studying all my life.

There are two ways to describe sentence structure. You can break the sentence up into phrases, like this:

or you can establish links from word to word, like this:

These are called, respectively, constituency grammar and dependency grammar. In my student days, and for decades afterward, the second one was very much in disfavor. Linguists felt that the first one was right and the second one was somehow equivalent to it, but more obscure; it was advocated only by a few eccentric Europeans.

Nonetheless, I promoted dependency grammar in papers such as this one and especially this one, which was written hastily for a small conference but has turned out to be one of the most influential papers I ever wrote.

I can't claim full credit, but that paper was at least one of the things that led Joakim Nivre to pursue computer implementations of dependency grammar.

And this week I was gratified to see two very interesting developments.

First, John Hale, Marisa Boston, and others at Cornell University have been using Nivre's work as a framework for modeling how the human brain processes language. John just visited UGA and gave a talk about this.

Second, Google has released an open-source parser (language analyzer) that is unabashedly based on dependency grammar and uses Nivre's basic methods.

So I can perhaps claim to be one of the grandparents of Google's parser.

Why use dependency grammar? Several reasons.

First, unlike constituency grammar, dependency grammar doesn't introduce separate labels for phrases (NP, VP, S). That means you have fewer objects to handle when analyzing a sentence. The phrases are still there; in the example, the phrase "all dogs" is exactly what you can get to by following the left arrow from the verb. But the phrases are implicit rather than explicit.

Second, relations like "subject of the verb" and "object of the verb" are what you actually need when you're trying to compute what the sentence means. If you just draw constituency trees, you then have to analyze the trees to get these relations; why not just start with them?

(A third advantage, which I made much of in my 1990 paper, is that if the language has highly variable word order, the dependency analyzer has less opportunity to get tangled up. Apparently, this spurred some people in Korea to try it and gave rise to a widely used method for parsing Korean.)

By the way, neither of the diagrams above is the kind of sentence diagramming that you may have learned in high school. The Reed and Kellogg sentence diagram (introduced in 1878) is mostly dependency- rather than constituency-based, although in 1878 none of the theory had been developed.

2016
May
12

M81 and M82 with an apo refractor

Although I do most of my medium-wide-field astrophotography with a top-quality 300-mm telephoto lens, the usual instrument for the purpose is an apochromatic (apo) refractor telescope. I was privileged to be allowed to use a friend's AT65EDQ, which is a 65-mm f/6.5 four-element, flat-field apo refractor from Astronomics. Here's what happened when I photographed M81 and M82 (compare March 4).

First you see the full-field image, to show that the stars are sharp to the corners of an APS-C sensor. (The telescope actually claims to cover appreciably larger sensors, and surely does.) There was a hint of vignetting — unavoidable unless you want to build a very glare-prone telescope — but PixInsight took care of it (and flat-field correction would have taken care of it better).

The second image is the central region, enlarged, with adjustments to color saturation and contrast. You can see some ionized hydrogen (red) in the turbulent central region of M82.

2016
May
11

To a familiar place, vale atque ave

Since retiring, I have spent more time in the University of Georgia Science Library than ever before. I have time to read things I couldn't read before, and it's a good place to sit with my laptop and work on whatever I'm writing or coding.

The main (second) floor (reference) was remodeled in 2012 and is much more comfortable now. At the same time, they changed the overhead lights throughout the library to a new type that don't reflect as much on laptop screens — sadly, they also don't illuminate vertical book spines very well, but they're fine to read by.

This year the third floor (physical sciences, mathematics, and engineering) gets a major renovation, and, to my alarm, all books published before 2000 will supposedly be moved to the Repository. That means they're moving out a lot of books whose positions I have known for over 40 years (including some that I wrote!). I do hope there will be some discretion; they have a complete run of Sky and Telescope and a complete run of QST, and many other things of historical interest.

Update: Far fewer books were removed than we were led to expect. See August 11 for the outcome.

Anyhow, I'm sure I'll enjoy sitting at the desks on the renovated third floor and keeping the librarians busy with repository requests. In the meantime, here are some pictures that may, perhaps, be full of memories only for me. But here they are, our last look at the third floor in its present state.

2016
May
10

Transit of Mercury

For the second time in my life I have observed a passage of Mercury in front of the sun. The first time was May 9, 1970, exactly 46 years ago. There have been several transits of Mercury in between, but I did not see them. The May 9 date is no coincidence; we are at the right position in our orbit on that date, and sometimes Mercury is in the right position its orbit on that date too, and we see a transit.

Why don't we see a transit every time we catch up with Mercury going around the sun? Because the earth's and Mercury's orbits are not in the same plane. Usually, Mercury passes north or south of the sun.

As an eighth-grader, I observed the May 1970 transit with Edward Van Peenen II (where is he now?) at Valdosta State College Observatory. We did sun projection with a 3-inch refractor. Professor Van Peenen also took photographs of the projected image using his venerable Exakta VXIIb (I think it was).

I listened to time signals on the shortwave radio, timed the end of the transit, and send the results to Sky and Telescope. At that time, amateurs were asked to time the transit as seen from their locations in order to measure the position of Mercury more precisely. My timing was published and was apparently not very accurate.

Today, 46 years later, I used my Celestron 5 (which is 36 years old!) with a Thousand Oaks solar filter and Canon 60Da camera. Dodging high clouds, I got a lot of bad images and a few good ones. (The cloud layer is responsible for the overall hazy appearance of this picture.)

You can see a sunspot group above center, a few tiny sunspots elsewhere, and Mercury below center. This is a stack of five 1/800-second exposures.

2016
May
9

Rubber eyecup (eyeglass protector) for Explore Scientific finder

I keep rubber eyecups on all my eyepieces so that accidental contact with the eyepiece will not scratch my glasses. (I usually observe with glasses off, but sometimes with glasses on, particularly during setup and while focusing the telescope for other people to look through.) My new Explore Scientific finder had an eyepiece that was particularly harzardous to eyeglasses and did not have an eyecup, but I found a solution.

That O-ring is sized 1 1/8 inch i.d., 1 3/8 inch o.d., and 1/8 inch thick. The size is critical, and O-rings of that size are not common, but I found this one at Home Depot.

Properly sized, the O-ring will stay in place under its own slight tension, but I tacked it down with a few dabs of contact cement applied with a toothpick. Crucially, when you're putting the O-ring on, don't let part of it twist separately from the rest; if you do, it will pop right off.

This isn't really an eyecup, but it does protect my glasses.

2016
May
8

M104 with the Arrow and Stargate

M104 in Virgo is one of the brightest galaxies in the sky, though not the largest, as seen from Earth. It is a good choice if you want to show someone a galaxy in your telescope and their eyes are not trained or not dark-adapted. As you can see, it has a prominent dark dust lane encircling it almost like the rings of Saturn.

The picture also shows two compact groups of stars (asterisms). The "arrow" is near M104 and seems to point to it. Toward the lower right is the "Stargate," which resembles the triangular insignia of Stargate SG-1.

This is a stack of sixteen 3-minute exposures with a Canon 60Da and 300-mm f/4 lens at f/5.6, on an AVX mount in my yard. The air was not very steady, stars were twinkling, and the autoguider tried to "chase the seeing" (follow the air turbulence), resulting in rather poor tracking, though still good enough for this picture.



Bing!

Starting today, the search boxes at the bottom of this page and on the Previous Months page use Microsoft's Bing search engine rather than Google. I should have done this long ago. Google simply will not index the Daily Notebook properly. It sometimes fails to realize I've started a new month, so an entire page gets neglected for a long time, or even permanently.

2016
May
7

Io, Ganymede, and Jupiter with a black eye

On the evening of May 5, I set up both telescopes, merely to install and adjust finderscopes. The air was rather unsteady. Looking at Jupiter, though, I saw a black spot, which turned out to be the shadow of Ganymede, its largest satellite.

Celestron 8 EdgeHD, Meade 3x extender, DFK color camera. Best 25% of about 2700 video frames.

2016
May
5-6

Advantages of a premium-quality finderscope
Explore Scientific 8x50 straight-through finder

(Based on a brief review posted on Cloudy Nights.)

I've just upgraded the finder on my Celestron 8 EdgeHD. I was using a vintage Meade 8x50 finder that Melody gave me for Christmas in 2001. The finder that came with the EdgeHD, on the other hand, had been placed on my even-more-vintage Celestron 5. Yes, I mix and match components...

The immediate problem was nowadays, I use the telescope with my glasses off (especially now that I have astigmatism correctors on the eyepieces), so I'd like to be able to use the finder the same way. On most finders, however, the reticle is permanently focused for distance vision. You can adjust the front lens to make the stars focus with the reticle, but you can't focus the reticle. Accordingly, I have to use the finder with my glasses on.

So I wanted a finder that focuses at the eyepiece. I also wanted to be able to change the focus easily, since during the initial stages of setup, I still have my glasses on.

Others recommended the Explore Scientific, and I bought it from Astronomics while it was on sale a couple of weeks ago.

As usual with Explore Scientific, the optics do not disappoint. The mechanics — well, we'll get to that.

Optically, the finderscope gives a sharp image, and its reticle, with thin lines and a central circle, is calibrated in degrees. I really appreciate being able to tweak the focus easily. (The reticle does not rotate with focusing, and I had the presence of mind to align the reticle lines north-south and east-west, a practice I recommend.) Overall, the higher-quality finder makes the whole telescope seem more solid and professional.

However, there is no rubber eyecup, so beware of bumping your eyeglasses on the eyepiece. The eye relief is about 12 or 15 mm, not enough for comfortably viewing the whole field with glasses on. However, you do not normally need the whole field, and my solution is simply not to move in too close. The place where an eyecup might fit is 30 mm in diameter and only 2 mm deep, so aftermarket eyecups are, in general, not applicable, but I may be able to improvise an eyeglass protector by gluing an O-ring in place, and if so, I'll write about it here. (See May 9.)

The illuminator seemed too dim (the opposite of most people's complaints about it) until I took it apart and put it together again. There had been a loose connection. Now it is a variable-brightness red LED illuminator, as advertised. During setup in twilight and when doing planetary work, I need a rather bright reticle, and it delivers. I can also turn it down until the illumination is barely visible. I am thinking of trying to get one or two more and modifying them — bright yellow for planetary work, dim red for deep-sky work.

The internal construction of the illuminator is, shall we say, very economical. It looks as if maybe it was built with components from a different illuminator, with last-minute improvisations. For example, one battery terminal is the curled-up positive lead of the LED.

The illuminator fits into a hole that is threaded 8 × 0.75 mm. I am told other illuminators are widely available that fit the same hole. Mine, however, is serving me well.

The bracket fits standard Meade and Explore Scientific bases, so I popped it into the Meade base that was already on the telescope. But then I found it hard to use, and I ended up reverting to my original Meade bracket, which had been fitted with plastic screws long ago.

The Explore Scientific bracket looks nice. It has two plastic screws and one spring plunger on each of its two rings. It's easy to adjust.

The problem is, the spring plungers hold the finderscope so lightly that it drifts fore and aft under its own weight! Now we know why, when Celestron introduced a finder bracket with a spring plunger in it, they used an O-ring at the front, rather than a second set of three screws. A bracket of that type is on my C5 and works well; the single spring plunger makes it easy to adjust.

One way to continue using the Explore Scientific bracket is to replace at least one of the two spring plungers with a 1/4-20 plastic screw, and maybe replace the other one with a stiffer spring plungers. The plungers are threaded 6 × 1.25 mm, but the holes are 1/4-20; within loose tolerances, those two sizes are the same.

(One more finder-bracket hint. If you want to put a finder other than the original one into a Celestron bracket, you may need a thinner O-ring. Go to Lowe's and buy one or more of their large O-ring assortments, and you're set.)

Bottom line? A great finder, but you may have to replace or fix the illuminator, and you will almost certainly want to modify the bracket if you use it.

2016
May
1-4

A victim of doublethink

There's a lot going on here, and I haven't been blogging regularly, but it's time to start the month of May — on a political rather than scientific note.

We have just seen the Republican nominating process collapse, to the point that a man who is not historically Republican and has not stood for what the party stands for is apparently going to be its nominee for President.

And will secure victory for the Democrats, because although Trump is popular with his special-interest group, he is not popular with the rest of the American people.

I never publicly endorse candidates, but this year I have made half of an exception — I have publicly de-endorsed Donald Trump. The reasons should be fairly obvious.

And I think the nominating process has been a victim of doublethink. Are primary elections the first round of general elections, in which anyone can vote? Or are they internal activities of political parties, open only to party members?

By mixing the two concepts, the Republican Party has failed to function as a party. Its leaders have not been allowed to choose a candidate who represents the party and has a good chance of winning the general election. Instead, we have had a popularity contest, outside the Republican Party proper, and a special-interest faction has won it.



New book coming

Work has begun on a new edition of Digital SLR Astrophotography. Don't hold your breath — it will probably take a year to reach the market. But what that means it that most of my technical writing about astrophotography is no longer going into this blog. I'll continue to share pictures, of course.


If what you are looking for is not here, please look at previous months.