Michael A. Covington    Michael A. Covington, Ph.D.
Books by Michael Covington
Consulting Services
Previous months
About this notebook
Search site or Web

Daily Notebook

Popular topics on this page:
Ira Edward Aaron centennial
Electronics in the new century
Software appearing tiny on high-resolution screen
NGC 6633
Veil Nebula
Field of Gamma Cygni
Jupiter (Great Red Spot breakup)
Many more...

This web site is protected by copyright law. Reusing pictures or text requires permission from the author.
For more topics, scroll down, press Ctrl-F to search the page, or check previous months.
For the latest edition of this page at any time, create a link to "www.covingtoninnovations.com/michael/blog"



On the evening of July 10 I photographed both Jupiter and Saturn. Celestron 8 EdgeHD, 3× converter, best 50% of a large number of video frames (2889 for Jupiter, 1297 for Saturn) stacked and processed with AutoStakkert and RegiStax.

Picture Picture


Alcor rides again

What got me back into microcontroller work was that someone asked me for help with Alcor, a drive controller for old-style telescope motors that I designed in (gasp!) about 1996 and released in final form (or so I thought) in 1999.

It felt strange to be working on an assembly-language program that I had last edited more than 20 years earlier. It's probably the last assembly-language program I'll ever work on, now that the price of a good C compiler has fallen to zero.

What I've done is add a version for motors that have special gears to produce sidereal rate on 60.000 Hz; and revise Alcor's web page; and add another page about how to program an old-style PIC16F84A with modern ICSP programmers such as the inexpensive PICkit.

Here are some scenes from the development and testing process. I have quite a few more projects in mind.





Software is tiny on a high-resolution screen
Known problem with MPLAB X among many others

While compiling microcontroller code on my PC, I ran into a problem that has also beset me with older versions of Photoshop and several other products: On my giant 4K screen, some of the windows (especially file dialogs) were tiny, and others switched between tiny and normal size at inopportune moments.

The problem is as follows. Traditionally, computer screens had no more than about 120 pixels per inch, and software addressed the pixels individually. But now we have high-resolution displays ("retina" displays as some call them) with the pixels much closer together. Windows automatically displays the software at the right size if it knows how. But occasionally it guesses wrong.

The cure is in the compatibility settings for the software. There are two ways to get to them:

  • Find the executable file (e.g., in "C:\Program Files" or "C:\Program Files (x86)" and right-click on it and choose Properties.
  • Alternatively, right-click on the Start Menu icon for the software, choose More, Open File Location, find the link to the software of interest, right-click on it, and choose Properties.

Now choose the Compatibility tab, click "Change high DPI settings," and choose "Override high DPI scaling behavior," selecting "System (Enhanced)" in the box below it. At least, that's what worked for me, with MPLAB X. You have several settings to play with.



The second microprocessor revolution


There have been two microprocessor revolutions. The first, in the 1970s and 1980s, was when microprocessors became available. The second, today, is when they are cheaper than even small amounts of other circuitry.

I'm not kidding. Nowadays, if all you want to do is make an LED flash, it can be cheaper to do it with a low-end microcontroller than with conventional ICs. The micro costs no more than a 555 or a dual op-amp. It costs less than a pair of transistors and a handful of resistors and capacitors. And you don't have to stock such a variety of parts — just program the micro to do what you need.

A complete (but small) computer on a chip, with ROM and RAM and numerous i/o devices, can cost as little as 35 cents in quantity; maybe as much as 70 cents singly. So it's no surprise that almost all functions involving switching, logic, and timing are being given to micros rather than conventional ICs.

I think that is a bigger change than the introduction of transistors (1950s) or conventional ICs (1970s). Those just gave us more convenient forms of circuits that had been invented (or nearly invented) in the tube era. Microprocessors give us something new: software (called "firmware" when stored inside the chip itself). Now you program your chips instead of just connecting them together. And, sadly, the things we build aren't repairable by people who don't have the software. You can't just buy new parts and put them in.

The cost of using a micro involves three things, and all three have plummeted:

  • Cost of the chips themselves (now under 50 cents, down from $40 in the 1980s);
  • Cost of the apparatus and software to program them (now often around $20, formerly hundreds);
  • Effort required to develop the programs (now a few minutes of coding in C rather than days of labor with quirky assembly language).

The last one may be the biggest change. The Arduino came along and was both affordable to buy, and very easy to program (no special apparatus, free software, C-like language). That put pressure on manufacturers to make bare microcontrollers just as easy to use. And they're doing it.

Consider for example the $20 "Curiosity" board shown above. I'm experimenting with one. It's a programmer for new-style PIC microcontrollers (the ones that all have the same pinout, except that part of the package is simply left out of the smaller ones). It is also a prototype board — you can modify it and have it actually be the chassis for the prototype of what you're building — ideal for student projects. It has buttons and LEDs connected to some of the pins, as well as connections for several kinds of add-ons, and there are lots of removable 0-ohm surface-mount resistors so you can break connections and make changes. With it, you get free programming and debugging software for your PC (including Linux) and a good C compiler. What's more, the software includes a "code configurator" that generates C code to set up the pin definitions, operating mode, etc., so you don't overlook anything. Setup was actually the hardest part of using assembly language, because every micro had its own requirements and it was so easy to overlook things.

The other thing the Arduino gave us was a community of users with shareable code. Thorny programming problems have already been solved by other people. If you want to interface to an LCD display or some other special device, find someone who has already done it and shared the code. This saves a lot of work.

That idea has spread in two directions: Raspberry Pi for those who need more computing power (still super-cheap), and, in the other direction, easier use of bare microcontrollers, in which Microchip, Inc., takes the lead.

So, in the foreseeable future, I'll use bare micros for the simplest tasks, Arduinos for moderately complex ones, and Raspberry Pi for those that need power comparable to a small PC.

And I may be doing my last assembly language coding right now, updating a telescope drive controller that I designed in 1999. More about that soon.

More about how electronics has changed

I said above that the microcontroller revolution, replacing wiring with software, is a bigger change than simply making circuits smaller with transistors and ICs.

That's not the only thing that has changed in electronics since my youth. I'm making a real effort to get caught up. Around 2003-2006 I wrote part of an introductory electronics book, which has never been published. I struggled with keeping it from being a 1980s nostalgia piece. I think it needs another revamping before I do much more with it.

Already in the 1970s, early in the IC era, I noticed that the real divide is not between hobbyists and professionals, but between one-off and mass-produced designs. Before that, in the Heathkit era, there had been no divide, and hobbyists could (if they cared to) build equipment that looked just like manufactured gear, inside and out. The decor of the panels might be the only visible difference. But not now!

That divide has widened. The high-density surface-mount printed circuit boards used in manufactured products are not cost-effective in quantity one. Instead, custom-builders (hobbyist or pro) use ready-made prototype boards or breakout boards, where a chip is already soldered to a circuit board with the required support components already installed, and there's space for adding more. In fact, all the most useful ICs (voltage converters, audio amplifiers, etc.), if they are available only in surface-mount form, are being marketed on breakout boards by entrepreneurs in Asia. Instead of buying a chip, you can easily buy a half-inch-wide circuit board to incorporate into your own design; it may even ride on your circuit board as if it were an IC. (And technically it is an IC, hybrid, not monolithic.)

For one who started following electronics in the 1960s, it's sad that we can no longer build things the way the manufacturers do — but we can certainly get good results! The amount of functionality that we can build into a small case is greater than ever.

The other big technical change I've seen is a move toward 3.3-volt and even lower supply voltages. In the middle of my career, we always expected 5 volts for logic, 12 volts for power. Not any more. Many devices use so little power that button-cell batteries can power them, delivering 2.7 volts. This means I need to rethink my stock of parts.

As an educator, I have another concern. As everything moves into ICs and firmware, will people still learn electronics? It's hard to think of diodes and transistors as important if you never see them or make any decisions about them. There's a connection between nostalgia for the discrete-component era and a serious interest in how circuits work. (Just as many automotive enthusiasts are interested in older engines with little or no electronic control.) Machines are more interesting if you can see how they work.


Ira Edward Aaron centennial

Yesterday (July 9) would have been the 100th birthday of my distinguished great-uncle, Ira Edward Aaron, 1919-2016. To mark the occasion, today I visited the University of Georgia's education library, which is located one floor directly below where his office used to be, and which has a reading area and a book collection named in his honor.





A really good lens: Sigma 105/2.8 DG EX

Back in 2005, for Father's Day, my family gave me a Sigma 105-mm f/2.8 DG EX (or EX DG) telephoto lens for my Canons. Although billed as a macro lens (for close-up photography), this lens also is very sharp at infinity, and I've taken a lot of excellent astrophotos with it.

Now that I'm using two Nikon bodies extensively, this year I went looking for a lens like it in Nikon mount. That was a difficult quest. I found out the hard way that the Sigma 105-mm f/2.8 EX (not DG) lens is not as good. It has one fewer element and not as flat a field. I also tested two Nikon 85-mm f/1.8 AF-D lenses from the 1970s and found that they just aren't built to the standards of the digital era; digital cameras demand about five times as much sharpness as film.

For quite a while, no 105/2.8 DG EX lenses in excellent condition were available on the American used market. I eventually bought one from an eBay camera dealer in Japan and was startled that they got it here in three days, for about $20 in shipping. And I got a good price on it, too.


Sigma doesn't make a tripod collar for this lens, but I have long used the collar for a different lens (Sigma 70-200/2.8 APO) with some padding added. The collar covers up the focusing scale, which you don't need to see anyhow:


It is an 11-element lens derived from the classic Zeiss Sonnar. Here is an optical diagram (from Digital SLR Astrophotography) showing this lens and two of its distinguished ancestors:


I had to wait for good weather to do tests, but I am glad to report that the lens is good. Of course, on a modern 24-megapixel camera, it shows some aberrations at the edges of the field at f/2.8, but less than most lenses. At f/4, it is almost perfect, with just a small area of degradation at one edge, probably due to slight decentration of an element, and only visible when the image is enlarged so much that the whole picture would be several feet wide.

This is not the best Sigma can do. I am told that their 105-mm f/1.4 Art lens was designed with astrophotography in mind. It costs $1600, so for now, I'll stick with the one I got used for $200. (The 105/2.8 also lives on as a DG OS lens, adding optical image stabilization, around $500.) Sigma is positioning itself as a maker of excellent lenses; in tests, Sigma products compete well against Canon, Nikon, and Zeiss. Also, $1600 is not absurd when you consider what a decent telephoto lens cost in 1970, and scale for inflation. We've gotten used to reasonably good lenses that cost a lot less; but, inflation-scaled, really good lenses cost no more than ever, and they are sharper than ever before.

NGC 6633 and star clouds


Here you see the star cluster NGC 6633, another cluster, and the edge of the Ophiuchus Milky Way, with dark nebulosity at the bottom.

Here and elsewhere, the cross spikes on bright stars are largely due to diffraction from crosshairs I added in front of the lens to help make the bright stars stand out. Some diffraction spikes are also due to the lens's diaphragm.

This was taken as a test of how sharply well the 105/2.8 renders stars, and also how well a PEC-corrected AVX mount will track the stars with this lens without guiding corrections (answer: perfectly for at least 4 minutes). This is a stack of eight 4-minute exposures, Sigma 105/2.8 at f/4, Nikon D5500 H-alpha modified, ISO 200.

The Veil, through a veil, darkly


I've had the modified Nikon D5500 over two months and, until June 30, had not used its superpower, its enhanced ability to see red nebulae due to the filter modification. I was waiting for the earth's orbital motion to bring the Milky Way into the evening sky, and also, often, waiting for good weather.

But here's an example, in poor weather (the sky was hazy). This the Veil Nebula in Cygnus. This is a stack of four 4-minute exposures, Sigma 105/2.8 at f/4, AVX mount, Nikon D5500 H-alpha modified, ISO 200. You are looking at only the central part of the picture.

To compare how this nebula looked with an unmodified camera (under better conditions) click here. This nebula emits both blue and red light, and the other camera only picked up the blue.

The nebulae around Gamma Cygni


Still photographing through hazy air, I ended the session by taking a series of nine 4-minute exposures of the nebulae around Gamma Cygni. (Same setup and camera as above.) You may recall that this field was what I looked at through binoculars the very first time I observed the sky, over 50 years ago, although of course I couldn't see nebulae, only stars. It remains one of my favorite parts of the sky.

(When you click through, note that the other picture does not have north straight up, but this one does.)


Eclipse, from a distance

I'm glad my friends in Chile had excellent weather for today's solar eclipse. I didn't go (though I turned down an opportunity; I just wasn't sure I could get away). Instead, I used the Internet to view the European Southern Observatory's live video feed while I worked...


Advantages of having a two-screen computer...

And I'd like to point out something. There were no sunspots today. And how often do we get to see a solar eclipse on a day when there are no sunspots? Because of low solar activity, the corona was unusually smooth and symmetrical, not jutting out in random directions. Here's a video frame from the ESO:


Of course, more coronal structure will be visible in high-dynamic-range pictures that I have not yet seen. But I think pictures from today will become the textbook example of an eclipse at a time of low solar activity.

Jupiter is still at it


The Great Red Spot continues to shrink and spin off material. This isn't a great picture, and none of this season's pictures are going to be great because Jupiter is low in the sky; it's on the side of the solar system that the earth's axis is tilted away from, at least as seen from the Northern Hemisphere. My friends in the tropics and in Australia are having much better luck.

Stack of the best 50% of about 3000 video frames, C8 EdgeHD, 3× extender, ASI120MC-S camera, sharpened with RegiStax.

If what you are looking for is not here, please look at previous months .