Saturday, September 3, 2016

Loss of the Night Sky

This is not my usual computer geek blog, this blog is on the theme of

"It should be dark at night"

Earth at Night


NASA has published this famous picture of what the dark side of the Earth looks like from space. This picture shows hot bright the cities of the world are at night, and how much of the developed world is lit up 24/7. (A really large version of this map, 10MB, is here. [biggest, 40MB, here.]

How many features can you see of the Earth without sun light? Find the Nile River, the Hawaiian Islands, the border between N and S Korea.

A lot of places 'left the lights on'.

 Light Pollution

All of these lights cause light pollution, light that obscures the night sky.

Here is an interactive map of world light pollution. Light pollution has been put on a color scale. Red is like Times Square, NYC. Black is Point Nemo. Where do you live? Orange, Yellow, maybe Green. [Map from Falchi et. al.]

[Good background, nice tweet, good references for mapping light pollution on the ground.]

Years ago a scale was developed for the apparent magnitude of stars in the night sky. The scale is logarithmic (read the link for all the maths :) ). The edge of human seeing is 6.5 on this scale. Just over 9000 stars can be seen by the human eye if the sky is dark enough (article has a lot more detail on the exact count and where this count is valid).

Add light pollution to obscure the dim stars, and the number of stars you could see at 4th magnitude drops to only HUNDREDs.
Here's a great set of charts showing how the constellation Cynus the Swan changes from 0 magnitude to 6 magnitude viewing.

Excellent demonstration of the difference of two areas with different night pollution. 

Milky Way

Have you ever seen the Milky Way, horizon to horizon? Seeing it makes you understand why the Ancient Greeks called it  gala, 'milk'.

All ancient civilizations have stories on how the Milky Way was created or what it represents. In the modern world, with our night skies, that connection is lost.

Citizen Science, Apps, No Telescope Needed

There is a great mobile phone app "Loss of the Night" (iTunes here). You become a citizen scientist. The program guides you through locating dimmer and dimmer stars until it determines how dark is your sky. It takes 15 or so 'sightings' to get enough data for a reasonably accurate measurement. The authors have a blog, shows the results, good stuff. They are developing a map of how the night sky is changing, literally from the ground up. See this for more details.

Another, more manual, citizen science project that has been running longer than 'Loss of the Night' is Globe at Night. Simple 5 step directions here.

Dark Sky Reserves and Parks

The International Dark Sky Association has started declaring areas a Dark Sky Reserve. It is a rigorous process to get an area designated such a site. Only 11 such sites are listed.

 A lower designation is a Dark Sky Park. The US has many of these, associated with its National Park system.

Finally, a Dark Sky Sanctuary is like a Dark Sky Reserve, but remote, with limited access.

It is a fact of modern times, one must create a park to see the night sky as it was ONLY 100 or so years ago, just outside living memory.

Man Made Lights and Lighting Science

The culprit in the loss of the night sky is the electric light. IDSA does have an excellent page on outdoor lighting and what communities can do to "bring back the night".

One result of light pollution is the brightening of the night sky, Sky Glow.

Another excellent website on lighting is The Lighting Institute at Rensselaer Polytechnic Institute The Lighting Institute conducts a two day course on outdoor lighting, and much more. A great resource for municipal public works departments, planning and zoning committees, as well as departments of transportation at the state level (Yes, USA centric terminology).

Look Up, Measure Your Night Sky, Report It, Get Involved


Lots of links in this article. Lots of people to talk to.

Learn about outdoor lighting.

Learn where you can educate local government officials on what is a loss of a NATURAL RESOURCE, the night sky. They can control lighting issues with local zoning.

Many US states are passing laws regarding the lighting of highways and roads, state buildings, etc.

Bring the Milky Way back, for good!



Friday, July 8, 2016

Software Archaeology

This blog is to discuss what a software engineer, me, has to do when there has been years of neglect to a program.

I work in the embedded systems space, so this blog will talk about embedded programs, not Windows, not Unix, but embedded programs. Some written exclusively in assembly, some in C. Most with no threads or other OS assistance.

Definitions: Software Archaeology - The investigation, research, documentation, and rewriting to gain meaningful understanding of long ago abandoned or neglected software programs.

What causes a program to be abandoned or neglected? Why is archaeology required in the first place?

The programs I have worked with were written in the early '90s. Software standard practices are better than back then. I will say that many projects today are better than back then, but there are many that are still built the same way it was done 20 years ago.

Software consultant, Joe, talking to friend consultant John.

"Joe - How's the new assignment going?" ask John. "Oh, they're writing legacy code" replied Joe.

When software is written it combines 1) the author's domain knowledge, and 2) the author's understanding of the underlying hardware.

The software is constrained by how well the underlying hardware can accomplish the task to be completed. The software is also constrained by the author's knowledge of the domain problem that is the source of information for the task. The author then brings their personality, experience, drive, and insight to the writing of software.

Software is the art and science of translating human goals into a language where a computer can perform the task expounded in the goal.

What do I find when I read code from another era? I find the remnants of enough of the domain and programming language knowledge to do the job, but no more.

I took a computer languages course in 1976. I was introduced to Algol, PL/1, Snobol, and APL. I do not use any of these languages today. I don't know who does. I learned and used FORTRAN in other courses, which is still widely used in numerical computing applications. C was just starting to be used in research labs.

If I had to resurrect a program from that era, I would have to learn, to a certain extent, the actual computer language, its syntax and nuance, to understand how the program functioned.

Sometimes, the need for the source code is not necessary. Sometimes all that is required is a complete definition of the inputs and outputs of the program. This is probably a simple program, but if you know that a certain list of numbers goes into a program and a set of operations is performed on that list and a new new set of numbers is created, then any programming language that handles the inputs and outputs could be used to satisfy the task of converting the input to the output.

The constraint on recovering an old program is that the inputs and outputs are still in place. They cannot be changed. What is missing is the exact definition of the inputs and outputs. The program knows what those inputs and outputs are. But the complete definition is in the code.

Unfortunately, the program can't expound on its nuances or give background on what the author was thinking. Comments help, when they are present.

The techniques of re-factoring are those that will provide the most insight with the best chance of documenting the inputs and outputs.

The other difficulty with working with old programs are the tools. With each generation of processors, comes a new generation of tools.

In the '90s the method by which one debugged an embedded program was very primitive or very sophisticated, but not much in between. The primitive method was to output RS-232 messages to display the current state of the code. Each output would reveal the changing state. Analysis would then determine what might be wrong. The very sophisticated, and thus very expensive method, was to use an In-Circuit Emulator or ICE.

Memory was expensive in the '90s. Embedded processors did not have cache. Programs ran from Read Only Memory, which may have been PROMs, EPROMs or Flash. The processor would have break point capability, but only if the memory location could be changed to an 'illegal instruction' to cause a jump to the interrupt handler that would provide the debugging support. This only worked if the program was running in RAM. Inserting an illegal instruction into ROM is impossible. This is the same mechanism used today for software breakpoints. Hardware breakpoints were nowhere to be seen.

The ICE provided a way for a host processor, a PC, to have RAM memory substitute for ROM memory as well as take over the clock operations of the processor, allowing the user to watch the processor step through each instruction in as much detail as desired.

Breakpoints are essential.

RS-232 output disturbs the timing of the program and use up precious memory. The ICE was an emulator and thus provided the debugging functions without the rewriting of code and using any additional memory.

If the program was neglected, then the tools have been neglected. The ICE unit may no longer power  on, if it can be found at all.

The history of how the author came to writing the code is lost. The author learned, most likely by trial and error, the nuances of the language and the hardware. This history is not documented.

All in all a puzzle.

That's another good definition of software archaeology, the study of puzzles created by time and neglect.