From: Milton Aupperle <milton@outcastsoft.com>

Date: July 28, 2008 2:35:29 PM MDT

To: Astro_IIDC@yahoogroups.com

Subject: Re: [Astro_IIDC] Monochrome???


Hi Dave;


On 28-Jul-08, at 1:24 PM, doodlebun wrote:

-Milton,


If I understand your explanation, the workflow you are recommending 

is:


1. Capture R, G, and B Monochrome movies with the DMK21AF04

2. Do NOT select USE MONOCHROME CHANNEL FOR  SHARPNESS AND ALIGNMENT.

instead just change MONOCHROME to RED (or G or B) for processing a 

movie through the red filter.


Yes, which color channel you select doesn't matter for Monochrome cameras. So you can select "Red" and then run a Mono (Luma, Red, Green or Blue) filtered movies though and you will save a bit of processing time.


The reason would be to speed up the stacking in the step leading to 

display of the confidence setting histogram screen?!?


Actually it speeds up the sharpness selection before histogram and alignment after histogram too. The biggest gain is on the Sharpness Estimation side, because there is very little computation needed now. The main limit to the sharpness is just pulling the frames off the hard drive too.


After you accept the subset of frames in the histogram screen, the 

stacking begins, which is rather slow. Are you saying the steps 

stated above will speed up that process, producing a finished stacked 

image ready for wavelet adjustments, much sooner with no downside of 

the technique?


It will speed up a bit, not huge but somewhat. you might want to experiment with using a smaller "Pixel Matching Size" too rather than 128x128 which will save a lot more time on the Aligning side. I rarely go over 64x64 pixels unless I'm double or triple scaling the image.


On an aside, I recently replaced the original Apple Matsui 5400 rpm hard drive in my G4 Laptop with a 5400 RM Western Digital Scorpion and have doubled the throughput for sharpness estimation. The original hard drive was 150% slower than the $85 replacement hard drive was. Adding the new drive and another 512 megs of Ram (total 1 gig now)  to this G4 laptop has mad it a lot faster, especially with the resource hog that is Leopard.



NOW CONSIDER THIS THEORY OF MINE ABOUT COLOR VS MONOCHROME.


   If you scan the net for planetary pictures taken with Imaging 

Source cameras you will notice only a tiny fraction of them will be 

taken with color cameras like the DBK or DFK series. 95% will use the 

monochrome with filters.


Actually your will still see a lot or people using the older USB 1.1 Philips color cameras too, which are firmware "hacked" to deliver bayer video out, The main reason they switched was because the DMK deliver much faster throughput. But among the TIS camera users, many do use the DMK Mono series with filters.


Now here comes the heresy from Florida in the form of axioms.


1. The surface brightness of Saturn is rather dim, which makes a LRGB 

approach prudent.


2. The luminance frame of a Saturn image is much brighter and with 

higher resolution than the R,G, or B. filtered image in the 

DMK21AF04. This makes the LRGB combo image the way to go.


3. The surface brightness of Jupiter is so high that taking Luminance 

frames is a waste of time. The resolution of a processed luminance 

image is no better than a processed red image.


4. A color camera like the DBK21AF04 will produce images of Jupiter 

that are indistinguishable from monochrome RGB combined images. 

Monochrome images must be rushed to finish in less than 3 minutes. 

Jupiter's moons like Io move so fast that the RGB images will never 

register over each other anyway, making the color camera the logical 

choice.


The color images coming from a DBK / DFK will be somewhat lower resolution that a DMK Mono camera. The color cameras use a Bayer Color Filter Mask (CFA)  over the CCD, so that each pixel on the sensor records a single R, G or B color. The array is generally laid out so that its Red Green alternating pixels on one line and Green Blue pixels on the next line, so that you wind up with 2 times as many Green pixels as Red or Blue pixels. Kodak is one of the big pioneers for digital color imaging:


http://en.wikipedia.org/wiki/Color_filter_array


http://en.wikipedia.org/wiki/Bayer_filter


The Green dominance approximates human vision, which is most sensitive to the Green spectrum and we get the bulk of our image detail from green too.


When software goes to display the color image, we use various algorithms to mix the adjacent color pixels to create the RGB image you see. Since we use adjacent pixels, the resolution is decreased somewhat and generally speaking your using an average of the 4 adjacent pixels around the edges.


However unless you've got phenomenal seeing, your likely not to be able to make full use of each pixel anyhow, which is the main reason I rarely shoot with monochrome cameras because I live under the jet stream (and have mountain outflows and inversions etc.). In my situation, the time spent shooting 3 or 4 streams, processing them and then re-combining them is not worth it over an RGB Bayer camera.



   In conclusion:  


    Saturn:  use DMK21AF04 with LRGB filters

    Jupiter:  use a DMK21AF04 alone and color balance with Astro IIDC.


I think you meant "Jupiter : use a DBK21AF04 or DFK21AF04 camera with Astro IIDC", but I agree with the basic premise.


Another possibility if money / cost was not an issue would be to use Flip mirror and two cameras (a DMK21AF04 mono  and DBK21AF04 / DFK21AF04 color). You could capture Luma for details with the DMK21AF04 and then shoot the color with the DBK21AF04 / DFK21AF04. That means you need to capture 2 streams to get the Luma RGB full range, fewer steps and less processing time too.


When I process my Bayer color movies, I select which color channel makes the most sense and has the greatest detail level. For Luna, Saturn and Jupiter  i generally stack on just the Green channel, but for Mars I will use the red channel. Having a look at the image in different light reveals different features too. And the reason most people don't do the enhanced color Luna images like I do with a Bayer color camera is because to shoot 3 colors, combine them and hope the seeing was "the same" for all 3 is basically not likely to happen.


On a side note, as Allan pointed out narrow band imaging offers improvements,  but with a Bayer RGB camera, your automatically "Narrow Band" filtering each pixel anyhow when you capture it. One thing I would love to see is under average and good steady skies, what is the actual improvement when shooting Bayer versus 3 RGB colored filters?



Hope That helps..


Milton J. Aupperle