Sunday, April 4, 2021

First Speckle Data is Confusing

I took my first speckle data last night (03 Apr 2021).  I got data on Sirius (looking for Sirius B) and Rigel (also a close binary).  While collecting the data and watching the screen, I saw something strange but I pressed on and collected it all.

When I was done I immediately started looking at the data closely.  What I saw was this typical 33 millisecond 'color' (using my ZWO-ASI385MC) image:


Let me explain this image a little more....

This is a very short exposure -- 33 milliseconds -- of Sirius.  An exposure of that length effectively 'freezes' the atmosphere and we see for this instant the extent and other qualities of the distortion.  Above the atmosphere, Sirius would be on a single pixel.  Instead, we get this complicated blob.  This distortion is very dynamic and causes what astronomers call 'seeing'.  There are two different kinds of distortions.  The first effects the amplitude of the wavefront, the second effects the phase of the wavefront.  Amplitude modulation caused by atmospheric distortion is called 'scintillation', oftentimes called 'Twinkling'.  Phase modulation causes 'seeing' and the complex structure you see in this image.  As the wavefront moves through the atmosphere, differences in refractive index of the air (caused by temperature differences, humidity, other aerosols, wind speed and direction, etc.) causes the wavefront to distort and become out of phase.  This has the effect of distorting the image spacially in the very complex pattern you see.  The airy rings (partial, in this case) are caused by the telescope optics -- you can hopefully see these rings surrounding a central round blob.  That round blob is the diffraction-limited image of Sirius as if there weren't any atmosphere.  Speckle Interferometry takes advantage of this and attempts to reconstruct a high-resolution long exposure image (I have 3000 images, at 30 fps that's 100 seconds).  In this case I'm hoping to see the white dwarf Sirius B.

A very important part of speckle interferometry is to do the observations using a narrow-band filter.  I just so happen to have a Hydrogen-Beta filter that's pretty narrow band (25 nanometers, 250 Angstroms).  It's a wonderful blue and, in fact, my favorite color.  The 'color' image above doesn't do it justice.

So here's the problem I have with this image: why am I seeing 'blue' parts and 'white' parts?????

Here's another image:






I'm completely baffled by this.  The only 'color' I should see is BLUE, since that's the only wavelength I'm (supposedly) letting through with the H-Beta filter.  The white pixels (which means there's red, green, and blue light getting through!!) shouldn't be there -- or at least I don't think so! (please see update 2021.04.06 below)

Assuming that it's correct, what would cause this?

Assuming that it's NOT correct, what would make it happen?

So what to do?  I thought about putting my eyeball on it with the hope of seeing something I'm not seeing with the camera.  If I don't see it with my eyeball, then I know it's a camera feature and I can focus on that.  But if I DO see the same blue/white thing, then what????  Well, the problem isn't in the camera, then.  Before the sensor is the filter.  Then the barlow, then the diagonal, then the secondary, then the primary, then the corrector plate.

Could the filter be doing this?  Maybe.  A filter could leak light at an offset?  Some kind of optical flaw?

The barlow is super simple -- nothing there.

The rest of the optical path is fine, as far as I can tell visually.

So what is going on???

I guess if I put an eyeball on it with and without the filter, I can eliminate that.  I could also eliminate the diagonal, but I'd have to refocus.  So that's what I'll do tonight (4 Apr 2021).

I'd appreciate any thoughts anyone has....

Update: clouds tonight.  Maybe tomorrow???

Update 2021.04:06

I'm pretty sure what I'm seeing in these images are two difference speckle patterns: blue and white.  Notice how there are more blobs in the blue than in the white?  That's exactly what I'd expect using a narrow-band filter versus a wide-band filter.  A wide-band filter would also produce the white light.  What I still don't understand is the relative offset -- why is blue on the left and white on the right?  I'm still baffled about how this could happen.