In Saturated pixels and Segment’s clumps, we described how you can run Segment such that saturated pixels do not interfere with its clumps.
However, due to the very extended wings of the PSF, the default definition of “objects” should also be modified for the scenario here.
To better see the problem, let’s inspect now the
OBJECTS extension, focusing on those objects with a label between 50 to 150 (which include the main star):
$ astscript-fits-view sat-seg.fits -hOBJECTS --ds9scale="limits 50 150"
We can see that the detection corresponding to the star has been broken into different objects. This is not a good object segmentation image for our scenario here. Since those objects in the outer wings of the bright star’s detection harbor a lot of the extended PSF. We want to keep them with the same “object” label as the star (we only need to mask the “clumps” of the background sources). To do this, we will make the following changes to Segment’s options (see Segmentation options for more on this options):
Let’s make these changes and check if the star has been kept as a single object in the
OBJECTS extension or not:
$ astsegment sat-nc.fits --convolved=sat-fill-conv.fits \ --gthresh=-10 --objbordersn=0 \ --output=sat-seg.fits --rawoutput $ astscript-fits-view sat-seg.fits -hOBJECTS --ds9scale="limits 50 150"
Now we can extend these same steps to the whole image. To detect signal, we can run NoiseChisel using the command below. We modified the default value to two of the options, below you can see the reason for these changes. See Detecting large extended targets for more on optimizing NoiseChisel.
Furthermore, since both NoiseChisel and Segment need a convolved image, we will do the convolution before and feed it to both (to save running time). But in the first command below, let’s delete all the temporary files we made above.
Since the image is large (+300 MB), to avoid wasting storage, any temporary file that is no longer necessary for later processing is deleted after it is used. You can visually check each of them with DS9 before deleting them (or not delete them at all!). Generally, within a pipeline it is best to remove such large temporary files, because space runs out much faster than you think (for example, once you get good results and want to use more fields).
$ rm *.fits $ mkdir label $ astmkprof --kernel=gaussian,2,5 --oversample=1 \ -olabel/kernel.fits $ astarithmetic flat/67510.fits set-i i i 2200 gt \ 2 dilate 2 dilate 2 dilate 2 dilate nan where \ --output=label/67510-masked-sat.fits $ astconvolve label/67510-masked-sat.fits --kernel=label/kernel.fits \ --domain=spatial --output=label/67510-masked-conv.fits $ rm label/kernel.fits $ astarithmetic label/67510-masked-sat.fits 2 interpolate-maxofregion \ --output=label/67510-fill.fits $ astarithmetic label/67510-masked-conv.fits 2 interpolate-maxofregion \ --output=label/67510-fill-conv.fits $ rm label/67510-masked-conv.fits $ astnoisechisel label/67510-fill.fits --outliernumngb=100 \ --detgrowquant=0.8 --detgrowmaxholesize=100000 \ --convolved=label/67510-fill-conv.fits \ --output=label/67510-nc.fits $ rm label/67510-fill.fits $ astsegment label/67510-nc.fits --output=label/67510-seg-raw.fits \ --convolved=label/67510-fill-conv.fits --rawoutput \ --gthresh=-10 --objbordersn=0 $ rm label/67510-fill-conv.fits $ astscript-fits-view label/67510-seg-raw.fits
We see that the saturated pixels have not caused any problem and the central clumps/objects of bright stars are now a single clump/object. We can now proceed to estimating the outer PSF. But before that, let’s make a “standard” segment output: one that can safely be fed into MakeCatalog for measurements and can contain all necessary outputs of this whole process in a single file (as multiple extensions).
The main problem is again the saturated pixels: we interpolated them to be the maximum of their nearby pixels. But this will cause problems in any measurement that is done over those regions. To let MakeCatalog know that those pixels should not be used, the first extension of the file given to MakeCatalog should have blank values on those pixels. We will do this with the commands below:
## First HDU of Segment (Sky-subtracted input) $ astarithmetic label/67510-nc.fits -hINPUT-NO-SKY \ label/67510-masked-sat.fits isblank nan where \ --output=label/67510-seg.fits $ astfits label/67510-seg.fits --update=EXTNAME,INPUT-NO-SKY ## Second and third HDUs: CLUMPS and OBJECTS $ astfits label/67510-seg-raw.fits --copy=CLUMPS --copy=OBJECTS \ --output=label/67510-seg.fits ## Fourth HDU: Sky standard deviation (from NoiseChisel): $ astfits label/67510-nc.fits --copy=SKY_STD \ --output=label/67510-seg.fits ## Clean up all the un-necessary files: $ rm label/67510-masked-sat.fits label/67510-nc.fits \ label/67510-seg-raw.fits
You can now simply run MakeCatalog on this image and be sure that saturated pixels will not affect the measurements. As one example, you can use MakeCatalog to find the clumps containing saturated pixels: recall that the --area column only calculates the area of non-blank pixels, while --geo-area calculates the area of the label (independent of their blank-ness in the values image):
$ astmkcatalog label/67510-seg.fits --ids --ra --dec --area \ --geo-area --clumpscat --output=cat.fits
The information of the clumps that have been affected by saturation can easily be found by selecting those with a differing value in the
## With AWK (second command, counts the number of rows) $ asttable cat.fits -hCLUMPS | awk '$5!=$6' $ asttable cat.fits -hCLUMPS | awk '$5!=$6' | wc -l ## Using Table arithmetic (compared to AWK, you can use column ## names, save as FITS, and be faster): $ asttable cat.fits -hCLUMPS -cRA,DEC --noblankend=3 \ -c'arith AREA AREA AREA_FULL eq nan where' ## Remove the table (which was just for a demo) $ rm cat.fits
We are now ready to start building the outer parts of the PSF in Building outer part of PSF.