Last updated on by Attila Kovács

CRUSH: SCUBA-2 data reduction

Table of Contents

   

Introduction

As of version 2.30 (September 2015), CRUSH has renewed, and thoroughly reworked, support for SCUBA-2. Unlike the old, separate, limited, and proprietary module, the new support is included freely, without restrictions and binding agreements. It now falls under the same GNU General Public License (GPLv3) as the rest of CRUSH.

The new, bundled, module has been thoroughly rewritten to support imaging with all subarrays, and comes with many improvements and refinements. This document aims to get you started, so you are just minutes away from reducing your own SCUBA-2 data...

It is assumed that you are already familiar with the contents of CRUSH's main README (inside the distribution directory), especially its Section 1 (Getting Started). If you run into difficulty following this document, you will probably find it useful to study the main README a little longer.

 

Computing requirements

CRUSH does not require any special computing hardware to reduce SCUBA-2 data. You just need a run-of-the-mill PC or Mac (laptop, desktop or server) with a few GB or RAM (to fit the SCUBA-2 data), running any flavor of UNIX (such as Linux, BSD, or MacOS X) or Windows. (CRUSH will also run on ARM-based platforms, but don't expect blazing speeds on these mobile-class devices...)

You will also need Java 6 (a.k.a 1.6.0) or later on the machine, but you probably have that already...

Your main limitation will likely be the RAM available, which will limit how many scans you can reduce at once. But, you might be surprised just how far a few GB will get you...

 

Performance

CRUSH is lean and mean when it comes to memory use and reduction speed. It is also thoroughly parallelized, and will use all processing cores in your machine to maximum effect.

If your machine has N processing cores, you can expect to reduce a minute of typical 850 μm SCUBA-2 data (4 subarrays per imaging band) in about one N-th of a minute (excluding OS and processor overheads, and disk reads & writes). Thus, with a dual-core i5 mobile CPU, my Lenovo T430s laptop churns through a minute of 850 μm SCUBA-2 data in around 30 seconds. An i7 desktop with 4 cores should get there in 15 seconds, while a 12-core Xeon processor should clock in at around 5 seconds reduction time per minute of observation (excluding system overheads!). Imaging at 450 μm will typically take about twice longer.

You might find the best performance when reducing multiple scans together (especially when the number of scans reduced is greater than the number of virtual cores on your machine), due to reduced overheads as the need to share data between threads is effectively minimized. (This is especially likely for some high-end servers with dual multicore CPUs...)

 

Setup and personalization

Start by installing CRUSH, as instructed by the main README.

Starlink

CRUSH will readily read SCUBA-2 data in FITS format, when these are available. More typically, though, your SCUBA-2 data comes in NDF (.sdf) format of the Starlink suite, and so you will need to convert your files to FITS for using with CRUSH. Fortunately, CRUSH can make the conversion seamless, as long as the necessary Starlink tools are accessible. (Once you have FITS files, you no longer need Starlink, or the original .sdf files.)

To set up the automatic NDF to FITS conversion, first make sure the Starlink suite is installled on your machine. (You only really need the ndf2fits tool included in Starlink...). You can now continue personalizing CRUSH.

Your personal settings

Start by creating (or editing) .crush2/scuba2/default.cfg in your home directory (this file will contain your personalized configuration entries). E.g. on UNIX and Mac OS:

$ mkdir -p ~/.crush2/scuba2

Then create/edit 'default.cfg' therein, e.g. with 'nano':

$ nano ~/.crush2/scuba2/default.cfg

(Replace nano with your preferred editor, such as vi or emacs or gedit.) In this file you can specify your default values for the SCUBA-2 scan data directory (datapath), and where CRUSH should write its output files, such as the reduced images (outpath). You might also want to specify where the Starlink ndf2fits conversion tool is. E.g.:

datapath /home/data/scuba2
outpath ~/images
ndf2fits /usr/local/Starlink/bin/convert/ndf2fits

(The ndf2fits setting may not be needed if you aready have FITS files, or if the STARLINK_DIR environment variable points to your Starlink distribution folder...). Also, make sure the output directory actually exists. :-)

You can also specify and override any of these settings ad-hoc on the command line later. E.g.:

$ crush scuba2 -datapath=/home/myData -outpath=../myImages [...]
 

Reducing SCUBA-2 data

Now, let's reduce some data. Suppose you want to reduce pointing scan 62 from 2015-06-30, at 850μm:

$ crush scuba2 -850um -date=2015-06-30 62

(The -850um or -450um options select the imaging band for the reduction — CRUSH will default to 850um imaging if the imaging band is not explicitly specified.)

You may have to prepend the path to crush when invoking. E.g. use

$ ./crush [...]

(if running crush from within its own directory), or

$ ~johndoe/crush/crush [...].

After the reduction completes, you can look at your images with CRUSH's own show tool. E.g.:

$ show ~/images/URANUS.20150630.62.850um.fits

And, you can post-process output images using imagetool (and also show, which takes all of imagetool's options). E.g.:

$ imagetool -smooth=7.5 ~/images/URANUS.20150630.62.850um.fits

To see a help screen with the available image processing options, run

$ imagetool -help

or check out the online manuals on the CRUSH site. Of course, you may also use your own favorite FITS viewer/tool, if you prefer.

 
Uranus at 450um -- color scale Uranus at 450um -- stretch image with ring
Figure 1. Uranus scan 62 observed on 2015-06-30, reduced by CRUSH, and the 450 μm image displayed with show after smoothing by a 5" FWHM Gaussian kernel. Full linear scale with colors (left panel), and a stretched flux image showing sidelobes and a faint, ~200 mJy/beam (0.2% level), ring in the PSF at a radius ~190" from the central peak (right panel). Both images are roughly 8 by 8 arminutes.
 

Pointing and calibration

For pointing scans, CRUSH will automatically perform a pointing fit at the end of the reduction, complete with peak and integrated (aperture) flux measures.

Suppose, you find that the pointing was off by -1.2, 4.6 arcsec (in Az/El). And, suppose the flux (peak or integrated) you found was too low by a factor of 1.14. You can the apply these corrections (to this scan, or others) as:

$ crush scuba2 [...] -pointing=-1.2,4.6 -scale=1.14 [...]

Now, you are ready to reduce some science scans from the same day (say scan numbers 6366, 70 with the above pointing and 8183 with a different pointing):

$ crush scuba2 [...] -pointing=-1.2,4.6 63-66 70 -pointing=2.3,1.8 81-83

Similarly, you can apply different calibration scalings too for the different scans. In general, options will be applied to all scans after the option is set, redefined, cleared (using -forget), or blacklisted (via -blacklist) on the command line. Thus, global options should be up front before any of the scans are listed...

CRUSH comes with a rough built-in integrated (aperture) flux calibration for SCUBA-2, with around 10% rms blind calibration error. It is based on a handful of primary calibrator scans on Uranus and Neptune, and the temperature model in GILDAS/ASTRO. It is possible that with a more thorough analysis and a larger representative sample of calibrator scans, the blind calibration may be improved. If you would like to help improve the blind calibration, please contact Attila Kovács (attila[AT]submm.caltech.edu).

 

Source brightness

The examples above produce a default reduction, which is meant to work with a wide variety of sources (brightness and extent), but you may do better than that for your particular target. For example, if your source is faint you may use the -faint option. For deep fields (very faint point sources) you can use -deep which will aggressively filter the large scales... E.g.:

$ crush scuba2 -faint [...]

or

$ crush scuba2 -deep [...]

For very bright sources (which might be flagged excessively by the default settings) you may try -bright.

You can find more details on pointing and calibration in the sections below. A more detailed general overview is in the main README (e.g. the Quick Start guide there, and the various advanced topics).

 

Extended sources

By default, CRUSH is optimized for reducing compact sources, and to provide the cleanest possible images. This works well up to scales of 2 to 4 arcmins across (depending on how bright the source is). If you have a more extended source, you may try the extended option. E.g.:

$ crush scuba2 -extended [...]

or

$ crush scuba2 -faint -extended [...]

(I.e., extended may be combined with other options like faint to tune the reduction for your particular source). The extended option will recover more of the large scales (up to FoV or, for bright sources, beyond the FoV — see the corresponding section in the main README), but you will pay the price of reduced sensitivity (increased noise, especially on the large scales).

You can further fine-tune extended mode reductions by specifying an approximate source extent (FWHM) via the sourcesize option (in arcsecs), and by changing the number of iterations via rounds. E.g.

$ crush scuba2 -extended -sourcesize=300 -rounds=20 [...]

(The above will set the expected source size to 300" fwhm and will iterate 20 times.) The larger the sourcesize, and the more you iterate, the more extended emission you will recover, but you will also see more and more noise, especially on the larger scales (such as a wavy background). You need to make you own call as to what is best.

You can find more information on the recovery of extended emission in the main README, under the Advanced Topics section.

 

Compact sources

For compact sources (≤ 2') you may get a cleaner image if you remove gradients from the exposures. The rationale is that the gradient removal will reject sky-noise to first order. However, it will also toss out any structure that is comparable to, or larger than, FoV/2. However, structures much smaller than that will remain largely unaffected.

Gradient removal is default for faint and deep mode reductions. If you want to turn on gradient removal otherwise, you can do it via the gradients flag. E.g.:

$ crush scuba2 -gradients [...]
 

Smoothing images

You have the option to smooth your images via the smooth option. Beware, however, that CRUSH typically applies one smoothing for intermediate maps (to increase redundancy between iterations), and another for the final output map. Therefore, to smooth the output map by a 6" FWHM Gaussian, you will want to use:

$ crush scuba2 -final:smooth=6.0 [...]

(The prepended -final: directive instructs crush to apply this setting in the final iteration, overriding whatever setting was specified for it before.)

Beyond specifying the size (FWHM) of the smoothing kernel, you can also set one of: minimal, halfbeam, 2/3beam, or beam to specify the smoothing kernel relative to the beam size (minimal is 1/3 of a beam). E.g.:

$ crush scuba2 -final:smooth=halfbeam [...]

Default reductions are unsmoothed, whereas faint images will be smoothed with 2/3beam, and deep images will be beam smoothed, by default.

To disable smoothing your output image, you can use:

$ crush scuba2 -final:forget=smooth [...]

To learn more about the effects of smoothing, please refer to the section Pixellization and Smoothing under Advanced Topics in the main README.

 

Closing remarks

The SCUBA-2 support of CRUSH is not a sanctioned, or an official, SCUBA-2 reduction suite. It was created essentially as a "hobby" project. (I can only hope that I'll be paid for it in kind... :-). It is provided to you in the hope that it will be useful. It is far from being a finished product, and yet, it should be quite capable, easy to use, and fast to run. If you find issues, or possible areas of improvement (such as calibration, or tweaking of the default configurations), then please do not hesitate to contact me about them.

With the help from you, and others, and by trying CRUSH on a multitude of data (i.e. far more than I have time or energy to reduce myself) it may evolve, and become better, and more powerful over time.

If you give it a go, and you like what you see, but would like more help getting your science data reduced more optimally, feel free to contact me (Attila Kovács) for further help and tips.

 

Credits

This re-incarnation of the CRUSH support for SCUBA-2 would not have happened without the essential help from others. The thanks go to Tony Mroczkowski for the persistent prodding to create this module, and for seeking out the necessary training data for configuring it; and Ed Chapin, without whom I would not have made enough sense of the data themself. If you like the outcome, you might want to invite them (and me) for a beer or two, if you happen to bump into them (or me). :-)

 
Copyright © 2015 Attila Kovács (attila[AT]submm.caltech.edu)