The first step in the analysis of the tutorial is to download the necessary input datasets. First, to keep things clean, let’s create a gnuastro-tutorial directory and continue all future steps in it:
$ mkdir gnuastro-tutorial $ cd gnuastro-tutorial
We will be using the near infra-red Wide Field Camera dataset. If you already have them in another directory (for example, XDFDIR, with the same FITS file names), you can set the download directory to be a symbolic link to XDFDIR with a command like this:
$ ln -s XDFDIR download
Otherwise, when the following images are not already present on your system, you can make a download directory and download them there.
$ mkdir download $ cd download $ xdfurl=http://archive.stsci.edu/pub/hlsp/xdf $ wget $xdfurl/hlsp_xdf_hst_wfc3ir-60mas_hudf_f105w_v1_sci.fits $ wget $xdfurl/hlsp_xdf_hst_wfc3ir-60mas_hudf_f125w_v1_sci.fits $ wget $xdfurl/hlsp_xdf_hst_wfc3ir-60mas_hudf_f160w_v1_sci.fits $ cd ..
In this tutorial, we will just use these three filters.
Later, you may need to download more filters.
To do that, you can use the shell’s
for loop to download them all in series (one after the other26) with one command like the one below for the WFC3 filters.
Put this command instead of the three
wget commands above.
Recall that all the extra spaces, back-slashes (
\), and new lines can be ignored if you are typing on the lines on the terminal.
$ for f in f105w f125w f140w f160w; do \ wget $xdfurl/hlsp_xdf_hst_wfc3ir-60mas_hudf_"$f"_v1_sci.fits; \ done
Note that you only have one port to the internet, so downloading in parallel will actually be slower than downloading in series.