![]() |
The Ohio State University College of Mathematical & Physical Sciences Department of Astronomy |
This document describes how to run data through the OSU data reduction pipline. It specifically refers to the version of the OSU pipeline in used since the 2001 Bulge Season, modified for the 2011 observing season. The pipeline has evolved some from the original PLANET collaboration software, and includes a number of new features reflecting how our approach to running the observing network has evolved since 2000.
To review, the OSU pipeline is a suite of programs (C and Fortran) run by scripts (both Perl and csh). These programs include DoPhot and related programs, and a suite of binary photometry archiving software (the "Quy" package adopted by the old PLANET collaboration). All of the reduction scripts were re-written at OSU to reflect how we organize our data reduction, provide error trapping, and to streamline the process (these scripts run quieter than the stock PLANET pipeline).
See the Quick Quick Guide even less verbiage.
~microfun/Data/on microfun (a large multi-Tb storage and analysis system dedicated to the MicroFUN project).
The name of the directory is composed of 2 parts: an abbreviation of the observatory site and a short form of the event name. For example, images of microlensing event OGLE-BLG-2011-0123 taken by the ANDICAM with the CTIO 1.3-meter telescope at CTIO would be processed in an event directory named "CT13_OB110123". For images taken by the Auckland Observatory of the same target, the event directory would be "AO_OB110123". To see a list of site codes for the various active observatories, type
microsites
Starting in 2011, the OGLE-4 project is now officially operating and event names from OGLE-4 have 4-digit numbers. By mid-season 2011 we had crossed the 1000 event line. MOA is still working with 3-digit event names, and to date we are going to stay with 3-digits for MOA unless they make a change.
The event directory name also becomes the rootname of all image files for that event from that observatory. For example, V and I images taken of OB110123 with the CTIO 1.3m telescope and ANDICAM would have names
CT13_OB110123I0001.fits CT13_OB110123I0002.fits ... CT13_OB110123V0001.fits CT13_OB110123V0002.fits ...The filter ID must be a CAPITAL LETTER, usually one of UBVR or I. For sites that take data without a filter in front of the CCD camera, we use "U" to designate "Unfiltered". Some small-telescope sites use a very wide R+I filter (actually a glass version of the Kodak #12 yellow cut-on long-pass filter) and for those we use "R" for "Red-pass".
The 4-digit number following the filter ID indicates the order in which the image has been acquired during the season in sequence, starting with 0001. In principle numbers can run up through 9999, but we only occasionally exceed 1000 images per event per site.
Finally, all images must be FITS format images in 16-bit signed integer format (BITPIX=16). The DoPhot program does not work with either 32-bit integer or floating point images, so you may have to convert your images into proper 16-bit integer format before processing.
To create a new event directory for a new microlensing event, type:
newfield CT13 OB110123The first argument is the site code (here CT13 = "CTIO 1.3m ANDICAM"), the second argument is the short form of the event name (OB110123=OGLE-2011-BLG-0123). If both are valid, you will be asked a series of questions to setup the event directory. newfield uses the event name to extract the RA/Dec coordinates from the most recent OGLE-4 and MOA event catalogs, and the MicroFUN site table to get the instrument options relevant for that observing site.
The newfield script creates a directory named CT13_OB110123/ and all of its various subdirectories as used by the pipeline programs. It also creates customized archive and .setup scripts for this directory, as follows:
Raw/ - directory to hold the unmeasured (raw) FITS images Reduced/ - directory to hold the measured (reduced) FITS images Work/ - scratch space for temporary files Dop/ - DoPhot photometry (dop) files Par/ - DoPhot parameter (par) files Archive/ - the photometry archives for each filter Archive/DopClean/ - archived DoPhot photometry (dop) filesAll new images (after pre-processing to remove bias and flat field) are copied into the Raw directory. The pipeline draws image to be measured from the Raw directory. If it successfully measures the images, it puts a copy into the Reduced directory.
[Menu]
cd Raw/ findBestThis will estimate the mean FWHM of the stars in all .fits images in the current (Raw/) directory and return the names of the 20 images with the best seeing. You can search for another number by adding it to the command line (e.g., findBest 30 will find up to the 30 best images). result. The units are FWHM in pixels. Choose 3 or 4 candidate images with small mean FWHMs and examine them with ds9 or your favorite image processing package (IRAF, XVista, etc.). A "good" template image is on which satisfies these criteria:
cd .. mktemplate I 11In this example, we have chosen CT13_OB110123I0011.fits as the template image, hence "I 11" for image I-band image 11 in the sequence ("I0011").
mktemplate examines the candidate template image and creates two files in Dop/ if successful: a master list of all measurable stars on the image (the "master catalog"), and a short list of the brightest stars on the image (the "offset catalog"). The master catalog becomes the list of all stars to be measured on all images for this data set, while the offset catalog is used to compute the relative shift between each image and the template.
If mktemplate is successful, it creates a file named ".template" in the event directory that contains the full name of the template image. This file is read by photproc and other pipelin scripts to make sure it always uses the same template image. If you choose a new template image, .template is updated, and the old template name is kept in .template.BAK as a backup.
If you later decide you want to use a new template file, you need to reset the event directory using the resetEvent script first to clean out any delinquent working files and "unprocess" the images in the Raw/ directory. [Menu]
photproc allIf instead you only want to process a subset of the unprocessed images, you would first make a list of these images, say "imlist", and then type
photproc imlistIn either case, photproc processes the images using the template image stored in the .template file created by mktemplate.
After photproc is done, check the logs for errors and repeat the above steps if the errors are correctable. Sometimes they are not, and you have to ditch the images as too bad to reduce. This is usually done by renaming a bad image in Raw/ by appending ".bad" to the name. This will allow it to be ignored by subsequent photproc runs.
After photproc runs, you will have a number of cleaned DoPhot output files (.dop extension) in the Dop/ directory, and a set of DoPhot runtime parameter files (.par extension) in the Par/ directory. If you are not going to be using the binary photometry archive, you are essentially done in that you now have DoPhot output to work with using your own photometry programs.
Common Processing Errors
If errors are encountered processing a frame, photproc will print a message to the screen like:
*** Processing failed for CT13_OB110123I0047 (see Work/photlog.024)Look at the file noted and see what it says. The most common faults are:
If the cause of the crash still makes no sense, it could mean that you have tripped across unrecognized, deeper programming problems and you should bring the matter to Rick's attention so he can try to run it down.
[Menu]
archive I 1 10 CThis creates a new photometry archive file named Archive/CT13_OB110123I with parameters as set up in the "archive" script. The field name, here "CT13_OB110123", is gotten from the .fieldname file in the current directory. If .fieldname is missing, it is a signal that archive is being run in the wrong place (or that the directory has not been setup correctly).
The arguments are: I is the filter band (V, I, etc.) of the images to archive (each filter has a separate archive), 1 10 are the images to be archived (here CT13_OB110123I0001 through CT13_OB110123I0010), and the C flag means Create a new archive file (overwriting the old one).
To append new data to the end of an existing archive, you would instead have typed:
archive I 11 20 AThis adds the records for frames 11-20 to the photometry archive file named "Archive/CT13_OB110123I".
It is OK to run archive over a range of numbers (e.g., 11-20) that includes missing numbers. The archive script will simply jump over any bogus numbers. This lets you archive good data right away while dealing with the problem images at your leisure.
[Menu]
The process of identifying the microlens and some suitable reference stars is the last major task that must be accomplished before subsequent images for a field can be "pipelined" day after day. This section steps you through the process, which is the most involved of the initial setup tasks. After this, the rest is pretty much rock-n-roll until the event is over or the observing season ends.
Step 1: Create an initial photometry archive.
Once you have created a template and done an initial photometry pass over a set of images, you need to create an initial photometry archive by running the archive script as shown above. Even though you haven't yet identified the microlensing target or any reference stars in the images, this initial "first draft" archive is needed in order to search for them using the tvphot program. Once you've indentified the lens and reference stars, you will re-run archive and replace this first-draft archive with the real thing.
Use the results from your first successful photproc run (see above) to generate the initial archive:
archive I 1 5 CThis builds an initial I-band photometry archive for the event using the first 5 image (note: this must include the template image!). The "C" flag tells archive to create a new archive file, here named CT13_OB110123I, in the Archive directory. It will also copy the template image into CT13_OB110123I.fits in the Archive directory. You need both the initial archive and the template image in order to indentify stars.
Step 2: Display the template image and examine it using tvphot
After building the initial archive, go into the Archive directory
cd Archiveand start up ds9:
ds9 &Once it is going, run tvphot on the archive:
tvphot CT13_OB110123Itvphot will first display the template image and then start the interactive display cursor. If you are familiar with the IRAF imexamine command, tvphot uses the same "blinking donut" cursor. Typing "?" will display a list of tvphot functions.
tvphot CT13_OB110123V CT13_OB110123Iwhere the ".fits" is left off and appended by the program.
Step 3: Pick stars on the image and get their DoPhot ID numbers from the archive
You need to find the DoPhot ID numbers corresponding to
To get the ID number of a star in the template image, center the tvphot cursor over the star you want information about and hit then "f" key.
tvphot will search the archive for all stars within 5 of the cursor position. It then marks each star it finds with a yellow diamond labeled by its ID number, and prints the following information to the screen xterm screen, for example, if you have the cursor on a star near X=225, Y=255, and hit "f" (for "find") it will type out lines like:
# Star X Y Mag Err RMS Typ 6 172.00 156.70 14.254 0.003 0.010 11This tells you that the star nearest the cursor is star #6 in the archive. The other data are as follows:
Note that the microlensing event, especially if faint, may be blended and so designated as DoPhot Type 13. This is OK, as we have no control over which star is the microlensing target.
Record the ID numbers of the microlensing target and 3 to 5 likely reference stars. When you are done picking stars, exit from tvphot by typing "q" to quit.
Step 4: Edit the archive script
Once you have the ID numbers of the microlensing event and some reference stars, edit the archive script:
cd .. edit archive(where "edit" is your favorite editor, be it vi or emacs).
First change the "microlens" ID number, for example:
$lensID = 118;Note that this is Perl, so there must be a semicolon (;) after the number. Here the lensing event is star 118 in the template image.
Next change the entries in the "refstars" list for each filter. For example:
$refStars{I} = "5 22 23 45 46 47"; $refStars{V} = "5 22 23 45 46 47";where we have identified 5 reference stars for the V and I bands (since both V and I use the same I-band template image, they are the same). The archive script uses a Perl associative array (aka "hash table") for the reference star list. The sample archive script created by newfield generally only has one example refStars{} array defined.
Once you've made and verified your changes, exit from the editor, saving the updated archive script.
Step 5: Rebuild the photometry archive
Now that you have put real ID numbers into the archive script for the microlens and some reference stars, you need to rebuild the photometry archive for the event.
Re-run the archive script using the C option as described above to create a new photometry archive. This will overwrite the initial archive you created in step 1, and give you an archive you can now build up over time as the new observations arrive. Eventually you may wish to choose a new template (e.g. one with better seeing near the event peak - especially if the event starts out faint and blended), or to select better reference stars once you have enough data accumulated to weed out poor initial choices. At least some choices are needed to get started.
[Menu]
findref searches through the archive to select candidate reference stars matching the following criteria:
cd Archive findref CT13_OB110123I stars.refThis will search archive CT13_OB110123I for stars matching the criteria above, making a list of candidates in the file stars.ref in the current directory. If stars.ref already exists, findref will append the new candidates to the end.
If in the first pass findref finds many stars matching the search criteria, refine the search by reducing the variability threshold. For example:
findref CT13_OB110123I stars.ref -s 0.01will restrict the search to type 11 stars within 30-arcsec that have rms variablity less than 0.01 magnitudes. This might produce a candidate list like this. From this, select likely candidates and enter them into the archive script.
On images with poor or highly variable seeing (leading to more blended stars and greater image-to-image scatter), it may be necessary to wide the search radius. For example, when reducing a subset of the 1998 UTas data, I used a 45-arcsec radius to find enough suitable (unblended) reference star, like this:
findref UOB98014I refs -s 0.02 -r 45If you fail to find reference stars on the first pass, increase the rms threshold (-s flag), and then work your way down as you refine the reference star selection. For example, I needed to start with 0.02mag for the UTas archives, but usually beat it down to 0.01mag after finding suitable first-cut reference stars. For YALO data I began with the default threshold of 0.015mag, but generally selected 5 stars which gave rms below 0.01 mag after 3 or 4 iterations. In general, archives that have poor starting cuts will not give low-scatter reference star solutions for more than 3-5 stars. There is no obligation to find 10 reference stars for all images.
Each time you select new reference stars, you will need to either completely regenerate the archive by running the archive script (preferred), or you can change the reference stars in-situ using the writepl command.
At the end of the season when the event is over, the field will likely be re-reduced by chosing the overall best template and selection of reference stars. While the event is on-going, it is important to make good, if not the ultimately best, choices in order to provide us with good enough photometry to judge when something interesting is happening so we can take different observations.
[Menu]
exphot CT13_OB110123I I.phoThis will extract the lens and reference star light curves from the archive and create an ASCII file named I.pho. The header of the file is a set of #-delimited comments, so that this file may be immediately imported into programs such as SM, mongo, etc. for plotting.
[Menu]
cd .. cleanupThis will walk through the various subdirectories (mainly Work/) and get rid of the debris leftover from running the pipeline.
[Menu]
For example, to reset the CTIO 1.3m ANDICAM OB110123 reductions in this example, you would do the following
cdThe run would go something like this:/CT13_OB110123 resetEvent
% resetEvent You are attempting to reset the CT13_OB110123 working directory. Is this what you want to do?You will need to make sure the FITS images in the Raw directory are good, and start over from the template selection step.: Y OK, here goes nothing... Clearing the working directories... removed old Reduced directory - 1 files deleted created new, empty Reduced directory removed old Work directory - 6 files deleted created new, empty Work directory removed old Dop directory - 1 files deleted created new, empty Dop directory removed old Par directory - 1 files deleted created new, empty Par directory removed old Archive directory - 2 files deleted created new, empty Archive directory created new, empty Archive/DopClean directory Removing the .template file... Removing all .processed tags from Raw FITS images... 0 raw FITS images restored Restoring pipeline parameter files... Done: CT13_OB110123 working directory reset.