OSU Logo The Ohio State University
College of Mathematical & Physical Sciences
Department of Astronomy

OSU Photometric Reduction Pipeline

Instructions - 2011 Revision

This document describes how to run data through the OSU data reduction pipline. It specifically refers to the version of the OSU pipeline in used since the 2001 Bulge Season, modified for the 2011 observing season. The pipeline has evolved some from the original PLANET collaboration software, and includes a number of new features reflecting how our approach to running the observing network has evolved since 2000.

To review, the OSU pipeline is a suite of programs (C and Fortran) run by scripts (both Perl and csh). These programs include DoPhot and related programs, and a suite of binary photometry archiving software (the "Quy" package adopted by the old PLANET collaboration). All of the reduction scripts were re-written at OSU to reflect how we organize our data reduction, provide error trapping, and to streamline the process (these scripts run quieter than the stock PLANET pipeline).

Index of Topics:

Create/configure a directory for a new microlensing event (newfield)
Make a DoPhot reduction template for the field (mktemplate)
Process a list of images with DoPhot (photproc)
Archive reduced photometry data (archive)
Identify the lens & reference stars on the template image (tvphot)
Refine the reference star selection (findref)
Export a light curve in text format (exphot)
Clean up after reductions (cleanup)
Reset and Start Over (resetEvent)
See the Quick Reference Guide for the steps with less verbiage.

See the Quick Quick Guide even less verbiage.


Create/configure an event directory for a new microlensing event

To start reducing data for a new microlensing event, you have to first create and configure a new event directory. During the 2008 season, all new event directories will be in
    ~microfun/Data/
on microfun (a large multi-Tb storage and analysis system dedicated to the MicroFUN project).

The name of the directory is composed of 2 parts: an abbreviation of the observatory site and a short form of the event name. For example, images of microlensing event OGLE-BLG-2011-0123 taken by the ANDICAM with the CTIO 1.3-meter telescope at CTIO would be processed in an event directory named "CT13_OB110123". For images taken by the Auckland Observatory of the same target, the event directory would be "AO_OB110123". To see a list of site codes for the various active observatories, type

 microsites 

Starting in 2011, the OGLE-4 project is now officially operating and event names from OGLE-4 have 4-digit numbers. By mid-season 2011 we had crossed the 1000 event line. MOA is still working with 3-digit event names, and to date we are going to stay with 3-digits for MOA unless they make a change.

The event directory name also becomes the rootname of all image files for that event from that observatory. For example, V and I images taken of OB110123 with the CTIO 1.3m telescope and ANDICAM would have names

    CT13_OB110123I0001.fits
    CT13_OB110123I0002.fits
    ...
    CT13_OB110123V0001.fits
    CT13_OB110123V0002.fits
    ...
The filter ID must be a CAPITAL LETTER, usually one of UBVR or I. For sites that take data without a filter in front of the CCD camera, we use "U" to designate "Unfiltered". Some small-telescope sites use a very wide R+I filter (actually a glass version of the Kodak #12 yellow cut-on long-pass filter) and for those we use "R" for "Red-pass".

The 4-digit number following the filter ID indicates the order in which the image has been acquired during the season in sequence, starting with 0001. In principle numbers can run up through 9999, but we only occasionally exceed 1000 images per event per site.

Finally, all images must be FITS format images in 16-bit signed integer format (BITPIX=16). The DoPhot program does not work with either 32-bit integer or floating point images, so you may have to convert your images into proper 16-bit integer format before processing.

To create a new event directory for a new microlensing event, type:

  newfield CT13 OB110123
The first argument is the site code (here CT13 = "CTIO 1.3m ANDICAM"), the second argument is the short form of the event name (OB110123=OGLE-2011-BLG-0123). If both are valid, you will be asked a series of questions to setup the event directory. newfield uses the event name to extract the RA/Dec coordinates from the most recent OGLE-4 and MOA event catalogs, and the MicroFUN site table to get the instrument options relevant for that observing site.

The newfield script creates a directory named CT13_OB110123/ and all of its various subdirectories as used by the pipeline programs. It also creates customized archive and .setup scripts for this directory, as follows:

.piperc:
.piperc is the pipeline runtime configuration file, and contains a series of parameters used for this event. These include the full directory path to the OSU pipeline executables and the instrument configuration for this observatory.

archive:
customized Perl script to archive the photometry measurements. At later stages in the reduction process, this file will need to be edited to specify the observation number of the template image, and to enter the DoPhot ID numbers of the lensing event and reference stars.

.fieldname:
A text file containing the rootname of the field (e.g., CT13_OB110123). .fieldname is used by archive and other scripts that are customized specifically for this directory.
newfield creates the following sub-directories:
   Raw/     - directory to hold the unmeasured (raw) FITS images
   Reduced/ - directory to hold the measured (reduced) FITS images
   Work/    - scratch space for temporary files
   Dop/     - DoPhot photometry (dop) files
   Par/     - DoPhot parameter (par) files
   Archive/ - the photometry archives for each filter
   Archive/DopClean/ - archived DoPhot photometry (dop) files 
All new images (after pre-processing to remove bias and flat field) are copied into the Raw directory. The pipeline draws image to be measured from the Raw directory. If it successfully measures the images, it puts a copy into the Reduced directory.

[Menu]


Make a DoPhot reduction template for the field.

Choose an image with good seeing and near the target peak brightness (if possible) to use as the template image. To do this, cd into the event directory's Raw/ directory and type
    cd Raw/
    findBest
This will estimate the mean FWHM of the stars in all .fits images in the current (Raw/) directory and return the names of the 20 images with the best seeing. You can search for another number by adding it to the command line (e.g., findBest 30 will find up to the 30 best images). result. The units are FWHM in pixels. Choose 3 or 4 candidate images with small mean FWHMs and examine them with ds9 or your favorite image processing package (IRAF, XVista, etc.). A "good" template image is on which satisfies these criteria:
  1. Stars are round and well-defined (no streaking or elongated images)
  2. The lensing event is well-centered in the frame, distinct and easily identified - if you can't find the lens, chances are DoPhot won't, either.
  3. The image does not have a very high background (e.g., moonlight or twilight), nor are there a lot of saturated, bleeding star images.
When you have identified a good candidate template image, return to the main event directory and run mktemplate
   cd ..
   mktemplate I 11
In this example, we have chosen CT13_OB110123I0011.fits as the template image, hence "I 11" for image I-band image 11 in the sequence ("I0011").

mktemplate examines the candidate template image and creates two files in Dop/ if successful: a master list of all measurable stars on the image (the "master catalog"), and a short list of the brightest stars on the image (the "offset catalog"). The master catalog becomes the list of all stars to be measured on all images for this data set, while the offset catalog is used to compute the relative shift between each image and the template.

If mktemplate is successful, it creates a file named ".template" in the event directory that contains the full name of the template image. This file is read by photproc and other pipelin scripts to make sure it always uses the same template image. If you choose a new template image, .template is updated, and the old template name is kept in .template.BAK as a backup.

If you later decide you want to use a new template file, you need to reset the event directory using the resetEvent script first to clean out any delinquent working files and "unprocess" the images in the Raw/ directory. [Menu]


Measure photometry for event images with DoPhot

To process all of the unmeasured FITS files in the Raw/ directory with the DoPhot pipeline, from the event directory you type
   photproc all
If instead you only want to process a subset of the unprocessed images, you would first make a list of these images, say "imlist", and then type
   photproc imlist
In either case, photproc processes the images using the template image stored in the .template file created by mktemplate.

After photproc is done, check the logs for errors and repeat the above steps if the errors are correctable. Sometimes they are not, and you have to ditch the images as too bad to reduce. This is usually done by renaming a bad image in Raw/ by appending ".bad" to the name. This will allow it to be ignored by subsequent photproc runs.

After photproc runs, you will have a number of cleaned DoPhot output files (.dop extension) in the Dop/ directory, and a set of DoPhot runtime parameter files (.par extension) in the Par/ directory. If you are not going to be using the binary photometry archive, you are essentially done in that you now have DoPhot output to work with using your own photometry programs.

Common Processing Errors

If errors are encountered processing a frame, photproc will print a message to the screen like:

    *** Processing failed for CT13_OB110123I0047 (see Work/photlog.024)
Look at the file noted and see what it says. The most common faults are:
  1. Really bad seeing, making it impossible to identify enough reference stars to perform a geometric transformation within the DoPhot thresholds. This would be the case if you see errors like "Error: only 3 matches found for ..." or "no matches found for ...".
  2. Errors like "Segmentation Fault" are (after fixing the other problems) often caused by having enough bright cosmic ray events on the images near enough to where it is expecting stars to fool DoPhot and cause a subsequent ploffset crash. I find that if I get such a failure, opening up XVista or IRAF, cleaning off the bright cosmic ray events, and then re-running fixes this most times.
  3. An error in ploffset claiming that no stars were found can mean that something is buggy with the image, either (a) the image is of a different field, the image is flipped in one or both axes (someone changed the instrument configuration), etc. The best advice is to look at the image causing the problem and make a call.
If an image persistently fails, note it down in the logs and rename it to XXX.fits.bad, where "XXX" is whatever the rootname is (e.g., XXX=CT13_OB110123I0002) for future reference.

If the cause of the crash still makes no sense, it could mean that you have tripped across unrecognized, deeper programming problems and you should bring the matter to Rick's attention so he can try to run it down.

[Menu]


Archive reduced photometry data

After producing the DoPhot photometry files, you need to add these to the binary photometry archive file for the event.
   archive I 1 10 C
This creates a new photometry archive file named Archive/CT13_OB110123I with parameters as set up in the "archive" script. The field name, here "CT13_OB110123", is gotten from the .fieldname file in the current directory. If .fieldname is missing, it is a signal that archive is being run in the wrong place (or that the directory has not been setup correctly).

The arguments are: I is the filter band (V, I, etc.) of the images to archive (each filter has a separate archive), 1 10 are the images to be archived (here CT13_OB110123I0001 through CT13_OB110123I0010), and the C flag means Create a new archive file (overwriting the old one).

To append new data to the end of an existing archive, you would instead have typed:

   archive I 11 20 A
This adds the records for frames 11-20 to the photometry archive file named "Archive/CT13_OB110123I".

It is OK to run archive over a range of numbers (e.g., 11-20) that includes missing numbers. The archive script will simply jump over any bogus numbers. This lets you archive good data right away while dealing with the problem images at your leisure.

[Menu]


Identify the lens & reference stars on the template image

Once you've created (or changed) the template image for an event, (e.g. adopting an image taken in good seeing with the source brighter), you need to update the parameters in the archive script that identify the microlensing event and the reference stars for relative photometry.

The process of identifying the microlens and some suitable reference stars is the last major task that must be accomplished before subsequent images for a field can be "pipelined" day after day. This section steps you through the process, which is the most involved of the initial setup tasks. After this, the rest is pretty much rock-n-roll until the event is over or the observing season ends.

Step 1: Create an initial photometry archive.

Once you have created a template and done an initial photometry pass over a set of images, you need to create an initial photometry archive by running the archive script as shown above. Even though you haven't yet identified the microlensing target or any reference stars in the images, this initial "first draft" archive is needed in order to search for them using the tvphot program. Once you've indentified the lens and reference stars, you will re-run archive and replace this first-draft archive with the real thing.

Use the results from your first successful photproc run (see above) to generate the initial archive:

   archive I 1 5 C
This builds an initial I-band photometry archive for the event using the first 5 image (note: this must include the template image!). The "C" flag tells archive to create a new archive file, here named CT13_OB110123I, in the Archive directory. It will also copy the template image into CT13_OB110123I.fits in the Archive directory. You need both the initial archive and the template image in order to indentify stars.

Step 2: Display the template image and examine it using tvphot

After building the initial archive, go into the Archive directory

   cd Archive
and start up ds9:
   ds9 &
Once it is going, run tvphot on the archive:
   tvphot CT13_OB110123I
tvphot will first display the template image and then start the interactive display cursor. If you are familiar with the IRAF imexamine command, tvphot uses the same "blinking donut" cursor. Typing "?" will display a list of tvphot functions.

NOTE: The example above assumes that the archive and template image both have the same rootname, e.g., archive file CT13_OB110123I and template image CT13_OB110123I.fits. If the template image has a different name, the command syntax is:
   tvphot CT13_OB110123V CT13_OB110123I
where the ".fits" is left off and appended by the program.

Step 3: Pick stars on the image and get their DoPhot ID numbers from the archive

You need to find the DoPhot ID numbers corresponding to

  1. The microlensing target
  2. At least 3-5 unblended stars to use a reference stars
in the archive. DoPhot assigns an ID number (1 through the number of stars) to each star it finds in the template image. These ID numbers are then carried forward for all images measured for this field by the pipeline. [Note: If you pick a new template image, all of the numbers will be reassigned, and you will need to pick new IDs for all stars].

To get the ID number of a star in the template image, center the tvphot cursor over the star you want information about and hit then "f" key.

tvphot will search the archive for all stars within 5 of the cursor position. It then marks each star it finds with a yellow diamond labeled by its ID number, and prints the following information to the screen xterm screen, for example, if you have the cursor on a star near X=225, Y=255, and hit "f" (for "find") it will type out lines like:

 
# Star   X       Y     Mag    Err   RMS  Typ
    6  172.00  156.70 14.254 0.003 0.010  11
This tells you that the star nearest the cursor is star #6 in the archive. The other data are as follows:
  1. X & Y: pixel coordinates of star 6 in the template image.
  2. Typ: is the DoPhot Object Type, in this case Type 11 (a "perfect" star).
  3. Mag: is the mean instrumental magnitude of the star, averaged over all observations.
  4. Err: is the typical formal error on each measurement of Mag, averaged over all observations.
  5. RMS: is the rms variation in Mag over all observations.
All reference stars that you choose must be DoPhot Type 11. Note that for your first pass the mean instrumental magnitude will be bogus since we have not yet identified reference stars. These numbers are more useful after more data has been accumulated to refine the choices of reference stars and investigate the data more carefully.

Note that the microlensing event, especially if faint, may be blended and so designated as DoPhot Type 13. This is OK, as we have no control over which star is the microlensing target.

Record the ID numbers of the microlensing target and 3 to 5 likely reference stars. When you are done picking stars, exit from tvphot by typing "q" to quit.

Step 4: Edit the archive script

Once you have the ID numbers of the microlensing event and some reference stars, edit the archive script:

   cd ..
   edit archive
(where "edit" is your favorite editor, be it vi or emacs).

First change the "microlens" ID number, for example:

   $lensID = 118;
Note that this is Perl, so there must be a semicolon (;) after the number. Here the lensing event is star 118 in the template image.

Next change the entries in the "refstars" list for each filter. For example:

   $refStars{I} = "5 22 23 45 46 47";
   $refStars{V} = "5 22 23 45 46 47";
where we have identified 5 reference stars for the V and I bands (since both V and I use the same I-band template image, they are the same). The archive script uses a Perl associative array (aka "hash table") for the reference star list. The sample archive script created by newfield generally only has one example refStars{} array defined.

Once you've made and verified your changes, exit from the editor, saving the updated archive script.

Note: The layout and operation of the archive script has significantly changed compared to previous versions. In particular, some very confusing logic about template filter bands has been eliminated.

Step 5: Rebuild the photometry archive

Now that you have put real ID numbers into the archive script for the microlens and some reference stars, you need to rebuild the photometry archive for the event.

Re-run the archive script using the C option as described above to create a new photometry archive. This will overwrite the initial archive you created in step 1, and give you an archive you can now build up over time as the new observations arrive. Eventually you may wish to choose a new template (e.g. one with better seeing near the event peak - especially if the event starts out faint and blended), or to select better reference stars once you have enough data accumulated to weed out poor initial choices. At least some choices are needed to get started.

[Menu]


Refine the reference star selection

Once you've accumulated enough data that you can get reasonably good statistics on the stars in the archive, you will want to refine the choice of reference stars to those that are most constant. To do this we use the findref program.

findref searches through the archive to select candidate reference stars matching the following criteria:

  1. Within 30-arcseconds of the microlensing target.
  2. RMS variability among many measurements of <0.015 magnitudes.
  3. DoPhot Type 11
To search an archive, you would type:
   cd Archive
   findref CT13_OB110123I stars.ref
This will search archive CT13_OB110123I for stars matching the criteria above, making a list of candidates in the file stars.ref in the current directory. If stars.ref already exists, findref will append the new candidates to the end.

If in the first pass findref finds many stars matching the search criteria, refine the search by reducing the variability threshold. For example:

   findref CT13_OB110123I stars.ref -s 0.01
will restrict the search to type 11 stars within 30-arcsec that have rms variablity less than 0.01 magnitudes. This might produce a candidate list like this. From this, select likely candidates and enter them into the archive script.

On images with poor or highly variable seeing (leading to more blended stars and greater image-to-image scatter), it may be necessary to wide the search radius. For example, when reducing a subset of the 1998 UTas data, I used a 45-arcsec radius to find enough suitable (unblended) reference star, like this:

   findref UOB98014I refs -s 0.02 -r 45
If you fail to find reference stars on the first pass, increase the rms threshold (-s flag), and then work your way down as you refine the reference star selection. For example, I needed to start with 0.02mag for the UTas archives, but usually beat it down to 0.01mag after finding suitable first-cut reference stars. For YALO data I began with the default threshold of 0.015mag, but generally selected 5 stars which gave rms below 0.01 mag after 3 or 4 iterations. In general, archives that have poor starting cuts will not give low-scatter reference star solutions for more than 3-5 stars. There is no obligation to find 10 reference stars for all images.

Each time you select new reference stars, you will need to either completely regenerate the archive by running the archive script (preferred), or you can change the reference stars in-situ using the writepl command.

At the end of the season when the event is over, the field will likely be re-reduced by chosing the overall best template and selection of reference stars. While the event is on-going, it is important to make good, if not the ultimately best, choices in order to provide us with good enough photometry to judge when something interesting is happening so we can take different observations.

[Menu]


Export a light curve in text format

Once you've updated the photometry archive, you will need to extract the lens and reference star light curves into a convenient ASCII text format file for subsequent analysis by typing:
   exphot CT13_OB110123I I.pho
This will extract the lens and reference star light curves from the archive and create an ASCII file named I.pho. The header of the file is a set of #-delimited comments, so that this file may be immediately imported into programs such as SM, mongo, etc. for plotting.

[Menu]


Clean Up

After you are finished reducing the data, it is a good idea to clean up the many working files and junk created by DoPhot and friends. Go into the main event directory and run the cleanup program:
   cd ..
   cleanup
This will walk through the various subdirectories (mainly Work/) and get rid of the debris leftover from running the pipeline.

[Menu]


Reset and Start Over

In rare circumstances, you may find that you need to reset everything and just start all over from scratch. A way to do this that preserves the basic setup (e.g., newfield) is to use the resetEvent program. You run this inside an event directory you want to reset.

For example, to reset the CTIO 1.3m ANDICAM OB110123 reductions in this example, you would do the following

    cd /CT13_OB110123
    resetEvent
The run would go something like this:
% resetEvent

You are attempting to reset the CT13_OB110123 working directory.

Is this what you want to do? : Y

OK, here goes nothing...


Clearing the working directories...
   removed old Reduced directory - 1 files deleted
   created new, empty Reduced directory
   removed old Work directory - 6 files deleted
   created new, empty Work directory
   removed old Dop directory - 1 files deleted
   created new, empty Dop directory
   removed old Par directory - 1 files deleted
   created new, empty Par directory
   removed old Archive directory - 2 files deleted
   created new, empty Archive directory
   created new, empty Archive/DopClean directory
Removing the .template file...
Removing all .processed tags from Raw FITS images...
   0 raw FITS images restored
Restoring pipeline parameter files...

Done: CT13_OB110123 working directory reset.

You will need to make sure the FITS images in the Raw directory are good, and start over from the template selection step.


Return to the OSU Microlensing Pipeline Software Page
Updated: 2011 Aug 16 [rwp]