README -- Describes contents of HEMCO/TOMS_SBUV/v2015-03 directory 27 Aug 2015 GEOS-Chem Support Team geos-chem-support@g.harvard.edu Overview: =============================================================================== This directory contains the merged TOMS/SBUV overhead ozone column data. This is required by certain GEOS-Chem full-chemistry simulations done. NOTE: Newer met field products (e.g. GEOS-FP) contain overhead ozone columns. GEOS-Chem will use the overhead ozone from the met fields wherever possible, thus rending this data more or less obsolete. We shall keep it for backwards compatibility with older model versions. This directory also contains the present day (2006-2010 mean) and long-term (1971-2010 climatological mean) TOMS ozone column data. This data was submitted by Jenny Fisher in August 21 2015 in bpch form for use in the mercury simulation. Files: =============================================================================== TOMS_O3col_*.geos.1x1.nc -- Binary punch (bpch) files containing 12 months of TOMS/SBUV data on the GEOS-Chem 1x1 grid, for years 1971 to 2010. Resolution : GEOS-Chem 1 x 1 grid (360 x 180 boxes) Units : dobsons (O3 columns); dobsons/day (tendencies) Timestamps : Monthly, 1971/01 thru 2010/12 Compression : Level 1 (e.g. nccopy -d1) TOMS_O3col_presentday.geos.2x25.nc -- Netcdf file containing 12 months of TOMS/SBUV 2006-2010 mean data on the GEOS-Chem 2x2.5 grid. Resolution : GEOS-Chem 2 x 2.5 grid (144 x 91 boxes) Units : dobsons (O3 columns) Timestamps : Monthly, 1985/01 thru 1985/12 (dummy year) Compression : Level 1 (e.g. nccopy -d1) TOMS_O3col_longterm.geos.2x25.nc -- Netcdf file containing 12 months of TOMS/SBUV 1971-2010 mean data on the GEOS-Chem 2x2.5 grid. Resolution : GEOS-Chem 2 x 2.5 grid (144 x 91 boxes) Units : dobsons (O3 columns) Timestamps : Monthly, 1985/01 thru 1985/12 (dummy year) Compression : Level 1 (e.g. nccopy -d1) References: =============================================================================== The 1x1 data comes from the raw data file toms_sbuv.v8.mod_v3.70-11.5x10.rev5.txt which is available from the website http://acdb-ext.gsfc.nasa.gov/Data_services/merged/ The data are then split into individual monthly files with IDL routine: /home/bmy/archive/data/IDL/cut.pro. Then the data are regridded and saved to binary punch format with the IDL program /home/~bmy/archive/data/IDL/reader_10x5.pro. Currently we have data from 1971-2010. These data files are used by "toms_mod.f" of the GEOS-Chem source code, especially with older met field products. This data was processed by Jenny Fisher (jennyf@uow.edu.au). Also see the original README file below. The 2x2.5 present-day and long-term data come from the bpch files: TOMS_O3col_presentday.geos.2x25 TOMS_O3col_longterm.geos.2x25 These files were submitted by Jenny Fisher as part of the Arctic Hg update incorporated into GEOS-Chem v11-01c in August 2015. The bpch files were converted to netcdf by the GEOS-Chem Support Team as described below. How the netCDF files were created: =============================================================================== 1x1 O3 data: Bob Yantosca wrote the following Perl script to convert the existing binary punch files to COARDS-compliant netCDF files for use with HEMCO. This script calls the GAMAP routine BPCH2COARDS and then uses NCO commands to fix variable names and attributes accordingly. #!/usr/bin/perl -w require 5.003; # Need this version of Perl or newer use English; # Use English language use Carp; # Get detailed error messages use strict; my $year = ""; my $inFile = ""; my $outFile = ""; my $ncFile = ""; my $txt = ""; my $cmd = ""; my $batch = "batch.pro"; my $result = ""; my $endDate = ""; foreach $year ( 1971 .. 2010 ) { # Input file for bpch2coards $inFile = "./TOMS_O3col_$year.geos.1x1"; # Output file for bpch2coards $outFile = "./TOMS_O3col_$year.geos.1x1.%date%.nc"; print "$inFile\n$outFile\n"; # Make an IDL batch script $txt = <$batch" ) or die "Cannot open $batch!\n"; print O $txt; close( O ); # Split the bpch file up into *.nc flies $cmd = "idl $batch"; $result = qx/$cmd/; # Concatenate files $ncFile = "TOMS_O3col_$year.geos.1x1.nc"; $cmd = "ncrcat -hO TOMS_O3col_$year.geos.1x1.$year*01.nc $ncFile"; $result = qx/$cmd/; # Remove temporary files if ( -f $ncFile ) { $cmd = "rm TOMS_O3col_$year.geos.1x1.$year*01.nc"; $result = qx/$cmd/; } else { print "Problem creating the file $ncFile!\h"; exit(1); } # Use NCO commands to edit the attributes and variable names $endDate = "$year"."1231"; $cmd = ""; $cmd .= qq/ncrename -v TOMS_O3__TOMS,TOMS $ncFile\n/; $cmd .= qq/ncatted -a gamap_category,TOMS,o,c,"TOMS-O3" $ncFile\n/; $cmd .= qq/ncatted -a units,TOMS,o,c,"dobsons" $ncFile\n/; $cmd .= qq/ncatted -a missing_value,TOMS,o,f,-999.0 $ncFile\n/; $cmd .= qq/ncatted -a _FillValue,TOMS,o,f,-999.0 $ncFile\n/; $cmd .= qq/ncrename -v TOMS_O3__DTOMS1,DTOMS1 $ncFile\n/; $cmd .= qq/ncatted -a gamap_category,DTOMS1,o,c,"TOMS-O3" $ncFile\n/; $cmd .= qq/ncatted -a units,DTOMS1,o,c,"dobsons\/day" $ncFile\n/; $cmd .= qq/ncatted -a missing_value,DTOMS1,o,f,-999.0 $ncFile\n/; $cmd .= qq/ncatted -a _FillValue,DTOMS1,o,f,-999.0 $ncFile\n/; $cmd .= qq/ncrename -v TOMS_O3__DTOMS2,DTOMS2 $ncFile\n/; $cmd .= qq/ncatted -a gamap_category,DTOMS2,o,c,"TOMS-O3" $ncFile\n/; $cmd .= qq/ncatted -a units,DTOMS2,o,c,"dobsons\/day" $ncFile\n/; $cmd .= qq/ncatted -a missing_value,DTOMS2,o,f,-999.0 $ncFile\n/; $cmd .= qq/ncatted -a _FillValue,DTOMS2,o,f,-999.0 $ncFile\n/; $cmd .= qq/ncatted -a End_Date,global,o,c,"$endDate" $ncFile\n/; $cmd .= qq/ncatted -a End_Time,global,o,c,"23:59:59.9" $ncFile\n/; $cmd .= qq/ncatted -a Contact,global,o,c,"GEOS-Chem Support Team (geos-chem-support\@as.harvard.edu)" $ncFile\n/; $cmd .= qq/ncatted -a References,global,o,c,"www.geos-chem.org; wiki.geos-chem.org"\n/; $cmd .= qq/ncatted -a Title,global,o,c,"TOMS\/SBUV column ozone for use with GEOS-Chem" $ncFile\n/; $cmd .= qq/ncatted -a delta_t,time,o,c,"0000-01-00 00:00:00" $ncFile\n/; $result = qx/$cmd/; } exit(0); #EOC 2x2.5 O3 data (present-day and long-term): Lizzie Lundgren wrote the following bash shell script to convert the existing binary punch files to COARDS-compliant netCDF files for use with HEMCO. This script calls the GAMAP routine BPCH2COARDS and then uses CDO and NCO commands to fix variable names and attributes accordingly. FILE1="TOMS_O3col_presentday.geos.2x25" FILE2="TOMS_O3col_longterm.geos.2x25" OFILE1="tmp1.%DATE%.nc" OFILE2="tmp2.%DATE%.nc" OUTFILE1="TOMS_O3col_presentday.geos.2x25.nc" OUTFILE2="TOMS_O3col_longterm.geos.2x25.nc" # read files and convert to netcdf idl << idlenv bpch2coards, '$FILE1','$OFILE1' bpch2coards, '$FILE2','$OFILE2' idlenv # merge into output files cdo mergetime tmp1.*.nc $OUTFILE1 cdo mergetime tmp2.*.nc $OUTFILE2 # Change variable names cdo chname,TOMS_O3__,TOMS_PD $OUTFILE1 tmp1.nc cdo chname,TOMS_O3__,TOMS_LT $OUTFILE2 tmp2.nc # Change units ncatted -a units,TOMS_PD,o,c,'dobsons' tmp1.nc $OUTFILE1 ncatted -a units,TOMS_LT,o,c,'dobsons' tmp2.nc $OUTFILE2 # Change long names ncatted -a long_name,TOMS_PD,o,c,'Present-day TOMS O3 columns (2006-2010 mean)' $OUTFILE1 tmp1.nc ncatted -a long_name,TOMS_LT,o,c,'Long-term TOMS O3 columns (1971-2010 mean)' $OUTFILE2 tmp2.nc mv tmp1.nc $OUTFILE1 mv tmp2.nc $OUTFILE2 # Cleanup rm -f tmp1.*.nc rm -f tmp2.*.nc rm -f tmp*.nc exit 0 ############################################################################### ##### ORIGINAL README FOLLOWS BELOW (1x1 data) ##### ############################################################################### README -- describes contents of TOMS_200906 27 Feb 2012 GEOS-Chem Support Team geos-chem-support@g.harvard.edu Data source and version: ------------------------- Version 8 Merged Ozone Data Sets Total Ozone Revision 05 DATA THROUGH: DEC 2011 LAST MODIFIED: 25 JAN 2011 http://acdb-ext.gsfc.nasa.gov/Data_services/merged/index.html TOMS/SBUV MERGED TOTAL OZONE DATA, Version 8, Revision 5. Resolution: 5 x 10 deg. * Includes reprocessed N16 and N17 SBUV/2 data using latest calibration. * OMI data updated from Collection 2 to Collection 3. * New offsets derived based on revised data sets. * 1970-1972 N4 BUV data added with no adjustments. User may wish to apply offset based on Comparisons between BUV and Dobson Measurements. Responsible NASA official: Dr. Richard Stolarski (Richard.S.Stolarski@nasa.gov) Stacey Frith (Stacey.M.Frith@nasa.gov ) Methodology: ------------------------- FAST-J comes with its own default O3 column climatology (from McPeters 1992 & Nagatani 1991), which is stored in the input file "jv_atms.dat". These "FAST-J default" O3 columns are used in the computation of the actinic flux and other optical quantities for the FAST-J photolysis. The TOMS/SBUV O3 columns and 1/2-monthly O3 trends (contained in the TOMS_200701 directory) are read to GEOS-Chem by routine READ_TOMS in "toms_mod.f". Missing values (i.e. locations where there are no data) in the TOMS/SBUV O3 columns are defined by the flag -999. After being read from disk in routine READ_TOMS, the TOMS/SBUV O3 data are then passed to the FAST-J routine "set_prof.f". In "set_prof.f", a test is done to make sure that the TOMS/SBUV O3 columns and 1/2-monthly trends do not have any missing values for (lat,lon) location for the given month. If so, then the TOMS/SBUV O3 column data is interpolated to the current day and is used to weight the "FAST-J default" O3 column. This essentially "forces" the "FAST-J default" O3 column values to better match the observations, as defined by TOMS/SBUV. If there are no TOMS/SBUV O3 columns (and 1/2-monthly trends) at a (lat,lon) location for given month, then FAST-J will revert to its own "default" climatology for that location and month. Therefore, the TOMS O3 can be thought of as an "overlay" data -- it is only used if it exists. Note that there are no TOMS/SBUV O3 columns at the higher latitudes (i.e. during polar night). At these locations, the code will revert to using the "FAST-J default" O3 columns. But because we don't call FAST-J in boxes where it is night, this essentially means that we have replaced the entire FAST-J O3 column climatology with the TOMS/SBUV O3 column product. As of Feb 2012, we have TOMS/SBUV data for 1971 thru 2011. 2012 data is incomplete as of this writing. Jan 2012 data is necessary to process 2011 data, so only 1971-2010 data have been processed. This methodology was originally adopted by Mat Evans.