README -- describes contents of Yuan_XLAI directory 08 Mar 2019 Jenny Fisher (jennyf@uow.edu.au ) GEOS-Chem Support Team (geos-chem-support@g.harvard.edu) Overview: =============================================================================== This directory contains the Yuan processed MODIS LAI (BNU product). et al. (2011). Reference: ---------- Yuan, H., Dai, Y., Xiao, Z., Ji, D., Shangguan, W., 2011. Reprocessing the MODIS Leaf Area Index Products for Land Surface and Climate Modelling. Remote Sensing of Environment, 115(5), 1171-1187. doi:10.1016/j.rse.2011.01.001 URL: ---- http://globalchange.bnu.edu.cn/research/lai These files were processed from the data files: ExtData/CHEM_INPUTS/MODIS_LAI_201806/MODIS.LAI.vBNU.generic.01x01.2005.nc in order to separate out the LAI values corresponding to the 73 land types (0..72) of the 2001 Olson Land Map into separate netCDF variables. This facilitates regridding by HEMCO, as each landtype can then be multiplied by mask denoting the grid boxes where each land type is present. The files listed below were created by following the steps listed below. Files: =============================================================================== Yuan_proc_MODIS_XLAI.025x025.YYYY.nc -- Weekly Yuan processed MODIS LAI data for a given year (2005-2015). The "XLAI" denotes that the LAI data has been separated into 73 netCDF variables, one per Olson land type. This is necessary for regridding via HEMCO (for backwards compatibility with the existing dry deposition module in GEOS-Chem). Source : Yuan et al. (2011), http://globalchange.bnu.edu.cn/research/lai Resolution: : Generic 0.25 x 0.25 grid (3600 x 1800 boxes) Units : cm2/cm2 Timestamps: : Every 8 days, starting from YYYY/01/02, where YYYY is the year listed in the file name Compression : Level 1 (e.g. nccopy -d1) Chunking : nccopy -c lon/1440,lat/720,time/1 NOTE: The HEMCO philosophy is to use the data at the highest resolution possible; however, the native files are at an extremely high resolution of 30sec. We have regridded here to 0.25 degree to not only to save space, but also to match the resolution of the Olson land map (which is necessary for regridding by land type via HEMCO to work). Condensed_MODIS_XLAI_vBNU.025x025.YYYY.nc -- The same MODIS LAI data as in the MODIS_XLAI_vBNU.025x025.YYYY.nc files, but with each of the 73 Olson land types stored as a separate level instead of as 73 separate variables. This is necessary to facilitate GCHP I/O (at least for MAPL prior to v1.0.0). Source : Yuan et al. (2011), http://globalchange.bnu.edu.cn/research/lai Resolution: : Generic 0.25 x 0.25 grid (3600 x 1800 boxes) Units : cm2/cm2 Timestamps: : Every 8 days, starting from YYYY/01/02, where YYYY is the year listed in the file name Compression : Level 1 (e.g. nccopy -d1) Chunking : nccopy -c lon/1440,lat/720,time/1 How the 0.1 x 0.1 files (MODIS.LAI.vBNU.generic.01x01.2005.nc) were created: =============================================================================== The above files were created as follows: 1) Download the original 30-second resolution netcdf files from http://globalchange.bnu.edu.cn/research/lai#download. This results in a set of very high resolution yearly files: global_30s_YYYY.nc 2) Regrid using cdo. Note this requires a large amount of memory, and on my system at least required submitting to an interactive queue specifying a requirement for 30GB memory. The cdo command uses the included grid description file. cdo remapdis,globalgrid_gen_01x01 global_30s_YYYY.nc LAI.01x01.YYYY.nc Note YYYY in the above command should be replaced by the year in question. This results in a set of 0.1x0.1 resolution yearly files: LAI.01x01.YYYY.nc 3) Fix time, variable names, attributes, etc. The remapping command above messes up the time dimension, which must be extracted from the original file and added to the new file. Other attributes must also be fixed, compression added, etc. All steps are done using the included bash script: ./fix_LAIv2.bash YYYY Note YYYY in the above command should be replaced by the year in question. This results in a set of 0.1x0.1 resolution yearly HEMCO-ready files: MODIS.LAI.vBNU.generic.025x025.YYYY.nc 4) Repeat steps 1-3 for all years. The following intermediate files can be deleted: global_30s_YYYY.nc LAI.01x01.YYYY.nc Further steps for regridding the data to 0.25 x 0.25 for use with HEMCO: =============================================================================== All scripts are stored in the scripts/ folder NOTE: Originally data was stored in for_HEMCO and scripts were stored in for_HEMCO/scripts Step 1: ------- Run the Python script regrid_BNU_LAI_to_025x025.py. This regrids the 0.1 x 0.1 degree data files MODIS.LAI.vBNU.generic.01x01.YYYY.nc, to 0.25 x 0.25 degree resolution. The new files are saved in with filenames MODIS.LAI.vBNU.generic.025x025.YYYY.nc. YYYY = {2005..2016} Step 2: ------- Edit the NCL script make_XMODIS_for_HEMCO.ncl: infile_modis should be the path that points to the 0.25 x 0.25 degree data files MODIS.LAI.vBNU.generic.025x025.YYYY.nc Step 3: ------- Run the script make_XMODIS.run. This will call the script make_XMODIS_for_HEMCO.ncl, which produces a single netCDF file for each of the 46 days of output. Output files will be in the folders: 2005/MODIS_XLAI_vBNU.025x025.2005_002.nc Step 4: ------- Run the script combine_files.sh. This will use the NCO ncrcat operator to combine all of the 46 files (1 per each MODIS BNU time slice) into a single file: MODIS_XLAI_vBNU.025x025.2005.nc Step 5: ------- Run the script fix_attrs.sh. This will use the NCO ncatted operator to fix variable attributes for lon, lat, and time for COARDS compliance. To create the Condensed*.nc files, follow these additional steps: ============================================================================== Step 6: ------- Clone the CsGrid repository if you do not already have it: git clone https://github.com/geoschem/CsGrid.git Step 7: ------- Edit the Matlab script condense_data.m as follows: (a) Add the full pathname to your local clone of the CsGrid. (b) Add the full pathname to the folder where the MODIS_XLAI_vBNU.025x025.YYYY.nc files are found. (c) Change the start and end years of the data (default is 2005 to 2016). Step 8: ------- Open Matlab (ask your IT staff if you have to first load it via a module or other system command). Then once the Matlab prompt appears, type: condense_data Step 9: ------- Compress and chunk the Condensed*.nc data files. We recommend that you use our nc_chunk.pl script. See: http://wiki.geos-chem.org/Working_with_netCDF_data_files#Chunking_and_deflating_a_netCDF_file_to_improve_I.2FO Further notes: ============================================================================== The GCSC has requested that we call this product "Yuan processed MODIS LAI". Therefore, we have to run two additional scripts: rename.sh : Renames data files to reflect that these are now Yuan processed MODIS LAI data. change_titles.sh : Changes the title string in the netCDF global attributes