Frequently Asked

  • How do I pull Simple Subset Wizard subsetted AIRS data from the GES DISC using wget, before and after 3 Oct 2016, the date at which FTP is no longer supported?

    The GES DISC announced on 28 Jun 2016 that access to GES DISC data will require the user be registered with the Earthdata Login System.  This will impact users who pull using SSW as well as other processes employing anonymous ftp. Go to this URL to register and to learn about the new supported download procedures and to learn how to set some items in your home directory required by them:

    URL: http://disc.sci.gsfc.nasa.gov/registration/registration-for-data-access

    The GES DISC has provided a test area to exercise the registration process, data access via HTTP/OPeNDAP, command line and applications is set for HTTP and OPeNDAP. All GES DISC data users are encouraged to register with the Earthdata Login system and test their data access via HTTP/OPeNDAP and scripts using this test area. User credentials for the Earthdata Login system will remain the same once user registration is enabled.

    One of the procedures is WGET.  Th instructions for its use provided by GES DISC are bare bones. After some experimentation, we have creatd the following C-shell script to aid users.  It assumes that the items GES DISC advises you create in your home directory are present.

    NOTE: there are at least 3 versions of wget currently available on various operating systems.
    Many users have version 1.12, and problems may be encountered unless the workaround described below is performed. Versions 1.14 and 1.18 do not encounter this problem, but it may be best to execute the workaround anyway, just in case.

    GES DISC advised that after you register, take this action on your system (I am assuming UNIX in the following text)
       cd ~  or cd $HOME
       touch .netrc
       echo "machine urs.earthdata.nasa.gov login <yourID> password <yourPW>" > .netrc
       chmod 0600 .netrc
       touch urs_cookies

    WORKAROUND if you have wget version 1.12
      cd ~  or  cd $HOME
      rm .urs_cookies
      wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies http://airsl1.gesdisc.eosdis.nasa.gov/data/.dummy.hdf -O /dev/null

    NOW you must get the URL list of subsetted files from the SSW and save the list in a text file named SSW.txt so that you may use the following shell script to download them to the same current directory (i.e., execute the shell script when the current directory contains SSW.txt).  Note that the GES DISC allows only 2000 SSW URLs to be presented at a time, so if you are attempting to access more data than that you must make multiple subset lists using SSW. These lists may be concatonated using the UNIX "cat" command, but be sure you remove the blank line that may exist at the end of each 200 URL list before doing so.

    Once you have SSW.txt in your current directory, you can execute this schell script (I call mine ssw_wget)
    (NOTE: once you are registered, you may use this to download SSW files before October 3rd)

    #!/bin/csh
    # download data from GES DISC using wget to current directory
    # I define shell variable OPTS to make the wget command line of manageable length
    # Starting 3 Oct 2016, the GES DISC will no longer support anonymous FTP
    # ~/.netrc is edited and then made private with chmod 0600 .netrc
    # content of .netrc is single line:
    #       machine urs.edarthdata.nasa.gov login <userid> password <password>
    # create .urs_cookies in ~/bin using touch
    #
    #
    #NOTE: if using wget Version 1.12, you must first do the following
    # rm ~/.urs_cookies
    # wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies http://airsl1.gesdisc.eosdis.nasa.gov/data/.dummy.hdf  -O /dev/null
    # Then this script will work
    # there are wget versions V1.14 and V1.18
    #
    #
    # NOTE:
    # -A "match string including wild cards" allows me to specify filter for what is to be downloaded, including using wildcard
    # -r is recursive, l1 means maximum depth of 1, --no-parent means references to parent directory are ignore
    # -nc meaning no clobber, i.e. if wget was interrupted and started again, will not overwrite files already downloaded
    # -np meaning do not pull from other directories
    # -nd meaning do not create directories --- this results in pulling the file(s) I want and placing them in current directory without replicating directory tree -- BINGO!!!
    # wget -nd -nc -np -r -l1 --no-parent -A "AIRS.*.hdf" $OPTS $CPU/$DIR/
    #
    echo "wget SSW data from GES DISC using file downloaded by browser"
    set CURRDIR = `pwd`
    echo "Files Transferred into Current Directory: $CURRDIR"
    echo ""
    set OPTS = " --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies "
    echo "Options: $OPTS"
    wget --content-disposition -nd -nc -np -r -l1 --no-parent $OPTS -i./SSW.txt
    exit

     

  • How do I pull AIRS data from the GES DISC using wget, before and after 3 Oct 2016, the date at which FTP is no longer supported?

    The GES DISC announced on 28 Jun 2016 that access to GES DISC data will require the user be registered with the Earthdata Login System.  This will impact users who pull using SSW as well as other processes employing anonymous ftp. Go to this URL to register and to learn about the new supported download procedures and to learn how to set some items in your home directory required by them:

    URL: http://disc.sci.gsfc.nasa.gov/registration/registration-for-data-access

    The GES DISC has provided a test area to exercise the registration process, data access via HTTP/OPeNDAP, command line and applications is set for HTTP and OPeNDAP. All GES DISC data users are encouraged to register with the Earthdata Login system and test their data access via HTTP/OPeNDAP and scripts using this test area. User credentials for the Earthdata Login system will remain the same once user registration is enabled.

    One of the procedures is WGET.  Th instructions for its use provided by GES DISC are bare bones. After some experimentation, we have creatd the following C-shell script to aid users.  It assumes that the items GES DISC advises you create in your home directory are present.

    NOTE: there are at least 3 versions of wget currently available on various operating systems.
    Many users have version 1.12, and problems may be encountered unless the workaround described below is performed. Versions 1.14 and 1.18 do not encounter this problem, but it may be best to execute the workaround anyway, just in case.

    GES DISC advised that after you register, take this action on your system (I am assuming UNIX in the following text)
       cd ~  or cd $HOME
       touch .netrc
       echo "machine urs.earthdata.nasa.gov login <yourID> password <yourPW>" > .netrc
       chmod 0600 .netrc
       touch urs_cookies

    WORKAROUND if you have wget version 1.12
      cd ~  or  cd $HOME
      rm .urs_cookies
      wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies http://airsl1.gesdisc.eosdis.nasa.gov/data/.dummy.hdf -O /dev/null

    NOW you can execute the following shell script (I call mine urs_wget) to download the identified data into the current subdirectory
    This example pulls 6 granules of AIRS Level 2 RetStd
    NOTE: once you have registered, you can use wget script to pull data, even before October 3rd
    #!/bin/csh
    # download data from GES DISC using wget to current directory
    # I define shell variable OPTS to make the wget command line of manageable length
    # Starting 1 August 2016, the GES DISC will no longer support anonymous FTP
    # ~/.netrc is edited and then made private with chmod 0600 .netrc
    # content of .netrc is single line:
    #       machine urs.edarthdata.nasa.gov login <userid> password <password>
    # create .urs_cookies in ~/bin using touch
    #
    # Although this script DOES work with wget Version 1.12, you might still wish to do the following
    # rm ~/.urs_cookies
    # wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies http://airsl1.gesdisc.eosdis.nasa.gov/data/.dummy.hdf  -O /dev/null
    # there are wget versions V1.14 and V1.18
    #
    # -A "match string including wild cards" allows me to specify filter for what is to be downloaded, including using wildcard
    # -r is recursive, l1 means maximum depth of 1, --no-parent means references to parent directory are ignore
    # add -nc meaning no clobber, i.e. if wget was interrupted and started again, will not overwrite files already downloaded
    # add -np meaning do not pull from other directories
    # add -nd meaning do not create directories
    #
    echo "wget AIRS data from GES DISC"
    set CURRDIR = `pwd`
    echo "Current Directory: $CURRDIR"
    echo ""
    set OPTS = " --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies "
    echo "Options: $OPTS"
    # pull granules 120 through 125 of AIRS L2 RetStd
    set CPU = "ftp://airsl2.gesdisc.eosdis.nasa.gov"
    set DIR = "data/s4pa/Aqua_AIRS_Level2/AIRX2RET.006"
    set YYYY = "2016"
    set DOY = "180"
    wget --content-disposition -nd -nc -np -r -l1 --no-parent -A "AIRS.*12[0-5].*.hdf" $OPTS $CPU/$DIR/$YYYY/$DOY/
    exit

     

  • How do I use FTP to pull AIRS data from the GES DISC without using a browser (until 3 Oct 2016, when FTP will no longer be supported)?

    Here is how to pull AIRS Level 1 data granules, starting from the UNIX prompt

    initiate the session by typing
    ftp airsl1.gesdisc.eosdis.nas.gov
    all commands you type that follow will be at the ftp> prompt

    you will be asked to provide a username and password
    simply type anonyous for the username and guest for the password

    change directory to the data and date you wish to access
    in this example, the data are AIRS L1 radiances and the date is 28 June 2016
    note that date is day of year 180

    ftp> cd data/s4pa/Aqua_AIRS_Level1/AIRBRAD.005/2016/180

    if you list the contents of that subdirectory by typing ls you will find it contains 4 different types of data
    the data files end in  .jpg  .gz   .xml  .hdf
    the data you wish are those files ending in .hdf
    Note that there are up to 240 of each type of data, since there are 240 AIRS granules per day

    ftp supports mget, and you can easily specify multiple files by using the wildcard, *
    but you must also suppress the ftp prompt, or you will be asked to respond y or n for each file
    turn off the prompt by typing
    ftp> prompt
    this is a toggle, i.e. you can turn it on again by again typing prompt

    now you can download all 240 .hdf files by typing
    ftp> mget AIRS.*.hdf
    the download of all files will take a while

    if you only wished to download granule 120, you could type
    ftp> mget AIRS.*.120.*.hdf

    when you wish to end the ftp session, type
    ftp> bye
    and you will be returned to your UNIX prompt on the local CPU. The data you downloaded will reside
    in the current directory

  • How do I FTP AIRS Level 1 and Level 2 data products from the GES DISC using a browser (until 3 Oct 2016, when FTP will no longer be supported)?

    The FTP server for AIRS V5 Level 1 data products is
         airsl1.gesdisc.eosdis.nasa.gov
    and the directory path to data products is
         data/s4pa/Aqua_AIRS_Level1/SHORTNAME.005/YYYY/DDD
    where:
    SHORTNAME is the product
         AIRIBRAD for AIRS IR; AIRBRAD for AMSU MW, AIRVBRAD for AIRS near-IR
         YYYY is the year (2002 through the current year)
         DDD is the day of year (001 through 365 or 366 for a leap year)
    You may access L1 data products via your browser using this URL
         ftp://airsl1.gesdisc.eosdis.nasa.gov/data/s4pa/Aqua_AIRS_Level1/SHORTNAME.005/YYYY/DDD

    The FTP server for AIRS V6 Level 2 physical data products is
         airsl2.gesdisc.eosdis.nasa.gov
    and the directory path to data products is
         data/s4pa/Aqua_AIRS_Level2/SHORTNAME.006/YYYY/DDD
    where:
    SHORTNAME is the product
         AIRX2RET for AIRS/AMSU Standard L2 physical product
         AIRX2SUP for AIRS/AMSU Support L2 physical product
         AIRI2CCF for AIRS/AMSU L2 Cloud Cleared IR radiances
    and
         YYYY is the year
         DDD is the day of year
    You may access L2 data products via your browser using this URL
         ftp://airsl2.gesdisc.eosdis.nasa.gov/data/s4pa/Aqua_AIRS_Level2/SHORTNAME.006/YYYY/DDD

  • How to I access multi-year ozone time series data for specific location, for example Mt Licancabur, Bolivia (22.48S, 67.47W) using Simple Subset Wizard (SSW) or GIOVANNI?

    There are two options:

    a) download the V6 Level 2 data using the Simple Subset Wizard (SSW) for your analysis

    b) use the online GIOVANNI tool Here is how to use the two options.

    a) SSW to identify AIRS granules within your desired date range and spatial bounding box and subset desired products
    Navigate to the URL: http://disc.sci.gsfc.nasa.gov/SSW/
    In the pull-down box for data set:
      Choose Goddard Earth Sciences Data and Information Services Center
      Choose Aqua AIRS v006 Choose AIRX2RET v0006 [2002-08-30 - Present]
    This will result in the URL becoming:
      http://disc.sci.gsfc.nasa.gov/SSW/#keywords=AIRX2RET%20006
    Now specify the date range and spatial bounding box. Assuming desire to access 9 Jan 2004 through 18 Jan 2004 and data within a 2 deg x 2 deg spatial box centered upon Mt Licancanbur, Bolivia
      Enter date range: 2004-01-09 to 2004-01-18
      Enter spatial bounding box: -24.48, -69.47, -20.48, -65.47
        (South, West, North, East)
      (example image attached, SSW_example_1.tiff)
    Then click the Search for Data Sets
    A new page appears, stating that 1 subsettable data set was found
      (example image attached, SSW_example_2.tiff)
      If you click on the box to the right of the +, you will see all 216 variable are currently selected
      This is the full content of each granule of AIRS L2 data
    If this is what is desired, then click the View Subset Results to continue to download option

    If you wish to subset the data, click the + ; for example,  desire only total column O3
      (example image attached, SSW_example_3.tiff, shows selection of the items you might require) Then click Subset Selected Data Sets
    After a brief period, a new page shown stating that number of variables selected = 4
    (example image attached, SSW_example_4.tiff)

    Click on View Subset Results
      You get a new page showing list of 28 subsetted granules that match your date range and spatial bounding box.
    Subset: Variables for AIRX2RET v006 (Get list of URLs for this subset in a file) (Downloading instructions)
    AIRS.2004.01.09.058.L2.SUBX2RET.v6.0.7.0.G13109011211.hdf AIRS.2004.01.09.179.L2.SUBX2RET.v6.0.7.0.G13109014503.hdf AIRS.2004.01.09.180.L2.SUBX2RET.v6.0.7.0.G13109015408.hdf AIRS.2004.01.10.049.L2.SUBX2RET.v6.0.7.0.G13109021613.hdf AIRS.2004.01.10.065.L2.SUBX2RET.v6.0.7.0.G13109021359.hdf AIRS.2004.01.10.187.L2.SUBX2RET.v6.0.7.0.G13109024643.hdf AIRS.2004.01.11.056.L2.SUBX2RET.v6.0.7.0.G13109030751.hdf AIRS.2004.01.11.177.L2.SUBX2RET.v6.0.7.0.G13109035014.hdf AIRS.2004.01.11.178.L2.SUBX2RET.v6.0.7.0.G13109035154.hdf AIRS.2004.01.12.063.L2.SUBX2RET.v6.0.7.0.G13109041947.hdf AIRS.2004.01.12.185.L2.SUBX2RET.v6.0.7.0.G13109044051.hdf AIRS.2004.01.13.054.L2.SUBX2RET.v6.0.7.0.G13109051208.hdf AIRS.2004.01.13.175.L2.SUBX2RET.v6.0.7.0.G13109053608.hdf AIRS.2004.01.13.176.L2.SUBX2RET.v6.0.7.0.G13109054504.hdf AIRS.2004.01.14.061.L2.SUBX2RET.v6.0.7.0.G13109060646.hdf AIRS.2004.01.14.183.L2.SUBX2RET.v6.0.7.0.G13109063512.hdf AIRS.2004.01.15.052.L2.SUBX2RET.v6.0.7.0.G13109065830.hdf AIRS.2004.01.15.174.L2.SUBX2RET.v6.0.7.0.G13109073939.hdf AIRS.2004.01.15.190.L2.SUBX2RET.v6.0.7.0.G13109073547.hdf AIRS.2004.01.16.059.L2.SUBX2RET.v6.0.7.0.G13109075707.hdf AIRS.2004.01.16.180.L2.SUBX2RET.v6.0.7.0.G13109082739.hdf AIRS.2004.01.16.181.L2.SUBX2RET.v6.0.7.0.G13109082805.hdf AIRS.2004.01.17.050.L2.SUBX2RET.v6.0.7.0.G13109085345.hdf AIRS.2004.01.17.066.L2.SUBX2RET.v6.0.7.0.G13109085748.hdf AIRS.2004.01.17.188.L2.SUBX2RET.v6.0.7.0.G13109092611.hdf AIRS.2004.01.18.057.L2.SUBX2RET.v6.0.7.0.G13109095110.hdf AIRS.2004.01.18.178.L2.SUBX2RET.v6.0.7.0.G13109102416.hdf AIRS.2004.01.18.179.L2.SUBX2RET.v6.0.7.0.G13109102414.hdf
    Now you can click on the link “list of URLs” to download. Click on “Downloading instructions” for info.

    b) GIOVANNI to access and do some processing/image creation
    Navigate to the URL: http://giovanni.gsfc.nasa.gov/giovanni/
    Choose type of analysis you wish to perform (Maps, Comparisons, Time Series, etc, etc)
    For example, select Time Series: Area-Averaged Set date range and bounding box.
    Specify the same date range and bounding box as above
    (note that the order of spatial bounds is different: West, South, East, North)
    Discipline:
      Atmospheric Chemistry
        Choose Ozone Total Column Daytime AIRX3STD v006
        and Ozone Total Column Nighttime AIRX3STD v006 (you could do them separately)
    Click Plot Data
      GIOVANNI cranks away for a while and produces two images
      (daytime and nighttime separately, images attached)
    GIOVANNI also allows the user to download the data used to create the images.
    Click on “Downloads” and choose ASCII CSV for data and PNG for images.

  • How do I download AIRS L3 Monthly product in ASCII format?

    The Simple Subset Wizard (SSW) option allows you to select an area and time range and convert to ASCII:

    http://disc.sci.gsfc.nasa.gov/SSW/

    Fill in the data set keyword, spatial range and date range
    AIRX3STM
    2003-01-01  2003-12-31     (this gets the full year of 2003)
    22,39,60,79                         (bounding box: S, W, N, E)

    Then click Search for Data Sets

    you will be presented with two sets, V6 and V5    — CHOOSE V006 (click the check box at be beginning, just to right of + sign) and select ASCII from the pull-down menu to the right of the subset
    (note: if you click on the + sign at the left, you will be presented with subset selection

    Click Subset Selected Data Sets (even though you have not subset)

    Click View Subset Results

    If you click on a file hot link, it will immediately download

    You can instead click on “list of URLs” to get a set of FTP instructions to execute (see the downloading instructions)

  • How do I subset Version 6 data

    You may subset AIRS V6 Data Products using the Simple Subset Wizard at the GES DISC

    Access the AIRS V6 Data Products URL
    http://disc.sci.gsfc.nasa.gov/AIRS/data-holdings/by-data-product-V6
    Click on DataAccess link for the product of interest

    Example: choose DataAccess for AIRX2SUP, the AIRS L2 Support Product
    Clicking DataAccess leads to this URL
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRS2SUP_V006.html?AIRS2SUP&#tabs-2

    Click on the Simple Subset Wizard link
    This opens up the web page
    http://disc.sci.gsfc.nasa.gov/SSW/#keywords=AIRX2SUP%20006

    Specify a date range and spatial bounding box, then click Search button.
    This opens up the page stating found data set
    Click on the empty button to the right of the + sign to select ALL products in the data, OR...
    Click on the + sign to see a list of the products that allows selection of those you wish

    Each product has a + sign next to it that allows you to further subset, i.e. you can filter out fields if you desire to not include them in your subset by not checking them.  For example, Air Temperature Variabies includes TAirSup 100 level vector and also includes the TSurf Air and the Temp_ave_kern and other items.  You could simply choose TSurfAir, TSurfAir_QC and TSurfAirErr if that particular product is the only one of interest.

    After selecting fields of interest, choose between HDF and gzipped HDF format for the delivered data, and then click the SUBSET SELECTED DATA SETS button at the bottom of the page.

    The data will be subsetted.  Then cluck the View Subset Results button.  You wil set the list of subsetted granules that have been prepared.

    Click the Get List of URLs or the Downloading Instructions links.

  • Using ArcGIS, I get different bounds for Level 3 data products depending upon whether they are V5 or V6, despite the fact that both are 1x1 deg grids covering the same spatial area.

    The V5 and V6 Level 3 gridded products share identical 1x1 degree grids.  The latitudes and longitudes of the grid box centers are provided in the data (LATITUDE, LONGITUDE).  The upper left box center location is (89.5,-179.5) and the lower right box center location is (-89.5, +179.5). The spatial extent of the 1x1 degree grid spans the upper left (+90, -180) to lower right (-90, +180).

    Many software packages, and ArcGIS is one such, look at the metadata to determine the spatial extent of the data sets rather than the grid resolution and spatial arrays themselves. The V6 metadata incorrectly specifies the spatial extent because they are set based on the upper left and lower right grid box centers.  The V5 metadata correctly provided the spatial extent values based on the upper left and lower right grid box outer edges.

    For example, if you peruse the V6 metadata, you will find:

    GROUP=GridStructure
        GROUP=GRID_1
            GridName="ascending"
            XDim=360
            YDim=180
            UpperLeftPointMtrs=(-179030000.000000,89030000.000000)
            LowerRightMtrs=(180030000.000000,-90030000.000000)


    Whereas, if you peruse the V5 metadata, you will find:

    GROUP=GridStructure
        GROUP=GRID_1
            GridName="ascending"
            XDim=360
            YDim=180
            UpperLeftPointMtrs=(-180000000.000000,90000000.000000)
            LowerRightMtrs=(180000000.000000,-90000000.000000)

    If your software analysis package uses the metadata rather than the grid resolution and latitude/longitude data arrays to determine spatial extent, you must override the grid bounds metadata of the V6 data set.

    The V6 metadata providing the corners, as required by ArcGIS, is incorrect and this will be fixed in a future release of AIRS products.  We recommend that ArcGIS users of the V6 L3 data products override the values of UpperLeftPointMtrs and LowerRightMtrs, setting them to those found in the V5 L3 data products metadata, i.e.:

            UpperLeftPointMtrs=(-180000000.000000,90000000.000000)
            LowerRightMtrs=(180000000.000000,-90000000.000000)

  • How do I convert the format of AIRS granules from HDF to netCDF

    The GES DISC provides a service that allows you to convert a granule from HDF to netCDF on the fly before download.  Here is how you do it:

    1) Access the AIRS V6 data by product URL at GES DISC

        http://disc.sci.gsfc.nasa.gov/AIRS/data-holdings/by-data-product-V6

    2) Choose the shortname of the product you wish to access.  For this demonstration let us assume we wish to pull the AIRS-Only L2 support product granule number 210 for 23 Jan 2013.  The shortname for the AIRS-Only L2 support product is AIRS2SUP. Find this shortname in the first column of the first table on the web page and click on the "DataAccess" link in the last column.

    3) Several data access services are provided on the resulting URL.  The one that currently supports on-the-fly conversion to netCDF is OPENDAP. Click on the data service access URL provided under the OPENDAP section. Note that the URL already has the shortname and version number in it as keywords (they are AIRS2SUP and 006)

    4) You are now presented with a listing of years (look to ensure that the subdirectory listed in the path at the top of the page is AIRS2SUP.006).  Since the year for the granule is 2013, click on that year.

    5) You are now presented with a listing of day numbers, from 1 to 365 (366 if a leap year). The January 23 date is easy, for that day is 023 of the year.  If you were looking for March 1 you would have to be aware of whether the year is not a leap year (daynum = 31+28+1) or is a leap year (daynum = 31 + 29 + 1).  Click on 023 since that is the date desired.

    6) Now you are presented with links to every one of the 240 granules that are created each day. Note that the name of the granule contains year, month, day of month, granule number, L2, RetSup_IR, and a version number.  Also note that the granules are HDF.  Click on granule 210.

    7) In the action line, click on the "Get as NetCDF" button.  There will be a pause as GES DISC converts the HDF file to a netCDF file. Then you will be given the opportunity to download the result to your computer.

    8) IMPORTANT NOTE: By default, if you follow the steps above the complete contents of the HDF file will be converted over to the netCDF file.  You are also given the capability to subset only the variable you desire. Before clicking "Get as NetCDF", choose the variables you desire by checking them off in the "Variables" section.  Then click "Get as NetCDF".  Only those variables will be present in the (much smaller) netCDF file.

  • How does one convert AIRS Level 3 support product specific humidity column density from molecules per square centimeter to millimeters of precipitable water vapor?

    The AIRS V6 Level 3 support product provides specific humidity column density profiles as layer quantities (H2OCDSup), in units of molecules/cm^2, on 100 support pressure levels. Use the TqJoint (ascending or descending) quantity.
    H2OCDSup is reported on the support pressure level that bounds the layer closest to the surface.  For example, the 75th element of the 100 in the H2OCDSup vector corresponds to the 496.63 hPa pressure level, and provides the average specific humidity for the layer bounded by that pressure level and the next one higher in the atmosphere (which is the 477.961 hPa pressure level).

    Define:
    Mwv = 18.0154 grams/mole = water molecular mass
    RHOwv = 1.0 grams/cm^3  = water mass density
    NA = 6.02214 x 10^23 molecules/mole = Avagadro’s number
    CD = water vapor column density, molecules/cm^2 = H2OCDSup[n] where n=75 for layer bounded by pressure levels  477.961 hPa and 496.63 hPa
    pwv = precipitable water vapor contained with the layer, millimeters

    Calculation:
    pwv = 10.0 x CD x Mwv / (NA x RHOwv)

    Multiple layers in a profile are accumulated by performing the calculation for each individually and then summing the results.  Performing this calculation for a profile will not provide an exact match to the total precipitable water vapor (totH2OStd) in the AIRS product because the bottom layer is determined by the surface pressure (and not the standard pressure array) and the calculation for that layer nearest the surface must be done by finely slicing it to achieve an accurate value for its water vapor content.

  • How does one convert AIRS Level 2 support product specific humidity column density from molecules per square centimeter to millimeters of precipitable water vapor?

    The AIRS V6 Level 2 support product provides specific humidity column density profiles as layer quantities (H2OCDSup), in units of molecules/cm^2, on 100 support pressure levels.
    H2OCDSup is reported on the support pressure level that bounds the layer closest to the surface.  For example, the 75th element of the 100 in the H2OCDSup vector corresponds to the 496.63 hPa pressure level, and provides the average specific humidity for the layer bounded by that pressure level and the next one higher in the atmosphere (which is the 477.961 hPa pressure level).

    Define:
    Mwv = 18.0154 grams/mole = water molecular mass
    RHOwv = 1.0 grams/cm^3  = water mass density
    NA = 6.02214 x 10^23 molecules/mole = Avagadro’s number
    CD = water vapor column density, molecules/cm^2 = H2OCDSup[n] where n=75 for layer bounded by pressure levels  477.961 hPa and 496.63 hPa
    pwv = precipitable water vapor contained with the layer, millimeters

    Calculation:
    pwv = 10.0 x CD x Mwv / (NA x RHOwv)

    Multiple layers in a profile are accumulated by performing the calculation for each individually and then summing the results.  Performing this calculation for a profile will not provide an exact match to the total precipitable water vapor (totH2OStd) in the AIRS product because the bottom layer is determined by the surface pressure (and not the standard pressure array) and the calculation for that layer nearest the surface must be done by finely slicing it to achieve an accurate value for its water vapor content.

  • How does one convert AIRS Level 3 standard specific humidity product from grams of water vapor per kilogram of dry air (g/kg) to millimeters of precipitable water vapor?

    The AIRS V6 Level 3 provides standard product specific humidity profiles as level quantities (H2O_MMR) and as layer quantities (H2O_MMR_Lyr).  Both are given in units of g/kg of dry air, but only one is easily convertible to millimeters of precipitable water vapor.

    Use the TqJoint (ascending or descending) quantity, H2O_MMR_Lyr, which provides the average specific humidity in a layer bounded by two standard pressure levels.  It is reported on the pressure level that bounds the layer closest to the surface.  For example, the 7th element of the 15 in the H2O_MMR_Lyr vector corresponds to the 500 hPa pressure level, and provides the average specific humidity for the layer bounded by that pressure level and the next one higher in the atmosphere (which is the 400 hPa pressure level).


    Define: dP = thickness of layer, hPa = (lower boundary pressure - upper boundary pressure) g = 9.80665 m/s^2 = gravitational constant of acceleration at the surface of the Earth w = water vapor mixing ratio, g/kg = H2O_MMR_Lyr[n] where n=7 for layer bounded by pressure levels  400 hPa and 500 hPa pwv = precipitable water vapor contained with the layer, millimeters

    Calculation:
    pwv = 0.1 x (dP/g) x w

    and for the example, dP = 100 hPa

    Multiple layers in a profile are accumulated by performing the calculation for each individually and then summing the results.  Performing this calculation for a profile will not provide an exact match to the total precipitable water vapor (totH2OVap) in the AIRS L3 product because the bottom layer is determined by the surface pressure (and not the standard pressure array) and the calculation for that layer nearest the surface must be done by finely slicing it to achieve an accurate value for its water vapor content.  Note that totH2OVap units are kg/m^2.  This is equivalent to millimeters of precipitable water, since the density of water is 1 gm/cm^3.

  • How does one convert AIRS Level 2 specific humidity product from grams of water vapor per kilogram of dry air (g/kg) to millimeters of precipitable water vapor?

    The AIRS V6 Level 2 product provides specific humidity profiles as level quantities (H2OMMRLevStd) and as layer quantities (H2OMMRStd).  Both are given in units of g/kg of dry air, but only one is easily convertible to millimeters of precipitable water vapor.

    Use the quantity, H2OMMRStd, which provides the average specific humidity in a layer bounded by two standard pressure levels.  It is reported on the pressure level that bounds the layer closest to the surface.  For example, the 7th element of the 15 in the H2OMMRStd vector corresponds to the 500 hPa pressure level, and provides the average specific humidity for the layer bounded by that pressure level and the next one higher in the atmosphere (which is the 400 hPa pressure level).

    Define:
    dP = thickness of layer, hPa = (lower boundary pressure - upper boundary pressure)
    g = 9.80665 m/s^2 = gravitational constant of acceleration at the surface of the Earth
    w = water vapor mixing ratio, g/kg = H2OMMRStd[n] where n=7 for layer bounded by pressure levels  400 hPa and 500 hPa
    pwv = precipitable water vapor contained with the layer, millimeters

    Calculation:
    pwv = 0.1 x (dP/g) x w

    and for the example, dP = 100 hPa

    Multiple layers in a profile are accumulated by performing the calculation for each individually and then summing the results.  Performing this calculation for a profile will not provide an exact match to the total precipitable water vapor (totH2OStd) in the AIRS product because the bottom layer is determined by the surface pressure (and not the standard pressure array) and the calculation for that layer nearest the surface must be done by finely slicing it to achieve an accurate value for its water vapor content.

  • Where might I download the AIRS Spectral Response Functions (SRFs).

    You can download the AIRS SRFs by pointing your browser at this URL:

    ftp://asl.umbc.edu/pub/airs/srf/

  • Does the GES DISC allow the user to request AIRS HDF granules be converted to netCDF for download?

    Yes, both the Level 2 and Level 3 data products can be converted from HDF version 4 to netCDF before they are made available to the user for download. Level 2 may be converted to netCDF and Level 3 may be converted to netCDF or ASCII.

    There are two methods by which this may be accomplished:
      a) MIRADOR data access method allows subsetting and conversion to netCDF of both L2 and L3
      b) Simple Subset Wizard method allows subsetting of L2 and L3 but only converts L3 to netCDF or ASCII

    For MIRADOR, go to the V6 Data Products URL
    http://disc.sci.gsfc.nasa.gov/AIRS/data-holdings/by-data-product-V6

    Then click on the DataAccess link of the Data Product of interest (for example, AIRX2RET for L2)

    Select the URL for MIRADOR, in this example
    http://mirador.gsfc.nasa.gov/cgi-bin/mirador/homepageAlt.pl?keyword=AIRX2RET

    Set the Time Span and spatial location area and click the search button

    Select the desired data set among those found, click “add selected files to cart”

    Click on “Subset by Variable”

    Click the netCDF button under Output Format Options at the top of the page (or leave as HDF if desired)

    Modify the variable selection as desired (default is all)

    Then click the “Submit” button

    Then click “Continue to Cart”

    Then click “Checkout”

    You will be presented with several options for downloading the data

    For the Simple Subset Wizard, go to this URL
    http://disc.sci.gsfc.nasa.gov/SSW

    Click the “Select Data Sets”

    A window opens listing various data sets

    Expand the “Goddard Earth Sciences Data and Information Services Center” option

    Expand the "Aqua AIRS v006”  sub-option

    Click the “AIRX3STD 006” option (this is L3 daily) and then click “Choose” button at bottom of this window

    Enter the Date Range

    Enter the Spatial Bounding Box (click on map symbol to left to do this graphically)

    Click on “Search for Data Sets”

    You will be presented with a “found N subsettable data sets”

    Expand the set if you wish to subset, otherwise just check it. netCDF conversion is default but you can choose ASCII

    Then Click “Subset Selected Data Sets”

    Then click “View Subset Results”

    Then Click the hot link to download, or “Get list of URLs for this subset in a file)

    The list of URLs can then be provided to wget (see the Downloading instructions)

  • How do I access the AIRS Near Real Time Products?

    The AIRS Level 1B and Level 2 Near Real Time Products (NRT) are available through the LANCE server, and are available within 3 hours of Aqua satellite overpass.  

    The NRT products use a less accurate ephemeris/attitude than the official products that are available at the GES DISC about 3 days after Aqua satellite overpass, which results in positional offsets that are typically less than 100 meters.  Granule processing does not include the previous or subsequent granule (as is done for the official products) and this results in very small radiance biases of the first and last scans in the granules relative to the radiances in the official products.  The small positional errors also lead to minor interpolated surface pressures used by the Level 2 retrieval stage.  The impact of the radiance biases and surface pressure error in the Level 2 NRT products are small but sufficient to warrant caution if attempting to use the NRT products for precision scientific analyses.  The official products are what should be used for those type of analyses.  The NRT products are for quick look studies in which time is of the essence, as in severe weather and imagery.

    See the web page
    http://disc.sci.gsfc.nasa.gov/nrt/data-holdings/airs-nrt-products/airs-nrt-products#Data

    The NRT products are L1B AIRS, AMSU, and VIS radiance products (AIRIBRAD, AIRABRAD, AIRVBRAD)

    The L2 Standard Products
    AIRS L2 physical retrieval standard product (AIRX2RET)
    AIRS L2 Cloud Cleared radiance product (AIRI2CCF)

    The L2 Support Product
    AIRS L2 physical retrieval support product (AIRX2SUP) and the QA products for L1B AIRS and VIS (AIRIBQAP, AIRVBQAP)

    Users who with to access the NRT data products should use the link to the Lance Registration page
    http://disc.sci.gsfc.nasa.gov/AIRS/data-holdings/nrt/announcements/to-all-lance-airs-and-lance-mls-near-real-time-users

    and self-register at the link
    https://earthdata.nasa.gov/urs/register

  • How do I access the AIRS data?

    Version 6 is our most recent release, and it is available for the full mission, Sept 2002 to the recent past.

    Version 6 documents are available at this URL
    http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v6_docs

    It is important that you read the documentation before you attempt to use the AIRS data.

    The key documents are:
    Data Release User Guide, Data Disclaimer, L2 Product User Guide, and  L3 User Guide

    Version 6 data products are available at this URL
    http://disc.sci.gsfc.nasa.gov/AIRS/data-holdings/by-data-product-V6

    You may wish to initially investigate the Level 3 data products, which are gridded at 1x1 deg resolution. Use the _TqJoint data set contained within these products, as they have applied consistent quality assurance selection filters to ensure that full profiles are averaged. The product provides total integrated column water vapor burden and profiles (level quantities at pressure levels and layer quantities integrated between bounding pressure levels). Atmospheric composition products (except for CO2) and surface products are also available in the Level 3 Standard Data Product files.

    The links to the AIRS Daily L3 Standard Data Product are available at this URL
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRX3STD_V006.html?AIRX3STD&#tabs-2

    The FTP archive link for the AIRS Daily L3 Standard Data Product is
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRX3STD.006/

    The 8-day L3 standard product links are under the URL
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRX3ST8_V006.html?AIRX3ST8&#tabs-2

    The calendar monthly L3 standard product links are under the URL
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRX3STM_V006.html?AIRX3STM&#tabs-2

    Links to the FTP archives of each are also available for these two L3 products as well:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRX3ST8.006/
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRX3STM.006/

    V6 for the full mission, Sept 2002 to the recent past, are available.

    The Level 2 data products are calibrated and geolocated retrievals at instrument field-of-view resolution.

    The links to the AIRS L2 Standard Data Product are available at this URL
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRX2RET_V006.html?AIRX2RET&#tabs-2

    The FTP archive link for the AIRS L2 Standard Data Product is
    ftp://airsl2.gesdisc.eosdis.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level2/AIRX2RET.006/

  • How do I access the AIRS CO2 data?

    The AIRS CO2 data product is currently being produced by assimilating Version 5 L2 data product, and so these data are located on the V5 data page.  The URL is
    http://disc.sci.gsfc.nasa.gov/AIRS/data-holdings/by-data-product-v5

    Earlier than January 2012, the post processing AIRS CO2 retrieval stage ingested the AIRS L2 products derived using AIRS infrared and AMSU microwave radiances.  Due to a steady degradation of the noise figure if AMSU channel #5, the L2 product yield decreased throughout 2011, resulting in a drop in CO2 yield as well.  The AIRS CO2 retrieval stage was modified to ingest the L2 products derived using only the AIRS infrared radiances, and CO2 products from January 2012 onward are retrieved in this manner.  The CO2 post-processing algorithm is in the process of being modified to ingest the V6 L2 products.

    L2 data are calibrated and geolocated retrievals at a resolution of 100km x 100km.

    L3 data are binned on a 2° x 2.5° grid.  In the V6 release this will be change to 1° x 1°.

    The AIRS CO2 data products are to be found at these URLs:

    Level 2, from September 2002 through December 2011:
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRX2STC_V005.html?AIRX2STC&#tabs-2

    FTP archives:
    ftp://airsl2.gesdisc.eosdis.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level2/AIRX2STC.005/

    Level 3 Daily from September 2002 through December 2011:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRX3C2D.005/

    Level 3 8-Day from September 2002 through December 2011:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRX3C28.005/

    Level 3 Calendar Monthly from September 2002 through December 2011:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRX3C2M.005/

    Level 2, from January 2012 to present:
    http://disc.sci.gsfc.nasa.gov/datacollection/AIRX2STC_V005.html?AIRX2STC&#tabs-2

    FTP archives:
    ftp://airsl2.gesdisc.eosdis.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level2/AIRX2STC.005/

    Level 3 Daily from January 2012 to present:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRS3C2D.005/

    Level 3 8-Day from January 2012 to present:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRS3C28.005/

    Level 3 Calendar Monthly from January 2012 to present:
    ftp://acdisc.gsfc.nasa.gov/ftp/data/s4pa/Aqua_AIRS_Level3/AIRS3C2M.005

  • How do I read the AIRS data products?

    The AIRS data products are provide in HDF V4 format. The Level 2 data products are HDF-EOS swath format, whereas the Level 3 data products are HDF-EOS gridded format.

    We provide procedures in IDL and MATLAB (and more primitive ones in FORTRAN and C) on the documentation web page.
    http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v6_docs

    Scroll to the bottom of the page and you will find these two links that will download the codes, with README files and sample input and output files

    http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v5_docs/AIRS_V5_Release_User_Docs/IDL_MATLAB_READERS.tar.gz

    http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v6_docs/v6releasedocs-1/FORTRAN_C_READERS.tar.gz

    The IDL and MATLAB procedures are well-documented and easiest to use.

Didn't find your question?

Submit a question and we'll get back to you as soon as possible. We'll also consider the question for inclusion in Ask AIRS.