Downloading Input Data

Downloading Input Data

Static and initialization data

Two types of datasets will need to be available on your system to run WPS and WRF. The first is a static geographical dataset that is interpolated to the model domain defined by the geogrid program in WPS. To reduce the necessary size footprint, only a subset of coarse static geographical data is provided. The second is model initialization data that is also processed through WPS and the real.exe program to supply initial and lateral boundary condition information at the start and during the model integration.

Additionally, a tarball for GSI data assimilation will be needed, containing CRTM coefficient files. And a tarball containing shapefiles is needed for running the Python plotting scripts.  

Information on how to download data is detailed in this tutorial. If a user has access to datasets on their local machine, they can also point to that data when running the containers.

Please follow the appropriate section below that fits your needs.

NOTE: If you do not plan on running all the test cases, you can omit downloading all of the model input data and only download those cases you are interested in. All cases require the CRTM and WPS_GEOG data, however.

For other platforms, you can download this data from the DTC website:

cd ${PROJ_DIR}/

mkdir data/

cd data/

curl -SL https://dtcenter.ucar.edu/dfiles/container_nwp_tutorial/tar_files/container-dtc-nwp-derechodata_20120629.tar.gz | tar zxC .

curl -SL https://dtcenter.ucar.edu/dfiles/container_nwp_tutorial/tar_files/container-dtc-nwp-sandydata_20121027.tar.gz | tar zxC .

curl -SL https://dtcenter.ucar.edu/dfiles/container_nwp_tutorial/tar_files/container-dtc-nwp-snowdata_20160123.tar.gz | tar zxC .

curl -SL https://dtcenter.ucar.edu/dfiles/container_nwp_tutorial/tar_files/CRTM_v2.3.0.tar.gz | tar zxC .

curl -SL https://dtcenter.ucar.edu/dfiles/container_nwp_tutorial/tar_files/wps_geog.tar.gz | tar zxC .

curl -SL https://dtcenter.ucar.edu/dfiles/container_nwp_tutorial/tar_files/shapefiles.tar.gz | tar zxC .

You should now see all the data you need to run the three cases in this directory, grouped into five directories:

ls -ald -- *
drwxr-xr-x   3 user  admin    96 Jul 21  2020 gsi/
drwxr-xr-x  35 user  admin  1120 May 13 22:52 WPS_GEOG/
drwxr-xr-x   3 user  admin    96 Nov 12  2018 model_data/
drwxr-xr-x   4 user  admin   128 Nov 12  2018 obs_data/
drwxr-xr-x   3 user  admin   96 Sep 10  2021 shapefiles/

For users of the NCAR Cheyenne machine, the input data has been staged on disk for you to copy. Make a directory named "data" and unpack the data there:

cd ${PROJ_DIR}/
mkdir data/
cd data/
tcsh bash
foreach f (/glade/p/ral/jntp/NWP_containers/*.tar.gz)
  tar -xf "$f"
end
for f in /glade/p/ral/jntp/NWP_containers/*.tar.gz; do tar -xf "$f"; done

You should now see all the data you need to run the three cases in this directory:

ls -al            
drwxr-xr-x 3 user ral 4096 Jul 21  2020 gsi
drwxrwxr-x 3 user ral 4096 Nov 12  2018 model_data
drwxrwxr-x 3 user ral 4096 Nov 12  2018 obs_data
drwxrwxr-x 4 user ral 4096 Sep 10  2021 shapefiles
drwxrwxr-x 35 user ral 4096 May 13 16:52 WPS_GEOG
jwolff Tue, 03/19/2019 - 11:36

Sandy Data

Sandy Data jwolff Mon, 03/25/2019 - 09:43

For the Hurricane Sandy case example, Global Forecast System (GFS) forecast files initialized at 12 UTC on 20121027 out 48 hours in 3-hr increments are provided. Prepbufr files from the North American Data Assimilation System (NDAS) are provided for point verification and data assimilation, and Stage II precipitation analyses are included for gridded verification purposes.

There are two options for establishing the image from which the data container will be instantiated:

  1. Pull the image from Docker Hub
  2. Build the image from scratch

Please follow the appropriate section below that fits your needs.

Option 1: Pull the dtcenter/sandy image from Docker Hub

If you do not want to build the image from scratch, simply use the prebuilt image by pulling it from Docker Hub.

cd ${PROJ_DIR}
docker pull dtcenter/sandy

To instantiate the data container, type the following:

docker create -v /data/sandy_20121027 --name sandy dtcenter/sandy

To see what images you have available on your system, type:

docker images

To see what containers you have running on your system, type:

docker ps -a

Option 2: Build the dtcenter/sandy image from scratch

To access the model initialization data for the Hurricane Sandy case from the Git repository and build it from scratch, first go to your project space where you cloned the repository:

cd ${PROJ_DIR}/container-dtc-nwp/components

and then build an image called dtcenter/sandy:

cd case_data/sandy_20121027 ; docker build -t dtcenter/sandy . ; cd ../..

This command goes into the case_data/sandy_20121027 directory and reads the Dockerfile directives. Please review the contents of the case_data/sandy_20121027/Dockerfile for additional information.

To instantiate the case data container, type the following:

docker create -v /data/sandy_20121027 --name sandy dtcenter/sandy

To see what images you have available on your system, type:

docker images

To see what containers you have running on your system, type:

docker ps -a

Snow Data

Snow Data jwolff Mon, 03/25/2019 - 09:40

For the snow case example, Global Forecast System (GFS) forecast files initialized at 00 UTC on 20160123 out 24 hours in 3-hr increments are provided. Prepbufr files from the North American Data Assimilation System (NDAS) are provided for point verification and data assimilation, and MRMS precipitation analyses are included for gridded verification purposes.

There are two options for establishing the image from which the data container will be instantiated:

  1. Pull the image from Docker Hub
  2. Build the image from scratch

Please follow the appropriate section below that fits your needs.

Option 1: Pull the dtcenter/snow image from Docker Hub

If you do not want to build the image from scratch, simply use the prebuilt image by pulling it from Docker Hub.

cd ${PROJ_DIR}
docker pull dtcenter/snow

To instantiate the case data container, type the following:

docker create -v /data/snow_20160123 --name snow dtcenter/snow

To see what images you have available on your system, type:

docker images

To see what containers you have running on your system, type:

docker ps -a

Option 2: Build the dtcenter/snow image from scratch

To access the model initialization data for the snow case from the Git repository and build it from scratch, first go to your project space where you cloned the repository:

cd ${PROJ_DIR}/container-dtc-nwp/components

and then build an image called dtcenter/snow:

cd case_data/snow_20160123 ; docker build -t dtcenter/snow . ; cd ../..

This command goes into the case_data/snow_20160123 directory and reads the Dockerfile directives. Please review the contents of the case_data/snow_20160123/Dockerfile for additional information.

To instantiate the case data container, type the following:

docker create -v /data/snow_20160123 --name snow dtcenter/snow

To see what images you have available on your system, type:

docker images

To see what containers you have running on your system, type:

docker ps -a

Derecho data

Derecho data jwolff Mon, 03/25/2019 - 09:42

For the derecho case example, Global Forecast System (GFS) forecast files initialized at 12 UTC on 20120629 out 24 hours in 3-hr increments are provided. Prepbufr files from the North American Data Assimilation System (NDAS) are provided for point verification and data assimilation, and Stage II precipitation analyses are included for gridded verification purposes.

There are two options for establishing the image from which the data container will be instantiated:

  1. Pull the image from Docker Hub
  2. Build the image from scratch

Please follow the appropriate section below that fits your needs.

Option 1: Pull the prebuilt dtcenter/derecho image

If you do not want to build the image from scratch, simply use the prebuilt image by pulling it from Docker Hub.

cd ${PROJ_DIR}
docker pull dtcenter/derecho

To instantiate the case data container, type the following:

docker create -v /data/derecho_20120629 --name derecho dtcenter/derecho

To see what images you have available on your system, type:

docker images

To see what containers you have running on your system, type:

docker ps -a

Option 2: Build the dtcenter/derecho image from scratch

To access the model initialization data for the Derecho case from the Git repository and build it from scratch, first go to your project space where you cloned the repository:

cd ${PROJ_DIR}/container-dtc-nwp/components

and then build an image called dtcenter/derecho:

cd case_data/derecho_20120629 ; docker build -t dtcenter/derecho . ; cd ../..

This command goes into the case_data/derecho_20120629 directory and reads the Dockerfile directives. Please review the contents of the case_data/derecho_20120629/Dockerfile for additional information.

To instantiate the case data container, type the following:

docker create -v /data/derecho_20120629 --name derecho dtcenter/derecho

To see what images you have available on your system, type:

docker images

To see what containers you have running on your system, type:

docker ps -a