========================================== Workflow Examples for GFS Initialization ========================================== Introduction ============ This section takes the reader through the entire workflow from GFS input data to Flexpart WRF SRM output. It does so in an incremental way, first introducing the basics, and illustrating with the use of standalone NWP components. By design, complete workflows can be constructed in this standalone fashion, and it's built that way to facilitate great flexibility in how and when certain components are executed. Several examples are also provided to illustrate what will likely be the more common workflows, whereby a single workflow namelist (*wfnamelist*) is created for the entire simulation. By the end of this document, the reader should be able to start creating their own custom workflows using GFS as input data. The use of ERA data will be somewhat similar, but there are a few aspects that warrant its coverage in the following document. Note that copies of all of the following wfnamelists (and other important files) are available in the repository directory **packages/ehratmworkflow/docs/UserPerspective/sample_workflows/** .. code-block:: bash $ tree sample_workflows sample_workflows ├── flexp_input.txt ├── namelist.input.gfs_twonest ├── small_domain_twonest.nml ├── wfnamelist.donothing ├── wfnamelist.ecmwf-fullwps-twonest-4mpitasks ├── wfnamelist.ecmwf-metgridonly-twonest-4mpitasks ├── wfnamelist.ecmwfungrib ├── wfnamelist.flexwrf+srm ├── wfnamelist.fullworkflow ├── wfnamelist.fullwps-twonest ├── wfnamelist.geogrid-twonest-4mpitasks ├── wfnamelist.gfs-metgridonly ├── wfnamelist.gfs-realonly ├── wfnamelist.gfsungrib └── wfnamelist.gfs-wrfonly 0 directories, 15 files Finally, a copy of a recent :doc:`Workflow Namelist Reference ` document is available for understanding the format and options within a wfnamelist. ---- ehratmwf.py - the workflow driver ================================= We now begin the process of using **ehratmwf.py** to run a collection of workflows, starting with the extremely simple, incrementally increasing the complexity of the workflow namelists (*wfnamelist*) * Basic usage message * Simple “do nothing” wfnamelist, that ultimately generates an error message With these first two examples, we gain confidence that our basic setup is correct, and then move on to actual NWP workflows. With the **PYTHONPATH** set, with a copy of **ehratmwf.py** in our working dir, and with a link, **epython** to the Conda Python that’s already available, we start the process of running **ehratmwf.py** by looking at its usage message .. code-block:: bash $ ./epython3 ehratmwf.py --help usage: ehratmwf.py [-h] -n WF_NAMELIST_PATH Enhanced High Res ATM optional arguments: -h, --help show this help message and exit -n WF_NAMELIST_PATH, --namelist WF_NAMELIST_PATH path to workflow namelist As you can see, there is one required argument - that of the path to the wfnamelist. So, in the next example we use an actual, but incomplete wfnamelist, **wfnamelist.donothing**, just to ensure that it is processed, and its incompleteness is detected, generating an error .. code-block:: bash $ cat wfnamelist.donothing &control workflow_list = 'ungrib' log_level = 'warning' workflow_rootdir = '/tmp' / .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.donothing Log level set to wfnamelist specified value: WARNING 2023-09-11:18:29:31 --> Workflow started 2023-09-11:18:29:31 --> Start process_namelist() ********************************************* SETTING LOG LEVEL TO WFNAMELIST-SPECIFIED VALUE: 30 ********************************************* CRITICAL [ehratmwf:ehratmwf.py:run_workflow:1751] --> process_namelist() ABEND ==> Use log level of DEBUG for full stack trace and messages The important takeaway from this example, which is expected to fail, is that **ehratmwf.py** did indeed process the wfnamelist and generated some error messages that a developer or troubleshooter would be interested in. If you were to change the **log_level** in the wfnamelist to **debug**, you would see much more verbose output, and then if you were to change the value of **workflow_list** to some “junk” value or the **workflow_rootdir** to a nonexistent path, then you would see other errors. In this way, default behaviour is to keep the output “clean,” with generated messages generally just letting you know something went wrong. Changing the **log_level** to a more verbose setting will allow for a generous amount of detail for troubleshooting. So, at this point, despite the error message, things have worked as expected and there is confidence that we can continue on to more meaningful examples. In the next set of examples, we demonstrate the execution of the WPS components **ungrib**, **geogrid** and **metgrid**, first as standalone operations, then in a unified wfnamelist. ---- Simple GFS ungrib ================= Starting with an ungrib component, we specify the single-component workflow in **wfnamelist.gfsungrib** .. code-block:: bash &control workflow_list = 'ungrib' workflow_rootdir = '/dvlscratch/ATM/morton/tmp' / &time start_time = '2021051415' end_time = '2021051421' wrf_spinup_hours = 0 / &grib_input1 type = 'gfs_ctbto' subtype = '0.5.fv3' hours_intvl = 3 rootdir = '/ops/data/atm/live/ncep' / The *workflow_rootdir* is where the run directory will be created, and ultimately where output will be located. You will obviously want this to be in a directory writable by you. .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.gfsungrib 2023-09-12:00:49:45 --> Workflow started 2023-09-12:00:49:45 --> Start process_namelist() 2023-09-12:00:49:45 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_004945.055280 2023-09-12:00:49:45 --> Start run_ungrib() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:00:49:45 --> Workflow started 2023-09-12:00:49:45 --> Start process_namelist() 2023-09-12:00:49:45 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_004945.055280 2023-09-12:00:49:45 --> Start run_ungrib() 2023-09-12:00:49:53 --> Workflow completed... Running it should be straightforward, and it’s important to note the location of the run directory, as we’ll need this location of the ungrib data when we run metgrid. We can take a look into the run directory .. code-block:: bash $ ls -F /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_004945.055280/ungrib_rundir_gfs_ctbto/WPS geogrid/ GRIBFILE.AAB@ namelist.wps.all_options ungrib.exe@ geogrid.exe@ GRIBFILE.AAC@ namelist.wps.fire* ungrib.log GFS:2021-05-14_15 link_grib.csh* namelist.wps.global util/ GFS:2021-05-14_18 metgrid/ namelist.wps.nmm Vtable@ GFS:2021-05-14_21 metgrid.exe@ README GRIBFILE.AAA@ namelist.wps ungrib/ and what we find is a complete run directory for the ungrib operation just performed. In fact, it is a full WPS run directory. We see the output files - **GFS:2021-05-14_\*** - present as expected. At this point, we could actually change into that directory and run **./ungrib.exe**, and it would run the component manually for us, producing the same output. This is particularly useful if we want to experiment with or troubleshoot the process, using a run directory that has already been set up for the task. ---- Simple two-nest geogrid ========================= For this one, we’ll use **wfnamelist.geogrid-twonest-4mpitasks**, which refers to the domain definition namelist, **small_domain_twonest.nml** (both of which I copy to my local dir for the following test). Note that most of the file/dir entries in the wfnamelist will support full pathnames. If not used, they refer to the current working directory. The domain(s) created by geogrid are defined in the domain definition namelist, **small_domain_twonest.nml** .. code-block:: bash &domain_defn parent_id = 1, 1, parent_grid_ratio = 1, 3, i_parent_start = 1, 20, j_parent_start = 1, 20, e_we = 51, 19, e_sn = 42, 16, geog_data_res = '10m', '5m' dx = 30000, dy = 30000, map_proj = 'lambert', ref_lat = 50.00, ref_lon = 5.00, truelat1 = 60.0, truelat2 = 30.0, stand_lon = 5.0, / whose path (in this case, a relative path) is specified in the wfnamelist, **wfnamelist.geogrid-twonest-4mpitasks** .. code-block:: bash &control workflow_list = 'geogrid' workflow_rootdir = '/dvlscratch/ATM/morton/tmp' / &domain_defn domain_defn_path = 'small_domain_twonest.nml' / &geogrid num_mpi_tasks = 4 / .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.geogrid-twonest-4mpitasks 2023-09-12:02:08:49 --> Workflow started 2023-09-12:02:08:49 --> Start process_namelist() 2023-09-12:02:08:49 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207 2023-09-12:02:08:49 --> Start run_geogrid() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:02:08:49 --> Workflow started 2023-09-12:02:08:49 --> Start process_namelist() 2023-09-12:02:08:49 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207 2023-09-12:02:08:49 --> Start run_geogrid() 2023-09-12:02:09:00 --> Workflow completed... Again, note the location of the run directory, as we’ll need this information to run the standalone metgrid component. .. code-block:: bash $ ls -F /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207/geogrid_rundir/WPS geo_em.d01.nc geogrid.log.0001 metgrid.exe@ namelist.wps.nmm geo_em.d02.nc geogrid.log.0002 namelist.wps README geogrid/ geogrid.log.0003 namelist.wps.all_options ungrib/ geogrid.exe@ link_grib.csh* namelist.wps.fire* ungrib.exe@ geogrid.log.0000 metgrid/ namelist.wps.global util/ The geogrid output files are **geo_em.d0?.nc**, and can be viewed with the **ncview** utility (if it’s not in the general system path, a copy of the executable is available in the repository, in **misc/utilities/ncview**) .. code-block:: bash $ ncview /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207/geogrid_rundir/WPS/geo_em.d01.nc .. |ncview-landmask| image:: LANDMASK-d01.png :scale: 70 % |ncview-landmask| ---- Simple two-nest metgrid ======================== For this one, we’ll use **wfnamelist.gfs-metgridonly** Because it is standalone, we need to specify where it can find the ungrib and geogrid files produced above. You will obviously need to edit for your own run directories. .. code-block:: bash &control workflow_list = 'metgrid' / &time start_time = '2021051415' end_time = '2021051418' wrf_spinup_hours = 0 / &metgrid ug_prefix_01 = 'GFS' ug_path_01 = '/dvlscratch/ATM/morton/tmp/ehratmwf_20230912_004945.055280/ungrib_rundir_gfs_ctbto/WPS' geogrid_path = '/dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207/geogrid_rundir/WPS' num_nests = 2 hours_intvl = 3 / And then, we run .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.gfs-metgridonly 2023-09-12:02:33:09 --> Workflow started 2023-09-12:02:33:09 --> Start process_namelist() 2023-09-12:02:33:09 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878 2023-09-12:02:33:09 --> Start run_metgrid() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:02:33:09 --> Workflow started 2023-09-12:02:33:09 --> Start process_namelist() 2023-09-12:02:33:09 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878 2023-09-12:02:33:09 --> Start run_metgrid() 2023-09-12:02:33:11 --> Workflow completed... .. code-block:: bash $ ls -l /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878/metgrid_rundir/WPS total 9752 lrwxrwxrwx. 1 morton consult 91 Sep 12 02:33 geo_em.d01.nc -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207/geogrid_rundir/WPS/geo_em.d01.nc lrwxrwxrwx. 1 morton consult 91 Sep 12 02:33 geo_em.d02.nc -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_020849.715207/geogrid_rundir/WPS/geo_em.d02.nc drwxr-xr-x. 2 morton consult 4096 Sep 12 02:33 geogrid lrwxrwxrwx. 1 morton consult 75 Sep 12 02:33 geogrid.exe -> /dvlscratch/ATM/morton/WRFDistributions/WRFV4.3/WPS/geogrid/src/geogrid.exe lrwxrwxrwx. 1 morton consult 104 Sep 12 02:33 GFS:2021-05-14_15 -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_004945.055280/ungrib_rundir_gfs_ctbto/WPS/GFS:2021-05-14_15 lrwxrwxrwx. 1 morton consult 104 Sep 12 02:33 GFS:2021-05-14_18 -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_004945.055280/ungrib_rundir_gfs_ctbto/WPS/GFS:2021-05-14_18 -rwxr-xr-x. 1 morton consult 1331 Sep 12 02:33 link_grib.csh -rw-r--r--. 1 morton consult 4273660 Sep 12 02:33 met_em.d01.2021-05-14_15:00:00.nc -rw-r--r--. 1 morton consult 4273660 Sep 12 02:33 met_em.d01.2021-05-14_18:00:00.nc -rw-r--r--. 1 morton consult 589300 Sep 12 02:33 met_em.d02.2021-05-14_15:00:00.nc -rw-r--r--. 1 morton consult 589300 Sep 12 02:33 met_em.d02.2021-05-14_18:00:00.nc . . . Again, this is a run directory set up for this specific metgrid component, with links to the needed geogrid and ungrib inputs. The output is in the *met_em.d0\*.nc* files, and can be viewed with the *ncview* utility. ---- Full WPS ungrib + geogrid + metgrid Workflow ============================================= In general, we will be interested in a single wfnamelist that defines a full workflow. The above examples were provided as more of an academic demonstration of how components can be chained together separately for asynchronous workflows. In the following, we define the entire workflow in one wfnamelist. In this scenario, we will not need to keep track of where the geogrid and ungrib outputs are - the **ehratmwf.py** will handle all of that. **wfnamelist.fullwps-twonest** .. code-block:: bash &control workflow_list = 'ungrib', 'geogrid', 'metgrid' / &time start_time = '2021051415' end_time = '2021051421' wrf_spinup_hours = 0 / &grib_input1 type = 'gfs_ctbto' subtype = '0.5.fv3' hours_intvl = 3 rootdir = '/ops/data/atm/live/ncep' / &domain_defn domain_defn_path = 'small_domain_twonest.nml' / &metgrid hours_intvl = 3 / Then, we run it, again noting the run directory which, in this case, will include run directories for ungrib, geogrid and metgrid .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.fullwps-twonest 2023-09-12:02:43:34 --> Workflow started 2023-09-12:02:43:34 --> Start process_namelist() 2023-09-12:02:43:34 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_024334.533613 2023-09-12:02:43:34 --> Start run_ungrib() 2023-09-12:02:43:43 --> Start run_geogrid() 2023-09-12:02:43:54 --> Start run_metgrid() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:02:43:34 --> Workflow started 2023-09-12:02:43:34 --> Start process_namelist() 2023-09-12:02:43:34 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_024334.533613 2023-09-12:02:43:34 --> Start run_ungrib() 2023-09-12:02:43:43 --> Start run_geogrid() 2023-09-12:02:43:54 --> Start run_metgrid() 2023-09-12:02:43:58 --> Workflow completed... .. code-block:: bash $ tree -L 2 -F /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_024334.533613 /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_024334.533613 ├── geogrid_namelist.wps ├── geogrid_rundir/ │ ├── GEOG_DATA -> /dvlscratch/ATM/morton/WRFDistributions/WRFV4.3/GEOG_DATA/ │ ├── WPS/ │ └── WRF/ ├── metgrid_rundir/ │ ├── GEOG_DATA -> /dvlscratch/ATM/morton/WRFDistributions/WRFV4.3/GEOG_DATA/ │ ├── WPS/ │ └── WRF/ ├── namelist.wps ├── namelist.wps_gfs_ctbto ├── tmpmetdata-gfs_ctbto/ │ ├── GD21051415 -> /ops/data/atm/live/ncep/2021/05/14/0.5.fv3/GD21051415 │ ├── GD21051418 -> /ops/data/atm/live/ncep/2021/05/14/0.5.fv3/GD21051418 │ └── GD21051421 -> /ops/data/atm/live/ncep/2021/05/14/0.5.fv3/GD21051421 └── ungrib_rundir_gfs_ctbto/ ├── GEOG_DATA -> /dvlscratch/ATM/morton/WRFDistributions/WRFV4.3/GEOG_DATA/ ├── WPS/ └── WRF/ 13 directories, 6 files Each of the run directories is self-contained and all set up for execution of the specific case components, and the interested user can go into them and rerun for experimentation and/or troubleshooting. ---- Real component =============== Running **real** and **wrf** both require the provision of a fully functional **namelist.input** (in most cases, this can be the same file), and **flexwrf** requires the provision of a standard **flexp_input.txt** file. Future implementations should have these generated automatically. The following example will continue to use the output produced in previous stages - in this case, we only the need the **met_em\*** files produced by **metgrid**. Like the other sample wfnamelists and related samples, the following are found in the repository directory, **packages/ehratmworkflow/docs/Overview/sample_workflows/** Note that the **metgrid_path** comes from the output of the previous metgrid-only case, and that **anne_bypass_namelist_input** must be a full, absolute path to the real/wrf namelist. **wfnamelist.gfs-realonly** .. code-block:: bash &control   workflow_list = 'real' / &time   start_time = '2021051415'   end_time = '2021051418'   wrf_spinup_hours = 0 / &domain_defn   domain_defn_path = 'small_domain_twonest.nml' / &real   metgrid_path = '/dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878/metgrid_rundir/WPS'   num_nests = 2   hours_intvl = 3   ! This needs to be a full, absolute path   anne_bypass_namelist_input = '/dvlscratch/ATM/morton/ehratm-test1/namelist.input.gfs_twonest' / .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.gfs-realonly 2023-09-12:18:31:04 --> Workflow started 2023-09-12:18:31:04 --> Start process_namelist() 2023-09-12:18:31:04 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_183104.256836 2023-09-12:18:31:04 --> Start run_real() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:18:31:04 --> Workflow started 2023-09-12:18:31:04 --> Start process_namelist() 2023-09-12:18:31:04 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_183104.256836 2023-09-12:18:31:04 --> Start run_real() 2023-09-12:18:31:05 --> Workflow completed... As always, the run directory is provided, and can be examined. real.exe may be re-run in here for experimentation and troubleshooting. .. code-block:: bash . . . lrwxrwxrwx. 1 morton consult 111 Sep 12 18:31 met_em.d01.2021-05-14_15:00:00.nc -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878/metgrid_rundir/WPS/met_em.d01.2021-05-14_15:00:00.nc lrwxrwxrwx. 1 morton consult 111 Sep 12 18:31 met_em.d01.2021-05-14_18:00:00.nc -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878/metgrid_rundir/WPS/met_em.d01.2021-05-14_18:00:00.nc lrwxrwxrwx. 1 morton consult 111 Sep 12 18:31 met_em.d02.2021-05-14_15:00:00.nc -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878/metgrid_rundir/WPS/met_em.d02.2021-05-14_15:00:00.nc lrwxrwxrwx. 1 morton consult 111 Sep 12 18:31 met_em.d02.2021-05-14_18:00:00.nc -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_023309.185878/metgrid_rundir/WPS/met_em.d02.2021-05-14_18:00:00.nc . . . -rw-r--r--. 1 morton consult 2629060 Sep 12 18:31 wrfbdy_d01 lrwxrwxrwx. 1 morton consult 64 Sep 12 18:31 wrf.exe -> /dvlscratch/ATM/morton/WRFDistributions/WRFV4.3/WRF/main/wrf.exe -rw-r--r--. 1 morton consult 7529596 Sep 12 18:31 wrfinput_d01 -rw-r--r--. 1 morton consult 1038900 Sep 12 18:31 wrfinput_d02 The expected real output files are present - **wrfbdy_d01**, **wrfinput_d01** and **wrfinput_d02** They are NetCDF files, and can therefore be viewed with the **ncview** utility. ---- Wrf component ============== In this case (as in most cases), we can use the same **namelist.input** we used for the **real** component. We use the output of the previous step as **real_input** The following example will continue to use the output produced in previous stages - in this case, we need the **wrfbdy_d01**, **wrfinput_d01** and **wrfinput_d02** files produced by **real**. **wfnamelist.gfs-wrfonly** .. code-block:: bash &control workflow_list = 'wrf' / &time start_time = '2021051415' end_time = '2021051418' wrf_spinup_hours = 0 / &domain_defn domain_defn_path = 'small_domain_twonest.nml' / &wrf real_path = '/dvlscratch/ATM/morton/tmp/ehratmwf_20230912_183104.256836/real_rundir/WRF' num_nests = 2 hours_intvl = 3 anne_bypass_namelist_input = '/dvlscratch/ATM/morton/ehratm-test1/namelist.input.gfs_twonest' ! This is commented out because the tiny domain is too small for parallel ! domain decomposition !num_mpi_tasks = 4 / It’s often useful to specify a parallel execution for **WRF** by assigning a number of MPI tasks. In this particular example, though, the domain is so tiny that parallel processing of it fails, so it's commented out in the above wfnamelist. .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.gfs-wrfonly 2023-09-12:18:52:11 --> Workflow started 2023-09-12:18:52:11 --> Start process_namelist() 2023-09-12:18:52:11 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_185211.615991 2023-09-12:18:52:11 --> Start run_wrf() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:18:52:11 --> Workflow started 2023-09-12:18:52:11 --> Start process_namelist() 2023-09-12:18:52:11 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_185211.615991 2023-09-12:18:52:11 --> Start run_wrf() 2023-09-12:18:52:59 --> Workflow completed... .. code-block:: bash $ ls -l /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_185211.615991/wrf_rundir/WRF . . . lrwxrwxrwx. 1 morton consult 85 Sep 12 18:52 wrfbdy_d01 -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_183104.256836/real_rundir/WRF/wrfbdy_d01 lrwxrwxrwx. 1 morton consult 64 Sep 12 18:52 wrf.exe -> /dvlscratch/ATM/morton/WRFDistributions/WRFV4.3/WRF/main/wrf.exe lrwxrwxrwx. 1 morton consult 87 Sep 12 18:52 wrfinput_d01 -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_183104.256836/real_rundir/WRF/wrfinput_d01 lrwxrwxrwx. 1 morton consult 87 Sep 12 18:52 wrfinput_d02 -> /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_183104.256836/real_rundir/WRF/wrfinput_d02 -rw-r--r--. 1 morton consult 7200936 Sep 12 18:52 wrfout_d01_2021-05-14_15:00:00 -rw-r--r--. 1 morton consult 7200936 Sep 12 18:52 wrfout_d01_2021-05-14_16:00:00 -rw-r--r--. 1 morton consult 7200936 Sep 12 18:52 wrfout_d01_2021-05-14_17:00:00 -rw-r--r--. 1 morton consult 7200936 Sep 12 18:52 wrfout_d01_2021-05-14_18:00:00 -rw-r--r--. 1 morton consult 1002160 Sep 12 18:52 wrfout_d02_2021-05-14_15:00:00 -rw-r--r--. 1 morton consult 1002160 Sep 12 18:52 wrfout_d02_2021-05-14_16:00:00 -rw-r--r--. 1 morton consult 1002160 Sep 12 18:52 wrfout_d02_2021-05-14_17:00:00 -rw-r--r--. 1 morton consult 1002160 Sep 12 18:52 wrfout_d02_2021-05-14_18:00:00 The expected inputs from **real** are made available by symlinks to the previous run’s directory, and the expected wrfout files are present in the new run directory. These, too, are NetCDF and can be viewed with the **ncview** utility For example .. code-block:: bash $ ncview /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_185211.615991/wrf_rundir/WRF/wrfout_d02_2021-05-14_18:00:00 ---- Flexwrf component ================== In this standalone case we use the output of the previous **wrf** component and a user-provided **flexp_input.txt** to run **Flexpart WRF**, and then to produce SRM output file. Users will need to ensure that they specify their own **wrfout_path** and **bypass_user_input** paths. **wfnamelist.flexwrf+srm** .. code-block:: bash &control workflow_list = 'flexwrf', 'srm' / &time start_time = '2021051415' end_time = '2021051418' wrf_spinup_hours = 0 / &flexwrf wrfout_path = '/dvlscratch/ATM/morton/tmp/ehratmwf_20230912_185211.615991/wrf_rundir/WRF' wrfout_num_nests = 2 wrfout_hours_intvl = 1 ! This needs to be a full, absolute path bypass_user_input = '/dvlscratch/ATM/morton/ehratm-test1/flexp_input.txt' / &srm levels_list = 1 multiplier = 1.0e-12 / .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.flexwrf+srm 2023-09-12:23:01:31 --> Workflow started 2023-09-12:23:01:31 --> Start process_namelist() 2023-09-12:23:01:31 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_230131.638927 2023-09-12:23:01:31 --> Start run_flexwrf() 2023-09-12:23:06:31 --> Start run_srm() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:23:01:31 --> Workflow started 2023-09-12:23:01:31 --> Start process_namelist() 2023-09-12:23:01:31 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_230131.638927 2023-09-12:23:01:31 --> Start run_flexwrf() 2023-09-12:23:06:31 --> Start run_srm() 2023-09-12:23:06:31 --> Workflow completed... The expected output is present in the run directory produced by the standalone component .. code-block:: bash $ ls -l -F /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_230131.638927/flexwrf_rundir/flexwrf_rundir . . . -rw-r--r--. 1 morton consult 212 Sep 12 23:02 grid_conc_20210514160000_001 -rw-r--r--. 1 morton consult 312 Sep 12 23:04 grid_conc_20210514170000_001 -rw-r--r--. 1 morton consult 312 Sep 12 23:06 grid_conc_20210514180000_001 -rw-r--r--. 1 morton consult 252 Sep 12 23:02 grid_conc_nest_20210514160000_001 -rw-r--r--. 1 morton consult 428 Sep 12 23:04 grid_conc_nest_20210514170000_001 -rw-r--r--. 1 morton consult 552 Sep 12 23:06 grid_conc_nest_20210514180000_001 -rw-r--r--. 1 morton consult 10422 Sep 12 23:01 header -rw-r--r--. 1 morton consult 1794 Sep 12 23:01 header_nest lrwxrwxrwx. 1 morton consult 87 Sep 12 23:01 IGBP_int1.dat -> /dvlscratch/ATM/morton/FlexWRFDistributions/FlexWRFv3.3/flexwrf_code/data/IGBP_int1.dat* -rw-r--r--. 1 morton consult 10640 Sep 12 23:01 latlon_corner_nest.txt -rw-r--r--. 1 morton consult 84035 Sep 12 23:01 latlon_corner.txt -rw-r--r--. 1 morton consult 10640 Sep 12 23:01 latlon_nest.txt -rw-r--r--. 1 morton consult 84035 Sep 12 23:01 latlon.txt . . . $ ls -l -F /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_230131.638927/srm_rundir total 16 -rw-r--r--. 1 morton consult 156 Sep 12 23:06 control_level_01 -rw-r--r--. 1 morton consult 87 Sep 12 23:06 SITE_A.fp.20210514150000.l1.l0.dry.srm -rw-r--r--. 1 morton consult 87 Sep 12 23:06 SITE_A.fp.20210514150000.l1.l0.wet.srm -rw-r--r--. 1 morton consult 591 Sep 12 23:06 SITE_A.fp.20210514150000.l1.srm ---- Full workflow ================ Finally, we illustrate a complete workflow, based on all the components demonstrated above, integrated in a way that all of the run directories lie in the same root directory, and the tracking of outputs from one component as inputs to another component is handled cleanly. **wfnamelist.fullworkflow** .. code-block:: bash &control workflow_list = 'ungrib', 'geogrid', 'metgrid', 'real', 'wrf', 'flexwrf', 'srm' / &time start_time = '2021051415' end_time = '2021051418' wrf_spinup_hours = 0 / &grib_input1 type = 'gfs_ctbto' subtype = '0.5.fv3' hours_intvl = 3 rootdir = '/ops/data/atm/live/ncep' / &domain_defn domain_defn_path = 'small_domain_twonest.nml' / &metgrid hours_intvl = 3 / &real num_nests = 2 hours_intvl = 3 ! This needs to be a full, absolute path anne_bypass_namelist_input = '/dvlscratch/ATM/morton/ehratm-test1/namelist.input.gfs_twonest' / &wrf num_nests = 2 hours_intvl = 3 anne_bypass_namelist_input = '/dvlscratch/ATM/morton/ehratm-test1/namelist.input.gfs_twonest' / &flexwrf wrfout_num_nests = 2 wrfout_hours_intvl = 1 ! This needs to be a full, absolute path bypass_user_input = '/dvlscratch/ATM/morton/ehratm-test1/flexp_input.txt' / &srm levels_list = 1 multiplier = 1.0e-12 / .. code-block:: bash $ ./epython3 ehratmwf.py --namelist wfnamelist.fullworkflow 2023-09-12:23:23:12 --> Workflow started 2023-09-12:23:23:12 --> Start process_namelist() 2023-09-12:23:23:12 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_232312.048757 2023-09-12:23:23:12 --> Start run_ungrib() 2023-09-12:23:23:18 --> Start run_geogrid() 2023-09-12:23:23:29 --> Start run_metgrid() 2023-09-12:23:23:33 --> Start run_real() 2023-09-12:23:23:34 --> Start run_wrf() 2023-09-12:23:24:23 --> Start run_flexwrf() 2023-09-12:23:29:12 --> Start run_srm() ----------------------- Workflow Events Summary ----------------------- 2023-09-12:23:23:12 --> Workflow started 2023-09-12:23:23:12 --> Start process_namelist() 2023-09-12:23:23:12 --> Started run_workflow(): /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_232312.048757 2023-09-12:23:23:12 --> Start run_ungrib() 2023-09-12:23:23:18 --> Start run_geogrid() 2023-09-12:23:23:29 --> Start run_metgrid() 2023-09-12:23:23:33 --> Start run_real() 2023-09-12:23:23:34 --> Start run_wrf() 2023-09-12:23:24:23 --> Start run_flexwrf() 2023-09-12:23:29:12 --> Start run_srm() 2023-09-12:23:29:12 --> Workflow completed... After running, we find all of the run directories, each set up for execution for experimenting or troubleshooting .. code-block:: bash $ ls -l -F /dvlscratch/ATM/morton/tmp/ehratmwf_20230912_232312.048757 total 52 drwxr-xr-x. 3 morton consult 4096 Sep 12 23:24 flexwrf_rundir/ -rw-r--r--. 1 morton consult 455 Sep 12 23:23 geogrid_namelist.wps drwxr-xr-x. 4 morton consult 4096 Sep 12 23:23 geogrid_rundir/ drwxr-xr-x. 4 morton consult 4096 Sep 12 23:23 metgrid_rundir/ -rw-r--r--. 1 morton consult 229 Sep 12 23:23 namelist.wps -rw-r--r--. 1 morton consult 204 Sep 12 23:23 namelist.wps_gfs_ctbto drwxr-xr-x. 4 morton consult 4096 Sep 12 23:23 real_rundir/ drwxr-xr-x. 2 morton consult 4096 Sep 12 23:29 srm_rundir/ drwxr-xr-x. 2 morton consult 4096 Sep 12 23:23 tmpmetdata-gfs_ctbto/ drwxr-xr-x. 4 morton consult 4096 Sep 12 23:23 ungrib_rundir_gfs_ctbto/ drwxr-xr-x. 2 morton consult 4096 Sep 12 23:24 wrfout_d01/ drwxr-xr-x. 2 morton consult 4096 Sep 12 23:24 wrfout_d02/ drwxr-xr-x. 4 morton consult 4096 Sep 12 23:23 wrf_rundir/ .. toctree::