================================= Demo of Single-Component Usage ================================= Introduction ============= One of the primary design goals of **nwpservice** has been to support the paradigm of loosely-coupled distributed components, whereby each of the WPS/WRF and FlexpartWRF functional components can be used, tested, debugged and enhanced in standalone mode, independent of the other components. Each component is initialised with a set of arguments and produces Python dictionary output that allows other components access to the output files. In this section we look at a set of single-component demo drivers that I built for familiarity and testing purposes. They are located in the **tests/demos/** directory of the **nwpservice** package. .. code-block:: bash $ tree -F -L 1 demos demos ├── flexwrf_test_driver.py* ├── full_workflow_driver_test_anne_eu.py* ├── full_wpswrf_workflow_driver.py* ├── geogrid_test_driver.py* ├── metgrid_test_driver.py* ├── real_test_driver.py* ├── ungrib_test_driver.py* └── wrf_test_driver.py* and they all make use of test data found in **tests/testcase_data/** All of them feature a similar set of code at the beginning, allowing uses to hard-code paths to input and output files and desired tests. For example, from ``metgrid_test_driver.py`` .. code-block:: bash . . . ## Uncomment exactly one of the following SYSTEM_NAME = 'EHRATM_VM' #SYSTEM_NAME = 'CTBTO_DEVLAN' #### These values are generally system-specific if SYSTEM_NAME == 'EHRATM_VM': USER_ROOT = '/home/ctbtuser' GIT_REPO_DIR = os.path.join(USER_ROOT, 'git/high-res-atm') # Local git repo TMP_ROOT_DIR = os.path.join(USER_ROOT, 'tmp') # Dir for temp files OUTPUT_ROOT_DIR = os.path.join(USER_ROOT, 'tmp') # Dir for output products WPSWRF_DISTRO = '/scratch/WRFV4.3-Distribution' # Dir of WPS/WRF distro MPIRUN = '/usr/lib64/openmpi/bin/mpirun' # mpirun executable elif SYSTEM_NAME == 'CTBTO_DEVLAN': USER_ROOT = '/dvlscratch/ATM/morton' GIT_REPO_DIR = os.path.join(USER_ROOT, 'git/high-res-atm') # Local git repo TMP_ROOT_DIR = os.path.join(USER_ROOT, 'tmp') # Dir for temp files OUTPUT_ROOT_DIR = os.path.join(USER_ROOT, 'tmp') # Dir for output products WPSWRF_DISTRO = '/scratch/WRFV4.3-Distribution' # Dir of WPS/WRF distro MPIRUN = '/usr/lib64/openmpi/bin/mpirun' # mpirun executable else: print('SYSTEM_NAME not supported: %s' % SYSTEM_NAME) sys.exit() #### --------------------------------------------------------- #### These values should not vary across systems #### --------------------------------------------------------- #### Possible testcases - Haven't recently tested all of these in each driver #### #### TESTCASE = 'gfs_spain_simple_twonest' #### TESTCASE = 'nam_nodak_simple_wps_wrf' #### ############################################################### TESTCASE = 'gfs_spain_simple_twonest' # Testcase to use # Testcase files, dependent on TESTCASE value TESTCASE_DATA_DIR = os.path.join(GIT_REPO_DIR, 'packages/nwpservice/tests/testcase_data', TESTCASE) TESTCASE_UNGRIB_DATA_DIR = os.path.join(TESTCASE_DATA_DIR, 'WPS/output') TESTCASE_GEOGRID_DATA_DIR = os.path.join(TESTCASE_DATA_DIR, 'WPS/output') NUM_MPI_TASKS_WPS = 2 # If set to True, the run directories will be retained NO_CLEANUP = True . . . Although they have all been tested in various ways on both a CTBTO-configured VM, and on devlan, these specific files have been set up for execution on my own CTBTO-configured VM (``EHRATM_VM``), and on single tests. In other words, they have not recently been configured for use on *devlan*, although they have been used there before, and they have not recently been tested on all available tests. The, demo files in this directory do work as-is, and it should be straightforward for the user to run these on devlan. Before running any of these, it would be important to ensure a compatible Python environment and a correct ``PYTHONPATH``. For example .. code-block:: bash $ conda activate ehratmv0.02 $ export PYTHONPATH=/home/ctbtuser/git/high-res-atm/packages/nwpservice/src (Note that a suitable Python environment is defined in the *high-res-atm* repository in **misc/conda/ehratmv0.02.yml**) Then, it should be a simple matter of running the script, for example, using the hard-coded paths for ``EHRATM_VM`` in above excerpt .. code-block:: bash $ ~/git/high-res-atm/packages/nwpservice/tests/demos/metgrid_test_driver.py setup... DEBUG [metgrid:metgrid.py:__init__:80] --> started . . . !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ! Successful completion of metgrid. ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! output_manifest: {'rundir': '/home/ctbtuser/tmp/metgrid_test_20230813.223705/WPS', 'metgrid': {'files': {'met_em.d01.2017-10-23_03:00:00.nc': {'bytes': 2904236}, 'met_em.d01.2017-10-23_06:00:00.nc': {'bytes': 2904236}, 'met_em.d02.2017-10-23_03:00:00.nc': {'bytes': 1632560}, 'met_em.d02.2017-10-23_06:00:00.nc': {'bytes': 1632560}}}} DEBUG [metgrid:metgrid.py:stage_output:450] --> dest_dir: /home/ctbtuser/tmp/metgrid_output_20230813.223705 DEBUG [metgrid:metgrid.py:stage_output:467] --> Copying met_em files: ['met_em.d01.2017-10-23_03:00:00.nc', 'met_em.d01.2017-10-23_06:00:00.nc', 'met_em.d02.2017-10-23_03:00:00.nc', 'met_em.d02.2017-10-23_06:00:00.nc'] DEBUG [metgrid:metgrid.py:stage_output:477] --> Copying aux files: ['metgrid.log.0000', 'metgrid.log.0001', 'namelist.wps'] Output stage success: True Keeping the test run dir: /home/ctbtuser/tmp/metgrid_test_20230813.223705 Upon successful completion, one could then verify the expected output in ``/home/ctbtuser/tmp/metgrid_output_20230813.223705`` (as indicated in above excerpt) .. code-block:: bash $ ls -l /home/ctbtuser/tmp/metgrid_output_20230813.223705 total 9116 -rw-rw-r--. 1 ctbtuser ctbtuser 2904236 Aug 13 22:37 met_em.d01.2017-10-23_03:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 2904236 Aug 13 22:37 met_em.d01.2017-10-23_06:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 1632560 Aug 13 22:37 met_em.d02.2017-10-23_03:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 1632560 Aug 13 22:37 met_em.d02.2017-10-23_06:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 119746 Aug 13 22:37 metgrid.log.0000 -rw-rw-r--. 1 ctbtuser ctbtuser 119746 Aug 13 22:37 metgrid.log.0001 -rw-rw-r--. 1 ctbtuser ctbtuser 214 Aug 13 22:37 namelist.wps In a workflow, this directory name would be passed to a downstream component that needed this data as input. And, because the ``NO_CLEANUP`` variable was set to ``True``, the run directory will be retained, and a user can go in there, examine it, and even manually execute the ``metgrid`` component in there for experimental or troubleshooting purposes. .. code-block:: bash $ ls -l /home/ctbtuser/tmp/metgrid_test_20230813.223705/WPS total 9148 lrwxrwxrwx. 1 ctbtuser ctbtuser 126 Aug 13 22:37 FILE:2017-10-23_03 -> /home/ctbtuser/git/high-res-atm/packages/nwpservice/tests/testcase_data/gfs_spain_simple_twonest/WPS/output/FILE:2017-10-23_03 . . . -rw-rw-r--. 1 ctbtuser ctbtuser 2904236 Aug 13 22:37 met_em.d01.2017-10-23_03:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 2904236 Aug 13 22:37 met_em.d01.2017-10-23_06:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 1632560 Aug 13 22:37 met_em.d02.2017-10-23_03:00:00.nc -rw-rw-r--. 1 ctbtuser ctbtuser 1632560 Aug 13 22:37 met_em.d02.2017-10-23_06:00:00.nc drwxr-xr-x. 2 ctbtuser ctbtuser 214 Aug 13 22:37 metgrid lrwxrwxrwx. 1 ctbtuser ctbtuser 57 Aug 13 22:37 metgrid.exe -> /scratch/WRFV4.3-Distribution/WPS/metgrid/src/metgrid.exe -rw-rw-r--. 1 ctbtuser ctbtuser 119746 Aug 13 22:37 metgrid.log.0000 -rw-rw-r--. 1 ctbtuser ctbtuser 119746 Aug 13 22:37 metgrid.log.0001 -rw-rw-r--. 1 ctbtuser ctbtuser 214 Aug 13 22:37 namelist.wps . . . Some observations to note from the above include * The metgrid component has used the input arguments to create a functional virtual WPS/WRF run directory, and created appropriate links to input files in other directories, and to the executables located in the specified WPS/WRF distribution. * This is a fully functional run directory, and a user could go in and re-run ``metgrid.exe``, modify the namelist, etc. Summary ========= Each of the other individual components can be executed in a similar way. The user would need to * Ensure that hard-coded paths at the top of the demo driver are correct for their environment * Ensure that a suitable Python environment has been set up * Ensure that **nwpservice/src** has been included in the **PYTHONPATH** There are several tests available in **/tests/testcase_data/** .. code-block:: bash $ tree -F -L 2 testcase_data testcase_data ├── ecmwf_anne_2014_eu/ │   ├── ecmwf_anne_2014_eu.png │   ├── FLEXWRF/ │   ├── metdata/ │   ├── WPS/ │   └── WRF/ ├── ecmwf_twonest_fdda/ │   ├── FlexWRF/ │   ├── WPS/ │   └── WRF/ ├── flexwrf_sicily/ │   ├── AVAILABLE1 │   ├── flxp_input_binary_outgrid.txt │   ├── flxp_input_binary_outgrid_withnest.txt │   ├── flxp_input.txt │   ├── output/ │   ├── output_binary_single/ │   ├── output_binary_withnest/ │   └── wrfout_metdata/ ├── gfs_spain_onenest_fdda/ │   ├── WPS/ │   └── WRF/ ├── gfs_spain_simple_twonest/ │   ├── flexwrf/ │   ├── metdata/ │   ├── README.md │   ├── WPS/ │   └── WRF/ └── nam_nodak_simple_wps_wrf/ ├── FLEXWRF/ ├── metdata/ ├── README.md ├── WPS/ └── WRF/ 27 directories, 7 files and the ``TESTCASE`` variable in each of the components can be set appropriately. Unfortunately, not all tests are set up to work with all components - they were installed for different purposes. This would need to be documented better in the future. However, the ``gfs_spain_simple_twonest/`` and ``nam_nodak_simple_wps_wrf/`` test cases probably have the best chance of success. The values that have been hard-coded into the current demo drivers have been verified to work as-is. .. toctree::