Error while running forecast wrapper

Submitted by vtangutu on Thu, 09/23/2021 - 05:20
Forum: Users | Forecast

Hi
I have downloaded the dataset  Hurricane IRMA from the website(https://dtcenter.org/community-code/hurricane-wrf-hwrf/datasets#data-4) to test HWRF installation and my DATA folder contains the following folders
abdecks,fix,gfs.2017090512,loop,recon.2017090512,SYNDAT-PLUS,TEMPDROP

I have configured the launcher_wrapper with "config.run_gsi=no" "config.run_ensemble_da=no" "config.run_relocation=no"  configuration
i was able to succesfully launch and finish the following wrappers
launcher_wrapper
init_gfs_wrapper
init_ocean_wrapper
init_bdy_wrapper
unpost_wrapper

when launching the forecast wrapper i am facing issues 
my system.conf has
[wrfexe]
nio_groups=1          ;; Number of WRF I/O server groups per domain in init jobs
nio_tasks_per_group=0,0,0 ;; Number of I/O servers per group in init jobs
poll_servers=yes      ;; Turn on server polling in init jobs if quilt servers are used (They are not.)
nproc_x=-1            ;; Init job WRF processor count in X direction (-1 = automatic)
nproc_y=-1            ;; Init job WRF processor count in Y direction (-1 = automatic)

[runwrf]
nio_groups=1          ;; Number of WRF I/O server groups per domain
nio_tasks_per_group=4,4,4 ;; Number of I/O servers per group
poll_servers=no      ;; Turn on server polling if quilt servers are used (They are not.)
nproc_x=18           ;; WRF processor count in X direction (-1 = automatic)
nproc_y=36            ;; WRF processor count in Y direction (-1 = automatic)
wrf_compute_ranks=648  ;;

TOTAL_TASKS=673

With the above configurations  i get this 

An error occurred in MPI_Comm_split
reported by process [8813528854213558273,270]
*** on communicator MPI COMMUNICATOR 4 DUP FROM 3
*** MPI_ERR_ARG: invalid argument of some other kind
 *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
 
The command that is failing is mpiexec')['-np','4','/hwrfrun/sorc/ncep-coupler/cpl_exec/hwrf_wm3c.exe',':','-np','9','/hwrfrun/sorc/pomtc/ocean_exec/hwrf_ocean_fcst.exe',':','-np','648','/hwrfrun/sorc/WRF/run/wrf.exe',':','-np','12','/hwrfrun/sorc/WRF/run/wrf.exe'].out('/hwrfrun/output/pytmp/hwrfrun/2017090512/11L/cpl.out',append=Flase)

 

 

Hi,

The errors don't tell much. Can you try using poll_servers=true in the [runwrf] section? Re-run the launcher and try the forecast job again. Please let me know if the issue still persists. Please also check the log file within the runwrf directory to see if it provides some useful error information. 

Thanks

Biswas

Permalink

In reply to by biswas

Hi 
Thank you for the respone
 

I tried by modifying poll_servers=true, ran the launcher_wrapper and forecast_wrapper.
still i face the same error An error occurred in MPI_Comm_split

Attaching the log files rsl.error.0000 ,cpl.out and the output of forecas_wrapper(forecast)

Can you please help me in getting out of this error and proceed further

 

Thanks 

Hemanth