Started by an SCM change Obtained jenkins_tests/cam_gpu_test/Jenkinsfile from git https://github.com/larson-group/cam.git [Pipeline] Start of Pipeline [Pipeline] node Running on Jenkins in /home/jenkins/workspace/cam_gpu_test [Pipeline] { [Pipeline] stage [Pipeline] { (Declarative: Checkout SCM) [Pipeline] checkout The recommended git tool is: git Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/larson-group/cam.git > git init /home/jenkins/workspace/cam_gpu_test # timeout=10 Fetching upstream changes from https://github.com/larson-group/cam.git > git --version # timeout=10 > git --version # 'git version 2.34.1' > git fetch --tags --force --progress -- https://github.com/larson-group/cam.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/larson-group/cam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 Avoid second fetch > git rev-parse refs/remotes/origin/clubb_silhs_devel^{commit} # timeout=10 Checking out Revision a64171b81dfe0bfd91289455c71c25230c5ecc6e (refs/remotes/origin/clubb_silhs_devel) > git config core.sparsecheckout # timeout=10 > git checkout -f a64171b81dfe0bfd91289455c71c25230c5ecc6e # timeout=10 Commit message: "Autoupdated CLUBB_core Commit e430c6b5a72a8a88fa602e13b4cb946d3361895e Author: domkesteffen Date: Thu Jan 16 16:06:31 2025 -0600 Added new hole-filling method which takes TKE from up2 and vp2 (#1217) Added new hole-filling method which takes TKE from up2 and vp2 CLUBB ticket #1165" > git rev-list --no-walk b3c6f29d0d1a50b165ee1e44a515e6d23991285c # timeout=10 [Pipeline] } [Pipeline] // stage [Pipeline] withEnv [Pipeline] { [Pipeline] stage [Pipeline] { (Checkout Externals and Copy Custom Files) [Pipeline] sh + python run_scripts/set_up_repo.py . Checking out externals... Copying ccs_config files [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Remove Old Output) [Pipeline] sh + rm -rf /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/ /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/ [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Running ne3 with OpenACC) [Pipeline] sh + source /etc/profile.d/larson-group.sh ++ export GIT_EDITOR=vi ++ GIT_EDITOR=vi ++ export SVN_EDITOR=vi ++ SVN_EDITOR=vi ++ export OMP_STACKSIZE=1048579 ++ OMP_STACKSIZE=1048579 ++ export LMOD_ROOT=/opt/lmod/ ++ LMOD_ROOT=/opt/lmod/ ++ source /opt/lmod//lmod/lmod/init/bash +++ '[' -z '' ']' +++ case "$-" in +++ __lmod_vx=x +++ '[' -n x ']' +++ set +x Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for this output (/usr/local/spack/opt/spack/linux-pop22-cascadelake/gcc-12.2.0/lmod-8.7.37-bi3kyxcdrfgw3y7vv2k7c5rjxg75qzju/lmod/lmod/init/bash) Shell debugging restarted +++ unset __lmod_vx +++ find /usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core -print -quit ++ export MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core ++ MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core + ulimit -s 8388608 + run_scripts/run_cesm_uwm_coarse_res_gpu_no_silhs.sh ----- Case Setup ----- Compset longname is 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV Compset specification file is /home/jenkins/workspace/cam_gpu_test/cime_config/config_compsets.xml Automatically adding SESP to compset Compset forcing is 1972-2004 ATM component is CAM cam7 physics: LND component is clm6.0:Satellite phenology: ICE component is Sea ICE (cice) model version 6 :prescribed cice OCN component is DOCN prescribed ocean mode ROF component is MOSART: MOdel for Scale Adaptive River Transport GLC component is Stub glacier (land ice) component WAV component is Stub wave component ESP component is Stub external system processing (ESP) component Pes specification file is /home/jenkins/workspace/cam_gpu_test/cime_config/config_pes.xml Machine is larson-group Pes setting: grid match is a%ne3np4 Pes setting: grid is a%ne3np4.pg3_l%ne3np4.pg3_oi%ne3np4.pg3_r%r05_g%null_w%null_z%null_m%gx3v7 Pes setting: compset is 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV_SESP Pes setting: tasks is {'NTASKS_ATM': 24, 'NTASKS_LND': 24, 'NTASKS_ROF': 24, 'NTASKS_ICE': 24, 'NTASKS_OCN': 24, 'NTASKS_GLC': 24, 'NTASKS_WAV': 24, 'NTASKS_CPL': 24} Pes setting: threads is {'NTHRDS_ATM': 1, 'NTHRDS_LND': 1, 'NTHRDS_ROF': 1, 'NTHRDS_ICE': 1, 'NTHRDS_OCN': 1, 'NTHRDS_GLC': 1, 'NTHRDS_WAV': 1, 'NTHRDS_CPL': 1} Pes setting: rootpe is {'ROOTPE_ATM': 0, 'ROOTPE_LND': 0, 'ROOTPE_ROF': 0, 'ROOTPE_ICE': 0, 'ROOTPE_OCN': 0, 'ROOTPE_GLC': 0, 'ROOTPE_WAV': 0, 'ROOTPE_CPL': 0} Pes setting: pstrid is {} Pes other settings: {} Pes comments: none setting additional fields from config_pes: {} Compset is: 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV_SESP Grid is: a%ne3np4.pg3_l%ne3np4.pg3_oi%ne3np4.pg3_r%r05_g%null_w%null_z%null_m%gx3v7 Components in compset are: ['cam', 'clm', 'cice', 'docn', 'mosart', 'sglc', 'swav', 'sesp'] No project info available No charge_account info available, using value from PROJECT cesm model version found: cam_4ncar_20240605_2f3c896-727-ga64171b8 Batch_system_type is none Creating Case directory /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc ERROR: Unknown Job Queue specified use --force to set Creating batch scripts Writing case.run script from input template /home/jenkins/workspace/cam_gpu_test/ccs_config/machines/template.case.run Creating file .case.run Writing case.st_archive script from input template /home/jenkins/workspace/cam_gpu_test/ccs_config/machines/template.st_archive Creating file case.st_archive Creating user_nl_xxx files for components and cpl Running cam.case_setup.py If an old case build already exists, might want to run 'case.build --clean' before building You can now run './preview_run' to get more info on how your case will be run ----- Compile Configuration ----- ----- Compile ----- Building case in directory /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc sharedlib_only is False model_only is False Generating component namelists as part of build 2025-01-17 03:18:01 atm Create namelist for component cam Calling /home/jenkins/workspace/cam_gpu_test/cime_config/buildnml ...calling cam buildcpp to set build time options CAM namelist copy: file1 /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/Buildconf/camconf/atm_in file2 /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/atm_in 2025-01-17 03:18:02 lnd Create namelist for component clm Calling /home/jenkins/workspace/cam_gpu_test/components/clm//cime_config/buildnml 2025-01-17 03:18:02 ice Create namelist for component cice Calling /home/jenkins/workspace/cam_gpu_test/components/cice//cime_config/buildnml RUN: /home/jenkins/workspace/cam_gpu_test/components/cice/bld/generate_cice_decomp.pl -ccsmroot /home/jenkins/workspace/cam_gpu_test -res ne3np4.pg3 -nx 486 -ny 1 -nproc 1 -thrds 1 -output all FROM: /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc output: 486 1 486 1 1 roundrobin square-ice 2025-01-17 03:18:02 ocn Create namelist for component docn Calling /home/jenkins/workspace/cam_gpu_test/components/cdeps/docn/cime_config/buildnml docn_mode is prescribed 2025-01-17 03:18:02 rof Create namelist for component mosart Calling /home/jenkins/workspace/cam_gpu_test/components/mosart//cime_config/buildnml 2025-01-17 03:18:02 glc Create namelist for component sglc Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sglc/cime_config/buildnml 2025-01-17 03:18:02 wav Create namelist for component swav Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/swav/cime_config/buildnml 2025-01-17 03:18:02 esp Create namelist for component sesp Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sesp/cime_config/buildnml 2025-01-17 03:18:02 cpl Create namelist for component drv Calling /home/jenkins/workspace/cam_gpu_test/components/cmeps/cime_config/buildnml Writing nuopc_runconfig for components ['CPL', 'ATM', 'LND', 'ICE', 'OCN', 'ROF'] Building gptl with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/gptl.bldlog.250117-031801 Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/build_scripts/buildlib.gptl Building pio with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/pio.bldlog.250117-031801 Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/build_scripts/buildlib.pio Building csm_share with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/csm_share.bldlog.250117-031801 Calling /home/jenkins/workspace/cam_gpu_test/share/buildlib.csm_share Building CDEPS with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/CDEPS.bldlog.250117-031801 Calling /home/jenkins/workspace/cam_gpu_test/components/cdeps/cime_config/buildlib - Building clm library Building lnd with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/lnd.bldlog.250117-031801 clm built in 1172.945179 seconds - Building atm Library Building atm with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.250117-031801 - Building ice Library Building ice with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.250117-031801 - Building ocn Library Building ocn with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.250117-031801 - Building rof Library Building rof with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.250117-031801 docn built in 0.715084 seconds mosart built in 66.831562 seconds cice built in 187.704345 seconds cam built in 487.137470 seconds ERROR: BUILD FAIL: cam.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.250117-031801 Error building CAM [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Declarative: Post Actions) [Pipeline] script [Pipeline] { [Pipeline] sh + chmod -R 755 /home/jenkins/cam_output/ [Pipeline] sh + gzip -d '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.*.gz' gzip: /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.*.gz: No such file or directory [Pipeline] echo Not all logs were compressed, this indicates a failure. [Pipeline] } [Pipeline] // script [Pipeline] script [Pipeline] { [Pipeline] } [Pipeline] // script [Pipeline] } [Pipeline] // stage [Pipeline] } [Pipeline] // withEnv [Pipeline] } [Pipeline] // node [Pipeline] End of Pipeline ERROR: script returned exit code 1 Finished: FAILURE