Started by an SCM change Obtained jenkins_tests/cam_scam_gfortran_debug_test/Jenkinsfile from git https://github.com/larson-group/cam.git [Pipeline] Start of Pipeline [Pipeline] node Running on Jenkins in /home/jenkins/workspace/cam_scam_gfortran_debug_test [Pipeline] { [Pipeline] stage [Pipeline] { (Declarative: Checkout SCM) [Pipeline] checkout The recommended git tool is: git Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/larson-group/cam.git > git init /home/jenkins/workspace/cam_scam_gfortran_debug_test # timeout=10 Fetching upstream changes from https://github.com/larson-group/cam.git > git --version # timeout=10 > git --version # 'git version 2.34.1' > git fetch --tags --force --progress -- https://github.com/larson-group/cam.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/larson-group/cam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 Avoid second fetch > git rev-parse refs/remotes/origin/clubb_silhs_devel^{commit} # timeout=10 Checking out Revision 97aa2c69603ad8dd327196c2190dae80292aa917 (refs/remotes/origin/clubb_silhs_devel) > git config core.sparsecheckout # timeout=10 > git checkout -f 97aa2c69603ad8dd327196c2190dae80292aa917 # timeout=10 Commit message: "Autoupdated CLUBB_core Commit 969bc4aa2a1db2664c0f92b98d2df5544c554c32 Author: Brian Griffin Date: Sat Mar 22 23:39:40 2025 -0500 Clubb test correct gg (#1237) * Updated the generalized vertical grid test to use the run_bindiff_w_flags_config_core_flags.json file, which performs 18 overall flag configuration file tests and includes almost every configurable model flag found in CLUBB core." > git rev-list --no-walk 6f0563a327d1cfefda2306f4c6d51ff65557810f # timeout=10 [Pipeline] } [Pipeline] // stage [Pipeline] withEnv [Pipeline] { [Pipeline] stage [Pipeline] { (Checkout Externals and Copy Custom Files) [Pipeline] sh + python run_scripts/set_up_repo.py . Checking out externals... Copying ccs_config files [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Remove Old Output) [Pipeline] sh + rm -rf /home/jenkins/cam_output/scratch/test_scam_ATEX /home/jenkins/cam_output/scratch/test_scam_ARM97 [Pipeline] script [Pipeline] { [Pipeline] } [Pipeline] // script [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Run atex) [Pipeline] sh + source /etc/profile.d/larson-group.sh ++ export GIT_EDITOR=vi ++ GIT_EDITOR=vi ++ export SVN_EDITOR=vi ++ SVN_EDITOR=vi ++ export OMP_STACKSIZE=1048579 ++ OMP_STACKSIZE=1048579 ++ export LMOD_ROOT=/opt/lmod/ ++ LMOD_ROOT=/opt/lmod/ ++ source /opt/lmod//lmod/lmod/init/bash +++ '[' -z '' ']' +++ case "$-" in +++ __lmod_vx=x +++ '[' -n x ']' +++ set +x Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for this output (/usr/local/spack/opt/spack/linux-pop22-cascadelake/gcc-12.2.0/lmod-8.7.37-bi3kyxcdrfgw3y7vv2k7c5rjxg75qzju/lmod/lmod/init/bash) Shell debugging restarted +++ unset __lmod_vx +++ find /usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core -print -quit ++ export MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core ++ MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core + run_scripts/run_scam.bash ATEX ----- Case Setup ----- Compset longname is 2000_CAM60%SCAMATEX_CLM50%SP_CICE%PRES_DOCN%DOM_SROF_SGLC_SWAV Compset specification file is /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime_config/config_compsets.xml Automatically adding SESP to compset Compset forcing is 1972-2004 ATM component is CAM cam6 physics: LND component is clm5.0:Satellite phenology: ICE component is Sea ICE (cice) model version 6 :prescribed cice OCN component is DOCN prescribed ocean mode ROF component is Stub river component GLC component is Stub glacier (land ice) component WAV component is Stub wave component ESP component is Stub external system processing (ESP) component Pes specification file is /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime_config/config_pes.xml Compset specific settings: name is RUN_STARTDATE and value is 1969-02-15 Compset specific settings: name is STOP_N and value is 47 Compset specific settings: name is STOP_OPTION and value is nhours Compset specific settings: name is START_TOD and value is 0 Compset specific settings: name is PTS_MODE and value is TRUE Compset specific settings: name is PTS_LAT and value is 15.0 Compset specific settings: name is PTS_LON and value is 345.0 Machine is larson-group vid is comment vid is ntasks vid is nthrds vid is rootpe Pes setting: grid is a%T42_l%T42_oi%T42_r%null_g%null_w%null_z%null_m%gx1v7 Pes setting: compset is 2000_CAM60%SCAMATEX_CLM50%SP_CICE%PRES_DOCN%DOM_SROF_SGLC_SWAV_SESP Pes setting: tasks is {'NTASKS_ATM': 1, 'NTASKS_LND': 1, 'NTASKS_ROF': 1, 'NTASKS_ICE': 1, 'NTASKS_OCN': 1, 'NTASKS_GLC': 1, 'NTASKS_WAV': 1, 'NTASKS_CPL': 1} Pes setting: threads is {'NTHRDS_ATM': 1, 'NTHRDS_LND': 1, 'NTHRDS_ROF': 1, 'NTHRDS_ICE': 1, 'NTHRDS_OCN': 1, 'NTHRDS_GLC': 1, 'NTHRDS_WAV': 1, 'NTHRDS_CPL': 1} Pes setting: rootpe is {'ROOTPE_ATM': 0, 'ROOTPE_LND': 0, 'ROOTPE_ROF': 0, 'ROOTPE_ICE': 0, 'ROOTPE_OCN': 0, 'ROOTPE_GLC': 0, 'ROOTPE_WAV': 0, 'ROOTPE_CPL': 0} Pes setting: pstrid is {} Pes other settings: {} Pes comments: none setting additional fields from config_pes: {} Compset is: 2000_CAM60%SCAMATEX_CLM50%SP_CICE%PRES_DOCN%DOM_SROF_SGLC_SWAV_SESP Grid is: a%T42_l%T42_oi%T42_r%null_g%null_w%null_z%null_m%gx1v7 Components in compset are: ['cam', 'clm', 'cice', 'docn', 'srof', 'sglc', 'swav', 'sesp'] This is a CESM scientifically supported compset at this resolution. No project info available No charge_account info available, using value from PROJECT cesm model version found: cam_4ncar_20240605_2f3c896-775-g97aa2c69 Batch_system_type is none Creating Case directory /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX This component includes user_mods /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime_config/usermods_dirs/scam_mandatory Adding user mods directory /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime_config/usermods_dirs/scam_mandatory RUN: /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX/shell_commands FROM: /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX Creating batch scripts Writing case.run script from input template /home/jenkins/workspace/cam_scam_gfortran_debug_test/ccs_config/machines/template.case.run Creating file .case.run Writing case.st_archive script from input template /home/jenkins/workspace/cam_scam_gfortran_debug_test/ccs_config/machines/template.st_archive Creating file case.st_archive Creating user_nl_xxx files for components and cpl Running cam.case_setup.py If an old case build already exists, might want to run 'case.build --clean' before building You can now run './preview_run' to get more info on how your case will be run ----- Compile Configuration ----- ----- Run configuration ----- ----- Compile ----- Building case in directory /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX sharedlib_only is False model_only is False Generating component namelists as part of build 2025-03-24 03:47:57 atm Create namelist for component cam Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime_config/buildnml ...calling cam buildcpp to set build time options CAM namelist copy: file1 /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX/Buildconf/camconf/atm_in file2 /home/jenkins/cam_output/scratch/test_scam_ATEX/run/atm_in 2025-03-24 03:47:57 lnd Create namelist for component clm Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/components/clm//cime_config/buildnml WARNING: CLM is starting up from a cold state 2025-03-24 03:47:58 ice Create namelist for component cice Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/components/cice//cime_config/buildnml 2025-03-24 03:47:58 ocn Create namelist for component docn Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/components/cdeps/docn/cime_config/buildnml docn_mode is prescribed 2025-03-24 03:47:58 rof Create namelist for component srof Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/non_py/src/components/stub_comps_nuopc/srof/cime_config/buildnml 2025-03-24 03:47:58 glc Create namelist for component sglc Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sglc/cime_config/buildnml 2025-03-24 03:47:58 wav Create namelist for component swav Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/non_py/src/components/stub_comps_nuopc/swav/cime_config/buildnml 2025-03-24 03:47:58 esp Create namelist for component sesp Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sesp/cime_config/buildnml 2025-03-24 03:47:58 cpl Create namelist for component drv Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/components/cmeps/cime_config/buildnml Writing nuopc_runconfig for components ['CPL', 'ATM', 'LND', 'ICE', 'OCN'] Building gptl with output to file /home/jenkins/cam_output/scratch/test_scam_ATEX/bld/gptl.bldlog.250324-034757 Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build_scripts/buildlib.gptl Component gptl build complete with 21 warnings Building pio with output to file /home/jenkins/cam_output/scratch/test_scam_ATEX/bld/pio.bldlog.250324-034757 Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build_scripts/buildlib.pio Building csm_share with output to file /home/jenkins/cam_output/scratch/test_scam_ATEX/bld/csm_share.bldlog.250324-034757 Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/share/buildlib.csm_share Component csm_share build complete with 254 warnings Building CDEPS with output to file /home/jenkins/cam_output/scratch/test_scam_ATEX/bld/CDEPS.bldlog.250324-034757 Calling /home/jenkins/workspace/cam_scam_gfortran_debug_test/components/cdeps/cime_config/buildlib Component CDEPS build complete with 200 warnings - Building clm library Building lnd with output to /home/jenkins/cam_output/scratch/test_scam_ATEX/bld/lnd.bldlog.250324-034757 Traceback (most recent call last): File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX/./case.build", line 267, in _main_func(__doc__) File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/scripts/test_scam_ATEX/./case.build", line 251, in _main_func success = build.case_build( File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build.py", line 1321, in case_build return run_and_log_case_status(functor, cb, caseroot=caseroot) File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/utils.py", line 2482, in run_and_log_case_status rv = func() File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build.py", line 1305, in functor = lambda: _case_build_impl( File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build.py", line 1195, in _case_build_impl logs = _build_libraries( File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build.py", line 869, in _build_libraries _build_model_thread( File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/build.py", line 939, in _build_model_thread analyze_build_log(compclass, file_build, compiler) File "/home/jenkins/workspace/cam_scam_gfortran_debug_test/cime/CIME/utils.py", line 2391, in analyze_build_log with open(log, "r") as fd: FileNotFoundError: [Errno 2] No such file or directory: '/home/jenkins/cam_output/scratch/test_scam_ATEX/bld/lnd.bldlog.250324-034757' Error building CAM [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Run ARM97 with SILHS) Stage "Run ARM97 with SILHS" skipped due to earlier failure(s) [Pipeline] getContext [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Declarative: Post Actions) [Pipeline] script [Pipeline] { [Pipeline] sh + chmod -R 755 /home/jenkins/cam_output/ [Pipeline] sh + gzip -d '/home/jenkins/cam_output/scratch/test_scam_ATEX/bld/cesm.bldlog.*.gz' gzip: /home/jenkins/cam_output/scratch/test_scam_ATEX/bld/cesm.bldlog.*.gz: No such file or directory [Pipeline] echo ATEX WARNING: One or more of the build logs were not zipped, this means compilation did not finish [Pipeline] sh + gzip -d '/home/jenkins/cam_output/scratch/test_scam_ATEX/run/cesm.log.*.gz' gzip: /home/jenkins/cam_output/scratch/test_scam_ATEX/run/cesm.log.*.gz: No such file or directory [Pipeline] echo ATEX WARNING: One or more of the run logs were not zipped, this means the code did not run to completion [Pipeline] sh + cat '/home/jenkins/cam_output/scratch/test_scam_ATEX/bld/cesm.bldlog.*' cat: '/home/jenkins/cam_output/scratch/test_scam_ATEX/bld/cesm.bldlog.*': No such file or directory [Pipeline] echo ATEX WARNING: One of the log files (build or run) were not found - the failure must have occured before their creation [Pipeline] sh + gzip -d '/home/jenkins/cam_output/scratch/test_scam_ARM97/bld/cesm.bldlog.*.gz' gzip: /home/jenkins/cam_output/scratch/test_scam_ARM97/bld/cesm.bldlog.*.gz: No such file or directory [Pipeline] echo ARM97 WARNING: One or more of the build logs were not zipped, this means compilation did not finish [Pipeline] sh + gzip -d '/home/jenkins/cam_output/scratch/test_scam_ARM97/run/cesm.log.*.gz' gzip: /home/jenkins/cam_output/scratch/test_scam_ARM97/run/cesm.log.*.gz: No such file or directory [Pipeline] echo ARM97 WARNING: One or more of the run logs were not zipped, this means the code did not run to completion [Pipeline] sh + cat '/home/jenkins/cam_output/scratch/test_scam_ARM97/bld/cesm.bldlog.*' cat: '/home/jenkins/cam_output/scratch/test_scam_ARM97/bld/cesm.bldlog.*': No such file or directory [Pipeline] echo ARM97 WARNING: One of the log files (build or run) were not found - the failure must have occured before their creation [Pipeline] } [Pipeline] // script [Pipeline] script [Pipeline] { [Pipeline] emailext Sending email to: messnermet@uwm.edu [Pipeline] } [Pipeline] // script [Pipeline] } [Pipeline] // stage [Pipeline] } [Pipeline] // withEnv [Pipeline] } [Pipeline] // node [Pipeline] End of Pipeline ERROR: script returned exit code 1 Finished: FAILURE