Skip to content
Failed

Console Output

Started by an SCM change
Obtained jenkins_tests/cam_gpu_test/Jenkinsfile from git https://github.com/larson-group/cam.git
[Pipeline] Start of Pipeline
[Pipeline] node
Running on overlie in /home/jenkins/workspace/cam_gpu_test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Checkout SCM)
[Pipeline] checkout
The recommended git tool is: git
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/larson-group/cam.git
 > git init /home/jenkins/workspace/cam_gpu_test # timeout=10
Fetching upstream changes from https://github.com/larson-group/cam.git
 > git --version # timeout=10
 > git --version # 'git version 2.34.1'
 > git fetch --tags --force --progress -- https://github.com/larson-group/cam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
Checking out Revision fe5c039b9bba8aa4acb3a124acc0ad44cbe4cf2d (refs/remotes/origin/clubb_silhs_devel)
Commit message: "Autoupdated CLUBB_core Commit 418c202a9b0512fc8c4f421918cc978164db852d Author: Gunther Huebler Date: Wed Dec 17 14:58:28 2025 -0600 Fixes for fill_holes and edsclr code (#1266) * Fixing bugs in new hole fillers that caused error when ran in descending mode, the issue was that dz is negative in descending mode, so we need abs() around it when calculating the normalized mass. Also fixing a related bug in the way edsclr code calls the hole filler - it was not using the generalized bounds. Also removing the ifdef in favor of normal ifs"
 > git config remote.origin.url https://github.com/larson-group/cam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse refs/remotes/origin/clubb_silhs_devel^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fe5c039b9bba8aa4acb3a124acc0ad44cbe4cf2d # timeout=10
 > git rev-list --no-walk f1c6a5ece8f3616577a7dca89d68f4fe968ffa86 # timeout=10
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Checkout Externals and Copy Custom Files)
[Pipeline] sh
+ python run_scripts/set_up_repo.py .
Checking out externals...
Copying ccs_config files
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Remove Old Output)
[Pipeline] sh
+ rm -rf /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/ /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Running ne3 with OpenACC)
[Pipeline] sh
+ source /etc/profile.d/larson-group.sh
++ export GIT_EDITOR=vi
++ GIT_EDITOR=vi
++ export SVN_EDITOR=vi
++ SVN_EDITOR=vi
++ export OMP_STACKSIZE=1048579
++ OMP_STACKSIZE=1048579
++ export LMOD_ROOT=/opt/lmod/
++ LMOD_ROOT=/opt/lmod/
++ source /opt/lmod//lmod/lmod/init/bash
+++ '[' -z '' ']'
+++ case "$-" in
+++ __lmod_vx=x
+++ '[' -n x ']'
+++ set +x
Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for this output (/usr/local/spack/opt/spack/linux-pop22-skylake_avx512/gcc-12.2.0/lmod-8.7.37-fq24mybyn2li6got2bxzk62ejh5d3p4z/lmod/lmod/init/bash)
Shell debugging restarted
+++ unset __lmod_vx
+++ find /usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core -print -quit
++ export MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core
++ MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core
+ ulimit -s 8388608
+ run_scripts/run_cesm_uwm_coarse_res_gpu_no_silhs.sh
----- Case Setup -----
Compset longname is 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV
Compset specification file is /home/jenkins/workspace/cam_gpu_test/cime_config/config_compsets.xml
Automatically adding SESP to compset
Compset forcing is 1972-2004
ATM component is CAM cam7 physics:
LND component is clm6.0:Satellite phenology:
ICE component is Sea ICE (cice) model version 6 :prescribed cice
OCN component is DOCN   prescribed ocean mode
ROF component is MOSART: MOdel for Scale Adaptive River Transport
GLC component is Stub glacier (land ice) component
WAV component is Stub wave component
ESP component is Stub external system processing (ESP) component
Pes     specification file is /home/jenkins/workspace/cam_gpu_test/cime_config/config_pes.xml
Machine is larson-group
Pes setting: grid match    is a%ne3np4 
Pes setting: grid          is a%ne3np4.pg3_l%ne3np4.pg3_oi%ne3np4.pg3_r%r05_g%null_w%null_z%null_m%gx3v7 
Pes setting: compset       is 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV_SESP 
Pes setting: tasks       is {'NTASKS_ATM': 24, 'NTASKS_LND': 24, 'NTASKS_ROF': 24, 'NTASKS_ICE': 24, 'NTASKS_OCN': 24, 'NTASKS_GLC': 24, 'NTASKS_WAV': 24, 'NTASKS_CPL': 24} 
Pes setting: threads     is {'NTHRDS_ATM': 1, 'NTHRDS_LND': 1, 'NTHRDS_ROF': 1, 'NTHRDS_ICE': 1, 'NTHRDS_OCN': 1, 'NTHRDS_GLC': 1, 'NTHRDS_WAV': 1, 'NTHRDS_CPL': 1} 
Pes setting: rootpe      is {'ROOTPE_ATM': 0, 'ROOTPE_LND': 0, 'ROOTPE_ROF': 0, 'ROOTPE_ICE': 0, 'ROOTPE_OCN': 0, 'ROOTPE_GLC': 0, 'ROOTPE_WAV': 0, 'ROOTPE_CPL': 0} 
Pes setting: pstrid      is {} 
Pes other settings: {}
Pes comments: none
setting additional fields from config_pes: {}
 Compset is: 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV_SESP 
 Grid is: a%ne3np4.pg3_l%ne3np4.pg3_oi%ne3np4.pg3_r%r05_g%null_w%null_z%null_m%gx3v7 
 Components in compset are: ['cam', 'clm', 'cice', 'docn', 'mosart', 'sglc', 'swav', 'sesp'] 
No project info available
No charge_account info available, using value from PROJECT
cesm model version found: cam_4ncar_20240605_2f3c896-875-gfe5c039b9
Batch_system_type is none
 Creating Case directory /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc
ERROR: Unknown Job Queue specified use --force to set
Creating batch scripts
Writing case.run script from input template /home/jenkins/workspace/cam_gpu_test/ccs_config/machines/template.case.run
Creating file .case.run
Writing case.st_archive script from input template /home/jenkins/workspace/cam_gpu_test/ccs_config/machines/template.st_archive
Creating file case.st_archive
Creating user_nl_xxx files for components and cpl
Running cam.case_setup.py
If an old case build already exists, might want to run 'case.build --clean' before building
You can now run './preview_run' to get more info on how your case will be run
----- Compile Configuration -----
----- Compile -----
Building case in directory /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc
sharedlib_only is False
model_only is False
Generating component namelists as part of build
  2025-12-18 03:18:04 atm 
Create namelist for component cam
   Calling /home/jenkins/workspace/cam_gpu_test/cime_config/buildnml
     ...calling cam buildcpp to set build time options
CAM namelist copy: file1 /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/Buildconf/camconf/atm_in file2 /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/atm_in 
  2025-12-18 03:18:05 lnd 
Create namelist for component clm
   Calling /home/jenkins/workspace/cam_gpu_test/components/clm//cime_config/buildnml
  2025-12-18 03:18:05 ice 
Create namelist for component cice
   Calling /home/jenkins/workspace/cam_gpu_test/components/cice//cime_config/buildnml
RUN: /home/jenkins/workspace/cam_gpu_test/components/cice/bld/generate_cice_decomp.pl -ccsmroot /home/jenkins/workspace/cam_gpu_test -res ne3np4.pg3 -nx 486 -ny 1 -nproc 1 -thrds 1 -output all 
FROM: /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc
  output: 486 1 486 1 1 roundrobin square-ice

  2025-12-18 03:18:05 ocn 
Create namelist for component docn
   Calling /home/jenkins/workspace/cam_gpu_test/components/cdeps/docn/cime_config/buildnml
docn_mode is prescribed
  2025-12-18 03:18:05 rof 
Create namelist for component mosart
   Calling /home/jenkins/workspace/cam_gpu_test/components/mosart//cime_config/buildnml
  2025-12-18 03:18:05 glc 
Create namelist for component sglc
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sglc/cime_config/buildnml
  2025-12-18 03:18:05 wav 
Create namelist for component swav
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/swav/cime_config/buildnml
  2025-12-18 03:18:05 esp 
Create namelist for component sesp
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sesp/cime_config/buildnml
  2025-12-18 03:18:05 cpl 
Create namelist for component drv
   Calling /home/jenkins/workspace/cam_gpu_test/components/cmeps/cime_config/buildnml
Writing nuopc_runconfig for components ['CPL', 'ATM', 'LND', 'ICE', 'OCN', 'ROF']
Building gptl with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/gptl.bldlog.251218-031804
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/build_scripts/buildlib.gptl
Building pio with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/pio.bldlog.251218-031804
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/build_scripts/buildlib.pio
Building csm_share with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/csm_share.bldlog.251218-031804
   Calling /home/jenkins/workspace/cam_gpu_test/share/buildlib.csm_share
Building CDEPS with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/CDEPS.bldlog.251218-031804
   Calling /home/jenkins/workspace/cam_gpu_test/components/cdeps/cime_config/buildlib
         - Building clm library 
Building lnd with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/lnd.bldlog.251218-031804
clm built in 660.124609 seconds
         - Building atm Library 
Building atm with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251218-031804
         - Building ice Library 
Building ice with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.251218-031804
         - Building ocn Library 
Building ocn with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.251218-031804
         - Building rof Library 
Building rof with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.251218-031804
cice built in 0.092090 seconds
cam built in 0.094207 seconds
docn built in 0.120966 seconds
mosart built in 0.120295 seconds
Traceback (most recent call last):
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2482, in run_and_log_case_status
    rv = func()
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1305, in <lambda>
    functor = lambda: _case_build_impl(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1230, in _case_build_impl
    _build_model(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 402, in _build_model
    expect(not thread_bad_results, "\n".join(thread_bad_results))
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 176, in expect
    raise exc_type(msg)
CIME.utils.CIMEError: ERROR: BUILD FAIL: cice.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.251218-031804
BUILD FAIL: cam.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251218-031804
BUILD FAIL: docn.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.251218-031804
BUILD FAIL: mosart.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.251218-031804

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 267, in <module>
    _main_func(__doc__)
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 251, in _main_func
    success = build.case_build(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1321, in case_build
    return run_and_log_case_status(functor, cb, caseroot=caseroot)
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2494, in run_and_log_case_status
    append_case_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2084, in append_case_status
    append_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2068, in append_status
    with open(os.path.join(caseroot, sfile), "a") as fd:
FileNotFoundError: [Errno 2] No such file or directory: '/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/CaseStatus'
Error in sys.excepthook:
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/apport_python_hook.py", line 76, in apport_excepthook
    binary = os.path.realpath(os.path.join(os.getcwd(), sys.argv[0]))
FileNotFoundError: [Errno 2] No such file or directory

Original exception was:
Traceback (most recent call last):
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2482, in run_and_log_case_status
    rv = func()
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1305, in <lambda>
    functor = lambda: _case_build_impl(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1230, in _case_build_impl
    _build_model(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 402, in _build_model
    expect(not thread_bad_results, "\n".join(thread_bad_results))
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 176, in expect
    raise exc_type(msg)
CIME.utils.CIMEError: ERROR: BUILD FAIL: cice.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.251218-031804
BUILD FAIL: cam.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251218-031804
BUILD FAIL: docn.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.251218-031804
BUILD FAIL: mosart.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.251218-031804

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 267, in <module>
    _main_func(__doc__)
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 251, in _main_func
    success = build.case_build(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1321, in case_build
    return run_and_log_case_status(functor, cb, caseroot=caseroot)
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2494, in run_and_log_case_status
    append_case_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2084, in append_case_status
    append_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2068, in append_status
    with open(os.path.join(caseroot, sfile), "a") as fd:
FileNotFoundError: [Errno 2] No such file or directory: '/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/CaseStatus'
Error building CAM
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] sh
+ chmod -R 755 /home/jenkins/cam_output
[Pipeline] sh
+ gzip -d '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*.gz'
gzip: /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*.gz: No such file or directory
[Pipeline] echo
WARNING: One or more of the build logs were not zipped, this means compilation did not finish
[Pipeline] sh
+ gzip -d '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/cesm.log.*.gz'
gzip: /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/cesm.log.*.gz: No such file or directory
[Pipeline] echo
WARNING: One or more of the run logs were not zipped, this means the code did not run to completion
[Pipeline] sh
+ cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251218-031804
ERROR: Makes no sense to have empty read-only file: /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/env_case.xml
[Pipeline] sh
+ cat '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*'
cat: '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*': No such file or directory
[Pipeline] echo
WARNING: One of the log files (build or run) were not found - the failure must have occured before their creation
[Pipeline] }
[Pipeline] // script
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE