Skip to content
Failed

Console Output

Started by an SCM change
Obtained jenkins_tests/cam_gpu_test/Jenkinsfile from git https://github.com/larson-group/cam.git
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Jenkins in /home/jenkins/workspace/cam_gpu_test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Checkout SCM)
[Pipeline] checkout
The recommended git tool is: git
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/larson-group/cam.git
 > git init /home/jenkins/workspace/cam_gpu_test # timeout=10
Fetching upstream changes from https://github.com/larson-group/cam.git
 > git --version # timeout=10
 > git --version # 'git version 2.34.1'
 > git fetch --tags --force --progress -- https://github.com/larson-group/cam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/larson-group/cam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
 > git rev-parse refs/remotes/origin/clubb_silhs_devel^{commit} # timeout=10
Checking out Revision f4776e99c62d8fb12936020cbc47f854c088a006 (refs/remotes/origin/clubb_silhs_devel)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f4776e99c62d8fb12936020cbc47f854c088a006 # timeout=10
Commit message: "Autoupdated CLUBB_core Commit 988ea0f2cc24e424e0a204b65bfd59bd4e9eee5b Author: Gunther Huebler Date: Thu Dec 4 20:47:54 2025 -0600 Small bug fix to make gfortran+debug happy, without the _core_rknd this could be a potential type error"
 > git rev-list --no-walk 97e5861a2d216548b7ab9f32ac60f0423361561d # timeout=10
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Checkout Externals and Copy Custom Files)
[Pipeline] sh
+ python run_scripts/set_up_repo.py .
Checking out externals...
Copying ccs_config files
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Remove Old Output)
[Pipeline] sh
+ rm -rf /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/ /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Running ne3 with OpenACC)
[Pipeline] sh
+ source /etc/profile.d/larson-group.sh
++ export GIT_EDITOR=vi
++ GIT_EDITOR=vi
++ export SVN_EDITOR=vi
++ SVN_EDITOR=vi
++ export OMP_STACKSIZE=1048579
++ OMP_STACKSIZE=1048579
++ export LMOD_ROOT=/opt/lmod/
++ LMOD_ROOT=/opt/lmod/
++ source /opt/lmod//lmod/lmod/init/bash
+++ '[' -z '' ']'
+++ case "$-" in
+++ __lmod_vx=x
+++ '[' -n x ']'
+++ set +x
Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for this output (/usr/local/spack/opt/spack/linux-pop22-cascadelake/gcc-12.2.0/lmod-8.7.37-bi3kyxcdrfgw3y7vv2k7c5rjxg75qzju/lmod/lmod/init/bash)
Shell debugging restarted
+++ unset __lmod_vx
+++ find /usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core -print -quit
++ export MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core
++ MODULEPATH=/usr/local/spack/share/spack/lmod/linux-pop22-x86_64/Core
+ ulimit -s 8388608
+ run_scripts/run_cesm_uwm_coarse_res_gpu_no_silhs.sh
----- Case Setup -----
Compset longname is 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV
Compset specification file is /home/jenkins/workspace/cam_gpu_test/cime_config/config_compsets.xml
Automatically adding SESP to compset
Compset forcing is 1972-2004
ATM component is CAM cam7 physics:
LND component is clm6.0:Satellite phenology:
ICE component is Sea ICE (cice) model version 6 :prescribed cice
OCN component is DOCN   prescribed ocean mode
ROF component is MOSART: MOdel for Scale Adaptive River Transport
GLC component is Stub glacier (land ice) component
WAV component is Stub wave component
ESP component is Stub external system processing (ESP) component
Pes     specification file is /home/jenkins/workspace/cam_gpu_test/cime_config/config_pes.xml
Machine is larson-group
Pes setting: grid match    is a%ne3np4 
Pes setting: grid          is a%ne3np4.pg3_l%ne3np4.pg3_oi%ne3np4.pg3_r%r05_g%null_w%null_z%null_m%gx3v7 
Pes setting: compset       is 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV_SESP 
Pes setting: tasks       is {'NTASKS_ATM': 24, 'NTASKS_LND': 24, 'NTASKS_ROF': 24, 'NTASKS_ICE': 24, 'NTASKS_OCN': 24, 'NTASKS_GLC': 24, 'NTASKS_WAV': 24, 'NTASKS_CPL': 24} 
Pes setting: threads     is {'NTHRDS_ATM': 1, 'NTHRDS_LND': 1, 'NTHRDS_ROF': 1, 'NTHRDS_ICE': 1, 'NTHRDS_OCN': 1, 'NTHRDS_GLC': 1, 'NTHRDS_WAV': 1, 'NTHRDS_CPL': 1} 
Pes setting: rootpe      is {'ROOTPE_ATM': 0, 'ROOTPE_LND': 0, 'ROOTPE_ROF': 0, 'ROOTPE_ICE': 0, 'ROOTPE_OCN': 0, 'ROOTPE_GLC': 0, 'ROOTPE_WAV': 0, 'ROOTPE_CPL': 0} 
Pes setting: pstrid      is {} 
Pes other settings: {}
Pes comments: none
setting additional fields from config_pes: {}
 Compset is: 2000_CAM70_CLM60%SP_CICE%PRES_DOCN%DOM_MOSART_SGLC_SWAV_SESP 
 Grid is: a%ne3np4.pg3_l%ne3np4.pg3_oi%ne3np4.pg3_r%r05_g%null_w%null_z%null_m%gx3v7 
 Components in compset are: ['cam', 'clm', 'cice', 'docn', 'mosart', 'sglc', 'swav', 'sesp'] 
No project info available
No charge_account info available, using value from PROJECT
cesm model version found: cam_4ncar_20240605_2f3c896-871-gf4776e99c
Batch_system_type is none
 Creating Case directory /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc
ERROR: Unknown Job Queue specified use --force to set
Creating batch scripts
Writing case.run script from input template /home/jenkins/workspace/cam_gpu_test/ccs_config/machines/template.case.run
Creating file .case.run
Writing case.st_archive script from input template /home/jenkins/workspace/cam_gpu_test/ccs_config/machines/template.st_archive
Creating file case.st_archive
Creating user_nl_xxx files for components and cpl
Running cam.case_setup.py
If an old case build already exists, might want to run 'case.build --clean' before building
You can now run './preview_run' to get more info on how your case will be run
----- Compile Configuration -----
----- Compile -----
Building case in directory /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc
sharedlib_only is False
model_only is False
Generating component namelists as part of build
  2025-12-06 03:17:49 atm 
Create namelist for component cam
   Calling /home/jenkins/workspace/cam_gpu_test/cime_config/buildnml
     ...calling cam buildcpp to set build time options
CAM namelist copy: file1 /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/Buildconf/camconf/atm_in file2 /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/atm_in 
  2025-12-06 03:17:49 lnd 
Create namelist for component clm
   Calling /home/jenkins/workspace/cam_gpu_test/components/clm//cime_config/buildnml
  2025-12-06 03:17:49 ice 
Create namelist for component cice
   Calling /home/jenkins/workspace/cam_gpu_test/components/cice//cime_config/buildnml
RUN: /home/jenkins/workspace/cam_gpu_test/components/cice/bld/generate_cice_decomp.pl -ccsmroot /home/jenkins/workspace/cam_gpu_test -res ne3np4.pg3 -nx 486 -ny 1 -nproc 1 -thrds 1 -output all 
FROM: /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc
  output: 486 1 486 1 1 roundrobin square-ice

  2025-12-06 03:17:50 ocn 
Create namelist for component docn
   Calling /home/jenkins/workspace/cam_gpu_test/components/cdeps/docn/cime_config/buildnml
docn_mode is prescribed
  2025-12-06 03:17:50 rof 
Create namelist for component mosart
   Calling /home/jenkins/workspace/cam_gpu_test/components/mosart//cime_config/buildnml
  2025-12-06 03:17:50 glc 
Create namelist for component sglc
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sglc/cime_config/buildnml
  2025-12-06 03:17:50 wav 
Create namelist for component swav
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/swav/cime_config/buildnml
  2025-12-06 03:17:50 esp 
Create namelist for component sesp
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/non_py/src/components/stub_comps_nuopc/sesp/cime_config/buildnml
  2025-12-06 03:17:50 cpl 
Create namelist for component drv
   Calling /home/jenkins/workspace/cam_gpu_test/components/cmeps/cime_config/buildnml
Writing nuopc_runconfig for components ['CPL', 'ATM', 'LND', 'ICE', 'OCN', 'ROF']
Building gptl with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/gptl.bldlog.251206-031749
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/build_scripts/buildlib.gptl
Building pio with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/pio.bldlog.251206-031749
   Calling /home/jenkins/workspace/cam_gpu_test/cime/CIME/build_scripts/buildlib.pio
Building csm_share with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/csm_share.bldlog.251206-031749
   Calling /home/jenkins/workspace/cam_gpu_test/share/buildlib.csm_share
Building CDEPS with output to file /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/CDEPS.bldlog.251206-031749
   Calling /home/jenkins/workspace/cam_gpu_test/components/cdeps/cime_config/buildlib
         - Building clm library 
Building lnd with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/lnd.bldlog.251206-031749
clm built in 513.821702 seconds
         - Building atm Library 
Building atm with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251206-031749
         - Building ice Library 
Building ice with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.251206-031749
         - Building ocn Library 
Building ocn with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.251206-031749
         - Building rof Library 
Building rof with output to /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.251206-031749
cam built in 0.188454 seconds
mosart built in 0.173775 seconds
cice built in 0.183705 seconds
docn built in 0.186921 seconds
Traceback (most recent call last):
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2482, in run_and_log_case_status
    rv = func()
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1305, in <lambda>
    functor = lambda: _case_build_impl(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1230, in _case_build_impl
    _build_model(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 402, in _build_model
    expect(not thread_bad_results, "\n".join(thread_bad_results))
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 176, in expect
    raise exc_type(msg)
CIME.utils.CIMEError: ERROR: BUILD FAIL: cam.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251206-031749
BUILD FAIL: mosart.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.251206-031749
BUILD FAIL: cice.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.251206-031749
BUILD FAIL: docn.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.251206-031749

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 267, in <module>
    _main_func(__doc__)
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 251, in _main_func
    success = build.case_build(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1321, in case_build
    return run_and_log_case_status(functor, cb, caseroot=caseroot)
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2494, in run_and_log_case_status
    append_case_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2084, in append_case_status
    append_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2068, in append_status
    with open(os.path.join(caseroot, sfile), "a") as fd:
FileNotFoundError: [Errno 2] No such file or directory: '/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/CaseStatus'
Error in sys.excepthook:
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/apport_python_hook.py", line 76, in apport_excepthook
    binary = os.path.realpath(os.path.join(os.getcwd(), sys.argv[0]))
FileNotFoundError: [Errno 2] No such file or directory

Original exception was:
Traceback (most recent call last):
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2482, in run_and_log_case_status
    rv = func()
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1305, in <lambda>
    functor = lambda: _case_build_impl(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1230, in _case_build_impl
    _build_model(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 402, in _build_model
    expect(not thread_bad_results, "\n".join(thread_bad_results))
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 176, in expect
    raise exc_type(msg)
CIME.utils.CIMEError: ERROR: BUILD FAIL: cam.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251206-031749
BUILD FAIL: mosart.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/rof.bldlog.251206-031749
BUILD FAIL: cice.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ice.bldlog.251206-031749
BUILD FAIL: docn.buildlib failed, cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/ocn.bldlog.251206-031749

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 267, in <module>
    _main_func(__doc__)
  File "/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/./case.build", line 251, in _main_func
    success = build.case_build(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/build.py", line 1321, in case_build
    return run_and_log_case_status(functor, cb, caseroot=caseroot)
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2494, in run_and_log_case_status
    append_case_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2084, in append_case_status
    append_status(
  File "/home/jenkins/workspace/cam_gpu_test/cime/CIME/utils.py", line 2068, in append_status
    with open(os.path.join(caseroot, sfile), "a") as fd:
FileNotFoundError: [Errno 2] No such file or directory: '/home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/CaseStatus'
Error building CAM
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] sh
+ chmod -R 755 /home/jenkins/cam_output
[Pipeline] sh
+ gzip -d '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*.gz'
gzip: /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*.gz: No such file or directory
[Pipeline] echo
WARNING: One or more of the build logs were not zipped, this means compilation did not finish
[Pipeline] sh
+ gzip -d '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/cesm.log.*.gz'
gzip: /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/run/cesm.log.*.gz: No such file or directory
[Pipeline] echo
WARNING: One or more of the run logs were not zipped, this means the code did not run to completion
[Pipeline] sh
+ cat /home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/atm.bldlog.251206-031749
ERROR: Makes no sense to have empty read-only file: /home/jenkins/cam_output/caseroot/UWM_ne3_gpu_no_stats_larson-group_nvhpc/env_case.xml
[Pipeline] sh
+ cat '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*'
cat: '/home/jenkins/cam_output/scratch/UWM_ne3_gpu_no_stats_larson-group_nvhpc/bld/cesm.bldlog.*': No such file or directory
[Pipeline] echo
WARNING: One of the log files (build or run) were not found - the failure must have occured before their creation
[Pipeline] }
[Pipeline] // script
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE