arsc & iarc report juanxiong he 1,2 greg newby 1 1. arctic region supercomputing center, 2....

33
ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December 5-6 2008

Upload: roxana-wadding

Post on 16-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

ARSC & IARC Report

Juanxiong He1,2 Greg Newby1

1. Arctic Region Supercomputing Center, 2. International Arctic Research Center

DOD/RACM Workshop, December 5-6 2008

Page 2: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Sketch• Analyze CCSM4 and CPL7 codehttp://www.oc.nps.edu/NAME/he_cpl7andwrf.ppt

• Integrate WRFV3 into CCSM4compilation successfulall components initialization successfulmapping initialization successfuldie at domain grid check

Page 3: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Integration Stragety• Following the framework of sequential CCSM4 and

CPL7, since they’re excellent!

• Don’t change any physical scheme and dynamical framework of WRFV3 as far as possible.

• Using macro CCSMCOUPLED and SEQ_MCT as the switches to control whether WRFV3 run alone or couple with ccsm4.

• Adding a module named atm_comp_mct into WRFV3 to corporate with CCSM4.

Page 4: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Outline• atm_comp_mct

• PBL issue

• radiation

• Restart

• Parallel

• Time mechanism

• Compilation

• Case run

Page 5: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

atm_comp_mct.F

• Public interface:atm_init_mct, atm_run_mct, atm_final_mct

• Private interface:atm_SetgsMap_mct atm_domain_mctatm_import_mct atm_export_mctatm_read_srfrest_mct atm_write_srfrest_mct

• Import the variables before the integration begins and export the variables after integration finishes. As Tony suggest, this issue should be tested in future!

Grid distribution on the different processors

Export and import state variables and flux

read and write restart file

Page 6: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Export variables• Bottom atmosphere level layer (7):(grid%phb+grid%ph_2)/9.8 (height), grid%u_2 (zonal wind), grid

%v_2 (meridional wind), grid%pb + grid%p (pressure), grid%t_2+300.0 (potential temperature), (grid%t_2+300.0)((grid%pb + grid%p)/100000)0.285 (temperature), grid%alt (air density), grid%moist (humidity)

• Radiation (6):grid%gsw (downward net shortwave radiation), grid%glw

(downward longwave radiation), grid%swndr (nir direct downward), grid%swvdr (vis direct downward), grid%swndf (nir diffuse downward), grid%swvdf (vis diffuse downward)

• Surface (4):grid%raincv (convective precipitation), grid%rainncv (large scale

precipitation), grid%snowncv (large scale frozen), grid%psfc (sea level pressure)

Page 7: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Import variables• PBL (8 variables) : grid%lh (latent heat, w/m2 ), grid%hfx (sensible heat, w/m2 ),

grid%qfx (upward moisture flux, kg/m2 s2 ), taux (zonal wind stress, ), tauy (meridional win stress, ), grid%tsk (surface skin temperture, k), grid%t2 (2m reference temperature, k), grid%q2 (2m reference humidity, kg/kg)

• Radiation (4 variables):grid%lwupb (upward longwave radiation, w/m2 ), grid%rasdir

(vir direct albedo), grid%raldir (nir direct albedo), grid%rasdif (vir diffusion albedo), grid%raldif (nir diffusion albedo)

• Surface (4 variables):, grid%sst (sea surface temperature, k), grid%snowh (snow

depth, m), grid%xice (sea ice mask), grid%xland (land mask)

Page 8: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Planetary boundary layer issue

• All WRF PBL schemes need (1) momentum, heat and moisture similarity function at the lowest layer; (2) surface roughness length; (3) surface temperature and humidity; (4) Bulk Richardson number. YSU also needs (5) wind at 10m.

• The coupler provides sensible heat, evaporation and wind stress. It means we must derive the above variables. (constraint of flux conservation)

• Surface roughness length is a very important variable for the air-sea interaction, especially at the high wind speed condition.

Page 9: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

YSU (MRF)• Typical popular efficient K profile method (Louis, 1979;

Toren and Mahrt, 1986; Vogelezang and Holtslag, 1996; Noh et al., 2003)

• Counter-gradient transportation in the mix layer and penetration of entrainment flux at the inversion layer

• PBL height determined from bulk Richardson number

∂C∂t

=∂

∂zKc

∂C

∂z− γ c

⎛⎝⎜

⎞⎠⎟

− ′w ′c( )hz

h⎛⎝⎜

⎞⎠⎟

3⎡

⎣⎢⎢

⎦⎥⎥

Roughness length, Monin-Obukhov length, bulk Richardson number

Primary equation

Page 10: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

u* =τ x +τ y

ρ1

θ* =H sensible heat

ρ1u*

q* =Qmoisture flux

ρ1u*θv* = θ* + 0.61θq*

L =θvu*

2

kgθv*

Computation I

z0 =0.111.5 ×10−5

u*+ 0.018

u*2

9.8

Friction velocity

temperature scaletemperature scale Virtual temperature scaleVirtual temperature scalehumidity scalehumidity scale

Roughness length, Charnok equation (Beljaars, 1994)

Monin-Obukhov length

Page 11: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Computation II

Unstable condition

Stable condition

ψ h (z

L) = ln

1+ y

2⎛⎝⎜

⎞⎠⎟

2⎡

⎣⎢⎢

⎦⎥⎥

, y = (1− 9z

L)1/2

ψ m (z

L) = ln

1 + x

2⎛⎝⎜

⎞⎠⎟

1 + x2

2

⎝⎜⎞

⎠⎟⎡

⎣⎢

⎦⎥− 2 arctan x +

π

2, x = (1−15

z

L)1/4

ψ m = −4.7z

L

ψ h = −4.7

0.74

z

L

Bulk Richardson number

(Louis, 1979)

Page 12: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Computation III

Pressure pointu u

v

v

φ,u _ phy,v_ phy Halo region exchange u, v before every step computation

halo_EM_couple_in_A.inc

Page 13: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

List of changed files

• All the PBL-related computation completes in the subroutine atm_import_mct.

• Change gz1oz0, br, psim, psih from i1 type to state type in the Registry/Registry.EM

• Add the following entry into Registry.EM:halo HALO_EM_COUPLE_IN_A main 4:u_2,v_2

• Change solve_em.F, module_first_rk_step_part1.F and module_first_rk_step_part2.F. It just delete the subroutine dummy related to gz1oz0, br, psim, psih

Page 14: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Atmosphere radiaton issue• Use CAM package (ra_lw_physics=3, ra_lw_physics=3)

• Add asdir, aldir, asdif and aldif as subroutine input variable (if not, set all of them equal to albedo).

• Release swndr, swvdr, swndf and swvdf as subroutine output variables.

• Unlock the CAM-related variables in the Registry.EM

• Add asdir, aldir, asdif, aldif, swndr, swvdr, swndf and swvdf into Registry.EM as state variables.

• Change module_first_rk_step_part1.F, module_radiation_driver.F and module_ra_cam.F to add the correspond dummies.

Page 15: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Restart issue• Base on the restart mechanism in WRFV3

• Add two new files into WRFV3 – wrf_restart_couplein.F and wrf_restart_coupleout.F

• Add codes and revise module_io_domain.F, output_wrf.F, input_wrf.F and module_io_wrf.F

• Enhance Registry framework:

– Revise Registry tools: gen_wrf_io.c, reg_parse.c, data_.h and registry.h

– Add a new attribute ‘c’ for the sake of reading and writing restart coupling file in Registry framework

– Add several new entrys as rconfig type into Registry, such as io_form_restart_couple, rst_couple_inname and rst_couple_outname

Page 16: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

subroutine atm_write_srfrest_mct( grid, config_flags)……… CALL domain_clock_get( grid, current_timestr=timestr )……… CALL construct_filename2a ( rstname , config_flags, &grid%rst_couple_outname , grid%id , 2 , timestr )………CALL open_w_dataset ( rid, TRIM(rstname), grid , & config_flags , output_restart , "DATASET=RESTART", ierr )………CALL output_restart_couple ( rid, grid , config_flags , ierr )……….CALL close_dataset ( rid , config_flags , "DATASET=RESTART" )

end subroutine atm_write_srfrest_mct

A snapshot of top-level restart subroutine

Page 17: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

parallel issue• Most of mpi activties in WRF are undertaken in the “

local_communicator” group

• Subroutine split_communicator in module_dm initialized the communicator firstly

• Block mpi initialization and transfer mpi communicator to WRF in subroutine split_communicator:

wrf_set_dm_communicator( mpi_communicator_atm)

• Passing the communicator to WRF from ccsm4 in atm_init_mct:

call seq_cdata_setptrs(cdata_a, ID=ATMID, mpicom=mpicom_atm, & gsMap=gsMap_atm, dom=dom_a, infodata=infodata)

mpi_communicator_atm=mpicom_atm

Page 18: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

List of changed files• Revise and add new source codes into

WRFV3/frame/init_module.F, WRFV3/frame/module_io_quilt.F, WRFV3/external/RSL_LITE/module_dm.F

• add new file - module_atm_communicator.F into WRFV3/frame

module module_atm_communicator integer, public :: mpi_communicator_atmend module module_atm_communicator

• Change WRFV3/frame/Makefile

Page 19: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Timer issue• CCSM4 uses two sets of clocks, one is outside for

driver, the other inside for the component.

• Both WRFV3 internal and CCSM4 driver timer base on ESMF. Most of their module, subroutine, function and variable are the same.

• Many minor differences between WRF and ESMF timer. Some have the same name with different function. Some have different name with the same function.

• Don’t change the framework of two Timers, just rename some of them to avoid the name and use-associated conflict and resolve the difference.

Page 20: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Resolve the differences • To avoid the name confliction, change some

functions and structure related CCSM4 in seq_ccsm_drv.F90 and atm_comp_mct.F :

use ESMF_MOD, CCSM_Clock=>ESMF_Clock use ESMF_MOD, CCSM_time_initialize=>ESMF_initialize use ESMF_MOD, CCSM_time_finalize=>ESMF_finalize

• To avoid the function and subroutine use-associated confliction, rename all the ESMF modules in WRFV3 into *_WRF form. For example, ESMF.mod ->ESMF_WRF.mod.

Page 21: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

list of changed files

• All the files under the directory external/esmf_time_f90

• external/io_esmf/module_symbol_utils.F90, share/dfi.F

• Seq_ccsm_drv.F90

Page 22: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Other every minor changes to ESMF in CCSM4

• Comment the line related to timeintchecknormalize in ESMF_TimeIntervalget

Seq_timeclockmgr_clockInit seems doesn’t deal with seq_timemgr_alarm_datestop and seq_timemgr_alarm_history in the real time condition corretly. It gives Interval ymd < 0, then seq_timemgr_clockPrint crashes.

• Change date=date-off to date=date+off in function get_curr_calday and get_curr_date in lnd component. It seems a typo. For a real time run, it gives the beginning julian day as 366.97……, then stops.

Page 23: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Driver clock structure(seq_timemgr_clockPrint) Clock = drv 1(seq_timemgr_clockPrint) Start Time = 20010101 00000(seq_timemgr_clockPrint) Curr Time = 20010101 00000(seq_timemgr_clockPrint) Ref Time = 20010101 00000(seq_timemgr_clockPrint) Stop Time = 20010106 00000(seq_timemgr_clockPrint) Step number = 0(seq_timemgr_clockPrint) Dtime = 1800(seq_timemgr_clockPrint) Alarm = 1 seq_timemgr_alarm_run(seq_timemgr_clockPrint) Alarm = 2 seq_timemgr_alarm_stop(seq_timemgr_clockPrint) Alarm = 3 seq_timemgr_alarm_datestop(seq_timemgr_clockPrint) Prev Time = 00000000 00000(seq_timemgr_clockPrint) Next Time = 20010106 00000(seq_timemgr_clockPrint) Intervl yms = 9999 0 -1795851392(seq_timemgr_clockPrint) Alarm = 4 seq_timemgr_alarm_restart (seq_timemgr_clockPrint) Alarm = 5 seq_timemgr_alarm_history(seq_timemgr_clockPrint) Alarm = 6 seq_timemgr_alarm_atmrun(seq_timemgr_clockPrint) Alarm = 7 seq_timemgr_alarm_lndrun(seq_timemgr_clockPrint) Alarm = 8 seq_timemgr_alarm_icerun(seq_timemgr_clockPrint) Alarm = 9 seq_timemgr_alarm_ocnrun(seq_timemgr_clockPrint) Alarm = 10 seq_timemgr_alarm_ocnnext

Page 24: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

ATM outside clock structure

(seq_timemgr_clockPrint) Clock = atm 2(seq_timemgr_clockPrint) Start Time = 20010101 00000(seq_timemgr_clockPrint) Curr Time = 20010101 00000(seq_timemgr_clockPrint) Ref Time = 20010101 00000(seq_timemgr_clockPrint) Stop Time = 20010106 00000(seq_timemgr_clockPrint) Step number = 0(seq_timemgr_clockPrint) Dtime = 1800(seq_timemgr_clockPrint) Alarm = 1 seq_timemgr_alarm_run(seq_timemgr_clockPrint) Alarm = 2 seq_timemgr_alarm_stop(seq_timemgr_clockPrint) Alarm = 3 seq_timemgr_alarm_datestop(seq_timemgr_clockPrint) Prev Time = 00000000 00000(seq_timemgr_clockPrint) Next Time = 20010106 00000(seq_timemgr_clockPrint) Intervl yms = 9999 0 -1795851392(seq_timemgr_clockPrint) Alarm = 4 seq_timemgr_alarm_restart(seq_timemgr_clockPrint) Alarm = 5 seq_timemgr_alarm_history(seq_timemgr_clockPrint) Prev Time = 00001130 00000(seq_timemgr_clockPrint) Next Time = 99991201 00000(seq_timemgr_clockPrint) Intervl yms = 9999 0 -1795851392

Page 25: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Compile issue• Thread=1 in ccsm4 at present, no openmp.

• None of optimization and openmp in WRFV3, otherwise the compilation and link of seq_ccsm_drv may be subject to crash.

• Change Configure.wrf, top directory Makefile, share/Makefile, main/Makfile, frame/Makefile and other files related to set up compilation environmental variables in WRFV3

• keep the Buildexe/cam.buildexe.csh name, but replace with all new content

Page 26: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

• #! /bin/csh –f

• ……

• # for atm_mct_comp and WRF compiling

• ./compile em_real

• # prepare library file for seq_ccsm_drv compiling

• cd $CODEROOT/WRFV3/external/esmf_time_f90

• ar ru $CODEROOT/WRFV3/main/libwrflib.a *.o

• ……

• # copy file to $LIBROOT

• ……

• cp -p main/libwrflib.a $LIBROOT/libatm.a

• # prepare namelist, parameter tables and initial dataset for WRF run

• …

• cp $CODEROOT/WRFV3/run/ETAMPNEW_DATA . cp $CODEROOT/WRFV3/run/namelist.input .

• cp $CODEROOT/WRFV3/run/wrfbdy_d01 .

• cp $CODEROOT/WRFV3/run/wrfinput_d01 .

Cam.buildexe.csh

Page 27: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

An dirty case• Follow ccsm4 f4x5_g3x5 case

• WRFV3: 69X48X27, global, timestep=900s, 20010101:00-00-00 – 20010106:00-00-00

• Successful compilation, all component have a successful initialization, most of coupling subroutines work. But the program die when running seq_domain_check_mct to check whether atmosphere and land grid be the same.

• The reason seems that atmosphere and land grid isn’t the same in WRF case. New remapping file must be generated by SCRIP.

• CCSM4 support each of component can have a different grid now.

• Code directory: /wrkdir/jhe/ccsm4

• Case compilation, run and output directory: /wrkdir/jhe/case2

Page 28: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

A snapshot of ccsm.log.081203-164400……… plat= 48 12 nextsw_cday = 1.000000000000000 --------------------- atm_init_mct finish ------------------ (seq_mct_drv) : Initialize lnd component 8 pes participating in computation for CLM………Solver options (barotropic solver)……..Preconditioner choice: diagonal…… eps= -438.8614196777344 data2= data1= n=

Atm_init_mct finishs

lnd_init_mct finishs

Ocn begins integration

Seq_domain_check_mct

Page 29: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

A snapshot of lnd.log.081203-164400All fields on history tape 1 are grid averaged Number of time samples on history tape 1 is 1 Output precision on history tape 1 = 2 hist_htapes_build Successfully initialized clm2 history files------------------------------------------------------------Successfully initialized the land

model begin initial run at: nstep= 0 year= 2001 month= 1 day= 1 seconds= 0************************************************************ Attempting to read monthly vegetation data ..... nstep = 0 month = 1 day = 1 ……….Successfully read monthly vegetation data for ……….

Page 30: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

A snapshot of ocn.log.081203-164400

Constants used in this run: ………. stefan_boltzmann = 5.670000000000000E-08 W/m^2/K^4 latent_heat_vapor = 2.501000000000000E+06 J/kg latent_heat_fusion = 3.337000000000000E+09 erg/g ocn_ref_salinity = 3.470000000000000E+01 psu sea_ice_salinity = 4.000000000000000E+00 psu T0_Kelvin = 2.731500000000000E+02 K pi = 3.141592653589793E+00 End of initialization=================================================================

=======

ocn_init_mct: iday0 = 2 start_day= 1.

Page 31: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

A snapshot of ice.log.081203-164400

read_global 15 0 -18764.24097909558 32433.52928192394 ice mask for dynamics read_global 15 0 0.000000000000000 1.000000000000000 Finished writing case2.cice.i.0000-01-01-00000.nc (ice_init_mct) idate from sync clock = 20010101 (ice_init_mct) tod from sync clock = 0 (ice_init_mct) resetting idate to match sync clock istep1: 0 idate: 20010101 sec: 0

Page 32: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Next step

• Merge with VIC and toward RACM.

• Generate new remapping files for atmosphere, ocean and land, testing.

• Test restart issue.

Page 33: ARSC & IARC Report Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center, 2. International Arctic Research Center DOD/RACM Workshop, December

Thank you!