Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

C-grid crash with velocity instabilities #992

Open
anton-seaice opened this issue Nov 4, 2024 · 43 comments
Open

C-grid crash with velocity instabilities #992

anton-seaice opened this issue Nov 4, 2024 · 43 comments
Labels

Comments

@anton-seaice
Copy link
Contributor

This may well be related to #941 but Ill write it up seperately for clarity.

I ran CICE6 c-grid, coupled to MOM6 and using data atmosphere forcing. Its a nominal 0.25 degree resolution, we've been running with a CICE6 / MOM6 Baroclinic / Coupling timestep of 1350s (possibly too-long) and a short MOM6 barotropic timestep (675s).

After ~15 months, CICE crashes with negative ice area. This is the crash location, on the Greenland coast.

Screenshot 2024-11-04 at 2 23 02 PM

The last instantaneous output shows clear patterning in sea ice concentration at the crash location:

Screenshot 2024-11-04 at 2 25 21 PM

And instabilities in the C grid velocities. Note the main instability shown is in uvelE and vvelN

Screenshot 2024-11-04 at 2 27 59 PM Screenshot 2024-11-04 at 2 28 46 PM

It possible ofcourse this is not an issue with C-grid advection - our ocean model configuration is still undergoing changes. However we are confident that this configuration wouldn't crash with CICE B-grid. I am happy to do re-runs, provide extra plots etc to investigate further.

Crash log:

New mass < 0, i, j =          50          31
 Old mass =  1.584993725340139E-004
 New mass = -5.727752649206490E-006
 Net transport = -1.642271251832203E-004
 istep1, my_task, iblk, cat =       28799          80           4           2
 (print_state) negative area (ice)�
 (print_state) istep1, my_task, i, j, iblk:       28799          80          50
          31           4
 (print_state) Global block:         450
 (print_state) Global i and j:        1069        1002
 (print_state) Lat, Lon (degrees):   81.6865092532964     
  -14.8643879243489     
  
 aice    1.00000000000000     
 aice0  1.110223024625157E-016
  
 n =           1
 aicen  0.000000000000000E+000
 vicen  0.000000000000000E+000
 vsnon  0.000000000000000E+000
 Tsfcn  -1.79028623325174     
  
  
 n =           2
 aicen  1.584993725340139E-004
 vicen  1.541077117507753E-004
 vsnon  2.501025765687183E-009
 hin  0.972292251300263     
 hsn  1.577940483739432E-005
 Tsfcn  -20.1079585645912     
  
  
 n =           3
 aicen  2.710599464180833E-004
 vicen  5.327318905188697E-004
 vsnon  4.277164629733349E-009
 hin   1.96536558631641     
 hsn  1.577940483739432E-005
 Tsfcn  -22.7879628146431     
  
  
 n =           4
 aicen  1.058556202804489E-002
 vicen  3.713120284791096E-002
 vsnon  4.344231959587402E-003
 hin   3.50772143694754     
 hsn  0.410392187781622     
 Tsfcn  -26.5975737336460     
  
  
 n =           5
 aicen  0.988984878653003     
 vicen   8.07225859281393     
 vsnon  0.575890443169084     
 hin   8.16216583999580     
 hsn  0.582304598987849     
 Tsfcn  -26.7320323153733     
  
 qice, cat            1  layer            1  0.000000000000000E+000
 qice, cat            1  layer            2  0.000000000000000E+000
 qice, cat            1  layer            3  0.000000000000000E+000
 qice, cat            1  layer            4  0.000000000000000E+000
  
 qice, cat            2  layer            1  -307755649.562915     
 qi/rhoi  -335611.395379406     
 qice, cat            2  layer            2  -305955239.880947     
 qi/rhoi  -333648.026042472     
 qice, cat            2  layer            3  -305976569.446496     
 qi/rhoi  -333671.286201195     
 qice, cat            2  layer            4  -305655112.966541     
 qi/rhoi  -333320.733878453     
  
 qice, cat            3  layer            1  -306509750.424470     
 qi/rhoi  -334252.726744242     
 qice, cat            3  layer            2  -305988097.504735     
 qi/rhoi  -333683.857693277     
 qice, cat            3  layer            3  -305997732.789325     
 qi/rhoi  -333694.365091957     
 qice, cat            3  layer            4  -305915232.195830     
 qi/rhoi  -333604.397160120     
  
 qice, cat            4  layer            1  -330320531.213201     
 qi/rhoi  -360218.681802837     
 qice, cat            4  layer            2  -321686608.131194     
 qi/rhoi  -350803.280404791     
 qice, cat            4  layer            3  -312374812.336550     
 qi/rhoi  -340648.650312486     
 qice, cat            4  layer            4  -298970272.633928     
 qi/rhoi  -326030.831661863     
  
 qice, cat            5  layer            1  -335702127.033382     
 qi/rhoi  -366087.379534768     
 qice, cat            5  layer            2  -324703839.344245     
 qi/rhoi  -354093.608881401     
 qice, cat            5  layer            3  -313196657.706246     
 qi/rhoi  -341544.882994815     
 qice, cat            5  layer            4  -297369967.905740     
 qi/rhoi  -324285.679286521     
  
 qice(i,j)  -4984078201.07574     
  
 qsnow, cat            2  layer            1  -110121000.000000     
 qs/rhos  -333700.000000000     
 Tsnow  0.000000000000000E+000
  
 qsnow, cat            3  layer            1  -110121000.000000     
 qs/rhos  -333700.000000000     
 Tsnow  0.000000000000000E+000
  
 qsnow, cat            4  layer            1  -125310280.183219     
 qs/rhos  -379728.121767329     
 Tsnow  -21.7393727617777     
  
 qsnow, cat            5  layer            1  -126251426.824129     
 qs/rhos  -382580.081285240     
 Tsnow  -23.0863712635802     
  
 qsnow(i,j)  -471803707.007348     
  
 sice, cat            1  layer            1  0.000000000000000E+000
 sice, cat            1  layer            2  0.000000000000000E+000
 sice, cat            1  layer            3  0.000000000000000E+000
 sice, cat            1  layer            4  0.000000000000000E+000
 sice, cat            2  layer            1  0.649201856400313     
 sice, cat            2  layer            2   2.35458111192744     
 sice, cat            2  layer            3   3.03109206979846     
 sice, cat            2  layer            4   3.18929777221700     
 sice, cat            3  layer            1  0.649201856400314     
 sice, cat            3  layer            2   2.35458111192744     
 sice, cat            3  layer            3   3.03109206979846     
 sice, cat            3  layer            4   3.18929777221700     
 sice, cat            4  layer            1  0.649201856400311     
 sice, cat            4  layer            2   2.35458111192744     
 sice, cat            4  layer            3   3.03109206979848     
 sice, cat            4  layer            4   3.18929777221700     
 sice, cat            5  layer            1  0.649201856400316     
 sice, cat            5  layer            2   2.35458111192745     
 sice, cat            5  layer            3   3.03109206979847     
 sice, cat            5  layer            4   3.18929777221700     
  
 uvel(i,j)  0.421109861482645     
 vvel(i,j)  0.626738511376921     
 uvelE(i,j)   2.01817027718648     
 uvelN(i,j)  0.109513019152795     
  
 atm states and fluxes
             uatm    =    11.4565684617028     
             vatm    =   -5.35149824880375     
             potT    =    247.552100916471     
             Tair    =    247.552100916471     
             Qa      =   2.939509003865902E-004
             rhoa    =    1.39554159484570     
             swvdr   =   0.000000000000000E+000
             swvdf   =   0.000000000000000E+000
             swidr   =   0.000000000000000E+000
             swidf   =   0.000000000000000E+000
             flw     =    173.710396449191     
             frain   =   6.817549163993367E-009
             fsnow   =   3.857187849140834E-006
  
 ocn states and fluxes
             frzmlt  =    1130.89758885969     
             sst     =   -1.78566706771846     
             sss     =    32.9096734053628     
             Tf      =   -1.79028623325174     
             uocn    =   7.000659626299037E-002
             vocn    =   4.925604188204023E-002
             strtltxU=  -7.371310560342537E-003
             strtltyU=   3.605963438896476E-002
  
 srf states and fluxes
             Tref    =    247.355015644017     
             Qref    =   3.016600190419180E-004
             Uref    =    12.5024800827170     
             fsens   =    33.0991790281445     
             flat    =   -3.38962097696834     
             evap    =  -1.175569424330643E-006
             flwout  =   -207.304897334251     
  
  
 (abort_ice)ABORTED: 
 (abort_ice) error = (diagnostic_abort)ERROR: negative area (ice)
 ERROR: (abort_ice)(diagnostic_abort)ERROR: negative area (ice)

(Built cice using 12dd204 , I can't see any commits since then they would impact this).

@phil-blain
Copy link
Member

Hi Anton!

@JFLemieux73 is not at the office this week, but I think his first question would be: does it also crash with upwind advection ?

@anton-seaice
Copy link
Contributor Author

Thanks Phil - I will try that, and I will also make some figures of spatial plots of uvelE/vvelN similar to #976 (comment)

@anton-seaice
Copy link
Contributor Author

Both re-running with upwind advection or re-running with a 900s timestep didn't crash the model. Neither are a workable solution for us, although I could experiment with timesteps some more if there is nothing else strange going on.

@JFLemieux73
Copy link
Contributor

Hi Anton, I have the impression this is an instability related to the coupling. Can you restart just before the oscillations develop and show plots of uvel,vvel as above with a smaller time step? I am adding @dupontf to the conversation. Fred any idea?

@dupontf
Copy link

dupontf commented Nov 12, 2024

Hi Anton, what is your ice-ocean drag? and when you said above

re-running with a 900s timestep didn't crash

I suppose you mean that both the coupling (CICE) and MOM time-steps were 900s?

I am trying to eliminate a coupling instability: which time-level is passed to CICE from MOM?

@anton-seaice
Copy link
Contributor Author

Thanks both

Here is the plot from the 900s timestep, ( the coupling , CICE and MOM all run at this 900s timestep ).

Screenshot 2024-11-13 at 11 03 23 AM image

You can see the instability but it resolves - it doesn't show up in the sea-ice concentration. Maybe this is normal ?

@dupontf
Copy link

dupontf commented Nov 14, 2024

Thanks Anton, this still could correspond to a coupling instability to me as 900s corresponds to a reduction of the coupling timestep, so again, which value of drag and which time-level are exchanged between CICE and MOM?

I have found in the past for instance in NEMO and LIM2 that having the ocean current at time-level n given to CICE for calculating ice velocity at time-level n+1 and the associated stress given back to NEMO for computing current at time-level n+1 (and with Leapfrog time-stepping) yields an unstable situation akin to have $$u^{n+1}=u^{n-1}+C_d (u_{ice}-u^n)$$

@dupontf
Copy link

dupontf commented Nov 15, 2024

  • During the CICE virtual meeting, it was determined that MOM is using a RK2 time-steppîng, which should not be as sensitive as Leapfrog to the time level exchanged. So less likely a coupling issue.
  • Elizabeth raised the possibility of a Coriolis mode. This would require more investigation.
  • D. Bailey clarified that the momentum exchange is done on T-point through the coupler.
  • A. Roberts suggested to plot a phase lag plot between ice and ocean.

@JFLemieux73
Copy link
Contributor

Hi @anton-seaice,

It would be good to have an idea if this comes from the remapping. Could you please restart just before the oscillations develop and use upwind instead? Use your usual time steps: coupling timestep of 1350s and MOM6 barotropic timestep of 675s. Do we still see the oscillations in the velocity (please show plots as above)?

@JFLemieux73
Copy link
Contributor

About Coriolis...I also coded a semi-implicit approach:

https://github.com/JFLemieux73/CICE/tree/semi_imp_Coriolis

This is not up to date with the current code. If you want to try it you might have to cherry pick some parts in ice_dyn_evp and ice_dyn_shared.

@eclare108213
Copy link
Contributor

A few thoughts from me:

What we know (anything else?):

  1. Configuration
    ~0.25 deg resolution
    1350 s CICE6 / MOM6 Baroclinic / Coupling timestep
    675 s MOM6 barotropic timestep
  2. C-grid fails, B-grid does not
    Oscillations in space and time lead to negative area and mass
    ~15 months into the simulation
    Location on the northern Greenland coast
  3. Using upwind instead of remapping does not fail, but it’s very diffusive so that does not tell us much.
  4. Using a 900 s time step for CICE, MOM and their coupling does not fail but still shows some instability.

Some suggestions:

  1. To debug, it’s helpful to have a restart file immediately before the oscillations develop.
  2. In the failing models, is the momentum exchange done at cell centers or at the C-grid points? If it’s at the C-grid points, does it still fail when coupling at cell centers?
  3. Could this be coming from residual elastic waves? Crank up EVP subcycling to ~1200 or more.
  4. To further understand time step dependence of coupling versus component model time steps, run CICE and MOM at 900 s (or smaller, to eliminate potential instabilities coming from them) and couple less often, e.g. 1350 s. Andrew's coupling diagnostics could also help here.
  5. Try JF’s semi-implicit Coriolis approach.

@eclare108213
Copy link
Contributor

Considering the location, what happens if you turn on landfast ice? Maybe the increased tensile stress would help?

If none of those things provide any insight, start simplifying your configuration as much as possible while maintaining the failure mode. E.g. Turn off the thermodynamics. Turn off the ice-ocean coupling, or individual terms associated with the dynamics. Turn off advection. Try to set up a similar configuration in a box model (grid size, thickness, velocity magnitude and angle with the coast, etc)....

@anton-seaice
Copy link
Contributor Author

Thankyou all for the suggestions!

Re: upwind advection

It would be good to have an idea if this comes from the remapping. Could you please restart just before the oscillations develop and use upwind instead? Use your usual time steps: coupling timestep of 1350s and MOM6 barotropic timestep of 675s. Do we still see the oscillations in the velocity (please show plots as above)?

These are the plots from the experiment with upwind advection:

image

image

This appears to rule out the advection scheme somewhat as the instability is still there with upwind advection.

@JFLemieux73
Copy link
Contributor

Good! This is very useful.

@daveh150
Copy link
Contributor

I made a new test after updating to CICE 6.6.0. All exchange variables are on T grid. CICE on B-Grid runs a 384 hour forecast to completion. CICE on C-Grid crashes after 56 hours with error below. I noticed the print state is for the open water, not ice. The location is in Hudson Bay. Ice is melting, as the date is July 13, 2021.

New mass < 0, i, j = 6 3032
Old mass = 0.999402585760957
New mass = -7.225630273900660E-002
Net transport = -1.07165888849996
(print_state) negative area (open
(print_state) istep1, my_task, i, j, iblk: 7552693 426 6
3032 1
(print_state) Global block: 427
(print_state) Global i and j: 2561 3031
(print_state) Lat, Lon (degrees): 58.5415201362827
-89.4008593032007

aice 5.974142390434661E-004
aice0 0.999402585760957

A few notes:

  • Our global 1/12 degree grid.
  • Timestep = 225 seconds.
  • 120 subcycles (gasp, I know! Runtime is important for operations.)
  • Landfast ice is turned on.
  • Model coupling frequency is 3600 seconds.
  • Ocean and ice grids identical.
  • Ocean/ATM/ICE data passed on T grid for all cases.

I'd like to start making the ice phase plots mentioned above. Could I get help on what data to look at? Where/when I should be writing what ice information? Thanks for the help!

@apcraig
Copy link
Contributor

apcraig commented Nov 25, 2024

So you are coupling every 16 ice timesteps? I think most other coupled systems couple the ice model every timestep because of averaging challenges when fields vary in area. Could the time averaging operation you're using for coupling be generating errors in the C-grid coupling that don't exist when running on the B-grid?

@anton-seaice
Copy link
Contributor Author

@proteanplanet - Did you have a paper for how to examine the phase / coupling diagnostics ?

@daveh150
Copy link
Contributor

Hi Tony. Sorry if I am mistaken or misunderstand, but I am not aware of anything like time-averaging in our setup. At the coupling frequency (one hour), instantaneous data are exchanged. that data is used for the next forecast hour until data is exchanged again. I have not tried with a higher coupling frequency, it would be interesting to see the impact on runtime if we coupled more often than one hour.

@apcraig
Copy link
Contributor

apcraig commented Nov 25, 2024

Thanks @daveh150. So you are coupling instantaneous data and not conserving, that's clearer now. I guess you are resolving the diurnal cycle and inertial period pretty well, coupling once per hour. Maybe the timestepping/coupling is not playing a role.

@proteanplanet
Copy link
Contributor

@proteanplanet - Did you have a paper for how to examine the phase / coupling diagnostics ?

Hello @anton-seaice and @daveh150. I sent an email to Anton suggesting we meet online to go over your coupling configuration, since there are a number of things that aren't clear to me from the configuration described here. It would be easier to go over this in person so you can explain how your model is configured. I invited @eclare108213, and @apcraig may also like to listen in.

@anton-seaice
Copy link
Contributor Author

For anyone playing along at home, daveh's setup and ours (ACCESS) are different. Yes we both use CICE6, but our coupling is different (and not sure about the ocean)

  1. C-grid fails, B-grid does not

As noted the B-grid runs. This is the B-grid velocity from the same time that the c-grid version crashes, the sharp spike might be odd ?

image
  1. Using a 900 s time step for CICE, MOM and their coupling does not fail but still shows some instability.

These are the plots in #992 (comment)

Some suggestions ...
2. In the failing models, is the momentum exchange done at cell centers or at the C-grid
points? If it’s at the C-grid points, does it still fail when coupling at cell centers?

The exchange is done at the cell-centers. Its not something which is easy to change at this point.

  1. Could this be coming from residual elastic waves? Crank up EVP subcycling to ~1200 or more.

I ran with ndte=1200 and the results look basically the same with large velocity instabilities except the model recovers and continues.

  1. To further understand time step dependence of coupling versus component model time steps, run CICE and MOM at 900 s (or smaller, to eliminate potential instabilities coming from them) and couple less often, e.g. 1350 s. Andrew's coupling diagnostics could also help here.

It would take some work do to this exactly as described. Although I think I achieved something similar:

I set the thermodynamic timesteps in MOM and CICE and the coupling timestep to 2700, and then the MOM baroclinic timestep to 1350 and ndtd = 2 in CICE. I thought I was making a change to make the model less stable, however the results look almost normal:

Screenshot 2024-11-26 at 1 53 41 PM Screenshot 2024-11-26 at 1 54 08 PM
  1. Try JF’s semi-implicit Coriolis approach.

Considering the location, what happens if you turn on landfast ice? Maybe the increased tensile stress would help?

This is possible, we are currently using Ktens = 0 and the basal stress parameterisation is off.

If none of those things provide any insight, start simplifying your configuration as much as possible while maintaining the failure mode. E.g. Turn off the thermodynamics. Turn off the ice-ocean coupling, or individual terms associated with the dynamics. Turn off advection. Try to set up a similar configuration in a box model (grid size, thickness, velocity magnitude and angle with the coast, etc)....

There are no instabilities if run with advection turned off (ktransport = -1). Is it worth trying turning on/off with the other processes?

@proteanplanet
Copy link
Contributor

Hi @anton-seaice. There's a bunch of things potentially going on here that aren't related to anything mentioned on this thread. I can talk now if you are available. I've emailed a WebEx link.

@dupontf
Copy link

dupontf commented Nov 26, 2024

Hi guys! back from Europe and just to add to the noise, I have spoken to the SI3 guys in Paris and they have had the explicit Coriolis treatment for the C-grid implementation for about 10 years (i.e., the LIM3 period) with no adverse report.

@daveh150
Copy link
Contributor

Sorry if I missed the discussion on coupling @proteanplanet suggested. I certainly would be interested in learning from y'all about your coupling setup and trying to correct/improve ours if another opportunity arises.

@daveh150
Copy link
Contributor

@anton-seaice Thank you for posting your results. Just to clarify, did your run with ndtd=2 run to completion on C-Grid? From what I can tell that halves the dynamic timestep. Did you also have ndte=1200 in that test? Also If I understood the model did not crash with ndte=1200, though you still saw spikes in velocity? Thanks!

@JFLemieux73
Copy link
Contributor

Thanks @proteanplanet for helping.

I would guess the problem is related to the ice-ocean stress. @anton-seaice I would like to suggest an additional test to verify if rheology is involved or not. Could you please use your default setup (and time steps) and set Pstar=0 (if kstrength=0) or Cf=0 (kstrength=1)? Doing this sets the rheology term to zero. Please restart just before the instability and show plot the velocities as above.

@JFLemieux73
Copy link
Contributor

I see two potential issues related to the ice-ocean stress:

  1. our treatment at the ice edge (uvelE=vvelN=0 instead of freedrift).
  2. at steady-state when aice is small (and vice) the momentum balance in CICE is aicetau_air = aicetau_io. tau_air is the air stress and tau_io is the ocean stress on the ice. We could have a problem (and large velocities) if aice on the left is not equal to aice on the right.

For point 2), @daveh150, @dabail10 and @anton-seaice could you please confirm in you coupled setup that aice on the left is equal to aice on the right? In other words, are they at the same time level?

@daveh150
Copy link
Contributor

@JFLemieux73 sorry I need some hand-holding here... can you point out where that is in the code? Please forgive my inexperience in finding where the momentum balance is. I am missing where the stresses are multiplied by aice. Thanks!

@anton-seaice
Copy link
Contributor Author

@anton-seaice Thank you for posting your results. Just to clarify, did your run with ndtd=2 run to completion on C-Grid? From what I can tell that halves the dynamic timestep. Did you also have ndte=1200 in that test?

Elizabeth suggested lengthening the coupling timestep, however in our set-up the coupling & model timesteps are linked. So instead I doubled the normal (thermodynamic) timesteps and then set ndtd=2 (and similar in the ocean). This was with the default ndte (120).

Also If I understood the model did not crash with ndte=1200, though you still saw spikes in velocity? Thanks!

Yes correct - ice velocities were unstable / oscillating but recovered (probably due to the atmosphere wind-stress forcing changing)

@anton-seaice
Copy link
Contributor Author

Thanks again @JFLemieux73

I see two potential issues related to the ice-ocean stress:

  1. our treatment at the ice edge (uvelE=vvelN=0 instead of freedrift).

I am very suspicious of this but can't turn it into anything certain

This is uvelE just before the instability. The cell which fails is to the adjacent (to the left) to a cell with 0 velocity. I guess the cell with 0 velocity (grey) is sensible enough, as the motion (positive to the right) is pushing the ice onto a land cell (white).

image

My feeling is this is worth fixing if possible even if unrelated to this instability.

  1. at steady-state when aice is small (and vice) the momentum balance in CICE is aice_tau_air = aice_tau_io. tau_air is the air stress and tau_io is the ocean stress on the ice. We could have a problem (and large velocities) if aice on the left is not equal to aice on the right.

I don't quite follow but before the crash aice is close to 1.

For point 2), @daveh150, @dabail10 and @anton-seaice could you please confirm in you coupled setup that aice on the left is equal to aice on the right? In other words, are they at the same time level?

We use relative stresses, so I assume both tau_air and tau_io are calculated from wind/ocean/ice velocities from the previous timestep. We use a data atmosphere, so i would have to double check for the wind.

I would guess the problem is related to the ice-ocean stress. @anton-seaice I would like to suggest an additional test to verify if rheology is involved or not. Could you please use your default setup (and time steps) and set Pstar=0 (if kstrength=0) or Cf=0 (kstrength=1)? Doing this sets the rheology term to zero. Please restart just before the instability and show plot the velocities as above.

Results are completely smooth with Cf=0

image

image

@JFLemieux73
Copy link
Contributor

Thanks @anton-seaice. The ice is compact so I don't think my points above (ice edge velocity and aice in water and air stresses) are good candidates. Can you try with your default setup (and time steps) and use kstrength=0? (don't set Pstar to zero).

@JFLemieux73
Copy link
Contributor

@eclare108213 I am wondering if it is not related to the stability issue described in

Lipscomb, W. H., E. C. Hunke, W. Maslowski, and J. Jakacki (2007), Ridging, strength, and stability in high-resolution sea
ice models, J. Geophys. Res., 112, C03S91, doi:10.1029/2005JC003355.

Maybe the Cgrid is more sensitive to this problem...

@eclare108213
Copy link
Contributor

@JFLemieux73 that's an excellent thought. It would make sense, too, that @anton-seaice's runs with 2 dynamics iterations per thermo time step would be more stable. Using the Hibler ice strength formulation should also be a good indicator, as you suggest. If this is the issue, then we can focus on the ridging parameters and if necessary, the participation / redistribution functions.

@JFLemieux73
Copy link
Contributor

Thanks @eclare108213. @anton-seaice if this is the problem it is due to the fact that the ice strength is explicit when solving for the momentum equation. It should be less sensitive with the Hibler formulation. Due to the time difference I would like to ask right away for a few more experiments. Could we see time series of the velocity (as above) for kstrength=0 with

  1. Pstar = 2.75e4, Cstar = 20 (as I requested above)
  2. Pstar = 1e4, Cstar = 20
  3. Pstar = 2.75e4, Cstar = 10

Thanks!

@JFLemieux73
Copy link
Contributor

@anton-seaice, it is in region with some of the smallest grid cells? What are roughly the dimensions of the grid cells (in km) where it crashes?

@anton-seaice
Copy link
Contributor Author

Thankyou for all the help on this @JFLemieux73 - the cells are small but not smallest. X dimension is ~7km and 7 dimension is ~11km. The instability grows to the left / negative X direction which is the small dimension in the grid in this area.

The crash point shows excessive ridging prior to the crash in the original run, and the internal ice-stress history output also looks erroneous (as always - hard to know what is cause vs effect). Which I think is consistent with the suggestion of an instability related to ridging.

From original run, blue is atmosphere-ice stress, orange is ocean-ice stress, green is internal stress, all in x direction at the c-grid u point. Note the internal stress field is larger magnitude than the surface stresses:

image

With the hibler ice strength formulation, the model runs however the velocities are greatly reduced:

image

@JFLemieux73
Copy link
Contributor

Thanks a lot @anton-seaice. It really makes sense with what is described in the paper cited above (Lipscomb et al. 2007). I also have a paper about this is you are interested:

Lemieux, J.-F., D. A. Knoll, M. Losch, and C. Girard (2014), A second-order accurate in time IMplicit-EXplicit (IMEX) integration scheme for sea ice dynamics, J. Comput. Phys., 263, 375–392, doi:10.1016/j.jcp.2014.01.010.

No worries if the velocities are small at this location with the Hibler formulation. With Pstar=2.75e4 the ice is probably strong enough for not failing in compression (given the current ice and wind conditions). The fact that the blue and green curves are the same is because aice~1.

As @eclare108213 wrote we might have to adjust the ridging parameters to fix the problem with kstrength=1.

For the moment I would suggest to use kstrength=0. Note that this is what we use at ECCC in all our operational systems. With kstrength=1, we get unrealistic thicknesses in tidally active regions (it might be the same problem that is described here...). Thicknesses are more realistic with kstrength=0. We use Pstar=2.25e4.

@daveh150
Copy link
Contributor

daveh150 commented Dec 2, 2024

Thank you @anton-seaice for the clarification.

Another data point: Before the holiday last week I ran a test with ndtd=2. I kept all our other parameters the same, including ndte=120 and our standard dt=225. This test ran successfully for 16 days. This seems to agree with @anton-seaice results.

I'll make another test with ndtd=1 and kstrength=0 to see if that works too. Thank you also @JFLemieux73 for the info on using Pstar=2.25e4 with kstrengh=0 and the issue with thick ice in tidally active regions. I haven't looked at that in our system but will do so now!

@anton-seaice
Copy link
Contributor Author

Hi @daveh150 - is your crash occurring at the coast with high winds (>10m/s) leading to high ridging / internal ice strength values ?

@anton-seaice
Copy link
Contributor Author

Apologies for the very-long-post. The TLDR is approximately, the ridging when using C-grid kicks off the instability, which can be worsened by coupling with the ocean. Prior to the instability, the internal ice stress shows more sensitivity to high wind speeds when using the B-grid compared to the ice-velocities being more sensitive when using the C-grid.

Thanks to @proteanplanet , @eclare108213 and @apcraig for help in identifying we have a two-timestep lag in the ocn-ice coupling. This is related to us running to ice&ocn models concurrently.

I ran some more tests to try and determine the impact of this.

  1. Original config with 2 cice dynamics timesteps per cice timestep (ndtd = 2)
  2. Coupling/CICE/MOM barotropic timestep halved (i.e. 675s)
  3. Original config with ice-> ocn couple deactivated

For all cases, there are a few timesteps where the ice internal stress term oscilates, however it doesn't grow unstably. There is minor patterning in the strint term the surrounding cells. The ridging term in these results still look implausible (from run case 3) :

image

  1. Run ocean first and then ice.
    I also ran with running the ocn model first, then the ice model (serially) (but otherwise the original config). This reduces the lag in stress terms to one timestep. The results here are quite interesting. The ridging rates and internal stress term are very high, but the model recovers.

These are two figures from the ice-ocn serial run:

image

image

These results are a bit counter-intuitive. Both reducing the timestep and running without feedbacks from the ocean model give results which shows some instability which recovers. The initial perturbation for the instability comes from ridging term in CICE.

I went back and looked at the ridging term in the original B-grid runs.

This configuration uses a data-atmosphere, and all the coupling is on the A-grid/center/tracer points (i.e. grid_atm = A ), and then CICE is interpolating to the B or C grid points. The difference in the B-grid and C-grid (marked with E/N) atmosphere stresses are small:

image

However the ridging rate in the C-grid run is much much larger.

In the c-grid results, prior to instability, the velocity is higher and the internal stress is lower prior to the ridging occuring and then the ridging term appears to cause the instability.

image

In the B-grid results, before the time of the instability (i.e before 16:00) it looks like internal stress term is more sensitive to changes in strair and the velocity is lower. Then ridging occurs in a more plausible/stable way.

image

@JFLemieux73
Copy link
Contributor

Hi @anton-seaice, just to clarify...are these results above all with kstrength=1 (Rothrock)?

@anton-seaice
Copy link
Contributor Author

Yes - with Rothrock and default Cf value (17)

@JFLemieux73
Copy link
Contributor

Hi @anton-seaice,

Following our discussion yesterday, I think you should have a look at section 4 in Lipscomp et al. 2007. You could set up a similar idealized experiment. Let me know if you want to proceed and I could help for setting up the experiment. That would be a nice way to see if the C-grid is more sensitive than the B-grid for this instability issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants