0

Run a diffusion problem to equilibrium

Hi,

I am working on a diffusion problem with a small gradient. The anticipated equilibrium time is ~30000 years. The best simulation time I can get for now is ~1000 years. I have tried several methods to elongate the simulation time, but they seems not working well.

First, I tried to tighten the convergence criteria (RE1). It seemed it shortened every time step but the total simulation time was not increased. Secondly, I adjusted the initial time step size, which only caused a small fluctuation to the total simulation time. Thirdly, I tried to remove the limit of DELTMX, which resulted in a convergence issue that no steady state was reached. Sometimes even I limited the DELTMX, the convergence issue still occur, as shown in the attached figure. I also tried to coarsen the grid size from 1 mm to 4 mm, larger than which is not quite acceptable in diffusion problems. The result is a little bit better but still far away. 

May I have some advice to resolve this issue? Thanks in advance. 

Best,

Yaxin

Reply

null

Content aside

  • 7 yrs agoLast active
  • 376Views
  • 1 Following