- #1
pomekrank
- 13
- 0
Hello,
I have a typical 1D advection problem where a cold fluid flows over a flat plate. I did an energy balance to include conduction, convection and friction loss and I got the PDE's for the fluid and the solid. I used finite differences to solve the system as T(x, t) for both fluid and solid. After simplification, I have this kind of matrix in explicit scheme.
(see attached matrix.png)
The problem is that I want to optimise the number of nodes (time step) in my simulation to decrease time of calculation and ensure stability and convergence. I read about CFL criteria but it doesn't seem to always work in my case even if CFL < 1. Here (advection.png) is an image of the temperature distribution at a specific node for 2 different cases. By trial and error, I determined that the lowest minimum of nodes would be around 20 to get the full phenomen. However, is there a analytical way to figure out this value ?
Thank you,
Steven
I have a typical 1D advection problem where a cold fluid flows over a flat plate. I did an energy balance to include conduction, convection and friction loss and I got the PDE's for the fluid and the solid. I used finite differences to solve the system as T(x, t) for both fluid and solid. After simplification, I have this kind of matrix in explicit scheme.
(see attached matrix.png)
The problem is that I want to optimise the number of nodes (time step) in my simulation to decrease time of calculation and ensure stability and convergence. I read about CFL criteria but it doesn't seem to always work in my case even if CFL < 1. Here (advection.png) is an image of the temperature distribution at a specific node for 2 different cases. By trial and error, I determined that the lowest minimum of nodes would be around 20 to get the full phenomen. However, is there a analytical way to figure out this value ?
Thank you,
Steven