I am trying to trouble shoot other problems that I am currently having with CMC and SO and found this, what seems like an inconsistency, along the way. Here is my question and problem.
Why is it that I can have an output precision of 6, minimum integrator step size of 1e-8, and integrator error tolerance of 1e-5 in both my SO setup file and my CMC file, but they print different result time steps?
For example, in SO, my output precision is 6 and the results are printed every 0.01 seconds. This was my collection frequency. However in CMC, my output precision is also 6, but my results are printed MUCH more often than this. How can I change how often the results are printed to the file in CMC.
I tried to change the minimum integrator step size to 0.01 in my CMC setup file, but two things happened.
1) The simulation failed, when previously it finished.
2) My error says “… at time 1.32333 for reason: Exception caught…” which infers that the program is not taking a 0.01 second time step like I assumed I specified.