Skip to content

phaseField FAQ's

Shiva Rudraraju edited this page Sep 3, 2015 · 4 revisions

FAQ's

Question: Error while running Cahn-Hilliard problem in parallel in debug mode?

Cahn-Hilliard problem involves setting up random initial conditions for the composition field which are generated using std::rand. In parallel, each processor generates the initial conditions independently and because of this the values for the points at the processor sub-domain boundaries (ghost nodes) may not be the same. In debug mode, as a precautionary measure, the underlying deal.ii library has checks which require equal values being set for the ghost nodes from all the sharing processors, and this causes an Assert() function to fail and throw an Error. In release mode, this check is not done and hence the error is not reported.

So long story short, this is completely expected because of the random intial conditions. This should only be an issue in parallel debug runs, and should be all fine for serial runs and parallel release (optimized) runs.

Question: Why is my code slow? Or why is debug mode slow?

Are you in debug mode?. Debug mode is only for development. Run in release mode (make release; make run).

Please note that the default setting when you "make" your program is for debug compilation. For production jobs, you need to call "make release" to switch to optimized mode.

Question: I have both pvtu and vtu output files...why do I open in Visit/Paraview?

If your program generates *.pvtu files, then open pvtu files in the visualization software instead of vtu files. pvtu files are text files which have pointers to the vtu files generated by each processor in a parallel job.

Question: Only a small portion of my problem domain is shown by the visualization package.?

This may be expected if you have pvtu files pointing to vtu files from a previous job which were run with different number of processors. Delete existing pvtu, vtu files and run your job.
Clone this wiki locally