Is cmake still broken for linux svsolver?

Provides a system for patient-specific cardiovascular modeling and simulation.
POST REPLY
User avatar
Ryan Pewowaruk
Posts: 14
Joined: Mon Nov 13, 2017 1:04 pm

Is cmake still broken for linux svsolver?

Post by Ryan Pewowaruk » Thu Feb 08, 2018 1:00 pm

Hello,

I am trying to install sv solver on a cluster running Scientific Linux (similarish to CentOS) and I was concerned with this note in the github documentation.

NOTE: The CMake system is currently broken for svSolver and is under revision.

Is cmake still broken for svSolver and if so how should I go about installing svSolver?

Thanks,
Ryan

User avatar
Justin Tran
Posts: 109
Joined: Mon Sep 30, 2013 4:10 pm

Re: Is cmake still broken for linux svsolver?

Post by Justin Tran » Thu Feb 08, 2018 3:11 pm

Hi Ryan,

Thanks for your question! We have separated svSolver from the main SimVascular code to make compiling it on clusters easier, so you will not have to go through the CMake system. Below are instructions that uses the Makefile system for compiling svSolver on clusters:

1. Download the svsolver build straight from git using the command: "git clone https://github.com/SimVascular/svSolver.git svsolver"
2. Change into the build directory with the command: "cd svsolver/BuildWithMake"
3. Download the VTK binaries using the command: "./get-vtk-binaries.sh centos_6"
4. Copy an overrides file to specify cluster-specific settings: "cp SampleOverrides/centos_6/global_overrides.mk ." <--- notice the period at the end here!
5. IMPORTANT: Open up the global_overrides.mk with your favorite text editor (ex: vim global_overrides.mk) and make the following changes:
SV_USE_DUMMY_MPI=0
SV_USE_MPICH=1
6. Compile the code using the command: "make"

Take care when following steps 3 - 4, as these might be different for your cluster. For step 3, you will have to download different VTK binaries if your system is ubuntu, macosx, or msvc (if you open the get-vtk-binaries.sh in a text editor, you will see different options for these). You will have to copy over the appropriate override in Step 4 depending on your system as well. Step 5 will change depending on the MPI implementation on your cluster (either MPICH or OpenMPI).

Hope that helps! Let us know if you have trouble compiling svSolver on your cluster.

User avatar
Ryan Pewowaruk
Posts: 14
Joined: Mon Nov 13, 2017 1:04 pm

Re: Is cmake still broken for linux svsolver?

Post by Ryan Pewowaruk » Fri Feb 09, 2018 11:55 am

Thanks Justin!!

In the svsolver/BuildWithMake/Bin folder I now have

svpost.exe
svpre.exe
svsolver-openmpi.exe

along with the corresponding gcc-gfortran.exe files.

Is there anything else I can do to verify that I've installed svsolver correctly?

Best,
Ryan

User avatar
Justin Tran
Posts: 109
Joined: Mon Sep 30, 2013 4:10 pm

Re: Is cmake still broken for linux svsolver?

Post by Justin Tran » Fri Feb 09, 2018 3:23 pm

Hi Ryan,

That's great! Those are all the executables that are expected after a successful compilation. I would just recommend you to run a simulation on your cluster using those executables, then examine the results to see if they make sense. If you have a verified simulation that you ran on another system, you can run that same simulation on your cluster and see if you get the same results.

I would also see how your code scales with an increasing number of processors. We have tested svSolver to have good scaling on the clusters that we run on, so you should see your run-time approximately cut in half if you run it with twice as many processors (assuming you have a large enough mesh so each processor has enough work to do). If you are experiencing poor scaling, please let us know and we can help you out!

User avatar
Ryan Pewowaruk
Posts: 14
Joined: Mon Nov 13, 2017 1:04 pm

Re: Is cmake still broken for linux svsolver?

Post by Ryan Pewowaruk » Mon Feb 19, 2018 3:20 pm

Hi Justin,

I'm currently running into another issue that I would appreciate if you could help me with.

When I try to run svsolver on the cluster instead of 1 job running on X cores, I get the the same job running X times on 1 core each.

For example, using the following code in my sbatch script

Code: Select all

module load  mpi/gcc/openmip-1.6.4
mpiexec -n 32 /home/pewowaruk/svsolver/BuildWithMake/Bin/svsolver-openmpi.exe
I end up running the same job 32 times!

Do you have any thoughts on if this is an issue with how I'm calling svsolver or if I should seek help from the cluster staff?

Thanks

User avatar
Justin Tran
Posts: 109
Joined: Mon Sep 30, 2013 4:10 pm

Re: Is cmake still broken for linux svsolver?

Post by Justin Tran » Tue Feb 20, 2018 2:11 pm

Hi Ryan,

Are you loading the openmpi module before you launch the job, or is the first time you are loading it in the batch script? Usually I like to put all my module loads in my .bashrc file so they get loaded up when I login to the cluster. I would try loading the openmpi module before launching the batch script and see if that helps.

POST REPLY