MPI Issue
- William Yoon
- Posts: 13
- Joined: Wed May 12, 2021 12:45 pm
MPI Issue
Hello,
I am currently running latest Simvascular and SVsolver on Mac.
I installed open-MPI.
To install, I used "brew install open-mpi".
locate mpiexec
/Applications/ParaView-5.9.0.app/Contents/MacOS/mpiexec
/Applications/ParaView-5.9.0.app/Contents/bin/mpiexec
/usr/local/Cellar/open-mpi/4.1.1_2/bin/mpiexec
/usr/local/Cellar/open-mpi/4.1.1_2/share/man/man1/mpiexec.1
/usr/local/bin/mpiexec
/usr/local/share/man/man1/mpiexec.1
/usr/local/sv/svFSI/2021-05-03/bin/mpiexec
/usr/local/sv/svsolver/2021-06-15/bin/mpiexec
But somehow I can't seem to get Simvascular to recognize this. When I click MPI to run simulation, I get the following message:
svSolver requires OpenMPI but an OpenMPI MPI Implementation was not found.
Under the simvascular preference MPI tab,
mpiexec: /usr/local/bin/mpiexec
MPI Implementation: MPICH
I have been reading the forum to fix this issue, but I have not been successful.
Any help would be greatly appreciated.
Thank you,
I am currently running latest Simvascular and SVsolver on Mac.
I installed open-MPI.
To install, I used "brew install open-mpi".
locate mpiexec
/Applications/ParaView-5.9.0.app/Contents/MacOS/mpiexec
/Applications/ParaView-5.9.0.app/Contents/bin/mpiexec
/usr/local/Cellar/open-mpi/4.1.1_2/bin/mpiexec
/usr/local/Cellar/open-mpi/4.1.1_2/share/man/man1/mpiexec.1
/usr/local/bin/mpiexec
/usr/local/share/man/man1/mpiexec.1
/usr/local/sv/svFSI/2021-05-03/bin/mpiexec
/usr/local/sv/svsolver/2021-06-15/bin/mpiexec
But somehow I can't seem to get Simvascular to recognize this. When I click MPI to run simulation, I get the following message:
svSolver requires OpenMPI but an OpenMPI MPI Implementation was not found.
Under the simvascular preference MPI tab,
mpiexec: /usr/local/bin/mpiexec
MPI Implementation: MPICH
I have been reading the forum to fix this issue, but I have not been successful.
Any help would be greatly appreciated.
Thank you,
- Ryan DeGroff
- Posts: 7
- Joined: Tue Jun 08, 2021 8:23 am
Re: MPI Issue
Following, I have the same issue.
- David Parker
- Posts: 1775
- Joined: Tue Aug 23, 2005 2:43 pm
Re: MPI Issue
Hi William and Ryan,
Post here what the output is when you execute
Cheers,
Dave
Post here what the output is when you execute
Code: Select all
mpiexec --version
Dave
- Ryan DeGroff
- Posts: 7
- Joined: Tue Jun 08, 2021 8:23 am
Re: MPI Issue
Hi David,
Here's my output:
Here's my output:
-Ryan
- William Yoon
- Posts: 13
- Joined: Wed May 12, 2021 12:45 pm
Re: MPI Issue
davep wrote: ↑Thu Jul 08, 2021 11:32 amHi William and Ryan,
Post here what the output is when you execute
Cheers,Code: Select all
mpiexec --version
Dave
Hi Dave,
Here is my version:
mpiexec --version
mpiexec (OpenRTE) 4.1.1
Report bugs to http://www.open-mpi.org/community/help/
Thanks,
William
- David Parker
- Posts: 1775
- Joined: Tue Aug 23, 2005 2:43 pm
Re: MPI Issue
Hi Guys,
I think the problem is that SV is reading the MPI version from a database stored in the $HOME/Library/Application Support/SimVascular directory. Remove the files in that directory and see if that fixes the problem.
Cheers,
Dave
I think the problem is that SV is reading the MPI version from a database stored in the $HOME/Library/Application Support/SimVascular directory. Remove the files in that directory and see if that fixes the problem.
Cheers,
Dave
- William Yoon
- Posts: 13
- Joined: Wed May 12, 2021 12:45 pm
Re: MPI Issue
Hi Dave,
Thanks for your quick response.
I deleted the files under Simvascular directory and tried to run simulation.
This time, I get the following message:
Simulation job "name" has failed with non-zero exit code 1.
Show details... show:
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Z-MacBook-Pro.local:02104] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Z-MacBook-Pro.local:02110] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Z-MacBook-Pro.local:02115] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Z-MacBook-Pro.local:02119] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[32037,1],0]
Exit code: 1
--------------------------------------------------------------------------
- Ryan DeGroff
- Posts: 7
- Joined: Tue Jun 08, 2021 8:23 am
Re: MPI Issue
I got the same error as William.
- David Parker
- Posts: 1775
- Joined: Tue Aug 23, 2005 2:43 pm
Re: MPI Issue
Try running the solver from the command line in your SV project Simulations/JOBNAME/ directory
Cheers,
Dave
Code: Select all
mpiexec -np 4 /usr/local/sv/svsolver/2021-06-15/bin/svsolver solver.inp
Dave
- William Yoon
- Posts: 13
- Joined: Wed May 12, 2021 12:45 pm
Re: MPI Issue
Hi Dave,
I entered the commands and got the following results.
Z-MacBook-Pro:~ A$ cd ~/Desktop/CoronaryProject/Simulations/pulsatile_sim
Z-MacBook-Pro:pulsatile_sim A$ mpiexec -np 4 /usr/local/sv/svsolver/2021-06-15/bin/svsolver solver.inp
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
I entered the commands and got the following results.
Z-MacBook-Pro:~ A$ cd ~/Desktop/CoronaryProject/Simulations/pulsatile_sim
Z-MacBook-Pro:pulsatile_sim A$ mpiexec -np 4 /usr/local/sv/svsolver/2021-06-15/bin/svsolver solver.inp
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found