Hello,
We use OpenMPI with the solvers.
What do you mean when you say it works but I am not able to run a simulation with MPI enabled ? What error do you see?
You can set the mpiexec bin file from the SV Preferences panel by clicking on the ... to the right of text field.
Cheers,
Dave
Fail on MPI configuration on Mac
- Evangelos Stamos
- Posts: 5
- Joined: Thu Oct 19, 2023 10:10 am
Re: Fail on MPI configuration on Mac
Hey, thank you for your quick response Dave,
by 'it works but I am not able to run a simulation with MPI enabled' I mean that SimVascular seems to be fully functional so far but Simulation cannot be run with 'Use MPI' option enabled.
Update: I performed
Is there any know open-mpi version that works with svsolver 2022.07.22 ?
Installed path of open-mpi with brew is and seems to be functional and linked to .
When select Use MPI on simulation I receive error.
I would like to kindly ask you if there is any temporary workaround to Run Simulation on multiple core with MPI on my macOS version, since I have to conduct a study in the context of my Biomechanics course with a set deadline.
In my view, it could be related to open-mpi version and I am willing to further contribute and investigate.
sv4guiMPIPreferences::DetermineMpiImplementation
I am using:
- SimVascular 2022.08.02
- svsolver 2022.07.22 since I could not run a simulation
- macOS Sonoma Version 14.1.2 (23B92)
I have tried to install open-mpi natively by configure and cmake with no success yet, due to erros in configure phase.
Any hint would be highly appreciated.
Cheers,
Evangelos.
by 'it works but I am not able to run a simulation with MPI enabled' I mean that SimVascular seems to be fully functional so far but Simulation cannot be run with 'Use MPI' option enabled.
Update: I performed
Code: Select all
rm -rf $HOME/Library/Application Support/SimVascular # according to https://simtk.org/plugins/phpBB/viewtopic.php?p=39176&sid=569c919aa9c0c562dc48456ddba7805c#p39176
Installed path of open-mpi with brew is
Code: Select all
/usr/local/Cellar/open-mpi/5.0.0
Code: Select all
/usr/lib/bin
Code: Select all
estamos@mbp-euangelos ~ % mpiexec --version
mpiexec (Open MPI) 5.0.0
Report bugs to https://www.open-mpi.org/community/help/
I would like to kindly ask you if there is any temporary workaround to Run Simulation on multiple core with MPI on my macOS version, since I have to conduct a study in the context of my Biomechanics course with a set deadline.
In my view, it could be related to open-mpi version and I am willing to further contribute and investigate.
sv4guiMPIPreferences::DetermineMpiImplementation
I am using:
- SimVascular 2022.08.02
- svsolver 2022.07.22 since I could not run a simulation
- macOS Sonoma Version 14.1.2 (23B92)
I have tried to install open-mpi natively by configure and cmake with no success yet, due to erros in configure phase.
Any hint would be highly appreciated.
Cheers,
Evangelos.
- David Parker
- Posts: 1716
- Joined: Tue Aug 23, 2005 2:43 pm
Re: Fail on MPI configuration on Mac
Hi Evangelos,
svSolver was built using OpenMPI 4.1 so you might could try installing that version.
SV still can't be built on the newer MacOS, we are working on that.
Cheers,
Dave
svSolver was built using OpenMPI 4.1 so you might could try installing that version.
SV still can't be built on the newer MacOS, we are working on that.
Cheers,
Dave
- Evangelos Stamos
- Posts: 5
- Joined: Thu Oct 19, 2023 10:10 am
Re: Fail on MPI configuration on Mac
Hi David,
I have successfully installed Open mPI 4.1.6 with cmake with prefix /usr/local/bin.
On SimVascular 2022.08.02 Preferences under MPI option, path is correctly set to /usr/local/bin/orterun and orterun is recognised as a valid MPI Implementation.
Though when I am trying to Run a Simulation with "Use MPI" enabled I receive exactly the same error as Sara.
Looks like even being correctly set at is reseted to or it's just the default location to display relevant information when something goes wrong and we have no clue what is the error related to.
I will try to dig in code further, but I am not sure if it's worth it since SV can't be built on MacOS Sonoma.
Any hint, approach to investigate would be really helpful. Also if there is any relevant Github issue please let me know.
Appreciated
Cheers,
Evangelos.
I have successfully installed Open mPI 4.1.6 with cmake with prefix /usr/local/bin.
Code: Select all
estamos@mbp-euangelos bin % mpiexec --version
mpiexec (OpenRTE) 4.1.6
Report bugs to http://www.open-mpi.org/community/help/
Though when I am trying to Run a Simulation with "Use MPI" enabled I receive exactly the same error as Sara.
Looks like
Code: Select all
MPIExecPath
Code: Select all
sv4guiMPIPreferencePage::SelectMPIExecPath()
Code: Select all
/usr/local/Cellar/open-mpi/4.1.4/
I will try to dig in code further, but I am not sure if it's worth it since SV can't be built on MacOS Sonoma.
Any hint, approach to investigate would be really helpful. Also if there is any relevant Github issue please let me know.
Appreciated
Cheers,
Evangelos.
sara-benchara wrote: ↑Sat Mar 04, 2023 11:00 amHello David,
Thank you for your answer. I solved this problem by installing openMPI. However, when I try to launch a simulation in MPI, an error message appears: " Simulation job 'mul2_1_5_1_copy' has failed with non-zero exit code 1." with the following details:
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02478] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02479] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[22959,1],2]
Exit code: 1
--------------------------------------------------------------------------
Thanks for you help
Sara
- Florian Gartner
- Posts: 12
- Joined: Tue Dec 12, 2023 7:13 am
Re: Fail on MPI configuration on Mac
Hi everybody,
I have a similar issue on my mac when I want to "creat data files for simulation" only that I don't get any error message. it just says "Creating Data files: bct, restart, geombc" on the left lower corner and then nothing happens. I waited for about 40 minutes but apparently the process didn't terminate. If I go to the pertinent folder, I find that a folder "Simulations" has been created with the relevant files in it (like .vtp, vtu, .dat, .inp etc.), however, all these files appear faded and I cannot click on them. The same problem occurs whether I use MPI or not. Any ideas what the problem might be? I already tried to build svSolver again from source on my mac and I checked that gfortran and cmake are properly installed but that didn't work either.
Thanks for any help!
I have a similar issue on my mac when I want to "creat data files for simulation" only that I don't get any error message. it just says "Creating Data files: bct, restart, geombc" on the left lower corner and then nothing happens. I waited for about 40 minutes but apparently the process didn't terminate. If I go to the pertinent folder, I find that a folder "Simulations" has been created with the relevant files in it (like .vtp, vtu, .dat, .inp etc.), however, all these files appear faded and I cannot click on them. The same problem occurs whether I use MPI or not. Any ideas what the problem might be? I already tried to build svSolver again from source on my mac and I checked that gfortran and cmake are properly installed but that didn't work either.
Thanks for any help!
- David Parker
- Posts: 1716
- Joined: Tue Aug 23, 2005 2:43 pm
Re: Fail on MPI configuration on Mac
Hello,
What do you mean when you say all these files appear faded ? Do the files contain any data?
Creating simulation data does not use MPI.
What MacOS are you running on?
If you upload your SV project someplace I can download I'll have a look.
Cheers,
Dave
What do you mean when you say all these files appear faded ? Do the files contain any data?
Creating simulation data does not use MPI.
What MacOS are you running on?
If you upload your SV project someplace I can download I'll have a look.
Cheers,
Dave