Fail on MPI configuration on Mac

Provides a system for patient-specific cardiovascular modeling and simulation.
User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Fri Dec 15, 2023 10:28 am

Hello,

We use OpenMPI with the solvers.

What do you mean when you say it works but I am not able to run a simulation with MPI enabled ? What error do you see?

You can set the mpiexec bin file from the SV Preferences panel by clicking on the ... to the right of text field.

Cheers,
Dave

User avatar
Evangelos Stamos
Posts: 5
Joined: Thu Oct 19, 2023 10:10 am

Re: Fail on MPI configuration on Mac

Post by Evangelos Stamos » Sat Dec 16, 2023 12:24 am

Hey, thank you for your quick response Dave,

by 'it works but I am not able to run a simulation with MPI enabled' I mean that SimVascular seems to be fully functional so far but Simulation cannot be run with 'Use MPI' option enabled.

Update: I performed

Code: Select all

rm -rf $HOME/Library/Application Support/SimVascular # according to https://simtk.org/plugins/phpBB/viewtopic.php?p=39176&sid=569c919aa9c0c562dc48456ddba7805c#p39176
Screenshot 2023-12-18 at 16.44.29.png
Screenshot 2023-12-18 at 16.44.29.png (114.51 KiB) Viewed 1033 times
Is there any know open-mpi version that works with svsolver 2022.07.22 ?

Installed path of open-mpi with brew is

Code: Select all

/usr/local/Cellar/open-mpi/5.0.0
and seems to be functional and linked to

Code: Select all

/usr/lib/bin
.
Screenshot 2023-12-18 at 16.38.47.png
Screenshot 2023-12-18 at 16.38.47.png (163.93 KiB) Viewed 1033 times

Code: Select all

estamos@mbp-euangelos ~ % mpiexec --version
mpiexec (Open MPI) 5.0.0

Report bugs to https://www.open-mpi.org/community/help/
When select Use MPI on simulation I receive error.
Screenshot 2023-12-18 at 16.47.41.png
Screenshot 2023-12-18 at 16.47.41.png (56.91 KiB) Viewed 1033 times
I would like to kindly ask you if there is any temporary workaround to Run Simulation on multiple core with MPI on my macOS version, since I have to conduct a study in the context of my Biomechanics course with a set deadline.

In my view, it could be related to open-mpi version and I am willing to further contribute and investigate.

sv4guiMPIPreferences::DetermineMpiImplementation

I am using:

- SimVascular 2022.08.02
- svsolver 2022.07.22 since I could not run a simulation
- macOS Sonoma Version 14.1.2 (23B92)


I have tried to install open-mpi natively by configure and cmake with no success yet, due to erros in configure phase.

Any hint would be highly appreciated.
Cheers,
Evangelos.

User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Mon Dec 18, 2023 12:44 pm

Hi Evangelos,

svSolver was built using OpenMPI 4.1 so you might could try installing that version.

SV still can't be built on the newer MacOS, we are working on that.

Cheers,
Dave

User avatar
Evangelos Stamos
Posts: 5
Joined: Thu Oct 19, 2023 10:10 am

Re: Fail on MPI configuration on Mac

Post by Evangelos Stamos » Wed Dec 20, 2023 1:14 am

Hi David,

I have successfully installed Open mPI 4.1.6 with cmake with prefix /usr/local/bin.

Code: Select all

estamos@mbp-euangelos bin % mpiexec --version
mpiexec (OpenRTE) 4.1.6

Report bugs to http://www.open-mpi.org/community/help/
On SimVascular 2022.08.02 Preferences under MPI option, path is correctly set to /usr/local/bin/orterun and orterun is recognised as a valid MPI Implementation.

Though when I am trying to Run a Simulation with "Use MPI" enabled I receive exactly the same error as Sara.

Looks like

Code: Select all

MPIExecPath
even being correctly set at

Code: Select all

sv4guiMPIPreferencePage::SelectMPIExecPath()
is reseted to

Code: Select all

/usr/local/Cellar/open-mpi/4.1.4/
or it's just the default location to display relevant information when something goes wrong and we have no clue what is the error related to.

I will try to dig in code further, but I am not sure if it's worth it since SV can't be built on MacOS Sonoma.
Any hint, approach to investigate would be really helpful. Also if there is any relevant Github issue please let me know.

Appreciated

Cheers,
Evangelos.
sara-benchara wrote:
Sat Mar 04, 2023 11:00 am
Hello David,

Thank you for your answer. I solved this problem by installing openMPI. However, when I try to launch a simulation in MPI, an error message appears: " Simulation job 'mul2_1_5_1_copy' has failed with non-zero exit code 1." with the following details:

--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02478] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02479] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[22959,1],2]
Exit code: 1
--------------------------------------------------------------------------


Thanks for you help

Sara

User avatar
Florian Gartner
Posts: 12
Joined: Tue Dec 12, 2023 7:13 am

Re: Fail on MPI configuration on Mac

Post by Florian Gartner » Mon Jan 08, 2024 4:23 pm

Hi everybody,

I have a similar issue on my mac when I want to "creat data files for simulation" only that I don't get any error message. it just says "Creating Data files: bct, restart, geombc" on the left lower corner and then nothing happens. I waited for about 40 minutes but apparently the process didn't terminate. If I go to the pertinent folder, I find that a folder "Simulations" has been created with the relevant files in it (like .vtp, vtu, .dat, .inp etc.), however, all these files appear faded and I cannot click on them. The same problem occurs whether I use MPI or not. Any ideas what the problem might be? I already tried to build svSolver again from source on my mac and I checked that gfortran and cmake are properly installed but that didn't work either.
Thanks for any help!

User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Tue Jan 16, 2024 6:14 pm

Hello,

What do you mean when you say all these files appear faded ? Do the files contain any data?

Creating simulation data does not use MPI.

What MacOS are you running on?

If you upload your SV project someplace I can download I'll have a look.

Cheers,
Dave

POST REPLY