Fail on MPI configuration on Mac

Provides a system for patient-specific cardiovascular modeling and simulation.
User avatar
Sara BENCHARA
Posts: 7
Joined: Tue Nov 30, 2021 4:22 am

Fail on MPI configuration on Mac

Post by Sara BENCHARA » Sat Feb 18, 2023 4:04 pm

Hello everyone

I just installed Simvascular on a Mac with an M2 pro chip and macOS Ventura. I would like to run simulations using MPI. However, when I ask Simvascular to find the path to the mpiexec file. At first the mpiexec file in the bin file is not found and then when it appears I get this error message.

Thanks for your help.

Sara
Attachments
Capture d’écran 2023-02-18 à 23.56.17.png
Capture d’écran 2023-02-18 à 23.56.17.png (243.93 KiB) Viewed 3118 times

User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Tue Feb 21, 2023 8:25 pm

Hi Sara,

The mpiexec program should be set to /usr/local/bin/mpiexec. Perhaps SV is not automatically setting it correctly.

Cheers,
Dave
Attachments
Screen Shot 2023-02-21 at 7.21.50 PM.png
Screen Shot 2023-02-21 at 7.21.50 PM.png (13.8 KiB) Viewed 3088 times

User avatar
Sara BENCHARA
Posts: 7
Joined: Tue Nov 30, 2021 4:22 am

Re: Fail on MPI configuration on Mac

Post by Sara BENCHARA » Sat Mar 04, 2023 11:00 am

Hello David,

Thank you for your answer. I solved this problem by installing openMPI. However, when I try to launch a simulation in MPI, an error message appears: " Simulation job 'mul2_1_5_1_copy' has failed with non-zero exit code 1." with the following details:

--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02478] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02479] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.1.4/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[MacBook-Pro-de-Sara.local:02477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[22959,1],2]
Exit code: 1
--------------------------------------------------------------------------


Thanks for you help

Sara

User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Tue Mar 07, 2023 9:59 am

Hi Sara,

It seems that MPI was very sorry but it did not mention what it was sorry about!

Try running the simulation without MPI to see if you can determine what the problem might be, maybe see a useful error message.

Cheers,
Dave

User avatar
Sara BENCHARA
Posts: 7
Joined: Tue Nov 30, 2021 4:22 am

Re: Fail on MPI configuration on Mac

Post by Sara BENCHARA » Wed Mar 08, 2023 1:15 am

Hello David,
I can run the simulations without MPI without any problem.
However, when installing openMPI no usr/local/Cellar folder is created (as Simvascular is looking for).
I'm going to dig in this direction.

Thanks for your answers.

Cheers,

Sara

User avatar
Giuseppe Dalla Vecchia
Posts: 7
Joined: Tue Aug 22, 2023 5:56 am

Re: Fail on MPI configuration on Mac

Post by Giuseppe Dalla Vecchia » Tue Dec 05, 2023 7:44 am

Good morning everyone,

I've tried to install SimVascular several times on my macbook air with M2 chip and macOS Sanoma, and each time it seems there's an issue. I've checked multiple times to ensure that the required programs, such as gcc, vtk, and open-mpi, are installed. Specifically, the program works up until the simulation section, where the "create data files for simulation" command gives me the following error:

dyld[61448]: Library not loaded: /usr/local/opt/gcc/lib/gcc/11/libgfortran.5.dylib
Referenced from: <7F8D4F10-902B-3ADE-B253-9A20D4A22593> /usr/local/sv/svsolver/2022-07-22/bin/svpre
Reason: tried: '/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/svExternals/lib/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/svExternals/lib/plugins/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/lib/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/./libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/bin/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/lib/plugins/libgfortran.5.dylib' (no such file), '/usr/local/opt/gcc/lib/gcc/11/libgfortran.5.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/System/Volumes/Preboot/Cryptexes/OS/usr/local/opt/gcc/lib/gcc/11/libgfortran.5.dylib' (no such file), '/usr/local/opt/gcc/lib/gcc/11/libgfortran.5.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/usr/local/lib/libgfortran.5.dylib' (no such file), '/usr/lib/libgfortran.5.dylib' (no such file, not in dyld cache), '/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/svExternals/lib/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/svExternals/lib/plugins/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/lib/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/./libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/bin/libgfortran.5.dylib' (no such file), '/Applications/SimVascular.app/Contents/Resources/lib/plugins/libgfortran.5.dylib' (no such file), '/opt/homebrew/Cellar/gcc@11/11.4.0/lib/gcc/11/libgfortran.5.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/System/Volumes/Preboot/Cryptexes/OS/opt/homebrew/Cellar/gcc@11/11.4.0/lib/gcc/11/libgfortran.5.dylib' (no such file), '/opt/homebrew/Cellar/gcc@11/11.4.0/lib/gcc/11/libgfortran.5.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/usr/local/lib/libgfortran.5.dylib' (no such file), '/usr/lib/libgfortran.5.dylib' (no such file, not in dyld cache)

With my professor, we have agreed that perhaps the issue lies in the installation of svSolver, even though we have checked in the preferences, and it seems to be set correctly (I am attaching a photo).
Screenshot 2023-12-05 alle 15.37.15.png
Screenshot 2023-12-05 alle 15.37.15.png (239.88 KiB) Viewed 2975 times
Perhaps the problem lies in the setting of mpiexec, which doesn't match in the MPI section of preferences. However, I am unable to select the correct file, and it seems stuck with that 'orterun' executive file. Nonetheless, the issue persists even for simulations without MPI.
Screenshot 2023-12-05 alle 15.40.36.png
Screenshot 2023-12-05 alle 15.40.36.png (94.88 KiB) Viewed 2975 times
What I can't understand is that, unlike Sara Benchara's case (problem above), in my case, even the simulation without MPI does not work. I thought it might be a problem related to Apple's new chips, but apparently, the issue is different.

Could you suggest a possible alternative solution? I am available to contact you privately so that I can explain the problem more thoroughly.

Thanks for your help.
Giuseppe Dalla Vecchia

User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Tue Dec 05, 2023 10:38 am

Hello Giuseppe,

The error Library not loaded:/usr/local/opt/gcc/lib/gcc/11/libgfortran.5.dylib indicates that the Fortran libraries used to build svSolver can't be located.

You could try installing gcc11 or building svSolver from source on your Mac.

Cheers,
Dave

User avatar
Giuseppe Dalla Vecchia
Posts: 7
Joined: Tue Aug 22, 2023 5:56 am

Re: Fail on MPI configuration on Mac

Post by Giuseppe Dalla Vecchia » Tue Dec 05, 2023 10:55 am

Hi Dave,

I've already installed gcc11 and open-mpi4.1.6, but I'm still have these issues. I used Homebrew for the installation. Could this be the problem?

I'll try to building svSolver from source on your Mac.

Thanks,

Giuseppe

User avatar
David Parker
Posts: 1740
Joined: Tue Aug 23, 2005 2:43 pm

Re: Fail on MPI configuration on Mac

Post by David Parker » Tue Dec 05, 2023 11:18 am

Hi Giuseppe,

Macs are known to place libraries and such in non-standard locations that causes problems when running executables.

You might could try creating a soft link (ln -s) from /usr/local/opt/gcc/lib/gcc/11/libgfortran.5.dylib to the actual location of libgfortran.5.dylib is. For example on my Mac libgfortran.5.dylib is stored in /opt/homebrew/Cellar/gcc/11/lib/current/

Cheers,
Dave

User avatar
Evangelos Stamos
Posts: 5
Joined: Thu Oct 19, 2023 10:10 am

Re: Fail on MPI configuration on Mac

Post by Evangelos Stamos » Fri Dec 15, 2023 7:59 am

Hi,
I was facing the same issue on my MacOS Sonoma Version 14.1.2 (23B92) and I couldn't "Create Data Files for Simulation" either with or without MPI option selected. Since I didn't manage to find a fix,I decided to go back to SimVascular 2022.07.20 and keep the same svSolver version 2022.07. It works but I am not able to run a simulation with MPI enabled.

I have tried to install open-mpi, mpich through homebrew but since mpiexec field on MPI section on SimVascular preferences it's not an editable field I can't change the path to location where it is installed.

Is there a specific MPI version that works ?

POST REPLY