MPI Issue

Provides a system for patient-specific cardiovascular modeling and simulation.
User avatar
David Parker
Posts: 1775
Joined: Tue Aug 23, 2005 2:43 pm

Re: MPI Issue

Post by David Parker » Thu Jul 08, 2021 5:29 pm

Hi William,

You don't seem to have Fortran libraries. Have you installed gcc using brew install gcc? I that should include the Fortran libraries. Which MacOS are you running?

Cheers,
Dave

User avatar
William Yoon
Posts: 13
Joined: Wed May 12, 2021 12:45 pm

Re: MPI Issue

Post by William Yoon » Thu Jul 08, 2021 5:34 pm

Hi Dave,

I am using MacOS (11.4). It is intel chip.
I've already un-installed and re-installed GCC by using "brew install gcc." without success...

Z-MacBook-Pro:~ A$ brew install gcc
Updating Homebrew...
==> Auto-updated Homebrew!
Updated 2 taps (homebrew/core and homebrew/cask).
==> Updated Formulae
Updated 47 formulae.
==> Updated Casks
Updated 6 casks.

Warning: gcc 11.1.0_1 is already installed and up-to-date.
To reinstall 11.1.0_1, run:
brew reinstall gcc

User avatar
David Parker
Posts: 1775
Joined: Tue Aug 23, 2005 2:43 pm

Re: MPI Issue

Post by David Parker » Thu Jul 08, 2021 6:26 pm

Have you done an xcode-select --install ?

If not uninstall gcc, do xcode-select --install and then install gcc again.

Sorry this is taking so long! I develop primarily on a Mac and it seems that MacOS is getting worse, development wise, every new release.

Cheers,
Dave

User avatar
William Yoon
Posts: 13
Joined: Wed May 12, 2021 12:45 pm

Re: MPI Issue

Post by William Yoon » Thu Jul 08, 2021 9:44 pm

I appreciate all your help Dave.

I uninstalled gcc then did xcode-select --install then reinstalled gcc.
Unfortunately, I encountered same issue...


Z-MacBook-Pro:~ A$ xcode-select --install
xcode-select: error: command line tools are already installed, use "Software Update" to install updates
Z-MacBook-Pro:~ A$ brew install gcc
Updating Homebrew...
==> Auto-updated Homebrew!
Updated 1 tap (homebrew/core).
==> Updated Formulae
Updated 10 formulae.

==> Downloading https://ghcr.io/v2/homebrew/core/gcc/manifests/11.1.0_1
Already downloaded: /Users/A/Library/Caches/Homebrew/downloads/f7bb8eeac80de9990fc7ff7ec216d4e1eb0cf861e852e8d62317f52eb667bb3b--gcc-11.1.0_1.bottle_manifest.json
==> Downloading https://ghcr.io/v2/homebrew/core/gcc/bl ... 8e83ce46f4
Already downloaded: /Users/A/Library/Caches/Homebrew/downloads/7b6c50d89dfe506752f9adea32f7ef5943121783145a5f8e4c1a1b9fefdc140a--gcc--11.1.0_1.big_sur.bottle.tar.gz
==> Pouring gcc--11.1.0_1.big_sur.bottle.tar.gz
๐Ÿบ /usr/local/Cellar/gcc/11.1.0_1: 2,163 files, 460.2MB
Z-MacBook-Pro:~ A$ brew install open-mpi
==> Downloading https://ghcr.io/v2/homebrew/core/open-m ... ts/4.1.1_2
Already downloaded: /Users/A/Library/Caches/Homebrew/downloads/467c84611ee1e10e3019e70726660e44ed32f67dd172d18fa9ede03dddd2ff49--open-mpi-4.1.1_2.bottle_manifest.json
==> Downloading https://ghcr.io/v2/homebrew/core/open-m ... :da310195e
Already downloaded: /Users/A/Library/Caches/Homebrew/downloads/fb40e3afddec91e3219c5b6ee06dfbf5c5f25c3fe23587d8aa76852a1b716486--open-mpi--4.1.1_2.big_sur.bottle.tar.gz
==> Pouring open-mpi--4.1.1_2.big_sur.bottle.tar.gz
๐Ÿบ /usr/local/Cellar/open-mpi/4.1.1_2: 760 files, 15.9MB
Z-MacBook-Pro:~ A$ brew doctor
Your system is ready to brew.
Z-MacBook-Pro:~ A$ cd ~/Desktop/CoronaryProject/Simulations/pulsatile_sim
Z-MacBook-Pro:pulsatile_sim A$ mpiexec -np 4 /usr/local/sv/svsolver/2021-06-15/bin/svsolver solver.inp
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
dyld: Library not loaded: /usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/sv/svsolver/2021-06-15/bin/svsolver
Reason: image not found
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 3 with PID 0 on node Z-MacBook-Pro exited on signal 6 (Abort trap: 6).

User avatar
Ryan DeGroff
Posts: 7
Joined: Tue Jun 08, 2021 8:23 am

Re: MPI Issue

Post by Ryan DeGroff » Fri Jul 09, 2021 6:42 am

davep wrote: โ†‘
Thu Jul 08, 2021 2:48 pm
mpiexec -np 4 /usr/local/sv/svsolver/2021-06-15/bin/svsolver solver.inp
Hey Dave, I should have mentioned that using mpiexec through the command line has been the only way I have been able to run simulations. Will we be able to use the application user interface, or do we need to rely on command line execution?

User avatar
David Parker
Posts: 1775
Joined: Tue Aug 23, 2005 2:43 pm

Re: MPI Issue

Post by David Parker » Fri Jul 09, 2021 10:25 am

Hi Ryan,

You should be able to run simulations from the GUI. What MacOS are you using?

Try removing the files in $HOME/Library/Application Support/SimVascular.

Cheers,
Dave

User avatar
David Parker
Posts: 1775
Joined: Tue Aug 23, 2005 2:43 pm

Re: MPI Issue

Post by David Parker » Fri Jul 09, 2021 10:37 am

Hi William,

Post the output of

Code: Select all

locate libgfortran.5.dylib
On my Mac I get

Code: Select all

/usr/local/Cellar/gcc/10.2.0/lib/gcc/10/libgfortran.5.dylib
/usr/local/lib/gcc/10/libgfortran.5.dylib
/usr/local/sv/svsolver/2021-06-15/lib/libgfortran.5.dylib
Cheers,
Dave

User avatar
Ryan DeGroff
Posts: 7
Joined: Tue Jun 08, 2021 8:23 am

Re: MPI Issue

Post by Ryan DeGroff » Fri Jul 09, 2021 11:38 am

davep wrote: โ†‘
Fri Jul 09, 2021 10:25 am
Hi Ryan,

You should be able to run simulations from the GUI. What MacOS are you using?

Try removing the files in $HOME/Library/Application Support/SimVascular.

Cheers,
Dave
I am using Mac OS BigSur v11.4. I removed the files in that application support folder for SimVascular like you described, but when I try running the simulation I get the following error, almost as if the GUI is explicitly looking for a folder for open-mpi version 4.0.5, but my version is 4.1.1
Simulation job 'ctest1_smooth' has failed with non-zero exit code 1.

--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30488] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30500] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30506] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30511] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30515] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
opal_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-orte-runtime: No such file or directory. Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry! You were supposed to get help about:
mpi_init:startup:internal-failure
But I couldn't open the help file:
/usr/local/Cellar/open-mpi/4.0.5/share/openmpi/help-mpi-runtime.txt: No such file or directory. Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[Ryans-MBP:30482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[49875,1],0]
Exit code: 1
--------------------------------------------------------------------------

This was the same error that William got before, but I have fortran libraries and xcode command line tools since svSolver works in Terminal.

User avatar
William Yoon
Posts: 13
Joined: Wed May 12, 2021 12:45 pm

Re: MPI Issue

Post by William Yoon » Fri Jul 09, 2021 2:25 pm

davep wrote: โ†‘
Fri Jul 09, 2021 10:37 am
Hi William,

Post the output of

Code: Select all

locate libgfortran.5.dylib
On my Mac I get

Code: Select all

/usr/local/Cellar/gcc/10.2.0/lib/gcc/10/libgfortran.5.dylib
/usr/local/lib/gcc/10/libgfortran.5.dylib
/usr/local/sv/svsolver/2021-06-15/lib/libgfortran.5.dylib
Cheers,
Dave

Hi Dave,

Here is my output
Thanks!

Code: Select all

Z-MacBook-Pro:~ A$ locate libgfortran.5.dylib
/Applications/ParaView-5.9.0.app/Contents/Libraries/libgfortran.5.dylib
/usr/local/Cellar/gcc/11.1.0_1/lib/gcc/11/libgfortran.5.dylib
/usr/local/sv/svFSI/2021-05-03/lib/libgfortran.5.dylib
/usr/local/sv/svsolver/2021-06-15/lib/libgfortran.5.dylib
Z-MacBook-Pro:~ A$ 

User avatar
Ryan DeGroff
Posts: 7
Joined: Tue Jun 08, 2021 8:23 am

Re: MPI Issue

Post by Ryan DeGroff » Mon Jul 12, 2021 10:55 am

rdegroff wrote: โ†‘
Fri Jul 09, 2021 11:38 am
davep wrote: โ†‘
Fri Jul 09, 2021 10:25 am
Hi Ryan,

You should be able to run simulations from the GUI. What MacOS are you using?

Try removing the files in $HOME/Library/Application Support/SimVascular.

Cheers,
Dave
I am using Mac OS BigSur v11.4. I removed the files in that application support folder for SimVascular like you described, but when I try running the simulation I get the following error, almost as if the GUI is explicitly looking for a folder for open-mpi version 4.0.5, but my version is 4.1.1
Hi David, have you found a solution to this issue yet? I've tried uninstalling my current version 4.1.1 of open-mpi and installing version 4.0.5, but simulations continue to run unsuccessfully in the GUI.

POST REPLY