Open MPI
More information: https://www.open-mpi.org/
Compiling applications:
module unload cray-mpich
module unload cray-libsci
module load openmpi
module load libfabric
mpicc openmpi.c
Note
In order to fully utilize the Slingshot High Speed Network, make sure that the application is linked with the Cray fabric library.
- $ ldd a.out | grep libfabric
libfabric.so.1 => /opt/cray/libfabric/1.15.2.0/lib64/libfabric.so.1 (0x00007fcc2f4f4000)
https://docs.open-mpi.org/en/v5.0.x/release-notes/networks.html
Running applications:
#!/bin/bash
#SBATCH -A hpcteszt
#SBATCH --partition=cpu
#SBATCH --job-name=openmpi
#SBATCH --output=openmpi.out
#SBATCH --time=06:00:00
#SBATCH --nodes=2
#SBATCH --ntasks-per-node=32
mpirun -np 64 a.out
Open MPI in containers
Introduction to using MPI with containers: https://permedcoe.github.io/mpi-in-container/
Running MPI parallel jobs using Singularity containers: https://imperialcollegelondon.github.io/2020-07-13-Containers-Online/12-singularity-mpi/index.html