Atlassian uses cookies to improve your browsing experience, perform analytics and research, and conduct advertising. Accept all cookies to indicate that you agree to our use of cookies on your device. Atlassian cookies and tracking notice, (opens new window)
User Manual

User Manual
Results will update as you type.
  • Application Guide
  • Status of System
  • Usage Guide
  • Compute partitions
    • CPU CLX partition
      • Workflow CPU CLX
      • Slurm partition CPU CLX
      • Examples and Recipes
        • Compilation on CPU CLX
        • OpenMPI on CPU CLX
        • Intel MPI on CPU CLX
        • OpenMP on CPU CLX
        • Hybrid Jobs
        • Linking the MKL version of fftw3
        • Multiple programs multiple data
      • Fat Tree OPA network of CLX partition
      • Operating system migration from CentOS to Rocky Linux
    • CPU Genoa partition
    • GPU A100 partition
    • GPU PVC partition
    • Next-Gen Technology Pool
  • Software
  • FAQ
  • NHR Community
  • Contact

    You‘re viewing this with anonymous access, so some content might be blocked.
    /
    Multiple programs multiple data

    Multiple programs multiple data

    Juli 02, 2020

    Using multiple programs on different data within a single job takes a bit ofset up, as you need to tell the MPI starter exaectly what to run and where to run it.

    Jobscript

    Example script hello.slurm for a code with two binaries

    • one OpenMP binary hello_omp.bin running on 1 node, 2 MPI tasks per node and 4 OpenMP threads per task,
    • one MPI binary hello_mpi.bin running on 2 nodes, 4 MPI tasks per node.
    Intel MPI
    #!/bin/bash
    #SBATCH --time=00:10:00
    #SBATCH --nodes=3
    #SBATCH --partition=medium:test
    
    module load impi
    export SLURM_CPU_BIND=none
    export OMP_NUM_THREADS=4
    
    scontrol show hostnames $SLURM_JOB_NODELIST | awk '{if(NR==1) {print $0":2"} else {print $0":4"}}' > machines.txt
    mpirun -machine machines.txt -n 2 ./hello_omp.bin : -n 8 ./hello_mpi.bin
    • Mit srun geht das auch. Leider läuft grade keins unserer Systeme...

    Related articles

    • Seite:
      Workflow CPU CLX
    • Seite:
      Hybrid Jobs
    • Seite:
      INTEL-MPI version 19 slower than INTEL-MPI version 18
    • Seite:
      Multiple programs multiple data
    • Seite:
      Floating point exception with Intel MPI 2019.x using one task per node



    , multiple selections available, Use left or right arrow keys to navigate selected items
    mpmd
    multiple
    intel
    machine
    file
    example
    kb-how-to-article
    {"serverDuration": 8, "requestCorrelationId": "61ecdf10c1e14dd6ac4b762dfed1e6b5"}