Conda with salloc and sbatch

Goal: Demonstrate best practices using Conda environments with salloc and sbatch.


Conda with salloc and sbatch: Best Practice

As indicated in the Modules and using salloc and sbatch: Best Practice we recommend to perform a module purge when starting an interactive session or submitting a job.

After performing a conda activate on a login node, typically the related set environment variables will be inherited after performing an salloc. But, notice what happens to the command-line prompt:

[@mblog1]$ module purge [@mblog1]$ module load miniconda3/24.3.0 [@mblog1]$ conda activate py_env (py_env) []$ python --version Python 3.12.4 (py_env) [@mblog1]$ salloc -A arcc -t 10:00 salloc: Granted job allocation 1243597 salloc: Nodes mbcpu-025 are ready for job [mbcpu-025]$ python --version Python 3.12.4

Which Conda environment is currently active?

We would suggest, as with performing a module purge, to conda activate environments explicitly after performing an salloc, and within you scripts that you sbatch.

[@mblog1]$ salloc -A arcc -t 10:00 salloc: Granted job allocation 1243600 salloc: Nodes mbcpu-025 are ready for job [@mbcpu-025]$ module purge [@mbcpu-025]$ module load miniconda3/24.3.0 [@mbcpu-025]$ conda activate py_env (py_env) [@mbcpu-025]$ python --version Python 3.12.4 (py_env) [@mbcpu-025]$ conda deactivate [@mbcpu-025]$ exit [@mblog1]$

Again, since this is now detailed within an sbatch-ed script, ARCC can see and replicate exactly what you are doing when there is an issue.


Conda sbatch: Example

#!/bin/bash #SBATCH --account=<project-name> #SBATCH --time=10:00 #SBATCH --job-name=conda_test #SBATCH --output=conda_results_%A.out echo "SLURM_JOB_ID:" $SLURM_JOB_ID start=$(date +'%D %T') echo "Start:" $start module purge module load miniconda3/24.3.0 conda activate /cluster/medbow/project/<project-name>/<username>/conda/py_env python --version conda deactivate end=$(date +'%D %T') echo "End:" $end