Conda with salloc and sbatch
Goal: Demonstrate best practices using Conda environments with salloc and sbatch.
Conda with salloc and sbatch: Best Practice
As indicated in the Modules and using salloc and sbatch: Best Practice we recommend to perform a module purge when starting an interactive session or submitting a job.
After performing a conda activate on a login node, typically the related set environment variables will be inherited after performing an salloc. But, notice what happens to the command-line prompt:
[@mblog1]$ module purge
[@mblog1]$ module load miniconda3/24.3.0
[@mblog1]$ conda activate py_env
(py_env) []$ python --version
Python 3.12.4
(py_env) [@mblog1]$ salloc -A arcc -t 10:00
salloc: Granted job allocation 1243597
salloc: Nodes mbcpu-025 are ready for job
[mbcpu-025]$ python --version
Python 3.12.4Which Conda environment is currently active?
We would suggest, as with performing a module purge, to conda activate environments explicitly after performing an salloc, and within you scripts that you sbatch.
[@mblog1]$ salloc -A arcc -t 10:00
salloc: Granted job allocation 1243600
salloc: Nodes mbcpu-025 are ready for job
[@mbcpu-025]$ module purge
[@mbcpu-025]$ module load miniconda3/24.3.0
[@mbcpu-025]$ conda activate py_env
(py_env) [@mbcpu-025]$ python --version
Python 3.12.4
(py_env) [@mbcpu-025]$ conda deactivate
[@mbcpu-025]$ exit
[@mblog1]$ Again, since this is now detailed within an sbatch-ed script, ARCC can see and replicate exactly what you are doing when there is an issue.
Conda sbatch: Example
Here is a very minimal example of what a submission script would look lie:
Using this script would look something like this:
[]$ sbatch run_conda.sh
Submitted batch job 1902001
[]$ cat conda_results_1902001.out
SLURM_JOB_ID: 1902001
Start: 08/15/24 10:47:23
Python 3.12.2
End: 08/15/24 10:47:24Prev | Workshop Home | Next |