Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Info

We would suggest, as with performing a module purge, to conda activate environments explicitly after performing an salloc, and within you scripts that you sbatch.

Code Block
[salexan5@mblog1 ~]$ salloc -A arcc -t 10:00
salloc: Granted job allocation 1243600
salloc: Nodes mbcpu-025 are ready for job
[salexan5@mbcpu-025 ~]$ module purge
[salexan5@mbcpu-025 ~]$ module load miniconda3
[salexan5@mbcpu-025 ~]$ conda activate py_env
(py_env) [salexan5@mbcpu-025 ~]$ python --version
Python 3.12.4

Again, since this is now detailed within an sbatch-ed script, ARCC can see and replicate exactly what you are doing when there is an issue.

...

Conda sbatch: Example

Info

Here is a very minimal example of what a submission script would look lie:

Expand
titlerun_conda.sh
Code Block
#!/bin/bash
#SBATCH --account=arccanetrain
#SBATCH --time=10:00
#SBATCH --job-name=conda_test
#SBATCH --output=conda_results_%A.out

echo "SLURM_JOB_ID:" $SLURM_JOB_ID
start=$(date +'%D %T')
echo "Start:" $start

module load miniconda3/24.3.0
conda activate /cluster/medbow/project/arcc/salexan5/conda/py_env
python --version

end=$(date +'%D %T')
echo "End:" $end
Info

Using this script would look something like this:

Code Block
[]$ sbatch run_conda.sh
Submitted batch job 1902001

[]$ cat conda_results_1902001.out
SLURM_JOB_ID: 1902001
Start: 08/15/24 10:47:23
Python 3.12.2
End: 08/15/24 10:47:24

...

...