Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Code Block
#SBATCH --nodes=5
#SBATCH --ntasks=4
#SBATCH --cpus-per-task=4
#SBATCH --partition=teton
Code Block
#SBATCH -N 5
#SBATCH -c 4
#SBATCH -n 4
#SBATCH -p teton

How Many Cores and/or Memory Should I Request?

  • There are no hard and fast rules on how to configure your batch files as in most cases it will depend on the size of your data and extent of analysis.

  • You will need to read and understand how to use the plugin/command as they can vary.

  • Memory is still probably going to be a major factor in how many cpus-per-task you choose.

  • In the example above we were only able to use 32 cores because we ran the job on one of the teton-hugemem partition nodes. Using a standard Teton node we were only able to use 2 cores. The latter still gave us an improvement of running for 9 hours and 45 minutes, compared to 17 hours with only a single core. But, using 32 cores on a hugemem node, the job ran in 30 minutes!

    • Remember, hugemem nodes can be popular, so you might actually end up queuing for days to run a job in half an hour when you could have jumped on a Teton node immediately and already have the longer running job finished.

    • Depending on the size of data/analysis you might be able to use more cores on a Teton node.

Note

You will need to perform/track analysis to understand what works for your data/analysis. Do not just use a hugemem node!

Summary

In this introduction we've looked at using the four sbatch options nodesntasks-per-nodecpus-per-task and ntasks and various combinations of them.

...