ParFlow is currently not available as a module, but has been installed individually within a project (due to extending the code base).
If you would like the standard ParFlow installed as a module please request via our Service Portal:
Multicore
Parflow is build upon OpenMPI and thus can run across multiple nodes.
Local Install Process:
ParFlow has a number of dependencies that are typically provided by pre-installed modules (such as hypre). We have not been able to use our module version of silo, but instead have had to install this locally, as demonstrated in the process below.
This example also uses specific versions/releases of ParFlow and silo. ParFlow releases can be found at the link at the start of the this page, silo releases can be found here: Silo Releases
You will need to replace <project-name> with your own project folder name, or update the folder paths with you own locations..
[source]$ pwd
/project/<project-name>/software/parflow/source/
[source]$ ls
parflow-3.12.0.zip silo-4.10.2.tar
[source]$ tar -xf silo-4.10.2.tar
[source]$ unzip parflow-3.12.0.zip
[source]$ ls
parflow-3.12.0 parflow-3.12.0.zip silo-4.10.2 silo-4.10.2.tar
[source]$ module load gcc/12.2.0 openmpi/4.1.4 hypre/2.26.0-ompi cmake/3.27.9
[source]$ cd silo-4.10.2/
[silo-4.10.2]$ ./configure --disable-silex
[silo-4.10.2]$ make
[silo-4.10.2]$ make install
[source]$ cd ..
[source]$ mv parflow-3.12.0 parflow
[source]$ cd parflow/
[parflow]$ pwd
/project/<project-name>/software/parflow/source/parflow
[parflow]$ mkdir build
[parflow]$ cd build/
[build]$ export PARFLOW_DIR=/project/<project-name>/software/parflow/source/parflow
[build]$ cmake .. \
-DPARFLOW_ENABLE_ZLIB=ON \
-DPARFLOW_HAVE_CLM=ON \
-DPARFLOW_AMPS_LAYER=mpi1 \
-DPARFLOW_AMPS_SEQUENTIAL_IO=true \
-DPARFLOW_ENABLE_TIMING=true \
-DPARFLOW_ENABLE_HYPRE=ON \
-DPARFLOW_ENABLE_SILO=ON \
-DPARFLOW_ENABLE_SIMULATOR=ON \
-DPARFLOW_ENABLE_SLURM=ON \
-DPARFLOW_ENABLE_TOOLS=ON \
-DPARFLOW_ENABLE_ZLIB=ON \
-DPARFLOW_HAVE_CLM=ON \
-DSILO_INCLUDE_DIR=/project/<project-name>/software/parflow/source/silo-4.10.2/include \
-DSILO_LIBRARY=/project/<project-name>/software/parflow/source/silo-4.10.2/lib/libsilo.a \
-DCMAKE_INSTALL_PREFIX=$PARFLOW_DIR \
-DMPIEXEC="mpiexec" \
-DMPIEXEC_NUMPROC_FLAG="-n" \
-DTCL_INCLUDE_PATH=/usr/include/
[build]$ make
[build]$ make install
Testing:
After the make install step, you can run make test to verify the installation.
Due to the nature of these tests, you MUST NOT run these on the login nodes as they will utilize the majority of cores and over nine hours to complete. This will affect ALL users on that login node.
Instead, submit a job onto the cluster using a Slurm script of the form below that calls make test from within the parflow build folder:
#!/bin/bash
#SBATCH --job-name=parflow
#SBATCH --ntasks=30
#SBATCH --time=12:00:00
#SBATCH --account=<project>
...
module load gcc/12.2.0 openmpi/4.1.4 hypre/2.26.0-ompi cmake/3.27.9
cd /project/<project-name>/software/parflow/source/parflow/build/
export SILO_DIR=/project/<project-name>/software/parflow/source/silo-4.10.2/
export PARFLOW_DIR=/project/<project-name>/software/parflow/source/parflow
make test
Our testing has observed:
...
Start 135: LW_var_dz_spinup.tcl_1_4_1
135/220 Test #135: LW_var_dz_spinup.tcl_1_4_1 .......................***Timeout 3600.13 sec
Start 136: LW_var_dz_spinup.tcl_4_1_1
136/220 Test #136: LW_var_dz_spinup.tcl_4_1_1 .......................***Timeout 3600.13 sec
...
98% tests passed, 4 tests failed out of 220
Total Test time (real) = 32122.82 sec
The following tests FAILED:
131 - LW_var_dz.tcl_1_4_1 (Timeout)
132 - LW_var_dz.tcl_4_1_1 (Timeout)
135 - LW_var_dz_spinup.tcl_1_4_1 (Timeout)
136 - LW_var_dz_spinup.tcl_4_1_1 (Timeout)
Errors while running CTest
Output from these tests are in: /project/<project-name>/software/parflow/source/parflow/build/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.
make: *** [Makefile:71: test] Error 8
We assume ParFlow’s test suite must have a job timeout limit of 3600s built within it. But due to the percentage of successful tests, we’re satisfied that this installation works.