|
This page shows you how to run a regular bash script as a pipeline. The runAsPipeline
script, accessible through the rcbio/1.0
module, converts an input bash script to a pipeline that easily submits jobs to the Slurm scheduler for you.
...
Start interactive job, and create working folder
For example, for user abc123, the scratch3 directory will be
Code Block | ||
---|---|---|
| ||
srun --pty -p interactive -t 0-12:0:0 --mem 2000MB -n 1 /bin/bash mkdir -p /n/scratch3/users/scratch2a/$USERabc123/testRunBashScriptAsSlurmPipeline cd /n/scratch2/$USERscratch3/users/a/abc123/testRunBashScriptAsSlurmPipeline |
...
In case you wonder how it works, here is a simple example to expain:
For each step per loop, the pipeline runner reates a file looks like this (here it is named flag.sh):
Code Block | ||
---|---|---|
| ||
#!/bin/bash srun -n 1 bash -c "{ echo I am running...; hostname; otherCommands; } && touch flag.success" sleep 5 export SLURM_TIME_FORMAT=relative echo Job done. Summary: sacct --format=JobID,Submit,Start,End,State,Partition,ReqTRES%30,CPUTime,MaxRSS,NodeList%30 --units=M -j $SLURM_JOBID sendJobFinishEmail.sh flag [ -f flag.success ] && exit 0 || exit 1 |
Then submit with:
Code Block | ||
---|---|---|
| ||
sbatch -p short -t 10:0 -o flag.out -e flag.out flag.sh |
sendJobFinishEmail.sh is in /n/app/rcbio/1.0/bin
There is a bug in the script, please change:
[ -f $flag.failed ]
to:
[ ! -f $flag.success ]
Let us know if you have any questions. Please include your working folder and commands used in your email. Any comment and suggestion are welcome!