CoEPP RC
 

Submit cloud jobs to process data in /working

/working is Adelaide's local storage that is mounted on all Adelaide local Tier 3 UIs (aui.coepp.org.au). Because Adelaide's local Tier 3 doesn't have much compute power, we also mount /working on all SA cloud's worker nodes (Torque's node property is SA) so that Adelaide users can submit jobs to the cloud Tier 3 to process data in /working.

To do that, one should copy all data to a sub-directory under /working. Then ssh to cxin01.cloud.coepp.org.au or cxin02.cloud.coepp.org.au, and cd to your sub-directory under /working, then run qsub from there.

You job submission script should request SA worker nodes. Here is an example:

#!/bin/bash

### Job name
#PBS -N testjob

#PBS -S /bin/bash

### Join queuing system output and error files into a single output file
#PBS -j oe

### Send email to user when job ends or aborts
#PBS -m ae

### email address for user
##PBS -M Your-email-Address

### Queue name that job is submitted to
#PBS -q short

### Request nodes, memory, walltime. NB THESE ARE REQUIRED
#PBS -l nodes=1:SA
#PBS -l walltime=00:10:00

setupATLAS
localSetupROOT

# This job's working directory
echo Working directory is $PBS_O_WORKDIR
cd $PBS_O_WORKDIR
echo Running on host `hostname`
echo Time is `date`

ls /working/shunde

# Load module(s) if required
###module load application
# Run the executable
hostname
sleep 10

The most important thing here is #PBS -l nodes=1:SA which requests 1 core in 1 node in SA cloud.

cloud/adlworking.txt · Last modified: 2015/11/27 14:21 by shunde
 
Except where otherwise noted, content on this wiki is licensed under the following license: CC Attribution-Share Alike 4.0 International
Recent changes RSS feed Donate Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki