How to Use the Molecular Modeling Facility Cluster

If you want to get an account on the cluster, please contact the Director at 4-4508 or e-mail to nathan.crawford@uci.edu.

From a workstation X11 terminal, you can open a secure shell connection:
ssh -Y yourusername@gplogin1.ps.uci.edu (or gplogin2 or gplogin3)

General Admonitions

  1. Change your temporary password immediately.
  2. Do not share your account with anyone else.
  3. Do not run any calculations on the login node -- use the queuing system.
  4. Do not run any calculations that read and write big files in your home directory -- use your assigned scratch space (available from any node) /work/cluster/yourusername.

Login Nodes (gplogin1.ps.uci.edu, gplogin2, and gplogin3)

Non-compute-intensive programs can be run on the login nodes. These include result-visualization and molecule-building programs like: [some software packages need an environment module loaded first]

  1. Molden
    "gmolden"
  2. GaussView
    "module load gaussian/09"

    "gv"

  3. VMD
    "vmd"
     
  4. Maestro/MacroModel
    "module load schrodinger/2014-3"
    "maestro"

Compute Nodes

The following packages are set up to run from the queue on the compute nodes:

  1. Turbomole 6.6
  2. Gaussian 09
  3. NAMD 2.8
  4. CP2K 2.1.394
  5. GAMESS-US
  6. Many Others, look in /modfac/apps for installed packages.

SLURM Queuing System

Full documentation can be found on the SchedMD website. but here is a short list:

Command: Description:
sbatch submission_script submits a job
scancel job_number kills a job
squeue -u yourusername shows your current jobs
sview graphical interface to slurm
sprio shows priority of everyone's jobs

You will also need to copy the example run script to your calculation setup directory:

cp /modfac/etc/run_slurm[.software] ~/path/to/your/calculation/input/directory/

Go to your calculation directory and edit the run script per its internal documentation (use gedit, gvim, emacs, etc.)

The job submission partitions are listed below:

Name Processor Type RAM Number
mf_m-c1.9 1.9 GHz AMD 48-core "Magny-Cours" 64G 10
mf_m-c2.2 2.2 GHz AMD 48-core  "Magny-Cours" 64G 2
mf_ilg2.3 2.3GHz AMD 32-core* "Interlagos" 128G 26
mf_nes2.8 2.8GHz Intel 8/12-core "Nehalem/Westmere" 12-24G 121
mf_i-b2.8 2.8GHz Intel 20-core "Ivy-Bridge" 128G 12

*mf_ilg2.3 "cores" are modules with two integer units and a floating-point unit

Once your job is ready to go, type "sbatch run_slurm" and patiently wait for your results to come back.