IPM is a popular MPI profiling library.  To use IPM and create an html-formatted report:

% module load openmpi/1.8_adv
% mpicc testfile.c -L/usr/local/pkg/ipm/version/lib -Wl,-rpath=/usr/local/pkg/ipm/version/lib -lipm
% mpirun -np 4 ./a.out	# Should create a .xml file or performance data
% ipm_parse -full -html *xml

IPM is installed only for openmpi/1.8_adv at present.  Other versions can be made available on request.

There will be an outage of the Ansys license server from Friday night 16/10/2015 through to Monday morning, 19/10/2015.  No Ansys programs (Fluent and CFX, mainly) will be available for this time.



Slurm Installed

The SLURM scheduler is now installed on our demo HPC machines; hp1, hp2 and hp3.  A good introduction to SLURM is at, but bear in mind that our installation is smaller and much more simple.  Here is a SLURM script to run an openMP job on hp1, which has 56 cores:

#SBATCH --job-name=OMP_simulate
#SBATCH --output=slurm.out
#SBATCH --error=slurm.err
#SBATCH --partition=bigdata
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=54

UC HPC Blog Created

This is the blog that will be used for all UC HPC (formerly BlueFern) blog posts.