ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER
  • User Guides & Documentation
  • Essential Skills
  • Quick Start Guide
  • ARCHER User Guide
  • ARCHER Best Practice Guide
  • Scientific Software Packages
  • UK Research Data Facility Guide
  • Knights Landing Guide
  • Data Management Guide
  • SAFE User Guide
  • ARCHER Troubleshooting Guide
  • ARCHER White Papers
  • Screencast Videos

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Compiling Code_Saturne 3.0.3 on ARCHER (Cray XC30)

Source Modification

A call to getpwuid() was removed in the supplied source package. This is to work around the crash issue with this function when used on ARCHER compute nodes due to the unavailability of the nscd daemon (see https://sourceware.org/bugzilla/show_bug.cgi?id=12491).

File diff follows:

$ diff -r code_saturne-3.0.3/src/base/cs_system_info.c saturne-3.0.3-archer/src/base/cs_system_info.c
269c269
<   pwd_user = getpwuid(geteuid());
---
>   pwd_user = NULL; /* getpwuid(geteuid()); */

Modify lib/python2.7/site-packages/code_saturne/cs_exec_environment.py to cater for the aprun job launcher. Modify the __init_mpich2_3__() routine as follows:

The solution is to add an extra elif branch to the selection statement starting at line 1381, so that it reads as follows:

if launcher_base[:7] == 'mpiexec':
    self.mpiexec_n = ' -n '
elif launcher_base[:6] == 'mpirun':
    self.mpiexec_n = ' -np '
elif launcher_base[:5] == 'aprun':
    self.mpiexec_n = ' -n '

Change file lib/python2.7/site-packages/code_saturne/cs_exec_environment.py to allow running on MOM/compute nodes:

$ diff cs_exec_environment.py.bak cs_exec_environment.py
787a788,790
>         # DSM EPCC Edit 13-05-2015 for Q521721
>         self.type = 'MPICH2'
>
931,934c934,942
<         launcher_names = ['mpiexec.mpich', 'mpiexec.mpich2', 'mpiexec',
<                           'mpiexec.hydra', 'mpiexec.mpd', 'mpiexec.smpd',
<                           'mpiexec.gforker', 'mpiexec.remshell',
<                           'mpirun.mpich2', 'mpirun.mpich2', 'mpirun']
---
>         #launcher_names = ['mpiexec.mpich', 'mpiexec.mpich2', 'mpiexec',
>         #                  'mpiexec.hydra', 'mpiexec.mpd', 'mpiexec.smpd',
>         #                  'mpiexec.gforker', 'mpiexec.remshell',
>         #                  'mpirun.mpich2', 'mpirun.mpich2', 'mpirun']
>
>         # DSM EPCC Edit 13-05-2015 for Q521721:
>         # mom nodes have a non-functional mpiexec on the PATH that breaks this detection loop
>         # hack to work around this
>         launcher_names = []
956a965,968
>         # DSM EPCC Edit 13-05-2015 for Q521721
>         self.mpiexec = 'aprun'
>         basename = 'aprun'
>
1000,1005c1012,1021
<         if launcher_base[:7] == 'mpiexec':
<             self.mpmd = MPI_MPMD_mpiexec | MPI_MPMD_configfile | MPI_MPMD_script
<             self.mpiexec_n = ' -n '
<         elif launcher_base[:6] == 'mpirun':
<             self.mpiexec_n = ' -np '
<             self.mpmd = MPI_MPMD_script
---
>         # DSM EPCC Edit 13-05-2015 for Q521721: hardcode aprun to stop mpiexec freezing on mom node
>         self.mpiexec_n = ' -n '
>         self.mpmd = MPI_MPMD_mpiexec | MPI_MPMD_configfile | MPI_MPMD_script
>
> #        if launcher_base[:7] == 'mpiexec':
> #            self.mpmd = MPI_MPMD_mpiexec | MPI_MPMD_configfile | MPI_MPMD_script
> #            self.mpiexec_n = ' -n '
> #        elif launcher_base[:6] == 'mpirun':
> #            self.mpiexec_n = ' -np '
> #            self.mpmd = MPI_MPMD_script

Environment

Edit the InstallHPC.sh script located in the directory containing SATURNE_3.0.3. Change --prefix= to point to installation directory as required. e.g.

--prefix=/work/y07/y07/cse/code-saturne/3.0.3/GNU

Build with Gnu programming environment:

module swap PrgEnv-cray PrgEnv-gnu

Compilation

(Contents of InstallHPC.sh script are listed in full at the bottom of this page.

./InstallHPC.sh
cd SATURNE_3.0.3/saturne-3.0.3.build/arch/Linux
make
make install

Testing

After making, from the source directory:

make check

To rerun tests against installed version:

make installcheck

Contents of InstallHPC.sh

#!/bin/sh

#####################
## Which machine ? ##
#####################

UNAME_N=`uname -n`

case ${UNAME_N} in
eslogin*) MACHINE=CRAY ;;
esac

echo $MACHINE, ${UNAME_N}

#################################
## Which version of the code ? ##
#################################

CODE_VERSION=3.0.3
SATURNE=SATURNE_${CODE_VERSION}
KER_VERSION=${CODE_VERSION}
KERNAME=saturne-${KER_VERSION}

################################################
## Installation PATH in the current directory ##
################################################

INSTALLPATH=`pwd`
INSTALLPATH=${INSTALLPATH}

echo $INSTALLPATH

#####################################
## Environment variables and PATHS ##
#####################################

NOM_ARCH=`uname -s`

export NOM_ARCH

##############
## Cleaning ##
##############

rm -rf $INSTALLPATH/$SATURNE/$KERNAME/arch/*
rm -rf $INSTALLPATH/$SATURNE/$KERNAME.build

#########################
## Kernel Installation ##
#########################

KERSRC=$INSTALLPATH/$SATURNE/$KERNAME
KERBUILD=$INSTALLPATH/$SATURNE/$KERNAME.build/arch/$NOM_ARCH
KEROPT=$KERSRC/arch/$NOM_ARCH

METISPATH=$INSTALLPATH/../SOFTWARE/gcc/parmetis-4.0.3
SCOTCHPATH=$INSTALLPATH/../SOFTWARE/gcc/scotch_6.0.0
CGNSPATH=$INSTALLPATH/../SOFTWARE/gcc/cgnslib_3.1.4
LIBXML2=$INSTALLPATH/../SOFTWARE/gcc/libxml2-2.9.0

export KEROPT

mkdir -p $KERBUILD
cd $KERBUILD

if [ "$MACHINE" = "CRAY" ] ; then
   $KERSRC/configure --host=x86_64-unknown-linux-gnu --disable-shared --without-modules --disable-rpath --disable-dlloader --disable-gui --disable-sockets --with-libxml2=$LIBXML2 --with-libxml2-lib=$LIBXML2/lib --with-libxml2-include=$LIBXML2/include --enable-long-gnum --disable-mei --with-metis=$METISPATH --with-metis-include=$METISPATH/include --with-metis-lib=$METISPATH/lib --with-scotch=$SCOTCHPATH --with-scotch-include=$SCOTCHPATH/include --with-scotch-lib=$SCOTCHPATH/lib --with-cgns=$CGNSPATH --with-cgns-include=$CGNSPATH/include --with-cgns-lib=$CGNSPATH/lib --prefix=/work/y07/y07/cse/code-saturne/3.0.3/GNU CC="cc" CFLAGS=" " FC="ftn" FCFLAGS=" " LDFLAGS=" -Wl,-z -Wl,muldefs " LIBS=" /usr/lib64/libxml2.a -ldl -lz " FCLIBS=" "
fi


###make
###make install
###make distclean

cd $INSTALLPATH
###rm -rf $INSTALLPATH/Kernel/ncs-${KER_VERSION}.build

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC