ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Extending CPL library to enable HPC simulations of multi-phase geomechanics problems

eCSE08-03

Key Personnel

PI/Co-Is: Dr. Catherine O'Sullivan - Imperial College London

Technical: Edward Smith - Imperial College London

Relevant documents

eCSE Technical Report: Extending CPL library to enable HPC simulations of multi-phase geomechanics problems

Project summary

Project description

The project has enabled the simulation of fluid-particle systems by combining the molecular dynamics code LAMMPS and the computational fluid dynamics code OpenFOAM on ARCHER. The resultant system will be of use to engineers in geotechnical (civil) engineering, geology, and petroleum engineering.

The core software, CPL library manages the exchange of messages between codes. Prior to this project CPL library had been used to couple LAMMPS and the OpenFOAM CFD code where two domains meet at an interface, as illustrated schematically in Figure 1(a). Within this eCSE funded project this capability was extended to allow fully overlapping simulations needed for granular mechanics Figure 1(b). Note that this software considers a much larger scale, rather than molecules as were previously simulated, the extended code considers larger particles with a size range of approximately 100 micron to 5 mm. CPL library had not previously been used on ARCHER. (although it had been used on HECToR) so to demonstrate deployment on ARCHER, the scaling of CPL library has been tested up to 10,000 cores.

In order to allow easy simulation of coupled granular systems, additional functionality has been implemented. Most important was the incorporation of a number of drag models to enable the force generated by the (continuum) fluid flow to be applied to the individual grains. A framework to allow the user to include new, customized drag force models is also provided. All the new models were unit tested; this was achieved by creating a self contained class for forces and fields and this process is automated using Travis CI. Extensive documentation has been developed for both new and experienced users, with deployment provided through installation instructions, ARCHER scripts and an Anaconda package.

Achievement of objectives

  1. Port CPL library to ARCHER and demonstrate coupling between existing versions of OpenFOAM and LAMMPS on the platform; success metrics include:

    (i)Demonstrate the automated verification cases previously tested on Tier 2 levels to check the correctness of data exchange and demonstrate validity of coupled simulations.

    The testing framework passes as expected on the ARCHER.

    (ii) Scaling tests for the coupled code system using up to 10,000 cores to ensure scalability of both codes.

    Testing of the core coupler library has been performed showing greater than 70% efficiency up to 10,000 cores.

  2. Extend the capabilities of CPL library to couple MD and CFD codes with overlapping, rather than abutting, domains. Success measures are:

    (i) A suite of topological and communications tests to allow automated validation of both codes independently on ARCHER, then CPL library itself and finally the complete coupled code.

    A range of test cases to assess communication have been run on ARCHER and a general framework for batch running developed.

    (ii) Accurate simulation of carefully constrained granular mechanics validation cases, including calculating terminal velocity of a particle falling in a fluid, and reproducing displacement of column of n>100 spheres following Suzuki et al. (2007).

    A range of different cases were chosen which are more fundamental for coupling specific problems and highlight specific features, including single particle under gravity, single particle on a wall, flow past an FCC lattice, hydrostatic flow, Couette flow, Couette flow with a drag constraint, partially obstructed domain, stationary sphere, etc. The priority here has been to develop a framework which can be easily extended by users (using a clear hook-based structure with documentation) and the aim to ensure all development proceeds by build up this body of validation cases.

  3. Develop documentation of coupled simulation; success metrics include:

    (i) A novice user is able to download and build in a single command on ARCHER (to be tested by PhD students at Imperial and Edinburgh, Bernhardt (Arkansas) and Huang (Tongji, China) will be involved in remote testing)

    Rather than involving a PhD student a postdoc with limited prior LINUX experience, Dr. Adnan Sufian, agreed to collaborate on this and his feedback has helped shape the development of all documentation, together with discussions with a number of remote users (Dr. Otsubo at the University of Tokyo).

    (ii) Detailed instructions on API and example usage for every function so more advanced users can easily develop their own interface for the CPL library.

    The description of the interface is available on the website. The most likely extensions will involve the drag expression that calculates the forces passed between the two codes. Focussing on this need for future users, a range of examples with increasing complexity, CPL force objects, and instructions as to how to extend are documentation on the Wiki.

  4. Scaling, performance checking and optimisation

    i) Tune and optimise through input, compiler and run settings

    A new framework to allow scaled simulation, SimWrapPy has been designed with various examples. Given the wide range of possible simulations enabled by the coupled software, general tuning and optimisation is not possibler and so a tool to facilitate local tuning was designed and made open source.

    ii) Load balancing and dynamic MPI spawning

    Dynamic spawning is not supported by Cray but serial tests are used to calibrate load balancing before running and MPI_wait times are monitored and can be minimised to balance the two coupled codes.

Summary of the Software

All the developed software is open source with GNU v3 licence, available under public repository, with documentation and a range of quickstart examples provided on the project website.

A version of LAMMPS and OpenFOAM must also be downloaded, with the interface for CPL library created by two APPS folders, one for LAMMPS and one for OpenFOAM.

These are stored in a separate repository from CPL library which is designed for coupling any possible codes. To deploy a coupled OpenFOAM-GranLAMMPS instance on ARCHER (or locally), a script is available in the utilities folder on the github page.

An Anaconda deployment is also available to allow the code to be used with a one click install, and work is currently underway to allow this to work on ARCHER using MPI ABI compatibility. In addition, an eco-system of helper tools have been developed and tailored to this project, including:

  1. A range of tools to visualise the output of OpenFOAM, GranLAMMPS and the coupled simulation.
  2. A framework which allows the batch running of LAMMPS and OpenFOAM, both locally and via pbs on ARCHER.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC