ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Optimisation of the EPOCH laser-plasma simulation code

eCSE03-1

Key Personnel

PI/Co-I: Prof Tony Arber, Dr Keith Bennett - University of Warwick

Technical: Dr Michael Bareford - EPCC, University of Edinburgh

Relevant Documents

eCSE Technical Report: minEPOCH3D Performance and Load Balancing on Cray XC30

Project summary

EPOCH (Extendable PIC Open Collaboration) is a mature laser-plasma MPI simulation code that has a large international user base. The code uses particle-in-cell (PIC) techniques. PIC moves Monte-Carlo sampled particles through a fixed grid on which field variables are updated. Thus the core scheme is dominated by particle pushes and particle-to-field and field-to-particle interpolations. The purpose of this eCSE project was to both improve the performance of EPOCH and futureproof the code for next-generation multi-core architectures.

During this project, the decision was taken to strip the 3D version of EPOCH down to its essentials, thereby removing the code responsible for physical processes such as ionisation, collisions and particle production. This stripped-down version of EPOCH3d was given the name minEPOCH.

The project investigated the performance improvements that could be gained by changing how particles are stored in memory: whether better performance could be achieved by storing particles within arrays - either as arrays of particle structures (AoS) or as separate arrays of particle properties (SoA) - instead of in linked lists (LL).

This project also investigated a new scheme for balancing the workload within EPOCH. The default balancing splits the domain such that each MPI process is assigned an identically-sized section of grid space; this approach, although suitable for simulations that feature a uniform particle distribution, will not perform well should particle concentrations change markedly within the domain. For this reason, an alternative scheme has been implemented and tested that balances the number of particles per process rather than the sub-grid volume.

It was seen that SoA with particle sorting or with Intel vectorisation runs faster than linked list, with the speed-up achieved falling between 1.03 and 1.44, depending on compiler and grid resolution. Typically, smaller speed-ups are seen for high grid resolutions. However, SoA with particle balancing showed strong performance gains for the higher of the two grid resolutions tested; speed-up improved as the number of compute nodes was increased.

The implementation of particle balancing now allows EPOCH to be used for a wider variety of simulations that feature non-uniform particle distributions. However, there a number of improvements that could be made to this work - for further details, see Section VII: Conclusions and Further Work in the technical report.

Summary of the software

The EPOCH code has been developed at the University of Warwick as the basis for a standard, extendable PIC code. EPOCH is written using standard Fortran95 and MPI and is open source, enabling it to be easily modified for specific use cases. The code has seen widespread adoption with several hundred registered users and has become one of the standard PIC codes used by the plasma physics research community.

The various versions of minEPOCH, the barebones version of the code used for this project, are held within a GitLab repository hosted at Warwick University, see git@cfsa-pmw.warwick.ac.uk:bareford/minEPOCH.git

The different versions are stored at distinct branches:

  • master: the linked list version of minEPOCH3D;
  • AoS: Array of Structures;
  • SoA: Structure of Arrays;
  • AoSoA
  • SoA_vec: the version of SoA with the vectorised particle push loop;
  • SoA_sfc: the version of SoA that incorporates particle load balancing through the use of a Hilbert space-filling curve (sfc).

A particular branch can be checked out as follows:

git clone --recursive -b AoS gitlab:bareford/minEPOCH.git

The above command uses a host called gitlab, which is specified within the .ssh/config file.

Host gitlab
User git
HostName cfsa-pmw.warwick.ac.uk
IdentityFile ~/.ssh/gitlab_rsa

If you wish to create an account on the University of Warwick GitLab Server, please contact Keith Bennett on k.bennett@warwick.ac.uk.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC