ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Enhancing long-range dispersion interaction functionality of CASTEP

eCSE10-010

Key Personnel

PI/Co-I: Dominik B. Jochym - Rutherford Appleton Laboratory , Dawn L. Geatches - STFC, Matt I. J. Probert, Phil J. Hasnip - University of York, Ian Rosbottom - University of Leeds, and Anthony M. Reilly - Dublin City University

Technical: Peter Byrne - University of York

Relevant Documents

eCSE Technical Report: Enhancing long-range dispersion interaction functionality of CASTEP

Project summary

Methods for performing first-principles simulations of materials (i.e. those that solve the quantum-mechanical Schrödinger equation via parameter-free approximations) have had a profound and pervasive impact on science and technology, spreading from physics, chemistry and materials science to diverse areas including electronics, geology and medicine. Methods based on density functional theory (DFT) have led the way in this success, offering a favourable balance of computational cost and accuracy compared to competitor methods. Nevertheless, the common approximations used in DFT neglect dispersion forces, such as van der Waals interactions, with the result that several important classes of materials are described poorly. Fortunately, van der Waals forces can be added back in to a DFT calculation by using a semi-empirical dispersion correction (SEDC), and several schemes have been published, including schemes by S. Grimme, and also A. Tkatchenko and M. Scheffler. Amongst the most sophisticated of these schemes is the many-body dispersion (MBD) method of Ambrosetti, Reilly, di Stasio & Tkatchenko [JCP 140 18A508 (2014)].

CASTEP is a DFT computer program developed in the UK and is commonly used to model periodic systems of up to a few thousand atoms per unit cell on ARCHER, using up to a few thousand cores (with larger systems scaling well to larger number of cores). CASTEP has an implementation of several SEDC schemes, including the MBD schemes, but the methods were implemented using an explicit summation over atoms in real-space, and large portions of the method were done in serial. Furthermore, the MBD forces were computed using a finite-difference method.

In this project the SEDC schemes were refactored to improve their sustainability, enabling alternative methods to be implemented much more straightforwardly. By linking with a new software library, the newer D3 and D4 schemes of Grimme were added to CASTEP. The MBD was also refactored and re-implemented, improving its performance, accuracy and parallel scaling substantially (often by more than an order of magnitude).

CASTEP has a large user-base, currently over 850 academic research groups and many companies worldwide. Within the UK, it is used frequently by the UKCP and the Materials Chemistry HEC consortia and by members of CCP9 and CCP-NC for crystalline NMR simulations. This user-base spans a wide range of materials research in Physics, Chemistry, Materials Science, Earth Sciences and Engineering departments. The success of this project means that CASTEP can now be used more efficiently than ever before, and that calculations can now scale to larger core counts for an even greater speed-up. The development of an extensible parallel model also reduces the need to tuned input parameters, and provides a sustainable framework for improving the parallelism further in the future. The net result of this work is a new science capability, running larger system sizes in less time.

Achievement of objectives

The initial goal(WP1 and WP2.a) was to rewrite the old SEDC module which relied heavily on the use of module level variables and was quite difficult to follow. We combined this with the initial work from a previous eCSE project to implement the Ewald summation method and this was completed. This has already been merged with the main CASTEP codebase and will be a key feature in the next CASTEP release which is due in November.

The 2nd goal was to increase the modularity of the code and in general make it easier to implement new functionality(WP2.b). This was demonstrated by adding the D3/D4 library from Grimme et al. The inclusion of this functionality is controlled by a compile flag in the CASTEP Makefile and can be activated in the academic release of CASTEP.

The final goals were centred around improving the efficiency of the many-body dispersion method(WP4) and to implement analytic forces(WP3). Initially, we had thought to optimise the parallel matrix diagonalise, but a recent development of a reciprocal-space version of the MBD algorithm was formulated and this was implemented instead because the expected parallel gains were much larger. This has been integrated into the next CASTEP release.

There was an additional task in (WP5.a) to compute lattice energies for a range of organic molecular crystals; this work is being undertaken by project partners from the ADDoPT consortium.

Summary of the software

CASTEP (www.castep.org) is a UK-based state-of-the-art implementation of DFT and a flagship code for UK HPC. It was rewritten in 1999-2001 according to sound software engineering principles and with HPC in mind. It is available as a system-installed binary on ARCHER, and version 19.1 -- which includes the developments from this project -- will be available following its release.

CASTEP and its source code are available generally under a free-of-charge licence to all academics in the UK, from: ccpforge.cse.rl.ac.uk/gf/project/castep/.

Pre-compiled CASTEP programs are marketed worldwide by BIOVIA Inc. along with their GUI; for more information, see castep.org/CASTEP/GettingCASTEP.

The refactored SEDC code has already been merged into the codebase for CASTEP 19.1, to be released later this year. The MBD improvements will be merged soon, and is also expected to be released in CASTEP 19.1. Both sets of developments will be fully integrated into the main codebase, and hence available to all academic and commercial users worldwide.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC