ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Delivering a step-change in the performance and functionality of the Fluidity shallow water solver

eCSE03-07

Key Personnel

PI/Co-I: Prof. Matthew D. Piggott, Dr. David A. Ham - Imperial College London

Technical: Dr. Christian T. Jacobs - Imperial College London

Relevant Documents

eCSE Technical Report: Delivering a step-change in performance and functionality to the Fluidity shallow water solver through code generation

Project summary

Fluidity (fluidity-project.org) is a fully-featured, open source, computational fluid dynamics (CFD) framework. It comprises several advanced numerical models based on the finite element method as well as a number of novel numerical features (e.g. mesh adaptivity) making it suitable for multi-scale simulations. It is largely unique in its abilities to also solve large-scale geophysical/oceanographic problems. Key examples include marine renewable energy, tsunami simulation and inundation, and palaeo-tidal simulations for hydrocarbon exploration.

The current Fluidity codebase comprises hand-written Fortran code to perform the finite element discretisation. Not only is this hand-written code potentially sub-optimal, it presents issues regarding its maintainability and longevity; should one want to run Fluidity on a newer hardware architecture more suited to larger scale problems in the future, then the entire codebase may have to be re-written. Furthermore, the need for numerical modellers to not only be experts in their field of science, but also be well-versed in parallel programming and code optimisation, is unsustainable in the long-term.

This eCSE project delivers a step-change in the performance and functionality of the shallow water model within Fluidity, accomplished by using the Firedrake framework (firedrakeproject.org) for the automated solution of partial differential equations using code generation techniques. A key aim is to remove Fluidity's existing hand-written Fortran finite element discretisation code and instead generate it automatically from a higher-level model description, thereby hiding complexity through layers of abstraction. This allows the users of the resulting models to focus on the problem specification and the end results of simulations. The Firedrake project achieves all of this in a performance-portable manner using the PyOP2 framework (github.com/OP2/PyOP2) to target and optimise the automatically-generated code for a desired hardware architecture. Moreover, in contrast to traditional, hand-written models, such as Fluidity, the use of code generation techniques has been shown to deliver significantly enhanced performance, as well as improved code maintainability.

Summary of the software

Firedrake-Fluids is available under the open-source GNU GPL licence (version 3). The source code is under Git version control and hosted on GitHub at the following URL: http://github.com/firedrakeproject/firedrake-fluids

The dependencies of Firedrake-Fluids (namely Firedrake itself and associated packages) can be readily loaded on ARCHER using:
export FDRAKE_DIR=/work/y07/y07/fdrake
module use $FDRAKE_DIR/modules

After loading the necessary modules, Firedrake-Fluids can then be run on ARCHER using a PBS submission script such as:
https://github.com/firedrakeproject/firedrake-fluids/blob/master/tools/parallel_scaling/src/submit_archer.pbs

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC