ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Enabling NAME for distributed memory parallelism

eCSE09-010

Key Personnel

PI/Co-I: Matthew Rigby (U. Bristol), Michele Weiland (EPCC), Ben Devenish (UKMO), David Thomson (UKMO)

Technical: Kevin Stratford (EPCC)

Relevant Documents

eCSE Technical Report: Enabling NAME for distributed memory parallelism

Project summary

The Numerical Atmospheric dispersion Modelling Environment (NAME) is owned by the United Kingdom Met Office and is used to perform simulations of a wide range of atmospheric dispersion phenomena. These include, but are not limited to: nuclear accidents, volcanic eruptions, chemical accidents, smoke from fires, and transport of airborne disease vectors.

NAME is a Lagrangian model, which represents atmospheric dispersion by tracking simulated 'particles'. Movement through the atmosphere is typically driven by numerical weather prediction (either historical data or forecast data) while particles also undergo a random motion to represent small-scale turbulence. Particles are generated by a specified source, or set of sources, which may be natural (e.g., a volcano or a fire) or man-made (e.g., a factory or other known source of pollution). Particles may be removed from the atmosphere by a number of different processes such as fall-out owing to gravity and impact with the ground, and 'wash-out' by rain.

A sophisticated chemistry model is included with 42 different chemical species represented. Particles may carry a payload of one or more species, which then contribute to chemical reactions at a given location. This is done by aggregating chemical concentrations from all the particles in a given Eulerian grid cell (which defines the region in which chemistry takes place), and then redistributing reaction products to the same particles.

One advantage of a Lagrangian model is that features such as volcanic plumes may be represented accurately at different scales independent of any underlying mesh scale. The degree of accuracy of this representation then depends largely on the number of particles that can be used. However, as the resolution of meteorological data increases, the number of particles required to maintain statistical accuracy also increases. The associated increase memory and computational requirement then favours a distributed memory implementation to complement the existing shared memory implementation.

Achievement of objectives

The project proposal set out three objectives:

  1. Objective 1. To be able to perform a greenhouse gas simulation with an increase in the number of particles of a factor of 100, allowing increased resolution in the underlying meteorological data without loss of statistical accuracy. Met Office UK-V 1.5 km resolution data has now been used with such an increase (48 million particles versus an original 480 thousand particles) to perform a simulation on Archer with overall time-to-solution of e.g., about 90 minutes on 16 nodes (cf. 45 minutes on 1 node for the original). Runs at the increased resolution with 480 million particles are possible, taking around 12 hours on 16 nodes. Such runs are possible using the message passing implementation produced as the first deliverable of the work.
  2. Objective 2: To be able to perform a simulation of long-range dispersion of volcanic ash with an increase by a factor of 10 in both the vertical resolution and the number of particles. A simulation of an eruption of Askja in Iceland has been performed with 7.2 million particles and vertical resolution of 250 feet (compared with an original of 720 thousand particles and 2500 feet). Time-to-solution for this problem is not significantly enhanced by distributed parallelism owing largely to serial output constraints.
  3. Objective 3: To perform air quality simulations with chemical reactions for years rather than months. As the second deliverable of the work, the OpenMP implementation of the chemistry scheme has been completely rewritten. This means the OpenMP performance for the chemistry scheme is now close to ideal for a suitable problem (e.g., speed up of around 22 on Archer 24-core nodes; speed up of around 29 using both hardware threads) where before performance was rather poor. The chemistry scheme is also decomposed in space and integrates with the message passing implementation.

Summary of the software

TThe NAME code is owned by the United Kingdom Met Office and is available for research use under licence. Interested parties should contact the Met Office.

Software can be made available on ARCHER if there is demand.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC