ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Parallel supermeshing for multimesh modelling

eCSE03-08

Key Personnel

PI/Co-I: Dr James R. Maddison - University of Edinburgh, Dr Patrick E. Farrell - University of Oxford

Technical: Mr. Iakovos Panourgias - EPCC

Relevant Documents

eCSE Technical Report: Parallel supermeshing for multimesh modelling

Project summary

Models which use multiple non-matching unstructured meshes typically need to solve a computational geometry problem, and construct intersection meshes in a process known as supermeshing [1-2]. The algorithm for solving this problem is known [2-4] and has an existing implementation in the unstructured finite element model Fluidity [5], but this implementation is deeply embedded within the code and unavailable for widespread use. This project addresses this issue via the creation of a standalone general-purpose numerical library, libsupermesh, which can easily be integrated into new and existing numerical models.

This project further addresses a general parallelisation problem: a numerical model may need to consider not only two non-matching unstructured meshes, but also allow the two meshes to have different parallel partitionings. The existing supermeshing functionality in Fluidity assumes that the meshes used have domain decompositions which match perfectly. An implementation of a parallel supermeshing algorithm in the general case was previously lacking, and this has limited the scale of multimesh modelling applications.

A parallel supermeshing implementation has been designed and implemented. The parallel supermeshing functionality was benchmarked, and scaling up to 10,000 cores for a one hundred million degree of freedom problem was demonstrated. Using the parallel supermeshing library, runtime for the supermeshing of two meshes (each with two hundred million degrees of freedom) was reduced from 6,980 seconds (or ~ 2 hours) to 1.63 seconds (when using 10,000 cores) or to 3.85 seconds (when using 2,000 cores). This parallel implementation will allow multimesh simulations to be conducted on a previously inaccessible scale.

The library is simple to use as it provides simple interfaces and does not use complex data structures. The standalone library has been integrated with Fluidity, providing immediate access to the software in an application in current use on ARCHER, and demonstrating the ease of adoption of the library.

References

  1. P. E. Farrell, M. D. Piggott, C. C. Pain, G. J. Gorman, and C. R. Wilson, "Conservative interpolation between unstructured meshes via supermesh construction", Computer Methods in Applied Mechanics and Engineering, 198, pp. 2632-2642, 2009
  2. P. E. Farrell and J. R. Maddison, "Conservative interpolation between volume meshes by local Galerkin projection", Computer Methods in Applied Mechanics and Engineering, 200, pp. 89-100, 2011
  3. M. J. Gander and C. Japhet, "An algorithm for non-matching grid projections with linear complexity", in "Domain Decomposition Methods in Science and Engineering XVIII", M. Bercovier, M. J. Gander, R. Kornhuber, and O. Widlund (editors), Springer Berlin Heidelberg, pp. 185- 192, 2009
  4. M. J. Gander and C. Japhet, "Algorithm 932: PANG: Software for nomatching grid projections in 2D and 3D with linear complexity", ACM Transactions on Mathematical Software, 40, pp. 6:1-6:25, 2013
  5. Fluidity [online]. Available at: http://fluidityproject.github.io

Summary of the software

The standalone libsupermesh library is available for download under the LGPL 2.1 license at: https://bitbucket.org/libsupermesh/libsupermesh

On ARCHER, Fluidity is maintained by the CSE team and will be soon available as an ARCHER package. It is expected that new versions of Fluidity will be compiled and offered to ARCHER users.

ARCHER users will be also able to access the code using the public repositories and compile their own version using the tools that are available on ARCHER.

libsupermesh has been tested with the latest versions of GNU (5.1.0), Intel (15.0.2.164) and Cray (8.4.1) compilers available on ARCHER.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC