Bulletin of the American Physical Society
62nd Annual Meeting of the APS Division of Plasma Physics
Volume 65, Number 11
Monday–Friday, November 9–13, 2020; Remote; Time Zone: Central Standard Time, USA
Session NM10: Mini-Conference on Growing An Open Source Software Ecosystem For Plasma Science IILive
|
Hide Abstracts |
Chair: David Schaffner, Bryn Mawr College |
Wednesday, November 11, 2020 9:30AM - 9:55AM Live |
NM10.00001: OMFIT: A Community and Framework for Integrated Modeling and Analysis Sterling Smith, Orso Meneghini OMFIT is a software framework developed for integrated modeling and data analysis, whose main applications have been developed by the magnetic fusion community. At the core of the OMFIT framework are a series of application programmer interfaces (API) for common tasks such as remote code execution, data transfers, file parsing and writing, database fetching, and GUI building. A set of over 110 physics modules (the collection of data and scripts for carrying out scientific studies, see https://omfit.io/modules.html for the full list) enable over 400 scientists, spread across 25 institutions worldwide, to carry out a wide plethora of leading edge fusion research, including validation of models against experiment. Software engineering best practices such as extensive documentation, automated regression testing and software deployment, and code review have been critical elements in supporting its nearly 100 community contributors. Overall, the OMFIT project has consolidated the efforts of many talented independent scientists into a technical solution and a community that are greater than the sum of their parts. [Preview Abstract] |
Wednesday, November 11, 2020 9:55AM - 10:10AM Live |
NM10.00002: An OMFIT Module for Event Detection Using Semi-Supervised Learning Kevin Montes, Cristina Rea, Robert Granetz This contribution describes the development of a new OMFIT\footnote{Meneghini O. et al., Nuclear Fusion 55 083008 (2015)} module designed to accelerate the assembly of large databases of disruption precursor events. Given a dataset of relevant 0D signals from a large number of shots and a few manually recorded times at which the event of interest occurs, the module implements an event detection algorithm based on the label propagation\footnote{Zhu X. et al., "Learning from labeled and unlabeled data with label propagation." (2002)} and label spreading\footnote{Zhou D. et al., Advances in Neural Information Processing Systems 16, 321-328 (2004)} methods. Each step in the module workflow is supported by a graphical user interface, allowing for ease of analysis and validation of individual event detections. For a dataset of $\sim$ 300 discharges with manually identified events, it has been shown that both H-L back transitions and initially rotating locked modes can be detected with high accuracy ($>$85\%) when $<$3\% of the events are initially labeled by the user. In addition to reproducing this analysis with a predefined dataset used in the study, users can apply the module to detect other events in a large dataset for which manual identification of events is too time consuming. [Preview Abstract] |
Wednesday, November 11, 2020 10:10AM - 10:25AM Live |
NM10.00003: PSYDAC: a parallel finite element library with automatic code generation Yaman G\"{u}\c{c}l\"{u}, Said Hadjout, Ahmed Ratnani PSYDAC is a Python~3 library for the solution of partial differential equations, with a focus on isogeometric analysis using B-spline finite elements. Support for multi-patch geometries and finite element exterior calculus is under development. In order to use PSYDAC~[1], the user defines the geometry and the model equations in an abstract form using SymPDE~[2], an extension of Sympy~[3] that provides the mathematical expressions and checks their semantic validity. Once a finite element discretization has been chosen, PSYDAC maps the abstract concepts to concrete objects, the basic building blocks being MPI-distributed vectors and matrices. Python code is generated for the all the computationally intensive operations (matrix and vector assembly, matrix-vector products, etc.), and it is accelerated using either Numba~[4] or Pyccel~[5]. We illustrate the library's capabilities with some plasma physics examples. \newline \newline \textbf{References (open source software)} \newline [1] PSYDAC: https://github.com/pyccel/psydac \newline [2] SymPDE: https://github.com/pyccel/sympde \newline [3] Sympy: https://www.sympy.org \newline [4] Numba: https://numba.pydata.org \newline [5] Pyccel: https://github.com/pyccel/pyccel [Preview Abstract] |
Wednesday, November 11, 2020 10:25AM - 10:40AM Live |
NM10.00004: SMILEI, a user oriented plasma simulation ecosystem Arnaud Beck, Julien Derouillat, Mathieu Lobet, Frederic Perez, Tomaso Vinci, Francesco Massimo, Mickael Grech SMILEI is an open source, general purpose, electromagnetic Particle-In-Cell code. As such, it federates inputs from the astrophysics, laboratory plasma physics, and computer science communities which contributors constitute the backbone of the SMILEI growing ecosystem. An efficient and massively parallel core, required for all scientific applications, is maintained by computer scientists. The various geometries and physics models included and built around this core by physicists are driven by the needs of the scientific communities and developed accordingly. Priorities are given to features requested by the users community which is the bulk of the ecosystem. Its feedback is essential to guide the development of a solid software which meets their needs in terms of capability and usability. A significant effort is put into a good user-interface, post-processing, documentation and education of this community with as much interaction as possible. The ecosystem is completed by the national computing centers. They help setup optimized environments and make sure to acquire computing systems well adapted to the code. This is achieved by benchmarking prototypes against various SMILEI simulations before the acquisition of new hardware. [Preview Abstract] |
Wednesday, November 11, 2020 10:40AM - 10:55AM Live |
NM10.00005: System-scale simulation based on kinetic theory: the ECsim code Giovanni Lapenta, Joost Croonen, Giuseppe Arro We present the ECsim approach [1] to modelling macroscopic systems. The kinetic approach is valid at all scales but it becomes costly to use at large scale. The most common approach to kinetic plasma modelling is the explicit particle in cell method (PIC). This approach requires to resolve all scales, from the smallest electron scale. Failure to do that incurs into rapid and disruptive numerical heating. If one desires to resolve only intermediate or large scales, the explicit PIC cannot be if help. It will still have the burden to need to resolve all electron scales. But what if one is interested in sty dung the electron motion in ion scale-features without resolving the electro-magnetic fields at electron scales? For this task we use a new energy conserving semi-implicit PIC method [1]. This approach has been used for a number of years in the modelling of reconnection in space plasmas [2]. Here we show two new applications of ECsim: modelling of fusion devices and modelling of the heliosphere from 10 solar radii to beyond the orbit of the Earth. [1] Lapenta, Giovanni. JCP 334 (2017): 349-366. [2] Lapenta, Giovanni, et al. ApJ (2020): 888 (2) [Preview Abstract] |
Wednesday, November 11, 2020 10:55AM - 11:10AM Live |
NM10.00006: Expanding VPIC Portability to Large Scale GPU Systems Nigel Tan, Michela Taufer, Scott Luedtke, Robert Bird, Brian Albright Vector Particle-In-Cell (VPIC) is a state of the art plasma physics simulation code with a history of large scale simulations, with recent simulations reaching 10 trillion particles over 2 million processes. The key to VPIC performance is its platform specific optimizations. The growing diversity in heterogeneous platforms makes continuously re-writing legacy codes, including VPIC, infeasible for the community due to portability issues. These issues can be addressed by frameworks such as Kokkos that enable developers to write codes once and compile them for different platforms. In so doing, scientists can focus on scientific models and discovery, while delegating hardware specific tuning to the Kokkos runtime. Our work is part of a broader effort to modernize VPIC portability across heterogeneous platforms, while reaching new milestones in particle scale and simulation performance. Here we present a high performance, portable variant of VPIC with platform agnostic algorithm optimizations using Kokkos and the lessons learned from running our variant on the GPU-accelerated Power9 system, Summit. Our results show near linear weak scaling on over 12,000 Summit GPUs. [Preview Abstract] |
Wednesday, November 11, 2020 11:10AM - 11:25AM Live |
NM10.00007: Nata: Python package for post-processing and visualization of simulation output for particle-in-cell codes Anton Helm, Fabio Cruz, Ricardo Fonseca, Luis Silva In plasma science and technology, large-scale, massively parallel simulations play a prominent role, in scenarios as diverse as the design of future accelerators, the dynamics of astrophysical plasmas, secondary sources driven by intense lasers and their applications in biology and medicine, or nuclear fusion. Particle-in-cell (PIC) simulations play a fundamental role in plasma physics research. It is due to their ability to resolve the smallest and shortest plasma scales and to couple to other physics modules from first principles or phenomenologically (e.g., collisions, ionization processes, Quantum Electrodynamics). PIC simulations produce very high-fidelity output, which results in large amounts of data. The post-processing, analysis, and preparation of publication-quality plots from PIC simulation codes quickly becomes a cumbersome task. We present the open-source Python package nata, developed to provide a user-friendly and straightforward interface to read, process, and visualize output generated by PIC codes. It is designed to benefit from the vibrant scientific ecosystem of Python and be minimalistic yet rich in functionality, allowing users to adopt the workflow quickly. We describe the core concepts and capabilities of nata and illustrate them with real simulation data from multiples PIC codes. We discuss how the plasma physics community can use nata to ease their day-to-day workflow and contribute to the project. We also discuss potential uses for nata beyond PIC codes. [Preview Abstract] |
Wednesday, November 11, 2020 11:25AM - 11:50AM Live |
NM10.00008: Input {\&} Output Standardization Efforts A Huebl, R Lehe, J-L Vay, M Thevenet, D P Grote, I F Sbalzarini, S Kuschel, M Bussmann, D Sagan, C Mayes, F Perez, F Koller Open frameworks and tools in plasma physics provide potential for complex, cross-domain modeling and analysis activities. Connecting those, standardization efforts aid compatibility and improve usability. We present openPMD (open standard for particle-mesh data files) and PICMI (particle-in-cell modeling interface) as two projects focusing on data and software input compatibility. [Preview Abstract] |
Wednesday, November 11, 2020 11:50AM - 12:10PM Live |
NM10.00009: Balancing Flexibility and Usability in the Gkeyll Simulation Framework James Juno, Ammar Hakim, Noah Mandell, Manaure Francisquez, Tess Bernard, Petr Cagas, Liang Wang, Rupak Mukherjee, Jason TenBarge, Gregory Hammett It is the goal of many software projects to leverage common functionality and thus build flexible tools that can be deployed for a wide range of problems. But flexibility can come at the cost of usability. As a software framework is designed to handle more general cases, actually using the software to solve a particular problem of interest can become more challenging. In this talk, we will present how the Gkeyll simulation framework solves these issues by not only providing an abstract layer on which to build solvers for desired equation systems, such as two-fluid, gyrokinetics, and Vlasov-Maxwell, but also abstracting out the requirement of the user to specify the complete simulation pipeline by packaging desired functionality into Gkeyll’s App system. As part of this presentation, we will show the evolution of a Gkeyll input file from the burdensome general input files employed previously that required users to “do everything themselves,” to the more compact and user-friendly app-driven input files. In doing so, we both demonstrate the ease with which a user can construct simulations with the open-source Gkeyll simulation framework and provide a template for other flexible code frameworks for improving maintainability and usability of their codes. [Preview Abstract] |
Wednesday, November 11, 2020 12:10PM - 12:30PM Live |
NM10.00010: Open source plasma simulation in the MOOSE framework Steven Shannon, Davide Curreli, Corey DeChant, Grayson Gall, David Green, Casey Icenhour, Shane Keniley, Alexander Lindsay The Multi-Physics Object Oriented Simulation Environment (MOOSE) is an open source framework originally developed for nuclear reactor simulation. Because of its original intention, the framework has a very well established development environment for deploying, tracking, and updating applications and insuring that code is well documented and verified. Recently, MOOSE has grown into a broader range of applications as the need for open source environments in science has grown. In this talk, we will present a brief overview of the MOOSE ecosystem and present results from three recently developed plasma applications: 1.) ZAPDOS, a two-fluid plasma simulation tool, 2.) CRANE, a plasma chemistry application for integration of complex chemical pathways into plasma simulation through ZAPDOS, and 3.) ELK, an electromagnetic solver designed to couple to ZAPDOS and enable solution of a broader class of plasma problems. These results will combine new discoveries and validation efforts, with emphasis on how the MOOSE ecosystem works to advance open source community development of plasma simulation tools. [Preview Abstract] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2023 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
1 Research Road, Ridge, NY 11961-2701
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700