65th Annual Gaseous Electronics Conference
Volume 57, Number 8
Monday–Friday, October 22–26, 2012;
Austin, Texas
Session QR1: Plasma Modeling and Simulations II
10:30 AM–12:30 PM,
Thursday, October 25, 2012
Room: Amphitheatre 204
Chair: Thomas Mussenbrock, Ruhr-Universit\"at Bochum
Abstract ID: BAPS.2012.GEC.QR1.1
Abstract: QR1.00001 : Atomic and Molecular Input Data for Plasma Modelling: a user's perspective
10:30 AM–11:00 AM
Preview Abstract
Abstract
Author:
Jan van Dijk
(Eindhoven University of Technology)
With the advent of cheap, yet powerful computers, self-consistent numerical simulation has become a viable tool for understanding, designing and improving technological and scientific plasma sources. Nowadays, multi-dimensional models that are capable of simulating time-dependent discharge behaviour are in use at various universities and research institutes. One such computer code is Plasimo, a PLAsma SImulation MOdel that is being developed at Eindhoven University of Technology (http://plasimo.phys.tue.nl). Plasimo provides kinetic (Monte Carlo), hybrid and fluid models for transport-sensitive and equilibrium plasma.
It is obvious that codes like Plasimo require a multitude of input data to function properly, but the measurement or calculation of such data is mostly outside the project's reach. In this contribution we, as Plasimo developers, will therefore provide a user's perspective of the subject of atomic and plasma data.
In the first part of this contribution, we will provide an overview of the various sorts of input data that are needed for the types of plasma modelling that are supported by Plasimo.
The discussion will be guided by real-world examples of models
for low- and high-pressure plasma sources.
In the second part of the contribution, we will discuss how modern Internet technologies can help us to fulfill our input data needs. As of today, input data are typically either hard-coded in computer programs, or read from local
input files. Moreover, data pre-processing tasks, like integrating cross sections to rate coefficients, are usually carried out locally as well. We will demonstrate how Web Services (http://www.w3.org/2002/ws/) can be used to manage, disseminate and manipulate data sets more conveniently. We will also identify various input data pre-processing tasks that could be taken over by data distributors, suggest how this could be implemented, and sketch the work flow that would
result from such effort.
To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2012.GEC.QR1.1