Professional Documents
Culture Documents
Department of Energy
National Nuclear Security Administration
Advanced Simulation and Computing
Academic Strategic Alliances Program
Table of Contents
1.
Table of Contents
1.1
2.
Executive Summary
2.1
3.
3.1
3.1
Simulation Development
3.1
3.2
Simulation Demonstrations
3.3
3.3
3.5
3.4
Fluid Dynamics
3.8
3.5
3.12
3.6
Computer Science
3.15
3.7
3.15
Computational Environment
3.16
Visualization
3.18
3.19
4.
4.1
5.
5.1
6.
Legacy
6.1
6.1
Code Legacy
Software Engineering
6.2
A.
Center Legacy
Table of Contents
6.1
6.1
6.2
A.1
1.1
Executive Summary
The goal of CSAR is the detailed, whole-system simulation of solid propellant rockets from
first principles under both normal and abnormal operating conditions. The design of solid
propellant rockets is a sophisticated technological problem requiring expertise in diverse
subdisciplines, including the ignition and combustion of composite energetic materials; the
solid mechanics of the propellant, case, insulation, and nozzle; the fluid dynamics of the interior flow and exhaust plume; the aging and damage of components; and the analysis of various potential failure modes. These problems are characterized by very high energy densities,
extremely diverse length and time scales, complex interfaces, and reactive, turbulent, and
multiphase flows.
The scientific and technological needs of the U. S. Department of Energy posed by the
Accelerated Strategic Computing Initiative/Academic Strategic Alliances Program
(ASCI/ASAP) encouraged the University of Illinois at Urbana-Champaign (UIUC) to establish the Center for Simulation of Advanced Rockets (CSAR) in September 1997. The outstanding quality of the faculty and staff, facilities, and research infrastructure offered by
UIUC have enabled a unique partnership among university researchers and the DOE/NNSA
Defense Program laboratories to advance the state of the art in computational simulation of
complex systems. State, regional, and university resources are also supporting the program,
and an experienced research team is fulfilling the mission of the Center.
CSAR is focusing on the reusable solid rocket motor (RSRM) of the NASA Space Transportation System, better known as the Space Shuttle, as its long-term simulation vehicle. The
RSRM is a well-established commercial rocket, is globally recognized, and most importantly,
design data and propellant configurations are available. Various smaller scale rockets are also
simulated to provide validation data for CSAR codes. Simulations that include full geometric
and materials complexity require a sequence of incremental developmentsin engineering
science, computer science, and systems integrationover an extended period of time. From
the outset, our emphasis has been on system integration rather than separate threads of development that eventually come together at some point in the future. Rapid exploration of critical system integration issues demanded the use of simplifiedbut fully integratedmodels
and interfaces initially, followed by successively refined models and interfaces as experience
was gained. CSAR staff have designed and implemented a fully integrated code that includes
characterization of various burn scenarios and the onset of potential component failures. Refined multiscale component models and advanced system integration concepts based on lessons learned from this effort constitute the key features in our proposed research. Use of the
simulation code to explore scientific and engineering issues in complex fluid-structure interactions is a major focus for the new program.
More than 100 UIUC faculty, students, and researchers contribute to the success of the
Center. An External Advisory Board provides critical guidance in rocket simulation and
computational science. The DOE-supplied budget has been sufficient to maintain an aggressive research program. In addition, the University of Illinois has provided funds for ancillary
research expenditures, computer workstations, and facility renovation. Center personnel have
traveled widely to explore rocket science and technology, identify technical collaborators,
describe the ASCI/ASAP program, and establish relationships among Center investigators,
DOE/NNSA DP scientists, and industry leaders.
Executive Summary
2.1
3.1
Simulation Development
The central goal of CSAR is the detailed, whole-system simulation of solid propellant rockets
from first principles under both normal and abnormal operating conditions. Full simulations
(Figure 3.1) of such complexity require a sequence of incremental developmentsin engineering science, computer science,
and systems integrationover an
extended period of time. From the
outset, however, our emphasis has
been on system integration rather
than separate threads of development
that eventually come together at some
point in the future. Rapid exploration
of critical system integration issues
have demanded the use of simplifiedbut fully integratedmodels
and interfaces initially, followed by
successively refined models and interfaces as experience is gained (Figure 3.2).
Simulation Roadmap
The CSAR Simulation Roadmap
Fig. 3.1: Current 3-D fully coupled code includes struc(Figure 3.3) depicts the evolution of
tural dynamics, combustion, and fluid dynamics simulation
increasingly sophisticated computamodules and ignition model. Images show interaction of
solid propellant, combustion layer, and fluid flow followtional models for the primary rocket
ing ignition of RSRM (clockwise from upper left: 5, 10,
components and their interactions.
15, and 20 ms). Colors in solid propellant depict local
Our initial implementation of an intestress, colored arrows in fluid represent flow direction and
grated simulation code (GEN1),
speed, and colored isosurfaces in fluid show temperature
which was operational at the end of
distribution.
2000, provided a simplified characterization of various burn scenarios. The GEN1 code employed macroscopic models for the
separate components to enable a strong focus on the definition and resolution of system integration issues. Refined, multiscale component models and advanced system integration concepts, based on lessons learned from GEN1, constitute the key features in the secondgeneration (GEN2) code, developed during Years 4 and 5 and beyond. The refined models
also reflect the synthesis of fundamental, subscale studies (bottom right side of Figure 3.3)
that are critical for detailed simulations of accident scenarios and for reliable simulation of
multiscale phenomena such as combustion and turbulence. The code numbers in the diagram
indicate dependence of the refined and accident models on the subscale simulations.
The Roadmap indicates the close coupling among the components; physical quantities
such as temperature (T), mass flow ( m ), pressure (p), heat flux (q), concentrations (ci), and
geometry that must be exchanged between the solid rocket booster (SRB) component mod-
3.1
Geometrical Complexity
els. The computer science integration efforts define the framework for these interconnections and, consequently, their
eventual impact on overall code performance. In the right-center box on the diagram, computer science research and development activities are shown that support the SRB simulation through the implementation and optimization of the
component models and subscale simulations, the integration of component models and the computational infrastructure
required to do large-scale parallel computation.
Accidents
GEN2
Family
Joints
Star Grain
GEN1
Family
3-D
2-D
1-D
GEN0
Weakly Coupled
Fully Coupled
Detailed
Physical Complexity
Finally, the central placement of validation efforts in the diagram highlights the priority
assigned to this activity. Each subscale, component, and integrated simulation must be validated against existing analytical, numerical, and experimental data available in the open literature or obtained from NASA, DOD research agencies, the U.S. rocket industry, or in experiments in laboratories on our own campus.
System integration involves two major tasks to ensure the physical, mathematical, geo-
Fig. 3.3: CSAR Roadmap showing completed tasks (dark boxes) and planned activities for Y6-10.
Scientific and Engineering Advances
3.2
metric, numerical, and software compatibility of the component models and the codes implementing them. The first task is providing information transfer across component boundaries. Boundary conditions for the component models must be compatible mathematically
(e.g., an outflow from one component becomes an inflow for a neighboring component). The
discretizations of neighboring components must fit together geometrically. Different spatial
resolutions and discretization methodologies must be reconciled via interpolation where necessary. These issues have been addressed by developing innovative algorithms for mesh association and common refinement of interface meshes, together with new data transfer methods that are both physically conservative and highly accurate numerically.
The other major task is temporal coupling of the components so that the whole system is
evolved in a self-consistent manner. Different components require different time step sizes
due to the choice(s) of algorithm(s) (e.g., explicit vs. implicit methods), spatial resolution,
and/or the physics of the subproblem that the module solves. The computational cost of
forcing each module to take a time step determined by the module requiring the shortest step
is often prohibitive. In a broad, important research area that has come to be known as time
zooming, we continue to investigate multiple strategies for coupling modules requiring different time step sizes while maintaining the accuracy of the overall simulation.
Our approach to system integration has been to develop a single executable code containing modules for the various components and the interface code for tying them together.
We are following an object-oriented design methodology that hides the data structures and
other internal details of the individual component codes. This simplifies development and
maintenance of the interface code and the component codes, and also makes it easier to swap
different versions of the same componenta critical capability for determining the most efficient algorithms and implementations.
3.2
Simulation Demonstrations
3.3
tion of improved physical models (turbulence, aluminum droplets, smoke, radiation, plume
chemistry; constitutive relations for the propellant, case, and insulation; heat transfer, ignition, and burn rate, etc.) may warrant revisiting a given problem to demonstrate better
agreement with experimental results. This preliminary list of problems includes Titan IV
propellant slumping, flexible inhibitor, aerodynamic loads (rocket in a wind tunnel), detailed
full burn of various small rockets supplied by the U.S. rocket industry, RSRM normal burn,
flow past O-rings, and Titan case rupture.
Integrated Rocket Simulations
Titan IV propellant slumping This real-life solid propellant booster accident is described in the
open literature. Published documentation includes detailed pressure sensor and hoop strain gauge
measurements, as well as 2-D simulation results for comparison with our 3-D simulation results. The
current GENX code will soon be able to reproduce the conditions that lead to rupture of the case.
This will require a 10 to 15-day run on 512 processors for a coarse mesh study. The amount of
slumping that can be followed is limited by fluids mesh motion. December 2002
Flexible bore inhibitor The current GENX code should be able to simulate a rocket with a flexing
inhibitor provided the deformation is not too great, perhaps a maximum deflection angle of ~45 degrees. The maximum deformation of an inhibitor in a full rocket model might be less than it is in our
model of just a section, unless we carefully choose time-dependent boundary conditions. We have
already generated a preliminary fluids mesh. December 2002
Aerodynamic loads on the case (for lab scale rockets) This topic is of particular interest to the
rocket industry, and the current GENX code could easily be extended to include the air in a region
outside the rocket. The fluids domain would include a virtual wind tunnel outside the case, with a
time-dependent inflow speed determined by the thrust history of the rocket. The non-rigid case
would have a fluid-structure boundary on the outside, as well as the usual loads from the internal gas
pressure. Most of the work to be done would involve setting up appropriate boundary conditions and
meshing the initial geometry. June 2003
Lab scale rockets from ignition to burn-out We already can reproduce the measured pressure
overshoot, but the pressure rise is much slower in the experimental data than in our simulations using the current ignition model (Rocburn v2.2). The CEM and Fluids teams will collaborate to develop an improved model of the heat flux from the gas to the propellant. To simulate propellant
burn-out, we need at least a rudimentary remeshing capability to handle the gradually changing topology. We expect that the unstructured fluids and structures solvers (Rocflu and Rocfrac) along
with the serial mesh repair code will be able to address this problem in a time frame of about a year.
June 2003
ARC Rockets Atlantic Research Corporation (ARC) has agreed to supply us with detailed design
and experimental data for small guidance rockets, both as a validation test for our code and to assist
them with improving their designs. The most difficult aspects of simulating these small motors is accurately modeling ignition transients and adapting the meshes automatically as the propellant burns
away completely. There are flexible baffles in some of these motors that may divide and reunite the
fluid domain as the system evolves, and cracks can form in the propellant and other components. An
advanced 3-D remeshing capability is required for such topology changes. December 2003
RSRM Challenger accident (simulate flow past O-rings) This problem requires a structures
solver that handles contact, so perhaps this is an item for the addendum on what could be done if we
are given additional funding. The fluids domain topology changes when the gas begins to pass
around the O-ring, which requires advanced remeshing capabilities. June 2004
RSRM complete normal burn The ignition transients for the Space Shuttle booster are well characterized in the open literature, and we have access to extensive test data. We hope to reproduce the
waterfall plot (amplitude of vibration as a function of frequency) at various stages during firing.
The most difficult aspect of simulating the entire history of a large motor is reducing the run time.
3.4
For a fluids mesh that is fine enough to allow accurate turbulence modeling, for example, time
zooming techniques under consideration by the Fluids team will be required to reach 120 seconds of
physical problem time. December 2004
3.3
Titan IV case rupture accident In this simulation, the pressure builds up until the case fails. The
failure begins as a crack that propagates all the way through the case. The fluid domain would include the air outside the case, so hot gas would begin to leak from the crack in the case into the surrounding air. The case would tear open, relieving the pressure. In the test firing, the rocket broke up
violently into many pieces, destroying the test stand, but there was no detonation. By the time we
implement the required remeshing capability, we may also have an advanced material model for the
propellant that includes the effect of voids and dewetting. June 2005
Rocfrac
Rocsolid
(Explicit)
(Implicit)
Case
Modeling
Solid Propellant
Modeling
Macroscale (continuum)
MTS
Hills potential
Mesoscale (continuum)
Multi-grain
Particulate composite
Structural
Simulations
Constitutive and
Failure Modeling
3.5
will involve the combination of a cohesive finite element scheme to capture the spontaneous
dewetting of the particles; the virtual internal bound (VIB) method to simulate the tearing of
the matrix; and packing algorithms to create realistic microstructures. The analysis will be
performed under both quasi-static and dynamic conditions, and will rely on a proposed novel
iterative solver that takes advantage of the slowly evolving nature of the secant or tangent
stiffness matrix as cohesive and/or VIB elements progressively fail. This project involves
close collaboration with research groups at Los Alamos and Sandia National Laboratories.
Micromechanics Approach to Study of Damage and Constitutive Response P. Sofronis
Finite element methodology and principles of mechanics of materials are used to study and
understand issues governing the mechanical damage response of a solid propellant under
both static and dynamic loading. We will address macroscopic propellant nonlinearity resulting from the interaction between the macromechanisms of damage at the microscale. The
derived constitutive laws will be used to investigate the phenomena of shear localization and
hot spot generation under dynamic loading. These phenomena take place spatially at both
nano (interface cohesive laws) and micron-scale (porosity, localized-band width), and may
evolve in time over picoseconds (initiation of dewetting) or a few seconds (void elongation).
The importance of atomistic simulations in assessing specific material-parameter inputs to
the micromechanical models will be explored. This project is part of an ongoing research
collaboration on void growth with the Lawrence Livermore National Laboratory.
Atomistic Calculations of Debonding, Plastic Behavior and Fracture R. Averback
This project provides convenient interfacing between atomistic simulations and continuum
modeling. In achieving these goals, we will develop analytic potentials that describe interatomic interactions in and between different classes of materials: metals, ceramics, and polymers. Efforts will focus on developing appropriate reactive potentials for polymer/metal and
polymer/metal-oxide interfaces and on developing a scheme to combine molecular dynamics
(MD) with kinetic Monte Carlo (KMC) methods for alloy systems. Atomistic simulations are
then employed to calculate the mechanical response of multiphase materials subjected to high
strain conditions at various temperatures and to develop constitutive laws near interfaces.
These results will provide input data for other CSAR continuum models.
Structural System Simulation
Adaptive Version of Rocfrac P. Geubelle and S. Breitenfeld
Developed over the past three years, Rocfrac is a structural solver used in the GEN2 integrated code development effort. The code is based on an explicit time stepping scheme and
relies on a 3-D Arbitrary Lagrangian Eulerian formulation to capture the regression of the
solid propellant. It includes nonlinear kinematic description to account for possible large deformations and/or rotations, and contains a variety of volumetric responses, ranging from linear elasticity to the Arruda-Boyce nonlinearly elastic model used for the grain response and
to a specially developed elasto-viscoplastic model for the case. The Rocfrac element library
currently includes 4- and 10-node tetrahedral volumetric elements used to model the bulk
response of the rockets structural components, and 6- and 12-node cohesive elements used
to capture the initiation and rapid propagation of one or more cracks in the grain and along
the grain/case interface. We will continue the development and integration of this structural
solver, and, in particular, address the following aspects:
3.6
Develop and implement the next generation of Rocfrac, with special emphasis on the introduction of improved low-order tetrahedral elements and on dynamic mesh adaptivity
Complete the full integration of Rocfrac within the Charm++/FE (SWIFT) Framework
Apply the integrated unstructured rocket code to various dynamic fracture problems, with
special emphasis on the Titan IV grain slumping accident
3.7
tion technique positioned to be uniquely responsive to CSARs simulation requirements. Inherent O(N) computational complexity, rich parallel structure, support of fully unstructured
spacetime grids, strong element-wise conservation properties, intrinsic high-order stability
and natural shock-capturing properties make STDG methods an ideal vehicle for highresolution simulations. Parallel, hp-adaptive STDG implementations promise to significantly
extend our capabilities for long-duration and multi-scale rocket simulations. We propose
continuing research on STDG methods, leveraged by a parallel effort in CPSD, to realize and
demonstrate these advantages. This program will involve new research on STDG formulations, spacetime mesh generation, visualization of spacetime data sets, and parallel implementations. The research involves mesh motion and adaptive grids, time zooming, and multiscale modeling of propellants.
3.4
Fluid Dynamics
3-D Structured
Development (Rocflu)
Development (Rocflo)
The Fluid Dynamics Group
addresses system-scale SRM
multiphase compressible
Radiation
Chemistry
Al Droplet
Al Oxide
LES Turbulence
core flow code development,
Modeling
Modeling
Modeling
Modeling
Modeling
(Rocspecies)
(Rocrad)
(Rocsmoke)
(Rocpart)
as well as subscale model
(Rocturb)
development relevant to the
turbulent dynamics of the
Optimal LES
Multi-phase Flow
Injection Driven
combustion interface. These
Research
Research
Flow Research
include injection, dispersion
and combustion of aluminum Fig. 3.5: Fluid Dynamics Group technical roadmap provides supportdroplets in the core flow, ing fundamental research for integrated code components.
formation, dispersion and
slag accumulation of aluminum oxide particles, and flow within cracks and other defects in
the propellant. Broadly viewed, Fluids Group research includes multiphysics code module
development (Rocflo and Rocflu), individual physics modules (Rocturb, Rocpart, Rocsmoke,
Rocrad, and Rocspecies), and fundamental research projects needed to support simulation
code development (Figure 3.5).
3.8
3.9
great importance to model the turbulence in the core flow. The primary turbulence modeling
paradigm that has been selected for this application is Large Eddy Simulation (LES). LES
has the advantage of resolving the large-scale turbulence fluctuations, which are needed in
modeling the interactions of turbulence with other phenomena (e.g. particle dispersion, interaction with the combustion layer, shedding from inhibitors). Earlier simulations of simple
rocket motors were performed with simple no-model LES, in which artificial dissipation is
used to regularize the fluid flow equations. However, such a numerical treatment lacks a rigorous physical basis. It fails, for instance, to adequately predict transition without adjusting
the level of artificial dissipation in an ad-hoc manner. The main purpose of Rocturb module
is therefore to provide physically based models for simulating turbulent flow inside a rocket
motor.
In its development and improvements, Rocturb relies on four areas of basic research.
First, research on application of turbulence models within the solid rocket simulation framework itself. This research is directly related to the turbulence phenomena that occur in the
rocket flow and its simulation. Results of these studies can, thus, be directly implemented in
the Rocturb module. Second is research on a new class of turbulence models called optimal
LES, in which LES models are developed directly approximating the best possible deterministic LES model. The first use of optimal LES in an applications fluids code will be in
Rocfluid through the Rocturb module. Third is research on the physics and modeling of the
turbulence fluctuations that are generated by the combustion layer and its interaction with the
core flow. This poorly understood process sets the boundary conditions for the turbulence in
the core flow. Finally, there is research on the LES of multiphase flows. The particles in the
rocket are subgrid in size. They therefore influence and are dispersed by the subgrid flow
fields. This interaction must be modeled, and represents a coupling between Rocturb,
Rocpart and Rocsmoke models.
Rocpart F. Najjar, S. Balachandar, and J. Ferry
Modern solid-propellant rocket motors have aluminum particles added to the mix and their
inclusion contributes for up to 30% of the heat generation. Within the present multi-physics
framework, these Al droplets and the large oxide caps are taken into account through
Rocpart. Several enhancements to the present Rocpart are proposed: incorporation of improved force law, heat and mass transfer correlations, and burn-rate models to better account
for flow conditions within the rocket, improved injection model, modeling of oxide smoke
entrapment resulting in slag accumulation in the submerged nozzle, modeling of the effect of
subgrid turbulence on droplet evolution, improved Lagrangian-Eulerian coupling, and thorough verification and validation. These enhancements require fundamental investigation and
in particular we propose continuing our very productive study of an isolated particle in complex ambient flow.
Rocsmoke and Rocspecies J. Ferry, S. Balachandar, and F. Najjar
Aluminum oxide smoke is an important by-product of combustion in modern aluminized
propellants and it impacts the flow physics inside the rocket motor in several critical ways.
Smoke modulates the radiative transfer of energy, it gets trapped in the submerged nozzle as
slag, and it can represent over 20% of the mass inside a motor. This mass exhibits variations
of concentration that are far more complex than the variations in gas density. Therefore we
propose to continue our fundamental multiphase flow research as a means to assess, develop,
3.10
and improve continuum models of multiphase flow. Specific areas of focus are the continued
development of the Equilibrium Eulerian method; internal boundary conditions (allowing
different methods to be used in adjacent regions); compressibility effects (interaction of
smoke with acoustic and shock waves); interphase coupling; slag dynamics; collision and
diffusion modeling; and LES modeling for smoke. This research encompasses the development of methods for handling species, especially taking into account real gas effects.
Fundamental Supporting Fluid Dynamics Research
Experimental Studies of Rocket Fluid Mechanics R. Adrian
We propose continuing an experimental program to provide physical data on the fluid mechanics of turbulent core-flow, combustion and two-phase flow in solid fuel rockets. These
data support the development of physical models needed for the computational fluid dynamics formulations and provide validation of computations of basic rocket configurations. We
employ small-scale models and rocket flow simulations that allow phenomena important to
the numerical simulations to be isolated and studied in well-defined conditions and simple
geometries amenable to computational development. The studies include investigations of
the sensitivity of the core flow to detailed structure of the wall velocity boundary condition,
detailed measurements of the velocity field of the hot exhaust plume, and construction and
testing of a standard rocket model whose fuel can be modified to study characteristics of
turbulence, two-phase flow, and combustion in the burning exhaust plume.
Coagulation and Fragmentation of Al Particles and Ash H. Aref and S. Balachandar
We are studying equations describing coagulation and fragmentation phenomena with a view
to applying the results to the aluminum particles used in solid-rocket fuels and the ash particles produced once these particles have burned. Modeling of the size and mass distribution of
fuel and ash particles is an important input to the global simulation codes being constructed.
This research is one element of a broader thrust within the CSAR particle group.
Particle Dynamics in High-Speed Flows H. Aref
The dynamics of particles advected by a flow are only known approximately, in general, although they are known with considerable accuracy in the limits of irrotational flow and when
the Reynolds number of the relative motion of particle and fluid is small. Much of the literature ignores effects from particles being non-spherical and from the center of mass and the
centroid of the particle not coinciding. We propose to gain a thorough understanding of particle motions in the types of flow that prevail in the solid booster rocket by considering the
Kirchhoff-Kelvin equations and modifications thereof. These equations should enable us to
achieve insights in the motion of irregular objects, both homogeneous objects of irregular
shape, and objects of regular shape with an inhomogeneous mass distribution, and achieve
a parametrization of the effects of particle-flow interactions in an imposed, essentially unidirectional flow, and establish a parametrization of particle-particle interaction effects. These
issues are relevant as a basis for modeling individual particle motions in the rocket, particle
interactions and collisions, and processes leading to coagulation of particles and build-up of
ash and slag.
Turbulence Research R. Moser, R. Adrian, J. Freund, B. Wasistho, and A. Haselbacher
3.11
The flow inside an SRM is of very large Reynolds number and is therefore turbulent. The
turbulence affects the transport of aluminum and aluminum oxide particles, the interaction of
the gas flow with the combustion layer, and with solid components such as inhibitors and
nozzle, heat transfer to solid surfaces (e.g. the nozzle), and many other fluid interactions. In
the Rocfluid code, turbulence is modeled using Large Eddy Simulation (LES) and several
LES models have been incorporated and tested in the Rocturb module. There are, however, a
number of areas in which further development and research are required.
New model development and implementation: a new class of LES models called optimal
LES models are being developed. The first implementation of an optimal LES model for
a real application is planned for Rocturb within the year. Further development to support
more complex flow physics and better statistical models is also planned.
Combustion-driven turbulence fluctuations: One of the driving forces for the development of turbulence are the fluctuations introduced at the combustion layer, and the shear
instabilities produced by long-wave axial acoustic modes. Using information from
Rocburn and hydrodynamic simulations, the physics of this interaction will be further investigated, and physics-based models for the injected turbulence fluctuations will be developed and implemented in Rocturb or as a boundary condition module.
Multiphase LES: Subgrid turbulence interacts with aluminum and oxide particle in the
flow. The particles act to produce subgrid turbulence and the subgrid turbulence disperses
the particles. Models are needed for both of these effects, and they must be integrated
with Rocturb, Rocpart and Rocsmoke.
Validation Data: Validating the turbulence modeling and simulation in a solid rocket is
difficult because the hostile flow environment makes detailed measurements difficult.
Our LES models must thus be validated against simple laboratory and detailed (DNS)
simulations. The ongoing work to produce the data needed for validation and other purposes is planned to continue.
3.5
3.12
modeling of solid propellant flames, combustion instability analysis, and constitutive modeling of energetic materials.
The objective of our work on combustion of
ammonium perchlorate (AP) solid propellants is
to develop a simulation capability that will allow
reliable prediction of macroscopic combustion
behavior, particularly burning rate, from fundamental material properties and formulation parameters, including AP particle size distribution.
Our approach is based on characterizing the microscopic behavior of a burning propellant in the
critical, burning rate-determining region near the
surface of the propellant and developing microstructural combustion models that can simulate
both micro- and macro-burning behavior.
Outflow
Outflow Region
C ombustion Layer
Melt Laye r
S olid Prope llant
Aluminum Particle
A P Grain
400
Mass Injection at
Regression Rate
B urning Al Particle
B inder
Flame
3.13
3.14
3.6
Computer Science
3.15
further develop several classes of parallel preconditioners based on our previous work and
that of others.
Mesh Adaptation and Refinement D. Guoy and M. Heath
Integrated, multi-component simulations require a wide range of meshing capabilities. Perhaps the greatest challenge is dealing effectively with dynamically changing geometries and
correspondingly changing meshes. Integrated rocket simulations require a variety of element
types (hexahedral and tetrahedral), mesh types (volume meshes and surface meshes), and
levels of adaptation (from minor mesh motion to substantial geometric change requiring major mesh repair or complete remeshing). We will develop techniques for addressing each of
these cases. We have already developed effective techniques for smoothing volume meshes
subject to gradual change, such as in propellant burning, and we plan to extend these to
smoothing surface meshes, such as the interface between fluid and solid regions, where relocated nodes must be constrained to lie on the (generally curved) surface. For more substantial
geometric changes, such as flexible inhibitors or propagating cracks, we will develop further
our mesh repair capabilities based on multiple steps of node movement, mesh coarsening,
and mesh enrichment. Finally, to deal with more drastic geometric change, such as propellant
burnback over an extended period, we plan to develop new capabilities for generating an entirely new mesh based in part on our previously developed techniques for sink insertion and
sliver removal.
Mesh Correlation, Data Transfer, and Interface Propagation M. Heath and X. Jiao
We will continue our work on the issues of correlating multiple meshes, data transfer between meshes, and interface propagation. These problems are directly related to the coupled
simulation code. Mesh correlation and data transfer are needed in many situations that involve multiple meshes, such as the interface between fluid and solid domains and a dynamic
solver involving adaptive remeshing. We have achieved reasonable success on mesh correlation and data transfer for surface meshes. We propose further work on adaptivity, parallelization, robustness, and generalization to more types of meshes (for example, volume
meshes). The problem of interface propagation is to track the motion of the interface. The
current propagation scheme in GEN2 is fairly ad hoc and is problematic at ridges and corners. We have developed a new approach in two dimensions and propose extending this
method to three dimensions. We propose tackling the problem in two steps. First, we will develop a limited algorithm assuming fixed connectivity for quick prototyping and easy integration. Second, we will develop a more general algorithm that allows full adaptivity of the
interface for better accuracy.
Computational Environment
Software Support for System Integration, Error Estimation, and Error Control E. de
Sturler and X. Jiao
An important effort of computer science is to develop software tools for easing the development and integration of physical components. We propose to develop a set of new tools for
error estimation, error control, and stability analysis. These tools will provide a means to
monitor the accuracy and stability of both the individual applications and the integrated code,
so that the accuracy and stability of a simulation can be established or violations of such assumptions can be identified. This will provide convenience for application developers and
3.16
users to perform code verification and debugging. Further, it can also serve as a basis for solution-based mesh adaptation and time zooming. Much of this effort will take advantage of
our previous work and be conveniently built upon our existing software framework. We have
developed a set of software tools for supporting intercomponent communication and orchestration, including Roccom, Rocnum, and Rocface. We also propose the continuing development and maintenance of these tools to meet new challenges of the more dynamic behavior
of the rocket simulation code in the coming years.
AMPI and Component Interface Techniques L. Kale and O. Lawlor
The integrated codes that we propose to develop arise from multiple research groups and development teams. Each will be parallel, have dynamic behavior and need to communicate
data to other modules, in addition to internal communication within each parallel module.
This project proposes further developing AMPI, an adaptive implementation of MPI, and an
orchestration and interface framework to help accomplish these objectives. The AMPI system currently provides automatic load balancing, automatic checkpointing and the ability to
communicate across independent modules for MPI programs. However, porting MPI codes
to AMPI still requires some effort, notably in writing pack-unpack functions and in collecting
global variables. We aim to fully automate this process, so no extra work needs to be done by
an MPI programmer. Further, AMPI itself will be enhanced by making the implementation a
more comprehensive standard-conforming one (including some MPI2 features), and in improving its communication efficiency. An orchestration and interface framework will be developed that allows each component to publish its boundary data without being explicitly
aware of which other modules will use it, and correspondingly use complementary boundary
data coming from other modules in an equally opaque manner.
Component Frameworks L. Kale and O. Lawlor
Although parallel application modules being developed for rocket simulation, as well as
other physical simulations, employ a large number of different algorithms, discretizations,
and numerical methods, they all rely on a few basic data structures, such as particles, structured grids, and unstructured meshes. Parallel implementations of each type of code, when
attempted from scratch, require a significant programming effort beyond that needed for sequential programming. Further, adaptive refinements present a significant challenge for each
data structureAdaptive Mesh Refinement (AMR) for structured grids; unstructured mesh
refinements and changes such as insertion of cohesive elements; and tree rebuilds for particles. We are developing component frameworks that dramatically reduce the effort needed to
use these algorithms in a parallel program, by providing abstractions that encapsulate or hide
the parallelism in the runtime system. The frameworks include: the MBlock framework for
structured grid based applications, with automatic handling of communication for curvilinear
geometries, an AMR extension to MBlock, an unstructured-mesh framework with applications in structures and finite-volume methods, and a particle framework. Work on the last
two will leverage support from other grants, as well as CSAR funding. The proposed work
includes both new capability development as well as help with integrating the frameworks
into mainstream Roc* codes.
Load Balancing and Communication Optimizations L. Kale and O. Lawlor
Current and future CSAR simulations will have dynamic behavior and run on large next generation machines, where communication is expensive relative to computation. We plan to
Scientific and Engineering Advances
3.17
3.18
mats including HDF5 and streaming data, which would allow on-the-fly visualization of a
running simulation. Scalability of the MPI parallel server Houston will be improved by including dummy cell values in each data block to reduce communication significantly.
Rocketeer exceeds many visualization tools in its ability to handle a wide variety of grids
on which the data is defined. The grid may be non-uniform, structured, or unstructured, and
multiblock. Rocketeer can display multiple data sets for multiple materials from multiple
files on a single image. It can perform the same set of graphics operations automatically on a
series of data files to produce frames for animation. Voyager is a fully featured MPI parallel
batch mode version of Rocketeer that takes advantage of a parallel platform by processing
concurrently a different snapshot on each CPU. Voyager takes as input a text file saved during an interactive Rocketeer session which specifies the camera position, a list of graphics
operations to perform, and a list of all HDF files to process.
3.7
3.19
generate these problems. For the integrated code, methods for computing values such as mass
conservation across the solid/fluid interface will be extended.
Validation
Validation, on the other hand, is making sure the model solves the correct equations. Thus,
it is important to be able to compare the results of the code(s) against known physical results.
Several problem sets are currently available (Titan IV, labscale rocket), but more need to be
developed (other labscale geometries, Space Shuttle RSRM, etc.). Collection and analysis of
appropriate data is a large effort, as is constructing the runtime models for these problems.
Running any of these physical problems requires a large amount of computer time and
generates many GB of output data. Currently the Rocketeer visualization tool is available to
view the resultant data sets, but no tool is available to analyze the raw data, and/or compare
two data sets. We propose a tool be constructed to enable the extraction of arbitrary data
points, sets, etc. from CSAR code output, and that the new tool be able to compare two ostensibly similar data sets to assess their convergence. This tool would both be available on
demand, and would also interface with the Roctest automated test suite to assess the results of
the periodic regression tests.
A series of grid-convergence studies using a lab-scale rocket has begun and will continue.
Grid convergence studies on other problems on other scales will be continued. Grid convergence studies for models based on analytical results should also be performed as verification
activities. Code uncertainty and sensitivity to model parameters is also important. Since in
general, it takes too long to run the CSAR codes to perform full statistical uncertainty analysis, it is proposed that methods be researched that can allow estimation of code run result uncertainties without years of run-time.
Many of the simulation demonstrations listed in Section 3.2 will also serve as validation
tests. In particular, we either already have or expect to obtain detailed test data for a number
of lab-scale rockets and small guidance motors for comparison with our simulation results. In
addition, under our Space Act Agreement with NASA Marshall Space Flight Center in
Huntsville, Alabama, we have received access to extensive design and test data for the Space
Shuttle RSRM. Such test data will enable detailed validation of our integrated simulation results. Further, laboratory experiments on our own campus will be used in validating individual component models in propellant combustion and turbulence.
3.20
4.1
Center-NNSA/DP Interactions
Center personnel have traveled extensively and have been involved in a large number of
technical and informational meetings. These included meetings intended to explore rocket
science and technology, identify technical collaborators, describe the ASCI/ASAP program,
and establish relationships among Center investigators, DOE lab scientists, and industry
leaders. Individual CSAR senior investigators and technical staff have traveled to DOE DP
labs to serve on ASCI/ASAP panels, to participate in ASAP-wide workshops (e.g., materials
and computational environment), to offer research seminars and technical interaction, to receive training on the ACSI computational resources, and to discuss ASCI resource issues
with the CRT. We will continue to work closely with lab representatives on the TST to identify opportunities for detailed interaction. During the first five years of the program, we have
documented hundreds of CSAR-lab interactions in the following areas:
4.1
Ali Pinar, Lawrence Berkeley National Laboratory (PhD, Computer Science, 2002)
Thomas Hafenrichter, SNL (MS, Mechanical Engineering, 2002)
Michelle Duesterhaus, SNL (MS, Mechanical and Industrial Engineering, 2001)
Jason Hales, SNL (PhD, Civil and Environmental Engineering, 2001)
Jack Yoh, LLNL (PhD, Theoretical and Applied Mechanics, 2001)
Benjamin T. Chorpening, SNL-L (PhD, Mechanical Engineering, 2000)
Burkhard Militzer, LLNL (PhD, Physics, 2000)
Christopher D. Tomkins, LANL (PhD, Theoretical and Applied Mechanics, 2000)
Jeff J. Murphy, SNL-L (PhD, Mechanical Engineering, 1999)
Jin Yao, LLNL (PhD, Theoretical and Applied Mechanics, 1999)
Donald Siegel, SNL (PhD, Physics, 1999)
Steven F. Wojtkiewicz, SNL (PhD, Aero and Astro Engineering, 1999)
Boyana Norris, Argonne National Laboratory (PhD, Computer Science, 1999)
Giulia Galli, LLNL (PhD, Physics, 1998)
Arne Gullerud, SNL (PhD, Civil Engineering, 1998)
Michael Ham, LANL (MS, Computer Science, 1998)
4.3
Although CSAR has been the ASAP leader in placing former students and staff in
DOE/NNSA laboratory positions, we seek to improve our record in the coming years. An
important facet of lab placement is citizenship it is often difficult to place non-U.S. citizens in NNSA research positions. To increase the likelihood of CSAR students and staff accepting positions at the labs, we will work diligently to increase the number and quality of
U.S. citizens in the program.
For several years we have encouraged faculty investigators to preferentially hire U.S.
graduate research assistants. In an unwritten policy, we have hired additional students for research projects when U.S. citizen students were identified, even if the initial project budget
was insufficient to cover the additional student. We will continue to extend this support to
encourage faculty to seek qualified domestic students and will attempt to broaden the extra
support to postdoctoral research associates, as well.
Undergraduate students at the University of Illinois are nearly all U.S. citizens (>98%)
and are predominantly Illinois residents (~85%). We propose rapidly growing our program of
using undergraduate students in our laboratories, and in particular, employing them to work
on meshing and laboratory experiment projects. We will make available funds to hire as
many undergraduate students as our faculty and research staff can identify.
4.2
Professor Michael T. Heath, CSAR Director, and the members of the Science Steering
Committee provide world-class leadership and focus for the Center for Simulation of Advanced Rockets. The Center is administratively housed within the Computational Science
and Engineering Program of the UIUC College of Engineering, reporting to the Dean of Engineering, David Daniel.
The Computational Science and Engineering Program is inherently interdisciplinary, requiring expertise in
Education
Research
advanced computing
Program
Program
technology, as well as
in one or more apComputational Science
Center for Simulation
Center for Process
& Engineering Option
of Advanced Rockets Simulation and Design
plied disciplines. The
purpose of the CSE
12 departments
DOE funded
NSF (& DARPA) funded
130 faculty associates
$20 million over 5 years
$2.5 million over 3 years
Degree Option is a
10 graduate fellows
First ends in 2002
Ends Sep 2001
perfect complement
80 graduate students
New 5-year contract
$4 million over 5 years
to the academic goals
enrolled
38 faculty
Begins Sep 2001
36 graduate students
12 faculty
of ASC/ASAPto
3 undergrads
13 students & postdocs
foster interdiscipli27 professional staff
nary, computationally Fig. 5.1: CSAR is one of two research centers in UIUC Computational Science
oriented research and Engineering Program. CSE education program is graduate student acaamong all fields of demic degree option.
science and engineering, and to prepare students to work effectively in such an environment (Figure 5.1).
The CSE Program does not independently admit students or confer graduate degreesstudents wishing to elect the CSE Option must first be admitted to one of the participating departments before enrolling in the CSE Program. Similarly, all faculty members affiliated with CSE have regular faculty appointments in one of the participating departments.
Students electing the CSE Option become proficient in computing technology, including numerical computation and the practical use of advanced computer architectures and in one or
more (traditional) applied disciplines. Such proficiency is gained, in part, through courses
that are specially designed to reduce the usual barriers to interdisciplinary work. Thesis research by CSE students is computationally oriented and actively advised by faculty members
from multiple departments.
Management
The Director and Science Steering Committee members are responsible for nurturing the research program, administering the Center, and maintaining and expanding relationships with
the DOE DP laboratories. This directorate provides the leadership necessary to ensure that
the Center identifies the most important research areas, attracts the most qualified researchers, and pursues and completes the work effectively over the long term. A small administrative staff works to execute Center activities (Figure 5.2). Each of the Research Groups has
Organization, Management, and Academic Program
5.1
Chancellor
Provost
Dean
Internal Advisory
Committee
Director
Managing
Director
External Advisory
Board
External advisors
TST Chair
Science Steering
Committee
Technical prog. coord.
Team collaboration
DOE DP liaison
Tech Program
Manager
Daily operations
The membership of the Ex Code coordination
Financial mgmt
Programmer mgmt
Outreach
HW/SW support
ternal Advisory Board (EAB) consists of individuals chosen from the
Research Groups
DOE DP labs, industry, other govCombustion and
Fluid
Structures
ernmental agencies, and other uniEnergetic Materials
Dynamics
and Materials
versities (Figure 5.3). The External
Computational Math
Computational
and Geometry
Environment
Advisory Board reviews CSAR research studies, makes research recommendations, and provides experFig. 5.2: CSAR management structure provides clear
tise for translating research findings
direction.
into practice. An active communications link has been established with
the EAB. The Board annually assesses the progress of the Center in reports to the CSAR Director and the Dean of the College of Engineering.
Administrative Staff
Rocket Industry
5.2
Groups
Combustion
and Energetic
Materials
Teams
Combustion
and
Energetic
Materials
Fluid
Dynamics
Structures &
Materials
Computer
Science
Fluid Dynamics
Structures and
Materials
Computational
Environments
Computational
Mathematics
and Geometry
System Integration
Engineering Code Development
Software Integration Framework
Validation, Accidents, and Specification
5.3
Legacy
6.1
Code Legacy
The integrated code legacy of CSAR will be a scalable, plugand-play simulation code and framework for performing coupled simulations that automatically adapts to changing topologies, plus a collection of validated state-of-the-art physics
modules that may be used to solve a wide variety of multiphysics problems. Further, the simulation code will enable
exploration of scientific and engineering issues in a broad array of complex fluid-structure interactions.
Legacy
6.1
tion, the CSAR External Advisory Board members continue to be excited about the potential
for use of our integrated simulation tool to explore their proprietary rocket designs. The level
of interest in industry, combined with that of NASA, indicate that long-term development,
maintenance, and dissemination of the code is worthwhile.
6.2
Center Legacy
The research of the ASC/ASAP centers is expected to drive advances in critical computer
and computational science areas. The major goals of the Alliance Program were to:
Solve science and engineering problems of national importance through the use of largescale, multidisciplinary modeling and simulation.
Establish and validate large-scale modeling and simulation as a viable scientific methodology across scientific applications requiring both integration across disciplines and complex simulation sequences.
Enhance overall ASC efforts by engaging academic experts in computer science, computational mathematics, and simulations in science and engineering.
Leverage relevant research in the academic community, including basic science, highperformance computing systems, and computational environments.
Strengthen education and research in areas critical to the long-term success of ASC and
the Stockpile Stewardship Program.
Strengthen ties among the Defense Programs laboratories and participating U.S. universities.
CSAR has been remarkably successful in helping ASC fulfill these vital goals.
The Center has had a major impact on the University of Illinois in a variety of ways.
Above all, it has engendered an unprecedented level of collaboration across disciplines and
departments. Even within single disciplines, such as fluid dynamics or structural analysis,
faculty collaboration across departmental lines has been enhanced enormously. As a result,
the Center has become a model for other interdisciplinary, interdepartmental research initiatives. In addition, because of the broad applicability of the technologies it represents, CSAR
has also provided leverage to, and benefited greatly from, many other separately funded programs on our campus, both individual faculty research grants and other large centers such as
NCSA.
Research Initiatives and Computational Simulation
Several new College of Engineering federally-funded research initiatives have sprung from
CSAR, and the Center is viewed as a source of talent and expertise in interdisciplinary management. The NSF/DARPA Center for Process Simulation and Design was proposed and
funded in 1998 (three year grant, $2.8 million) to explore simulation in materials manufacturing and solidification processes. This past year, a five-year, $3 million extension to CPSD
was funded by NSF to support Multiscale Models for Microstructure Simulation and Process Design.
A complementary center proposal was developed this past year and submitted to NASA
under the University Research, Engineering, and Technology Institutes program. This stillLegacy
6.2
open proposal requested the establishment of the Space Access Institute at the University
of Illinois to explore technologies needed for third generation launch vehicles. CSAR investigators, students, and Rocflo-MP may be involved in the study of airframe computational
fluid dynamics and heat transfer for launch vehicles being designed for use in 2020 and beyond.
Together with new collaborators from our own campus and elsewhere, we are planning to
apply the codes developed at CSAR to entirely new problems, both within the aerospace field
(e.g., designing new inlet technologies for supersonic jets) and well beyond (e.g., designing
implantable ventricular assist devices for heart patients). The opportunities for CSAR legacies such as these can be expected to grow in the coming years.
The Center has enhanced the awareness on our campus of computational simulation, and
it has substantially increased the visibility and influence of our interdisciplinary Computational Science and Engineering (CSE) Program, which administratively houses the Center.
The computationally oriented, interdisciplinary educational program provided by CSE fits
perfectly with the needs of CSAR, and the students in this program are ideally trained to participate in the research activities of the Center. CSE courses are specially designed to lower
the usual barriers to interdisciplinary course work and enable students to master both applied
and computational disciplines.
Staff and Students
By hiring more than 50 new professional staff and postdoctoral associates during the first
five years of the program, the Center has significantly enlarged the local technical talent
pool, providing a whole new set of collaborators for existing faculty and staff. The Center
has also hosted a number of visitors, both long-term and short-term, and has organized a very
popular seminar series that was designed specifically to reach out across disciplinary boundaries to enhance collaboration. All of these will continue.
The Center spans ten UIUC units, and its recognition and influence are pervasive
throughout the College of Engineering and beyond. We work very closely with NCSA,
which contributes both research personnel and computer time toward our effort. Several key
members of our research team are also research scientists at NCSA. It has been especially
convenient to do initial code development locally on parallel systems in CSE and NCSA preceding full implementation on the remote ASC platforms.
Another major impact of the Center has been on graduate education and training. CSAR
plays a major role in educating a new generation of scientists and engineers prepared to work
in computational simulation of complex systems by supporting more than forty graduate students at any given time. By virtue of this experience, the students we train are already attuned
to the needs of interdisciplinary collaboration. The level of involvement by undergraduate
students has been growing and will continue to expand.
Legacy
6.3
Addendum
Our strategy for efficient use of additional funds includes supporting both new and expanded
efforts. We propose establishing three new research projects (the first two of which are on
our critical path and hence will accelerate the completion of the integrated code), reestablishing the research support for our two off-campus subcontractors that was cut due to the
budget reduction between Y5 and Y6, and expanding the effort of three continuing projects.
A.1
A.1
Research Subcontractors
A.2
cracks in the solid propellant. The 3-D mesh repair capability will be interfaced with Rocflu
to enable more realistic and more challenging SRM simulations. For the moving inhibitor
problem, the deformation of the boundary surface is sufficiently mild that it is possible to
maintain the connectivity of the surface triangulation throughout the simulation. For the
crack propagation problem, however, there are significant changes in the extent and shape of
the boundary surface. New techniques must be developed to modify and repair a boundary
surface that is undergoing substantial change.
A.3
Expanded Efforts
A.3