You are on page 1of 35

U.S.

Department of Energy
National Nuclear Security Administration
Advanced Simulation and Computing
Academic Strategic Alliances Program

Center for Simulation


of Advanced Rockets

Fully coupled, 3-D simulation of


"slumping" propellant in Titan IV rocket
motor showing velocity vectors in core
fluid and total stress in solid propellant

Y6-10 Statement of Work


2262 Digital Computer Laboratory
1304 West Springfield Avenue
College of Engineering
University of Illinois at Urbana-Champaign
Urbana, Illinois 61801
May 2002

Table of Contents

1.

Table of Contents

1.1

2.

Executive Summary

2.1

3.

Scientific and Engineering Advances

3.1

3.1

Simulation Development

3.1

3.2

Simulation Demonstrations

3.3

3.3

Structures and Materials

3.5

3.4

Fluid Dynamics

3.8

3.5

Combustion and Energetic Materials

3.12

3.6

Computer Science

3.15

3.7

Computational Mathematics and Geometry

3.15

Computational Environment

3.16

Visualization

3.18

Verification and Validation

3.19

4.

Outreach and Interaction with NNSA/DP Laboratories

4.1

5.

Organization, Management, and Academic Program

5.1

6.

Legacy

6.1

6.1

Code Legacy
Software Engineering

6.2
A.

Center Legacy

Addendum Additional Research Projects

Table of Contents

6.1
6.1
6.2
A.1

1.1

Executive Summary

The goal of CSAR is the detailed, whole-system simulation of solid propellant rockets from
first principles under both normal and abnormal operating conditions. The design of solid
propellant rockets is a sophisticated technological problem requiring expertise in diverse
subdisciplines, including the ignition and combustion of composite energetic materials; the
solid mechanics of the propellant, case, insulation, and nozzle; the fluid dynamics of the interior flow and exhaust plume; the aging and damage of components; and the analysis of various potential failure modes. These problems are characterized by very high energy densities,
extremely diverse length and time scales, complex interfaces, and reactive, turbulent, and
multiphase flows.
The scientific and technological needs of the U. S. Department of Energy posed by the
Accelerated Strategic Computing Initiative/Academic Strategic Alliances Program
(ASCI/ASAP) encouraged the University of Illinois at Urbana-Champaign (UIUC) to establish the Center for Simulation of Advanced Rockets (CSAR) in September 1997. The outstanding quality of the faculty and staff, facilities, and research infrastructure offered by
UIUC have enabled a unique partnership among university researchers and the DOE/NNSA
Defense Program laboratories to advance the state of the art in computational simulation of
complex systems. State, regional, and university resources are also supporting the program,
and an experienced research team is fulfilling the mission of the Center.
CSAR is focusing on the reusable solid rocket motor (RSRM) of the NASA Space Transportation System, better known as the Space Shuttle, as its long-term simulation vehicle. The
RSRM is a well-established commercial rocket, is globally recognized, and most importantly,
design data and propellant configurations are available. Various smaller scale rockets are also
simulated to provide validation data for CSAR codes. Simulations that include full geometric
and materials complexity require a sequence of incremental developmentsin engineering
science, computer science, and systems integrationover an extended period of time. From
the outset, our emphasis has been on system integration rather than separate threads of development that eventually come together at some point in the future. Rapid exploration of critical system integration issues demanded the use of simplifiedbut fully integratedmodels
and interfaces initially, followed by successively refined models and interfaces as experience
was gained. CSAR staff have designed and implemented a fully integrated code that includes
characterization of various burn scenarios and the onset of potential component failures. Refined multiscale component models and advanced system integration concepts based on lessons learned from this effort constitute the key features in our proposed research. Use of the
simulation code to explore scientific and engineering issues in complex fluid-structure interactions is a major focus for the new program.
More than 100 UIUC faculty, students, and researchers contribute to the success of the
Center. An External Advisory Board provides critical guidance in rocket simulation and
computational science. The DOE-supplied budget has been sufficient to maintain an aggressive research program. In addition, the University of Illinois has provided funds for ancillary
research expenditures, computer workstations, and facility renovation. Center personnel have
traveled widely to explore rocket science and technology, identify technical collaborators,
describe the ASCI/ASAP program, and establish relationships among Center investigators,
DOE/NNSA DP scientists, and industry leaders.

Executive Summary

2.1

Scientific and Engineering Advances

3.1

Simulation Development

The central goal of CSAR is the detailed, whole-system simulation of solid propellant rockets
from first principles under both normal and abnormal operating conditions. Full simulations
(Figure 3.1) of such complexity require a sequence of incremental developmentsin engineering science, computer science,
and systems integrationover an
extended period of time. From the
outset, however, our emphasis has
been on system integration rather
than separate threads of development
that eventually come together at some
point in the future. Rapid exploration
of critical system integration issues
have demanded the use of simplifiedbut fully integratedmodels
and interfaces initially, followed by
successively refined models and interfaces as experience is gained (Figure 3.2).
Simulation Roadmap
The CSAR Simulation Roadmap
Fig. 3.1: Current 3-D fully coupled code includes struc(Figure 3.3) depicts the evolution of
tural dynamics, combustion, and fluid dynamics simulation
increasingly sophisticated computamodules and ignition model. Images show interaction of
solid propellant, combustion layer, and fluid flow followtional models for the primary rocket
ing ignition of RSRM (clockwise from upper left: 5, 10,
components and their interactions.
15, and 20 ms). Colors in solid propellant depict local
Our initial implementation of an intestress, colored arrows in fluid represent flow direction and
grated simulation code (GEN1),
speed, and colored isosurfaces in fluid show temperature
which was operational at the end of
distribution.
2000, provided a simplified characterization of various burn scenarios. The GEN1 code employed macroscopic models for the
separate components to enable a strong focus on the definition and resolution of system integration issues. Refined, multiscale component models and advanced system integration concepts, based on lessons learned from GEN1, constitute the key features in the secondgeneration (GEN2) code, developed during Years 4 and 5 and beyond. The refined models
also reflect the synthesis of fundamental, subscale studies (bottom right side of Figure 3.3)
that are critical for detailed simulations of accident scenarios and for reliable simulation of
multiscale phenomena such as combustion and turbulence. The code numbers in the diagram
indicate dependence of the refined and accident models on the subscale simulations.
The Roadmap indicates the close coupling among the components; physical quantities
such as temperature (T), mass flow ( m ), pressure (p), heat flux (q), concentrations (ci), and
geometry that must be exchanged between the solid rocket booster (SRB) component mod-

Scientific and Engineering Advances

3.1

Geometrical Complexity

els. The computer science integration efforts define the framework for these interconnections and, consequently, their
eventual impact on overall code performance. In the right-center box on the diagram, computer science research and development activities are shown that support the SRB simulation through the implementation and optimization of the
component models and subscale simulations, the integration of component models and the computational infrastructure
required to do large-scale parallel computation.

Accidents

GEN2
Family

Joints
Star Grain

GEN1
Family

3-D
2-D
1-D

GEN0

Weakly Coupled

Fully Coupled

Detailed

Physical Complexity

Fig. 3.2: CSAR follows staged approach to system


integration.

Finally, the central placement of validation efforts in the diagram highlights the priority
assigned to this activity. Each subscale, component, and integrated simulation must be validated against existing analytical, numerical, and experimental data available in the open literature or obtained from NASA, DOD research agencies, the U.S. rocket industry, or in experiments in laboratories on our own campus.
System integration involves two major tasks to ensure the physical, mathematical, geo-

Fig. 3.3: CSAR Roadmap showing completed tasks (dark boxes) and planned activities for Y6-10.
Scientific and Engineering Advances

3.2

metric, numerical, and software compatibility of the component models and the codes implementing them. The first task is providing information transfer across component boundaries. Boundary conditions for the component models must be compatible mathematically
(e.g., an outflow from one component becomes an inflow for a neighboring component). The
discretizations of neighboring components must fit together geometrically. Different spatial
resolutions and discretization methodologies must be reconciled via interpolation where necessary. These issues have been addressed by developing innovative algorithms for mesh association and common refinement of interface meshes, together with new data transfer methods that are both physically conservative and highly accurate numerically.
The other major task is temporal coupling of the components so that the whole system is
evolved in a self-consistent manner. Different components require different time step sizes
due to the choice(s) of algorithm(s) (e.g., explicit vs. implicit methods), spatial resolution,
and/or the physics of the subproblem that the module solves. The computational cost of
forcing each module to take a time step determined by the module requiring the shortest step
is often prohibitive. In a broad, important research area that has come to be known as time
zooming, we continue to investigate multiple strategies for coupling modules requiring different time step sizes while maintaining the accuracy of the overall simulation.
Our approach to system integration has been to develop a single executable code containing modules for the various components and the interface code for tying them together.
We are following an object-oriented design methodology that hides the data structures and
other internal details of the individual component codes. This simplifies development and
maintenance of the interface code and the component codes, and also makes it easier to swap
different versions of the same componenta critical capability for determining the most efficient algorithms and implementations.
3.2

Simulation Demonstrations

Team Coordinators: Mark D. Brandyberry and Robert A. Fiedler


Research Programmer: Johnny C. Norris
Overview
Simulating solid propellant rocket motors requires solving an extremely complex, fully coupled, multidisciplinary set of physics problems that span a wide range of length and time
scales. As noted above, we have taken a staged approach in developing an integrated wholesystem rocket simulation package, beginning with relatively simple physical models, geometries, and coupling in our first-generation code (GEN1), and progressing toward more detailed physics, geometries, and interactions in our second-generation code (GEN2).
Integrated Rocket Simulations SITeam Members
We propose a variety of challenging fully-integrated, 3-D simulations to exercise our GENX
code over the next few years. They are a collection of validation studies that include normal
and abnormal operating conditions, geometries supplied from the open literature, NASA,
DOD, and our U.S. rocket industry partners, and experimental- and production-model rocket
motors. These target problems comprise the annual integrated simulation goals required in
the Statement of Work. Approximate completion dates are included, although implementa-

Scientific and Engineering Advances

3.3

tion of improved physical models (turbulence, aluminum droplets, smoke, radiation, plume
chemistry; constitutive relations for the propellant, case, and insulation; heat transfer, ignition, and burn rate, etc.) may warrant revisiting a given problem to demonstrate better
agreement with experimental results. This preliminary list of problems includes Titan IV
propellant slumping, flexible inhibitor, aerodynamic loads (rocket in a wind tunnel), detailed
full burn of various small rockets supplied by the U.S. rocket industry, RSRM normal burn,
flow past O-rings, and Titan case rupture.
Integrated Rocket Simulations

Titan IV propellant slumping This real-life solid propellant booster accident is described in the
open literature. Published documentation includes detailed pressure sensor and hoop strain gauge
measurements, as well as 2-D simulation results for comparison with our 3-D simulation results. The
current GENX code will soon be able to reproduce the conditions that lead to rupture of the case.
This will require a 10 to 15-day run on 512 processors for a coarse mesh study. The amount of
slumping that can be followed is limited by fluids mesh motion. December 2002

Flexible bore inhibitor The current GENX code should be able to simulate a rocket with a flexing
inhibitor provided the deformation is not too great, perhaps a maximum deflection angle of ~45 degrees. The maximum deformation of an inhibitor in a full rocket model might be less than it is in our
model of just a section, unless we carefully choose time-dependent boundary conditions. We have
already generated a preliminary fluids mesh. December 2002

Aerodynamic loads on the case (for lab scale rockets) This topic is of particular interest to the
rocket industry, and the current GENX code could easily be extended to include the air in a region
outside the rocket. The fluids domain would include a virtual wind tunnel outside the case, with a
time-dependent inflow speed determined by the thrust history of the rocket. The non-rigid case
would have a fluid-structure boundary on the outside, as well as the usual loads from the internal gas
pressure. Most of the work to be done would involve setting up appropriate boundary conditions and
meshing the initial geometry. June 2003

Lab scale rockets from ignition to burn-out We already can reproduce the measured pressure
overshoot, but the pressure rise is much slower in the experimental data than in our simulations using the current ignition model (Rocburn v2.2). The CEM and Fluids teams will collaborate to develop an improved model of the heat flux from the gas to the propellant. To simulate propellant
burn-out, we need at least a rudimentary remeshing capability to handle the gradually changing topology. We expect that the unstructured fluids and structures solvers (Rocflu and Rocfrac) along
with the serial mesh repair code will be able to address this problem in a time frame of about a year.
June 2003

ARC Rockets Atlantic Research Corporation (ARC) has agreed to supply us with detailed design
and experimental data for small guidance rockets, both as a validation test for our code and to assist
them with improving their designs. The most difficult aspects of simulating these small motors is accurately modeling ignition transients and adapting the meshes automatically as the propellant burns
away completely. There are flexible baffles in some of these motors that may divide and reunite the
fluid domain as the system evolves, and cracks can form in the propellant and other components. An
advanced 3-D remeshing capability is required for such topology changes. December 2003

RSRM Challenger accident (simulate flow past O-rings) This problem requires a structures
solver that handles contact, so perhaps this is an item for the addendum on what could be done if we
are given additional funding. The fluids domain topology changes when the gas begins to pass
around the O-ring, which requires advanced remeshing capabilities. June 2004

RSRM complete normal burn The ignition transients for the Space Shuttle booster are well characterized in the open literature, and we have access to extensive test data. We hope to reproduce the
waterfall plot (amplitude of vibration as a function of frequency) at various stages during firing.
The most difficult aspect of simulating the entire history of a large motor is reducing the run time.

Scientific and Engineering Advances

3.4

For a fluids mesh that is fine enough to allow accurate turbulence modeling, for example, time
zooming techniques under consideration by the Fluids team will be required to reach 120 seconds of
physical problem time. December 2004

3.3

Titan IV case rupture accident In this simulation, the pressure builds up until the case fails. The
failure begins as a crack that propagates all the way through the case. The fluid domain would include the air outside the case, so hot gas would begin to leak from the crack in the case into the surrounding air. The case would tear open, relieving the pressure. In the test firing, the rocket broke up
violently into many pieces, destroying the test stand, but there was no detonation. By the time we
implement the required remeshing capability, we may also have an advanced material model for the
propellant that includes the effect of voids and dewetting. June 2005

Structures and Materials

Group Leaders: Philippe H. Geubelle and Keith D. Hjelmstad


Faculty Investigators: Robert S. Averback, Armand J. Beaudoin, Jeff G. Erickson, Philippe
H. Geubelle, Michael J. Garland, Robert B. Haber, Yonggang Huang, Keith D. Hjelmstad,
Petros Sofronis, and Daniel A. Tortorelli
Postdoctoral Research Associate: Yinon Ashkenazy
Research Programmers: M. Scot Breitenfeld and Changyu Hwang
Overview
The Structures and Materials group is responsible for the analysis of the solid parts of the
rocketthe rocket case, unburned solid fuel, insulation and inhibitors, case segment joints,
etc. The activities of this group divide into two broad areas: failure modeling and constitutive
behavior of components and materials, and integrated system simulation.
Failure Modeling and
Constitutive Behavior
Mesoscale Modeling of
Constitutive and Failure
Response of SP and Case
A. Beaudoin, E. de
Sturler, P. Geubelle and Y.
Huang

Rocfrac

Rocsolid

(Explicit)

(Implicit)

Case
Modeling

Solid Propellant
Modeling

Macroscale (continuum)

MTS

Hills potential

Mesoscale (continuum)

Multi-grain

Particulate composite

Structural
Simulations
Constitutive and
Failure Modeling

The continued developAtomic scale


Dislocations at grain
Grain/binder interface
ment and implementation
(atomistic simulations)
boundaries
of physically-based codes
needed to simulate 3-D Fig. 3.4: Structures and Materials Technical Roadmap provides explicit
constitutive and failure be- and implict simulation schemes and an array of materials modeling
methods.
havior of solid propellant
(SP) is proposed. Three phenomena taking place in the vicinity of a crack front propagating
in SP will be captured: dewetting of the larger AP particles, progressive growth of the voids,
and coalescence of voids through tearing of the matrix. The simulation effort will be performed at the mesoscale and will account for the discrete microstructure of SP. The analysis
Scientific and Engineering Advances

3.5

will involve the combination of a cohesive finite element scheme to capture the spontaneous
dewetting of the particles; the virtual internal bound (VIB) method to simulate the tearing of
the matrix; and packing algorithms to create realistic microstructures. The analysis will be
performed under both quasi-static and dynamic conditions, and will rely on a proposed novel
iterative solver that takes advantage of the slowly evolving nature of the secant or tangent
stiffness matrix as cohesive and/or VIB elements progressively fail. This project involves
close collaboration with research groups at Los Alamos and Sandia National Laboratories.
Micromechanics Approach to Study of Damage and Constitutive Response P. Sofronis
Finite element methodology and principles of mechanics of materials are used to study and
understand issues governing the mechanical damage response of a solid propellant under
both static and dynamic loading. We will address macroscopic propellant nonlinearity resulting from the interaction between the macromechanisms of damage at the microscale. The
derived constitutive laws will be used to investigate the phenomena of shear localization and
hot spot generation under dynamic loading. These phenomena take place spatially at both
nano (interface cohesive laws) and micron-scale (porosity, localized-band width), and may
evolve in time over picoseconds (initiation of dewetting) or a few seconds (void elongation).
The importance of atomistic simulations in assessing specific material-parameter inputs to
the micromechanical models will be explored. This project is part of an ongoing research
collaboration on void growth with the Lawrence Livermore National Laboratory.
Atomistic Calculations of Debonding, Plastic Behavior and Fracture R. Averback
This project provides convenient interfacing between atomistic simulations and continuum
modeling. In achieving these goals, we will develop analytic potentials that describe interatomic interactions in and between different classes of materials: metals, ceramics, and polymers. Efforts will focus on developing appropriate reactive potentials for polymer/metal and
polymer/metal-oxide interfaces and on developing a scheme to combine molecular dynamics
(MD) with kinetic Monte Carlo (KMC) methods for alloy systems. Atomistic simulations are
then employed to calculate the mechanical response of multiphase materials subjected to high
strain conditions at various temperatures and to develop constitutive laws near interfaces.
These results will provide input data for other CSAR continuum models.
Structural System Simulation
Adaptive Version of Rocfrac P. Geubelle and S. Breitenfeld
Developed over the past three years, Rocfrac is a structural solver used in the GEN2 integrated code development effort. The code is based on an explicit time stepping scheme and
relies on a 3-D Arbitrary Lagrangian Eulerian formulation to capture the regression of the
solid propellant. It includes nonlinear kinematic description to account for possible large deformations and/or rotations, and contains a variety of volumetric responses, ranging from linear elasticity to the Arruda-Boyce nonlinearly elastic model used for the grain response and
to a specially developed elasto-viscoplastic model for the case. The Rocfrac element library
currently includes 4- and 10-node tetrahedral volumetric elements used to model the bulk
response of the rockets structural components, and 6- and 12-node cohesive elements used
to capture the initiation and rapid propagation of one or more cracks in the grain and along
the grain/case interface. We will continue the development and integration of this structural
solver, and, in particular, address the following aspects:

Scientific and Engineering Advances

3.6

Develop and implement the next generation of Rocfrac, with special emphasis on the introduction of improved low-order tetrahedral elements and on dynamic mesh adaptivity

Complete the full integration of Rocfrac within the Charm++/FE (SWIFT) Framework

Apply the integrated unstructured rocket code to various dynamic fracture problems, with
special emphasis on the Titan IV grain slumping accident

Parallel Implicit Finite Element Computations K. Hjelmstad and D. Tortorelli


The unified, implicit, parallel, finite element program development effort will continue. Implicit time integration methods offer a significant advantage over explicit methods in that the
size of the analysis time step is not limited by numerical stability. Future simulations able to
model more than the first few seconds of a complex burn like the RSRM may demand implicit computations. Further, developing methods for time zooming include implicit computational methods. This work will be carried out in collaboration with the Computer Science
Groups efforts on a finite element framework and numerical solvers.
Convective Burning of Cracks P. Geubelle, C. Hwang, and L. Massa, and B. Asay (LANL)
Over the past three years, a 2-D fully-integrated fluid/structure code (referred hereafter as
GEN2D) has been developed as a 2-D version of GEN2. GEN2D has played an important
role in the success of its 3-D counterpart (GEN2) as it has allowed us to test the explicit
fluid/structure coupling algorithm that is at the heart of the integrated code. The 2-D code has
been used over the past two years to test the ability of the coupling algorithm to capture aeroelastic flutter and to conserve mass across a regressing boundary. The existence of this 2-D
code is a key reason why the development and implementation of GEN2 was achieved so
rapidly and successfully. GEN2D is expected to play a key role over the next five years, as
new coupling algorithms, physical models and meshing strategies are developed for the integrated code.
This project has two facets. First, we will continue an ongoing investigation of the multiphysics problem of convective burning of cracks in energetic materials in collaboration with
the CEM group and the LANL Combustion group (led by B. Asay). This study will be performed both at the macroscale, where the discrete nature of the SP microstructure is neglected and homogenized constitutive and combustion properties are assumed, and at the
mesoscale, where the structural solver will be combined with the 2-D version of Rocfire to
simulate the process of crack propagation in a burning solid propellant. Second, GEN2D will
be used to test new coupling strategies before these are used in the 3-D code. Special emphasis will be placed here on problems requiring adaptivity of the interface code (Rocface), i.e.,
those for which important geometrical changes take place along the interface due to substantial motion of the interface. Separation of a piece of solid propellant from the rest of the
rocket core is one of the class of problems to be investigated where a new internal boundary
is created spontaneously.
Spacetime Discontinuous Galerkin Methods for Coupled Simulations R. Haber, J. Erickson, and M. Garland
Spacetime discontinuous Galerkin (STDG) finite element methods have emerged as a powerful new class of numerical methods for solving hyperbolic problems and nonlinear conservation laws. Research in CSAR and its sister center, the NSF Center for Process Simulation and
Design (CPSD), has moved spacetime methods from blue-sky concepts to a viable simulaScientific and Engineering Advances

3.7

tion technique positioned to be uniquely responsive to CSARs simulation requirements. Inherent O(N) computational complexity, rich parallel structure, support of fully unstructured
spacetime grids, strong element-wise conservation properties, intrinsic high-order stability
and natural shock-capturing properties make STDG methods an ideal vehicle for highresolution simulations. Parallel, hp-adaptive STDG implementations promise to significantly
extend our capabilities for long-duration and multi-scale rocket simulations. We propose
continuing research on STDG methods, leveraged by a parallel effort in CPSD, to realize and
demonstrate these advantages. This program will involve new research on STDG formulations, spacetime mesh generation, visualization of spacetime data sets, and parallel implementations. The research involves mesh motion and adaptive grids, time zooming, and multiscale modeling of propellants.
3.4

Fluid Dynamics

Group Leaders: Robert D. Moser and S. Balachandar


Faculty: Ronald J. Adrian, Hassan Aref, S. Balachandar, Jonathan Freund, and Robert D.
Moser
Research Scientists: Jiri Blazek, James P. Ferry, Andreas C. Haselbacher, Fady M. Najjar,
and Bono Wasistho
Overview

Fluids Multi-Physics Framework


3-D Unstructured

3-D Structured

Development (Rocflu)
Development (Rocflo)
The Fluid Dynamics Group
addresses system-scale SRM
multiphase compressible
Radiation
Chemistry
Al Droplet
Al Oxide
LES Turbulence
core flow code development,
Modeling
Modeling
Modeling
Modeling
Modeling
(Rocspecies)
(Rocrad)
(Rocsmoke)
(Rocpart)
as well as subscale model
(Rocturb)
development relevant to the
turbulent dynamics of the
Optimal LES
Multi-phase Flow
Injection Driven
combustion interface. These
Research
Research
Flow Research
include injection, dispersion
and combustion of aluminum Fig. 3.5: Fluid Dynamics Group technical roadmap provides supportdroplets in the core flow, ing fundamental research for integrated code components.
formation, dispersion and
slag accumulation of aluminum oxide particles, and flow within cracks and other defects in
the propellant. Broadly viewed, Fluids Group research includes multiphysics code module
development (Rocflo and Rocflu), individual physics modules (Rocturb, Rocpart, Rocsmoke,
Rocrad, and Rocspecies), and fundamental research projects needed to support simulation
code development (Figure 3.5).

Multiphysics Fluid Dynamics Code Development


SRM boosters contain many geometrically complex features, such as the star grain region
and the submerged nozzle. Moreover, additional geometrically complex regions may develop
because of large time-dependent deformations during burning of the propellant. Examples
include cracks in the solid propellant, slumping propellant segments, and even periodic motions of inhibitors left partially exposed by propellant burn back. To simulate fluid-dynamic

Scientific and Engineering Advances

3.8

processes in complex geometries, CSAR is developing an unstructured-grid code called


Rocflu. The simpler regions of the rocket geometry will be modeled with the block-structured
Rocflo code. Simulation of one of our key validation studies, the Titan IV slumping propellant accident, may depend on the interaction among Rocflu, Rocflo, and Rocfrac (Figure 3.6).
Rocflu A. Haselbacher
Rocflu addresses fluid flow in complex and evolving geometric regions. It uses unstructured
grids composed of arbitrary
combinations of tetrahedra,
hexahedra, prisms, and pyramids. The development of
Rocflu is tightly coupled to that
of the block-structured code
Rocflo to maximize code reuse
through a framework of common data structures, variable
definitions, naming conventions, and coding standards.
Rocflu development efforts will
concentrate on issues relating to
grid motion and repair, the in- Fig. 3.6: Schematic interaction among Rocflu, Rocflo, and Rocfrac
corporation of optimal LES for the case of slumping propellant segment. Note that interface
stencils, and the integration of surface grids between Rocflu and Rocflo need not be matching.
the physical modeling modules
for multiphysics simulations. Research related to the code development effort will focus on
the computation of stencil weights for optimal LES on unstructured grids, and the continued
development of commuting filter operators for unstructured grids.
Rocflo-MP J. Blazek
A high fidelity simulation of a solid rocket motor based on first principles requires an accurate, fast and numerically robust fluids code. We will continue the development of our structured, multi-block flow solver Rocflo-MP for multi-physics simulations of flows inside and
outside of solid rocket motors. It is also proposed to conduct research in numerical methods
in order to enable detailed and accurate simulations of the complete burn of a solid rocket
motor and of possible failure scenarios. Our foci include: reducing significantly the computational time of a simulation while retaining temporal accuracy; developing a robust mesh
movement/regeneration methodology, increasing the spatial resolution to allow for reliable
Large Eddy Simulations (LES) inside the full motor; developing numerically robust schemes
for the treatment of stiff source terms arising from chemical species, radiation, burning particles and smoke; and including the effects of gravity, acceleration and aerodynamic loads on
the case.
Individual Fluids Code Module Development
Rocturb R. Moser, F. Najjar, and B. Wasistho
A substantial fraction of the gaseous flow in a solid propellant rocket chamber is turbulent,
and this turbulence affects many of the physical process occuring in an SRM. It is thus of

Scientific and Engineering Advances

3.9

great importance to model the turbulence in the core flow. The primary turbulence modeling
paradigm that has been selected for this application is Large Eddy Simulation (LES). LES
has the advantage of resolving the large-scale turbulence fluctuations, which are needed in
modeling the interactions of turbulence with other phenomena (e.g. particle dispersion, interaction with the combustion layer, shedding from inhibitors). Earlier simulations of simple
rocket motors were performed with simple no-model LES, in which artificial dissipation is
used to regularize the fluid flow equations. However, such a numerical treatment lacks a rigorous physical basis. It fails, for instance, to adequately predict transition without adjusting
the level of artificial dissipation in an ad-hoc manner. The main purpose of Rocturb module
is therefore to provide physically based models for simulating turbulent flow inside a rocket
motor.
In its development and improvements, Rocturb relies on four areas of basic research.
First, research on application of turbulence models within the solid rocket simulation framework itself. This research is directly related to the turbulence phenomena that occur in the
rocket flow and its simulation. Results of these studies can, thus, be directly implemented in
the Rocturb module. Second is research on a new class of turbulence models called optimal
LES, in which LES models are developed directly approximating the best possible deterministic LES model. The first use of optimal LES in an applications fluids code will be in
Rocfluid through the Rocturb module. Third is research on the physics and modeling of the
turbulence fluctuations that are generated by the combustion layer and its interaction with the
core flow. This poorly understood process sets the boundary conditions for the turbulence in
the core flow. Finally, there is research on the LES of multiphase flows. The particles in the
rocket are subgrid in size. They therefore influence and are dispersed by the subgrid flow
fields. This interaction must be modeled, and represents a coupling between Rocturb,
Rocpart and Rocsmoke models.
Rocpart F. Najjar, S. Balachandar, and J. Ferry
Modern solid-propellant rocket motors have aluminum particles added to the mix and their
inclusion contributes for up to 30% of the heat generation. Within the present multi-physics
framework, these Al droplets and the large oxide caps are taken into account through
Rocpart. Several enhancements to the present Rocpart are proposed: incorporation of improved force law, heat and mass transfer correlations, and burn-rate models to better account
for flow conditions within the rocket, improved injection model, modeling of oxide smoke
entrapment resulting in slag accumulation in the submerged nozzle, modeling of the effect of
subgrid turbulence on droplet evolution, improved Lagrangian-Eulerian coupling, and thorough verification and validation. These enhancements require fundamental investigation and
in particular we propose continuing our very productive study of an isolated particle in complex ambient flow.
Rocsmoke and Rocspecies J. Ferry, S. Balachandar, and F. Najjar
Aluminum oxide smoke is an important by-product of combustion in modern aluminized
propellants and it impacts the flow physics inside the rocket motor in several critical ways.
Smoke modulates the radiative transfer of energy, it gets trapped in the submerged nozzle as
slag, and it can represent over 20% of the mass inside a motor. This mass exhibits variations
of concentration that are far more complex than the variations in gas density. Therefore we
propose to continue our fundamental multiphase flow research as a means to assess, develop,

Scientific and Engineering Advances

3.10

and improve continuum models of multiphase flow. Specific areas of focus are the continued
development of the Equilibrium Eulerian method; internal boundary conditions (allowing
different methods to be used in adjacent regions); compressibility effects (interaction of
smoke with acoustic and shock waves); interphase coupling; slag dynamics; collision and
diffusion modeling; and LES modeling for smoke. This research encompasses the development of methods for handling species, especially taking into account real gas effects.
Fundamental Supporting Fluid Dynamics Research
Experimental Studies of Rocket Fluid Mechanics R. Adrian
We propose continuing an experimental program to provide physical data on the fluid mechanics of turbulent core-flow, combustion and two-phase flow in solid fuel rockets. These
data support the development of physical models needed for the computational fluid dynamics formulations and provide validation of computations of basic rocket configurations. We
employ small-scale models and rocket flow simulations that allow phenomena important to
the numerical simulations to be isolated and studied in well-defined conditions and simple
geometries amenable to computational development. The studies include investigations of
the sensitivity of the core flow to detailed structure of the wall velocity boundary condition,
detailed measurements of the velocity field of the hot exhaust plume, and construction and
testing of a standard rocket model whose fuel can be modified to study characteristics of
turbulence, two-phase flow, and combustion in the burning exhaust plume.
Coagulation and Fragmentation of Al Particles and Ash H. Aref and S. Balachandar
We are studying equations describing coagulation and fragmentation phenomena with a view
to applying the results to the aluminum particles used in solid-rocket fuels and the ash particles produced once these particles have burned. Modeling of the size and mass distribution of
fuel and ash particles is an important input to the global simulation codes being constructed.
This research is one element of a broader thrust within the CSAR particle group.
Particle Dynamics in High-Speed Flows H. Aref
The dynamics of particles advected by a flow are only known approximately, in general, although they are known with considerable accuracy in the limits of irrotational flow and when
the Reynolds number of the relative motion of particle and fluid is small. Much of the literature ignores effects from particles being non-spherical and from the center of mass and the
centroid of the particle not coinciding. We propose to gain a thorough understanding of particle motions in the types of flow that prevail in the solid booster rocket by considering the
Kirchhoff-Kelvin equations and modifications thereof. These equations should enable us to
achieve insights in the motion of irregular objects, both homogeneous objects of irregular
shape, and objects of regular shape with an inhomogeneous mass distribution, and achieve
a parametrization of the effects of particle-flow interactions in an imposed, essentially unidirectional flow, and establish a parametrization of particle-particle interaction effects. These
issues are relevant as a basis for modeling individual particle motions in the rocket, particle
interactions and collisions, and processes leading to coagulation of particles and build-up of
ash and slag.
Turbulence Research R. Moser, R. Adrian, J. Freund, B. Wasistho, and A. Haselbacher

Scientific and Engineering Advances

3.11

The flow inside an SRM is of very large Reynolds number and is therefore turbulent. The
turbulence affects the transport of aluminum and aluminum oxide particles, the interaction of
the gas flow with the combustion layer, and with solid components such as inhibitors and
nozzle, heat transfer to solid surfaces (e.g. the nozzle), and many other fluid interactions. In
the Rocfluid code, turbulence is modeled using Large Eddy Simulation (LES) and several
LES models have been incorporated and tested in the Rocturb module. There are, however, a
number of areas in which further development and research are required.

Continued Rocturb development: Continued development is planned to incorporate both


more models and new capabilities (e.g. unstructured grid support, zonal modeling).

New model development and implementation: a new class of LES models called optimal
LES models are being developed. The first implementation of an optimal LES model for
a real application is planned for Rocturb within the year. Further development to support
more complex flow physics and better statistical models is also planned.

Combustion-driven turbulence fluctuations: One of the driving forces for the development of turbulence are the fluctuations introduced at the combustion layer, and the shear
instabilities produced by long-wave axial acoustic modes. Using information from
Rocburn and hydrodynamic simulations, the physics of this interaction will be further investigated, and physics-based models for the injected turbulence fluctuations will be developed and implemented in Rocturb or as a boundary condition module.

Multiphase LES: Subgrid turbulence interacts with aluminum and oxide particle in the
flow. The particles act to produce subgrid turbulence and the subgrid turbulence disperses
the particles. Models are needed for both of these effects, and they must be integrated
with Rocturb, Rocpart and Rocsmoke.

Validation Data: Validating the turbulence modeling and simulation in a solid rocket is
difficult because the hostile flow environment makes detailed measurements difficult.
Our LES models must thus be validated against simple laboratory and detailed (DNS)
simulations. The ongoing work to produce the data needed for validation and other purposes is planned to continue.

3.5

Combustion and Energetic Materials

Group Leader: M. Quinn Brewster


Faculty: M. Quinn Brewster, John D. Buckmaster, David M. Ceperley, Herman Krier, Richard M. Martin, Mark Short, and D. Scott Stewart
Research Scientists and Programmers: Thomas L. Jackson, Kung-Chyun Tang, and Xiaojian
Wang
Postdoctoral Research Associate: Luca Massa
Overview
Combustion of solid propellant composed of energetic materials is the driving thermomechanical force in the operation of a solid rocket motor. Accurate modeling and simulation of
the combustion and resulting regression of solid propellants entails research activities in several areas, including the description and propagation of the propellant combustion interface,

Scientific and Engineering Advances

3.12

modeling of solid propellant flames, combustion instability analysis, and constitutive modeling of energetic materials.
The objective of our work on combustion of
ammonium perchlorate (AP) solid propellants is
to develop a simulation capability that will allow
reliable prediction of macroscopic combustion
behavior, particularly burning rate, from fundamental material properties and formulation parameters, including AP particle size distribution.
Our approach is based on characterizing the microscopic behavior of a burning propellant in the
critical, burning rate-determining region near the
surface of the propellant and developing microstructural combustion models that can simulate
both micro- and macro-burning behavior.

Outflow
Outflow Region
C ombustion Layer
Melt Laye r
S olid Prope llant

Aluminum Particle
A P Grain

400

Mass Injection at
Regression Rate

B urning Al Particle
B inder
Flame

3.7: Goal of CEM Group is integrated


Significant progress has been made in re- Fig.
simulation of propellant combustion layer.
solving unsteady 1-D issues, except for ignition.
Present and near-term multidimensional efforts
are and will continue to be on 2-D capability with laminate propellants, focusing on issues of
non-premixedness of solid fuel and oxidizer, partially premixed diffusion flame structure,
and coupling with the solid components. A key part of our approach is a strong emphasis on
physical validation of modeling assumptions and identification of new phenomena or explication of such via simulation. The long-term objective is to develop a 3-D simulation tool
that has the capability to successfully represent quantitatively the effects of AP particle size
as well as environmental parameters on propellant burning response, both steady and unsteadysomething that has eluded propellant combustion modelers for decades.

Ignition and Radiation in AP Propellants M. Brewster and K. Tang


Ignition of AP-composite propellants will be simulated computationally. The simulation will
use a modified Zeldovich-Novozhilov (ZN) theoretical approach, compatible with the nonlinear dynamic burning model (Rocburn) that has already been developed and validated. The
ignition and nonlinear dynamic burning package will form a unified tool for simulating the
entire ignition and propellant combustion process. Radiative energy will be considered as the
ignition source first, due to the strong role of radiation from burning metal in pyrotechnic igniters. Convective heating will be added after the radiant heating mode is developed and
validated. The effects of radiant flux level, spectral energy distribution, and propellant optical
properties on ignition delay will be investigated. The model will be used to predict ignition
behavior of AP and AP-composite propellant and will be validated with experimental results
for AP and space shuttle propellant. Upon validation the model will be used to simulate ignition and motor pressurization in lab-scale motors, tactical motors, and large-scale space
boosters. We also propose to simulate thermal radiation coupling with the emitting, absorbing, and scattering species (molecular gases and condensed phase particulates) in the core
flow in order to include the effect of radiation on propellant ignition and burning rate in integrated motor simulations.

Scientific and Engineering Advances

3.13

Heterogeneous Propellant Combustion J. Buckmaster, T. Jackson, M. Short, L. Massa,


and X. Wang
The development of propellant morphology modeling, homogenization strategies to account
for the smallest AP particles, and a phase-coupled 3-D unsteady propellant burning code now
permit an attack on the central problems of propellant combustion. A broad spectrum of
problems will be addressed that include improved surface kinematics (presently we can only
account for surfaces described by single-valued functions); the eruption, melting, agglomeration, and detachment of aluminum particles, and the effect of this on the propellant combustion; a combined study with the Fluids Group of the role that the combustion code Rocfire
has on the turbulent chamber flow, and the effect that the turbulent chamber flow has on
combustion (erosive burning); and a numerical study of the response of the burning rate to
acoustic disturbances for the purposes of validation.
Simulation of Condensed Phase and its Kinetics S. Stewart
A comprehensive modeling effort will be carried out to study and develop accurate descriptions of the condensed phase processes in energetic materials used in solid propellants and
explosives. The continuum modeling reflects the simultaneous appearance of delineated
solid, liquid and gas phases, phase boundary regions and exothermic (or endothermic)
chemical reaction. Submodels that can describe large deformation for elastic and viscoelastic solid behavior will be used in collaboration/consultation with the Structures and materials group. A simplified chemistry scheme for condensed phase kinetics will be employed.
Constitutive information will be sought from molecular dynamics (MD) especially to model
the kinetics of phase change. Specific and representative problems include a multiphase
simulation of monopropellant flames like AP and HMX. Simulation tools include existing
multi-material codes like GIBBS2D and recently built combustion solvers. We will use high
order methods and level set representation of material interfaces to the extent possible.
Ab Initio Methods for Calculation of Energetic Material Properties D. Ceperley and R.
Martin
We propose developing new ab initio methods to calculate properties of materials under conditions of high temperatures and densities under conditions relevant to rocket engines. We
will use a combination of density functional and quantum Monte Carlo methods that are
needed for realistic studies. The former are the most widely used methods in physics and
chemistry and the latter are the only methods capable of high accuracy and reliability for a
wide range of conditions at high temperatures and densities. The methods will be benchmarked on relevant energetic materials. Our recent work supported by CSAR is making important steps toward these goals, including simulations of hydrogen compared to shock wave
data from LLNL and phase transformations of nitrogen at high pressure. Active collaborations with scientists at LLNL on these topics are ongoing. Our future research will be enhanced by collaborating on energetic materials projects sponsored under a MURI grant at
Oklahoma State University (D. L. Thompson, PI). The grant is Accurate Theoretical Predictions of the Properties of Energetic Materials and the fact that we were approached to
join this work is recognition of the importance of ab initio simulations for understanding energetic materials. However, the funding is not sufficient by itself to make rapid progress on
these difficult topics.

Scientific and Engineering Advances

3.14

3.6

Computer Science

Group Leaders: Michael T. Heath and Laxmikant V. (Sanjay) Kale


Faculty: Eric de Sturler, Jeff Ericson, Michael J. Garland, Michael T. Heath, Laxmikant V.
Kale, Daniel A. Reed, Paul Saylor, and Marianne Winslett
Research Scientists/Programmers: Michael T. Campbell, Robert A. Fiedler, Xiangmin (Jim)
Jiao, Orion S. Lawlor, and Johnny C. Norris
Postdoctoral Research Associates: Damrong Guoy and Joerg Liesen
Overview
Two teams, Computational Mathematics and Geometry, and Computational Environment,
address research in Computer Science. In addition, critical CSAR resources are tools developed by the Visualization subteam in Computational Environment. Work in Computational
Mathematics and Geometry focuses on the
areas of linear solvers, mesh generation
Rocdriver
and adaptation, and data transfer between
diverse discretizations. The target of our
Rocflu
Rocsolid
Tetmesh
Gridgen
Computational Environment team is to
Rocburn
Rocflo
Rocfrac
Metis
Makeflo
provide the broad computational infrastructure necessary to support complex,
Roccom
large-scale simulations in general, and for
rocket simulation in particular. Areas of
Rocface
Rocnum
Rocpanda
research include parallel programming environments, parallel performance analysis Fig. 3.8: CS Group drives integrated code developand tools, parallel input/output, and visu- ment.
alization.
Computational Mathematics and Geometry
Krylov Subspace Methods and Preconditioners for Very Large Scale Problems E. de
Sturler and P. Saylor
As we simulate increasingly more complicated problems, we must solve ever-larger linear
systems. ASCs intention is to solve systems of more than a billion unknowns on massively
parallel computers. In addition, these systems become increasingly harder to solve, more illconditioned, non-Hermitian, indefinite, and highly non-normal. This asks for robust methods
like GMRES, GCROT, or variations; however, these are expensive. We would like to use
Lanczos-type methods, which are cheap per iteration and have low memory requirements,
such as BiCG and methods derived from it (QMR, TFQMR, and BiCGSTAB). However,
these methods often lack robustness and converge slowly or not at all.
In order to obtain methods with low cost per iteration (memory and CPU), fast convergence, and robustness we propose to derive methods that combine the virtues of both classes.
We have done some work on this in the past, and we are currently working on ways to make
Lanczos-type methods (BiCG and relatives) more robust and faster converging.
Another important way to get more robust methods and faster convergence is to develop
better preconditioners, especially for large parallel computers. We propose to analyze and
Scientific and Engineering Advances

3.15

further develop several classes of parallel preconditioners based on our previous work and
that of others.
Mesh Adaptation and Refinement D. Guoy and M. Heath
Integrated, multi-component simulations require a wide range of meshing capabilities. Perhaps the greatest challenge is dealing effectively with dynamically changing geometries and
correspondingly changing meshes. Integrated rocket simulations require a variety of element
types (hexahedral and tetrahedral), mesh types (volume meshes and surface meshes), and
levels of adaptation (from minor mesh motion to substantial geometric change requiring major mesh repair or complete remeshing). We will develop techniques for addressing each of
these cases. We have already developed effective techniques for smoothing volume meshes
subject to gradual change, such as in propellant burning, and we plan to extend these to
smoothing surface meshes, such as the interface between fluid and solid regions, where relocated nodes must be constrained to lie on the (generally curved) surface. For more substantial
geometric changes, such as flexible inhibitors or propagating cracks, we will develop further
our mesh repair capabilities based on multiple steps of node movement, mesh coarsening,
and mesh enrichment. Finally, to deal with more drastic geometric change, such as propellant
burnback over an extended period, we plan to develop new capabilities for generating an entirely new mesh based in part on our previously developed techniques for sink insertion and
sliver removal.
Mesh Correlation, Data Transfer, and Interface Propagation M. Heath and X. Jiao
We will continue our work on the issues of correlating multiple meshes, data transfer between meshes, and interface propagation. These problems are directly related to the coupled
simulation code. Mesh correlation and data transfer are needed in many situations that involve multiple meshes, such as the interface between fluid and solid domains and a dynamic
solver involving adaptive remeshing. We have achieved reasonable success on mesh correlation and data transfer for surface meshes. We propose further work on adaptivity, parallelization, robustness, and generalization to more types of meshes (for example, volume
meshes). The problem of interface propagation is to track the motion of the interface. The
current propagation scheme in GEN2 is fairly ad hoc and is problematic at ridges and corners. We have developed a new approach in two dimensions and propose extending this
method to three dimensions. We propose tackling the problem in two steps. First, we will develop a limited algorithm assuming fixed connectivity for quick prototyping and easy integration. Second, we will develop a more general algorithm that allows full adaptivity of the
interface for better accuracy.
Computational Environment
Software Support for System Integration, Error Estimation, and Error Control E. de
Sturler and X. Jiao
An important effort of computer science is to develop software tools for easing the development and integration of physical components. We propose to develop a set of new tools for
error estimation, error control, and stability analysis. These tools will provide a means to
monitor the accuracy and stability of both the individual applications and the integrated code,
so that the accuracy and stability of a simulation can be established or violations of such assumptions can be identified. This will provide convenience for application developers and

Scientific and Engineering Advances

3.16

users to perform code verification and debugging. Further, it can also serve as a basis for solution-based mesh adaptation and time zooming. Much of this effort will take advantage of
our previous work and be conveniently built upon our existing software framework. We have
developed a set of software tools for supporting intercomponent communication and orchestration, including Roccom, Rocnum, and Rocface. We also propose the continuing development and maintenance of these tools to meet new challenges of the more dynamic behavior
of the rocket simulation code in the coming years.
AMPI and Component Interface Techniques L. Kale and O. Lawlor
The integrated codes that we propose to develop arise from multiple research groups and development teams. Each will be parallel, have dynamic behavior and need to communicate
data to other modules, in addition to internal communication within each parallel module.
This project proposes further developing AMPI, an adaptive implementation of MPI, and an
orchestration and interface framework to help accomplish these objectives. The AMPI system currently provides automatic load balancing, automatic checkpointing and the ability to
communicate across independent modules for MPI programs. However, porting MPI codes
to AMPI still requires some effort, notably in writing pack-unpack functions and in collecting
global variables. We aim to fully automate this process, so no extra work needs to be done by
an MPI programmer. Further, AMPI itself will be enhanced by making the implementation a
more comprehensive standard-conforming one (including some MPI2 features), and in improving its communication efficiency. An orchestration and interface framework will be developed that allows each component to publish its boundary data without being explicitly
aware of which other modules will use it, and correspondingly use complementary boundary
data coming from other modules in an equally opaque manner.
Component Frameworks L. Kale and O. Lawlor
Although parallel application modules being developed for rocket simulation, as well as
other physical simulations, employ a large number of different algorithms, discretizations,
and numerical methods, they all rely on a few basic data structures, such as particles, structured grids, and unstructured meshes. Parallel implementations of each type of code, when
attempted from scratch, require a significant programming effort beyond that needed for sequential programming. Further, adaptive refinements present a significant challenge for each
data structureAdaptive Mesh Refinement (AMR) for structured grids; unstructured mesh
refinements and changes such as insertion of cohesive elements; and tree rebuilds for particles. We are developing component frameworks that dramatically reduce the effort needed to
use these algorithms in a parallel program, by providing abstractions that encapsulate or hide
the parallelism in the runtime system. The frameworks include: the MBlock framework for
structured grid based applications, with automatic handling of communication for curvilinear
geometries, an AMR extension to MBlock, an unstructured-mesh framework with applications in structures and finite-volume methods, and a particle framework. Work on the last
two will leverage support from other grants, as well as CSAR funding. The proposed work
includes both new capability development as well as help with integrating the frameworks
into mainstream Roc* codes.
Load Balancing and Communication Optimizations L. Kale and O. Lawlor
Current and future CSAR simulations will have dynamic behavior and run on large next generation machines, where communication is expensive relative to computation. We plan to
Scientific and Engineering Advances

3.17

develop techniques, including dynamic load balancing and communication optimizations,


which will automatically improve the performance of CSAR simulations. Our ideas are based
on two principles: virtualization and the principle of persistence. Virtualization is a technique
we have been exploring for the past decade in the context of Charm++ system. With this, an
application developer focuses on decomposing an application into many parallel chunks, but
is freed from deciding which processor will run each chunk, and how the communication
between them is implemented. This gives the runtime system the flexibility (and the challenge) to change the mapping of chunks to processors at runtimeto do this, we have implemented several measurement-based load-balancing strategies. We plan to improve these
strategies for next generation parallel machines, where we must take the network topology
into account to minimize communication bandwidth, and employ fully distributed strategies,
in contrast to current centralized decision making, to retain scalability. Communication between chunks also needs to be optimized, especially when there are multiple communicating
entities (chunks) per processor.
Intelligent Performance Tuning D. Reed and M. Campbell
During the past five years, the Pablo group and CSAR have had a mutually beneficial relationship. We have used the Pablo performance toolkit to identify and correct problematic
CSAR code. Likewise, the CSAR code has helped identify the needs of effective performance tools, and the Pablo toolkit has evolved, via funding from other sources, to fulfill those
requirements. Our goal is to strengthen and build on this symbiotic relationship. The Pablo
performance toolkit will continue to evolve in response to experience with performance
analysis of CSAR applications. However, based on our experiences to date, we propose focusing software development and enhancements on scalability analysis, offline performance
tuning and guidance, and online dynamic adaptation and multiversion code selection.
Parallel I/O and Data Migration M. Winslett
Our research group supplies the Rocpanda parallel I/O library used by GENX, along with
Rocpandas active buffering facilities that keep GENX I/O costs low. We will continue to
meet GENX I/O needs, while enhancing the facilities currently available. In particular, we
propose to add integrated support for compressed data migration in G E N X, improve
Rocpandas I/O load balancing facilities, develop data declustering strategies for the parallel
version of Rocketeer, and continue our work on self-tuning parallel I/O systems. As GENX
adopts adaptive grids and time zooming, these features will raise new I/O load imbalance issues, which will need to be addressed in the coming years.
Visualization
Rocketeer Visualization Suite R. Fiedler and J. Norris
We will continue to enhance the Rocketeer suite of visualization tools. The most useful tool
in the Rocketeer suite for very large, remote data sets is the client/server version
(Apollo/Houston), but the serial workstation version (Rocketeer) and the MPI parallel batch
mode version (Voyager) share the same code base. Support for large data sets will be improved by limiting the number of polygons in images sent to the desktop client. Numerous
useful features will be added, including fly-through animations, streamlines, support for tensors, data manipulation capabilities, support for advanced displays, and X-Y plots along arbitrary lines. Additional reader plug-ins will be developed to enable input of new data for-

Scientific and Engineering Advances

3.18

mats including HDF5 and streaming data, which would allow on-the-fly visualization of a
running simulation. Scalability of the MPI parallel server Houston will be improved by including dummy cell values in each data block to reduce communication significantly.
Rocketeer exceeds many visualization tools in its ability to handle a wide variety of grids
on which the data is defined. The grid may be non-uniform, structured, or unstructured, and
multiblock. Rocketeer can display multiple data sets for multiple materials from multiple
files on a single image. It can perform the same set of graphics operations automatically on a
series of data files to produce frames for animation. Voyager is a fully featured MPI parallel
batch mode version of Rocketeer that takes advantage of a parallel platform by processing
concurrently a different snapshot on each CPU. Voyager takes as input a text file saved during an interactive Rocketeer session which specifies the camera position, a list of graphics
operations to perform, and a list of all HDF files to process.
3.7

Verification and Validation

Team Coordinator Mark D. Brandyberry


Our code development effort requires the coding, testing, and integration of multiple physics
codes, along with codes for mesh matching, parallel I/O, job management, etc. The effort is
multi-language and involves many developers. It is the goal of the Software Engineering
(SE) and Verification and Validation (V&V) efforts to ensure that the computer code products resulting from CSAR efforts are of professional quality, fully tested, and well documented. The proposed work on SE and V&V for the second five years of CSAR will be discussed as separate topics. Software engineering is discussed in the Code Legacy section, below.
Activities that fall under the SE or V&V areas include code and documentation configuration management, various forms of testing (unit, regression, verification, validation, etc.),
code review, build management, and code design. These activities are often not standalone,
and interact with each other. As an example, a test case designed to validate a portion of the
integrated code can be used also as a regression test case if it has an appropriate runtime.
Verification
Verification is making sure the code solves the equations correctly. The goal of the effort is
to ensure that the results of simulations performed with CSAR codes are consistent and accurate as compared to the equations and models that are built into the code(s). Documentation
review and code review are important parts of the verification of the codes in the CSAR integrated code suite and will be continued and enhanced. To characterize the accuracy of CSAR
codes in comparison to their models, methods exist to estimate result (grid) errors both locally and globally. We propose research on constructing an error estimation framework be
initiated to construct a set of tools for performing error analysis of grid-based computer
codes. The framework would be callable from these codes, and would give developers consistent and easily used tools to verify computational accuracy of their codes. This framework
may be part of a larger effort that would include analyses to support grid refinement. Further
work on development of verification problems must be performed to generate analytic results
with which to compare verification runs of individual physics codes. It is important to ensure
that each component code be verified before attempting to validate the integrated code. Using
methods such as manufactured solutions will also be investigated and used as appropriate to
Scientific and Engineering Advances

3.19

generate these problems. For the integrated code, methods for computing values such as mass
conservation across the solid/fluid interface will be extended.
Validation
Validation, on the other hand, is making sure the model solves the correct equations. Thus,
it is important to be able to compare the results of the code(s) against known physical results.
Several problem sets are currently available (Titan IV, labscale rocket), but more need to be
developed (other labscale geometries, Space Shuttle RSRM, etc.). Collection and analysis of
appropriate data is a large effort, as is constructing the runtime models for these problems.
Running any of these physical problems requires a large amount of computer time and
generates many GB of output data. Currently the Rocketeer visualization tool is available to
view the resultant data sets, but no tool is available to analyze the raw data, and/or compare
two data sets. We propose a tool be constructed to enable the extraction of arbitrary data
points, sets, etc. from CSAR code output, and that the new tool be able to compare two ostensibly similar data sets to assess their convergence. This tool would both be available on
demand, and would also interface with the Roctest automated test suite to assess the results of
the periodic regression tests.
A series of grid-convergence studies using a lab-scale rocket has begun and will continue.
Grid convergence studies on other problems on other scales will be continued. Grid convergence studies for models based on analytical results should also be performed as verification
activities. Code uncertainty and sensitivity to model parameters is also important. Since in
general, it takes too long to run the CSAR codes to perform full statistical uncertainty analysis, it is proposed that methods be researched that can allow estimation of code run result uncertainties without years of run-time.
Many of the simulation demonstrations listed in Section 3.2 will also serve as validation
tests. In particular, we either already have or expect to obtain detailed test data for a number
of lab-scale rockets and small guidance motors for comparison with our simulation results. In
addition, under our Space Act Agreement with NASA Marshall Space Flight Center in
Huntsville, Alabama, we have received access to extensive design and test data for the Space
Shuttle RSRM. Such test data will enable detailed validation of our integrated simulation results. Further, laboratory experiments on our own campus will be used in validating individual component models in propellant combustion and turbulence.

Scientific and Engineering Advances

3.20

Outreach and Interaction with NNSA/DP Laboratories

4.1

Center-NNSA/DP Interactions

Center personnel have traveled extensively and have been involved in a large number of
technical and informational meetings. These included meetings intended to explore rocket
science and technology, identify technical collaborators, describe the ASCI/ASAP program,
and establish relationships among Center investigators, DOE lab scientists, and industry
leaders. Individual CSAR senior investigators and technical staff have traveled to DOE DP
labs to serve on ASCI/ASAP panels, to participate in ASAP-wide workshops (e.g., materials
and computational environment), to offer research seminars and technical interaction, to receive training on the ACSI computational resources, and to discuss ASCI resource issues
with the CRT. We will continue to work closely with lab representatives on the TST to identify opportunities for detailed interaction. During the first five years of the program, we have
documented hundreds of CSAR-lab interactions in the following areas:

CSAR researcher visits to DOE Labs


ASC PI Meetings
DOE Panel Reviews
ASC visits and seminars at CSAR
Faculty sabbaticals at DOE Labs
ASC Workshops
Code Sharing
Software/hardware visits from DOE Labs
Joint research
Detailed technical discussions

In addition to traditional individual and small-group visits to the DOE/NNSA labs, we


will be taking large teams of CSAR investigators (~15 people) to make presentations and
participate in technical discussions later this year. A visit to LLNL and SNL-CA has already
been scheduled and a similar trip in the near future will include SNL-NM and LANL.
One of the leading topics of discussion between lab staff and CSAR investigators will be
in pursuing the potential use within the labs of specific technologies developed at CSAR.
Immediate technology candidates include our technology for data transfer at component interfaces and our framework for integration of separately developed codes with automated
load balancing.
4.2
Student-NNSA/DP Interactions
CSAR has been remarkably successful in encouraging student-lab interactions. Leading opportunities for UIUC graduate and undergraduate student interactions with the NNSA/DP
laboratories include:

Summer student interns at DOE Labs


Joint research
Undergrads hired in CSAR labs performing collaborative research with NNSA/DP scientists

Outreach and Interaction

4.1

Former CSAR/CSE Students at DOE/NNSA Labs

Ali Pinar, Lawrence Berkeley National Laboratory (PhD, Computer Science, 2002)
Thomas Hafenrichter, SNL (MS, Mechanical Engineering, 2002)
Michelle Duesterhaus, SNL (MS, Mechanical and Industrial Engineering, 2001)
Jason Hales, SNL (PhD, Civil and Environmental Engineering, 2001)
Jack Yoh, LLNL (PhD, Theoretical and Applied Mechanics, 2001)
Benjamin T. Chorpening, SNL-L (PhD, Mechanical Engineering, 2000)
Burkhard Militzer, LLNL (PhD, Physics, 2000)
Christopher D. Tomkins, LANL (PhD, Theoretical and Applied Mechanics, 2000)
Jeff J. Murphy, SNL-L (PhD, Mechanical Engineering, 1999)
Jin Yao, LLNL (PhD, Theoretical and Applied Mechanics, 1999)
Donald Siegel, SNL (PhD, Physics, 1999)
Steven F. Wojtkiewicz, SNL (PhD, Aero and Astro Engineering, 1999)
Boyana Norris, Argonne National Laboratory (PhD, Computer Science, 1999)
Giulia Galli, LLNL (PhD, Physics, 1998)
Arne Gullerud, SNL (PhD, Civil Engineering, 1998)
Michael Ham, LANL (MS, Computer Science, 1998)

Former CSAR Employees at DP Labs

4.3

Jeffrey Vetter, LLNL


James Quirk, LANL
Dennis Parsons, LLNL
Increasing the Numbers of U.S. Citizen Students and Postdocs

Although CSAR has been the ASAP leader in placing former students and staff in
DOE/NNSA laboratory positions, we seek to improve our record in the coming years. An
important facet of lab placement is citizenship it is often difficult to place non-U.S. citizens in NNSA research positions. To increase the likelihood of CSAR students and staff accepting positions at the labs, we will work diligently to increase the number and quality of
U.S. citizens in the program.
For several years we have encouraged faculty investigators to preferentially hire U.S.
graduate research assistants. In an unwritten policy, we have hired additional students for research projects when U.S. citizen students were identified, even if the initial project budget
was insufficient to cover the additional student. We will continue to extend this support to
encourage faculty to seek qualified domestic students and will attempt to broaden the extra
support to postdoctoral research associates, as well.
Undergraduate students at the University of Illinois are nearly all U.S. citizens (>98%)
and are predominantly Illinois residents (~85%). We propose rapidly growing our program of
using undergraduate students in our laboratories, and in particular, employing them to work
on meshing and laboratory experiment projects. We will make available funds to hire as
many undergraduate students as our faculty and research staff can identify.

Outreach and Interaction

4.2

Organization, Management, and Academic Program

No changes to the organization, management structure, or academic program are proposed.


5.1

Organization and Management

Professor Michael T. Heath, CSAR Director, and the members of the Science Steering
Committee provide world-class leadership and focus for the Center for Simulation of Advanced Rockets. The Center is administratively housed within the Computational Science
and Engineering Program of the UIUC College of Engineering, reporting to the Dean of Engineering, David Daniel.
The Computational Science and Engineering Program is inherently interdisciplinary, requiring expertise in
Education
Research
advanced computing
Program
Program
technology, as well as
in one or more apComputational Science
Center for Simulation
Center for Process
& Engineering Option
of Advanced Rockets Simulation and Design
plied disciplines. The
purpose of the CSE
12 departments
DOE funded
NSF (& DARPA) funded
130 faculty associates
$20 million over 5 years
$2.5 million over 3 years
Degree Option is a
10 graduate fellows
First ends in 2002
Ends Sep 2001
perfect complement
80 graduate students
New 5-year contract
$4 million over 5 years
to the academic goals
enrolled
38 faculty
Begins Sep 2001
36 graduate students
12 faculty
of ASC/ASAPto
3 undergrads
13 students & postdocs
foster interdiscipli27 professional staff
nary, computationally Fig. 5.1: CSAR is one of two research centers in UIUC Computational Science
oriented research and Engineering Program. CSE education program is graduate student acaamong all fields of demic degree option.
science and engineering, and to prepare students to work effectively in such an environment (Figure 5.1).
The CSE Program does not independently admit students or confer graduate degreesstudents wishing to elect the CSE Option must first be admitted to one of the participating departments before enrolling in the CSE Program. Similarly, all faculty members affiliated with CSE have regular faculty appointments in one of the participating departments.
Students electing the CSE Option become proficient in computing technology, including numerical computation and the practical use of advanced computer architectures and in one or
more (traditional) applied disciplines. Such proficiency is gained, in part, through courses
that are specially designed to reduce the usual barriers to interdisciplinary work. Thesis research by CSE students is computationally oriented and actively advised by faculty members
from multiple departments.
Management
The Director and Science Steering Committee members are responsible for nurturing the research program, administering the Center, and maintaining and expanding relationships with
the DOE DP laboratories. This directorate provides the leadership necessary to ensure that
the Center identifies the most important research areas, attracts the most qualified researchers, and pursues and completes the work effectively over the long term. A small administrative staff works to execute Center activities (Figure 5.2). Each of the Research Groups has
Organization, Management, and Academic Program

5.1

co-leaders who coordinate the technical program in that area. Nine


technical teams are in place to address specific areas within the research effort. The Integrated Code
Development Team (Incode) was
formed to clearly identify the lead
authors of the integrated code and to
assure that resources are available.

Chancellor
Provost

Dean

Internal Advisory
Committee

Director

UIUC dept heads

Managing
Director

External Advisory
Board
External advisors
TST Chair

Science Steering
Committee
Technical prog. coord.
Team collaboration
DOE DP liaison

Tech Program
Manager

Daily operations
The membership of the Ex Code coordination
Financial mgmt
Programmer mgmt
Outreach
HW/SW support
ternal Advisory Board (EAB) consists of individuals chosen from the
Research Groups
DOE DP labs, industry, other govCombustion and
Fluid
Structures
ernmental agencies, and other uniEnergetic Materials
Dynamics
and Materials
versities (Figure 5.3). The External
Computational Math
Computational
and Geometry
Environment
Advisory Board reviews CSAR research studies, makes research recommendations, and provides experFig. 5.2: CSAR management structure provides clear
tise for translating research findings
direction.
into practice. An active communications link has been established with
the EAB. The Board annually assesses the progress of the Center in reports to the CSAR Director and the Dean of the College of Engineering.

Administrative Staff

Rocket Industry

Government Research Agencies

The Center has a small team


Air Force Research Laboratory
Aerojet
Army Research Office
Alliant
Techsystems
of very high quality profesLawrence Berkeley National Laboratory
Atlantic Research
sional staff that provides
NASA Headquarters
Geisler Enterprises
NASA Marshall Space Center
Lockheed-Martin Missiles
experienced management
Naval Air Warfare Center, China Lake
& Space
for the program. William
Sandia National Laboratory
Thiokol Propulsion
Dick serves as Managing
Director of the CSAR and
Computer Industry
Universities
Hewlett Packard Company
Caltech
Sheryl Hembrey is the AsIntel Corporation
University of Colorado
sistant Director for Budget
IBM
University of Tennessee Space Institute
and Resource Planning. Mr.
Yale University
Dicks role in CSAR is to
Fig. 5.3: Critical constituencies included on EAB.
manage the day-to-day operations of the program,
provide strategic direction, address facilities and equipment needs (including ASCI computing resources) and to assure that the Center is responsive to the DOE and ASCI. Robert
Fiedler is the CSAR Technical Program Manager. Dr. Fiedler manages the code development
process and convenes the System Integration Team.

Organization, Management, and Academic Program

5.2

Research Group Structure


The program is being carried out in a collaborative manner by a number of teams, each with
specific responsibilities indicated below. To facilitate communication and cooperation among
teams, there are appropriate overlaps in membership (Figure 5.4).
System Integration Team (SITeam): Responsible for overall system integration, including the
mathematical model selection for the system components and the specification of compatible
interfaces between component models. Includes both physical compatibility of component
models and software and data interfaces between corresponding component codes.
Integrated Code Development Team (Incode): New in 2000, this team brings together each of
the lead code authors from the four Research Groups. Responsible for developing the integrated simulation code.

Groups

Software Integration Framework


Team (SWIFT): Responsible for crafting and executing a strategy for developing a general software architecture
for component integration.

Combustion
and Energetic
Materials

Teams

Validation, Accident, and Specification Team (VAST): Responsible for


specifying detailed blueprints of devices to be simulated, including physical dimensions and materials. This
team has worked closely with NASA
and Thiokol in the past to collect detailed performance data for the Space
Shuttle RSRM that will be used for
validating CSAR simulations.

Combustion
and
Energetic
Materials

Fluid
Dynamics

Structures &
Materials

Computer
Science

Fluid Dynamics

Structures and
Materials

Computational
Environments
Computational
Mathematics
and Geometry

System Integration
Engineering Code Development
Software Integration Framework
Validation, Accidents, and Specification

Fig. 5.4: Team efforts contribute to Research Groups.


Integrated Code Development team was new in Y3 and
responded to suggestion of DOE Review Team.

Combustion and Energetic Materials


Team: Responsible for combustion-injection modeling and corresponding codes for simulating burning of composite propellant. Also responsible for continuum-mechanical and molecular-level modeling and corresponding codes for simulating the thermo-mechanical behavior of energetic materials.
Fluid Dynamics Team: Responsible for fluid-mechanical modeling and corresponding codes
for simulating the interior cavity flow and exhaust plume.
Structures and Materials Team: Responsible for solid-mechanical and thermal modeling and
corresponding codes for simulating the case, nozzle, insulation, and propellant.
Computational Mathematics and Geometry: Responsible for parallel numerical algorithms,
such as sparse linear system solvers, as well as algorithms for mesh generation, partitioning,
and adaptive refinement, needed for various component codes.
Computational Environments Team: Responsible for specifying compatible data structures
and data formats for scientific data management and also for parallel I/O and visualization.
Also responsible for parallelization strategies, performance evaluation, and tuning of individual component codes as well as integrated system code.

Organization, Management, and Academic Program

5.3

Legacy

6.1

Code Legacy

The integrated code legacy of CSAR will be a scalable, plugand-play simulation code and framework for performing coupled simulations that automatically adapts to changing topologies, plus a collection of validated state-of-the-art physics
modules that may be used to solve a wide variety of multiphysics problems. Further, the simulation code will enable
exploration of scientific and engineering issues in a broad array of complex fluid-structure interactions.

Fig. 6.1: ASC Program advances national strengths in


computational and simulation
science.

Software Engineering Principles


Critical to the code legacy is adherence to centerwide software engineering practices that enable ease of authorship, maintenance, and multiauthor coding. We propose additions and extensions to the current state of software engineering as practiced at CSAR to be developed
during Years 6-10. These include development of automated multiplatform build and test
suites, development of a grid-error framework for error estimation, construction of further
verification and validation problems, and implementation of other software engineering
practices. The proposed work on SE and V&V for the second five years of CSAR are discussed as separate topics in this Statement of Work although they form a coherent whole in
terms of software engineering process. The activities discussed have been chosen to provide
the highest return to the Center, with the least cost, and the least time taken from researchers.
They have not been chosen to address any specific level of software engineering practice,
but are some of the best practices as evidenced by their use in other large organizations developing complex software.
The CSAR code development effort requires the coding, testing, and integration of multiple physics codes, along with codes for mesh matching, parallel I/O, job management, etc.
The effort is multi-language and involves many developers. It is the goal of the SE and V&V
efforts to ensure that the computer code products resulting from CSAR efforts are of professional quality, fully tested, and well documented. Activities that fall under the SE or V&V
areas include code and documentation configuration management, various forms of testing
(unit, regression, verification, validation, etc.), code review, build management, and code design. These activities are often not standalone, and interact with each other. As an example, a
test case designed to validate a portion of the integrated code can be used also as a regression
test case if it has an appropriate runtime.
Code Dissemination
A year ago we negotiated a Space Act Agreement with NASA Marshall Space Flight Center to exchange the integrated code for test and flight data from the RSRM to be used in our
validation studies. This agreement enables NASA to execute the code to simulate the performance of other launch vehicles on a nonexclusive, royalty-free basis. More recently we
have worked with Atlantic Research Corporation, a major U.S. manufacturer of satellite
steering rockets, to use the GEN2 integrated code on one of their SP motor designs. In addi-

Legacy

6.1

tion, the CSAR External Advisory Board members continue to be excited about the potential
for use of our integrated simulation tool to explore their proprietary rocket designs. The level
of interest in industry, combined with that of NASA, indicate that long-term development,
maintenance, and dissemination of the code is worthwhile.
6.2

Center Legacy

The research of the ASC/ASAP centers is expected to drive advances in critical computer
and computational science areas. The major goals of the Alliance Program were to:

Solve science and engineering problems of national importance through the use of largescale, multidisciplinary modeling and simulation.

Establish and validate large-scale modeling and simulation as a viable scientific methodology across scientific applications requiring both integration across disciplines and complex simulation sequences.

Enhance overall ASC efforts by engaging academic experts in computer science, computational mathematics, and simulations in science and engineering.

Leverage relevant research in the academic community, including basic science, highperformance computing systems, and computational environments.

Strengthen education and research in areas critical to the long-term success of ASC and
the Stockpile Stewardship Program.

Strengthen ties among the Defense Programs laboratories and participating U.S. universities.

CSAR has been remarkably successful in helping ASC fulfill these vital goals.
The Center has had a major impact on the University of Illinois in a variety of ways.
Above all, it has engendered an unprecedented level of collaboration across disciplines and
departments. Even within single disciplines, such as fluid dynamics or structural analysis,
faculty collaboration across departmental lines has been enhanced enormously. As a result,
the Center has become a model for other interdisciplinary, interdepartmental research initiatives. In addition, because of the broad applicability of the technologies it represents, CSAR
has also provided leverage to, and benefited greatly from, many other separately funded programs on our campus, both individual faculty research grants and other large centers such as
NCSA.
Research Initiatives and Computational Simulation
Several new College of Engineering federally-funded research initiatives have sprung from
CSAR, and the Center is viewed as a source of talent and expertise in interdisciplinary management. The NSF/DARPA Center for Process Simulation and Design was proposed and
funded in 1998 (three year grant, $2.8 million) to explore simulation in materials manufacturing and solidification processes. This past year, a five-year, $3 million extension to CPSD
was funded by NSF to support Multiscale Models for Microstructure Simulation and Process Design.
A complementary center proposal was developed this past year and submitted to NASA
under the University Research, Engineering, and Technology Institutes program. This stillLegacy

6.2

open proposal requested the establishment of the Space Access Institute at the University
of Illinois to explore technologies needed for third generation launch vehicles. CSAR investigators, students, and Rocflo-MP may be involved in the study of airframe computational
fluid dynamics and heat transfer for launch vehicles being designed for use in 2020 and beyond.
Together with new collaborators from our own campus and elsewhere, we are planning to
apply the codes developed at CSAR to entirely new problems, both within the aerospace field
(e.g., designing new inlet technologies for supersonic jets) and well beyond (e.g., designing
implantable ventricular assist devices for heart patients). The opportunities for CSAR legacies such as these can be expected to grow in the coming years.
The Center has enhanced the awareness on our campus of computational simulation, and
it has substantially increased the visibility and influence of our interdisciplinary Computational Science and Engineering (CSE) Program, which administratively houses the Center.
The computationally oriented, interdisciplinary educational program provided by CSE fits
perfectly with the needs of CSAR, and the students in this program are ideally trained to participate in the research activities of the Center. CSE courses are specially designed to lower
the usual barriers to interdisciplinary course work and enable students to master both applied
and computational disciplines.
Staff and Students
By hiring more than 50 new professional staff and postdoctoral associates during the first
five years of the program, the Center has significantly enlarged the local technical talent
pool, providing a whole new set of collaborators for existing faculty and staff. The Center
has also hosted a number of visitors, both long-term and short-term, and has organized a very
popular seminar series that was designed specifically to reach out across disciplinary boundaries to enhance collaboration. All of these will continue.
The Center spans ten UIUC units, and its recognition and influence are pervasive
throughout the College of Engineering and beyond. We work very closely with NCSA,
which contributes both research personnel and computer time toward our effort. Several key
members of our research team are also research scientists at NCSA. It has been especially
convenient to do initial code development locally on parallel systems in CSE and NCSA preceding full implementation on the remote ASC platforms.
Another major impact of the Center has been on graduate education and training. CSAR
plays a major role in educating a new generation of scientists and engineers prepared to work
in computational simulation of complex systems by supporting more than forty graduate students at any given time. By virtue of this experience, the students we train are already attuned
to the needs of interdisciplinary collaboration. The level of involvement by undergraduate
students has been growing and will continue to expand.

Legacy

6.3

Addendum

Our strategy for efficient use of additional funds includes supporting both new and expanded
efforts. We propose establishing three new research projects (the first two of which are on
our critical path and hence will accelerate the completion of the integrated code), reestablishing the research support for our two off-campus subcontractors that was cut due to the
budget reduction between Y5 and Y6, and expanding the effort of three continuing projects.
A.1

New Research Efforts

Time Zooming Fluid Dynamics Group


The propellant regression time scale is much longer than any other physical time scale in our
rocket simulations. This calls for a multi-time-scale approach, which we have dubbed timezooming. This approach can be formalized asymptotically, but only an informal description
is given here. On the time scale of the propellant regression, all the more rapid processes average out, leaving a description in terms of the space-dependent average regression rate,
which varies only slowly in time. On the time scale of the fluid flow (for example), the regression is so slow that it has no effect. These characterizations can be used to formulate a
mixed time-stepping scheme that accomplishes time-zooming. In one method, one performs
a fast time simulation and averages it for a time sufficient to determine an average regression
rate. Then this average regression rate is used to advance the propellant surface in time on the
regression time scale. The result is a macroscopic change in core geometry. This new geometry is then used to do another fast-time simulation, and the process is repeated. There are
a number of practical issues to sort out, such as how long one should run each type of simulation, and how to most effectively restart the fast-time simulation. There are also refinements to the general approach that should be explored. For example, for the solid, which has
a meaningful steady solution, a quasi-steady solver could be used during the long time-scale
simulation.
Radiation Fluid Dynamics Group
It is important to compute the radiation field within the rocket chamber as it influences the
energy balance of the core and nozzle flow, and more importantly, the burn rate of the propellant. This requires the evaluation of the radiation equations along with the hydrodynamic
equations. The opacity of the medium plays an important role in determining the level of radiation modeling required. During the ignition transient, radiation occurs mainly through a
transparent gaseous medium, while under fully-ignited burn conditions the medium can be
considered to be optically thick, except within a very thin propellant combustion zone.
Two classes of radiation modeling will be pursued: a) Equilibrium diffusion approximation, and b) Flux limited diffusion approximation. The first of these is simpler and can be applied to the case where radiation can be assumed to be a local phenomenon - expected to
be valid for an optically thick medium. The second approach is expected to be able to bridge
the transition between the optically thick and optically thin regions in the core flow. Both
these approaches will be implemented as part of Rocrad.

Addendum Additional Research Projects

A.1

Macroscale Response Via Physics-Based Atomistic Calculations D. Johnson, Y. Huang,


and A. Beaudoin (Structures and Materials Group)
Direct embedding of atomistic models within continuum simulations represents a significant
advance in multiscale materials modeling. Direct embedding has advantages in certain
classes of problems, but is not always sensible. Therefore, we propose a two-pronged approach to coupling atomistic and continuum models. First, we will use first-principles calculations to compute effective properties that govern continuum models. To this end, we will
develop a physics-based, system-dependent interaction potential for (initially metallic) alloys
that will provide direct input to, for example, the Virtual Internal Bond microstructural models of Y. Huang, and the strength reduction simulations of A. Beaudoin. We propose a sensible data-interface strategy to connect first-principles techniques directly to mesoscale FEM
response models (e.g., estimates of interfacial strength, or solute strength reduction at grain
boundaries). The data-interface strategy allows for quick modifications and improvements to
individual methods without having to modify software interfaces or other analysis modules.
A second part of this project involves a novel scheme for directly embedding of a quantum
mechanical atomistic zone within a continuum finite element model.
A.2

Research Subcontractors

Propellant Kinetics M. Smooke, Yale University (Combustion and Energetic Materials


Group)
Computers are not powerful enough for three-dimensional, unsteady calculations using real
rocket propellant chemistry. For this reason, simplified chemistry must be adopted, defined
either by reduced chemistry strategies or by ad hoc models. These simplifications must be
validated by comparison with full simulations in simpler geometries. The Yale program is
pursuing this path and will examine the use of ethane as a surrogate for the propellant binder
fuel. Combustion models that are used in the simulation of pollutant formation, ignition phenomena, and in the study of chemically controlled extinction limits, often combine detailed
chemical kinetics with complicated transport phenomena. As the number of chemical species
and the geometric complexity of the computational domain increases, the modeling of such
systems becomes computationally prohibitive on even the largest supercomputer. While parallel architectures and algorithmic improvements have the potential of enhancing the level of
problems one can solve, the modeling of three-dimensional time-dependent systems with
complex transport and detailed finite rate chemistry will remain beyond the reach of combustion researchers for several years to come. The situation is even less promising if one
wants to consider direct numerical simulation of turbulence with finite rate chemistry. While
some applications can be studied effectively by loweringthe dimensionality of the computational domain,there are many systems in which this is neither feasible nor scientifically
sound.
Mesh Generation T. Baker, Princeton University (Computer Science Group)
Work will continue on software modules needed by CSAR for 3-D tetrahedral mesh generation, rapid generation of very large tetrahedral meshes by extrusion from a 2-D mesh, and a
3-D mesh repair algorithm. The module development is driven by our immediate needs for
mesh motion and repair for the moving inhibitor problem and to track the propagation of

Addendum Additional Research Projects

A.2

cracks in the solid propellant. The 3-D mesh repair capability will be interfaced with Rocflu
to enable more realistic and more challenging SRM simulations. For the moving inhibitor
problem, the deformation of the boundary surface is sufficiently mild that it is possible to
maintain the connectivity of the surface triangulation throughout the simulation. For the
crack propagation problem, however, there are significant changes in the extent and shape of
the boundary surface. New techniques must be developed to modify and repair a boundary
surface that is undergoing substantial change.
A.3

Expanded Efforts

Parallel Implicit Finite Element Computations K. D. Hjelmstad and D. A. Tortorelli


(Structures and Materials Group) Add 0.75 postdoctoral research associate.
Multiscale Modeling of Propellants and Energetic Materials D. S. Stewart (Combustion
and Energetic Materials Group) Add 0.5 postdoctoral research associate.
Component Frameworks L. V. Kale (Computer Science Group) Add one graduate research assistant.

Addendum Additional Research Projects

A.3

Center for Simulation of Advanced Rockets

University of Illinois at Urbana-Champaign


2262 Digital Computer Laboratory
1304 West Springfield Avenue
College of Engineering
University of Illinois at Urbana-Champaign
Urbana, Illinois 61801 USA
Telephone 217 333-3247
Fax 217 333-1910
www.csar.uiuc.edu

You might also like