Tải bản đầy đủ (.pdf) (60 trang)

nasa scientific and technical aerospace reports phần 6 docx

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (281.94 KB, 60 trang )


of physics. The focus is on the appearance of chaos in a beam distribution. A study of the problem is based on two
observations. The First observation is that using Lyapunov method and its extension we obtain solutions of partial differential
equations. Using this approach we discuss the problem of finding a solution of Vlasov-Poisson equation, i.e., some stationary
solution where we consider magnetic field as some disturbance with a small parameter. Thus the solution of Vlasov equation
yields an asymptotic series such that the solution of Vlasov-Poisson equation is the basis solution for one. The second
observation is that physical chaos is weakly limit of, well known, the Landau bifurcation’s. This fact we have proved using
ideas on the Nature of Turbulence.
NTIS
Partial Differential Equations; Vlasov Equations; Poisson Equation; Particle Accelerators; Beams (Radiation)
20050169872 Brookhaven National Lab., Upton, NY, USA
Surface Reactions Studied by Synchrotron Based Photoelectron Spectroscopy
Hrbek, J.; 1999; 70 pp.; In English
Report No.(s): DE2004-770789; BNL-66043; No Copyright; Avail: Department of Energy Information Bridge
The goal of this article is to illustrate the use of synchrotron radiation for investigating surface chemical reactions by
photoelectron spectroscopy. A brief introduction and background information is followed by examples of layer resolved
spectroscopy, oxidation and sulfidation of metallic, semiconducting and oxide surfaces.
NTIS
Chemical Reactions; Photoelectron Spectroscopy; Surface Reactions; Synchrotron Radiation
20050169873 Brookhaven National Lab., Upton, NY, USA, Florida Univ., Gainesville, FL, USA
Investigation of Coherent Emission from the NSLS VUV Ring
Carr, G. L.; Kramer, S. L.; Murphy, J. B.; La Veigne, J.; Lobo, R. P. S. M.; Mar. 1999; 10 pp.; In English
Report No.(s): DE2004-770804; BNL-66994; No Copyright; Avail: Department of Energy Information Bridge
Bursts of coherent radiation are observed from the NSLS VUV ring near a wavelength of 7mm. The bursts occur when
the electron beam current exceeds to threshold value which itself varies with ring operation conditions. Beyond threshold, the
average intensity of the emission is found to increase as current squared. With other parameters held nearby constant the
threshold current is found to increase quadratically with the synchrotron frequency, indicating a linear dependence on
momentum compaction. It is believed that the coherent emission is a consequence of micro-bunching of the electron beam due
to the microwave instability.
NTIS
Coherent Radiation; Synchrotrons; Synchrotron Radiation; Ultraviolet Radiation


20050169875 Brookhaven National Lab., Upton, NY
Instabilities in the Spallation Neutron Source (SNS)
Blaskiewicz, M.; 1999; 10 pp.; In English
Report No.(s): DE2004-770762; BNL-65933; No Copyright; Avail: Department of Energy Information Bridge
The 2MW Spallation Neutron Source (SNS) will have a D.C. beam current of 40 A at extraction, making it one of the
worlds most intense accelerators. Coherent instabilities are a major concern and efforts to predict beam behavior are described.
NTIS
Neutron Sources; Spallation; Stability; Particle Accelerators
20050169876 Brookhaven National Lab., Upton, NY
Flying Wire System in the AGS
Huang, H.; Buxton, W.; Mahler, G.; Marusic, A.; Roser, T.; 1999; 10 pp.; In English
Report No.(s): DE2004-770758; BNL-65923; No Copyright; Avail: Department of Energy Information Bridge
As the AGS prepares to serve as the injector for RHIC, monitoring and control of the beam transverse emittance become
a major and important topic. Before the installation of the flying wire system, the emittance was measured with ionization
profile monitors in the AGS, which require correction for space charge effects. It is desirable to have a second means of
measuring profile that is less dependent on intensity. A flying wire system has been installed in the AGS recently to perform
this task. This paper discusses the hardware and software setup and the capabilities of the system.
NTIS
Emittance; Wire; Synchrotrons
295
20050169878 Brookhaven National Lab., Upton, NY, USA
Design of a Resonant Extraction System for the AGS Booster
Brown, K.; Cullen, J.; Glenn, J. W.; Lee, Y. Y.; McNerney, A.; 1999; 10 pp.; In English
Report No.(s): DE2004-770771; BNL-65976; No Copyright; Avail: Department of Energy Information Bridge
The Booster Application Facility (BAF) will employ heavy ion beams of many different ion species and at beam energies
ranging from 0.04 to 3.07 GeV/nucleon. Resonant extraction is required in order to deliver a continuous stream of particles.
In this report they describe the beam requirements and the system design. The basic design is a third integer resonant extraction
process which employs a single thin magnetic septum and a thick septum ejector magnet. The expected extraction efficiency
is about 85%, based on the thin septum thickness and the predicted step size of the resonant beam at the septum. This is more
than sufficient for the low intensity low energy heavy ion beams needed for the BAF. In this report they present a detailed

discussion of the design of the various elements and a discussion of the detailed modeling of resonant extraction from the AGS
Booster. The extraction process was modeled using a BNL version of MAD which allowed them to interactively observe
detailed particle tracking of the process. This was a key tool to have in hand which permitted them to pose and answer various
questions in a very short period of time.
NTIS
Extraction; Ion Beams; Structural Design
20050169883 Brookhaven National Lab., Upton, NY
Collimator Systems for the SNS Ring
Ludewig, H.; Simos, N.; Walker, J.; Thieberger, P.; Aronson, A.; 1999; 10 pp.; In English
Report No.(s): DE2004-770754; BNL-66594; No Copyright; Avail: Department of Energy Information Bridge
The requirements and performance goals for the collimators are to reduce the uncontrolled beam loss by 2 x 10(sup -4),
absorb 2 kW of deposited heat, and minimize production and leakage of secondary radiation. In order to meet these
requirements a self-shielding collimator configuration consisting of a layered structure was designed. The front layers (in the
direction of the proton beam) are relatively transparent to the protons, and become progressively less transparent (blacker)
with depth into the collimator. In addition, a high density (iron) shield is added around the outside. The protons will be stopped
in the center of the collimator, and thus the bulk of the secondary particles are generated at this location. The conceptual design
described, the method of analysis discussed, and preliminary performance parameters outlined.
NTIS
Collimators; Proton Beams
20050169884 Brookhaven National Lab., Upton, NY
Optimization of the Parameters in the RHIC Single Crystal Heavy Ion Collimation
Biryukov, V. M.; Chesnokov, Y. A.; Kotov, V. I.; Trbojevic, D.; Stevens, A.; 1999; 10 pp.; In English
Report No.(s): DE2004-770753; BNL-69593; No Copyright; Avail: Department of Energy Information Bridge
In the framework of the project to design and test a collimation system prototype using bent channeling crystal for
cleaning of the RHIC heavy ion beam halo, the authors have studied the optimal length and bending angle of a silicon (110)
single crystal proposed to be a primary element situated upstream of the traditional heavy amorphous collimator. Besides the
matters of the channeling and collimation efficiency, they also looked into the impact the crystal may have on the
non-channeled particles that go on circulating in the ring, so as to reduce the momentum offset of the particles scattered of
the crystal.
NTIS

Collimation; Particle Accelerators
20050169885 Brookhaven National Lab., Upton, NY
Design of an AC-Dipole for use in RHIC
Parker, B.; Bai, M.; Jain, A.; McIntyre, G.; Meth, M.; 1999; 10 pp.; In English
Report No.(s): DE2004-770751; BNL-66578; No Copyright; Avail: Department of Energy Information Bridge
The authors present two options for implementing a pair of AC-dipoles in RHIC for spin flipping, measuring linear optical
functions and nonlinear diagnostics.AC-dipoles are magnets that can be adiabatically excited and de-excited with a continuous
sine-wave in order to coherently move circulating beam out to large betatron amplitudes without incurring emittance blow up.
The AGS already uses a similar device for getting polarized proton beams through depolarizing resonances. By placing the
magnets in the IP4 common beam region, two AC-dipoles are sufficient to excite both horizontal and vertical motion in both
296
RHIC rings. While they initially investigated an iron-dominated magnet design, using available steel tape cores; they now
favor a new air coil plus ferrite design featuring mechanical frequency tuning, in order to best match available resources to
demanding frequency sweeping requirements. Both magnet designs are presented here along with model magnet test results.
The challenge is to make AC-dipoles available for year 2000 RHIC running.
NTIS
Particle Accelerators; Betatrons; Continuous Radiation
20050169886 Brookhaven National Lab., Upton, NY
BNL-Built LHC Magnet Error Impact Analysis and Compensation
Ptitsin, V.; Tepikian, S.; Wei, J.; 1999; 10 pp.; In English
Report No.(s): DE2004-770749; BNL-66506; No Copyright; Avail: Department of Energy Information Bridge
Superconducting magnets built at the Brookhaven National Laboratory will reinstalled in both the Insertion Region IP2
and IP8, and the RF Region of the Large Hadron Collider (LHC). In particular, field quality of these IR dipoles will become
important during LHC heavy-ion operation when the (beta)* at IP2 is reduced to 0.5 meters. This paper studies the impact
of the magnetic errors in BNL-built magnets on LHC performance at injection and collision, both for proton and heavy-ion
operation.
NTIS
Particle Accelerators; Superconducting Magnets
20050169951 Lafayette Coll., Easton, PA, USA
Volumetric and Optical Studies of High-Pressure Phases of MgSO4-H2O with Applications to Europa and Mars

Hogenboom, D. L.; Dougherty, A. J.; Kargel, J. S.; Mushi, S. E.; Lunar and Planetary Science XXXVI, Part 8; [2005]; 2 pp.;
In English; See also 20050169945; Original contains color and black and white illustrations; Copyright; Avail: CASI;
A01,
Hardcopy; Available from CASI on CD-ROM only as part of the entire parent document
We report the first measurements and images obtained using a new high-pressure volumetric cell with sapphire windows
to study phase equilibria in a 17 wt.% sample of MgSO4 in H2O. Magnesium sulfate was chosen for study because it is
regarded as among the most likely constituents of Europa’s ocean and icy shell and constitutes key salts on Mars. The 17 wt.%
composition is close to the eutectic. The new data, when combined with data from our earlier study of the density vs. pressure
and temperature of MgSO4 solutions, will enable us to identify the phases with greater certainty and describe the phase
transitions with greater precision. For example, we observe that the process of solidification of the supercooled sample
involves a sequence in which a fine-grained structure forms rapidly, followed by the generation of liquid and then slower
growth of large-grained crystals. The addition of visual images to our capability to track the changes in sample volume is also
valuable to assess both stable and reversible phase changes and metastable phase transitions. Metastability has proven a key
aspect of this system in the lab and in nature. Additional information is included in the original extended abstract.
Author (revised)
Magnesium Sulfates; Water; Volumetric Analysis; High Pressure; Phase Stability (Materials)
20050170464 SEMATECH, Austin, TX, USA
The Quantitation of Surface Modifications in 200 and 300 mm Wafer Processing with an Automated Contact Angle
System
Carpio, Ronald; Hudson, David; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop;
[1998], pp. 272-277; In English; See also 20050170458; Copyright; Avail: Other Sources
Contact angle measurement, using advanced instrumentation, is assuming an increased role in monitoring those
semiconductor manufacturing processes which modify the surface characteristics of wafers. Such measurements can provide
rapid, nondestructive, and spatially as well as time resolved data in an automated mode. This information can be related to
processing uniformity and can in many cases provide information on the chemical state of the surface. Illustrations are
provided in the wafer cleaning, lithography, and interconnect areas. New application areas illustrated include measuring the
uniformity of UV photostabilization processes, measurement of contrast curves, and determination of receding and advancing
contact angles of processed copper wafers.
Author
Quantitative Analysis; Surface Defects; Wafers; Automatic Control

297
20050170468 International Business Machines Corp., Essex Junction, VT, USA
Wafer Line Productivity Optimization in a Multi-Technology Multi-Part-Number Fabricator
Maynard, Daniel N.; Rosner, Raymond J.; Kerbaugh, Michael L.; Hamilton, Richard a.; Bentlage, James R.; Boye, Carol A.;
1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998], pp. 34-42; In English; See
also 20050170458; Copyright; Avail: Other Sources
Successful semiconductor manufacturing is driven by wafer-level productivity. Increasing profits by reducing
manufacturing cost is a matter of optimizing the factors contributing to wafer productivity. The major wafer productivity
components are chips per wafer (CPW), wafer process or fabricator yield (WPY) and wafer final test WFT) or functional yield.
CPW is the count of product chips fitting within the useable wafer surface, and is dependent upon the chip size, dicing channel
(kerf) space, and wafer-field size. WPY yield is the percentage of wafers successfully exiting the line; losses include scrap for
broken wafers and failed-wafer specifications. WFT yield is the percent of chips that meet all final parametric functional
electrical test specifications.
Derived from text
Wafers; Productivity; Optimization; Chips
20050170471 International Business Machines Corp., Essex Junction, VT, USA
Correlation of Digital Image Metrics to Production ADC Matching Performance
Blais, Jennifer; Fischer, Verlyn; Moalem, Yoel; Saunders, Matthew; 1998 IEEE/SEMI Advanced Semiconductor
Manufacturing Conference And Workshop; [1998], pp. 86-92; In English; See also 20050170458; Copyright; Avail: Other
Sources
Automatic Defect Classification (ADC) tool matching requires that consistent quality images captured on all tools. Image
metrics have been developed and the variance of these metrics have been correlated to classifier matching. It is shown that
in order to maintain matching, image color balance, focus, and shadowing need to be monitored and maintained at acceptable
values. Of these metrics, inappropriate color balance has the greatest affect on matching.
Author
Image Analysis; Defects; Classifications
20050170476 Fairchild Semiconductor Corp., South Portland, ME, USA
In-Situ Gate Oxide/Electrode Deposition for a 0.5 micron BiCMOS Process Flow
Carbone, Thomas A.; Solomon, Gary; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop;
[1998], pp. 174-180; In English; See also 20050170458; Copyright; Avail: Other Sources

A method of depositing the gate oxide and electrode in a single chamber for BiCMOS processing is discussed. The
advantages of the deposition of in-situ gate electrode (DIGE), over the conventional two step oxidation and polycrystalline
silicon deposition is related to cycle time and increased gate oxide integrity. TEM images and a correlation to metrology
measurements are presented.
Author
Deposition; Oxides; Gates (Circuits); Electrodes
20050170492 Analog Devices, Inc., Woburn, MA, USA
A Comparison of Critical Area Analysis Tools
Fitzpatrick, Sean; ODonoghue, Geoffrey; Cheek, Gary; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing
Conference And Workshop; [1998], pp. 31-33; In English; See also 20050170458; Copyright; Avail: Other Sources
The application of Critical Area Analysis has become more mainstream in the semiconductor industry. The critical area
of a circuit is a measure of the sensitivity of a product layout to defects, which is subsequently used in accurate yield models.
Intuitively, if a circuit is more dense, the defect sensitivity is higher than a less dense circuit. Only recently, have commercial
tools become available to measure critical area. Several approaches have been developed to measure layout critical area, a
short summary of each approach is described, as well as a brief description of how critical area is incorporated into a yield
model. The results of applying critical area analysis are then described.
Author
Semiconductors (Materials); Defects; Layouts
298
20050170495 Georgia Inst. of Tech., Atlanta, GA, USA
Towards Real-Time Fault Identification in Plasma Etching Using Neural Networks
Zhang, Ben-Yong; May, Gary S.; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop;
[1998], pp. 61-65; In English; See also 20050170458
Contract(s)/Grant(s): NSF DDM-93-58163; Copyright; Avail: Other Sources
Abstract - As the IC industry moves further into submicron fabrication technology, optimal utilization of fabrication
equipment is essential. Timely and accurate equipment malfunction identification can be a key to success. It is also desirable
to predict malfunctions well in advance of their actual occurrence. In this paper, we use neural networks to model time series
data extracted from a three-step plasma etch process for defining active areas in a CMOS ASIC circuit. The data consists of
real-time measurements from the three-step etch process for 40,000 silicon wafers collected over a six-month per’rod from
a Drytek plasma etcher. Two types of anomalies were present in this data: 1) constant or slowly advancing time (indicating

the presence of a machine fault); and 2) missing steps (indicating something unexpected happened during the etch). Data
preprocessing is carried out to eliminate any data acquisition errors in the original data and to correctly separate the total time
sequence into three sub-sequences (one for each etch step). A pattern recognition technique is used to determine the process
step number for each record. The classification results and the prediction error demonstrate accurate determination of the etch
step number from the chamber state. Dynamic neural network models are then constructed for each step. We initially focus
on modeling the time series associated with chamber pressure. The time series of pressure data is modeled as a function of
its previous values and the current time. We use this approach to construct time series models of the pressure variations in the
etching system using only an initial condition and the time value as inputs.
Author
Real Time Operation; Fault Detection; Plasma Etching; Neural Nets; Pattern Recognition
20050170497 Cypress Semiconductor Corp., San Jose, CA, USA
Development of New Methodology and Technique to Accelerate Region Yield Improvement
Wong, K.; Mitchell, P.; Nulty, J.; Carpenter, M.; Kavan, L.; Jin, B.; McMahon, G.; Seams, C.; Fewkes, J.; Gordon, A.;
Sandstrom, C.; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998], pp. 82-86; In
English; See also 20050170458; Copyright; Avail: Other Sources
A focus in region yield is demonstrated to improve the systematic yield from 75% to upper 90% to achieve quick learning
curve in Defect Density on new products. A learning curve to drive both the Random and Systematic yield simultaneously are
important to accelerate the yield learning on new products as well as existing products. This paper showed the systematic yield
improvement from a module integration issue to an equipment setup and capability issue. A new methodology has been
defined to look at the edge region of the wafer, and is used to address wafer edge issue with systematic approaches to drive
yield improvement. The process variability on the center of the wafer is low, but as one approach the edge of the wafer, large
process variations arise which depress the yield at the edge of the wafers. This decrease in yield can be caused by technology
architecture, process uniformity, wafer misalignment and mark alignment scheme issues.
Author
Yield; Defects; Learning Curves; Technology Assessment
20050170498 International Business Machines Corp., Essex Junction, VT, USA
Intelligent Line Monitor: Maximum Productivity through an Integrated and Automated Line Monitoring Strategy
Pilon, Tom; Burns, Mark; Fischer, Verlyn; Saunders, Matthew; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing
Conference And Workshop; [1998], pp. 93-102; In English; See also 20050170458; Copyright; Avail: Other Sources
This paper describes an Intelligent Line Monitor system and highlights the features which make it superior to

conventional line monitor systems. By citing examples from an IBM 0.25 microns technology fabricator, we show that an
inte_ated and automated line monitoring strategy reduces time-to-results, provides a low cost-of-ownership, and delivers a
short time to return-on-investment. The natural expansion and growth possibilities of such as system are also explored.
Author
Monitors; Smart Structures; Automatic Control
20050170512 Analog Devices, Inc., Wilmington, MA, USA
Manufacturing and Reliability Improvements in Metal-Oxide-Metal Capacitors - MOMCAPs
Lowell, Larry; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998], pp. 181-186;
In English; See also 20050170458; Copyright; Avail: Other Sources
299
Metal-Oxide-Metal Capacitors, MOMCAPs, have historically demonstrated less than optimal leakage and breakdown
characteristics and yields. Additionally, the Cpk for capacitance is low. Any previous work done to improve the die!__tric
uniformity has resulted in further degradation of the capacitor characteristics. In this paper we will show that the parametric
and reliability characteristics are very dependent on the bottom plate material. Our standard Ti bottom plate interacts with the
capacitor dielectric resulting in degraded performance. That interaction renders a more uniform dielectric film unusable. We
have developed a MOMCAP using TiW as the bottom plate electrode, which minimizes those interactions and improves
capacitor characteristics.
Author
Manufacturing; Mom (Semiconductors); Reliability; Capacitors
20050170522 KLA-Tencor Corp., Orlando, FL, USA
Correlation of Ellipsonometric Modeling Results To Observe Grain Structure for OPO Film Stacks
Robinson, Tod E.; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998],
pp. 278-288; In English; See also 20050170458; Copyright; Avail: Other Sources
One significant, but potentially variable, parameter in the deposition and subsequent processing of polysilicon is its
microstructure. The purpose of this work was to correlate the model parameters, in this case, percent volume fraction of phase
components of polysilicon, generated by regression of model dispersion using Bruggerman Effective Media Approximation
to data acquired by the Spectroscopic Ellipsometry technique. Several samples are prepared consisting of SiO2/Undoped Poly
Si / SiO2 film stacks in order to measure their as-deposited average grain sizes. Ellipsonometric data is obtained for the center
site of each sample which are then compared to AFM results from similar samples. Various grain geometry approximations
are applied along with the assumption that the polysilicon structure may be modeled to consist of three components; crystalline

Si in a continuous Amorphous Si matrix, and voids. A mathematical relation is established between the percent concentration
of crystalline Silicon and the mean grain size for the two cases of equiaxed and columnuar microstructures. Results indicate
there to be good correlation withAFM measured grain sizesAdditional work is required to further demonstrate the correlation,
and develop software applications to enable in-line product monitoring.
Author
Ellipsometry; Grain Size; Silicon Polymers; Microstructure; Mathematical Models
20050170523 International Business Machines Corp., Essex Junction, VT, USA
Beyond Cost-of-Ownership: A Causal Methodology for Costing Wafer Processing
Miraglia, Stephanie; Miller, Peter; Richardson, Thomas; Blunt, Gregory; Blouin, Cathy; 1998 IEEE/SEMI Advanced
Semiconductor Manufacturing Conference And Workshop; [1998], pp. 289-293; In English; See also 20050170458;
Copyright; Avail: Other Sources
Classical cost-of-ownership data provides detailed cost data of equipment assets but does not provide wafer processing
costs. Starting with a cost-of-ownership model, a wafer processing cost model was developed and validated. This
cost-of-processing model provides wafer processing cost data from raw wafer through final passivation and parametric testing.
This new model goes beyond classical cost-of-ownership data and captures more than just equipment costs process, product,
and fabricator costs are also captured. These costs are then causally spread to wafers via various algorithmic methodologies.
In order to do this, some historical cost problems had to be addressed, such as how to properly weight equipment usage and
account for dedicated equipment requirements, deal with measurement sampling, incorporate idle time and contingency, and
account for different photolithographic field sizes. Output from the model was fully validated against actual spending and tied
to accounting data in order to assure a full dollar capture. The model is currently being used for product costing,
decisionmaking, and cost reduction activities at the IBM Microelectronics Division Manufacturing Facility in Essex Junction,
Vermont.
Author
Cost Reduction; Cost Analysis; Data Processing Equipment; Wafers; Technology Assessment
20050170524 Tefen Ltd., Foster City, CA, USA
Simulation of Test Wafer Consumption in a Semiconductor Facility
Foster, Bryce; Meyersdorf, Doron; Padillo, Jose M.; Brenner, Rafi; 1998 IEEE/SEMI Advanced Semiconductor
Manufacturing Conference And Workshop; [1998], pp. 298-302; In English; See also 20050170458; Copyright; Avail: Other
Sources
A discrete event simulation methodology was developed to assist in managing test wafer usage in semiconductor fabs.

300
The purpose of modeling test wafer usage is to predict the number of new test wafers required, test wafer WIP levels, and how
to downgrade test wafers to reduce costs of purchasing new test wafers. The test wafer simulation methodology is a detailed
yet accurate way to predict test wafer consumption. The methodology has been implemented in a 200mm development facility
resulting in considerable cost savings by reducing the overall WIP levels of test wafers.
Author
Technology Assessment; Wafers; Performance Tests; Models
20050170525 UniSil Corp., Santa Clara, CA, USA
Improvement of Silicon Wafer Minority Carrier Lifetime Through The Implementation of a Pre-Thermal Donor
Anneal Cleaning Process
Martines, Larry; Wang, Charley; Hardenburger, Tom; Barker, Nancie; Shomers, Brian; 1998 IEEE/SEMI Advanced
Semiconductor Manufacturing Conference And Workshop; [1998], pp. 303-307; In English; See also 20050170458;
Copyright; Avail: Other Sources
In recent years, to meet the device maker’s continuously smaller device geometry, the requirement of silicon wafer quality
has become more and more stringent. Now the silicon wafer minority carrier lifetime, or diffusion length has become one
routinely required parameter. It is well known that, in addition to the crystal growth, metal contamination is one of the major
limiting factors for the minority carrier lifetime in silicon wafers. It is very critical to optimize the silicon wafer manufacturing
process flow to minimize metal contamination sources during the silicon wafer processing.
Author
Silicon; Wafers; Carrier Lifetime; Minority Carriers; Cleaning
20050170526 International Business Machines Corp., Essex Junction, VT, USA
Design for Manufacturability: A Key to Semiconductor Manufacturing Excellence
Wilcox, R.; Forhan, T.; Starkey, G.; Turner, D.; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And
Workshop; [1998], pp. 308-313; In English; See also 20050170458; Copyright; Avail: Other Sources
This paper reviews measures of manufacturing excellence and presents a design-for-manufacturability (DFM) program
organized around early design and manufacturing teamwork and the economic analysis of design options. Typical measures
of manufacturing excellence for a semiconductor fabricator are expressed in terms of either operational or economic results.
Those expressed in terms of operational results are independent of the product mix in the fabricator while those expressed in
terms of economic results integrate both fabricator and product design attributes into a single parameter like revenue/wafer.
Improvements in the operational measures of manufacturing excellence focus upon increases in capacity and throughput,

defect density reductions, and cost containment. Improvements in the economic measures of manufacturing excellence must
focus on both fabricator processing efficiency and the productivity of the design. Design-for-manufacturability practices can
improve design productivity, time-to-market, and product performance and reliability by closely coupling semiconductor
fabrication knowledge with product requirements during the initial phase of a product design. Every design decision produces
both technical and economic consequences; understanding these consequences and using this knowledge in the design process
to optimize product productivity and profitability is key to achieving manufacturing excellence for that product.
Author
Semiconductors (Materials); Manufacturing; Design Analysis; Economic Analysis
20050170527 Siemens, France
Highly Selective Oxide to Nitride Etch Processes on BPSG/Nitride/Oxide Structures in a MERIE Etcher
Graf, W.; Basso, C.; Gautier, F.; Martin, J. M.; Sabouret, E.; Skinner, G.; 1998 IEEE/SEMI Advanced Semiconductor
Manufacturing Conference And Workshop; [1998], pp. 314-319; In English; See also 20050170458; Copyright; Avail: Other
Sources
This study is on oxide etch selective to nitride using a C4F8/CO/Ar/O2 chemistry in a RIE chamber. It has been tested
in a manufacturing environment on several applications for 16 and 64 megabit DRAM. chips. Film stacks tested included a
BPSG/nitride Self-Aligned Contact type of application and a BPSG/nitride/oxide application. Aspect ratios ranged from 4:1
to 8:1. Critical dimensions were typically 0.4 microns and 0.3 microns, but for one application, oxide etch had to finally occur
in a 0.09 microns wide space. Process development started with a Design of Experiment on patterned wafers in order to
understand the major trends of the chemistry. The wafers were analysed using a SEM. Fine tuning of processes for each
301
application involved Optical Emission Spectroscopy (OES) and electrical test yield analysis.
Author
Oxides; Nitrides; Etching; Wafers; Manufacturing; Chips
20050173487 South Carolina Univ., Columbia, SC USA
WBGS Epitaxial Materials Development and Scale Up for RF/Microwave-Millimeter Wave Devices
Khan, M. A.; Simin, G.; Shur, M.; Gaska, R.; May 2005; 9 pp.; In English; Original contains color illustrations
Contract(s)/Grant(s): DAAD19-02-1-0236
Report No.(s): AD-A432964; 15530FA16; No Copyright; Avail: Defense Technical Information Center (DTIC)
The project aimed at significant improvement of the III-nitride based epitaxial materials and device design and fabrication
for high-power heterostructure field-effect transistors (HFETs). The key innovative approaches implemented in this program

include novel pulsed atomic layer epitaxy (PALE) technique to grow the buffer layer with low defect density, improved
epitaxial uniformity in multi-wafer MOCVD reactor, growing HFET wafers with the sheet resistance below 300 Ohm/square.
Design improvements include double-heterostructure devices (DHFET) with InGaN electron confinement layer, insulated gate
design using SiO2 gate insulator (MOSDHFETs) and innovative field-plate design. These new devices demonstrated high RF
powers 15-20 W/mm at a drain bias of 50-65 V, and good parameter stability at 19 W/mm CW powers as confirmed by 100+
hours testing.
DTIC
Aluminum Gallium Arsenides; Epitaxy; Microwave Equipment; Millimeter Waves; Radio Frequencies; Semiconductors
(Materials)
77
PHYSICS OF ELEMENTARY PARTICLES AND FIELDS
Includes quantum mechanics; theoretical physics; and statistical mechanics. For related information see also 72 Atomic and Molecular
Physics, 73 Nuclear Physics, and 25 Inorganic, Organic and Physical Chemistry.
20050169773 Brookhaven National Lab., Upton, NY
RHIC Data Correlation Methodology
Michnoff, R.; D’Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.; 1999; 10 pp.; In English
Report No.(s): DE2004-770722; BNL-66031; No Copyright; Avail: Department of Energy Information Bridge
A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data
gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and
to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration
because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event
Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous
acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data
to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation
task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide
guidelines for software developers. The overall data correlation methodology will be presented in this paper.
NTIS
Data Acquisition; Particle Accelerators; Data Correlation
20050169775 Brookhaven National Lab., Upton, NY, USA
Littlest Higgs Model and One-Loop Electroweak Precision Constraints

Chen, M.; Dawson, S.; 2004; 10 pp.; In English
Report No.(s): DE2004-15009978; BNL-873293-2004-CP; No Copyright; Avail: Department of Energy Information Bridge
We present in this talk the one-loop electroweak precision constraints in the Littlest Higgs model, including the
logarithmically enhanced contributions from both fermion and scalar loops. We find the one-loop contributions are comparable
to the tree level corrections in some regions of parameter space. A low cutoff scale is allowed for a non-zero triplet VEV.
Constraints on various other parameters in the model are also discussed. The role of triplet scalars in constructing a consistent
renormalization scheme is emphasized.
NTIS
Fermions; Scalars; Electroweak Model
302
20050169843 Brookhaven National Lab., Upton, NY, USA
Singlet Free Energies of a Static Quark-Antiquark Pair
Petrov, K.; 2004; 12 pp.; In English
Report No.(s): DE2004-15009925; BNL-73191-2004-CP; No Copyright; Avail: Department of Energy Information Bridge
We study the singlet part of the free energy of a static quark anti-quark (Q(bar Q)) pair at finite temperature. The model
is three flavor QCD with degenerate quark masses using N(sub (tau)) = 4 and 6 lattices with Asqtad staggered fermion action.
We look at thermodynamics of the system around phase transition and study its scaling with lattice spacing and quark masses.
NTIS
Free Energy; Quarks; Thermodynamics; Antiparticles
20050171013 Jefferson (Thomas) Lab. Computer Center, Newport News, VA, USA
Nucleon Electromagnetic Form Factors
de Jager, K.; 2004; 34 pp.; In English
Report No.(s): DE2004-834525; No Copyright; Avail: Department of Energy Information Bridge
Although nucleons account for nearly all the visible mass in the universe, they have a complicated structure that is still
incompletely understood. The first indication that nucleons have an internal structure, was the measurement of the proton
magnetic moment by Frisch and Stern (1933) which revealed a large deviation from the value expected for a point-like Dirac
particle. The investigation of the spatial structure of the nucleon, resulting in the first quantitative measurement of the proton
charge radius, was initiated by the HEPL (Stanford) experiments in the 1950s, for which Hofstadter was awarded the 1961
Nobel prize. The first indication of a non-zero neutron charge distribution was obtained by scattering thermal neutrons off
atomic electrons. The recent revival of its experimental study through the operational implementation of novel instrumentation

has instigated a strong theoretical interest. Nucleon electro-magnetic form factors (EMFFs) are optimally studied through the
exchange of a virtual photon, in elastic electron-nucleon scattering.
NTIS
Nucleons; Form Factors; Neutrons; Protons
20050173407 Oxford Univ., Oxford, UK
An Investigation of Certain Thermodynamic Losses in Minature Cryocoolers
Reed, Jaime; Jan. 2005; 30 pp.; In English; Original contains color illustrations
Contract(s)/Grant(s): FA8655-04-1-3011
Report No.(s): AD-A432813; EOARD-SPC-04-3011; No Copyright; Avail: Defense Technical Information Center (DTIC)
Stirling cycle cryocoolers developed at Oxford have typically been designed using a second order methods whereby the
ideal Stirling efficiency is degraded by a number of discrete loss mechanisms. In all cases the eventual machines perform less
well than expected, and it always appears as if an additional thermodynamic loss is acting. This empirically calibrated loss
is therefore included as part of the normal design procedure and there is anecdotal evidence that this is an approach taken by
other manufactures. Although this loss might be caused by imperfect heat transfer, existing theories do not agree with its
magnitude. A project was therefore started to measure the losses in the simplest possible geometry, a linear compressor with
a plain ‘top-hat’ cylinder head. It was hoped that by characterizing the losses in this geometry and applying them to full
machines these called ‘compression loss’ could be explained. Since the loss is quite large it could allow significant
improvements to be made for future machines. A well calibrated measurement system was developed and a linear compressor
commissioned. To enable a sufficiently good energy balance to be produced electromagnetic motor losses and windage were
measured. It immediately became clear that these were more significant than had been assumed previous studies. In fact it
appeared as if a significant proportion of the ‘compression loss’ might be explained by these new measurements. With the
losses expected from analytic analyses. Agreement was not perfect, however, and this is thought to be due to the
incompleteness of the heat transfer theory, particularly with regard to the flow through the clearance seal. Future possibilities
for work are suggested and it is hoped that these measurements can be used as a baseline for testing theoretical work which
will enable efficiencies to be increased not just in Stirling type coolers, but also in pulse tubes and linear alternators.
DTIC
Coolers; Cryogenic Cooling; Thermodynamics
303
80
SOCIAL AND INFORMATION SCIENCES (GENERAL)

Includes general research topics related to sociology; educational programs and curricula. For specific topics in these areas see
categories 81 through 85.
20050169645 Utah State Univ., Logan, UT, USA
Making a World of Difference Recruitment of Undergraduate Students at USU
Furse, Cynthia; Price, Jana; IEEE Antennas and Propagation Society International Symposium, Volume 1; [1999], pp. 70-73;
In English; See also 20050169565; Copyright; Avail: Other Sources
This paper describes two creative methods that are used to recruit undergraduate students at Utah State University. The
first is ‘Engineering State’, a four.day hands- on immersement in a wide array of engineering disciplines, and the second is
a slide show ca/led ‘Maldag a World of Difference - Women in Engineering’ that was created by a woman engineering student
to provide an uplifting, upbeat look at the difference a woman can make in the world if she becomes an engineer, and to
encourage young women to take a second look at the opportunities engineering presents for them. The Engineering State
program has been in place since 1992 and has had clear, measurable benefits in the recruitment of students to USU engineering
programs. One of the goals of Engineering State is to attract women and minorities to consider an engineering career. The
engineering students who have seen the preliminary portions of the new slide show have excited comments, smiles on their
faces, and a renewed positive outlook. It is hoped that this exposure will provide much-needed role models, success studies,
and a new look at engineering as the people-oriented profession that it can be. This slide show is available free to interested
faculty, and is downloadable in Power Point format from the web.
Author
Universities; Students; Education; Occupation
20050169648 Brigham Young Univ., Provo, UT, USA
Microwave Engineering Design Laboratories: C-Band Rail SAR and Doppler Radar Systems
Jensen, MichaelA.; Arnold, David V.; Crockett, Donald E.; IEEE Antennas and Propagation Society International Symposium,
Volume 1; [1999], pp. 82-85; In English; See also 20050169565; Copyright; Avail: Other Sources
National trends appear to indicate that student and faculty interest in electromagnetic principles and practices is waning.
Ironically, given the current industrial emphasis on high-frequency communications, high-speed computational systems, and
high-bandwidth interconnection requirements, we find ourselves in a situation where increasing numbers of engineers need to
have a grasp of high-frequency fundamentals. To address this need, we have re-focused the electromagnetic teaching
laboratories in the Electrical and Computer Engineering Department at Brigham Young University to provide students with
relevant, motivational design experiences with microwave systems. Our current laboratories are based on a 6 GHz Doppler
radar in our Junior-level course, and a 6 GHz Synthetic Aperture Radar (SAR) in the Senior-level course. These laboratories

focus on taking students through the entire design process, beginning with system-level engineering and moving through
computer-aided design, fabrication, and testing. Because the end product is a functional, useful system, students can directly
see the relevance of the experience as well as the associated theory taught in the accompanying course. We have observed a
significant increase in student motivation since original inception of the revised laboratory experience.
Author
Microwaves; Electrical Engineering; C Band; Synthetic Aperture Radar; Doppler Radar
20050170455 NASA Langley Research Center, Hampton, VA, USA
Evaluating the Effectiveness of the 2003-2004 NASA SCIence Files(trademark) Program
Caton, Randall H.; Ricles, Shannon S.; Pinelli, Thomas E.; Legg, Amy C.; Lambert, Matthew A.; May 05, 2005; 55 pp.; In
English
Contract(s)/Grant(s): 23-079-99-OE
Report No.(s): NASA/TM-2005-213756; L-19120; No Copyright; Avail: CASI;
A04, Hardcopy
The NASA SCI Files is an Emmy award-winning series of instructional programs for grades 3-5. Produced by the NASA
Center for Distance Learning, programs in the series are research-, inquiry-, standards-, teacher- and technology-based. Each
NASA SCI Files program (1) integrates mathematics, science, and technology; (2) uses Problem-Based Learning (PBL) to
enhance and enrich the teaching and learning of science; (3) emphasizes science as inquiry and the scientific method; (4)
motivates students to become critical thinkers and active problem solvers; and (5) uses NASA research, facilities, and
personnel to raise student awareness of careers and to exhibit the ‘real-world’ application of mathematics, science, and
technology. In April 2004, 1,500 randomly selected registered users of the NASA SCI Files were invited to complete a survey
304
containing a series of questions. A total of 263 surveys were received. This report contains the quantitative and qualitative
results of that survey.
Author
NASA Programs; Education; Telecommunication; Science; Engineering; Mathematics
81
ADMINISTRATION AND MANAGEMENT
Includes management planning and research.
20050170461 Shape Memory and Superelastic Technologies, Boeblingen, Germany
Automated Lot Tracking and Identification System

Rohrer, Ulrich; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998], pp. -; In
English; See also 20050170458; Copyright; Avail: Other Sources
Meeting exactly the agreed upon delivery dates and product volumes is an essential part of the relationship between
semiconductor manufacturers and their customers. Especially in the ASIC business with a multitude of part-numbers and yet
small lot sizes, this has become a major criteria for ,,customer satisfaction’. Worldwide competition is the driving force to
reduce manufacturing cycle time, especially for design verification or product qualification using express or RTAT lots. To
supply these high priority lots in the least possible time to the proper manufacturing equipment is a critical factor towards
achieving short overall cycle times.
Derived from text
Application Specific Integrated Circuits; Automatic Control; Manufacturing; Tracking (Position)
20050170469 Motorola, Inc., Mesa, AZ, USA
Improvement of AME 8110 Oxide Etcher Daily Clean
Welp, Kevin; Fisher, Paul; Holden, Joan; Wang, Ping; Gunn, Mynetta; Franco, Jennie; 1998 IEEE/SEMI Advanced
Semiconductor Manufacturing Conference And Workshop; [1998], pp. 50-54; In English; See also 20050170458; Copyright;
Avail: Other Sources
In semiconductor manufacturing, continuously increasing production capacity to meet customer demands is a challenge
for many mature fabs. Purchasing new equipment or building additional fabrication areas are rarely the options. Therefore,
new ways to improve capacity using existing resources must be explored. Motorola’s Bipolar 3 fab has done this in the case
of Applied Materials 8110 Reactive Ion Etchers (RIE).
Derived from text
Semiconductors (Materials); Manufacturing
20050170472 Texas Instruments, Inc., Dallas, TX, USA
The Effect of Performance Based Incentive Plans
Ingersoll, Tim; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998], pp. 115-118;
In English; See also 20050170458; Copyright; Avail: Other Sources
This report describes a method to simultaneously achieve and maintain production and quality goals through Performance
Based Incentive Plans. Historically, at Texas Instruments’ DMOS IV wafer fab, the focus on one metric resulted in a loss of
another. Achievement or failure to achieve Fab goals had no noticeable impact on production specialists, Throughout this time
incentive plans were tried, but their metrics were complicated and not easily recognized by direct labor because they were
outside of their immediate line of sight. By modifying, improving, and evolving our incentive program to meet business goals,

DMOS IV experienced seven record output quarters over 2 years while improving in all other industry established metrics.
Author
Industrial Management; Personnel Management; Production Management; Management Methods; Human Performance;
Incentives
20050170473 Texas Instruments, Inc., Dallas, TX, USA
Rewards, Structure and Alignment Affect Goal Attainment
Gentleman-Ingersoll, Janet; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998],
pp. 128-132; In English; See also 20050170458; Copyright; Avail: Other Sources
305
To compete in today’s global market requires creative solutions and ideas that surpass those any individual alone can
conceive or achieve. Organizations will only succeed when employee’s work together, leverage diverse ideas and unite their
efforts in focused direction. This paper presents a strategy to create an environment where individual contributors, teams or
organizations want to work collectively to accomplish a common goal. This paper addresses alignment, structure and rewards
that both encourage and support collaborative effort.
Author
Teams; Organizations
20050170493 International Business Machines Corp., Essex Junction, VT, USA
The Advantages of Using Short Cycle Time Manufacturing (SCM) Instead of Continuous Flow Manufacturing (CFM)
Martin, Donald P.; 1998 IEEE/SEMIAdvanced Semiconductor Manufacturing ConferenceAnd Workshop; [1998], pp. 43-49;
In English; See also 20050170458; Copyright; Avail: Other Sources
Abstract - Over the past two decades continuous flow manufacturing (CFM) has been the principle operational tool to help
manage and improve the utilization of manufacturing assets. As the name connotes, the key focus of CFM is to measure and
manage the throughput of tools/toolsets that comprise the manufacturing line. To this end, there have been a variety of systems
proposed to help manage throughput (e.g., PUSH , PULL, theory of constraints) with their attendant control methodologies
(e.g., MRP, KANBAN, drum-buffer-rope, etc.). This paper explores how the X-factor (normalized cycle time) rather than
throughput is used as the prime line control and line analysis parameter; hence, the name short cycle time manufacturing
(SCM). Because manufacturing lines have both throughput and X-factor commitments, it is essential to understand the
fundamental relationships between throughput, capacity and X-factor. This paper also demonstrates that X-factor is a much
more sensitive indicator of capacity problems than throughput, because X-factor increases rapidly as the throughput
approaches the effective capacity. This sensitivity in X-factor can be used as a powerful diagnostic tool to uncover

unanticipated capacity issues. Short cycle time manufacturing (SCM) allows each tool/toolset to be analyzed depending on its
demonstrated X-factor and capacity versus target to determine which tools/toolsets need improvement, since the overall
X-factor of the line is just the weighted sums of the component toolset X-factors. In addition, this paper analyzes the impact
of mix and volume with a cycle time constraint on the capacity of tools that are affected by batch or train size. Thus, SCM
provides significant advantages over CFM in helping to manage and improve manufacturing asset utilization.
Author
Time Dependence; Manufacturing; Flow Charts
20050170494 University Coll., Cork, Ireland
Semiconductor Metrics: Conflicting Goals or Increasing Opportunities?
Sattler, Linda; Schlueter, Robert; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop;
[1998], pp. 55-60; In English; See also 20050170458; Copyright; Avail: Other Sources
In order to improve semiconductor manufacturing performance, companies typically utilize various metrics such as cycle
time, throughput and yield. By tracking the progress of one or more of these metrics and setting achievement goals, many
companies are able to make significant metric improvements. However, metric improvement is only beneficial if it results in
actual manufacturing improvement. Metrics may be influenced by forces outside of manufacturing, they may conflict with
other metrics, or they may actually increase undesirable outcomes in the lab. This paper highlights some of the current
problems with metric utilization in semiconductor fabs. Examples from industry and results using data from the Competitive
Semiconductor Manufacturing Study at the University of California at Berkeley are given. We present some practical solutions
highlighting the Overall Equipment Effectiveness Teams at Texas Instruments which have been designed to minimize many
of the semiconductor metric problems.
Author
Semiconductors (Materials); Manufacturing; Improvement
20050170496 INTEL Ireland Ltd., Leixlip, Ireland
A80 A New Perspective on Predictable Factory Performance
Cunningham, Calum; Babikian, Richard; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And
Workshop; [1998], pp. 71-76; In English; See also 20050170458; Copyright; Avail: Other Sources
Predictable output performance that maximizes asset utilization is the cornerstone of successful volume manufacturing.
The Theory of Constraints uses the principles of covariance and dependent events to describe how equipment or operations
that dominate factory performance should be managed. In practice the ‘true constraint’ is elusive and is seldom the designed
306

constraint. This paper introduces a new statistically based equipment performance management methodology called A80
which focuses on equipment or operation performance variability to rapidly identify and improve the performance of the ‘true
constraint’. The A80 methodology initially developed at Intel’s Fab 10 facility and subsequently adopted by all Intel 200mm
facilities rejects the traditional use of average availability as a primary indicator of equipment performance and capacity
because it provides no indication of stability thus invariably fails to prompt the correct response to performance
inconsistencies. This paper will describe the A80 concept, tools and methods developed in Fabl0 and will use data and case
study materials to show how the methodology is
Author
Management Planning; Statistical Analysis; Performance Prediction; Equipment
20050170504 Fairchild Semiconductor Corp., South Portland, ME, USA
Enhancing Fab Performance Under Team Council Methodology
Dupuis, Ronald N., Jr.; Gervais, John; Park, Steven; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference
And Workshop; [1998], pp. 119-121; In English; See also 20050170458; Copyright; Avail: Other Sources
The objective of this paper is to outline and describe the process of developing Team Councils in a Fab Organization. At
first we will present an historical background and why we thought this type of approach was necessary to achieve high
performance from all levels of the Organization.A Road Map to success as well as a Task Level Migration matrix will describe
different levels of responsibility needed to achieve the results described in the conclusion of this paper. Though this process
is still evolving and developing in South Portland, the paper describes the necessary steps to implement this process.
Author
Matrices (Mathematics); Migration
20050170505 Motorola, Inc., Austin, TX, USA
Risk Management Exercise in a Wafer Fab Utilizing Dynamic Simulation
McCay, Todd; DePinto, Gary; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998],
pp. 122-127; In English; See also 20050170458; Copyright; Avail: Other Sources
In the semiconductor industry, companies must be prepared to effectively respond to emergency situations that threaten
their employees’ safety and their manufacturing sites. Most emergencies are small incidents with minor impact; however, the
potential human and fmancial loss resulting from a large scale emergency can be very great. Prior experience had shown that
although the Motorola and City of Austin emergency response groups operate effectively on an independent basis, cross-group
communication and coordination needed improvement. To assist with this, a large-scale, multiple emergency drill involving
all groups was conducted. A forty-two member simulation team was organized to design and implement a scenario using

Dynamic Simulation in order to make the drill as realistic as possible. A five hour drill was successfully completed without
interruption to manufacturing with approximately eighty responders at eight different, simultaneous activity areas across a 245
acre campus containing five manufacturing facilities. Several opportunities to improve and refine the processes of preplanning,
response, follow-up and drill implementation were identified. Annual drills of this magnitude and style will be institutionalized
as part of how each group manages risk and protects their employees and other assets.
Author
Industrial Safety; Safety Management; Industrial Plants; Emergencies; Drills
20050170506 International Business Machines Corp., Essex Junction, VT, USA
Quantifying Capacity Loss Associated with Staffing in a Semiconductor Manufacturing Line
Pollitt, Clinton; Matthews, John; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop;
[1998], pp. 133-137; In English; See also 20050170458; Copyright; Avail: Other Sources
Even on a base of total time, staffing related capacity loss is one of the major contributors to underperforming tools. The
loss of capacity caused by staffing, whether planned or unplanned, has the potential of being the single most significant
operational detractor in a semiconductor line. A number of issues related to staffing strategies and operational methodologies
for a semiconductor line will affect capacity loss. On the one hand, there is the need to be cost competitive by reducing staffing
and increasing productivity. On the other, the cost of idle equipment and loss of tool capacity because of insufficient staffing
must be considered. Many issues are involved in determining accurate capacity loss because of staffing and identifying the
components of that loss. This paper discusses ways to determine capacity loss and other concerns related to staffing on various
tool sets in a semiconductor manufacturing line using the techniques of multiobservation study (MOS) and data analysis.
307
Additionally, it describes the link between the quantity and main contributing factors that result in a given loss. Finally, it
examines some strategies for reducing the effect of staffing on capacity.
Author
Personnel Management; Losses; Industrial Management; Assembling
20050170507 Southwest Research Inst., San Antonio, TX, USA
Filling the Technology Gap through Balanced Joint Development Projects and Contracted Independent Research
Providers
Runnels, Scott; Miceli, Frank; Kim, Inki; Easter, BIll; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference
And Workshop; [1998], pp. 138-141; In English; See also 20050170458; Copyright; Avail: Other Sources
Over the past several years, a noticeable amount of the semiconductor manufacturing industry’s overall R&D burden has

shifted from chip manufacturer to equipment supplier. However, it is difficult for equipment suppliers to support the permanent
dedicated research staff required to bear their increasing R&D burden. Likewise, their counterparts inside the chip
manufacturer are urged to focus on current process development, integration, and efficiency issues. This shift in the R&D
burden has been widely recognized in the supplier community, which has referred to it as the ‘Technology Gap.’ This paper
describes one way of dealing with that technology gap. A successful joint development project (JDP) between SpeedFam
Corporation and Lucent Technologies is described and used to exemplify how the R&D burden can be properly balanced by
allowing each organization to focus on their core competency. Key to the success of the JDP was the use of private,
independent R&D supplied under contract by Southwest Research Institute, which also helped facilitate the balance through
preliminary self-funded R&D. The paper explains how issues regarding intellectual property protection and ownership were
successfully resolved and will briefly describe the technology produced from the project.
Author
Intellectual Property; Research; Manufacturing
20050170509 International Business Machines Corp., Essex Junction, VT, USA
Dynamic Capacity Modeling
Mercier, James R.; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And Workshop; [1998],
pp. 148-150; In English; See also 20050170458; Copyright; Avail: Other Sources
Today’s semiconductor fabricators often experience large part number variations and short product lives which can lead
to capacity shortfalls. Fluctuation in part number mix can lead to multiple pinch points in the production process. To contain
wafer starts, new process qualification must be quickly implemented. However, this may introduce ‘risk’ into the line work
in process (WIP). In addition, any production pinch points will hamper the fabricator’s ability to maintain adequate line cycle
time. This paper demonstrates a methodology that can be used to relate part number variation in the fabricator to the available
tool capacity in various process sectors. This methodology allows for real time analysis, and is primarily intended for proactive
management of capacity-constrained production sectors.
Author
Industrial Management; Dynamic Models; Fabrication; Management Methods; Manufacturing
20050170510 Osaka Univ., Osaka, Japan
Effect of 300mm Wafer and Small Lot Size on Final Test Process Efficiency and Cost of LSI Manufacturing System
Nakamae, Koji; Chikamura, Akihisa; Fujioka, Hiromu; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing
Conference And Workshop; [1998], pp. 151-155; In English; See also 20050170458; Copyright; Avail: Other Sources
The effect of lot size change on the current final test process efficiency and cost due to the transition of from conventional

5 or 6 inches to 300mm (12 inches) in wafer size is evaluated through simulation analysis. Results show that a high test
efficiency and a low test cost are maintained regardless of lot size in the range of 300ram wafer from one sheet to 25 sheets
by using an appropriate dispatching rule and a small processing and moving lot size close to the batch size of testing equipment
in the final test process.
Author
Wafers; Low Cost; Manufacturing; Size Distribution
20050170515 JEOL System Technology Co. Ltd., Tokyo, Japan
Yield Management for Development and Manufacture of Integrated Circuits
Koyama, Hiroshi; Inokuchi, Masayuki; 1998 IEEE/SEMI Advanced Semiconductor Manufacturing Conference And
Workshop; [1998], pp. 208-211; In English; See also 20050170458; Copyright; Avail: Other Sources
308
The purpose of this paper is to outline a strategic element of yield management methodologies for the development and
fabrication of advanced Ultra Large Scale Integration (ULSI) circuits. Fundamental ideas regarding knowledge conversion and
a detailed yield management system are described.
Author
Management Systems; Large Scale Integration
82
DOCUMENTATION AND INFORMATION SCIENCE
Includes information management; information storage and retrieval technology; technical writing; graphic arts; and micrography. For
computer program documentation see 61 Computer Programming and Software.
20050169840 Aspen Systems Corp., Silverspring, MD, USA
What Works in Partnership Building for HMIS: A Guide for the Los Angeles/Orange County Collaborative
Apr. 30, 2003; 40 pp.; In English
Report No.(s): PB2005-105941; No Copyright; Avail: CASI;
A03, Hardcopy
To inform its implementation of a countywide homeless information management system, the Los Angeles/Orange
County (LA/OC) Collaborative is interested in identifying and understanding successful models for collaboration on
information technology. This document presents descriptions of how other jurisdictions around the country have implemented
an HMIS in their communities. The document highlights What Works in each community examples of decisions and practices
that can help inform the LA-OC HMIS decision-making process. The LA/OC Collaborative is comprised of the Cities of

Glendale, Los Angeles, Long Beach, Pasadena, Pomona and Santa Monica and Los Angeles and Orange counties.
NTIS
Information Management; Management Systems; Identifying
20050170924 American Geological Inst., Alexandria, VA, USA
National Geoscience Data Repository System. Phase III: Implementation and Operation of the Repository. Semiannual
Progress Report. 1st Half FY001 (Report for October 2000-March 2001)
Apr. 2001; 24 pp.; In English
Report No.(s): DE2004-834777; No Copyright; Avail: Department of Energy Information Bridge
The NGDRS has attained 72% of its targeted goal for cores and cuttings transfers, with over 12M linear feet of cores and
cuttings now available for public use. Additionally, large-scale transfers of seismic data have been evaluated, but based on the
recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale of the seismic
data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core
and cuttings preservation strategy evolve as well. A Steering Committee meeting held on November 30, 2000 focused on
current achievements, how the situation in the petroleum industry affects the NGDRS activities, and the nature of the study
by the National Research Council on data preservation.
NTIS
Geology; Data Base Management Systems
20050170925 American Geological Inst., Alexandria, VA, USA
National Geoscience Data Repository System. Phase III: Implementation and Operation of the Repository. Semiannual
Progress Report. 2nd Half FY02. (Report for April 2002-October 2002)
Oct. 2002; 22 pp.; In English
Report No.(s): DE2004-834768; No Copyright; Avail: Department of Energy Information Bridge
The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector.
Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available
for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized
a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but
based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale of
the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the
primary core and cuttings preservation strategy evolve as well.
NTIS

Geology; Data Base Management Systems; Geophysics
309
20050172075 American Geological Inst., Alexandria, VA, USA
National Geoscience Data Repository System Phase III: Implementation and Operation of the Repository
Apr. 2000; 16 pp.; In English
Report No.(s): DE2004-834609; No Copyright; Avail: Department of Energy Information Bridge
In the past six months the NGDRS program has continued to engaged new contacts, identify additional data transfer
targets, and improve the metadata catalog for both easier use and long-term maintainability. With industry conditions
continuing to rapidly change and evolve, the primary core and cuttings preservation strategy has evolved as well. With the
severe lack of available public data repository space and the establishment of a major national geoscience data repository
facility unlikely in the near future, the focus is on increasing public awareness and access to nonproprietary company data
holdings that remain in the public and private sector. Efforts still continue to identify and facilitate the entry of new repository
space into the public sector. Additionally, AGI has been working with the National Academy of Sciences Board on Earth
Sciences and Resources staff to initiate a study and workshop to develop a policy recommendation on geoscience data
preservation and prioritization of efforts.
NTIS
Data Bases; Geology; Geophysics
20050173127 Army Medical Dept. Activity, Heidelberg, Germany
Developing a Strategic Information Systems Plan for the Heidelberg US Army Medical Department Activity
Walker, Dennis W.; Apr. 2004; 74 pp.; In English
Report No.(s): AD-A432039; No Copyright; Avail: CASI;
A04, Hardcopy
The Heidelberg Military Healthcare System does not have a strategic information systems plan for the future. The hospital
is operating in a turbulent environment on an aging information system structure. The Heidelberg hospital recently underwent
significant changes and is anticipating more within the next three to five years. This study consists of a qualitative analysis
of the information systems for the Heidelberg healthcare system. Using a six-step customized planning methodology; the study
develops four recommended information management goals, aligns these goals with the organization’s strategic goals and
objectives, defines the information technology architecture, and identifies some resource requirements. Using the
recommended strategic information systems plan, the hospital must create a strategic control action plan developing
measurements and committing capital resources.

DTIC
Biomedical Data; Hospitals; Information Systems
20050173132 National War Coll., Washington, DC USA
The Encryption Export Policy Controversy: Searching for Balance in the Information Age
Miller, Marcus S.; Jan. 2000; 25 pp.; In English
Report No.(s): AD-A432212; No Copyright; Avail: Defense Technical Information Center (DTIC)
The Information Age challenges old paradigms and severely tests the government’s ability to devise appropriate and
effective national policies. The federal government s encryption export policy highlights a complex information age issue
involving seemingly insurmountable conflicts between national security, law enforcement, privacy, and business interests.
Encryption employs mathematical algorithms, implemented in either hardware or software, to encode or scramble a sequence
of data. Although cryptography has been used for centuries, the rise of the Internet and electronic commerce pushed the issue
of encryption control to the forefront of public debate during the 1990s. Formerly the near-exclusive domain of governments,
the majority of today’s encryption products flow from private industry backed by private funding for use in the private sector.
While encryption rose to increasing importance in cyberspace to secure communications and establish trustworthiness, the
federal government continued to follow the traditional national security paradigm of export controls. A series of policy
decisions by the Clinton Administration on encryption export controls during the 1990s ignited a heated public discourse and
a continuing search for a balance between competing interests. The Administration s pursuit of balance apparently reached its
end-state with an announcement on September 16, 1999 to reverse US export restrictions on strong encryption, a radical
departure from previous reliance on export controls. The federal government’s search for balance among competing interests
in its encryption export policy illustrates the substantial difficulties facing policy makers in the Information Age. While the
search for policy balance appears to prove the ultimate adequacy of the Constitutional framework and the policy making
process to deal with complex issues in cyberspace, it clearly highlights the imperative for national policy makers to recognize
Information Age realities. 7
DTIC
Cryptography; International Trade; Policies
310
20050173133 Army Medical Research and Materiel Command, Fort Detrick, MD USA
Alternative Approaches to Improve Physiological Predictions
Oleng, Nicholas; Reifman, Jaques; Berglund, Larry; Hoyt, Reed; Dec. 2004; 9 pp.; In English; Original contains color
illustrations

Report No.(s): AD-A432214; No Copyright; Avail: Defense Technical Information Center (DTIC)
Recent advancements in technology have resulted in new biosensors and information processing capabilities that permit
on-line, real-time measurement of physiological variables. This has, in turn, given rise to the possibility of developing
soldier-specific, data-driven predictive models for assessing physiological status in the battlefield. This paper explores how the
accuracy of a predictive model based on first principles physiology can be enhanced by data-driven ‘black box’ techniques
of modeling and predicting human physiological variables. Such hybrid techniques are employed here in the prediction of core
temperature. Preliminary results show that the mean square error of prediction can be reduced by up to fifty percent for
prediction horizons of up to 30 minutes.
DTIC
Biological Effects; Data Processing; Detection; Physiology
20050173172 Defence Research and Development Canada, Valcartier, Quebec Canada
Capturing and Modeling Knowledge Objectives: The Sacot Project
Auger, Alain; Jan. 2004; 3 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432283; No Copyright; Avail: Defense Technical Information Center (DTIC)
One of the strategic objectives for Information and Knowledge Management (IKM) in Canadian Command and Control
Information Systems (C2IS) consists in investigating and advancing knowledge creation and discovery techniques through
which information is collected and processed to support situation analysis and gain sufficient situational awareness to be able
to project possible future courses of action or trends with confidence. In 2001, the Future Army Capabilities report (DND,
2001) pointed out that without some fundamental change, current army ISR1 will be incapable of providing the degree of
knowledge that will be required by future commanders. Therefore all relevant data, information and knowledge must be
available at all levels, but managed in a way that produces a current, rapid and coherent understanding of the battlespace, while
at the same time allowing the various levels of command to process the relevant material for their specific purposes.
DTIC
Command and Control; Information Management; Information Systems
20050173181 Naval Postgraduate School, Monterey, CA USA
Analysis of Career Progression and Job Performance in Internal Labor Markets: The Case of Federal Civil Service
Employees
Spyropoulos, Dimitrios; Mar. 2005; 87 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432317; No Copyright; Avail: Defense Technical Information Center (DTIC)
The objective of this thesis is to investigate various factors that influence the job performance and promotion of DOD

civilian workers. The data used in this study were drawn from the Department of Defense Civilian Personnel Data Files
provided by the Defense Manpower Data Center (DMDC). The initial data was restricted to employees who were initially
hired in 1995 and stayed in service until 2003 and were paid under the General Schedule (GS) pay system. Three general
performance measures were used: compensation (salary), annual performance ratings and promotions. Multivariate models
were specified and estimated for each of these performance measures. The results indicate that females receive lower annual
and hourly compensation and are less likely to be promoted than men even though they receive better performance ratings.
The results also indicate that minorities are paid less and are less likely to be promoted than majority workers while veterans
are paid more, perform better, and are more likely to become supervisors. The models also reveal that performance rating is
a weak measure of productivity and that more highly educated employees are paid more and more likely to be promoted more
even if they are not always the best performers.
DTIC
Human Performance; Labor; Manpower; Occupation; Personnel; Personnel Management
20050173183 Maryland Univ., College Park, MD USA
Distributed Domain Generation Based on the Network Environment Characteristics for Dynamic Ad-Hoc Networks
Manousakis, Kyriakos; Baras, John S.; Dec. 2004; 3 pp.; In English; Original contains color illustrations
Contract(s)/Grant(s): DAAD19-2-01-0011
Report No.(s): AD-A432323; No Copyright; Avail: Defense Technical Information Center (DTIC)
311
Ad hoc networks are very important for scenarios where there is not fixed network infrastructure. These scenarios may
appear both in the military and the commercial world. Even though there is much advancement in the area of these networks,
the main drawback is that ad hoc networks do not scale well because the existing protocols (e.g., MAC, routing, security)
cannot tolerate their dynamics. A remedy to this problem could exist if these protocols were applied in hierarchical manner.
The hierarchy generation in these dynamic environments can be advantageous since the numerous topological changes can be
tolerated easier and the various protocols can perform better when dealing with smaller groups of nodes. On the other hand,
hierarchy has to be generated carefully in order to be beneficial for the network otherwise it may harm it, because of the
imposed maintenance overhead. The weakness of the existing network clustering algorithms is that they do not take into
consideration the dynamics of the network environment, so in cases of increased mobility their overhead may deteriorate
network performance instead of improving it. In this paper we present a new dynamic distributed clustering (DDC) algorithm.
The basic characteristic of this algorithm is that it takes into consideration the network dynamics for the generation of robust
and efficient clusters. DDC can be applied in highly mobile networks and we show that it presents better scalability and

robustness characteristics from well known existing clustering algorithms.
DTIC
Communication Networks; Hierarchies
20050173189 Naval Postgraduate School, Monterey, CA USA
Requirements Analysis and Course Improvements for EO3502 Telecommunications Systems Engineering
Wagner, Michael D.; Turner, Nathan L.; Mar. 2005; 115 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432333; No Copyright; Avail: Defense Technical Information Center (DTIC)
This thesis evaluated the requirement and provides course improvement recommendations for Telecommunications
Systems Engineering EO3502 taught at the Naval Postgraduate School. Other graduate programs in Information Technology
Management were evaluated to determine the standard for telecommunications engineering expected from some of the most
respected academic institutions. Graduates of NPS’s Information Technology Management (ITM) and Information Systems
and Operations (ISO) curriculums were surveyed to determine how important telecommunications engineering is for their
follow-on assignments. In addition, lesson topic vignettes were developed to provide fleet/field examples to reinforce the
relevance if individual topics. Finally, recommendations were provided for improving EO3502 and the ITM curriculum in
general.
DTIC
Information Systems; Systems Engineering; Telecommunication
20050173209 L-3 Communication Government Services, Inc., Rome, NY USA
Open Radio Communications Architecture Core Framework V1.1.0 Volume 1 Software Users Manual
Gudaitis, Mike; Hallatt, Dave; Bagdasarova, A.; Yax, Mike; Feb. 2005; 159 pp.; In English; Original contains color
illustrations
Contract(s)/Grant(s): F30602-01-C-0205; Proj-APAW
Report No.(s): AD-A432385; AFRL-IF-RS-TR-2005-59-VOL-1; No Copyright; Avail: Defense Technical Information Center
(DTIC)
This document describes software developed to support the Joint Tactical Radio System (JTRS) program. The software
implementation includes a Core Framework (CF) and sample applications that are based on the Software Communications
Architecture (SCA) v2.2. The software was designed for a desktop computer running the Linux operating system (OS). It was
developed in C++, uses ACE/TAO for CORBA middleware, Xerces for the XML parser, and Red Hat Linux for the Operating
System. The software is referred to as, Open Radio Communication Architecture Core Framework, OrcaCF (formerly known
as LinuxFC), this document describes version 1.1.0 of the OrcaCF. This Software User Manual (SUM) tells a hands-on

software user how to install and use the OrcaCF v1.1.0 subsystem. The architecture and requirements are based on the JTRS
SCA v2.2.
DTIC
C (Programming Language); Computer Programs; Manuals; Radio Communication; User Manuals (Computer Programs)
20050173234 Naval Postgraduate School, Monterey, CA USA
An Analysis of the Effect of Marital and Family Status on Retention, Promotion, and On-the-Job Productivity of Male
Marine Corps Officers
Cerman, Guray; Kaya, Bulent; Mar. 2005; 140 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432436; No Copyright; Avail: Defense Technical Information Center (DTIC)
312
This thesis investigates the effect of marital and family status on the performance and job productivity of male U.S.
Marine Corps officers. The analysis includes evaluation of fitness reports, retention, and promotion to O-4 and O-5 ranks as
performance measures. The primary goal is to examine the existence of any marriage premium on officers’ performance and
productivity and to investigate potential causal hypotheses. The personnel database used for the analysis includes more than
27,000 male Marine officers who entered the Marine Corps between FY 1980 and 1999. After controlling for selection,
estimating fixed effects and using panel data in order to capture timely-varying effects, this study finds that there is a marriage
premium for all performance measures. The thesis rejects the explanation that such premiums are due to supervisor favoritism.
Moreover, married male officers obtain higher fitness report scores, higher promotion probabilities, and higher retention
probabilities than single officers. Each additional year spent in marriage increases fitness report scores and retention
probabilities. Having additional non-spousal dependents increase fitness report scores and retention probabilities. On the other
hand, being a currently single but ‘to-be-married’ officer yields higher premium, as married officers, for all productivity and
performance indicators. This supports selectivity into marriage as a partial explanation of the source of the marriage premium.
DTIC
Data Bases; Males; Military Personnel; Personnel Management; Productivity
20050173243 Universidade Nova de Lisboa, Lisbon, Portugal
New Initiatives for Electronic Scholarly Publishing: Academic Information Sources on the Internet
Ramalho Correia, Ana Maria; Teixeira, Jose C.; Dec. 2004; 23 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432461; No Copyright; Avail: Defense Technical Information Center (DTIC)
No abstract available
Data Processing; Electronic Publishing; Information Systems; Internets

20050173272 Office of the Under Secretary of Defense (Acquisitions and Technology), Washington, DC USA
Report of the Defense Science Board Task Force On Information Warfare -Defense (IW-D)
Nov. 1996; 206 pp.; In English
Report No.(s): AD-A432539; No Copyright; Avail: Defense Technical Information Center (DTIC)
The national security posture of the USA is becoming increasingly dependent on U.S. and international infrastructures.
These infrastructures are highly interdependent, particularly because of the inter-netted nature of the information components
and because of their reliance on the national information infrastructure. The information infrastructure depends, in turn, upon
other infrastructures such as electrical power. Protecting the infrastructures against physical and electronic attacks and
ensuring the availability of the infrastructures will be complicated. These infrastructures are provided mostly (and in some
cases exclusively) by the commercial sector; regulated in part by federal, state, and local governments; and significantly
influenced by market forces. Commercial services from the national information infrastructure provide the vast majority of the
telecommunications portion of the Defense Information Infrastructure (DII). These services are regulated by Federal and state
agencies. Local government agencies regulate the cable television portion of the information infrastructure. Power generation
and distribution are provided by very diverse activities-the Federal government, public utilities, cooperatives, and private
companies. Interstate telecommunications are regulated by the Federal Communications Commission, interstate
telecommunications by the state public utilities commissions. Interstate power distribution is regulated by the Federal Energy
Regulatory Commission, interstate power generation and distribution by the state public utilities commissions.
DTIC
Security; Warfare
20050173273 Naval Health Research Center, San Diego, CA USA
Test and Evaluation of Medical Data Surveillance System at Navy and Marine Corps MTFs
Melcer, T.; Bohannan, B.; Burr, R.; Leap, T.; Reed, C.; Jeschonek, B.; Apr. 2003; 54 pp.; In English; Original contains color
illustrations
Contract(s)/Grant(s): Proj-M2332
Report No.(s): AD-A432540; NHRC-03-14; No Copyright; Avail: Defense Technical Information Center (DTIC)
Recent Department of Defense (DoD) directives call for joint medical surveillance. Joint Vision 2010-2020 states the
goals of Information Superiority and Full Spectrum Dominance. In addition, the emphasis on early detection of chemical and
biological attacks makes it imperative to conduct rigorous testing and evaluation (T&E) of medical informatics technologies
under development to enhance joint force protection. The Medical Data Surveillance System (MDSS) is a Web-based
automated surveillance and data analysis tool intended to integrate medical information for surveillance of deployed forces

313
and patient populations in the USA. The present study evaluated MDSS version 3.1, focusing on its functioning and utility
for end users at Navy and Marine Corps MTFs.
DTIC
Data Systems; Evaluation; Medical Services; Navy; Surveillance; System Effectiveness
20050173299 Office of Naval Research, Arlington, VA USA
Science and Technology Metrics
Kostoff, Ronald N.; Jan. 2005; 979 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432576; No Copyright; Avail: Defense Technical Information Center (DTIC)
This document describes the rationale for, and implementation of, the expanded use of the proper metrics in the evaluation
of science and technology (S&T). The document starts with an Executive Overview and Conclusions regarding the application
of metrics to the entire S&T development cycle, including its key role in setting incentives for S&T development. Then, after
describing how the evolution of S&T has influenced the present burgeoning interest in quantitative S&T metrics, this
monograph defines different types of S&T metrics, followed by the main principles of high quality metrics-based S&T
evaluations. After a broad overview of quantitative approaches to research assessment, the document focuses on the main
approaches of bibliometrics and econometrics, including a novel section on bibliometric collaboration indicators. It then
describes the bibliometrics-related family of approaches known as co-occurrence phenomena, describes a network modeling
approach to quantifying research impacts, and ends the main text body with a description of a metrics-based expert systems
approach for supporting research assessment. There are a substantial number of Appendices that make the present document
essentially a self-contained monograph. Appendix 12 contains extensive data describing the infrastructure of the S&T metrics
literature (including the seminal documents in S&T metrics), and it is followed by a very extensive Bibliography that contains
over 7500 key references in S&T metrics. The Bibliography includes both those specific references identified in the body of
this document’s text, and suggestions for further reading in this broad technical area.
DTIC
Cost Analysis; Cost Effectiveness; Research and Development; Technologies; Technology Assessment
20050173338 Army War Coll., Carlisle Barracks, PA USA
The Role of Public Diplomacy and Public Affairs in the Global War on Terrorism
Huntley, Henry L.; Mar. 2005; 35 pp.; In English
Report No.(s): AD-A432672; No Copyright; Avail: Defense Technical Information Center (DTIC)
On 12 September 2001 the day after the horrible attacks on the Pentagon and the World Trade Towers the USA

Government (USG) and the American military officially began the global war on terrorism (GWOT). In a response to the
overwhelming flow of compassion from the International Arab and Muslim Communities President Bush quickly reached out
to America and the rest of the world to make the USG’s case to respond quickly to the terrorist activity around the world.
Proposing a global war on terrorism (GWOT) he would deliver an eloquent but stern message successfully framing why
America and the freedom-loving citizens around the world needed to unit to fight the war on terrorism. Almost two years later
as America faced a second war with the brutal government of Iraq the USG again engaged the international community to state
its case for war. This time engaging too slowly America’s positive support gained through public diplomacy and public affairs
would quickly dissipate. Thus making it very difficult to convince the world and the Arab and Muslim Communities that
America and the coalition were doing the right thing by in going to war with Iraq for a second time. This SRP will examine
the importance of Public Diplomacy and Public Affairs. It will review the current USG policy on public diplomacy and the
military’s role of public affairs. Further the paper will discuss world opinion of USG policy assess whether the U.S. military
should carry the burden of public diplomacy to win the hearts and minds and provide a recommendation for improving the
USG Pubic Diplomacy posture in our current global war on terrorism.
DTIC
Public Relations; Terrorism; United States; Warfare
20050173343 Geological Survey, Reston, VA USA
Electronic Collection Management and Electronic Information Services
Cotter, Gladys; Carroll, Bonnie; Hodge, Gail; Japzon, Andrea; Dec. 2004; 21 pp.; In English; Original contains color
illustrations
Report No.(s): AD-A432684; No Copyright; Avail: Defense Technical Information Center (DTIC)
No abstract available
Data Management; Electronic Publishing; Information Management; Information Systems; Libraries; Management
Information Systems; Pulse Communication; User Requirements
314
20050173345 Information International Associates, Inc., Havertown, PA USA
Metadata for Electronic Information Resources
Hodge, Gail; Dec. 2004; 20 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432686; No Copyright; Avail: Defense Technical Information Center (DTIC)
No abstract available
Data Management; Electronic Publishing; Indexes (Documentation); Information Management; Metadata; Security; Subjects

20050173356 General Hospital (121st) APO, New York, NY USA
Pharmaceutical Logistics at the 121st General Hospital, Seoul, Korea
Giraud, Roger S.; Apr. 2004; 54 pp.; In English
Report No.(s): AD-A432702; AMDCS-35-04; No Copyright; Avail: CASI;
A04, Hardcopy
The USA Forces Korea has continued to deter North Korean aggression and the l2l St. General Hospital (121 St. GH) has
provided health care support during this period. The 121st GH pharmacy is an integral piece in the provision of health care
in Korea. The purpose of the study is to determine the indicators of effective pharmacy support and determine if our current
pharmaceutical logistics practice is efficient. The study reports an innovative application of multivariate approaches to predict
order ship time (OST). The sample consists of 122 days of pharmaceutical requisitions. Pharmaceutical logistics data are used
to estimate a multiple regression model of OST for demand satisfaction and accommodation, requisition cost and volume and
source of supply. Multivariate correlations among five independent variables and the dependent variable, OST, are calculated.
The average OST is 6.99 days. Demand satisfaction, requisition volume and source of supply measures make statistically
significant contributions to the shared variance in overall OST, and yield an R(exp 2) of .225 (F(5, 116)= 6.72; p \h .0001).
The study’s results, its usefulness for enhancing leadership’s ability to evaluate pharmaceutical logistics, and its implications
for current systems are discussed. By improving pharmaceutical logistics, the 121st General Hospital may deliver better health
care on the Korean peninsula.
DTIC
Drugs; Hospitals; Korea; Logistics; Logistics Management; Management Systems; Medical Services; Pharmacology
20050173394 Army War Coll., Carlisle Barracks, PA USA
Evaluation of Information Assurance Requirements in a Net-Centric Army
Miller, Scot; Mar. 2005; 31 pp.; In English
Report No.(s): AD-A432792; No Copyright; Avail: Defense Technical Information Center (DTIC)
Network centric capabilities are a key enabler for the transformational army and planned employment of Units of Action
in the future. Information Assurance refers to the security and assurance of the information that is being passed within the
myriad networked systems at multiple data rates and security classifications. This paper will examine the requirements and
concurrent capabilities necessary for this key strategic imperative of future Army operations as part of a joint and coalition
force.
DTIC
Information; Information Transfer; Military Operations; Security; User Requirements

20050173396 Texas Univ., Austin, TX USA
Future Force and First Responders: Building Ties for Collaboration and Leveraged Research and Development
O’Brien, William J.; Hammer, Joachim; Dec. 2004; 8 pp.; In English; Original contains color illustrations
Contract(s)/Grant(s): CMS-0075407; CMS-0122193
Report No.(s): AD-A432794; No Copyright; Avail: Defense Technical Information Center (DTIC)
Visions for the information needs and operational capabilities of the Future Force are similar to those for First Responders
who comprise the backbone of Homeland Security personnel. There is also an increasing role for collaboration between Future
Force warriors and First Responders in response to both domestic incidents and internationally through peacekeeping and
related operational roles (US Army 2001; US Army 2004). The purpose of this position paper is to summarize the information
environment of First Responders from the perspective of the IT/C4ISR community, seeking to highlight areas for
collaboration, extension of research, and opportunities for leveraged R&D.
DTIC
Medical Personnel; Security
315
20050173408 Naval Postgraduate School, Monterey, CA USA
Planning for Success: Constructing a First Responder Planning Methodology for Homeland Security
Jankowski, Thaddeus K., Sr; Mar. 2005; 103 pp.; In English; Original contains color illustrations
Report No.(s): AD-A432814; No Copyright; Avail: Defense Technical Information Center (DTIC)
The planning methodologies used today by most U.S. fire departments are excellent for traditional missions, but wholly
inadequate for the threats posed by terrorism. Planning in the fire service and the rest of the first responder community
historically has relied on a one-dimensional approach that uses a scenario-based planning (SBP) methodology. This thesis
argues that the fire service and others in the first responder community will be able to contribute to homeland security missions
much more effectively, and efficiently, by switching to specially adapted versions of capabilities-based planning. This thesis
proposes a new integrated planning methodology that combines the planning strengths of scenariobased planning, threat-based
planning, and capabilities-based planning. The new method identifies capabilities that could be used to manage and mitigate
the consequences of the different types of contingencies within the various response spectrums. It allows an organization to
perform analysis and efficiency studies to evaluate the different spectrums of contingencies against existing capabilities and
create a menu of capabilities necessary for the first responder to respond to all its missions, including immediate threats and
terrorism, in the most efficient and cost-effective manner.
DTIC

Cost Effectiveness; Security; Terrorism; Transponders
20050173419 Army Research Lab., Aberdeen Proving Ground, MD USA
Urban Combat Data Mining
Bodt, Barry A.; Heilman, Eric G.; Kaste, Richard C.; O’May, Janet F.; Dec. 2004; 8 pp.; In English; Original contains color
illustrations
Report No.(s): AD-A432834; No Copyright; Avail: Defense Technical Information Center (DTIC)
We describe an approach and its implementation involving simulation and data mining for improved understanding of the
potential relationships among battle parameters and battle outcomes in an urban setting.
DTIC
Combat; Data Mining; Information Retrieval; Simulation; Terrain; Warfare
20050173440 Mayo Foundation, Rochester, MN USA
Molecular Database Construction and Mining: A General Approach to Unconventional Pathogen Countermeasures
Pang, Yuan-Ping; Dec. 2004; 4 pp.; In English
Contract(s)/Grant(s): DAAD19-01-1-0322
Report No.(s): AD-A432883; ARO-42295.1-LS; No Copyright; Avail: Defense Technical Information Center (DTIC)
One general approach to unconventional pathogen countermeasures is to use specific inhibitors to cripple enzymes such
as proteases that are pivotal to pathogen invasions. For example, botulinum toxins can be detoxified by inhibitors targeting
the zinc endopeptidase located in the light-chain region of botulinum toxins, and anthrax can be detoxified by inhibitors
targeting anthrax’s lethal factor which is a zinc protease. The generality of this approach rests on the fact that all pathogen
invasions are enzyme-dependent. Furthermore, viral and bacterial enzymes have high substrate specificity and can therefore
be inhibited by specific inhibitors without interfering with other enzymes required for normal functions. This approach has
been conceptually validated by the clinical use of protease inhibitors for treating various pathogen invasions. It is, however,
not suitable for military use in its present form, because typically ten years are required to develop an effective protease
inhibitor. Here we propose to use the advanced supercomputing technology to shorten the drug discovery process.
DTIC
Construction; Countermeasures; Data Bases; Information Retrieval; Microorganisms; Pathogens
20050173471 Singapore Inst. of Manufacturing Technology, Singapore
Unmanned Tracked Ground Vehicle for Natural Environments
Ibanez-Guzman, J.; Jian, X.; Malcolm, A.; Gong, Z.; Chen, Chun Wah; Tay, Alex; Dec. 2004; 9 pp.; In English; Original
contains color illustrations

Report No.(s): AD-A432934; No Copyright; Avail: Defense Technical Information Center (DTIC)
The deployment of an autonomous and teleoperated vehicle in tropical environments presents numerous challenges due
to the extreme conditions encountered. This paper presents the transformation of a M113 Armored Personnel Carrier into an
autonomous and teleoperated vehicle for operation in jungle-like conditions. The system was partitioned into functional
316
systems: Vehicle Control/ Mobility, Piloting, Visual Guidance, Teleoperation and Communications. Details of the system
architecture and major components are included. Emphasis is made on the perception mechanisms developed for visual
guidance, the vehicle conversion into a computer-controlled system and the implementation of navigation algorithms for
localization and path planning. Asuite of onboard active and passive sensors is used in the visual guidance system. Data fusion
is performed on the outputs of the different types of the sensors. The fusion result fed to the path planner that generates heading
and speed commands to maneuver the vehicle towards the desired position. The vehicle controller executes the speed and
heading commands and ensures the vehicle fast and safe response. The results from field trials completed in tropical forest
conditions that are unique to the region are included.
DTIC
Architecture (Computers); Personnel; Tracked Vehicles; Unmanned Ground Vehicles
20050173478 Air Force Research Lab., Rome, NY USA
FPGA Acceleration of Information Management Services
Linderman, Richard W.; Linderman, Mark H.; Lin, Chun-Shin; Feb. 2005; 25 pp.; In English; Original contains color
illustrations
Report No.(s): AD-A432952; No Copyright; Avail: Defense Technical Information Center (DTIC)
Field Programmable Gate Arrays (FPGAs) are widely known for their ability to accelerate ‘number crunching’
applications, such as filtering for signal and image processing. However, this paper reports on the ability of FPGAs to greatly
accelerate non-numerical applications, particularly fundamental operations supporting publish subscribe information
management environments. The specific core service accelerated by FPGAs is the brokering of XML metadata of publications
against the XPATH logical predicates expressing the types of publications that the subscribers wish to receive. The
acceleration is not achieved solely by the FPGA, but by its close coordination with a programmable processor within a
Heterogeneous, HPC architecture (HHPC). Two subtasks addressed by the FPGA are the parsing of the ASCII XML
publication metadata into an exploitable binary form, followed by the partial evaluation of up to thousands of subscription
predicates, with results reported back to the programmable processor. On the first subtask, the FPGA implements a state
machine the parses 1 ascii character per clock cycle, presently with a 50 MHz clock on 6M gate Xilinx Virtex II FPGAs. This

reduces parse time typical information object metadata from 2 milliseconds to around 50 microseconds (40X speedup). Once
the data is parsed, the fields broadcast to parallel logic, which evaluates the subscription predicates. The FPGA synthesis tools
do a surprising effective job of optimizing the logic to evaluate these XPATH predicates. In one typical case, 2000 predicates
compiled down to only require 2.9% of the 6M gate FPGA resources.
DTIC
Computer Programming; Field-Programmable Gate Arrays; Information Management; Information Systems
20050173483 Space and Naval Warfare Systems Center, San Diego, CA USA
Integrated Control Strategies Supporting Autonomous Functionalities in Mobile Robots
Sights, B.; Everett, H. R.; Pacis, E. B.; Kogut, G.; Thompson, M.; Jan. 2005; 7 pp.; In English; Original contains color
illustrations
Report No.(s): AD-A432959; No Copyright; Avail: Defense Technical Information Center (DTIC)
High-level intelligence allows a mobile robot to create and interpret complex world models, but without a precise control
system, the accuracy of the world model and the robot’s ability to interact with its surroundings are greatly diminished. This
problem is amplified when the environment is hostile, such as in a battlefield situation where an error in movement response
may lead to destruction of the robot. As the presence of robots on the battlefield continues to escalate and the trend toward
relieving the human of the low-level control burden advances, the ability to combine the functionalities of several critical
control systems on a single platform becomes imperative.
DTIC
Autonomy; Robotics; Robots
20050173532 Objective Interface Systems, Inc., Herndon, VA USA
High-Assurance Security/Safety on HPEC Systems: an Oxymoron?
Beckwith, Bill; Vanfleet, W. M.; Feb. 2005; 8 pp.; In English; Original contains color illustrations
Report No.(s): AD-A433019; No Copyright; Avail: Defense Technical Information Center (DTIC)
To address the need for security in high performance systems, an architecture-based on a small separation, or partitioning,
kernel was proposed. This architecture, termed the MILS (Multiple Independent Levels of Security) architecture classifies the
317
components of a system into three layers, the Partitioning Kernel, the Middleware layer (which includes many operating
system functions commonly found combined with an OS kernel, as well as code more traditionally termed middleware), and
the Application layer. This approach can be implemented and used effectively in high performance systems. In MILS, basic,
general purpose security policies are enforced at lower levels by the Partitioning Kernel and middleware layer. Enforcement

of these basic security policies permits the top layer to implement other, application-specific security policies-such as
Bell-LaPadula (BLP), Biba, Community of Interest, etc with confidence that the code that implements these policies will have
the characteristics of a reference monitor: Non-bypassable, Evaluatable, Always-invoked and Tmper-roof (NEAT). The ability
of these systems to transfer data at high speed is not compromised by a MILS design. These concepts are extended to a
collection of MILS nodes called an enclave. The PCS (Partitioning Communication System) provides the high-assurance
secure communication between the MILS nodes in the enclave. The PCS was designed with HPEC systems in mind. The PCS
includes zero-copy semantics for secure communications. Like the Partitioning Kernel, the PCS requires formal methods and
mathematical models to assure correctness. The presentation will describe the performance impact and optimizations of the
PCS on HPEC environments.
DTIC
Information Transfer; Safety; Security
20050173543 Kentucky Univ., Lexington, KY USA
The Manuscript Option Dissertation: Multiple Perspectives
De Jong, Marla J.; Moser, Debra K.; Hall, Lynne A.; Dake, Marcia A.; May 2005; 13 pp.; In English
Report No.(s): AD-A433038; AFIT-CI04-1065; No Copyright; Avail: CASI;
A03, Hardcopy
In the dissertation process, the doctoral candidate designs, conducts, and presents scholarly research that is intended to
generate new knowledge. The traditional dissertation generally consists of several chapters, including an introduction, review
of literature, methods, results, and discussion, But far more dissertations remain unpublished than published. This practice
does a disservice to all who participated directly or indirectly in the research including the graduate, dissertation committee
and advisor, individuals or organizations and the funding agency. An alternate format, the manuscript option dissertation, is
becoming more popular at universities throughout the USA and consists of a series of manuscripts that are either published
or ready for journal submission. The University of Kentucky College of Nursing adopted the manuscript option for the
dissertation in 2002, leaving the decision regarding that option versus a traditional dissertation open to the student and advisor.
This paper describes our experience with the manuscript option dissertation from the perspectives of the program director, the
advisor, the doctoral candidate, and the journal editor. Program Director’s Perspective
DTIC
Medical Science; Theses
83
ECONOMICS AND COST ANALYSIS

Includes cost effectiveness studies.
20050170513 Stanford Univ., Stanford, CA, USA
New Business Models for Standard and ASIC Products in the Semiconductor Industry: Competing on Cost and
Time-to-Market
Akella, Ram; Kleinknecht, Jochen; Gillespie, Jaysen; Kim, Byunggyoo; Frederick, Al; 1998 IEEE/SEMI Advanced
Semiconductor Manufacturing Conference And Workshop; [1998], pp. 190-196; In English; See also 20050170458;
Copyright; Avail: Other Sources
Many semiconductor companies in the ASIC business struggle with the new competitive environment, which requires
better and better operational performance. We detail ways of improving their current business model in order to become more
responsive to customers’ orders and more profitable at the same time. Based on a study of customer change order behavior,
we motivate why these companies should base their business and operations on unit volume and not on the degree of
standardization of their products. Furthermore, we suggest to device new contract schemes and introduce the concept of
delayed product differentiation.
Author
Industrial Management; Management Methods; Economic Factors; Commerce; Market Research
318

×