You are on page 1of 10

2.

11

Safety Instrumentation and Justification of Its Cost


H. M. HASHEMIAN

Industrial processes such as power plants, chemical processes, and manufacturing plants depend on temperature,
pressure, level, and other instrumentation not only to control
and operate the process, but also, and more importantly, to
ensure that the process is operating safely. Normally, for
safety purposes, the output signal from a process instrumentation channel is compared with a set point that is established
based on the process design limits. If the set point is
exceeded, then alarms are initiated for operator action or the
plant is automatically shut down to avoid an accident. In
some cases, more than one instrument is used to measure the
same process parameter to provide redundancy and added
protection in case of failure of the primary instrument. In
some processes such as nuclear power plants, up to four
redundant instruments are used to measure the same process
parameter. For example, the steam generator level is measured with four differential pressure sensors. Often, the same
type of sensor and signal conditioning equipment are used
for these redundant measurements. However, there are plants
in which redundant instruments are selected from different
designs and different manufacturers to provide diversity as
well as redundancy and minimize the potential for a common
mode failure.
When redundant instruments are used, then a logic circuit
is often implemented in the process instrumentation system
to help in determining the course of action that should be
taken when a process limit is exceeded. For example, when
there are three redundant instruments, the process is automatically shutdown if two of the three redundant instruments
agree that the safe limit is exceeded.
Because industrial processes are inherently noisy due to
vibration, process parameter fluctuations, interferences, and
other causes, electronic or mechanical filters are sometimes
used to dampen the noise and avoid false alarms and inadvertent plant trips. The selection of the filters must be done
carefully to ensure that the filter does not attenuate any transient information that may indicate a sudden or significant
change in a process parameter requiring action to protect the
safety of the plant.
This section describes the characteristics and costs of
typical instrumentation that is used in industrial processes to
protect the plant and ensure the safety of both plant personnel
and the general public. Improved safety instrumentation has
received special attention in recent decades due to major
268
2002 by Bla G. Liptk

industrial accidents that have caused loss of life and property.


Although major industrial accidents have fortunately been
few, a great deal of effort and money are spent to prevent
these accidents. New plants are now equipped with digital
instrumentation and advanced sensors that take advantage of
21st-century technology and maximize safety. Also, old
plants are upgrading their safety instrumentation systems and
retrofitting them with new equipment, including digital
instrumentation to improve safety and overcome equipment
degradation due to aging. Furthermore, new monitoring, testing, and surveillance activities have been implemented in
some process industries to monitor the health of instrumentation and identify instruments that have anomalous performance. Some of these new monitoring, testing, and surveillance techniques are described in this section and additional
information is given in Reference 1.

SAFETY INSTRUMENTATION CHANNEL


To protect the safety of a process, accurate and timely measurements of key process parameters are usually made on a
continuous basis. The information is provided to a hardware
logic module or software algorithm to determine if and when
action must be taken to ensure safety.
To measure a process parameter, an array of instruments
is typically employed. This array is often referred to as an
instrument channel. Figure 2.11a shows typical components of an instrument channel in a typical industrial process.
The channel consists of a sensor such as a resistance temperature detector (RTD) or a thermocouple to measure temperature, or a pressure sensor to measure the absolute or
differential pressure for indication of pressure, level, or flow.
The connection between the sensor and the process is referred
to as the process-to-sensor interface; an example is a thermowell in the case of temperature sensors, and sensing lines
or impulse lines in the case of pressure sensors. Figures 2.11b
and 2.11c show examples of how a thermowell or a sensing
line is typically used in an industrial installation.
As will be seen later, it is important for process safety
to verify that temperature sensors are properly installed in
their thermowell and that sensing lines, which bring the pressure information from the process to the sensor, are not
obstructed. Any appreciable air gap in the interface between

2.11 Safety Instrumentation and Justification of Its Cost

Sensor

RTD
Thermocouple
Pressure Sensor

Signal
Converter

Signal
Conditioning

Control/
Trip

Actuation

RV toto IV
Etc

Amplifier
Lead/Lag
Filter

Control
Safety
Trip

Valves
Pumps
Breakers

269

FIG. 2.11a
Block diagram of a typical instrument channel.

Connection
Head

Conduit
Connection

Sensing
Lines

Equalizing Valve

Pressure
Sensor

Process

Isolation Valves
Pipe or
Vessel Wall

RTD

Pressure Transmitter
Equalizing Valve

Fluid Stream
Thermowell

Isolation Valve
Orifice Plate

Sensing Line

FIG. 2.11b
A typical sensor-in-thermowell installation.

the temperature sensor and its thermowell or clogging or


obstruction of pressure sensing lines can cause measurement
delays with a potential to affect the safety of the process.
As shown in Figure 2.11a, the output of the sensor is
typically connected to a signal conversion or a signal conditioning unit such as a Wheatstone bridge, which converts
the resistance of an RTD or a strain gauge to a measurable
voltage proportional to temperature or pressure, or a current-to-voltage converter, which may be used in a pressure
instrumentation channel. There are sometimes amplifiers in
the instrument channel to increase the signal level and
improve the signal-to-noise ratio, and electronic filters to
dampen any extraneous noise. The next stage in a typical
safety instrumentation channel is a logic circuit to compare

2002 by Bla G. Liptk

Root Valves

FIG. 2.11c
Two views of typical pressure sensing lines.

the process parameter against a safety set point and ensure


that the process parameter is within a safe limit. If the
process parameter is found to exceed the safety set point,
then actuation systems, such as relays, are used to initiate
an alarm, trip the channel, or trip the plant. Normally, plant
trips are not initiated unless redundant instrument channels
indicate that a safe limit is exceeded.

270

Designing a Safe Plant

PERFORMANCE CHARACTERISTICS OF SAFETY


INSTRUMENTATION
The safety of a process is monitored on an ongoing basis
during operation using instrumentation that measures the key
process parameters. As such, the performance of the sensors
or instrumentation system is very important to safety.
The two most important performance characteristics of
instrumentation are accuracy and response time. The accuracy of an instrument is established by calibration and its
response time is established by an appropriate response time
testing technique. Normally, instruments are calibrated
before they are installed in a process and then periodically
calibrated after installation in the process. Recently, methods
have been developed to verify the calibration of instruments
remotely while the process is operating. Referred to as online calibration verification, these methods are discussed in
Reference 2.
Response time is important in processes where a parameter can experience a sudden and significant change. For
example, in a reactor or a boiler, the temperature or pressure
can experience a large change suddenly under certain operating conditions. In these processes, it may be important for
the process sensors to respond quickly to the change so that
the plant operators or the plant safety system can initiate
timely action to mitigate any consequences of the anomaly.
Because accuracy (calibration) and response time are two of
the most important characteristics of instrumentation, a more
detailed description is provided below for each of these two
characteristics.
Accuracy of Temperature Sensors
The accuracy of a temperature sensor is established through
its calibration. Temperature sensors are normally calibrated
in a laboratory environment and then installed in the process.
After installation, temperature sensors are rarely removed
and recalibrated. This is especially true of thermocouples
because thermocouple recalibration is not usually practical
or cost-effective. Therefore, it is often best to replace thermocouples if and when they lose their calibration. More
information on thermocouple calibration is provided in Reference 3.
RTDs are different from thermocouples in that RTDs can
easily be recalibrated. In fact, it is often better to recalibrate
an RTD than to replace it. This is because RTDs usually
undergo a type of curing while in the process. They mature
beyond the infant mortality stage and reach a stable stage in
which they normally perform well for a long period of time
(e.g., 10 to 20 years).
Industrial temperature sensors are often calibrated in an
ice bath, water bath, oil bath, sand bath, or a furnace. For
calibrations up to about 400C, the ice bath, water bath, and
oil bath are all that is needed. In these media, temperature
sensors, especially RTDs, can be calibrated to very high
accuracies (e.g., 0.1C). Better accuracies can be achieved

2002 by Bla G. Liptk

for RTDs with additional work or through the use of fixed


point cells. However, industrial RTDs are not normally
required to provide accuracies of better than 0.1C, nor can
they normally maintain a better accuracy for a long period
of time. As for thermocouples, the best calibration accuracy
that can reasonably be achieved and maintained is about
0.5C up to about 400C.
The calibration process normally involves a standard thermometer such as a standard platinum resistance thermometer
(SPRT), or a standard Type S thermocouple to measure the
temperature of the calibration bath. Also, it is through the
standard thermometer that calibration traceability to a national
standard is typically established for a temperature sensor.
The bath temperature is measured with the standard thermometer as well as the temperature sensor under calibration.
This procedure is typically repeated in an ice bath (at about
0C) and in the oil bath at several widely spaced temperatures
that cover the desired operating range of the sensor (e.g., 100,
200, 300C). Figure 2.11d shows an example of a temperature
sensor calibration facility in which two oil baths and one ice
bath are shown with connections to a central computer system
that controls the calibration process and records the calibration data.
The results of the calibration are normally presented in
terms of a table referred to as a calibration table. This table
is typically generated by fitting the calibration data to a
polynomial to interpolate between the calibration points. Calibration tables can be extrapolated beyond the calibration
points to a limited degree. A reasonable limit is about 20%
4
based on research results.
In addition to calibration of sensors, the accuracy of
industrial temperature measurements depends on installation
of the sensors and other factors. As such, the overall accuracy
of a temperature instrumentation channel is evaluated based
not only on the accuracy of its initial calibration, but also on
the effect of installation and process operating conditions on
the accuracy. These effects and how to verify the accuracy
of an installed temperature sensor are discussed in References
1 and 4.
Accuracy of Pressure Sensors
Like temperature sensors, the accuracy of pressure sensors
depends on how well the sensor is calibrated. The procedure
for calibration of pressure sensors is more straightforward than
temperature sensors. Pressure sensor calibration involves
actual adjustments to sensor zero and span potentiometers as
opposed to the data recording and data fitting that is required
for temperature sensors.
Typically, a constant-pressure source and a pressure standard are used to calibrate pressure sensors. Depending on the
sensor, its application, and its range, a number of precise
pressures are applied to the sensor while its output is adjusted
to indicate the correct reading. Typically, pressure sensor
calibration is performed at five points covering 0, 25, 50, 75,
and 100% of the span. Sometimes the calibration data are

2.11 Safety Instrumentation and Justification of Its Cost

271

Hood and Fan


Connection to central data acquisition and
data analysis computer

Oil Bath

Oil Bath
Controller

Ice Bath

Oil Bath
Controller

Oil Bath

FIG. 2.11d
Photograph of a precision temperature sensor calibration facility.

recorded with both increasing and decreasing pressure inputs,


and the resulting outputs are averaged to account for the
hysterisis effect.
Conventionally, deadweight testers have been used in
most industrial processes for the calibration of pressure sensors. Although deadweight testers are still in common use,
recent calibration equipment has become available to streamline the calibration process. Referred to as automated calibration systems, the new equipment enables the user to calibrate a pressure sensor in a fraction of the time that is
normally required with conventional equipment. The new
equipment incorporates computer-aided data acquisition, data
analysis, automated report preparation, trending, and control
of the test equipment. Figure 2.11e shows a simplified block
diagram of an automated pressure sensor calibration system.
A problem with calibration of some pressure sensors is
that sensors are sometimes calibrated when the plant or process is shut down or at less than normal operating conditions.
That is, the effect of temperature on the calibration is not
often accounted for. Furthermore, the effect of static pressure
is not accounted for in calibrating differential pressure sensors. These effects may be important and can cause the calibration of a pressure sensor to be different during shutdown
as compared to normal operating conditions.

2002 by Bla G. Liptk

RESPONSE TIME TESTING


The response time of process instrumentation channels can
be very important to safety. In transient conditions, such as
sudden changes in temperature or pressure, process instrumentation must respond quickly to notify the operators or
initiate automatic safety actions. In some processes, instruments must respond within a fraction of a second to prevent
the process from falling into an unsafe condition.
Typically, the response time of the sensor in an instrument channel is the most important component of the channel
response time. This is because the electronics of an instrument channel are usually fast and are often located in mild
environments of the plant. As such, the electronics typically
suffer less calibration drift and less response time degradation
than sensors that are normally located in or near the process
in a harsh environment. In the past, the common wisdom was
that the response time of sensors is a relatively constant
variable, which does not change with the age of the sensor.
This point has now been rejected as research results and test
data gathered over the last two decades have shown that
sensor response time can degrade significantly. In the case
of temperature sensors, response time degradation occurs not
only as a result of the sensor itself, but also from changes in

272

Designing a Safe Plant

Hydraulic or Pneumatic Lines

Pressure
Transmitter
Being
Calibrated

Programmable
Pressure Source
(With Built-In
Pressure Standard)

Computer

Printer

FIG. 2.11e
Block diagram of an automated pressure sensor calibration system.

sensorthermowell interface. As for pressure sensors, response


time degradations occur mostly because of blockages and
voids in the sensing lines although pressure sensors themselves can also suffer response time degradation.
Because sensor response time degradation can occur, it
is important to make measurements from time to time to
verify that the response time of a sensor has not increased to
an unacceptable level. This may be accomplished in a number
of ways such as the two described below.
Response Time Testing of Temperature Sensors
In a laboratory environment, temperature sensor response
time testing is performed by imposing the sensor to a step
change in input. Typically, a method called the plunge test
is used. The plunge test involves a sudden immersion of the
sensor into an environment with a different temperature than
the sensor. For a typical temperature sensor, the plunge test
is performed in room-temperature water flowing at 1 m/s.
This procedure is prescribed by the American Society for
Testing and Material (ASTM) in a standard known as ASTM
5
Standard E 644. Figure 2.11f illustrates the test setup for
plunge test of a temperature sensor.
The plunge test is useful for measurement of response
time of temperature sensors in a laboratory environment.
After the sensor is installed in a process, its response time
can only be measured by in situ testing at process operating
conditions. If the sensor is removed from its thermowell in

2002 by Bla G. Liptk

the process and plunge-tested in a laboratory, the response


time results could be much different from the actual inservice response time of the sensor when it is installed in the
process. This is due to the effect of process conditions such
as flow and temperature on response time. More importantly,
the response time of a temperature sensor depends on its
installation in its thermowell if a thermowell is used.
The above arguments stimulated the development of a
new technique for in situ response time testing of temperature sensors. The method is referred to as the loop current
step response (LCSR) technique. It is based on sending an
electric current through the sensor leads to cause Joule heating in the sensing element of the sensor. Based on an analysis of the sensor output during the Joule heating or after
the heating is stopped, the response time of the sensor is
identified. This method is well established and approved by
the U.S. Nuclear Regulatory Commission for the measurement of response time of temperature sensors in the safety
systems of nuclear power plants. The details are presented
in Reference 6.

Response Time Testing of Pressure Sensors


For pressure sensors, the response time is typically measured
using a ramp pressure input. The ramp test signal is applied
to the sensor under test and simultaneously to a fast-response
reference sensor as shown in Figure 2.11g. The asymptotic

2.11 Safety Instrumentation and Justification of Its Cost

Trigger
IN

273

Signal
Conditioning

OUT

Sensor
Hot Air
Blower

Timing Probe

Ch. 1

Recorder

Rotating
Tank

Water

Ch. 2

Dual Channel Recorder


Timing Signal

Channel 2
Output Signal

Test Transient

A
0.632 A

Channel 1
Time
Response Time

FIG. 2.11f
Illustration of equipment setup for laboratory measurement of response time of temperature sensors.

delay between the two sensors is then measured as the response


time of the pressure sensor as shown in Figure 2.11g. The
details are presented in Reference 7.
Unlike temperature sensors, the response time of pressure
sensors does not depend on the process operating conditions.
Nevertheless, a new method was developed to measure the
response time of pressure sensors as installed in an operating
process. Referred to as noise analysis, the method is based
on analyzing the process fluctuations that often exist at the
output of process sensors while the plant is operating. The details
are presented in Reference 8. The advantage of the noise
analysis technique is that its results provide the response
time of not only the pressure sensor, but also the sensing
lines that bring the pressure information from the process to
the sensor. This is important because any significant clogging
or voids in the pressure sensing line can cause a major delay
in measurement of transient pressure signals with a potential
effect on the safety of the process. Figure 2.11h shows how

2002 by Bla G. Liptk

the response time of representative pressure sensors may


increase with sensing line blockages as they reduce the sensing line diameter.
DESIGN, DEVELOPMENT, AND TESTING REQUIREMENTS
Table 2.11i provides examples of some of the key requirements for safety instrumentation. The list is divided into
initial requirements and recurring requirements as described
below. Initial requirements are those to be met in producing
the instruments and recurring requirements are those to be
met in maintaining the instruments.
Initial Requirements
Safety-grade instrumentation shall be designed by qualified
and experienced electrical and mechanical engineers or
equipment designers and constructed by trained personnel

Designing a Safe Plant

Pressure

274

Output Signal
Time
Reference
Sensor

Pressure
Test
Signal

Sensor
Being
Tested

: Response Time

Time (s)

FIG. 2.11g
Illustration of equipment arrangement for response time testing of pressure sensors.

Response Time Increase (%)

200
Barton
160

120

Foxboro
Rosemount

80

40

0
30

40

50

60

70

80

Blockage (% of Original I.D.)

FIG. 2.11h
Effect of blockages on dynamic response time of representative pressure transmitters.

using documented procedures. High-quality material shall be


used in construction of the instruments to help provide the
desired performance, reliability, and a long life. The first prototype should be aged by such methods as accelerated aging
techniques and subsequently tested as necessary to establish
the qualified life of the instrument and verify that the instrument is sufficiently reliable for the intended service.
Upon construction, each instrument shall be tested individually under an approved quality assurance (QA) program
and the test results should be documented. These tests should
include calibration, response time measurements, and functionality tests as applicable. The tests must be performed
using appropriate standard equipment with valid calibration
traceable to a recognized national standard.

2002 by Bla G. Liptk

In addition to qualification tests and factory acceptance


measures mentioned above, safety-grade instrumentation
shall be tested after installation in the process to ensure that
no damage or degradation has occurred during the installation. Examples of measurements that may be made include
loop resistance measurements, insulation resistance measurements, continuity tests, capacitance measurements, cable
testing, response time measurements, calibration checks, etc.
Recurring Requirements
While installed in the process, safety-grade instrumentation
shall be tested periodically to verify continued reliability and
compliance with the expected performance requirements.

2.11 Safety Instrumentation and Justification of Its Cost

cables in a variety of applications. This test is based on sending


a signal through the cable and measuring the signal reflection.
Based on the characteristics of the reflected signal, one can
identify changes in the cable characteristics and verify the condition of the conductor and insulator of the cable. By using
computer-aided testing, these types of measurements can be
performed precisely, accurately, and effectively. Figure 2.11j
shows typical results of a TDR testing of cables leading from
instrument cabinets in the control room of a plant to an RTD
in the field.
The frequency of testing of safety instrumentation
depends on the instrument, its performance characteristics,
and its importance to process safety. Some instruments may
require testing on a monthly or quarterly basis and others
would only need testing once every operating cycle or longer.
Performance trending is an effective means for determining
objective testing periods. Recently, reliability engineering
techniques have been used in developing risk-based maintenance programs. These programs are used to determine the
importance of instruments to plant safety and to arrive at
objective maintenance schedules.

TABLE 2.11i
Example of General Requirements for Safety-Grade Instrumentation
Initial Requirements
Good design and construction
High-quality construction material
Good calibration (accuracy): achieved by (1) precision laboratory
calibration of each instrument prior to installation into the process and
(2) appropriate steps taken during installation and use to minimize
errors
High stability in calibration so that the sensor maintains its accuracy for
a long period of time
Fast dynamic response time: (1) achieved by designing the system for
fast response and (2) verified by laboratory tests of the prototype sensor
Reliability and longevity
Good insulation resistance, robust cabling, and high-quality connectors
Recurring Requirements
Periodic calibration on the bench or calibration checks and calibration
verifications in situ while the instrument remains installed in the process
operating conditions or at shutdown conditions as appropriate
Periodic response time measurements using in situ response time testing
techniques as much as possible
Verify the condition of the process-to-sensor interface (thermowell,
sensing line, etc.)
Testing of cables and connectors using in situ cable testing technology
Insulation resistance and capacitance measurements
Other electrical measurements (e.g., TDR)

SAFETY INSTRUMENTATION COSTS

A number of online monitoring techniques have been developed over the last decade or so for in situ testing of performance of process instruments while they remain installed in
a process. A number of these techniques have been mentioned
in this section and are found in the references that are provided at the end of this section.
Recently, in addition to instrument performance verification, tests are performed to verify the condition of cables and
cable insulation materials. For example, the time domain reflectometry (TDR) technique is now in common use for testing

The cost of a safety instrumentation channel depends on the


cost of the sensors, the cost for the instrument channel, and
maintenance costs. These cost components, their typical
prices, and the justification of their costs are discussed below.
Cost of Sensors
Because of high accuracy, fast dynamic response, and high
reliability requirements, the instrumentation that is used to
ensure the safety of a process could be much more expensive
than conventional instrumentation. For example, a generalpurpose RTD for a typical industrial temperature measurement application would normally cost no more than about
$300.00 for a good-quality sensor. (All prices in this chapter

0.4
Reflection Coefficient ()

Penetration into Reactor Containment

Connection Panel
Test Lead

0.2
150

150

300

450

Distance (ft)

FIG. 2.11j
Example of a TDR trace for an RTD circuit.

2002 by Bla G. Liptk

RTD

0.2

0.0

275

600

750

900

276

Designing a Safe Plant

Sensor

Analogto-Digital
Converter

Microprocessor
Sensor Linearization
Rerange
Transfer Function
Engineering Units
Damping
Diagnostics
Communications

Non-Volatile Memory
Linearity Constants
Range Constants
Transmitter
Configuration

Current Loop
Existing
Plant
Instrumentation

Digitalto-Analog
Converter

Communications
Unit

FIG. 2.11k
Typical components of a smart sensor.

are in U.S. dollars in the year 2002.) This compares with


about $3000.00 for a safety-grade temperature sensor with
its own calibration table, response time test results, etc. In
nuclear power plants where safety is vital, the cost of plant
protection instrumentation is typically very high. For example,
a set of 40 RTDs and their matching thermowells for a typical
safety-related installation can cost as much as $500,000.00.
This cost usually includes a specific calibration table for each
RTD, response time test certificates, QA documentation,
nuclear qualification test results, sensor installation services,
post installation testing, etc. A major portion of the high cost
is for nuclear safetyrelated qualification, but the quality of
the sensors is also superior. Reference 4 describes a research
project in which commercial-grade temperature sensors were
compared with nuclear-grade temperature sensors. The sensors in the research project were aged under typical process
operating conditions and then tested to determine their performance after aging. The results illustrated that the nuclear
grade sensors were at least twice as good as the commercialgrade sensors in terms of performance.
As for pressure sensors, the cost is normally about
$1000.00 or less for a general-purpose industrial pressure
sensor. The same type of sensor for a nuclear safetyrelated
application could cost as much as $10,000.00. There is, of
course, a difference between the quality, longevity, and reliability of the two sensors, although a major portion of the
extra cost is for nuclear safety qualification.
Pressure sensors for safety-related applications in processes other than nuclear power plants would not cost as much,
although a higher cost would be justified for the safety benefit,
longevity, and good performance that is achievable with highgrade sensors. In addition to safety, high-quality sensors help
with plant efficiency and economy. For example, the lower cost
of maintenance of high-quality sensors over the life of the plant
can easily compensate for the extra cost of the sensor.

2002 by Bla G. Liptk

Recently, smart pressure sensors have become available


that provide superior performance at very reasonable costs.
These sensors are becoming very popular in many industries,
including the nuclear power industry where smart sensors are
being qualified by some utilities for nuclear safetyrelated
applications. Figure 2.11k shows typical components of a
smart sensor.
Cost of Instrument Channel
The overall cost of a safety grade instrument channel could
range up to about $200,000.00, including not only the sensor,
but also the cables, electronics, design, and the labor to install
and test the channel. As such, the cost of the sensor is typically a small portion of the cost of the total instrument channel.
Because the sensor is the most important component of
an instrument channel and is usually the one that resides in
the harsh environment in the field, its high cost is readily
justified. Because replacing a sensor is typically very expensive, it is often important to use high-quality and reliable
sensors, regardless of their cost, to minimize the frequency
of replacement.
Recently, digital instrumentation systems have become
available with many advantages over the analog equipment.
Digital instrumentation systems are relatively drift free,
have better noise immunity, and are much more versatile
than analog equipment. They are, however, more expensive, experience subtle failures, and are susceptible to error
due to environmental effects such as electromagnetic and
radio-frequency interference (EMI/RFI). In addition, and
more importantly, digital instrumentation is prone to common mode failures because of the use of the same software
in redundant equipment. The fear of common mode failures
is the chief reason digital instrumentation is subject to
stringent licensing requirements for use in process safety

2.11 Safety Instrumentation and Justification of Its Cost

277

systems. Furthermore, obsolescence is more of a concern


with digital equipment than analog equipment. That is,
rapid advances in digital technologies could make it difficult for users to purchase spare parts and obtain support
for digital equipment as they aged in a process.

one instrument at a time. Because this new test equipment


often provides for testing of multiple instruments simultaneously and reduces the test time, it can help reduce the plant
maintenance costs and outage time.

Maintenance Costs

References

Although the cost of components in an instrument channel


can add to a large sum, it is really the maintenance costs that
can overshadow the cost of the components. For example, a
safety system instrument channel must be calibrated periodically, response time tested, maintained, and sometimes
upgraded or replaced during the life of the plant. The cost of
these activities over the life of the plant can exceed the
original cost of the equipment itself. Therefore, any extra
investment in the purchase of high-quality instruments is
often justified by the savings that would result in the cost of
maintenance, performance verification, or replacement.
In the past, instrument maintenance and performance
verification were typically performed manually, one channel
at a time. Today, automated equipment and techniques have
been developed for predictive maintenance of instruments
and for instrument performance verification that can test several channels at one time. Some of the techniques, such as
online calibration verification methods and in situ response
time testing techniques, were mentioned earlier. The cost of
test equipment and implementation of these techniques have
a range of $100,000.00 to $500,000.00, depending on the
scope of the implementation. The cost is justified considering
the time that it would take to perform the tests manually on

1.

2002 by Bla G. Liptk

2.

3.

4.

5.

6.

7.

8.

Hashemian, H. M. et al., Advanced Instrumentation and Maintenance


Technologies for Nuclear Power Plants, U.S. Nuclear Regulatory
Commission, NUREG/CR-5501, August 1998.
Hashemian, H. M., On-Line Testing of Calibration of Process Instrumentation Channels in Nuclear Power Plants, U.S. Nuclear Regulatory Commission, NUREG/CR-6343, November 1995.
Hashemian, H. M., New Technology for Remote Testing of Response
Time of Installed Thermocouples, U.S. Air Force, Arnold Engineering
Development Center, Report Number AEDC-TR-91-26, Volume 1
Background and General Details, January 1992.
Hashemian, H. M. et al., Aging of Nuclear Plant Resistance Temperature Detectors, U.S. Nuclear Regulatory Commission, Report Number NUREG/CR-5560, June 1990.
ANSI/ASTM E 644-78, Standard Methods for Testing Industrial Resistance Thermometers, Philadelphia: American Society for Testing and
Materials, 1978.
Hashemian, H. M. and Petersen, K. M., Loop Current Step Response
Method for in-Place Measurement of Response Time of Installed RTDs
and Thermocouples, in Seventh International Symposium on Temperature, Vol. 6, Toronto, Canada: American Institute of Physics, May
1992, 11511156.
Hashemian, H. M. et al., Effect of Aging on Response Time of
Nuclear Plant Pressure Sensors, U.S. Nuclear Regulatory Commission, NUREG/CR-5383, June 1989.
Hashemian, H. M., Long Term Performance and Aging Characteristics of Nuclear Plant Pressure Transmitters, U.S. Nuclear Regulatory
Commission, NUREG/CR-5851, March 1993.

You might also like