Professional Documents
Culture Documents
year. Little
STEPHEN R. CHITWOOD
George Washington University
federal
ver the past decade there has been widespread criticism of the
government for poor management. In response to the
growing censure, there have been attempts, in the past several years, to
reform basic decision processes
including planning, budgeting, evaluation, personnel management, procurement, and contracting. Federal
agencies have also been subject to numerous personnel ceilings, hiring
freezes, and special budget controls in areas including travel expenditure,
paperwork, and ADP. Recently, of course, the sizable budget reductions
proposed by the administration and approved by Congress have created
an austere, cutback environment in most civilian agencies.
EVALUATION REVIEW, Vol 8 No. 3, June 1984 283-305
@ 1984 Sage Publications, Inc
283
284
ropolitan
(1)
(2)
(3)
285
analysis
studies is to
provide input to
making.
analysis function is
to
in
supposed bring expertise management, analysis, and evaluation to
complex management decisions. The OPM standards (1972b: 1) state:
management decision
The management
The paramount
(1972a: 3)
notes:
The
and
286
study findings
studies that
too much
results
are
~
~
available
making
provided questions for which managers have no interest
political and fundmg issues are often more important than evaluation results, and
evaluation findmgs are not commumcated to agency personnel m ways that they
are not
answers are
can
to
be used.
287
the
the
~
~
issues
The issue at this time is not the search for a single formula of utilization success, nor
the generation of ever-longer lists of possible factors affecting utilization. The task
for the present is to identify and define a few key vanables that may make a major
difference in a significant number of evaluation cases.
study.
There
Although the reasons for this are not fully understood, some hypotheses may be
stated: (a) these non-rigorous styles better fit the decisionmakmg styles and needs
of administrators; (b) there is greater pressure on corrections for system improvement than for client improvement, and these studies provide adequate rationales
for system change; (c) in times of rapid change, conditions are not favorable for the
use of strong research designs; and (d) correctional administrators have not yet
supported ngorous designs to the extent required to make them generally effective.
288
with
utility
was
the
Mark Van de Vall has done research about the effect of methodology
and user participation on utilization of applied social science and social
policy findings. Van de Vall and Bolas (1979) have concluded that,
the impact of social policy research upon organizational decisions is higher when
the research sponsor and research consumer are identical or closely linked, rather
than consistmg of two separate and independent organizations; and
committee.
Van de
use
upon industrial
lar analysis.
The more the methodological mixture of applied social research favors qualitative
methods, the more intensively the projects are utilized m company policies.
289
affect the
of personnel
can
acceptability of recommendations.
or
expansion.
Micro factors, the focus of this effort, include dimensions such as
methodology, the way decisions were made about how the study would
be conducted, the kind of information (data) collected, who collected
the information, how the information (data) was analyzed and interpreted, and how recommendations were developed. Micro factors also
include the characteristics of the kind of study conducted such as the
purpose of the study, the topical content, the initiator, and the retrospective or prospective emphasis. The analyst may be in a position to
control
or to
alter
explores the relationship among the kinds of managethey are conducted, and the acceptance
of recommendations by decision makers. To examine this question
three typologies, or classification schemes, have been established relating to (1) the nature of the study, (2) the methodology used, and (3) the
interpersonal processes used in the effort.
ment evaluations, the ways that
Figure 1).
The conduct of management evaluations may be viewed in terms of
&dquo;content&dquo; and &dquo;process.&dquo; The second and third typologies deal with
these areas. For the purposes of this analysis, &dquo;content activities&dquo; are the
analytic methodologies employed in the study. &dquo;Process&dquo; pertains to the
interpersonal processes used and is concerned with who is involved in
making decisions about how to conduct the evaluation and who carries
290
291
292
METHODOLOGY
An exploratory, descriptive approach was selected for this research
because little theory has been developed about factors affecting study
acceptance. The research methodology applied involves a series of
structured, multiple case studies. Information for the cases was gathered
by interviews, questionnaires, and examination of study reports.
The twelve cabinet-level civilian departments (with all of their
bureaus) and the independent agencies of the General Services Administration (GSA), the National Aeronautics and Space Administration
(NASA), and the Veterans Administration (VA) were the organizational
entities from which particular MA units were chosen. GSA, NASA, and
VA were included from among the independent agencies because they
have relatively large numbers of management analysts.
LIMITATIONS OF THE RESEARCH
There
are a
293
294
295
296
297
298
299
sampled.
(3)
(4)
(5)
(6)
(7)
the purpose of the study should have been to assist in decision making;
the study should be completed and sufficient time should have elapsed for
mformation about the acceptance of recommendations to be available;
the study should have been of a substantive subject large enough in scope or
300
301
The largest study, in terms of analysts time meeting these criteria, was
selected for examination.
A minimum of three interviews were conducted in each MA unit. The
first interview was conducted with the head of the MA unit to get an
overview of the unit and to select one study for detailed examination.
Second, a structured interview with the analyst who conducted the study
was carried out. In most cases, it was necessary to conduct two
interviews with each analyst. A copy of the study report was reviewed
before the interview with the analysts(s) who conducted the study.
Finaly, a decision maker responsible for accepting the recommendation
was interviewed.
302
Some studies were done for more than one decision maker at more
than one organization level. For example, a study was initiated by an
associate director of administration but final approval was the responsibility of the bureau administrator. Studies done for more than one
decision maker were usually initiated by a lower-level decision maker
while final approval rested with the higher-level decision maker. These
studies appeared to have a lower-than-average level of acceptance.
Studies of larger scale questions that were often complex and difficult to
define had a lower level of acceptance than studies of more focused
issues.
Studies
to
to
study.
Studies that used advanced quantitative methodologies such as inferential statistics, correlation, regression, or experimental designs,
appeared to have lower levels of acceptance if more qualitative methods
were not also used. When quantitative methods were combined with
descriptive approaches, levels of acceptance appear to be at least
average. It should be noted that our sample of studies using advanced
quantitative methods was small. Adams (1975) and Van de Vall et al.
(1976) have also found that the use of simple data analysis techniques
can result in higher acceptance.
303
The
typologies developed
were
304
REFERENCES
ADAMS, S. (1975) Evaluative Research in Corrections: A Practical Guide. U.S.
Department of Justice, Washington, DC: Government Printing Office.
COX, G. B. (1977) "Managerial Style: implications for the utilization of program
evaluation information." Evaluation Q. 1(August): 499-509.
LEVITON, L. C. and E.F.X. HUGHES (1981) "A review and synthesis of research on
utilization of evaluations." Evaluation Rev. 5: 525-545.
305
U.S. Office of
(April/May/June): 172-173.
WALLER, J. D. Developing Useful Evaluation Capability: Lessons from the Model
Evaluation Program, U.S. Department of Justice. Washington DC: Government
Printing Office.
YOUNG, C. J. (1978) "Evaluation utilization," Presented at the Evaluation Research
Society Second Annual Meeting, Washington, D. C., November 2-4.
Ray C. Oman, Ph. D., a supervisory management analyst, is a Branch Chief in the Plans
Policy Division of the Management Systems Support Agency, Department of the
Army. He has graduate degrees from Pennsylvania State University and George
Washington University, specializing in program analysis and evaluation, management,
and
and finance and economics. One of his current research interests is the relationship
between the methodology and processes used in management studies and the acceptance
of findings by decision makers.