You are on page 1of 18

International Journal of Forecasting 26 (2010) 144161

www.elsevier.com/locate/ijforecast

Forecasting task-technology fit: The influence of individuals,


systems and procedures on forecast performance
Carlo D. Smith a, , John T. Mentzer b,1
a Operations and Supply Chain Management, Central Washington University, 400 E. University Way, Ellensburg, WA 98926, United States
b University of Tennessee, 310 Stokely Management Center, Knoxville, TN 37996, United States

Abstract

This study establishes and tests a theoretically-based model of forecasting practice. Forecasting Task-Technology Fit (FTTF)
addresses the relationship between forecasting support systems, the procedures that guide forecast creation, and their fit with the
capabilities of the forecasting support system user. An analysis of survey responses collected from 216 forecasting practitioners
confirms a positive relationship between specific forecasting support system characteristics and the system users perceptions of
system quality and access. The results also support a positive relationship between the system users perceptions of the quality
of, and their access to, procedures that guide forecast creation and their perceptions of system quality and access. Finally, the
results confirm a positive relationship between the users assessment of system quality and access and a dependent variable
measuring forecast performance.
c 2009 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.

Keywords: Forecasting support systems; Forecasting management; Sales forecasting; Task technology fit; Forecasting practice

1. Introduction & Bell, 2006). When used effectively, improved fore-


casting performance helps organizations and supply
Demand forecasts are employed in a wide range of chains to adapt to changing market conditions and im-
business operations, including inventory management, prove operating performance (Fildes & Beard, 1992;
transportation, distribution, replenishment, produc- Gardner, 1990; Wacker & Lummus, 2002). When
tion, maintenance and supplier collaboration (Fildes, performance degrades, companies frequently rely on
Goodwin, & Lawrence, 2006; Kusters, McCullough, inventory assets, expedited services and other expen-
sive actions to support operations and maintain ser-
vice (Armstrong, 1988; Winklhofer, Diamantopoulos,
Corresponding author. Tel.: +1 509 963 2032.
& Witt, 1996). In the worst cases, poor forecasting
E-mail addresses: cdssan2@gmail.com (C.D. Smith),
jmentzer@utk.edu (J.T. Mentzer). performance can contribute to major financial losses.
1 Tel.: +1 865 974 1652. Forecasting missteps at Cisco Systems were believed

0169-2070/$ - see front matter c 2009 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.
doi:10.1016/j.ijforecast.2009.05.014
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 145

to have contributed to a multi-billion-dollar write off Mentzer (2007) included information technology as
of products and components during the dot-com bust part of an information logistics component of their
in 2000 (Berinato, 2001). sales forecasting management (SFM) framework.
Researchers have long recognized the significant The present study emphasizes the role of systems
role of forecasting in business practice, and forecast- in forecasting. It integrates research drawn from the
ing journals have emphasized the need to investigate forecasting, information systems and decision support
management factors that influence model development systems literature to address the relationship between
and implementation (Fildes, 2006; Makridakis, 1996; forecasting support system effectiveness and forecast
Winklhofer et al., 1996). Since its founding, the In- performance. Fildes et al. (2006) employed the term
ternational Journal of Forecasting has published two forecasting support system (FSS) to describe a fore-
editorials calling for more research into forecasting casting system as a type of decision support system
practice (Lawrence, 2000; Schultz, 1992). The vol- (DSS) used to prepare forecasts. Their description of
ume of such research, however, has remained lim- system characteristics included database capabilities
ited (Fildes, 2006). Schultz (1984), for example, used to collect demand history and related informa-
investigated model implementation and the struc- tion, a range of forecasting techniques, and the ability
tures and processes related to model application. Re- to incorporate managerial judgments.
viewing practice-related research, Winklhofer et al. The study also benefits from a stream of research
(1996) cited 35 surveys and six case studies that concerned with improving information and decision
offered insights related to forecast design, selec- support system effectiveness. It draws on the theory
tion/specification and evaluation. of Task-Technology Fit (TTF) (Goodhue, 1995; Good-
Few studies have presented a theoretical grounding hue & Thompson, 1995) to evaluate the relationship
to explain how factors and relationships may influence between the skills and abilities of individuals involved
forecast development and application (Davis & in forecasting, the task and system characteristics that
Mentzer, 2007; Fildes & Hastings, 1994; Fildes et al., support forecast creation, and the resulting forecast
2006; Mentzer, Bienstock, & Kahn, 1999; Winlkhofer performance, defined as an assessment of forecast ac-
& Diamantopoulos, 2003). Proposed frameworks have curacy.
considered the influence of market environment, The next section reviews the literature concerned
organization, management support, and information with forecasting models and systems, as well as infor-
systems on forecast performance and application. mation systems research. The review provides a theo-
They have addressed forecasting at a departmental retical grounding to explain how factors influence the
and individual unit of analysis. They have yet, effectiveness of forecast support systems. Section 3
however, to provide empirical evidence to confirm develops a series of hypotheses that address the re-
such relationships. The lack of an empirically tested lationship between the capabilities of individuals in-
model of forecasting development and application volved in forecasting, the tasks and systems that sup-
represents a significant gap in efforts to understand and port forecasting, and forecast accuracy. Section 4 dis-
improve forecasting practice. cusses the methodology for testing the hypotheses.
One factor consistently represented across earlier Section 5 discusses the results, and Section 6 ad-
models of forecasting is conceptualized in one dresses conclusions and implications for forecasting
form or another as the forecasting system. Schultz practice and research.
(1984) identified system format and system
quality in his forecasting model implementation 2. Relevant literature
profile. Fildes and Hastings (1994) outlined what
they considered to be the characteristics of an Research investigating the role of systems in fore-
idealized (quotes original) forecasting system to casting practice has described the system charac-
guide a discussion of organizational factors in market teristics which are useful to forecast applications
forecasting. Mentzer et al. (1999) addressed systems in areas such as production and inventory control
as being one of the four dimensions of forecasting (Fildes & Beard, 1992), vendor managed inventory
management that impact performance. Davis and (Nachiappan, Jawahar, Parthibaraj, & Brucelee, 2005),
146 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

Table 1
Studies addressing forecasting system characteristics.

System capability/study Fildes and Beard Mentzer and Mentzer and Mentzer and Mentzer Fildes et al.
(1992) Schroeter (1994) Bienstock (1998) Kent (1999) et al. (1999) (2006)

Method Technique Case Benchmark Case Benchmark Case


application study

Client/server

Central database

Electronic
linkage/customers

Developer/user access

Item-level time series

History/forecast
aggregation

Multiple methods

Time series

Regression

Qualitative

Performance reporting

User-defined reporting

Note capabilities

Easy to use

Process driven

Graphics

Item/data flagging

Experimental module

Automated selection

and parts management (Snyder, 1993). It has ad- discussions of the system requirements expected
dressed features for facilitating management judg- to help improve forecasting. The studies include
ment (Moriarty, 1985) and event information (Webby, case analyses of the forecasting practices in one or
OConnor, & Edmundson, 2005) during forecast more companies (Mentzer & Kent, 1999; Mentzer &
creation. Schroeter, 1994) and in a particular the application
Forecasting research has also identified general domain (Fildes & Beard, 1992). They draw insights
system features and capabilities which are considered based on benchmark analyses across organizations
important to forecast creation and implementation (Mentzer & Bienstock, 1998; Mentzer et al., 1999),
(Fildes et al., 2006; Fildes & Beard, 1992; Mentzer and, in one case, draw on elements discussed as part
et al., 1999; Mentzer & Bienstock, 1998; Mentzer of a framework for guiding forecasting support system
& Kent, 1999; Mentzer & Schroeter, 1994). These design (Fildes et al., 2006).
include capabilities that can help individuals with little These studies offer a number of system recom-
knowledge of forecasting techniques create forecasts mendations, including the ability to process time
(Fildes et al., 2006; Kusters et al., 2006). While such series data at the lowest unit of analysis and aggre-
capabilities could be considered helpful to practice, if gate data based on different criteria. Such a capabil-
it leads to system misuse they may also contribute to ity provides individuals in different functions with a
poor forecast performance (McCarthy, Davis, Golicic, means of reviewing forecasts at the level and unit of
& Mentzer, 2006; Mentzer & Cox, 1984; Mentzer & analysis where they are most comfortable. For exam-
Kahn, 1995). Concern for such circumstances may ple, production or logistics frequently rely on the item
have contributed to the characterization of forecasting level or item by location level unit demand, whereas
systems as a black box (McCarthy et al., 2006). marketing may be more interested in the demand by
The first column of Table 1 offers a list of product category, and finance may be more interested
capabilities reflected in articles that present normative in the forecasted dollar demand rather than units.
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 147

Recommendations suggest that systems be able ease of use as a key to user acceptance of forecasting
to draw from alternative forecasting techniques and tools. Their approach drew on the Technology Accep-
select the best technique based on the unique demand tance Model (TAM) in information systems research
characteristics of the product or service. This is (Venkatesh & Davis, 2000).
particularly valuable as users become responsible for The Technology Acceptance Model (Davis, 1986;
greater numbers of items that must be reviewed and Davis, Bagozzi, & Warshaw, 1989; King & He, 2006)
reforecasted on a regular basis (Fildes et al., 2006). adapts relationships outlined in the Theory of Rea-
In the wholesale and retail industries, for example, soned Action (TRA) (Ajzen & Fishbein, 1980; Fish-
individuals can be responsible for forecasting tens of bein & Ajzen, 1975). TAM posits that the perceived
thousands of items on a monthly or weekly basis. usefulness and perceived ease of use of a technology
Other recommendations include the ability to helps to establish an individuals attitude toward the
present data in graphic and tabular formats that allow technology, their intention to use it, and the resulting
for user reviews in a manner consistent with their use behavior (Davis et al., 1989). TAM2 (Venkatesh
approach to analysis, give the ability for forecast & Davis, 2000), which was referenced by Fildes et al.
experimentation outside the production system, and (2006), conceptualizes perceived usefulness in terms
provide a means of adjusting history to address of system results demonstrability, along with ease of
unexplained demand or anticipate future demand. In use, as an antecedent to system use behavior. Fore-
addition, many indicate the need to be able to flag and casting, they noted, offers a means of demonstrability,
record reasons for forecast changes. using accuracy as a measure of performance.
To assess performance, recommendations include TAM and TAM2 focus on use as the dependent
the automatic capture of common performance metrics variable. While system use is requisite to obtaining
such as mean squared error or mean absolute
benefit from a FSS, Delone and McLean (1992)
percent error, and encourage metrics related to the
acknowledged the limitations of focusing solely on use
performance of functions that use forecasts. Some
as a measure of system success.
suggest that measures be captured at any point during
Simply saying that more use will yield more
the process where forecasts may be altered. Doing
benefits, without considering the nature of this use,
so can pinpoint where changes are made that affect
performance, and may help to identify the need for is clearly insufficient. Researchers must also consider
training or process changes. the nature, extent, quality, and appropriateness of the
The majority of these studies prescribe system fea- system use (Delone & McLean, 2003, p. 16).
tures and capabilities that should lead to improved This observation is relevant to forecasting. Schultz
forecast performance when implemented. They have, (1984) acknowledged that forecasting models (a
however, offered little evidence of the linkage between component of forecasting systems) may not improve
system capabilities, their application and forecast per- organizational effectiveness if used incorrectly.
formance. Descriptive studies of forecasting practice TAM has received considerable attention in
suggest that system advances may not be leading to information and decision support systems research.
improved performance (McCarthy et al., 2006). An alternative stream of research found in the same
Acknowledging this continuing challenge to im- domain draws theoretical support from cognitive
proving forecasting performance, Fildes et al. (2006) cost-benefit research (Christensen-Szalanski, 1980;
employed decisional guidance and restrictiveness Creyer, Bettman, & Payne, 1990) and organizational
(Silver, 1991) to suggest how the forecasting system contingency theory (Van de Ven & Delbecq, 1974; Van
design can improve the likelihood that appropriate de Ven & Drazin, 1985). Task-Technology Fit (TTF)
techniques will be applied to demand information and (Goodhue, 1995) considers the interactions between
result in improved forecast performance. Their sys- tasks, defined as actions carried out by individuals to
tem design emphasized guidance over restrictiveness, turn inputs into outputs; technologies, defined as the
asserting that guidance is more consistent with a flex- tools used by individuals in carrying out their tasks;
ible system and promotes features that may be eas- and an individuals capabilities. The concept is based
ier to use. Fildes et al. (2006) adopted the concept of on the proposition that:
148 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

. . . (a) an individuals performance is affected by


how well technology options fit his or her task
requirements, (b) fit operates through its impact on
task processes, and (c) individuals can evaluate fit and
choose technologies on that basis (Goodhue, 1995, p.
1830).
Applied to forecasting practice, TTF suggests that
individuals who use forecasting support systems are
able to evaluate the fit between the tasks that guide
forecast development and the systems that support Fig. 1. The forecasting task-technology fit model.
forecasting. To the extent that these factors coincide,
use of the system will lead to higher performance, an integrated model (51% of utilization variance
which, in the case of forecasting, may be evaluated explained), followed by support for TTF (41% of
based on forecast accuracy. utilization variance explained), and finally TAM (35%
Fildes et al. (2006) outlined technology features of utilization variance explained). The context of their
believed to support improved forecasting (see Table 1). study appears more relevant to forecasting. Where
They addressed forecasting tasks that they described Goodhue (1995) and Goodhue and Thompson (1995)
as involving the derivation of the statistical (model- investigated TTF and TPC, respectively, applied
based) forecasts, and judgmental adjustment of the to general system applications, Dishaw and Strong
statistical forecast to take into account special factors (1999) focused on a specific application area involving
and other available information (p. 353). Their the use of CASE tools for software maintenance. The
consideration for decisional guidance in forecasting next section adopts TTF as a means of explaining
system design acknowledges the importance of user how forecasting support systems may contribute to
capabilities as outlined in TTF. Decisional guidance improvements in forecast performance.
offers a system-centric approach to fitting the
capability of the users (particularly poorly trained 3. Evaluating forecasting task-technology-fit
users) with the tasks and systems that support (FTTF)
forecasting. Where the dependent variable in TAM
is the behavior of system use, the dependent Fig. 1 illustrates the task-technology fit model ap-
variable in TTF is concerned with user perceptions plied to forecasting support systems and their impact
of performance. In forecasting, such perceptions on forecast performance. Within the context of fore-
would receive grounding from tangible measures of casting, the hardware and software used to analyze de-
accuracy (Fildes et al., 2006). mand related data and produce forecasts represent the
Goodhue (1995) confirmed that users are able technology characteristics of the forecasting support
to evaluate the task-technology fit of systems they system, and task characteristics are represented by the
employ. Goodhue and Thompson (1995) included procedures and activities that guide forecast develop-
system utilization as a behavioral construct in ment. Forecasting task-technology fit (FTTF) is based
their Technology-to-Performance Chain (TPC) model. on the evaluation of the individuals responsible for
Their study confirmed a positive relationship between preparing forecasts. Finally, forecasting performance
TTF and performance, as well as a relationship is represented based on measures of forecast accuracy.
between utilization and performance, and indicated Each of these elements and their relationships are ex-
that TTF predicted performance better than utilization plained in the following subsections.
alone. The findings did not support a relationship
between TTF and utilization. 3.1. Forecasting support systems
Dishaw and Strong (1999) evaluated TTF and
TAM both separately, and as an integrated model The technology characteristics of forecasting
with technology utilization as the dependent variable. support systems are the hardware, software and
Their findings indicated the strongest support for network features and capabilities of the system.
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 149

Goodhue (1995) suggested that users are able to com- responsibility resides in a separate forecast function or
pare the characteristics of the systems they use with within an existing function such as sales or marketing.
those prescribed as part of an idealized system. This Forecasting procedures may focus solely on historical
perspective, known as profile deviation, conceptual- demand data or may include qualitative input from
izes the users system assessment based on its adher- individuals throughout the organization.
ence to a specified profile (Venkatraman, 1989; Zig- Forecasting procedures may be affected by prod-
urs & Buckland, 1998). Drawing on the characteristics uct or service characteristics. Capital equipment
reflected in normative models of forecasting support forecasting may require a means of gathering input
systems (Table 1), the profile of an idealized forecast- regarding the timing and probability of receiving a
ing system should include attributes such as detailed contract. Consumer product forecasting, on the other
demand capture and a means of aggregating demand; hand, might focus on methods of collecting more spe-
the ability to change the unit of measurement based cific demand data and data concerned with the influ-
on user requirements; the ability to recommend tech- ence of promotion and competitive pricing activities.
niques; and the ability to display demand and fore- Forecasting procedures may also be defined by
cast information in tabular and graphic format. Such market characteristics. Wholesale and retail forecast-
systems should be able to capture performance in- ing tasks will be more focused on collecting historical
formation from the original forecast and at any point demand data. Agricultural product forecasting may re-
where forecast modifications occur. They should pro- quire the collection of weather information that can
vide users with a means of recording reasons for de- influence seasonal demand, or commodity prices that
mand or forecast modifications. can influence the selection of crops that will be planted
Goodhue (1995) relied on a separate panel of in- from year to year.
formation system experts to determine the technology Goodhue and Thompson (1995) and Goodhue
characteristics of the general systems being evaluated. (1995) evaluated task characteristics based on dimen-
Goodhue and Thompson (1995) employed dummy sions of non-routineness and interdependence. Dishaw
variables to assess technology characteristics. The and Strong (1999) measured tasks based on user re-
evaluation of FSS technology is more consistent with ports of the relative frequency of task activities related
Dishaw and Strong (1999), who focused on user as- to planning, knowledge building, diagnosis and modi-
sessments of the functionality and capabilities of the fication.
maintenance support software tools used. Forecasting Support for forecasting procedures in the FTTF
technology characteristics are defined as the extent to model draws on research concerned with the role of
which FSS technologies correspond to those of an ide- cognitive fit in problem-solving (Vessey, 1991; Vessey
alized FSS. Though system characteristics may differ & Galletta, 1991). Cognitive fit views problem solving
between companies, their support of core forecasting as . . . the outcome of the relationship between the
functionalities is likely to share common elements. By problem representation and the problem-solving task
evaluating FSS user perceptions as a comparison with (Vessey, 1991, p. 220). Processes (procedures) link the
the features and capabilities of an idealized system, two factors and contribute to the individuals mental
expert opinion regarding system design is integrated in representation of the problem. When the information
the assessment. contained in the problem representation corresponds
to the problem-solving task, common processes
3.2. Forecasting procedures facilitate the individuals mental representation of
the problem and contribute to the effectiveness of
Forecasting procedures are the actions carried out the solution. Vessey and Galletta (1991) extended
by individuals in turning inputs, such as historical the cognitive fit model to address the individuals
demand and information related to product and market problem-solving skills. In this case, when the
characteristics, into outputs, in this case demand individuals problem-solving skills correspond to both
forecasts. Forecasting procedures may be influenced the problem representation and the problem-solving
by a number of contingent factors, for example, task, the problem-solving performance is expected to
organizational factors, including whether forecasting improve.
150 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

Viewed in a forecasting context, when the which the tools and information in the forecasting
procedures that guide forecast creation are consistent support system assist individuals to perform their
with the forecast information and the task of forecasting related tasks.
forecasting, adherence to those procedures will Items drawn from the dimensions of accessibility,
contribute to the forecasters mental representation ease of use, and system reliability (Goodhue, 1998)
of the forecasting problem and improve forecasting were used to establish a measure of system access.
effectiveness. When the forecasters skills match both System access is defined as the degree to which
the forecast information and the task of forecasting, the tools and information in the forecasting system
the forecast performance will improve. are available to assist individuals in performing their
Forecasting procedures are operationalized along forecasting tasks. Combined, the construct of system
two dimensions. Procedure quality is defined as the quality and system access represent the factors of a
degree to which the procedures guiding the forecasting higher order FTTF construct.
process are perceived to assist in the creation of Forecasting research has addressed forecasting
demand forecasts. Procedure access is defined as processes (Fildes & Hastings, 1994; Mentzer et al.,
the degree to which the procedures guiding forecast 1999), system requirements (Fildes & Beard, 1992;
creation are perceived to be available to assist in Fildes et al., 2006; Mentzer & Kent, 1999) and
the creation of demand forecasts. Evaluating FSS user capabilities (Fildes et al., 2006; Goodwin,
user perceptions of procedure quality and procedure Fildes, Lawrence, & Nikolopoulos, 2007) as parts
access offers a means of assessing the correspondence of an overall picture of forecasting practice. The
between the task of forecasting, the information FTTF model provides an opportunity to assess
represented in the forecasting problem and the the roles of these three factors in an integrated
forecasters skills. fashion. Furthermore, studies support an assertion
that individuals are able to evaluate the extent to
3.3. User evaluation of forecasting task-technology fit which systems can help them complete their tasks
successfully (Goodhue, 1995). Applied to the domain
Forecasting task-technology fit involves the percep- of forecasting, TTF posits that users of a forecasting
tions of the FSS user regarding the extent to which the support system are able to evaluate forecasting
features and technologies represented in the forecast- procedures and forecasting support systems, and
ing support system fit or facilitate their efforts to the extent to which the correspondence between
complete the forecasting task. Goodhue (1995) con- procedures and systems supports them in the
firmed the ability of system users to evaluate task- successful creation of demand forecasts. These
system fit. His task-system fit construct consisted relationships lead us to our first two hypotheses.
of 12 dimensions. Goodhue and Thompson (1995) H1 : Forecasting support system characteristics are
re-confirmed the validity and reliability of a task- positively related to user evaluations of forecasting
technology fit instrument, and represented user eval- task-technology fit.
uations of task-technology fit in a factor consisting of H2 : Forecasting task characteristics are positively
eight dimensions (quality, locatability, authorization, related to user evaluations of forecasting task-
compatibility, ease of use/training, production timeli- technology fit.
ness, system reliability, and relationship with users).
The user evaluation of FTTF is defined as 3.4. Forecast performance
the respondents perception of the correspondence
between forecasting support system capabilities and Goodhue and Thompson (1995) confirmed a rela-
forecasting procedures that facilitate the creation of tionship between user perceptions of TTF and per-
forecasts. Rather than the 12 dimensions proposed ceived performance. They also noted the importance
by Goodhue (1995), the present study integrates the of identifying objective measures of performance.
dimensions of right data, right detail and data accuracy Forecasting systems provide just such an opportunity
(Goodhue, 1998) to establish a measure of system for more tangible measures (Fildes et al., 2006).
quality. System quality is defined as the degree to Traditional forecasting performance measures include
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 151

statistics reflected in calculations of the mean abso- for forecasting at companies that manufacture or
lute percent error (MAPE), mean absolute deviation distribute products through consumer or industrial
(MAD), and similar quantifications of forecast accu- product supply channels. The criteria guiding sample
racy. Each represents the comparison of the forecasted selection were broad, in hopes of achieving a moderate
demand with the actual demand experienced. Fore- level of external validity (Cook & Campbell, 1979),
casting performance is therefore defined as the extent and to contribute to the generalizability of the results.
to which demand forecasts match the actual demand The unit of analysis was the respondents perceptions
for a product or service over a stated time horizon. of the system used and the procedures followed
The technology acceptance model focuses on to create forecasts, and the resulting forecasting
utilization as a dependent variable, and suggests that performance.
individuals have a choice whether or not to use the Scale development followed the procedures and
system. TAM posited that perceptions of ease of use guidelines recommended by Bienstock, Mentzer,
and usefulness result in system use. If a system is well and Bird (1997), Churchill (1979), Dunn, Seaker,
designed, its use should subsequently lead to improved and Waller (1994), Gerbing and Anderson (1988),
performance. The information systems literature, Garver and Mentzer (1999) and Mentzer and Flint
and more specifically the forecasting literature, has (1997). Existing scales provided models for the
recognized that the connection between use and development of scales for measuring forecasting task-
performance is insufficient. technology fit and the tasks that support forecast
TTF makes a different assumption. TTF assumes creation. Forecasting research provided guidance for
that the individual will not necessarily have a developing scales to measure forecasting support
choice whether or not to use the system, as use system as well as forecasting performance.
may be mandated (Goodhue, 1995). However, users Variable definitions emerged through an iterative
are still able to assess the extent to which the process involving experience, literature reviews and
technology helps them accomplish their tasks. If, for exploratory research. Depth interviews were con-
example, technologies do not effectively support task ducted with forecasting managers in seven companies.
requirements, TTF posits that users may become more The interviews were unstructured and started with a
frustrated with their efforts, and performance will discussion of the individuals role in the company and
suffer. their involvement in forecasting. The goal of the pro-
The emphasis in forecasting research on evaluating cess was to confirm the relationships conceptualized
user satisfaction with systems, along with the inherent in the FTTF model, and ensure a precise definition of
complexity of forecasting methods and the frequency the variables to be operationalized in the survey instru-
with which the forecasting task must be repeated ment.
(weekly, monthly, etc.), suggest a greater likelihood A new scale for measuring forecasting support
that system use may be mandated by policy or by the systems was developed to tap respondent perceptions
volume of activity. The effectiveness of the system in regarding the features and capabilities of their
helping the users complete their tasks then becomes forecasting system. The features included were based
important to the creation and performance of the on those commonly prescribed by forecasting experts.
forecasts. This is reflected in the following hypothesis. Measures of forecasting procedure characteristics
H3 : User evaluations of forecast task-technology fit were derived from measures of task technology fit,
are positively related to forecast performance. and were adapted to address forecasting tasks along
The next section describes the method employed to two dimensions, procedure quality (defined as the
evaluate these hypotheses. degree to which forecasting procedures and methods
assist individuals in their forecasting tasks), and
4. Methodology procedure accessibility (defined as the degree to
which forecasting procedures are available to assist
The study employed a non-experimental mail individuals in performing their forecasting tasks).
survey methodology (Kerlinger, 1992). Surveys Measures of forecasting task-technology fit were
were distributed to the senior person responsible grounded in existing scales developed by Goodhue
152 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

(1998). Rather than the original 12 dimensions, items The reliability of first order factors was evaluated
drawn from the dimensions of right data, right detail, based on Cronbachs coefficient alpha (Cronbach,
and data accuracy were consolidated to establish a 1951). The reliability of the two second order factors
dimension of system quality. Items drawn from the was calculated using Nunnallys (1978) formula for
original dimensions of accessibility, ease of use and the reliability of linear combinations. The reliabilities
system reliability were consolidated to establish a for the six first order factors, as well as the two
second dimension of system access. higher order constructs, were above the recommended
The dependent variable of forecasting performance threshold of 0.70.
was represented by a scale for evaluating forecast ac- Convergent validity was assessed for each first
curacy at different levels of aggregation (stock keep- order factor using confirmatory factor analysis. Item-
ing unit by location (SKUL), stock keeping unit (SKU) to-factor loadings were all significant at the = 0.05
and product line), over a one month time horizon. level. As a result, convergent validity was supported
with regard to all first order factors.
4.1. Survey pretest The discriminate validity for the higher level task
construct and task/technology fit construct was as-
The survey plan was implemented through the pro-
sessed through a comparison of three models (Bien-
cedures outlined by Dillman (1978). The survey was
stock et al., 1997). Model 0 proposes a structure of
distributed to individuals responsible for forecasting
no factors and all traits. Model 1 proposes a struc-
at 280 companies. After three waves of mailings, 21
ture of one factor consisting of all traits. Model 2
surveys were returned blank, with bad addresses, or
proposes a structure consistent with the final dimen-
after the cut-off date for analysis, resulting in a net
sional structure (e.g. two dimensions of forecasting
mailing to 259 individuals. Of the 259 surveys, 129 us-
task-technology fit), with each dimension being asso-
able surveys were returned, resulting in a response rate
ciated with its respective trait. A significant difference
of 49.8%. The forecasting manager pretest descriptive
in the chi-square comparison of Model 0 with Model 1
statistics revealed no skewness or kurtosis problems
provides evidence of convergent validity, and the com-
with the data.
parison between Model 1 and Model 2 provides evi-
To assess a potential response bias, a comparison of
dence of discriminate validity (Bienstock et al., 1997;
responses between the first and third waves of survey
Widaman, 1985). The evidence supported convergent
returns was analyzed (Armstrong & Overton, 1977). In
validity for each construct, as well as discriminate va-
addition, a random sample of 26 non-respondents was
lidity among the sub-dimensions of each construct.
contacted by phone and asked five non-demographic
questions (Mentzer & Flint, 1997). The results of both 4.2. Final survey
analyses indicated no bias.
Confirmatory analysis was used to analyze scale The final survey was directed to the individuals re-
unidimensionality (Gerbing & Anderson, 1988). sponsible for forecasting at each of 289 companies.
Reviews of factor loadings, modification indices and Fourteen surveys were returned incomplete or declin-
substantive content were used to evaluate items ing participation, resulting in a net mailing of 275 sur-
that loaded weakly on the hypothesized factor for veys. After three waves, 216 usable survey responses
elimination. were received. The effective response rate for the final
The pretest asked respondents to record the forecast sample was 78.54%. The participants included indi-
accuracy for their products at multiple levels of viduals from companies ranging in size from less than
aggregation. Because of the large number of missing $10 million to greater than $5 billion. The inventory
responses, measurement refinements could not be approaches employed for finished goods included in-
accomplished as part of the pretest. To improve ventory to stock (72.2%), inventory to order (20.4%),
responses, the final test instrument employed scales and both methods (6.5%). Two companies did not re-
associated with forecast accuracy at the different levels port their inventory approach. Table 2 provides de-
of aggregation. Seven point scales presented ranges scriptive statistics for each of the final measurement
of accuracy, namely <70%, 70%74%, 75%79%, items, as well as correlations between measures from
80%84%, 85%89%, 90%94%, and 95%100%. the final sample. The means for the items ranged from
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 153

Table 2
Final sample descriptive statistics.

Correlation matrix

Item Mean Std. Skew Kurt. Proced. Proced. System System Fcstg. support Fcstg. perf.
Dev quality access quality access system
1 4.00 1.45 0.14 0.60
2 3.99 1.42 0.07 0.75
3 3.34 1.46 0.17 0.70
4 3.76 1.47 0.15 0.93
Procedure quality 1.00
5 3.80 1.90 0.04 1.14
6 4.68 1.59 0.44 0.56
7 4.03 1.73 0.08 0.95
8 4.05 1.58 0.13 0.57
9 3.91 1.66 0.01 0.77
Procedure access 0.83 1.00
10 4.62 1.71 0.47 0.75
11 4.04 1.64 0.09 0.80
12 3.41 1.57 0.31 0.96
13 4.06 1.58 0.17 0.81
System quality 0.74 0.72 1.00
14 4.61 1.70 0.45 0.66
15 4.44 1.66 0.28 0.75
16 4.03 1.58 0.20 0.88
17 4.58 1.72 0.45 0.79
System access 0.68 0.75 0.84 1.00
18 4.02 1.92 0.12 1.19
19 4.44 2.02 0.40 1.18
20 4.46 1.91 0.35 1.02
21 4.25 1.82 0.23 1.02
22 3.88 2.02 0.10 1.34
Fcstg. support system 0.59 0.63 0.81 0.81 1.00
23 4.74 1.47 0.83 0.34
24 3.89 1.71 0.36 0.92
25 3.47 1.78 0.06 1.09
Fcstg. perf. 0.46 0.39 0.42 0.36 0.37 1.00
Fcst procedures 0.90 0.92 0.82 0.79 0.67 0.37
Fcst Task-Tech Fit 0.78 0.80 0.94 0.91 0.89 0.42

3.34 to 4.74, and the standard deviations for the items A confirmatory measurement model was run to as-
ranged from 1.42 to 2.02. sess unidimensionality (Gerbing & Anderson, 1988).
Because of the high response rate associated with The fit of the measurement model was good, with a
the final survey of forecasting managers, the test chi square/df ratio of 1.80, a RMSEA of 0.061, which
for response bias was limited to a comparison of falls within the acceptable fit range of 0.050.08, and
differences in the mean response between the first a CFI of 0.945, which is greater than the 0.90 value ac-
and third waves of the survey mailing for five pre- cepted as a minimum indicator of consistency with the
selected questions. No response bias was identified. observed data from which it was estimated. Regres-
The revised measures of forecast accuracy were well sion weights of all items on their latent variables were
received in the final survey and the measures were in the appropriate direction and were significant at the
included in the confirmatory analysis. 0.001 level. The results supported convergent validity
154 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

Forecasting
Support 2
Systems
1014
23
Forecasting Forecasting
10 11 12 13 14
Task-Tech Performance
Fit 3 63 6
2325
42 52

Forecasting 31
23 24 25
Procedures System System
1 Quality Access
11 21 4 5
1518 1922
Procedure Procedure
Quality Access
15 16 17 18 19 20 21 22
1 2
14 59

1 2 3 4 5 6 7 8 9

Fig. 2. Forecasting task-technology fit model.

and unidimensionality. Discriminate validity for each evaluations of forecasting task-technology fit. Based
of the second order constructs was confirmed as out- on a path weight of 0.671 with a t-value of 8.251, the
lined in the pretest section. hypothesis was supported, and was significant at the
0.001 level (Table 5). Forecasting research has pre-
scribed system capabilities that are expected to help
5. Results
improve forecast performance. While simulation stud-
ies support a relationship between system capabilities
The FTTF model (Fig. 2) illustrates the relation-
(e.g. multiple methods) and performance, such a rela-
ships tested in this study. Forecasting support sys-
tionship has not been confirmed in practice.
tems ( 2) and forecasting procedures ( 1) represent
exogenous variables that are proposed to influence The present study incorporated commonly pre-
the endogenous variable, forecasting task-technology scribed forecasting system capabilities as measures of
fit (3). Forecasting procedures consist of two di- forecasting support systems. Rather than employing
mensions, procedure quality (1) and procedure ac- the capabilities solely as a normative model of system
cess (2). Forecasting task-technology fit also consists capabilities that support improved forecasting, task-
of two dimensions, system quality (4) and system technology fit considers the user role in system as-
access (5). Forecasting task-technology fit is sub- sessment. Fildes et al. (2006) recognized the user role
sequently proposed to influence the endogenous when they proposed that forecasting support systems
variable of forecasting performance (6). Structural be able to guide users with system application. From a
equation modeling was used to evaluate the FTTF TTF perspective, design attributes that guide forecast
model. AMOS 7.0 and SPSS version 15 software were creation may contribute to user perceptions of system
used to conduct the analysis. The results supported a quality and access by those users who have not re-
good fit of the FTTF path model (chi square/df ratio of ceived forecasting training. For those with forecasting
2.16, a RMSEA of 0.072 and a CFI of 0.921). Table 3 training, the extent to which system features facilitate
provides the path estimates and critical ratios associ- their forecasting task may also contribute to percep-
ated with each of the factor measures and relationships tions of system quality and access.
illustrated in Fig. 2. Fildes et al. (2006) also suggested that guidance
may be more important than restrictiveness in fore-
5.1. Hopothesis 1 casting support system design. This is also reflected
in the forecasting task-technology fit model. From a
Hypothesis 1 stated that forecasting support TTF perspective, design attributes that restrict access
system characteristics are positively related to user to data and techniques may not significantly influence
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 155

Table 3
Forecasting Task-Technology Fit model results.

Path Estimates Critical ratio


Fcst Procedure Procedure Quality ( 11) 0.89 9.41
Fcst Procedure Procedure Access ( 21) 0.92 n/a
Fcst Procedures Fcst TTF ( 31) 0.67 7.88
Fcst TTF System Quality ( 42) 0.90 9.48
Fcst TTF System Access ( 52) 0.88 n/a
Fcst Support Systems Fcst TTF ( 23) 0.67 8.25
Fcst TTF Fcst Performance (63) 0.38 4.76
Procedure Quality - Item 1 (11) 0.88 n/a
Procedure Quality - Item 2 (12) 0.87 17.53
Procedure Quality - Item 3 (13) 0.84 16.21
Procedure Quality - Item 4 (14) 0.81 15.41
Procedure Access Item 1 (15) 0.72 n/a
Procedure Access Item 2 (16) 0.76 10.67
Procedure Access Item 3 (17) 0.87 12.26
Procedure Access Item 4 (18) 0.81 11.40
Procedure Access Item 5 (19) 0.83 11.71
System Quality Item 1 (110) 0.77 n/a
System Quality Item 2 (111) 0.86 12.93
System Quality Item 3 (112) 0.41 5.79
System Quality Item 4 (113) 0.83 12.52
System Access Item 1 (114) 0.79 n/a
System Access Item 2 (115) 0.88 13.90
System Access Item 3 (116) 0.81 12.65
System Access Item 4 (117) 0.68 10.25
Fcst Support Systems Item 1 (118) 0.71 n/a
Fcst Support Systems Item 2 (119) 0.76 10.11
Fcst Support Systems Item 3 (120) 0.78 10.32
Fcst Support Systems Item 4 (121) 0.74 9.90
Fcst Support Systems Item 5 (122) 0.62 8.37
Fcst Performance Item 3 (123) 0.75 n/a
Fcst Performance Item 4 (124) 0.97 11.96
Fcst Performance Item 5 (125) 0.72 10.83

the perception of system quality and access by users of non-routineness and interdependence. Forecasting
with little training in forecasting. For those with more procedures generally reflect more clearly defined ac-
extensive forecast training, restricting access to a lim- tivities. The degree of routineness and interdepen-
ited set of data and techniques may lead to frustration dence of forecasting procedures reflects contingent
with the forecasting task, and may detract from user factors such as the market demand characteristics, the
perceptions of system quality and access. availability of demand related information, communi-
cation requirements, and so forth. The present study
5.2. Hypothesis 2 evaluated forecasting procedures based on conceptual
underpinnings similar to those of the FTTF construct.
Hypothesis 2 stated that forecasting procedures In this case, forecasting procedures are viewed as the
are positively related to user evaluations of forecast- technologies that facilitate the task of forecasting.
ing task-technology fit. Based on a path weight of Forecasting products with different market and de-
0.668 with a t-value of 7.879, this hypothesis was mand characteristics may call for different forecast-
supported, and was significant at the 0.001 level. ing procedures. For example, fashion products may
Goodhue (1995) and Goodhue and Thompson (1995) rely on more qualitative assessments of larger market
measured task characteristics along the dimensions trends than grocery items, which rely more on demand
156 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

history. The access to defined procedures and the per- and management. There is support for this assertion.
ceived quality of those procedures may be expected to Simulation studies have illustrated operational and
be related to forecaster assessments of how technolo- financial benefits from employing more accurate
gies support their forecasting efforts. forecasts (Bowersox, Closs, Mentzer, & Sims, 1979;
Gardner, 1990; Zhao, Xie, & Lau, 2001). Studies
5.3. Hypothesis 3 of forecasting practice, however, have reflected a
continuing challenge in identifying those factors
Hypothesis 3 stated that the user evaluation of that may improve forecast performance (Fildes &
forecasting task-technology fit is positively related to Hastings, 1994; McCarthy et al., 2006).
forecast performance. Based on a path weight of 0.381
This study acknowledges the significant role the
with a t-value of 4.775, the hypothesis was supported,
user plays in the implementation of forecasting
and was significant at the 0.001 level. Research has
support systems. The contribution of a forecasting
generally prescribed certain system characteristics that
support system relies on the assessment of the
should help to improve forecast performance. The
individuals who use the system to create forecasts
confirmation of this third hypothesis recognizes that
specifically, user perceptions of the extent to which the
the individuals responsible for forecast creation play
system contains the right data and the techniques they
an important role in determining the extent to which
believe are needed to forecast demand, and the extent
processes and systems influence forecast performance.
The evaluation of the model for forecasting to which the system provides access to accurate data
task-technology fit confirms the proposition that and appropriate techniques. If the forecaster perceives
forecast support system users are able to evaluate the support system to not contain the techniques or
the correspondence between the forecasting support data they believe are needed to create forecasts, or
system capabilities, the procedures that guide forecast to not provide easy access to the techniques or data,
development, and their own forecasting skills and the resulting forecasts are likely to suffer in terms of
abilities. The results of the study confirmed that there accuracy.
is a positive relationship between forecasting support This relationship may also be reflected by the
system capabilities and user evaluations of forecasting phenomenon identified in case and benchmark studies
task-technology fit; between forecasting procedures as Islands of Analysis (Moon, Mentzer, & Smith,
and user evaluations of forecasting task-technology 2003, p.18). Islands of Analysis are described as
fit; and between forecasting task-technology fit and the informal implementation of forecasting systems
forecast performance. and processes in multiple, and not always authorized,
locations throughout an organization. The emergence
of Island of Analysis may reflect efforts on the part of
6. Conclusions and future directions forecasters to establish systems that they perceive to
be of better quality or to provide better access than the
Drawing on forecasting, information, and deci-
official forecasting support system.
sion support system research, this study empirically
tested a theoretically-based model of forecasting task- This study supports specific capabilities that are
technology fit. The results of the study illustrated the frequently recommended in forecasting support sys-
correspondence between forecasting system character- tems. Forecasting practitioners should lean toward
istics, forecast procedures, and user assessments of support systems that include the following capabili-
system quality and access which influence forecast ties: the use of multiple forecasting techniques, the
performance. ability to display the forecast in different measurement
units and at different levels of aggregation, the ability
6.1. Implications for practice to distinguish product importance, and the ability to
capture forecast adjustments. In order to support im-
For businesses, improved forecast performances proved forecast performance, however, it is important
offer a means of improving the performances of that these features be perceived to contribute to fore-
systems and activities that use forecasts for planning caster assessments of system quality and access.
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 157

The results of this study also highlight the need individuals within the same organization, this study
to establish procedures for guiding forecast creation evaluated FTTF across companies, and represents an
that are clearly documented, accessible to those alternative assessment of the generalizability of the
responsible for forecasting, and viewed by forecasters concept.
as appropriate and easy to follow. This finding is Among the limitations of the study, the original
consistent with business trends that emphasize process conceptualization of task-technology fit incorporated
mapping and evaluation as part of broader continuous 12 dimensions. The present study condensed them
improvement practices. into two dimensions representing system quality and
access. While these two dimensions appear to reflect
6.2. Implications for research a significant amount of the variability in forecasting
performance, it will be beneficial to more thoroughly
Forecasting research has presented new algorithms
for forecast creation and compared those models with test the broader dimensions of TTF within the domain
existing methods in an effort to improve performance. of forecasting.
Surveys have described the techniques and systems Another potential limitation involves the approach
used in industry, together with the satisfaction with taken to evaluate forecast performance. Establishing
various techniques. Case analyses have provided a a scale for measuring forecast performance helped
greater understanding of the complex management to ensure the capture of performance data, though
and organizational factors that influence forecast perhaps at the expense of measurement variability.
creation and application. Recently, studies have started Future studies that include forecasting performance
to develop a grounding for theoretical frameworks that should attempt to establish more detailed measures
identify factors and explain relationships that illustrate of accuracy, and measures of accuracy at different
the forecasting phenomena. forecast horizons.
The present study represents one of the first
empirical assessments of a theoretically based model 6.3. Future directions
of factors that influence forecasting performance in
organizations. The FTTF model offers a means of This study represents an initial investigation of
evaluating forecasting procedures and technologies the role of task-technology fit within the domain of
that recognize the role of the forecaster and the contin- forecasting. It would be beneficial to both decision
gent nature of forecasting practices in different organi- support research and forecasting research to develop
zations. This study contributes to forecasting research a greater depth of understanding of the variables
by offering a series of psychometrically rigorous and relationships represented in the current model.
measures of forecasting systems, procedures and per- Qualitative research methods offer a means of
formance. A new measure of forecasting system capa- investigating TTF within a forecasting context, will
bilities was developed to assess the extent to which the help to validate current results, and may reveal other
characteristics of forecasting support systems reflect factors that can influence forecast development and
those prescribed in idealized forecasting systems. Ad- performance.
ditional measures for capturing forecaster perceptions Task-technology fit is also part of a larger model
of the quality of and access to forecasting procedures referred to as the Technology to Performance Chain
and forecasting systems were adapted from measures (TPC) model (Goodhue & Thompson, 1995). The
used in TTF research. TPC model suggests that performance feedback may
This study contributes to information and deci- influence the decision whether or not to use a
sion support system research by further validating technology, or the way in which the user assesses task-
the relationship conceptualized in the theory of task- technology fit. Locke (1968) and others have identified
technology fit. Where earlier TTF studies have relied feedback clarity and challenge to be important aspects
on user evaluations of performance, this study em- which lead to improved performance. Forecasting
ployed more objective measures of performance based appears to offer a clear metric to those responsible for
on forecast accuracy. While previous studies evalu- forecast development, and should be considered in an
ating task-technology fit have been conducted across extension of the FTTF model.
158 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

Table A.1
Respondents were asked to choose a level of agreement or disagreement with the questions based on a scale ranging from 1 (Strongly disagree)
to 7 (Strongly agree).

Construct Questions
Procedure quality Our forecasting procedures. . .
the degree to which the procedures guiding the . . . support forecast development very effectively
forecasting process are perceived to assist in the . . . provide clear directions to guide forecast development
creation of demand forecasts. . . . are exactly what are needed to create accurate forecasts
. . . ensure the right methods (techniques) are used to forecast
Procedure access Our forecasting procedures. . .
the degree to which the procedures guiding forecast . . . are documented
creation are perceived to be available to assist in the . . . ensure that forecasts are created in a timely manner
creation of demand forecasts. . . . are readily available to help guide forecast development
. . . are easy to follow
. . . are always up-to-date
Forecasting support systems Our forecasting system. . .
the extent that forecasting support system . . . uses a number of quantitative methods (techniques) for forecasting
technologies correspond to those of an idealized . . . can display forecasts in different measurement units that might be needed by
forecasting support system. forecast participants (e.g. units, cases, dollars, etc)
. . . can display forecasts at different levels of our product hierarchy that might
be needed by forecast participants (product by location, product, product line,
division, etc)
. . . gives us the ability to manage products on an ABC ranking, or other means
of distinguishing product importance
. . . can capture forecast adjustments made by each forecasting participant
Forecasting system quality Our forecasting system. . .
the degree to which information in the forecasting . . . contains data at the right level(s) of detail to support forecasting
support system assists individuals in performing their . . . contains the right methods (techniques) needed to create forecasts
forecasting related tasks. . . . contains data that is not accurate enough to create effective forecasts
. . . contains all the data needed to create accurate forecasts
Forecasting system access Our forecasting system. . .
the degree to which information in the forecasting . . . is easily accessible to those who need it to create forecasts
system is available to assist individuals in performing . . . makes it easy to get access to demand and forecast data
their forecasting tasks. . . . makes it easy to access the forecasting methods (techniques) needed to
develop forecasts
. . . is always up and available when forecasting participants need it
Forecast performance For each level of your forecast hierarchy, indicate how accurate forecasts are, based
the extent to which demand forecasts match the actual on a monthly time horizon. . .
demand for a product or service over a stated time . . . product line forecasts
horizon. . . A seven point scale with the following . . . product (SKU) forecasts
ranges was used to capture performance: <70%, . . . product (SKU) by location forecasts (DC level)
70%74%, 75%79%, 80%84%, 85%90%,
90%94%, 95%100%.

In addition to performance feedback, the TPC for example, has presented evidence that improved
model extends the dependent variable to include forecasting contributes to improved performance in
the impact on organizational performance. In cases areas such as inventory management (Gardner, 1990).
where forecasts are applied directly to decision In many cases, forecasts are not implemented in
support systems without user intervention, we might isolation. Forecast users have been acknowledged
expect improved performance in those parts of the in cases discussed by Fildes and Hastings (1994),
organization that apply forecasts. Simulation research, Mentzer et al. (1999) and Schultz (1984). Future
International Journal of Forecasting 26 (2010) 144161
www.elsevier.com/locate/ijforecast

Forecasting task-technology fit: The influence of individuals,


systems and procedures on forecast performance
Carlo D. Smith a, , John T. Mentzer b,1
a Operations and Supply Chain Management, Central Washington University, 400 E. University Way, Ellensburg, WA 98926, United States
b University of Tennessee, 310 Stokely Management Center, Knoxville, TN 37996, United States

Abstract

This study establishes and tests a theoretically-based model of forecasting practice. Forecasting Task-Technology Fit (FTTF)
addresses the relationship between forecasting support systems, the procedures that guide forecast creation, and their fit with the
capabilities of the forecasting support system user. An analysis of survey responses collected from 216 forecasting practitioners
confirms a positive relationship between specific forecasting support system characteristics and the system users perceptions of
system quality and access. The results also support a positive relationship between the system users perceptions of the quality
of, and their access to, procedures that guide forecast creation and their perceptions of system quality and access. Finally, the
results confirm a positive relationship between the users assessment of system quality and access and a dependent variable
measuring forecast performance.
c 2009 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.

Keywords: Forecasting support systems; Forecasting management; Sales forecasting; Task technology fit; Forecasting practice

1. Introduction & Bell, 2006). When used effectively, improved fore-


casting performance helps organizations and supply
Demand forecasts are employed in a wide range of chains to adapt to changing market conditions and im-
business operations, including inventory management, prove operating performance (Fildes & Beard, 1992;
transportation, distribution, replenishment, produc- Gardner, 1990; Wacker & Lummus, 2002). When
tion, maintenance and supplier collaboration (Fildes, performance degrades, companies frequently rely on
Goodwin, & Lawrence, 2006; Kusters, McCullough, inventory assets, expedited services and other expen-
sive actions to support operations and maintain ser-
vice (Armstrong, 1988; Winklhofer, Diamantopoulos,
Corresponding author. Tel.: +1 509 963 2032.
& Witt, 1996). In the worst cases, poor forecasting
E-mail addresses: cdssan2@gmail.com (C.D. Smith),
jmentzer@utk.edu (J.T. Mentzer). performance can contribute to major financial losses.
1 Tel.: +1 865 974 1652. Forecasting missteps at Cisco Systems were believed

0169-2070/$ - see front matter c 2009 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.
doi:10.1016/j.ijforecast.2009.05.014
160 C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161

Goodhue, D. L. (1995). Understanding user evaluations of Nachiappan, S. P., Jawahar, N., Parthibaraj, S. C., & Brucelee, B.
information systems. Management Science, 41(12), 18271844. (2005). Performance analysis of forecast driven vendor managed
Goodhue, D. L. (1998). Development and measurement validity inventory system. Journal of Advanced Manufacturing Systems,
of a task-technology fit instrument for user evaluations of 4(2), 209226.
information systems. Decision Sciences, 29(1), 105137. Nunnally, J. (1978). Psychometric theory (2nd Ed.). New York:
Goodhue, D. L., & Thompson, R. L. (1995). Task-technology fit and McGraw-Hill.
individual performance. MIS Quarterly, June, 213236. Schultz, R. (1984). The implication of forecasting models. Journal
Goodwin, P., Fildes, R., Lawrence, M., & Nikolopoulos, K. (2007). of Forecasting, 3, 4355.
The process of using a forecasting support system. International Schultz, R. (1992). Fundamental aspects of forecasting in
Journal of Forecasting, 23(3), 391404. organizations. International Journal of Forecasting, 7(4),
Kerlinger, F. N. (1992). Foundations of behavioral research (3rd 409411.
Silver, M. S. (1991). Decisional guidance for computer-based
Ed.). New York: Harcourt Brace College Publishers.
support. MIS Quarterly, 15(1), 105133.
King, W. R., & He, J. (2006). A meta-analysis of the
Snyder, R. D. (1993). A computerized system for forecasting spare
technology acceptance model. Information and Management,
parts sales: A case study. International Journal of Operations
43(6), 740755.
and Production Management, 13(7), 8392.
Kusters, U., McCullough, B. D., & Bell, M. (2006). Forecasting Van de Ven, A. H., & Delbecq, A. L. (1974). A task
software: Past, present and future. International Journal of contingent model of work-unit structure. Administrative
Forecasting, 22(3), 599615. Sciences Quarterly, 19(2), 183197.
Lawrence, M. (2000). What does it take to achieve adoption in Van de Ven, A. H., & Drazin, R. (1985). The concept of fit in
sales forecasting. International Journal of Forecasting, 16(2), contingency theory. In L. L. Cummings, & B. M. Staw (Eds.),
145146. Research in organizational behavior (pp. 333365). Greenwich,
Locke, E. A. (1968). Toward a theory of task motivation and CT: JAI Press.
incentives. Organizational Behavior and Human Performance, Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of
3, 157189. the technology acceptance model: Four logitudinal field studies.
Makridakis, S. (1996). Forecasting: Its role and value for planning Management Science, 46(2), 186204.
and strategy. International Journal of Forecasting, 12(2), Venkatraman, N. (1989). The concept of fit in strategy research:
513537. Toward verbal and statistical correspondence. Academy of
McCarthy, T. M., Davis, D. F., Golicic, S. L., & Mentzer, J. Management Review, 14(3), 423444.
T. (2006). The evolution of sales forecasting management: A Vessey, I. (1991). Cognitive fit: A theory-based analysis of
20 year study of forecasting practices. Journal of Forecasting, the graphs versus tables literature. Decision Sciences, 22(2),
25(5), 303324. 219240.
Mentzer, J. T., & Cox, J. E. (1984). Familiarity, application, Vessey, I., & Galletta, D. (1991). Cognitive fit: An empirical
and performance of sales forecasting techniques. Journal of assessment of information acquisition. Information Systems
Forecasting, 3, 2736. Research, 2(1), 6384.
Mentzer, J. T., & Schroeter, J. (1994). Integrating logistics Wacker, J. G., & Lummus, R. R. (2002). Sales forecasting for
forecasting techniques, systems, and administration: The strategic resource planning. International Journal of Operations
multiple forecasting system. Journal of Business Logistics, and Production Management, 22(9), 10141031.
15(2), 205225. Webby, R., OConnor, M., & Edmundson, B. (2005). Forecasting
Mentzer, J. T., & Kahn, K. B. (1995). Forecasting technique support systems for the incorporation of event information: An
familiarity, satisfaction, usage, and application. Journal of empirical investigation. International Journal of Forecasting,
Forecasting, 14, 465476. 21(3), 411423.
Widaman, K. F. (1985). Hierarchically nested covariance structure
Mentzer, J. T., Bienstock, C. C., & Kahn, K. B. (1999). Benchmark-
models for multitrait-multimethod data. Applied Psychological
ing sales forecasting management. Business Horizons, 42(3),
Measurement, 9(1), 126.
4856.
Winklhofer, H., Diamantopoulos, A., & Witt, S. F. (1996).
Mentzer, J. T., & Kent, J. L. (1999). Forecasting demand in the
Forecasting practice: A review of the empirical literature and an
Longaberger Company. Marketing Management, 8(2), 4651.
agenda for future research. International Journal of Forecasting,
Mentzer, J. T., & Flint, D. J. (1997). Validity in logistics research.
12, 193221.
Journal of Business Logistics, 18(1), 199216. Winlkhofer, H., & Diamantopoulos, A. (2003). A model of export
Mentzer, J. T., & Bienstock, C. C. (1998). The seven principles of sales forecasting behavior and performance development and
sales-forecasting systems. Supply Chain Management Review, testing. International Journal of Forecasting, 19(2), 271285.
Fall, 7683. Zhao, X., Xie, J., & Lau, R. S. M. (2001). Improving the supply
Moon, M., Mentzer, J. T., & Smith, C. D. (2003). Conducting a sales chain performance: Use of forecasting models versus early order
forecasting audit. International Journal of Forecasting, 19(1), commitments. International Journal of Production Research,
525. 39(17), 39233939.
Moriarty, M. M. (1985). Design features of forecasting systems Zigurs, I., & Buckland, B. K. (1998). A theory of task/technology fit
involving management judgement. Journal of Marketing and group support systems effectiveness. MIS Quarterly, 22(3),
Research, 22(4), 353364. 313334.
C.D. Smith, J.T. Mentzer / International Journal of Forecasting 26 (2010) 144161 161

Carlo D. Smith (Ph.D., University of Tennessee) is Assistant John T. Mentzer (Ph.D., Michigan State University) is the
Professor of Operations and Supply Chain Management at Central Chancellors Professor and the Harry J. and Vivienne R. Bruce
Washington University and Associate Director of the CWU Chair of Excellence in Business in the Department of Marketing and
Supply Chain Management Institute. His research focuses on Logistics at the University of Tennessee. He has published 8 books
logistics in the supply chain, inventory management, and forecasting and more than 200 articles and papers in the International Journal of
management. His articles have appeared in the Journal of Business Forecasting, Journal of Forecasting, Journal of Business Logistics,
Logistics, International Journal of Forecasting, Journal of Business Harvard Business Review, Journal of Marketing, International
Forecasting, Business Horizons, and the Journal of Consumer Journal of Physical Distribution and Logistics Management, and
Satisfaction, Dissatisfaction and Complaining Behavior. Carlo other journals. He was the 2004 recipient of the Council of Logistics
received his undergraduate and Masters degrees in Business Management Distinguished Service Award, the 2007 recipient of
Logistics from the Pennsylvania State University. Carlo has over the International Society of Logistics Armitage Medal, and the
13 years of industry experience as a logistics consultant and 2008 recipient of the Academy of Marketing Science Distinguished
executive educator. Service Award.

You might also like