Professional Documents
Culture Documents
A Compendium
Feature Articles of Recent Articles by CMP Editors
Sponsored by:
BACK TO HOME
BI Grows Up
A new generation of pervasive business intelligence points the way to more efficient businesses
Savvy companies hungry for a competitive edge know that business intelligence isn’t just for data-analysis
experts anymore.
Today’s BI systems, known variously as “BI 2.0” and “Pervasive BI,” no longer focus only on summaries
of historical data. They combine timely information gathered from the breadth of systems across an
enterprise to inform daily decision-making. Some BI 2.0 implementations also embed reporting tools
and monitors into select enterprise applications to suggest on-the-fly responses to new opportunities
and challenges as they arise.
But the roadmap to pervasive BI is still being charted and technical potholes remain for the unwary.
That’s why we’ve compiled this series of watershed stories about the latest generation of BI from CMP’s
From the family of business technology publications. Read these articles to learn about the technologies, trends,
editors of:
and best practices that can help your company launch and profit from BI 2.0.
For example, Optimize Magazine documents the changing landscape of BI and real-time analytics in the
must-read story “Finally, Intelligent Businesses.” Contending that BI as we know it will change dramati-
cally over the next five years, this article lays out the reasons why innovative enterprises are embedding
analytics into operational processes to gain competitive advantage. “The need to take action, not just
be informed, is more urgent than ever,” says author Neil Raden. A range of converging business forces
“all are pushing BI in a new and exciting direction,” he adds.
For further proof that leading companies need to find new ways to understand and use information,
read “Turning Data Chaos into Business Gold,” also from Optimize Magazine. The story explains how
enterprises can glean better intelligence from their available data and then turn those analyses into
actions for forging closer relationships with customers, suppliers, and employees.
Optimize Magazine rounds out its BI series with “Smarter Use Of Business Intelligence,” which examines
why CIOs have made BI one of their top investment priorities. One catalyst: as BI applications become
easier to use they become available to a wider range of workers and business processes than ever
before. The future of BI lies in its methods, practices, and technologies becoming completely integrated
into our everyday work, the authors argue. They add that IT executives must develop business tools that
absorb and analyze an explosion of information from partners and customers. Based on current spend-
ing patterns, that future has already arrived at many enterprises.
Nancy Feig explores how large enterprises are rethinking their BI strategies in the quest to make better
and more-profitable business decisions in “The Next Level in Business Intelligence,” from Bank Systems
& Technology. While the goals of BI always have been to improve the value of customer relationships
and bottom lines, increasingly competitive business environments are forcing organizations to develop a
new philosophy centered around the enterprise-wide use of BI tools, Feig explains.
“Executive Insight: Enterprise Data Warehousing Renaissance” follows the evolution of enterprise data
warehousing from its essentially theoretical roots a decade ago into what’s becoming a mainstream
architecture today. This Insurance & Technology story explains how sophisticated BI tools and strategies
now are making it possible for companies to launch enterprise-scale data repositories and see data-
warehousing success in much shorter time periods than in years past.
“Operational intelligence” isn’t just one of the IT industry’s latest and greatest buzzwords, as Ventana
Research found out went it investigated the changes that are reshaping BI. Intelligent Enterprise presents
Ventana’s findings in “Operational Intelligence Enters the Spotlight.” The story explains how rapidly
evolving event-driven architectures are spawning intelligence systems that monitor the current status of
business processes and activities and merges intelligence with process workflows. The result: companies
become more competitive by acting quickly to address new opportunities and challenges.
Dashboards and BI may be two concepts closely linked in the minds of many business managers. But
dashboard success requires more than a one-size-fits-all approach, argues Dan Everett in “BI Is More
Than a Dashboard,” from Intelligent Enterprise. To assure that end users take advantage of dashboards,
IT managers need to consider the roles and technical skills of the target audience. See how correctly
designed dashboards can be a key ingredient to BI implementations that deliver real business value.
To understand how BI is becoming a vital tool for day-to-day decision making, read “Oracle Delivers
Compliance, Risk, Reporting Tool” in TechWeb Technology News. Laurie Sullivan describes how Oracle
embedded analytics into enterprise applications to give companies enhanced compliance management
reporting capabilities.
Finally, in “Oracle Plunges Into BI” TechWeb Technology News reports on Oracle’s push earlier this year
into the burgeoning BI sector with its Oracle Business Intelligence Suite. The introduction is part of a
larger trend, the story explains. “During the past five years companies have moved from wanting to col-
lect the data, to gaining insight from the data,” says a researcher quoted in the story. As a result, the BI
software and services market could rise 10 percent to $6 billion this year.
Few technologies are evolving as quickly as today’s BI. These stories will quickly get you up to speed.
Alan Joch
Editor in Chief
The Enterprise Edge
and taxonomies outside the user’s frame of mation exchange that Tim Berners-Lee and
reference. the World Wide Web Consortium (W3C) envi-
sion is still far off, but technology created in its
Applying search technology to the interactive pursuit is showing promise. Not only do
BI experience offers a long-overdue improve- ontologies provide richer and more flexible
ment. It also promises to link indexed, unstruc- ways to represent definitions, meanings, and
tured text, though real apps are a year or two relationships; they also let computers draw
away. BI vendors such as Cognos and SAS are inferences. Asserting things in an ontology lets
responding to customer interest in search and you create a vast new resource of implicit
navigation by inking partnerships with Google, information that has its own discoverable pat-
Inxight Software, and other search and terns and relationships.
unstructured content-analysis vendors.
Not everyone is sold on ontologies, however.
One enterprise customer, law firm Morrison & Pessimists argue that syllogisms are applica-
Foerster LLP, has turned to Recommind’s ble only to a very small part of our lives. And
MindServer tools to go beyond keyword while semantic technology is used in the intel-
search and use concepts to gather information ligence and military communities to do things
from assorted sources, including documents, like unify databases across law-enforcement
databases, Web pages, and E-mail. The objec- and intelligence agencies, it has yet to gain
tive is to feed information to the firm’s more significant traction in commercial enterprises.
than 1,000 attorneys across 19 offices world- Recently, Google, Yahoo, and others have pur-
wide regarding external developments that sued semantic Web technology, which
affect clients and ongoing cases. becomes part of what commercial enterprises
use to discover and interpret search results.
• Master data management. MDM helps
companies share reference data among con- Semantic technology is also finding its way
stituencies, including business functions and into data integration, search, and navigation,
external partners. Although several application providing the basis for “supercharged” meta-
and information-management domains are try- data efforts. For example, IBM, which acquired
ing to “own” MDM, the discipline is emerging Unicorn Solutions in May, is applying
as something separate—and taking a big bite advanced metadata management to service-
out of that traditional mega-enabler of BI, the oriented architecture (SOA) Web-service gov-
data warehouse. ernance and management, among other busi-
ness-integration challenges.
Creating a single repository of integrated
data—the so-called single version of the Business problems might also prompt user
truth—is a core data-warehousing promise. organizations to adopt semantics. For
However, a single, common set of master data instance, most organizations exist as tightly
is too important to an organization to bury in bound groups, communicating in languages
the murky processes of a data warehouse. that only internal participants can readily
Data definitions mediated by those who partic- understand. But as businesses outsource, off-
ipate in the data-warehouse design don’t nec- shore, and otherwise externalize processes,
essarily serve the enterprise view, which is fluid communication with outside partners
more operational than analytical. becomes essential.
• Semantics. Because the data warehouse Realistically, individual groups or functions are
can play only a limited role in delivering MDM rarely willing to exhaustively delineate all the
capabilities, semantic technology becomes rel- elements and combinations required for trans-
evant. The semantic Web of frictionless infor- lation and better understanding. But semantic
models—especially those based on first-order technology, vendor Celequest has found a way
logic, as ontology is—let structure emerge for BI to participate in an SOA cooperatively.
over time and machines to draw inferences
without requiring human participants to code Operational BI must be as lightweight and
each relationship explicitly. configurable as services. Grabbing a piece of
historical data from a data warehouse, aligning
• Operational BI. Operational BI—or more it with current information from an operational
accurately, operational analytics—offers a process, and perhaps dynamically generating
classic example of a boundary object: some- a forecast based on trend analysis must all
thing identified by different domains, and happen transparently and in near-real time.
therefore given very different meanings. Although many operational-analytical hybrids
Classic data warehousing and BI attempted to can operate in a more relaxed time frame, the
meet some operational reporting requirements demand for analytical services will drive the
with the addition of the operational data store. development of fast, thin applets.
ODS has tried to integrate atomic-level data,
generally for later aggregation or summariza- A second factor shaping BI’s future is the evo-
tion by data warehouses, OLAP, and BI tools. lution of technology architecture. SOA, Web
services, W3C standards, and AJAX all came
Unfortunately, the challenges of developing an to us through the exploding use of the
ODS to meet the low- to zero-latency Internet. Other Web 2.0 technologies of inter-
demands of hybrid operational-analytical est are standards and tools for collaboration,
applications have mostly proved beyond the such as RSS feeds and social networking—
capabilities of existing data-warehouse infra- both of which are based on semantic Web
structure and mainstream BI tools. To meld technology. And blogging is becoming a
operations and analytics, tools would have to respectable way to share and vet analysis.
step up to the service levels of operational
software—impossible with prevailing best Management woes
practices. And an ODS’ ability to fully integrate
with data warehouses remains questionable. The Web 2.0 era is also spreading Google-like
software licensing and distribution, featuring
Among database vendors, Teradata seems fur- versionless upgrades. Even in a straightforward
thest along in providing what it calls an Active BI environment, managers face a dozen or more
Data Warehouse. In this scenario, the ware- pieces of software, all on different upgrade
house supports tactical decision making, with schedules. This leads to excessive downtime
performance levels and data refreshes and wasted effort, not to mention cost.
approaching those of transactional systems.
However, most existing BI environments pro- Polling databases is another consuming BI
vide weak support for the kind of real-time effort. However, systems are beginning to
query federation between a data warehouse incorporate unattended agents that know what
and other operational data sources that’s nec- to look for and the most efficient way to find it.
essary to make operational-analytical hybrids Tibco Software, webMethods, and a few other
serve business processes. vendors offer agents that can choose the right
means of communicating analysis results and
One alternative is to be selective about the information automatically. As SOA and Web
data elements you involve, using real-time services mature, it will be simple to devise and
agents either to gather the data as it flows deploy bots to poll and search for you.
through a message queue or to read the appli-
cations’ logs for changed data. By providing In addition to new capabilities and evolving
these and other capabilities as part of its core architecture, three external factors will drive BI
into new modes of usefulness. The open- division of analytical from operational systems,
source movement is one, although it remains which was artificial in the first place, has
to be seen whether open source’s impact will proved inefficient for managers and opera-
be simply to drive down software costs or to tional employees to use analysis for decisions
start something bigger. Everyone knows about and action.
Linux, but BI software doesn’t have a similarly
vast number of interested parties behind it. On Embedded analytics, composite applications,
the other hand, the pressure might be a good and operational BI all lead us to further co-
influence on BI vendors, which might other- processing of operational and analytical data
wise feel they can get away with mediocre and formulas. Since SOA will increasingly be
software at exorbitant prices. the foundation of applications, companies will
prefer to distribute and deploy analytical
Another outside factor seems more influential: applets as needed. Monolithic BI suites with
the continued externalization of business. expensive per-seat licenses will lose favor—as
Today, all organizations must transact business will BI and data warehousing stacks cobbled
with partners and customers electronically. together from different generations of technol-
Historically, a single large customer such as ogy. Market advantage will go to newer, less-
Wal-Mart, or a manufacturer such as Procter & comprehensive entrants. Even older applica-
Gamble, could dictate the format of business- tions that provide specific functionality—such
to-business electronic interchange. Participants as visualization, Monte Carlo simulation or
had to make a big investment in proprietary other stochastic processes, and industry-spe-
technology, including mainframe systems and cific analytics—will get a leg up.
EDI. Those that didn’t were locked out.
BI needs to do a better job of connecting with
With open standards solidifying and technology the new user experience. Data exploration
lowering the cost barriers to almost zero, nearly without guided search and more useful
anyone can participate in E-commerce. The abstraction from physical data representation
stumbling block is getting all parties to effective- won’t capture attention. And those layered
ly communicate and share information in real architectures that look so good on PowerPoint
time. Leading organizations must push beyond slides won’t cut it in a future dominated by
conventional BI and data-warehousing flatter architectures characterized by SOA.
approaches and seek adaptable, agile solutions. Rather than a typical data architecture that
looks like a layer cake and has all sorts of
Finally, the time has come to radically rethink arrows with no explanation, advanced organi-
BI’s basic methodologies. Caching, virtual zations will embrace the cooperative, peer
data warehousing, query federation, relationship inherent among services.
autonomous agents, and real-time and direct
access to operational data stores are all more Analysis is a collaborative, not a singular
feasible now and must be built into the new effort. If BI is to permeate the enterprise, it
BI. Doing so will help companies alleviate the must thrive within the network of business
delays and rigidity that confound current processes. In the not-too-distant future, the
approaches. best BI may be that which you don’t even see.
The potential exists to derive actionable intelli- A central theme running through these changes
gence about customers, the supply chain, is the prevalence of bottom-up, distributed
employees, facilities, and competitors, to name a approaches that are often anathema to the
few. This promise isn’t due to any fundamental enterprise. Inside the enterprise, we tend to rely
changes in the technical capabilities of data on a small number of centralized, CIO-approved
warehouses, but may be attributed to the way applications for internal data intended to be
we interact with content—and the changing most things to most people. However, that’s all
nature of content itself. changing with the gradual adoption of bottom-up,
decentralized methods within the enterprise—
Inside the enterprise, changes to the way we one of the hallmarks of Enterprise 2.0.
use content and data have been slow in coming.
Portals, enterprise search, and business- Changing content
intelligence tools are among the primary means
of accessing information, and changes are Content itself has changed even more than the
incremental. Outside the enterprise, however, ways we’ve used it. Over the past 15 years
change is far more dramatic. The range of we’ve accumulated massive amounts of data in
technologies and practices in this sphere, often four key areas. The challenge going forward is to
referred to as Web 2.0, is characteristic of develop intelligent business applications that
many of these developments. These rapid-fire scale accordingly.
changes impact how we create, adapt, distrib-
ute, and consume content. First, the widespread deployment of ERP systems
has resulted in the systematic accumulation of
The phenomenon of blogs and wikis is the business process data.
most notable development in distributed content
Second, there’s been dramatic growth in data porating insights derived from these sources of
about people. Businesses have increasingly data. We’ve encountered four key challenges:
captured the activities of consumers, patients, using data in real time, working with probabilistic
employees, and students, allowing companies to data, exploiting unstructured content, and
change the way they make products, deliver achieving semantic integration.
services, and maintain relationships.
Regarding the first challenge, businesses have
Third, sensor data is increasingly available. The traditionally gathered data about the past to help
broad proliferation of sensors—including cam- them decide what to do in the future. But it’s
eras, RFID tags, microphones, accelerometers, now becoming increasingly possible for busi-
and GPS—is making the physical world directly nesses to act in real time. For example, mobile
visible to computers. The widespread use of devices mounted to shopping carts give the
cheap sensors and wireless communications retailer a channel to its customers as they stroll
makes it possible to collect data from remote the store aisles. But just how to use this new
environments extending well beyond factories— capability is still being decided. If a business can
including homes, battlefields, forests, and even whisper in the ear of a customer at any time,
vineyards—often in real time. what should it say, and when?
of the sensors or our interpretation of the data. descriptions on the Web. The methodology
Currently, our emphasis is on the fusion of this includes natural-language processing techniques
type of imperfect sensor data to provide an over- and an active-learning feedback loop that inter-
all account of what’s going on. The result is not a actively refines the inferred hypotheses in a man-
definitive account of where everyone is, but ner that optimizes the use of human attention.
rather a probability distribution of everyone’s
location. For example, the system might indicate One way to make good use of unstructured con-
there’s a 50% chance that an individual is in tent is to use it to augment traditional structured
Conference Room A, a 20% chance that he’s in data. For example, consider a retailer that has
his office, and a 10% chance that he’s in years’ worth of transaction data. In principle, it
Conference Room B. should be possible to go back and look at cus-
tomers and understand their tastes and buying
While we can and should reduce uncertainty, we habits. Yet when you actually examine the data,
can never totally eliminate it. Our goal is to build it often includes little more than the SKU, date of
applications that recognize and gracefully purchase, and price.
accommodate it.
If you’re lucky, the data will tell you that the item
The third challenge involves deriving value from is women’s wear—maybe even that it’s a shirt.
unstructured content. Not only is unstructured So what does that tell you about the customer?
content growing faster than traditional data, At best, that he or she likes women’s SKUs.
most of it is external to the organization, yet still More of this type of data won’t help. What’s
of potential value. For example, the number of missing isn’t more data, but richer data. In this
blogs doubles every five months. Furthermore, case, to learn more about the customer or the
the kind of unstructured content fueled into store, we need to know more about the product:
organizations is changing. (For a related view- Is it trendy? Conservative? Sporty? We’re unlike-
point on the explosion of data, see Thomas ly to find a database with this information.
Davenport’s article, E2.0? Marginal At Best.)
However, one place where this information
In another example, 8 million Sprint customers implicitly exists is in the marketing associated
swapped 300 million pictures. While it can seem with each product. Descriptions about fabric and
overwhelming, businesses have the opportunity style may tell us valuable things about a cus-
to glean valuable insights from this content—by tomer who would purchase such a shirt, a retail-
analyzing opinions expressed online, gathering er who would sell it, or a brand that would pro-
information about products and services in the duce it.
marketplace, or reviewing pictures and video
depicting a problem with a product encountered With this in mind, we built Product Profiler, a sys-
by a user. To do so, however, businesses need tem that—given such natural-language product
methods of monitoring, aggregating, and analyz- descriptions—uses machine learning to recog-
ing this content. nize attributes such as trendiness and sporti-
ness. It’s effective at augmenting impoverished
Automated approaches won’t be able to “under- databases and can be used to support a number
stand” unstructured content to the same depth of the applications mentioned above. This and
as humans. More modest goals, such as identi- similar approaches are behind a variety of appli-
fying key themes in a document or reconciling cations, including assortment planning, brand
conflicting descriptions of an object, can still be management, and catalog mapping.
useful. With this in mind, we’re developing tools
and algorithms for extracting product attributes For the CIO, such methods pose a series of
and values—such as size, material, and other challenges for data quality and data governance.
specifications—from unstructured product In a 2005 Accenture study of more than 100
CIOs at global organizations, 77% said having a the fourth challenge we increasingly face. For
clearly defined approach to data quality was very example, enterprise business-intelligence tools
important. Correctness, clarity, and complete- generally focus on using data that’s flowing
ness are critical. Yet each of these concepts through corporate systems to provide decision-
needs to be reconsidered in light of the chal- makers with near-real-time visibility into their
lenges and applications ahead. company’s internal operations. This can be a
great improvement over what we’ve seen in the
When we’re working with probability distribu- past, when a comprehensive picture of the oper-
tions, for example, it’s understood that actual ation might have been days or weeks out of date
values may vary. Correctness becomes more a by the time it was compiled.
question of verifying not that the data values are
correct, but that the distributions governing the However, as they now exist, these business-
possible values are correct—a much harder intelligence systems lack awareness of what’s
proposition. How should we consider complete- going in the broader competitive ecosystem out-
ness when we’re extracting concepts from side the enterprise. Although a great deal of
unstructured content? Did we extract every con- information about things that customers, part-
cept we should have? Is a given extracted con- ners, suppliers, and competitors are doing is
cept really an appropriate instance, given the now available on the Internet, the information is
context? In many cases, metrics used in infor- typically unstructured; enterprises lack the tools
mation retrieval—such as recall and precision— to process these information sources and relate
may be more appropriate. the information to automate their operations.
In the case of data governance, consider some Business Event Advisor, a tool we’re developing
of the challenges for probabilistic databases. in our labs, will model the competitive relation-
What will constitute new evidence that can ships that impinge on a company’s operations,
impact a probability? Will the methods used to then use the new model to process various
update a belief be subject to governance? What sources of information. By applying a combina-
will the policies be when the probability associat- tion of text processing and automatic inference,
ed with a belief is updated? In Accenture’s CIO the system will detect business-relevant events
survey, 64% of respondents said defining clear and infer their potential implications for a partic-
date ownership is very important. Yet many of ular customer. In this way, the system will help
the applications we consider involve examining knowledge workers monitor the external environ-
large numbers of external data sources and ment in which they operate, spotting potential
combining them in novel ways. What will be the threats and opportunities as early as possible.
status of information derived from these By scanning, filtering, categorizing, and analyz-
sources? Will we have to apply data-quality ing, the tool will translate unstructured Web
standards to each and every outside source? information into a stream of structured descrip-
tions of both reported and inferred events that
These questions aren’t showstoppers. And there can be used to trigger alerts, populate decision-
are approaches evolving for many of these support portals, and integrate with the enterprise
issues. They simply suggest that the kinds of BI system.
systems we’ll start seeing will require approach-
es to data governance and quality that differ So when a competitor’s supplier drops a product
from what we’ve traditionally relied on. line, or when a manufacturer of complementary
products changes prices, the tool will scan many
Making sense of information sources of relevant external data, integrate the
findings, and report back on the possible conse-
Semantic integration—the ability to make sense quences of the news. By mining for gems in the
of information represented in different ways—is mountain of external data, companies will be
These concepts, currently focused primarily on fore more challenging, and Gartner expects
consumer markets, will impact enterprises as the adoption within large enterprises—as opposed
Internet remains the primary vehicle for con- to startups—to be at a much lower level than the
sumerization. Web platforms will emerge as ven- simple incorporation of technology.
dor-specific implementations of Web architecture
that will increasingly define the model for per- Web 2.0 is consistent with what we refer to as
sonal and enterprise systems. global-class computing—an approach to design-
ing systems and architectures that extends com-
While it’s straightforward to add specific tech- puting processes outside the enterprise and into
nologies like AJAX and RSS to products, plat- the culture of consumers, mobile workers, and
forms, and applications, it’s more difficult to add business partners. Key to this approach is an
a social dimension, such as user-contributed emphasis on interoperability via Web-based
content, or a new kind of business model if it standards. The architecture of the Web, especially
hasn’t been built into the application. Adding as it evolves, can be thought of as the basis for
these requires rethinking the design of the sys- the next generation of global class.—David
tem, and possibly its target audience. It’s there- Mitchell Smith, VP and fellow at Gartner Research
Smarter Use
Of Business Intelligence
With BI as a top priority, CIOs must expand
business strategies across the entire organization
By Betsy Burton and Mark McDonald Business expectations for IT have changed
dramatically—executives are pushing CIOs to
Powerful external forces are move beyond cost, security, and quality and
shaping the CIO’s agenda. focus more on specific business needs. And
Enterprises are broadening CIOs believe that business intelligence (BI) can
their focus from providing effi- help achieve these goals.
cient operations to creating
new sources of advantage in According to the survey, BI applications are
highly competitive markets. In CIOs’ top technology priority in 2006—and, for
response, IT executives must develop busi- the second consecutive year, business-
ness tools that absorb and analyze an explo- process improvement is their No. 1 business
sion of information from partners and cus- concern (see chart, above).
tomers. This direction is evident from our
January 2006 Gartner Executive Programs sur- These priorities highlight how CIOs are work-
vey of 1,400 CIOs. ing across dual tracks to improve the busi-
ness. One track will require working in the and analysis. This way, multiple BI users can
business trenches, delivering technology serv- lead, decide, measure, manage, and optimize
ices and improving processes; the other will performance to evaluate new sources of infor-
involve new capabilities to support competitive mation and reap financial benefits.
differentiation and customer choice. BI can
help connect these tracks, but it requires CIOs The current holistic view of BI encompasses
to rethink their strategies. business objectives, performance manage-
ment, people, processes, analytics, reporting,
To launch successful BI initiatives, CIOs must online analytical processing (OLAP), and query
adhere to the following principles: technologies—all sitting on an information-
management infrastructure. This BI is about
1. Understand overall business goals and using information and analysis to spur busi-
objectives. ness growth and transformation.
2. Assess the organization’s level of maturity in To capitalize on the real value and potential of
relation to technology infrastructure and BI, users should turn more traditional
information analysis. approaches upside down—shifting the focus
from technology that serves a small segment
3. Define an organizational structure that bal- of decision-makers to a much broader initia-
ances business, analysis, and technology to tive that puts people and business objectives
support the evolving and dynamic business first. Leading BI initiatives are interactive, flexi-
objectives. ble processes that take into account the needs
and skills of people within the company. This
Changing BI focus means viewing BI as a continuum that spans
diverse users—including managers, workers,
Traditionally, BI was considered a layered sales representatives, senior executives, part-
technology tied to a specific database-man- ners, customers, and suppliers—where tools
agement system. The goal was simply to pro- identify new business opportunities, integrate
vide executives with reports and dashboardlike business processes, and build collaboration
views of what departments were doing, how across the business.
well they were doing it, and where opportuni-
ties for growth might lie. But this approach In assessing an organization’s readiness for
limited thinking and support. Now, organiza- using BI pervasively, Gartner uses a graph that
tions are increasingly supporting BI so that plots the maturity of the technology infrastruc-
diverse people can better use the information ture against the maturity in the use and analy-
sis of information (see chart, above). The ideal
maturity curve would approximate a linear
relationship between the technology-maturity
axis and information analysis: The more
investment you make in BI-related technology,
the broader the use and analysis of informa-
tion. However, most organizations don’t follow
the ideal trajectory, but fall into one of three
places on the curve:
• Analytic obsession. This is where the tech- the requirements, including data quality and
nology infrastructure is incapable of support- governance; and help the company understand
ing the broad and pervasive use of BI. how insights should be interpreted and applied
Therefore, islands of analysis have emerged,
typically sitting on top of specialized infra-
structures and managed by disparate IT
groups.
grated into our everyday work. This won’t just from every business level who will strive to
be within front-end applications. To support make the business better.
flexibility and diverse applications, user-orient-
ed visualization tools will be based on finer- BI’s role will continue to change dramatically
grained service-oriented architectures. as a new generation of users, applications,
and processes takes hold. CIOs face a unique
Infrastructure services that manage and deliver opportunity as information pours in from tradi-
data will be constructed with the understand- tional BI sources, customers, and partners. By
ing that it’s the content quality and ease of productively gathering all this information,
translating data between multiple business- executives can fully integrate BI with core IT
context definitions that creates the highest architecture and business processes to devel-
degree of data reuse. op new strategies for growth.
CIOs should think beyond traditional roles, Betsy Burton is a VP and distinguished analyst
questions, and audiences, and consider the at Gartner. Mark McDonald is a group VP and
opportunities to support a new breed of BI head of research in Gartner Executive
users and requirements. To staff a BICC, Programs (EXP).
they’ll need to find creative, energetic thinkers
Now, banks need to “get together on a strategic from BI provider SAS (Cary, N.C.). “The compe-
level,” she advises. tency center provides a single point of contact to
the BI expertise and knowledge in our organiza-
Other industry observers share Burton’s point of tion. This enables us to efficiently support the
view. “Banks have very disparate [BI] technology needs of business users and transfer the skills
systems,” says Guillermo Kopp, VP, cross indus- they need to get the intelligence they need to
try, for TowerGroup. “It’s all over the place.” drive the business forward.”
To centralize BI decision making, financial institu- The quality of the available information goes a
tions are creating dedicated, cross-functional long way toward determining the success of any
groups within the organization known as busi- BI initiative, and large data volumes — often
ness intelligence competency centers (BICC). stored in silos — present a major challenge for
BICCs are akin to advisory panels or user groups BI at banks. As a result, data management is a
that oversee the overall business strategy within huge piece of the BI puzzle. According to Kopp,
a bank. The job of the BICC is to assess the there are several stages needed in data manage-
goals of business intelligence initiatives, oversee ment to achieve the ultimate goal of creating
technology implementations and measure the customer value: awareness, then governance
success of completed projects. “As BI becomes and finally combination.
increasingly more strategic, IT departments are
looking for ways to manage and support deploy- “We are challenged with the complexity of culti-
ments across divisions, regions and functions,” vating millions of individual customer relation-
according to “Building a Better Business ships across multiple channels with increasing
Intelligence Competency Center,” a June report volumes of data,” said Matt Harris, head of CRM
from Ottawa-based BI-software provider for London-based Barclays Bank ($575 billion in
Cognos. “A BI competency center can provide assets), in a release. To better understand and
the centralized knowledge and best practices to leverage its customer data, Barclays recently
help make this broader BI initiative possible.” implemented the Teradata CRM software appli-
(See related sidebar for more on BICCs.) cation from Teradata, a division of NCR (Dayton,
Ohio). Harris explained that the Teradata CRM
Banks with established BICCs seem happy with solution enables event-triggered marketing at
the results. “We have made a significant invest- Barclays through the creation of business rules
ment in business intelligence technologies and that are automatically deployed within the bank’s
want business users to take full advantage of data warehouse. The solution identifies signifi-
this,” said Yves Roelandt, department head cant changes in customer behavior that are
BICC, KBC Bank & Insurance Group (Brussels; indicative of new financial interests, allowing the
US$402 billion in assets) in a June 2005 release bank to act upon the information with relevant
and timely offerings that meet the customer’s and delivering it to the right person” notes Susan
needs, according to Teradata. Duchesneau, SAS global industry strategist for
financial services. She recommends that banks
Wachovia ($497 billion in assets) also turned to target a robust BI solution that runs the gamut of
BI to better understand its customers. To meas- capabilities.
ure customer loyalty and drive future business
decisions, the Charlotte, N.C.-based bank But, “With sophistication comes complications,”
recently chose SAS. Using survey results along- Wachovia’s Thorpe concedes. The banks biggest
side historical customer data, the bank will challenges are with some of the solution’s graph-
attempt to understand what drivers affect cus- ical displays — or dashboards — because of the
tomer loyalty, says Dan Thorpe, SVP and statis- huge amount of data with which the financial
tic and modeling director of Wachovia’s cus- institution is dealing for its more than 13 million
tomer analysis research and targeting group. customers, he relates. Sometimes the sheer vol-
ume of data can create information overload,
The next step in the equation, Thorpe notes, is Thorpe says.
evaluating customer equity. Customer equity, he
explains, is a lifetime measurement of how much Rise of Predictive Analytics
value each customer brings to the bank, and how
Wachovia can improve on that value. “Customer Still, the vast wealth of data at banks’ disposal
equity is a way of looking at the customer presents a tremendous opportunity, and how
beyond just the products,” Thorpe says. “It’s all and when banks utilize this data is changing.
the products the customer will have, how long With advances in predictive analytics, “now” is
they have those products and the value of those now too late when it comes to delivering deci-
products over a customer’s lifetime,” he adds. sion-making information inside the bank. Instead
of real-time information, decision makers at
“Executives want to know, if we increase a cus- banks now are asking for BI systems that predict
tomer’s value, how much extra profit that means the future. BI is “going up steps from reactive to
to the bank every year,” Thorpe continues. “We proactive,” says Ronnie Ray, VP, marketing for
are using SAS to really understand and model Infovista, a Herndon, Va.-based performance-
our customer’s balance history, their revenues, management software company.
their profits, the number of products and their
tenure.” As a result, Thorpe says, Wachovia is With more and more sophisticated information,
able to offer customers products and services banks can spot trends and cycles, even by cus-
that meet their needs as well as determine what tomer type, according to TowerGroup’s Kopp.
strategy would be most profitable for the bank. Predictive analytics can be used across the bank,
particularly in the areas of marketing, capacity
Thorpe stresses that the bank depends heavily planning and risk management, he says.
on SAS for the analysis, which allows the bank
to have a more holistic view of what each cus- “For years, banks have talked about cross-selling,”
tomer needs and target its marketing, adding says Kelly Pennock, CEO of Bellevue, Wash.-
that Wachovia chose SAS because of the com- based Intelligent Results, a provider of analytics
prehensive nature of the vendor’s solution. “We and decision-management software. “The real key
nailed it faster than if we would have had to to cross-selling is being able to determine what
work with three or four different companies,” new offer will motivate which individual customers
Thorpe says. to buy a new product or service.”
“It’s very important that a business intelligence Worldwide revenue from predictive analytics will
solution handle everything that’s needed, from grow through 2008 with a compound annual
pulling data, cleaning data, creating intelligence growth rate of 8 percent, according to a 2005
study from IDC. The Framingham, Mass.-based To leverage business intelligence more effectively
research and consulting firm defines predictive across the enterprise, many banks are imple-
analytics as the use of sophisticated analytics, menting business intelligence competency cen-
which are more complex in their mathematics ters (BICCs). In fact, many experts say that with-
than core analytics, to determine the likelihood out a BICC, it is almost impossible to effectively
of future trends or events. run BI tools within an organization.
Some banks are taking predictive analytics a While there is no specific formula for the optimal
step further. Cincinnati-based U.S. Bank ($208.9 BICC, typically, a BICC includes about 10 to 12
billion in assets), for example, recently selected executives in the group, made up of members of
Fair Isaac’s (Minneapolis) Strategy Science tool the bank from the IT side and the business side
for its risk management practice. Fair Isaac is (those who will actually be using business intelli-
developing a decision model for U.S. Bank’s gence). According to Ottawa-based BI software
credit line management that leverages the bank’s provider Cognos, “As an absolute minimum,” the
own data and analytics to go beyond predictions BICC should consist of a BICC director/manager,
to make actual decisions. To do that, Strategy a business analyst and a technical consultant.
Science combines methodology, process and
platform, and models the economics of cus- According to Cary, N.C.-based BI solutions
tomer decisions, according to Fair Isaac. provider SAS, a BICC should provide the follow-
ing functions:
A decision model maps the relationship between
multiple input variables to the range of decision Oversight of a BI Program — Define and monitor
choices available to the user. Strategy Science implementation of the BI strategy; be responsible
works by establishing optimal actions for organi- for consistent BI deployment; and provide stan-
zations to take for customers in any area, says dards, technology assessments, knowledge
Sally Taylor-Shoff, VP of analytic product man- management, best practices and business ana-
agement at Fair Isaac. “It explicitly manages the lytics expertise.
trade-off between risk and reward or cost and
benefit,” she adds. Strategy Science enabled Data Stewardship — Provide metadata manage-
U.S. Bank to see exactly what would happen if a ment and set data standards.
customer’s credit line was increased and make
the appropriate decision based on its risk-reward Support — Respond to user questions.
formula, Taylor-Shoff contends, helping the bank
increase its profits by $7 per active account. BI Delivery — Oversee front-end development,
testing and maintenance.
Intelligent Results also recently released a prod-
uct that integrates BI analytics and decision Data Acquisition — Oversee data integration and
management. The new software, Predigy, data store development, testing and mainte-
includes five modules that map each step in the nance.
analysis-to-action progression, according to the
firm’s Pennock. With such developments in pre- Advanced Analytics — Guide data mining and
dictive and actionable analytics, “From now on, statistical modeling.
banks can focus on action, not just customer
outreach,” she says. * Training — Train end users and project teams.
proper approach is taken in order to ensure the Creating a complete view of the business is a long
successful design and implementation of an EDW. journey, requiring a road map that clearly identifies
all milestones in building the warehouse. This will
1. Integrate, integrate, integrate enable the integration of various subject areas in
an iterative fashion based on company goals and
At its most granular level, an insurance organiza- objectives. This process of integrating subject
tion revolves around the policy and several key areas will be threaded by the same architectural
business functions: underwriting, claims, actuarial design concept and, like pieces of a puzzle, will
and finance/investments. Over the years, the lega- add to the details of the full business picture.
cy business has created numerous silos of data
stores, each with their own data quality protocols, It is also important to note that special considera-
business rules, data structures, etc. Each silo pro- tion should be given to both logical and physical
file usually reflects one line of business, or a major designs. Logical integration does not necessarily
component of the business, such as claims or mean a monolithic physical architecture. The
policy administration. In some extreme cases, no physical architecture should be flexible and allow
two silos are alike in terms of structures, data for quick delivery and superior performance.
types and business rules, and, as a consequence,
neither is the information they produce. In order to 3. Flexible design - To normalize or denormalize?
create an environment conducive to enterprise- Since enterprise data warehouses consist of inte-
level business intelligence, it is necessary to inte- grated architecture of various subject areas and
grate all these types of data into one logical and data domains, they necessarily gravitate more
physical structure that reflects the business archi- toward normalized schema. Normalization creates
tecture and process flow. a flexible environment that permits rapid accom-
modation of data in an iterative fashion. New data
It is only then, when a high degree of integration will always be the reality of such data architec-
is achieved, that a comprehensive view of the tures. Due to the size and versatility of EDW,
insurance business can be created. changes will always occur and should be
accounted for in the schema construction.
2. A multi-polar world of data - Integrated
subject areas and data domains Normalization helps designers and implementers
accommodate changes quickly and in the most
In order to unify data under a single architecture, economical manner. It also serves as a technique
and supply it to the business for meaningful busi- to directly reflect the business process workflow
ness intelligence, all transaction systems or infor- and can, therefore, change as fast as these
mation subject areas should be represented in the processes do. Both of these observations express
warehouse. Complete, clean data from all busi- a crucial characteristic of data warehouses:
ness areas will enable the creation of a unified adaptability.
and unique view of the entire organization’s busi-
ness processes and their consequences in the It is generally difficult to anticipate what future
market place, via basic unaltered data — or, as it requirements will be addressed by the warehouse
is commonly known, a “single version of the or how it ultimately will be used. However, it is
truth.” The multiple subjects allow “information clear that even with advancements in hardware
catering” for anyone interested in extracting intelli- and RDBMS software, normalization will remain a
gence from the data. Simultaneously, the integrat- problem in terms of performance and complexity.
ed subject areas provide a complete view of the It may get costly to optimize and beef up the
enterprise based on an unaltered and unique hardware to the point that performance is not an
value of the objective business facts collected by issue. The complexity of normalized structures
the transaction systems. also becomes a business issue since the less
technically proficient users (business power users)
of data and information will find them difficult to attributes that define a highly scalable architecture
navigate and understand. as well. Those attributes can include enterprise
metadata, data quality, match/consolidation, per-
Therefore, for some time to come, denormaliza- formance monitoring, application check-pointing
tion still will be seen as a way to improve perform- and the degree of freedom power users have
ance and, first and foremost, as a technique that when exploring the data in the warehouses. The
simplifies data relationships for business users. It EDW solution should be able to accommodate
is also the basis for unifying all standard business any type of query regardless of its complexity
views or conformed dimensions around the same and data volume and bring back high-quality
base metrics, the same version of the truth. But information within a reasonable amount of time.
does the denormalization violate the principles of That means it is imperative to rationalize the
EDW architecture? Not necessarily. The EDW design across all subject areas, and make it flexi-
should be an instrument for organizing data, while ble enough to accept changes at any time and at
integrating and delivering information in the most any level of complexity.
comprehensive way to all facets of an organiza-
tion. As a result, denormalization needs to accom- To summarize, scalability for EDW means: flexible
modate all types of structures without becoming logical and physical model, query freedom, data
totally dependent on a certain degree of granulari- quality, rich metadata, high availability system, high-
ty or native relationships between data. performance loading and querying, support for high
concurrency and, last but not least, tight operational
4. Performance vs. scalability control of the entire processes/systems.
Operational Intelligence
Enters the Spotlight
Understanding the technology components of OI.
By Dan Everett from multiple sources that flow across the enter-
prise service bus. To do this requires a broad
Summary spectrum of input adapters for things such as
Ventana Research has defined a new category of system logs, network protocols, relational data-
software called operational intelligence (OI). For bases, API calls, messaging queues and Web
front-line workers who need to improve their services. The input adapters must be able to
execution of the daily tasks that contribute to detect events at different points in the business
achieving strategic goals, OI reduces latency in process workflow to enable throughput and tem-
awareness, evaluation and management of poral analysis. (From an OI perspective an
changes in the state of business performance. “event” is an object that contains information
Unlike business intelligence (BI), OI uses an about a change in the state of an operational
event-driven architecture to detect the current activity or process. For example a new customer
state of activities and processes and analyze order would be an event that contains informa-
them against expected states. Unlike business tion such as products, quantity and price, which
activity monitoring (BAM), OI embeds intelligence is used in the process of filling the order.)
into process workflows that helps users deter-
mine the most appropriate response to threats Further, OI must be able to detect events across
and opportunities. multiple processes and nonlinear workflows in a
single process, which enables correlation of activi-
Operational intelligence incorporates technology ties to changes in business performance. This last
components from other software categories, aspect requires workflow modeling capabilities or
among them business intelligence, business the ability to integrate with modeling tools using
rules, complex event processing and business workflow specification languages such as
process management. These components Business Process Execution Language (BPEL).
address different aspects of events, the data
they contain and the processes they are part of OI also requires the capability to model the
to improve front-line decision-making. attributes, constraints and dependencies of
events. As an information object, an event is an
View entity that has attributes, such as data values,
The first requirement of OI is to integrate events time of occurrence, process ID and workflow
report formulas to Excel formulas and preserva- evaluating vendors as candidates to become
tion of formulas and formatting when a query is the corporate standard for BI, a primary evalu-
refreshed. ation criterion should be the product’s ability to
address users’ access needs, technical skills
Our research on operational BI shows that front- and roles in the organization flexibly.
line workers access information most frequently
through prebuilt HTML or PDF reports. These About Ventana Research
workers also run custom reports to get informa- Ventana Research is the leading Performance
tion not available via the prebuilt templates; they Management research and advisory services
do so by selecting query parameters from prede- firm. By providing expert insight and detailed
fined lists built into the user interface. The most guidance, Ventana Research helps clients oper-
important criteria for user satisfaction here are ate their companies more efficiently and effec-
access to financial and operational data, the tively. These business improvements are deliv-
ability to support multiple output formats, ease ered through a top-down approach that con-
of scheduling report delivery and the ability to nects people, process, information and technolo-
create custom reports without having to know gy. What makes Ventana Research different from
SQL or Excel functions and macros. other analyst firms is a focus on Performance
Management for finance, operations and IT. This
Assessment focus, plus research as a foundation and reach
Our research shows user evaluations of appli- into a community of over two million corporate
cation success are significantly higher for IT executives through extensive media partnerships,
organizations that involve users in defining the allows Ventana Research to deliver a high-value,
information access requirements. So IT organi- low-risk method for achieving optimal business
zations that want business users to recognize performance. To learn how Ventana Research
the value they provide should ensure BI inter- Performance Management workshops, assess-
faces are tailored to the needs of the various ments and advisory services can impact your
business groups that use them. Ventana bottom line, visit www.ventanaresearch.com.
research therefore recommends that when
SAP surrounds their applications stack with Both the Enterprise Edition and the Standard
reporting and analysis, whereas Oracle considers Edition are available now. The Enterprise Edition
the applications stack and the data, Hagerty said. costs $1,500 per user. The Standard Edition
costs $400 per user.
By year’s end, Oracle Business Intelligence
Suites will support SAP’s warehouse manage- The Standard Edition One, meant to compete
ment application, said Christina Kolotouros, with Microsoft Corp.’s enterprise applications
director of Oracle BI Product Management. geared toward small and midsize businesses,
“The applications also will integrate tightly with will roll out after June 1 this year.
Microsoft Office,” she said.