TraditionalvsAgileBI

3
Traditional vs. Agile BI INTRODUCTION Since Business Intelligence emerged into mainstream awareness in the 1990's, the imperative of delivering a "single version of the truth" has been an extremely challenging vision to realize in most organizations. Legacy BI has centered on the assumption that more information yields better decisions, and other than support for highly routine decisions made in mature, stable environments, this model has largely resulted in failure. This is mainly caused by three factors: 1) the difficulty and time required to integrate all of an organization’s data before any analysis can be done has resulted in extremely challenging implementations; 2) the fact that legacy BI technology has been optimized not for how decisions are made, but rather for solving technical limitations (many of which have been removed or are rapidly being removed); and 3) traditional BI platforms are unable to adapt to system change, which is inevitable given that we operate in a competitive environment where everything is dynamic. Traditional BI technologies have focused on solving data storage, integration, processing, and presentation issues. With the goal of decision support left unachieved, a new model of BI has emerged called Agile BI, which is built on many opposing assumptions like looser data integration, the utilization of less, more targeted information to make decisions, and the reality of continuous system change. Agile BI changes the focus from data driven to decision driven. MORE IS BETTER Legacy BI is based on the assumption that by having access to every piece of information about every aspect of a business process, we can make better decisions. This so called “single version of the truth” has lead to the ideal of the “Enterprise Data Warehouse”, in which a complete, unified view of our entire enterprise can be found. And by knowing everything, in context of everything else, our decision making will be fool proof. This is a very attractive idea that unfortunately just doesn’t work in practice. Even if it is theoretically possible to construct a unified, complete “single version of the truth,” it is likely the competition will have outmaneuvered you long before you are able to act upon it. And in competitive environment, the “truth” changes as new markets open, as new competitors come onto the field, and as market dynamics change the game. Not only is it impractical for most organizations to build a universal view, but current research in the decision sciences also indicates that decision models that attempt to include all available information actually don’t perform well in the real world. Such models do a great job of “predicting” the data you already have, but fail to work in new situations. And if you want humans to participate in the decisions, understand them, and take action, more information inputs and complexity leads to poorer adoption as well as difficulty in judging when the model might be failing. These factors have been echoed in poor BI adoption rates and the spectacular failures of so-called “data driven” organizations in the recent financial industry crises. Agile BI focuses on the requirements of the decisions being made, rather than on corralling all available data. Data may be tightly integrated to support decisions, or it may be loosely joined without the need for conformed dimensional models. As the decision model changes, information that seemed critical may fade in importance and new data source requirements will emerge. To be Agile, BI must quickly integrate (and dis- integrate) this information for the decision maker. THE “TRUTH” CHANGES CONSTANTLY The success of Agile Methodology in software development is largely due to the fact that it accepts constant change as the norm. Every principle of that methodology is centered around delivering value-producing functionality quickly, and in a way that anticipates significant change in direction. Much like the software industry, BI has historically been plagued by constant changes in requirements. Anecdotes abound of end-users viewing a report for the first time and immediately responding with new requirements. But despite this, legacy BI architecture has failed to achieve any form of agility. Executive Summary This paper explains the fundamental assumption of traditional BI platforms that was made when business intelligence first emerged into the mainstream in the 1990sand why it is no longer valid. Given this false assumption, we put forth the implications for how traditional platforms operate, and how this compares to more agile, lightweight platforms.

description

This paper explains the fundamental assumption of traditional BI platforms that was made when business intelligence first emerged into the mainstream in the 1990s— and why it is no longer valid. Given this false assumption, we put forth the implications for how traditional platforms operate, and how this compares to more agile, lightweight platforms. MORE IS BETTER THE “TRUTH” CHANGES CONSTANTLY

Transcript of TraditionalvsAgileBI

Page 1: TraditionalvsAgileBI

Traditional vs. Agile BI

INTRODUCTION

Since Business Intelligence emerged into mainstream awareness

in the 1990's, the imperative of delivering a "single version of

the truth" has been an extremely challenging vision to realize in

most organizations. Legacy BI has centered on the assumption

that more information yields better decisions, and other than

support for highly routine decisions made in mature, stable

environments, this model has largely resulted in failure. This is

mainly caused by three factors: 1) the difficulty and time

required to integrate all of an organization’s data before any

analysis can be done has resulted in extremely challenging

implementations; 2) the fact that legacy BI technology has been

optimized not for how decisions are made, but rather for solving

technical limitations (many of which have been removed or are

rapidly being removed); and 3) traditional BI platforms are

unable to adapt to system change, which is inevitable given that

we operate in a competitive environment where everything is

dynamic.

Traditional BI technologies have focused on solving data storage,

integration, processing, and presentation issues. With the goal of

decision support left unachieved, a new model of BI has

emerged called Agile BI, which is built on many opposing

assumptions like looser data integration, the utilization of less,

more targeted information to make decisions, and the reality of

continuous system change. Agile BI changes the focus from

data driven to decision driven.

MORE IS BETTER

Legacy BI is based on the assumption that by having access to

every piece of information about every aspect of a business

process, we can make better decisions. This so called “single

version of the truth” has lead to the ideal of the “Enterprise Data

Warehouse”, in which a complete, unified view of our entire

enterprise can be found. And by knowing everything, in context

of everything else, our decision making will be fool proof.

This is a very attractive idea that unfortunately just doesn’t work

in practice. Even if it is theoretically possible to construct a

unified, complete “single version of the truth,” it is likely the

competition will have outmaneuvered you long before you are

able to act upon it. And in competitive environment, the “truth”

changes as new markets open, as new competitors come onto

the field, and as market dynamics change the game.

Not only is it impractical for most organizations to build a

universal view, but current research in the decision sciences also

indicates that decision models that attempt to include all

available information actually don’t perform well in the real

world. Such models do a great job of “predicting” the data you

already have, but fail to work in new situations. And if you want

humans to participate in the decisions, understand them, and

take action, more information inputs and complexity leads to

poorer adoption as well as difficulty in judging when the model

might be failing. These factors have been echoed in poor BI

adoption rates and the spectacular failures of so-called “data

driven” organizations in the recent financial industry crises.

Agile BI focuses on the requirements of the decisions being

made, rather than on corralling all available data. Data may be

tightly integrated to support decisions, or it may be loosely

joined without the need for conformed dimensional models. As

the decision model changes, information that seemed critical

may fade in importance and new data source requirements will

emerge. To be Agile, BI must quickly integrate (and dis-

integrate) this information for the decision maker.

THE “TRUTH” CHANGES CONSTANTLY

The success of Agile Methodology in software development is

largely due to the fact that it accepts constant change as the

norm. Every principle of that methodology is centered around

delivering value-producing functionality quickly, and in a way

that anticipates significant change in direction. Much like the

software industry, BI has historically been plagued by constant

changes in requirements. Anecdotes abound of end-users

viewing a report for the first time and immediately responding

with new requirements. But despite this, legacy BI architecture

has failed to achieve any form of agility.

Executive Summary

This paper explains the fundamental assumption of

traditional BI platforms that was made when business

intelligence first emerged into the mainstream in the 1990s—

and why it is no longer valid. Given this false assumption, we

put forth the implications for how traditional platforms

operate, and how this compares to more agile, lightweight

platforms.

Page 2: TraditionalvsAgileBI

7900 Westpark Drive, Suite T-107 | McLean, VA 22102 | Toll Free: 1-888-LOGIXML | www.logixml.com | [email protected]

Traditional BI technologies, in assuming the goal of a single

version of the truth, have focused largely on overcoming

performance issues associated with meeting that goal. They

have done so through a heavy-weight process of transforming

data, dimensional modeling, summarizing or “cubing” data,

creating metadata layers, etc. This architecture builds in key

aspects of the decision process into every layer of the process.

The implication of this is that even relatively simple requirement

changes can trigger significant rework through the entire

architecture. For example, changes to source systems, ETL jobs,

the dimensional model, and the metadata often take months to

deliver.

When decisions are well known (“routine decisions”) such an

architecture can support them. The requirements for basic

financial reporting, for example, don’t change often. But many

times, we need to make decisions that are novel, such as what

markets to expand into, what products to introduce, or how to

respond to a new competitor. These types of decisions will often

require new information, and will often be very iterative in

nature. The way the decision is made changes as the decision

maker gains more information. And once made, such decisions

can change the landscape entirely. This is not a job for legacy

BI.

The monolithic approach of legacy BI has actually led to desktop

analysis tools (king of which is the spreadsheet) to become the

standard in such decisions. When Oracle decides to acquire

another BI vendor, that decision will be made in Excel, not

OBIEE. Why? Agility. Agility to change the decision model on the

fly. This monolithic architecture was required when 16-bit

computing and nascent relational database technology made

performance the primary barrier to decision support. With the

emergence of 64-bit computing, columnar databases, cloud

computing, and extreme data volume technologies like Hadoop,

legacy BI architecture needlessly sacrifices agility to solve last

century’s performance barriers. If BI is to be Agile, it must adopt

an architecture that assumes constant change in requirements at

all levels and is focused on the decision being made.

AGILITY: DRIVING CHANGE, NOT RESPONDING

David Weinberger, a senior researcher at Harvard’s Berkman

Center, talks about a phenomenon he calls the “changing shape

of knowledge.” The idea is very much at the root of why BI

requirements change so much, and in essence is a reflection that

as we learn more, we tend to change the way we view what we

knew in the first place. Most organizations struggle to keep up

with the changing shape of knowledge in the marketplace and

legacy BI likewise consistently fails to respond to these changes.

Weinberger studies the impact of the internet on society and the

impact of the internet in driving information globally has been

the key information technology success of the last century. The

internet was built on assumptions completely antithetical to

legacy BI, focusing on providing very focused information that

could be loosely joined to any other information stored anywhere

in the world. And the move into Web 2.0 has taken the web from

linking pages into a world in which we now are able to mash-up

rich applications.

Organizations have started to move away from wholesale

adoption of full ERP packages and back to a best-of-breed

approach supported by standards-based integration architectures

such as SOAP-based web services. Capabilities can be added,

changed, or removed without the need to completely re-architect

the entire system, thus providing agility to the business. This

architecture achieves the same type of flexibility that has made

spreadsheets proliferate, but provides a powerful framework to

avoid isolated, redundant and conflicting information silos.

Agile BI will follow this model, allowing domain and decision-

specific information applications to be joined together to form BI

platforms. These information applications will no longer be

stand-alone “enterprise BI systems,” but will often appear

embedded in the context of the transactional or other

applications already is use. And as requirements change,

applications will be added, removed, or updated quickly because

they don’t require an assessment of their impact on a universal

“single version of the truth” data model.

Agility is about driving these changes in the marketplace. Think

Wal-Mart or Amazon, but think paragons of Business Intelligence

who stretch their implementations beyond traditional views of BI

and use their insights to redefine and dominate their industry. In

order to obtain and maintain such competitive positions, such

organizations cannot wait for legacy architectures to catch up to

emerging requirements. When you are redefining the

competitive rules, you need decision support systems that can

keep up.

A NEW LANDSCAPE FOR BI

The environment in which any decision support system must

operate has completely transformed since the early attempts in

the 1970’s to create an EIS. While legacy BI architectures

continue to hold many of the same assumptions about

information and computing that were true in the early 1990s,

we’re seeing virtualization and cloud computing, Web 2.0

technologies, emerging standards and Services Oriented

Page 3: TraditionalvsAgileBI

7900 Westpark Drive, Suite T-107 | McLean, VA 22102 | Toll Free: 1-888-LOGIXML | www.logixml.com | [email protected]

Architectures, Advanced Analytics and Visual Analysis and a

variety of other innovations that have completely changed the

landscape.

The current imperative in BI is to abandon the assumptions that

have lead to such rigid solutions and leverage modern

approaches to decision support that provide greater agility for

the business. BI teams must move beyond legacy BI

architectures and include technologies that support a rapid,

iterative development style. The ability to rapidly source

information, connect it to other information in both a tightly and

loosely integrated fashion, and quickly connect BI applications

together will be critical in meeting rapidly changing

requirements.